How can I keep a Heroku Node.js instance running indefinitely? [closed] - node.js

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
I've written a simple Node server, deployed on Heroku, that connects to a Firebase DB written to by users of an app of mine. The server monitors child changes, sending me a message via Twilio when there's new data I need to review. There's no front end, nor does there need to be. How can I have Heroku, which ordinarily terminates idle dynos, run this indefinitely so that I can have it monitor Firebase and pass on alerts?

Heroku free web dynos that receive no traffic in a 30 minute period will sleep. That prevents the Firebase node.js child_changed listeners from staying active and thus you won't receive the Twilio messages. To prevent that from happening, you can include a very simple setInterval function (like the example in this article) to run at some frequency less than 30 minutes.
var http = require("http");
setInterval(function() {
http.get("http://<your app name>.herokuapp.com");
}, 300000); // every 5 minutes (300000)
So for a long-term strategy I don't see anything wrong with it.
In case something causes your node.js server to crash, you can also include code to notify yourself something needs to be checked:
process.on('uncaughtException', function (err) {
console.log(err);
twilioClient.messages.create({...}, function(twilioErr, message) {
if (twilioErr) {
console.log("twilio error: " + twilioErr);
}
});
}

Related

Performance issue while using node.js, socket.io and MongoDB for chat application like Teams [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 4 days ago.
Improve this question
We are using node.js, socket.io, express and Mongo database to build an chat application like Teams.
When user text some message then
Data need to be saved
Respective users will be notified over app if they were actively connected.
Else we need to send firebase notification (It's to notify mobiles)
We are facing performance issue when different users of a group texts messages in a group at a time.Here
we are iterating over list of recipients to store data against each as we are also tracking later on to
know when it is delivered and read by that recipient. We are notifying over socket or firebase.
Due to above computation, performance is degrading and entire app get struck at server side till it finishes.
Note: While saving we are using async function. To return saved reference id (which is auto generated for
each entry) we are using Promise.
I am trying to use worker threads to save and notify but could not able to change code
to use worker thread due to code complexity. Kindly suggest any proper workaround to fix it.

Difference between Googlecloud's pub/sub, socket, ipc, etc [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
I'm using node.js server and GCloud. What is exactly the GCloud's pub/sub? Does it work like socket or tcp? My server is using socket.io and node-ipc for communicating, can GCloud's pub/sub be the alternative?
PubSub is a messaging service. These allow asynchronous communication between two applications; one "publishes a message" to a message service and then some other process reads that message from the message service at a later time - seconds, minutes, or hours later. The application that published the message does not need to "stay connected".
That's really useful for scalable and reliable communication between applications - but quite different from socket-based communication which is point-to-point between a client and server process. Implementing request/response type communications is difficult over a messaging service - "send and forget" is the usual model. As #komarkovich noted, a message can also be received by many applications if that is appropriate.
Google Cloud Pub/Sub is an asynchronous publish/subscribe messaging service.
Publisher creates and sends messages to a topic. Subscriber creates a subscription to a topic to receive messages from it. Communication can be one-to-many, many-to-one, and many-to-many.
Pub/Sub has two endpoints:
Publisher: Any application that can make HTTPS requests to googleapis.com.
Subscriber:
Pull subscriber: Also any application that can make HTTPS requests to googleapis.com.
Push subscriber: Webhook endpoints that can accept POST requests over HTTPS.
You can check the Cloud Pub/Sub Client Libraries and review the example for Node.js to get you started using Google Cloud Pub/Sub API.

How can I show if a user is online in express? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I am making a chat app and would like to add the functionality of showing an online/offline symbol next to users. How can I do this reliably with minimum number of srever requests and database writes?
One way I found was to update a lastSeenAt field in the user document every time that user requests a page and use this to indicate whether the user is online/offline. Another way is to ping the server from the client side at fixed intervals of time and then update the lastSeenAt field.
Both theses ways would require a lot of database writes and/or server requests. Is there a way to do this more efficiently?
You have to push data from your server to the clients upon change, with something like web-sockets or server sent events. socket.io is a popular tool for implementing such functionally.
With socket.io or specifically websockets you could track the if a client is connected or not and have a flag in database that tracks whenever a client connects or
disconnects(keep in mind, the client might be having multiple connections(multiple devices, or even browser tabs), so if one connection disconnect he still might be online.
Assuming you will have multiple state-less webservers(common practice), then once a client connects or disconnects you want to notify other interested clients, then you should a use a pubsub pattern to notify other servers which will notify their connected clients respectively. There's a lot of implementations such as Redis, zeromq, AWS SNS, GCP Cloud Pubsub. You could even use MongoDB with tailable cursors as your pubsub.
However this might mean a lot of constant connections to your server, so this might hurt your scaliblity. If it proves to be to expensive for you then your lastSeenAt approach with some sane polling. You can find out which works better with your setup by just running some experiments.
If database writes and requests worries, you could always throw more servers at it. You could have a microservice specifically for this so your main server wouldn't be affected much and also you could have a database specifically for this. If performance worries you, you can use a in-memory database.
Also you can also have some caching in your servers with simple time based invalidation or you can sync the online/offline data between your servers with a pubsub pattern.
I would suggest try to make your setup as simple as possible, and you can run experiments to see how it handles your expected traffic.
Look into Express.io where you can combine HTTP calls with sockets. (the best method for real-time communication is Sockets).

Which side front-end or back-end , attachment file size check should be placed ? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
Suppose client wants to upload image file or any other attachment .
And maximum size can be 10 Mb ,
So where file size check should be placed front-end or server back-end side?
which is better design and why ?
It should be at both ends.
Why at client side(front end).
It will be annoying for user if you throw an error after 10mb data is uploaded and this 10mb data will be also uploaded at server side so you will be wasting servers processing power.
Why at server side(back end)
Some people can hack client side code and upload files more than 10mb so you should have validation at server side also
If the client is browser based, then you must have server side validation because it is easy and widely known how to circumvent a client side validation. You may have a client side validation as well for the normal users.
If the flient is a proper application and it is the only channel to your server, then have a client side valudation becsuse it is closer to your users. If there are multiple channels to the server, then you may want to add the server side validation to enforce consistency accross all channels.

What are good message queue options for nodejs? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
Looking to use a message queue in a small web app I'm building with node.js. I looked at resque but not sure that's appropriate. The goal is to push notifications to clients based on backend and other client actions with socketio. I could do this with just socketio but I thought maybe a proper message queue would make this cleaner and I wouldn't have to reinvent the wheel.
What are the options out there?
you could use redis with the lightning fast node_redis client. It even has built-in pubsub semantics.
You could use the node STOMP client. This would let you integrate with a variety of message queues including:
ActiveMQ
RabbitMQ
HornetQ
I haven't used this library before, so I can't vouch for its quality. But STOMP is a pretty simple protocol so I suspect you can hack it into submission if necessary.
Another option is to use beanstalkd with node. beanstalkd is a very fast "task queue" written in C that is very good if you don't need the feature flexibility of the brokers listed above.
Shameless plug: I'm working on Bokeh: a simple, scalable and blazing-fast task queue built on ZeroMQ. It supports pluggable data stores for persisting tasks, currently in-memory, Redis and Riak are supported. Check it out.
Here's a couple of recommendations I can make:
node-amqp: A RabbitMQ client that I have successfully used in combination with Socket.IO to make a real-time multi-player game and chat application amongst other things. Seems reliable enough.
zeromq.node: If you want to go down the non-brokered route this might be worth a look. More work to implement functionality but your more likely to get lower latency and higher throughput.
Take a look at node-busmq - it's a production grade, highly available and scalable message bus backed by redis.
I wrote this module for our global cloud and it's currently deployed in our production environment in several datacenters around the world. It supports named queues, peer-to-peer communication, guaranteed delivery and federation.
For more information on why we created this module you can read this blog post: All Aboard The Message Bus
I recommend trying Kestrel, it's fast and simple as Beanstalk but supports fanout queues. Speaks memcached. It's built using Scala and used at Twitter.
kue is the only message queue you would ever need
You might want to have a look at
Redis Simple Message Queue for Node.js
Which uses Redis and offers most features of Amazons SQS.
Look at node-queue-lib. Perhaps it is enough that you.
It support node.js and browsers. Has two delivery strategies: broadcast and round-robin.
Only javascript.
Quick example:
var Queue = require('node-queue-lib/queue.core');
var queue = new Queue('Queue name', 'broadcast');
// subscribe on 'Queue name' messages
queue.subscribe(function (err, subscriber) {
subscriber.on('error', function(err){
//
});
subscriber.on('data', function (data, accept) {
console.log(data);
accept(); // accept process message
});
});
// publish message
queue.publish('test');
How about Azure ServiceBus? It supports nodejs.
I used KUE with socketIO like you described.
I stored the socketID with the job and could then retreive it in the Job Complete..
KUE is based on redis and has good examples on github
something like this....
jobs.process('YourQueuedJob',10, function(job, done){
doTheJob(job, done);
});
function doTheJob(job, done){
var socket = io.sockets.sockets[job.data.socketId];
try {
socket.emit('news', { status : 'completed' , task : job.data.task });
} catch(err){
io.sockets.emit('news', { status : 'fail' , task : job.data.task , socketId: job.data.socketId});
}
job.complete();
}
You might also want to check out ewd-qoper8: https://github.com/robtweed/ewd-qoper8

Resources