I'm a novice with node and javascript, and I'm more familiar with the old paradigm of synchronous programming rather than use of callbacks, promises, etc. in asynchronous programming offered in node and browser based javascript.
I was adding catching of SIGINT to some node scripts I was developing and noticed some peculiarity. I have a variety of node utility scripts. One is an express.js based app to serve stuff over HTTP. Another is a kafka message subscriber that processes messages coming to a specific topic on the bus. And a third one is a simple test/debug script for trying out node.
The express & kafka scripts handle SIGINT fine and terminate when the signal comes. But the simple debug script doesn't and continues operation even though I've sent Control + C (or D). So my question is, how should a novice write basic node code, when not using frameworks like express or a node kafka client that will already support this, to properly catch SIGINT? Here's my sample code below. Please suggest how to re-work (or encapsulate the relevant code for) it to catch the signal. As you can see, it's very basic code that a novice would likely write, like a hello world demo.
var sleep = require('sleep');
process.on('SIGINT', function() {
console.log("Performing graceful shutdown");
process.exit();
});
while(true){
console.log("running "+new Date());
sleep.sleep(1);
}
Node.js can't handle any event if the event loop is blocked.
The while(true){} in your codes will block the event loop, and all SIGINT will be queued in the event queue, it cannot be handled until the while breaks.
About event loop, please refer to this video.
Related
I want to catch uncaught exceptions in an Electron app. I read that I could use
process.on("uncaughtException", err => {
console.log(err)
}
I am wondering if the process object is the general process of the app, or if it is the process of the function it was called in (if it can even be used that way)?
For example if I want to do process.exit or something similar, will it kill the app or shut down the function?
Thanks for your time !
From the docs:
The process object is a global that provides information about, and control over, the current Node.js process. As a global, it is always available to Node.js applications without using require().
Yes the process object is the general process of the app.
So if you do process.exit it will quit the whole process.
Process docs
I am looking into using newrelic APM to monitor certain parts of our codebase.
I want to watch transactions that are not simple HTTP calls, but background processes. These transactions are completed by worker processes and we want to monitor them in the main part of the app.
Pseudo code:
var fork = childProcess.spawn('node', ['--harmony', 'path-to-worker.js', args]);
fork.stdout.on('data', function(data) {
// a finished transaction
// this fires most likely more than once
});
We basically need something like newrelic.createBackgroundTransaction() that can log a transaction immediately, without having to pass it a function to execute and time (I can do that myself).
Can I do something like this on the free tier of newrelic?
Folks,
I would like to set up a message queue between our Java API and NodeJS API.
After reading several examples of using aws-sdk, I am not sure how to make the service watch the queue.
For instance, this article Using SQS with Node: Receiving Messages Example Code tells me to use the sqs.receiveMessage() to receive and sqs.deleteMessage() to delete a message.
What I am not clear about, is how to wrap this into a service that runs continuously, which constantly takes the messages off the sqs queue, passes them to the model, stores them in mongo, etc.
Hope my question is not entirely vague. My experience with Node lies primarily with Express.js.
Is the answer as simple as using something like sqs-poller? How would I implement the same into an already running NodeJS Express app? Quite possibly I should look into SNS to not have any delay in message transfers.
Thanks!
For a start, Amazon SQS is a pseudo queue that guarantees availability of messages but not their sequence in FIFO fashion. You have to implement sequencing logic into your app if you want it to work that way.
Coming back to your question, SQS has to be polled within your app to check if there are new messages available. I implemented this in an app using setInterval(). I would poll the queue for items and if no items were found, I would delay the next call and in case some items were found, the next call would be immediate bypassing the setInterval(). This is obviously a very raw implementation and you can look into alternatives. How about a child process on your server that pings your NodeJS app when a new item is found in SQS ? I think you can implement the child process as a watcher in BASH without using NodeJS. You can also look into npm modules if there is already one for this.
In short, there are many ways you can poll but polling has to be done one way or the other if you are working with Amazon SQS.
I am not sure about this but if you want to be notified of items, you might want to look into Amazon SNS.
When writing applications to consume messages from SQS I use sqs-consumer:
const Consumer = require('sqs-consumer');
const app = Consumer.create({
queueUrl: 'https://sqs.eu-west-1.amazonaws.com/account-id/queue-name',
handleMessage: (message, done) => {
console.log('Processing message: ', message);
done();
}
});
app.on('error', (err) => {
console.log(err.message);
});
app.start();
See the docs for more information (well documented):
https://github.com/bbc/sqs-consumer
I would like to stop a function in order to wait the end of another section of code.
Is there in Firefox OS some synchronization method like wait() and notify() in Java?
Thanks
JavaScript doesn't has this concept, but generally speaking uses callbacks (function pointers). A bit like the anonymous classes in Java. For example, I'm making a call to a web server:
function callToWebServer(url, doneCallback) {
// do all kinds of magic, waiting for the web server to reply etc.
// when done:
doneCallback();
}
Now using it via:
callToWebServer(function() {
// this is executed after the call to the web server succeeded
});
alert(1); // this is executed straight away
We can never wait, as JS is a single-thread execution environment. All code is written async.
The issue is:
Lets assume we have two Node.js processes running: example1.js and example2.js.
In example1.js there is function func1(input) which returns result1 as a result.
Is there a way from within example2.js to call func1(input) and obtain result1 as the outcome?
From what I've learned about Node.js, I have only found one solution which uses sockets for communication. This is less than ideal however because it would require one process listening on a port. If possible I wish to avoid that.
EDIT: After some questions I'd love to add that in hierarchy example1.js cannot be child process of example2.js, but rather the opposite. Also if it helps -- there can be only one example1.js processing its own data and many example2.js's processing own data + data from first process.
The use case you describe makes me think of dnode, with which you can easily expose functions to be called by different processes, coordinated by dnode, which uses network sockets (and socket.io, so you can use the same mechanism in the browser).
Another approach would be to use a message queue, there are many good bindings for different message queues.
The simplest way to my knowledge, is to use child_process.fork():
This is a special case of the spawn() functionality for spawning Node processes. In addition to having all the methods in a normal ChildProcess instance, the returned object has a communication channel built-in. The channel is written to with child.send(message, [sendHandle]) and messages are received by a 'message' event on the child.
So, for your example, you could have example2.js:
var fork = require('child_process').fork;
var example1 = fork(__dirname + '/example1.js');
example1.on('message', function(response) {
console.log(response);
});
example1.send({func: 'input'});
And example1.js:
function func(input) {
process.send('Hello ' + input);
}
process.on('message', function(m) {
func(m);
});
May be you should try Messenger.js. It can do IPC in a handy way.
You don't have to do the communication between the two processes by yourself.
Use Redis as a message bus/broker.
https://redis.io/topics/pubsub
You can also use socket messaging like ZeroMQ, which are point to point / peer to peer, instead of using a message broker like Redis.
How does this work?
With Redis, in both your node applications you have two Redis clients doing pub/sub. So each node.js app would have a publisher and subscriber client (yes you need 2 clients per node process for Redis pub/sub)
With ZeroMQ, you can send messages via IPC channels, directly between node.js processes, (no broker involved - except perhaps the OS itself..).