I have a code snippet in nodejs like this:
in every 2 sec, foo() will be called.
function foo()
{
while (count < 10)
{
doSometing()
count ++;``
}
}
doSomething()
{
...
}
The limitation is, foo() has no callback.
How to make while loop execute and foo() completes without waiting for dosomething() to complete (call dosomething() and proceed), and dosomething() executes parallely?
I think, what you want is:
function foo()
{
while (count < 10)
{
process.nextTick(doSometing);
count ++;
}
}
process.nextTick will schedule the execution of doSometing on the next tick of the event loop. So, instead of switching immediately to doSometing this code will just schedule the execution and complete foo first.
You may also try setTimeout(doSometing,0) and setImmediate(doSometing). They'll allow I/O calls to occur before doSometing will be executed.
Passing arguments to doSomething
If you want to pass some parameters to doSomething, then it's best to ensure they'll be encapsulated and won't change before doSomething will be executed:
setTimeout(doSometing.bind(null,foo,bar),0);
In this case doSometing will be called with correct arguments even if foo and bar will be changed or deleted. But this won't work in case if foo is an object and you changes one of its properties.
What the alternatives are?
If you want doSomething to be executed in parallel (not just asynchronous, but actually in parallel), then you may be interested in some job-processing solution. I recommend you to look at kickq:
var kickq = require('kickq');
kickq.process('some_job', function (jobItem, data, cb) {
doSomething(data);
cb();
});
// ...
function foo()
{
while (count < 10)
{
kickq.create('some_job', data);
count ++;
}
}
kickq.process will create a separate process for processing your jobs. So, kickq.create will just register the job to be processed.
kickq uses redis to queue jobs and it won't work without it.
Using node.js build-in modules
Another alternative is building your own job-processor using Child Process. The resulting code may look something like this:
var fork = require('child_process').fork,
child = fork(__dirname + '/do-something.js');
// ...
function foo()
{
while (count < 10)
{
child.send(data);
count ++;
}
}
do-something.js here is a separate .js file with doSomething logic:
process.on('message', doSomething);
The actual code may be more complicated.
Things you should be aware of
Node.js is single-threaded, so it executes only one function at a time. It also can't utilize more then one CPU.
Node.js is asynchronous, so it's capable of processing multiple functions at once by switching between them. It's really efficient when dealing with functions with lots of I/O calls, because it's newer blocks. So, when one function waits for the response from DB, another function is executed. But node.js is not a good choice for blocking tasks with heavy CPU utilization.
It's possible to do real parallel calculations in node.js using modules like child_process and cluster. child_process allows you to start a new node.js process. It also creates a communication channel between parent and child processes. Cluster allows you to run a cluster of identical processes. It's really handy when you're dealing with http requests, because cluster can distribute them randomly between workers. So, it's possible to create a cluster of workers processing your data in parallel, though generally node.js is single-threaded.
Related
I know I can write a while (true) loop to monitor the queue, but it will cause the CPU 100% problem.
I can sleep some seconds inside the while (true) loop, but it's NOT efficient.
In C language, I can wait for a semaphore inside the while (true) loop. When a task added into the queue, release the semaphore so that the while (true) loop can do its job. After the queue is empty, it can set the semaphore, and wait for it.
Is there similar way to do this in Nodejs?
Imagine we have this taskQueue:
// Tasks will be added to the array randomly
const tasks = [];
Note: the taskQueue above is something completely different than the internal NodeJS micro/macro task queue, that I'm referring to throughout this post.
A way of constantly monitoring this array would be to schedule a 'micro-task' or 'macro-task' that parses the array.
As an example:
function handleTasks() {
if (tasks.length) {
// Alternatively loop and pop all the current tasks in queue
const task = tasks.pop();
// Do something with the task
}
setImmediate(handleTasks)
}
setImmediate(handleTasks)
The setImmediate function will add a task to the internal macro-task queue.
The JS micro- and macro-tasks do not block the main thread and will only be executed when the event-loop picks it off the internal micro/macro task queue.
In NodeJS there are 4 ways of scheduling a function in a non-blocking way. Which way you pick is based on how much priority you'd want to give to the function.
Ordered by highest priority first the ways to do this are:
process.nextTick(handleTask)
new Promise((resolve) => { resolve() }).then(handleTask)
setImmediate(handleTask) / setTimeout(handleTask, 0)
setTimeout(handleTask, 1) # Every timeout value bigger than 0
Be aware that executing this function with the highest priority recursively could slow down the rest of your code.
Depending on how important clearing this taskQueue is, I'd generally suggest to use setTimeout with a reasonable value (as high as you can afford) to prevent affecting performance of your application. (Same goes for any other function that schedules itself on the micro/macro task queue.)
Questions
I know I can write a while (true) loop to monitor the queue, but it
will cause the CPU 100% problem.
In JavaScript the functions cannot be preempted, meaning that their execution cannot be halted somewhere in the middle.
The consequence is that once a function start, it will have to finish before another line of code (somewhere else) can be executed.
Therefore an infinite while-loop will not work.
I can sleep some seconds inside the while (true) loop, but it's NOT
efficient.
while(true) {
await timeout(1000);
// Do sth
}
Is actually syntactic sugar for
timeout(1000).then(() => {
// Do sth
timeout(1000).then(() => {
// Do sth
// ...etc
})
})
Using await inside a loop is considered a bad-practice, but could work since it just schedules each next iteration on the micro-task queue.
In C language, I can wait for a semaphore inside the while (true) loop. When a
task added into the queue, release the semaphore so that the while
(true) loop can do its job. After the queue is empty, it can set the
semaphore, and wait for it.
There is no such thing as a semaphore in JS. Something that might achieve a similar effect could be a callback function.
Example:
function heavyLoadTask() {
// Do sth
resumeExecution = () => {
// What to do when execution is resumed
}
}
// Somewhere else the execution could be resumed like this;
if (typeof resumeExecution === "function"){
resumeExecution();
}
Recommended reading
https://javascript.info/event-loop
https://nodejs.dev/learn/understanding-process-nexttick
https://nodejs.dev/learn/understanding-setimmediate
How can I run a single method multiple times multi-threaded when called as a method of a class?
At first I tried to use the cluster module, but I realize it just re-runs the whole process from the start, rightfully so.
How can I achieve something like what's outlined below?
I want a class's method to spawn n processes, and when the parallel tasks are completed, I can resolve a promise which the method returns.
The problem with the code below is that calling cluster.fork() will fork index.js process.
index.js
const Person = require('./Person.js');
var Mary = new Person('Mary');
Mary.run(5).then(() => {...});
console.log('I should only run once, but I am called 5 times too many');
Person.js
const cluster = require('cluster');
class Person{
run(distance){
var completed = 0;
return new Promise((resolve, reject) => {
for(var i = 0; i < distance; i++) {
// run a separate process for each
cluster.fork().send(i).on('message', message => {
if (message === 'completed') { ++completed; }
if (completed === distance) { resolve(); }
});
}
});
}
}
I think the short answer is impossible. It's even worse - this has nothing to do with js. To multi (process or thread) in your particular problem you will essentially need a copy of the object in every thread, since it needs (maybe) access to fields - in this case you would need to either initialize it in every thread or share memory. That last one I don't think is provided in cluster, and not trivial in other languages in every use case.
If the calculation is independent of the Person I suggest you extract it, and use the usual (in index.js):
if(cluster.isWorker) {
//Use the i for calculation
} else {
//Create Person, then fork children in for loop
}
You then collect the results and change the Person as needed. You will be copying index.js, but this is standard and you only run what you need.
The problem is if results are dependent on Person. If these are constant for all i you can still send them to your forks independently. Otherwise what you have is the only way to fork. In general forking in cluster is not meant for methods, but for the app itself, which is the standard forking behavior.
Another solution
Following your comment, I suggest you checkout child_process.execFile or child_process.exec on same file.
This way you can spawn a totally independent process on the fly. Now instead of calling cluster.fork you call execFile. You can use either the exit code or stdout as return values (stderr etc.). Promise is now replaced with:
var results = []
for(var i = 0; i < distance; i++) {
// run a separate process for each
results.push(child_process.execFile().child.execFile('node', 'mymethod.js`,i]));
}
//... catch the exit event from all results or return a callback using results.
Inside mymethod.js Have your code that takes i and returns what you want either in the exit code or through stdout, both properties of the returned child_process. This is a bit un-node.js-y since you're waiting on asynchronous calls, but you're requirements are non standard. Since I'm not sure how you use this perhaps returning a callback with the array is a better idea.
I am very new to node js and socket io. Can this code lead to a race condition on counter variable. Should I use a locking library for safely updating the counter variable.
"use strict";
module.exports = function (opts) {
var module = {};
var io = opts.io;
var counter = 0;
io.on('connection', function (socket) {
socket.on("inc", function (msg) {
counter += 1;
});
socket.on("dec" , function (msg) {
counter -= 1;
});
});
return module;
};
No, there is no race condition here. Javascript in node.js is single threaded and event driven so only one socket.io event handler is ever executing at a time. This is one of the nice programming simplifications that come from the single threaded model. It runs a given thread of execution to completion and then and only then does it grab the next event from the event queue and run it.
Hopefully you do realize that the same counter variable is accessed by all socket.io connections. While this isn't a race condition, it means that there's only one counter that all socket.io connections are capable of modifying.
If you wanted a per-connection counter (separeate counter for each connection), then you could define the counter variable inside the io.on('connection', ....) handler.
The race conditions you do have to watch out for in node.js are when you make an async call and then continue the rest of your coding logic in the async callback. While the async operation is underway, other node.js code can run and can change publicly accessible variables you may be using. That is not the case in your counter example, but it does occur with lots of other types of node.js programming.
For example, this could be an issue:
var flag = false;
function doSomething() {
// set flag indicating we are in a fs.readFile() operation
flag = true;
fs.readFile("somefile.txt", function(err, data) {
// do something with data
// clear flag
flag = false;
});
}
In this case, immediately after we call fs.readFile(), we are returning control back to the node.js. It is free at that time to run other operations. If another operation could also run this code, then it will tromp on the value of flag and we'd have a concurrency issue.
So, you have to be aware that anytime you make an async operation and then the rest of your logic continues in the callback for the async operation that other code can run and any shared variables can be accessed at that time. You either need to make a local copy of shared data or you need to provide appropriate protections for shared data.
In this particular case, the flag could be incremented and decremented rather than simply set to true or false and it would probably serve the desired purpose of keeping track of whether this file is current being read or not.
Shorter answer:
"Race condition" is when you execute a series of ordered asynchronous functions and because of their async nature they won't finish processing in their original order.
In your code, you are executing a series of ordered synchronous process (increasing or decreasing the counter), So they finish instantly after they start, resulting in ordered output. So no racing here!
Suppose you've got a 3rd-party library that's got a synchronous API. Naturally, attempting to use it in an async fashion yields undesirable results in the sense that you get blocked when trying to do multiple things in "parallel".
Are there any common patterns that allow us to use such libraries in an async fashion?
Consider the following example (using the async library from NPM for brevity):
var async = require('async');
function ts() {
return new Date().getTime();
}
var startTs = ts();
process.on('exit', function() {
console.log('Total Time: ~' + (ts() - startTs) + ' ms');
});
// This is a dummy function that simulates some 3rd-party synchronous code.
function vendorSyncCode() {
var future = ts() + 50; // ~50 ms in the future.
while(ts() <= future) {} // Spin to simulate blocking work.
}
// My code that handles the workload and uses `vendorSyncCode`.
function myTaskRunner(task, callback) {
// Do async stuff with `task`...
vendorSyncCode(task);
// Do more async stuff...
callback();
}
// Dummy workload.
var work = (function() {
var result = [];
for(var i = 0; i < 100; ++i) result.push(i);
return result;
})();
// Problem:
// -------
// The following two calls will take roughly the same amount of time to complete.
// In this case, ~6 seconds each.
async.each(work, myTaskRunner, function(err) {});
async.eachLimit(work, 10, myTaskRunner, function(err) {});
// Desired:
// --------
// The latter call with 10 "workers" should complete roughly an order of magnitude
// faster than the former.
Are fork/join or spawning worker processes manually my only options?
Yes, it is your only option.
If you need to use 50ms of cpu time to do something, and need to do it 10 times, then you'll need 500ms of cpu time to do it. If you want it to be done in less than 500ms of wall clock time, you need to use more cpus. That means multiple node instances (or a C++ addon that pushes the work out onto the thread pool). How to get multiple instances depends on your app strucuture, a child that you feed the work to using child_process.send() is one way, running multiple servers with cluster is another. Breaking up your server is another way. Say its an image store application, and mostly is fast to process requests, unless someone asks to convert an image into another format and that's cpu intensive. You could push the image processing portion into a different app, and access it through a REST API, leaving the main app server responsive.
If you aren't concerned that it takes 50ms of cpu to do the request, but instead you are concerned that you can't interleave handling of other requests with the processing of the cpu intensive request, then you could break the work up into small chunks, and schedule the next chunk with setInterval(). That's usually a horrid hack, though. Better to restructure the app.
When I receive an "on" event on the server side, I want to start a task in parallel so it does not block the current event loop thread. Is it possible to do so? How?
I don't want to block the server side loop and I want to be able to send back a message to the client once the task is done, something such as:
client.on('execute-parallel-task', function(msg) {
setTimeout(function() {
// do something that takes a while
client.emit('finished-that-task');
},0);
// this block should return asap, not waiting for the previous call
});
I am not sure if setTimeout will do the job.
It depends what the takes a while is. If it takes a while asynchronously (you can tell because you'll have to register a callback or complete handler), and takes a while because it's blocked on something like IO, rather than CPU bound, it'll inherently be parallel.
If however, its something synchronous or CPU bound, whilst you can use setTimeout, setImmediate etc. to send back a message immediately, once the handler for setTimeout or setImmediate executes, your single thread of execution will be stuck handling that; you're not really fixing the problem, merely deferring it.
To exhibit true parallel behaviour, you'll need to launch a child process. You can use the message passing functionality to notify your worker what work to do, and to notify the parent process once the work is complete.
var cp = require('child_process');
var child = cp.fork(__dirname + '/my-child-worker.js');
n.on('message', function(m) {
if (m === "done") {
// Whey!
}
});
n.send(/* Job id, or something */);
Then in my-child-worker.js;
process.on('message', function (m) {
switch (m) {
case 'get-x':
// blah
break;
// other jobs
}
process.send('done');
});
you do not need the setTimeout.
Your function(msg) will be called once the execute parallel task finishes.
if you are designing a task to run in an async manner, you can look at something like the async lib for node.js
Async Node JS Link