Loops with callbacks in node.js - node.js

I have the following code in node.js
for (var i = 0; i<allLetters.length; i++)
for (var k = 0; k<allLetters.length; k++){
var allFilesName = fs.readdirSync("/opt/ + allLetters[i] + "/" + allLetters[k]);
for (var t = 0; t< akkFilesName; t++)
dosomething(allFilesName[t];
}
dosomething is a function with callback, and include IO operation.
the problem is that my application doesn't executed the callback until it finish the i, k & t loops. Meaning, I see that all the CPU time is wasted on completed the callback, and just after it complete all the loops, it executed the callback, and returns from the callback.
I want that the loops and the callback will executed parallel, so I would get the result from the callback while I do the loop.

As stated in the comments, the each-Function of the async-Library does what you are looking for.

Related

Node.js synchronous loop execution

AWSERED DONT NEED TO WRITE
How to make this loop run synchronous in node js
CODE:
for(var i = 0; i < arr.length; i++) {
inv(arr[i][0],arr[i][1]);
}
inv(arr[i][0],arri); // this is a function which fetches data from a website must be executed one by one or ip will be blocked cant be executed async
What program does:image
the problem was with the function it was async so i made it recursive to solve the problem

Node.js for loop, event loop, asynchronous resolution

Recently I came across code that makes me wonder about how Node.js engine works. Specifically in regards to event loop, for loop, and asynchronous processing. For example:
const x = 100000000; // or some stupendously large number
for (let i = 0; i < x; i++) {
asyncCall(callback);
}
const callback = () => {
console.log("done one");
}
Let's say asyncCall takes anywhere from 1ms to 100ms. Is there a possibility that console.log("done one") will be called before the for loop finishes?
To try to find my own answer, I've read this article: https://blog.sessionstack.com/how-javascript-works-event-loop-and-the-rise-of-async-programming-5-ways-to-better-coding-with-2f077c4438b5. But I'm not sure if there is a case where the call stack will be empty in the middle of the for loop so that the event loop puts the callback in between asyncCall calls?

npm wait.for not working as expected

tried the code below as provided in the official site - https://www.npmjs.com/package/wait.for. But not working as expected.
Output:
*before calling test
after calling test
reverse for 216.58.196.142: ["syd15s04-in-f14.1e100.net"]*
Expected output:
*before calling test
reverse for 216.58.196.142: ["syd15s04-in-f14.1e100.net"]
after calling test*
What is that I can do to make it work?
var dns = require("dns"), wait=require('wait.for');
function test(){
var addresses = wait.for(dns.resolve4,"google.com");
for (var i = 0; i < addresses.length; i++) {
var a = addresses[i];
console.log("reverse for " + a + ": " + JSON.stringify(wait.for(dns.reverse,a)));
}
}
console.log("before calling test");
wait.launchFiber(test);
console.log("after calling test");
wait.launchFiber(test);
Means launch and forget. launchFiber starts a concurrent execution fiber. Inside the fiber you can use wait for, but the fiber is concurrent with the main execution thread.

Removing event listeners on a currently emitting event

I have the following sample app, written in Node.js:
'use strict';
var events = require('events'),
util = require('util');
var EventEmitter = events.EventEmitter;
var Foo = function () {};
util.inherits(Foo, EventEmitter);
var foo = new Foo();
foo.once('x', function () {
foo.removeAllListeners();
console.log('Google!');
});
foo.once('x', function () {
foo.removeAllListeners();
console.log('Yahoo!');
});
foo.emit('x');
It prints:
Google!
Yahoo!
Now my question is: Apparently the removeAllListeners does not affect the event listeners that are currently bound to the event. Is this by random, or is this by intent? (I checked this out using 0.10.32 as well as 0.11.13)
The background of my question is: If I bind two event handlers to a stream's end event, and one of them calls removeAllListeners, does Node.js guarantee that both will always be run, or is this just by good luck?
In looking at the implementation of the .emit() method, it looks like once it starts processing an event and calling listeners, that event will not be affected by any code that calls removeAllListeners() so in your example both listeners will be called.
The code for .emit() makes a copy of the array of listeners before executing any of them so that once it starts executing one, it will execute all of them, even if they are removed during execution. Here's the relevant piece of code:
} else if (util.isObject(handler)) {
len = arguments.length;
args = new Array(len - 1);
for (i = 1; i < len; i++)
args[i - 1] = arguments[i];
listeners = handler.slice();
len = listeners.length;
for (i = 0; i < len; i++)
listeners[i].apply(this, args);
}
From the EventEmitter implementation here: https://github.com/joyent/node/blob/857975d5e7e0d7bf38577db0478d9e5ede79922e/lib/events.js#L120
In this piece of code, handler will be an array of listener functions. The line
listeners = handler.slice()
makes a copy of the listeners array before any listeners are executed. This is to be expected because iteration of that array can be messed up (duplicates or skips) if code is freely allowed to modify the array being iterated while it is being iterated. So, it freezes the set of listeners to be called before calling any of them.

Can I allow for "breaks" in a for loop with node.js?

I have a massive for loop and I want to allow I/O to continue while I'm processing. Maybe every 10,000 or so iterations. Any way for me to allow for additional I/O this way?
A massive for loop is just you blocking the entire server.
You have two options, either put the for loop in a new thread, or make it asynchronous.
var data = [];
var next = function(i) {
// do thing with data [i];
process.nextTick(next.bind(this, i + 1));
};
process.nextTick(next.bind(this, 0));
I don't recommend the latter. Your just implementing naive time splicing which the OS level process scheduler can do better then you.
var exec = require("child_process").exec
var s = exec("node " + filename, function (err, stdout, stderr) {
stdout.on("data", function() {
// handle data
});
});
Alternatively use something like hook.io to manage processes for you.
Actually you probably want to aggressively redesign your codebase if you have a blocking for loop.
Maybe something like this to break your loop into chunks...
Instead of:
for (var i=0; i<len; i++) {
doSomething(i);
}
Something like:
var i = 0, limit;
while (i < len) {
limit = (i+10000);
if (limit > len)
limit = len;
process.nextTick(function(){
for (; i<limit; i++) {
doSomething(i);
}
});
}
}
The nextTick() call gives a chance for other events to get in there, but it still does most looping synchronously which (I'm guessing) will be a lot faster than creating a new event for every iteration. And obviously, you can experiment with the number (10,000) till you get the results you want.
In theory, you could also use setTimeout() rather than nextTick(), in case it turns out that giving other processes a somewhat bigger "time-slice" helps. That gives you one more variable (the timeout milliseconds) that you can use for tuning.

Resources