Multiple parllel request not processing in node js in async - node.js

var express=require('express');
var app=express();
module.exports=app;
app.get('/request1', function(req,res){
request1(function(){
res.end();
});
console.log("req1");
});
app.get('/request2',function(req,res){
console.log("req2");
request2(function(){
res.end();
});
});
function request1(callback){
process.nextTick(function(){
for(var i=0;i<9999999;i++){
console.log(i);
}
return callback();
});
};
function request2(callback){
process.nextTick(function(){
console.log('request2');
callback();
});
};
app.listen(3000);
In this code, I first call /request1 that takes time to process the loop.
But in another tab, I requested /request2 until /request1 is not complete. /request2 is also not executing.
Please help me how to solve this.

Node.js is single threaded. It peforms IO and Network Operations in async manner.
Here you are executing for loop which blocks the thread.
So once it will be finished, you will get the request of second request.

You should look in the concepts of callback , while making request. Its very important to use call back if you are really looking to do process which do some work at same time . How ever it is not possible to do multiple things at same time in node.js. Yet callback gives you a feel of that .

Related

Where does the callback within this express app coming from?

after some freecodecamp I started doing the Express js tutorial from MDN (https://developer.mozilla.org/en-US/docs/Learn/Server-side/Express_Nodejs/Displaying_data/Home_page 2) for some backend.
I am stuck at understanding where the callback in the async.parallel is coming from and what is represents.
If I delete the callback the site wont load, so it must have some important meaning but unfortunately I have no glue. Is it calling the function(err, results) { res.render(‘index’, […] }) to make the result availalble for data?
var Book = require(’…/models/book’);
var async = require(‘async’);
exports.index = function(req, res) {
async.parallel({
book_count: function(callback) {
Book.countDocuments({}, callback);
},
[...]
[...]
function(err, results) {
res.render('index', {
title: 'Local Library Home',
error: err, data: results
});
});
};
A Callback is a generic function invoked upon the completion of an asynchronous request. In this particular instance, the Callback is being utilized as a method of getting the data out of the asynchronous request to fill the number of books on your page. These are required because these query's are non-blocking, meaning Javascript will keep executing other surrounding code until the Callback is invoked. If you want more detail on how they work in general look here as previously mentioned by #dnp1204. I hope this answered you question.

Wait for an event to happen before sending HTTP response in NodeJS?

I'm looking for a solution to waiting for an event to happen before sending a HTTP response.
Use Case
The idea is I call a function in one of my routes: zwave.connect("/dev/ttyACM5"); This function return immediately.
But there exists 2 events that notice about if it succeed or fail to connect the device:
zwave.on('driver ready', function(){...});
zwave.on('driver failed', function(){...});
In my route, I would like to know if the device succeed or fail to connect before sending the HTTP response.
My "solution"
When an event happen, I save the event in a database:
zwave.on('driver ready', function(){
//In the database, save the fact the event happened, here it's event "CONNECTED"
});
In my route, execute the connect function and wait for the event to
appear in the database:
router.get('/', function(request, response, next) {
zwave.connect("/dev/ttyACM5");
waitForEvent("CONNECTED", 5, null, function(){
response.redirect(/connected);
});
});
// The function use to wait for the event
waitForEvent: function(eventType, nbCallMax, nbCall, callback){
if(nbCall == null) nbCall = 1;
if(nbCallMax == null) nbCallMax = 1;
// Looking for event to happen (return true if event happened, false otherwise
event = findEventInDataBase(eventType);
if(event){
waitForEvent(eventType, nbCallMax, nbCall, callback);
}else{
setTimeout(waitForEvent(eventType, callback, nbCallMax, (nbCall+1)), 1500);
}
}
I don't think it is a good practice because it iterates calls over the database.
So what are your opinions/suggestions about it?
I've gone ahead and added the asynchronous and control-flow tags to your question because at the core of it, that is what you're asking about. (As an aside, if you're not using ES6 you should be able to translate the code below back to ES5.)
TL;DR
There are a lot of ways to handle async control flow in JavaScript (see also: What is the best control flow module for node.js?). You are looking for a structured way to handle it—likely Promises or the Reactive Extensions for JavaScript (a.k.a RxJS).
Example using a Promise
From MDN:
The Promise object is used for asynchronous computations. A Promise represents a value which may be available now, or in the future, or never.
The async computation in your case is the computation of a boolean value describing the success or failure to connect to the device. To do so, you can wrap the call to connect in a Promise object like so:
const p = new Promise((resolve) => {
// This assumes that the events are mutually exclusive
zwave.connect('/dev/ttyACM5');
zwave.on('driver ready', () => resolve(true));
zwave.on('driver failed', () => resolve(false));
});
Once you have a Promise representing the state of the connection, you can attach functions to its "future" value:
// Inside your route file
const p = /* ... */;
router.get('/', function(request, response, next) {
p.then(successful => {
if (successful) {
response.redirect('/connected');
}
else {
response.redirect('/failure');
}
});
});
You can learn more about Promises on MDN, or by reading one of many other resources on the topic (e.g. You're Missing the Point of Promises).
Have you tried this? From the look of it, your zwave probably have already implemented an EventEmmiter, you just need to attach a listener to it
router.get('/', function(request, response, next) {
zwave.connect("/dev/ttyACM5");
zwave.once('driver ready', function(){
response.redirect(/connected);
});
});
There is a npm sync module also. which is used for synchronize the process of executing the query.
When you want to run parallel queries in synchronous way then node restrict to do that because it never wait for response. and sync module is much perfect for that kind of solution.
Sample code
/*require sync module*/
var Sync = require('sync');
app.get('/',function(req,res,next){
story.find().exec(function(err,data){
var sync_function_data = find_user.sync(null, {name: "sanjeev"});
res.send({story:data,user:sync_function_data});
});
});
/*****sync function defined here *******/
function find_user(req_json, callback) {
process.nextTick(function () {
users.find(req_json,function (err,data)
{
if (!err) {
callback(null, data);
} else {
callback(null, err);
}
});
});
}
reference link: https://www.npmjs.com/package/sync

Does Express.js support sending unbuffered progressively flushed responses?

Perl's Catalyst framework permitts you to send an progressively flushed response over an open connection. You could for instance use write_fh() on Catalyst::Response. I've begun using Node.js, and I can't find how to do the equivalent.
If I want to send a big CSV file, on the order of 200 megs is there a way to do that without buffering the whole CSV file in memory? Granted, the client will timeout if you don't send data in a certain amount of time, so a promise would be nice if -- but is there anyway to do this?
When I try to do a res.send(text) in a callback, I get
Express
500 Error: This socket has been ended by the other party
And, it doesn't seem that Express.js supports an explicit socket.close() or anything of the ilk.
Here is an example,
exports.foo = function (res) {
var query = client.query("SELECT * FROM naics.codes");
query.on('row', function(row) {
//console.log(row);
res.write("GOT A ROW");
});
query.on('end', function() {
res.end();
client.end();
});
};
I would expect for that to send "GOT A ROW" out for each row, until the call to client.end() signifying completion.
Express is built on the native HTTP module, which means res is an instance of http.ServerResponse, which inherits from the writable stream interface. That said, you can do this:
app.get('/', function(req, res) {
var stream = fs.createReadStream('./file.csv');
stream.pipe(res);
// or use event handlers
stream.on('data', function(data) {
res.write(data);
});
stream.on('end', function() {
res.end();
});
});
The reason you can't use the res.send() method in Express for streams is because it will use res.close() automatically for you.

Request not ending in node.js (used to work)

I have this piece of code:
var app = require('http').createServer(function(req, res){
console.log(req);
req.addListener('end', function () {
fileServer.serve(req, res);
});
});
var statics = require('node-static');
var fileServer = new statics.Server('./');
app.listen(1344, '127.0.0.1');
app.on('error', function(err){
console.log(err);
})
It was working just fine, till I made a couple of changes, node gives an error, and when I went back, that error wasn't there anymore, but instead of work like it was working before the end event is not being fired. So, anything inside req.addListener('end', function (){});is not called.
And even if I run another node.js that uses the same event, is not being fired either. So is like if the end event of the request is broken. But how can that be possible?
Is not the first time it happens. Last time I ended up re-installing node (after try lots of different things). I would prefer to find a solution, so I can understand the problem!
NOTE: The original code include socket.io and other kind of connections, but I've just pasted the piece of code were the app is stuck on.
It could also be useful to know how to debug the problem!
#InspiredJW should get credit for pointing this out, since I had forgotten about it, but undoubtedly your problem is because of the changes in the readable streams. In order for the end event to get called you either have to attach a listener to the data event, or you have to call stream.resume().
require('http').createServer(function(req, res){
req.addListener('end', function () {
// won't ever get called in node v0.10.3
});
});
require('http').createServer(function(req, res){
req.addListener('end', function () {
// will get called in node v0.10.3 because we called req.resume()
});
req.resume();
});
require('http').createServer(function(req, res){
req.on('data', function (chunk) { });
req.addListener('end', function () {
// also will get called because we attached a data event listener
});
});
http://nodejs.org/api/stream.html#stream_compatibility

Delaying events until a callback is called (Node.js)

I am using Node.js with Express and have code similar to this as part of my routes:
requireLogin: function(req, res, next) {
User.find(req.session.userId)
.on('success', function(user) {
req.addListener('data', function(chunk) {
console.log("DATA: " + chunk);
}
next()
}
}
I am using Sequelize and the User.find method is accessing the database. The trouble is, the request 'data' event that I bind to is never fired. It seems that the data event had already been triggered and handled by the time the user is returned from the database and it's too late to do anything with it. In the example above I could just move the req.addListener to outside the database callback, but in reality I am calling next() here which can't be moved.
All of the following route middleware that is called by next() then doesn't have access to the request data since these events have already been fired. Worse than that, they just hang waiting for the data event from req because it has already happened.
How can I somehow delay the data event so that it can be bound to from within the database callback? Or have I misunderstood something fundamental and need to change my way of going about this?
Thanks a lot.
Edit: I found a relevant discussion in the nodejs Google group which suggests there isn't a solution that will work for me.
var cache = new function () {
var arr = [],
cbs = [];
this.add = function(data) {
arr.push(data);
cbs.forEach(function(cb) {
cb(arr);
});
}
this.get = function(cb) {
cbs.push(arr);
if (arr.length > 0) {
cb(arr);
}
}
};
req.addListener('data', function(chunk) {
cache.add(chunk);
};
User.find(
req.session.userId
).on('success', function(user) {
cache.get(function(data) {
// stuff
next();
});
};
I presume what you actually want is some kind of message caching. Now this is a vague proof of concept. What you actually want depends on your code.
If you have any kind of deferred library / abstraction available then the code will become a lot smaller.

Resources