Multiple clients posting data in node js - node.js

I've read that in Node js one should treat POST requests carefully because the post data may arrive in chunks, so it has to be handled like this, concatenating:
function handleRequest(request, response) {
if (request.method == 'POST') {
var body = '';
request.on('data', function (data) {
body += data;
});
request.on('end', function () {
//data is complete here
});
}
}
What I don't understand is how this code snippet will handle several clients at the same time. Let's say two separate clients start uploading large POST data. They will be added to the same body, mixing up the data...
Or is it the framework which will handle this? Triggering different instances of handleRequest function so that they do not get mixed up in the body variable?
Thanks.

Given the request, response signature of your method, it looks like that's a listener for the request event.
Assuming that's correct, then this event is emitted for every new request, so as long as you are only concatenating new data to a body object that is unique to that handler (as in your current example), you're good to go.

Related

NodeJS ending https response early

I have a nodejs application that sends http requests to an external API and retrieves json data (in the form of a hex buffer) in response, using http.
However, when I try to pull large data sets from the API, the data is incomplete. The json will cut off about a third of the way through and trying to parse it produces an error. I don't think it's a problem with my parsing (toString) because res.complete is not being triggered, so it's clearly not finishing.
Is there a way to force my request to wait for res.complete to finish?
My request looks like this:
const req = https.get(options=pull_options,res=> {
res.on('data', d=> {
if(res.complete) {
resolve(d.toString());
} else {
console.error("Connection terminated while message was still being sent.")
}
}
}
I really don't think it's a problem with the API cutting me off because I'm able to pull the same data set with nearly identical code in python with no issues.
output = '';
const req = https.get(options=pull_options,res=> {
res.on('data', d=> {
output += d;
}
res.on('end', d=> {
resolve(output)
}
}
Changing the resolve to on end to allow all the data to come in worked for me.

When an Express.js middleware modifies the chunk that goes into res.end(), response times go up by 10x

Node.js version: 14.16, Express version: 4.17
We use express-winston for logging our express responses. Since we want additional information to go into our logs, that we don't want our end-users to see, we decided to include a middleware that wraps res.end(), so as to intercept the chunk sent by express-winston, and remove the additional data.
For reference here's the line of code in express-winston that calls res.end(), whose chunk we want to replace before the response is sent to the end-user, without altering the chunk that is logged:
https://github.com/bithavoc/express-winston/blob/bdba1d39965f83b003178646d213cd974b090326/index.js#L317
Here is a sample middleware that we wrote:
module.exports.responseMiddleware = (req, res, next) => {
const { end } = res;
res.end = (chunk, encoding) => {
res.end = end;
// If alterBody is enabled, the chunk sent to res.end is modified
const resultChunk = req.body.alterBody === 'yes'
? Buffer.from(JSON.stringify({}))
: chunk;
res.end(resultChunk, encoding);
};
next();
};
The original response is sent with a call to res.status(...).json(...)
What we found was that when we enable alterBody, the response time goes up by 10x (from 500ms to 5s).
What could be the reason for this? Is there a way that we can maintain the original response time, while also logging and sending two different chunks?

Node http.createServer how to buffer incoming requests

I'm building a small node server that generates PDF files (using Nightmare.js). Each request calls createPage to generate one pdf.
The incoming request tend to all come around the same time, overloading the PC this is running on.
I need to buffer the incoming requests to delay execution of some requests till some of the current requests have completed. How do I do this?
function createPage(o, final) {
//generate pdf files
}
http.createServer(function (request, response) {
var body = [];
request.on('data', function (chunk) {
body.push(chunk);
}).on('end', function () {
body = Buffer.concat(body).toString();
var json = JSON.parse(body);
createPage(json, function (status) {
if (status === true) {
response.writeHead(200, { 'Content-Length': 0 });
console.log('status good');
} else {
response.writeHead(500, { 'Content-Type': 'text/html' });
response.write(' ' + status);
}
response.end('\nEnd of Request \n');
});
});
}).listen(8007);
If I understand correctly, you want to continually accept http requests but throttle the rate at which createPage is invoked. If so, you probably need to consider a slightly different design. In this current design, every next client will have to wait longer than the previous one to find out if their request has succeeded or failed.
Approach 1:
use a queue (rabbitmq, aws sqs, zeromq, kafka, etc).
Here's the basic workflow:
receive the request
generate a unique id
put a message on the queue that includes the data and the unique id
return the unique id to the client
the client periodically checks for the completion of the task using the unique id
Approach 2:
Use a queue with message duplexing.
receive the request
generate a correlation id and relate it to the http transaction
send message on queue to worker with correlation id
when worker completes, it sends the response back with the correlation id
server uses correlation id to find the http transaction and send the appropriate response to the client

Node.js and understanding how response works

I'm really new to node.js so please bear with me if I'm making a obvious mistake.
To understand node.js, i'm trying to create a webserver that basically:
1) update the page with appending "hello world" everytime the root url (localhost:8000/) is hit.
2) user can go to another url (localhost:8000/getChatData) and it will display all the data built up from the url (localhost:8000/) being triggered
Problem I'm experiencing:
1) I'm having issue with displaying that data on the rendered page. I have a timer that should call get_data() ever second and update the screen with the data variable that stores the appended output. Specifically this line below response.simpleText(200, data); isn't working correctly.
The file
// Load the node-router library by creationix
var server = require('C:\\Personal\\ChatPrototype\\node\\node-router').getServer();
var data = null;
// Configure our HTTP server to respond with Hello World the root request
server.get("/", function (request, response) {
if(data != null)
{
data = data + "hello world\n";
}
else
{
data = "hellow world\n";
}
response.writeHead(200, {'Content-Type': 'text/plain'});
console.log(data);
response.simpleText(200, data);
response.end();
});
// Configure our HTTP server to respond with Hello World the root request
server.get("/getChatData", function (request, response) {
setInterval( function() { get_data(response); }, 1000 );
});
function get_data(response)
{
if(data != null)
{
response.writeHead(200, {'Content-Type': 'text/plain'});
response.simpleText(200, data);
console.log("data:" + data);
response.end();
}
else
{
console.log("no data");
}
}
// Listen on port 8080 on localhost
server.listen(8000, "localhost");
If there is a better way to do this, please let me know. The goal is to basically have a way for a server to call a url to update a variable and have another html page to report/display the updated data dynamically every second.
Thanks,
D
The client server model works by a client sending a request to the server and the server in return sends a response. The server can not send a response to the client that the client hasn't asked for. The client initiates the request. Therefore you cannot have the server changing the response object on an interval.
The client will not get these changes to the requests. How something like this is usually handled as through AJAX the initial response from the server sends Javascript code to the client that initiates requests to the server on an interval.
setTimeout accepts function without parameter which is obvious as it will be executed later in time. All values you need in that function should be available at the point of time. In you case, the response object that you are trying to pass, is a local instance which has scope only inside the server.get's callback (where you set the setTimeout).
There are several ways you can resolve this issue. you can keep a copy of the response instance in the outer scope where get_data belongs or you can move the get_data entirely inside and remove setTimeout. The first solution is not recommended as if getChatData is called several times in 1sec the last copy will be prevailing.
But my suggestion would be to keep the data in database and show it once getChatData is called.

fetch data from multiple table by sending only one request

I am using node.js for server side development and backbone.js for client side development. i want to fetch data from multiple table(more than 3) by sending only one request to node.js. but i cant merge all that result with each other beacuse of asynchronous execution of node.js. i have done this but it sending a lots of get request to node js for getting data from all the tables and because of these performance of my site is become slower. please help me if anyone having any idea.
I would create a method which aggregates the results from each of the requests and sends the response back. Basically each of your three async db calls would pass their data to the same method. That method would check to see if it had all of the data it needed to complete the request, and if it did, send the response.
Here is a pseudo code example:
function handleRequest(req, res) {
var results = {};
db.getUsers(function(data) {
aggregate('users', data);
});
db.getPosts(function(data) {
aggregate('posts', data);
});
db.getComments(function(data) {
aggregate('comments', data);
});
function aggregate(name, data) {
results[name] = data;
if(results.users && results.posts && results.comments) {
res.send(results);
}
}
}
This is simplified greatly, you should also of course check for errors and timeouts to the db calls, but this will allow you to wait for all the async commands to complete before sending the data.

Resources