Node http.createServer how to buffer incoming requests - node.js

I'm building a small node server that generates PDF files (using Nightmare.js). Each request calls createPage to generate one pdf.
The incoming request tend to all come around the same time, overloading the PC this is running on.
I need to buffer the incoming requests to delay execution of some requests till some of the current requests have completed. How do I do this?
function createPage(o, final) {
//generate pdf files
}
http.createServer(function (request, response) {
var body = [];
request.on('data', function (chunk) {
body.push(chunk);
}).on('end', function () {
body = Buffer.concat(body).toString();
var json = JSON.parse(body);
createPage(json, function (status) {
if (status === true) {
response.writeHead(200, { 'Content-Length': 0 });
console.log('status good');
} else {
response.writeHead(500, { 'Content-Type': 'text/html' });
response.write(' ' + status);
}
response.end('\nEnd of Request \n');
});
});
}).listen(8007);

If I understand correctly, you want to continually accept http requests but throttle the rate at which createPage is invoked. If so, you probably need to consider a slightly different design. In this current design, every next client will have to wait longer than the previous one to find out if their request has succeeded or failed.
Approach 1:
use a queue (rabbitmq, aws sqs, zeromq, kafka, etc).
Here's the basic workflow:
receive the request
generate a unique id
put a message on the queue that includes the data and the unique id
return the unique id to the client
the client periodically checks for the completion of the task using the unique id
Approach 2:
Use a queue with message duplexing.
receive the request
generate a correlation id and relate it to the http transaction
send message on queue to worker with correlation id
when worker completes, it sends the response back with the correlation id
server uses correlation id to find the http transaction and send the appropriate response to the client

Related

Comunication between two Node.js express requests

I have two clients. A client sends an HTTP request to node.js express server and this writes in the DB then waits that value changes. It changes when the other client sends an HTTP request to node.js express and it UPDATE the record in the DB. Then the first request sends a response JSON to the first client.
The idea is that the first client process waits a signal from the second client process without using busy wait (like reads from deatabase every 100ms). Here the code (uniqueid is a value that is PRIMARY KEY of that table's row):
var foo = function(uniqueid, client, cb){
var query="UPDATE... WHERE uniqueid=" + db.escape(uniqueid) + ";";
db.query(query, function(err, result, fields){
if(client==1){ //if we are client1 we wait
waitClient2(uniqueid, function(callback){
//Then read the new value
getValue(uniqueid, function(v){
cb(v);
});
});
}else{ //else unlock client1
unlockClient1(uniqueid);
//Then read the new value
getValue(uniqueid, function(v){
cb(v);
});
}
});
}
var waitClient2 = function(uniqueid, cb){
//wait(uniqueid)
//cb(0) for unlock foo
}
var unlockClient1 = function(uniqueid){
//signal(uniqueid)
}
Is there a way to send a message (signal) from a client request to another client request so the second can unlocks itself editing two latest functions?
I appreciate other ideas too.
Thank you.

How to poll another server periodically from a node.js server?

I have a node.js server A with mongodb for database.
There is another remote server B (doesn't need to be node based) which exposes a HTTP/GET API '/status' and returns either 'FREE' or 'BUSY' as the response.
When a user hits a particular API endpoint in server A(say POST /test), I wish to start polling server B's status API every minute, until server B returns 'FREE' as the response. The user doesn't need to wait till the server B returns a 'FREE' response (polling B is a background job in server A). Once the server A gets a 'FREE' response from B, it shall send out an email to the user.
How can this be achieved in server A, keeping in mind that the number of concurrent users can go large ?
I suggest you use Agenda. https://www.npmjs.com/package/agenda
With agenda you can create recurring schedules under which you can schedule anything pretty flexible.
I suggest you use request module to make HTTP get/post requests.
https://www.npmjs.com/package/request
Going from the example in node.js docs I'd go with something like the code here. I tested and it works. BTW, I'm assuming here that the api response is something like {"status":"BUSY"} & {"status":"FREE"}
const http = require('http');
const poll = {
pollB: function() {
http.get('http://serverB/status', (res) => {
const { statusCode } = res;
let error;
if (statusCode !== 200) {
error = new Error(`Request Failed.\n` +
`Status Code: ${statusCode}`);
}
if (error) {
console.error(error.message);
res.resume();
} else {
res.setEncoding('utf8');
let rawData = '';
res.on('data', (chunk) => { rawData += chunk; });
res.on('end', () => {
try {
const parsedData = JSON.parse(rawData);
// The important logic comes here
if (parsedData.status === 'BUSY') {
setTimeout(poll.pollB, 10000); // request again in 10 secs
} else {
// Call the background process you need to
}
} catch (e) {
console.error(e.message);
}
});
}
}).on('error', (e) => {
console.error(`Got error: ${e.message}`);
});
}
}
poll.pollB();
You probably want to play with this script and get rid of unnecessary code for you, but that's homework ;)
Update:
For coping with a lot of concurrency in node.js I'd recommend to implement a cluster or use a framework. Here are some links to start researching about the subject:
How to fully utilise server capacity for Node.js Web Apps
How to Create a Node.js Cluster for Speeding Up Your Apps
Node.js v7.10.0 Documentation :: cluster
ActionHero.js :: Fantastic node.js framework for implementing an API, background tasks, cluster using http, sockets, websockets
Use a library like request, superagent, or restify-clients to call server B. I would recommend you avoid polling and instead use a webhook when calling B (assuming you are also authoring B). If you can't change B, then setTimeout can be used to schedule subsequent calls on a 1 second interval.

How to send response using twice or more callback in one request

I'm using express and the request POST look like that
router.post('/', function(req, res, next){
var data = req.body;
getRandom(data, function(value){
res.json({value: value});
});
});
POST is sent through ajax and then update textarea with new data.
$.ajax({
type: "POST",
url: "/",
data: JSON.stringify(datareq),
dataType: 'json',
contentType: 'application/json',
success: function(x){
$.each(x, function(index, value) {
$('.textarea').append(value + '\n');
});
},
error: function(x) {
console.log(x + 'error');
}
});
How to send this using one POST and a few response. User received one data in textarea when cb finished and then another data and so one till the end.
<textarea>
data 1 - 1sec
data 2 - 2sec leater
data 3 - 3 second later
...
</textarea>
I add Time (1sec ...) only to show that callback has a lot to do to send another response.
Of course this not working because res.send() close connection and I received error
So how to achieve my idea, to sending simultaneously after post request. I want to give user data very fast, then another one when is ready not waiting for all and then send response.
You can't
Reason:
Http closes connection after sending response. You can not keep it open and sending multiple responses to the client. HTTP doesn't support it.
Solution 1:
Simply put a timer at client side and request periodically.
Solution 2 (Recommended):
Use socket, and pass data through it. socket.io is the socket library for nodejs applications. It is very easy to use. Set up a connection, keep sending data from server and receive it on client side.
Just to add on the answer. This answer explains why res.send closes the connection.

Node.js and understanding how response works

I'm really new to node.js so please bear with me if I'm making a obvious mistake.
To understand node.js, i'm trying to create a webserver that basically:
1) update the page with appending "hello world" everytime the root url (localhost:8000/) is hit.
2) user can go to another url (localhost:8000/getChatData) and it will display all the data built up from the url (localhost:8000/) being triggered
Problem I'm experiencing:
1) I'm having issue with displaying that data on the rendered page. I have a timer that should call get_data() ever second and update the screen with the data variable that stores the appended output. Specifically this line below response.simpleText(200, data); isn't working correctly.
The file
// Load the node-router library by creationix
var server = require('C:\\Personal\\ChatPrototype\\node\\node-router').getServer();
var data = null;
// Configure our HTTP server to respond with Hello World the root request
server.get("/", function (request, response) {
if(data != null)
{
data = data + "hello world\n";
}
else
{
data = "hellow world\n";
}
response.writeHead(200, {'Content-Type': 'text/plain'});
console.log(data);
response.simpleText(200, data);
response.end();
});
// Configure our HTTP server to respond with Hello World the root request
server.get("/getChatData", function (request, response) {
setInterval( function() { get_data(response); }, 1000 );
});
function get_data(response)
{
if(data != null)
{
response.writeHead(200, {'Content-Type': 'text/plain'});
response.simpleText(200, data);
console.log("data:" + data);
response.end();
}
else
{
console.log("no data");
}
}
// Listen on port 8080 on localhost
server.listen(8000, "localhost");
If there is a better way to do this, please let me know. The goal is to basically have a way for a server to call a url to update a variable and have another html page to report/display the updated data dynamically every second.
Thanks,
D
The client server model works by a client sending a request to the server and the server in return sends a response. The server can not send a response to the client that the client hasn't asked for. The client initiates the request. Therefore you cannot have the server changing the response object on an interval.
The client will not get these changes to the requests. How something like this is usually handled as through AJAX the initial response from the server sends Javascript code to the client that initiates requests to the server on an interval.
setTimeout accepts function without parameter which is obvious as it will be executed later in time. All values you need in that function should be available at the point of time. In you case, the response object that you are trying to pass, is a local instance which has scope only inside the server.get's callback (where you set the setTimeout).
There are several ways you can resolve this issue. you can keep a copy of the response instance in the outer scope where get_data belongs or you can move the get_data entirely inside and remove setTimeout. The first solution is not recommended as if getChatData is called several times in 1sec the last copy will be prevailing.
But my suggestion would be to keep the data in database and show it once getChatData is called.

Multiple clients posting data in node js

I've read that in Node js one should treat POST requests carefully because the post data may arrive in chunks, so it has to be handled like this, concatenating:
function handleRequest(request, response) {
if (request.method == 'POST') {
var body = '';
request.on('data', function (data) {
body += data;
});
request.on('end', function () {
//data is complete here
});
}
}
What I don't understand is how this code snippet will handle several clients at the same time. Let's say two separate clients start uploading large POST data. They will be added to the same body, mixing up the data...
Or is it the framework which will handle this? Triggering different instances of handleRequest function so that they do not get mixed up in the body variable?
Thanks.
Given the request, response signature of your method, it looks like that's a listener for the request event.
Assuming that's correct, then this event is emitted for every new request, so as long as you are only concatenating new data to a body object that is unique to that handler (as in your current example), you're good to go.

Resources