fetch data from multiple table by sending only one request - node.js

I am using node.js for server side development and backbone.js for client side development. i want to fetch data from multiple table(more than 3) by sending only one request to node.js. but i cant merge all that result with each other beacuse of asynchronous execution of node.js. i have done this but it sending a lots of get request to node js for getting data from all the tables and because of these performance of my site is become slower. please help me if anyone having any idea.

I would create a method which aggregates the results from each of the requests and sends the response back. Basically each of your three async db calls would pass their data to the same method. That method would check to see if it had all of the data it needed to complete the request, and if it did, send the response.
Here is a pseudo code example:
function handleRequest(req, res) {
var results = {};
db.getUsers(function(data) {
aggregate('users', data);
});
db.getPosts(function(data) {
aggregate('posts', data);
});
db.getComments(function(data) {
aggregate('comments', data);
});
function aggregate(name, data) {
results[name] = data;
if(results.users && results.posts && results.comments) {
res.send(results);
}
}
}
This is simplified greatly, you should also of course check for errors and timeouts to the db calls, but this will allow you to wait for all the async commands to complete before sending the data.

Related

Asynchronous processing of data in Expressjs

I've an Express route which receives some data and process it, then insert into mongo (using mongoose).
This is working well if I return a response after the following steps are done:
Receive request
Process the request data
Insert the processed data into Mongo
Return 204 response
But client will be calling this API concurrently for millions of records. Hence the requirement is not to block the client for processing the data. Hence I made a small change in the code:
Receive request
Return response immediately with 204
Process the requested data
Insert the processed data into Mongo
The above is working fine for the first few requests (say 1000s), after that client is getting socket exception: connection reset peer error. I guess it is because server is blocking the connection as the port is not free and at some point of time, I notice my nodejs process is throwing out Out of memory error.
Sample code is as follows:
async function enqueue(data) {
// 1. Process the data
// 2. Insert the data in mongo
}
async function expressController(request, response) {
logger.info('received request')
response.status(204).send()
try {
await enqueue(request.body)
} catch (err) {
throw new Error(err)
}
}
Am I doing something wrong here?

How to return 2 response for a single request in node.js

My node app is simple, for a request querying the total customers in mysql databases, i do a block call using await to wait for query to finish.
The problem is it can only handle ~75 requests per second, too low.
So i try to return 200 whenever i get the request, telling caller i get the request.
Then return the query result when ready, mysql query could take a while.
But not working for me yet, this is the code:
router:
router.get('', controller.getCustomers);
controller:
const getCusomers = (req, res) => {
try {
service.getCustomers(res);
res.write('OK');
//res.send(200); this will end the response before query ends
} catch (err) {
res.end(err);
}
service:
const getCustomers = async (res) => {
customers = await mysqlPool.query('select * from cusomerTable');
res.send(customers);
}
Error: Can't set headers after they are sent to the client.
How to fix this pls ?
You can use Server-Sent Events. You can send an event as soon as you get the request.
When the result is ready, you can send another event. Kindly find my GitHub repo useful for a SSE sample.
OR
You can just use res.write() to stream the response. Find this StackOverflow answer useful.

Node.js multiple requests - no responses

I am trying to get data from web API with axios (Node.js). I need to execute approximately 200 requests with different URLs to fetch some data for further analysis. I tried to use multiple libs for http callouts but in every case i have the same issue. I did not receive success or error callback. Request just stock somewhere.
async function sendRequest(url) {
let resp = await axios.get(url);
return resp.data;
}
I am calling this function in foor loop
for (var url in urls) {
try {
setData(url)
} catch (e) {
console.log(e);
}
}
async function setData(url) {
var data = await sendRequest(url);
// Set this data in global variable.
globalData[url] = data;
}
I often received this error:
Error: read ECONNRESET
I think this is all connected with too many requests in small interval.
What should I do to receive all requests. My temporary fix is to periodicly send 20 request per 20 seconds (still not ok, but I received more responses). But this is slow and it takes to many time.
However, I need all data from 200 requests in one variable for further analysis. If I wait for every request It takes too many time.

Reasonable design of using socket.io for RPC

I am building a web app that uses socket.io to trigger remote procedure calls that passes sessions specific data back go the client. As my app is getting bigger and getting more users, I wanted to check to see if my design is reasonable.
The server that receives websocket communications and triggers RPCs looks something like this:
s.on('run', function(input) {
client.invoke(input.method, input.params, s.id, function(error, res, more) {
s.emit('output', {
method: input.method,
error: error,
response: res,
more: more,
id: s.id
});
});
});
However, this means that the client has to first emit the method invocation, and then listen to all method returns and pluck out its correct return value:
socket.on('output', function(res) {
if (res.id === socket.sessionid) {
if (!res.error) {
if(res.method === 'myMethod') {
var myResponse = res.response;
// Do more stuff with my response
});
}
}
});
It is starting to seem like a messy design as I add more and more functions... is there a better way to do this?
The traditional AJAX way of attaching a callback to each function would be a lot nicer, but I want to take advantage of the benefits of websockets (e.g. less overhead for rapid communications).

Multiple clients posting data in node js

I've read that in Node js one should treat POST requests carefully because the post data may arrive in chunks, so it has to be handled like this, concatenating:
function handleRequest(request, response) {
if (request.method == 'POST') {
var body = '';
request.on('data', function (data) {
body += data;
});
request.on('end', function () {
//data is complete here
});
}
}
What I don't understand is how this code snippet will handle several clients at the same time. Let's say two separate clients start uploading large POST data. They will be added to the same body, mixing up the data...
Or is it the framework which will handle this? Triggering different instances of handleRequest function so that they do not get mixed up in the body variable?
Thanks.
Given the request, response signature of your method, it looks like that's a listener for the request event.
Assuming that's correct, then this event is emitted for every new request, so as long as you are only concatenating new data to a body object that is unique to that handler (as in your current example), you're good to go.

Resources