I've an Express route which receives some data and process it, then insert into mongo (using mongoose).
This is working well if I return a response after the following steps are done:
Receive request
Process the request data
Insert the processed data into Mongo
Return 204 response
But client will be calling this API concurrently for millions of records. Hence the requirement is not to block the client for processing the data. Hence I made a small change in the code:
Receive request
Return response immediately with 204
Process the requested data
Insert the processed data into Mongo
The above is working fine for the first few requests (say 1000s), after that client is getting socket exception: connection reset peer error. I guess it is because server is blocking the connection as the port is not free and at some point of time, I notice my nodejs process is throwing out Out of memory error.
Sample code is as follows:
async function enqueue(data) {
// 1. Process the data
// 2. Insert the data in mongo
}
async function expressController(request, response) {
logger.info('received request')
response.status(204).send()
try {
await enqueue(request.body)
} catch (err) {
throw new Error(err)
}
}
Am I doing something wrong here?
Related
Background:
I have a NodeJS TCP stream server with SSL (tls server) which simply forwards data from client A to client B. i.e. it listens to connections on port 9000 (client A) and writes the received data to an existing persistent connection on port 9001 (client B). The connection between NodeJS and client B is persistent, never broken. Everything is running on my local machine.
Problem:
When using JMeter as client A and sending 300 requests (threads) with a ramp-up period of 1 second, a few requests never arrive at client B.
Troubleshooting done so far:
First I checked the logs of NodeJS application, which indicated that 300 requests were received from client A, and 300 requests were sent to client B.
Then I checked the logs of the client B, which indicated that only 298 requests arrived.
Next I monitored the local network using WireShark, which to my surprise indicated that 300 requests arrived from Client A to server, but only 298 of them were sent to client B.
Server side function which writes data to client B:
The function is a class member, where vSocket is a NodeJS TLSSocket which represents the persisted connection with client B. In the server logs, I get "Writing data to socket." and "Wrote data to socket." 300 times. I never get "Unexpected error while writing data to socket." or "Write returned false.".
public async SendAsync (pData: string): Promise<void> {
return new Promise<void>((pResolve, pReject) => {
try {
this.uLogger.Debug('Writing data to socket.', pData);
const rc = this.vSocket.write(pData, (pError) => {
if (pError) {
this.uLogger.Error('Unexpected error while writing data to socket.', pError);
return pReject(pError);
}
this.uLogger.Debug('Wrote data to socket.', pData);
if (this.vKeepConnectionOpen == false) {
this.uLogger.Debug('Not keeping this socket open.');
this.CloseAsync();
}
return pResolve();
});
if (rc == false) {
this.uLogger.Debug('Write returned false.', this.uId);
}
} catch (error) {
this.uLogger.Error('Unexpected error while writing data to socket.', error);
return pReject(error);
}
});
}
socket.write(data[, encoding][, callback]):
The docs say that -
Returns true if the entire data was flushed successfully to the kernel buffer. Returns false if all or part of the data was queued in user memory.
Since in my case this function never returns false, I assume NodeJS has successfully written the data to OS buffer, and the data should have been sent.
This problem is frequently replicable, and more likely to occur when I send hundreds of requests with JMeter. It is less likely to occur with 10 requests, and has never happened with < 5 requests.
I don't understand what's happening here, any help is appreciated.
I have to insert a a table with data regarding sent emails, after each email is sent.
Inside a loop I'm stuffing an array to be solved by the Promise.all().
insertData is a function that inserts Data, given two arguments, connector, the connection pool and dataToInsert, an object with data to be inserted.
async function sendAndInsert(payload) {
for (data of payload) {
let array = [];
const dataToInsert = {
id: data.id,
campaign: data.campaign,
}
for (email of data) {
array.push(insertData(connector, dataToInsert));
}
await Promise.all(array);
}
}
Afterwards, the function is invoked:
async invoke () {
await sendAndInsert(toInsertdata);
}
To insert 5000 records, it takes about 10 minutes, which is nuts.
Using
nodejs v10
pg-txclient as DB connector to PostgreSql.
What I've done and can be discarded as possible source of error:
Inserted random stuff to the table using the same connection.
I'm sure there is no issue with DB server, connection.
The issue must be in the Promise.all(), await sutff.
It looks like each record is being inserted through a separate call to insertData. Each call is likely to include overhead such as network latency, and 5000 requests cannot all be handled simultaneously. One call to insertData has to send the data to the database and wait for a response, before the next call can even start sending its data. 5000 requests over 10 minutes corresponds to 1.2 seconds latency per request, which is not unreasonable if the database is on another machine.
A better strategy is to insert all of the objects in one network request. You should modify insertData to allow it to accept an array of objects to insert instead of just one at a time. Then, all data can be sent at once to the database and you only suffer through the latency a single time.
I have a node API that is working fine when tested using postman.
But when I use this API in my angular project there occurs an error and browser don't get any response there it keep waiting for a response. When I go to console I see the error message.
How I can make that error message to be sent back to the browser with full stack trace
In general, you will need to catch that error, then populate http response object with it just the same as if you were sending successful response data back to the requestor.
Synchronous processing:
try {
// do my requested stuff
res.status(200).json({something:"returned"});
} catch(ex) {
res.status(500).json(ex);
};
Promises:
Promise.resolve()
.then(() => {
// do my requested stuff
// return my results stuff to the client
res.status(200).json({something:"returned"});
})
.catch((ex) => {
// return 500 error and exception data to the client
res.status(500).json(ex);
});
Also, as standard practice you should catch all errors, and at the very least, you should return a 500 to the browser res.status(500) so you don't leave it hanging when unexpected issues arise.
And, of course you can return html rather than json, and/or more info in the response.
Good luck.
I am trying to get data from web API with axios (Node.js). I need to execute approximately 200 requests with different URLs to fetch some data for further analysis. I tried to use multiple libs for http callouts but in every case i have the same issue. I did not receive success or error callback. Request just stock somewhere.
async function sendRequest(url) {
let resp = await axios.get(url);
return resp.data;
}
I am calling this function in foor loop
for (var url in urls) {
try {
setData(url)
} catch (e) {
console.log(e);
}
}
async function setData(url) {
var data = await sendRequest(url);
// Set this data in global variable.
globalData[url] = data;
}
I often received this error:
Error: read ECONNRESET
I think this is all connected with too many requests in small interval.
What should I do to receive all requests. My temporary fix is to periodicly send 20 request per 20 seconds (still not ok, but I received more responses). But this is slow and it takes to many time.
However, I need all data from 200 requests in one variable for further analysis. If I wait for every request It takes too many time.
I am using node.js for server side development and backbone.js for client side development. i want to fetch data from multiple table(more than 3) by sending only one request to node.js. but i cant merge all that result with each other beacuse of asynchronous execution of node.js. i have done this but it sending a lots of get request to node js for getting data from all the tables and because of these performance of my site is become slower. please help me if anyone having any idea.
I would create a method which aggregates the results from each of the requests and sends the response back. Basically each of your three async db calls would pass their data to the same method. That method would check to see if it had all of the data it needed to complete the request, and if it did, send the response.
Here is a pseudo code example:
function handleRequest(req, res) {
var results = {};
db.getUsers(function(data) {
aggregate('users', data);
});
db.getPosts(function(data) {
aggregate('posts', data);
});
db.getComments(function(data) {
aggregate('comments', data);
});
function aggregate(name, data) {
results[name] = data;
if(results.users && results.posts && results.comments) {
res.send(results);
}
}
}
This is simplified greatly, you should also of course check for errors and timeouts to the db calls, but this will allow you to wait for all the async commands to complete before sending the data.