Node.js multiple requests - no responses - node.js

I am trying to get data from web API with axios (Node.js). I need to execute approximately 200 requests with different URLs to fetch some data for further analysis. I tried to use multiple libs for http callouts but in every case i have the same issue. I did not receive success or error callback. Request just stock somewhere.
async function sendRequest(url) {
let resp = await axios.get(url);
return resp.data;
}
I am calling this function in foor loop
for (var url in urls) {
try {
setData(url)
} catch (e) {
console.log(e);
}
}
async function setData(url) {
var data = await sendRequest(url);
// Set this data in global variable.
globalData[url] = data;
}
I often received this error:
Error: read ECONNRESET
I think this is all connected with too many requests in small interval.
What should I do to receive all requests. My temporary fix is to periodicly send 20 request per 20 seconds (still not ok, but I received more responses). But this is slow and it takes to many time.
However, I need all data from 200 requests in one variable for further analysis. If I wait for every request It takes too many time.

Related

Chain of endpoints in Node and Express: how to prevent that some of them stops all the series?

In some page I have to get information from 8 different endpoints. 2 of them are outside of my application and sometimes they cause an delay at displaying data. The web browser waits until the data is processed. Once they're outside of my app I can't refactor them in order to make them fast, but I need to show the information that they provide. In addition, sometimes one of them returns nothing. If so, I use default data to show to the user. The waiting time takes time for the user experience perspective.
I'm using promises to call these endpoints. Below is part of the code snippet that I am using.
The code is working fine. The issue is the delay.
First. Here is the array that contains all the service that I need to process:
var requests = [{
// 0
url: urlLocalApi + '/endpointURL_1/',
headers: {
'headers': 'apitoken'
},
}, {
// 1
url: urlLocalApi + '/endpointURL_2/',
headers: {
'headers': 'apitoken'
},
];
The code of array is encapsulated in this method:
const requests = homePageFunctions.createRequest();
Now, it is how the data is processed. I am using both 'request-promise' and 'bluebird', and a personal logger to check it out if everything goes fine.
const Promise = require("bluebird");
const request = require('request-promise');
var viewsHelper = {
getPageData: function (requests) {
return Promise.map(requests, function (obj) {
return request(obj).then(function (body) {
AppLogger.log(`Endpoint parsed`, statusLogger.infodate);
return JSON.parse(body);
});
});
}
}
module.exports = viewsHelper;
How do I call this?
viewsHelper.getPageData(requests)
.then(results => {
var output = [];
for (var i = 0; i < results.length; i++) {
output.push(results[i]);
}
// render data
res.render('homepage/index', output);
AppLogger.log(`PageData is rendered`, statusLogger.infodate);
})
.catch(err => {
console.log(err);
});
};
Take a look that inside of each index item of "output" array, there is the output of each data of each endpoint.
The problem here is:
If any of the endpoint takes long, the entire chain slows even though
if they are already processed. The web page waits in a blank mode.
How to prevent this behavior?
That is an interesting question but I have questions in order to answer it effectively.
You have Node server and client (HTML/JS)
You have 8 end points 2 are slow because you don’t have control over them.
Does the client (page) aware of the 8 end points? I .e you make 8 calls everytime you reload the page?
OR
Does the page makes one request to your node JS and your nodeJS synchronously calls the 8 end points
If it is 1 then lazy loading will work easily for you since the page is making the requests.
If it is 2 lazy loading will work only at the server side however the client will be blocked because it doesn’t know (or care how you load your data. The page made one request and it is blocked waiting for that request..
Obviously each method has pros and cons ..
One way you can solve this is to asynchronously call those end points on node and cache them and when the page makes the 1 request you have the data ready ..
Again we know very little about the situation there are many ways to solve this
Hope this helps

Asynchronous processing of data in Expressjs

I've an Express route which receives some data and process it, then insert into mongo (using mongoose).
This is working well if I return a response after the following steps are done:
Receive request
Process the request data
Insert the processed data into Mongo
Return 204 response
But client will be calling this API concurrently for millions of records. Hence the requirement is not to block the client for processing the data. Hence I made a small change in the code:
Receive request
Return response immediately with 204
Process the requested data
Insert the processed data into Mongo
The above is working fine for the first few requests (say 1000s), after that client is getting socket exception: connection reset peer error. I guess it is because server is blocking the connection as the port is not free and at some point of time, I notice my nodejs process is throwing out Out of memory error.
Sample code is as follows:
async function enqueue(data) {
// 1. Process the data
// 2. Insert the data in mongo
}
async function expressController(request, response) {
logger.info('received request')
response.status(204).send()
try {
await enqueue(request.body)
} catch (err) {
throw new Error(err)
}
}
Am I doing something wrong here?

How to return 2 response for a single request in node.js

My node app is simple, for a request querying the total customers in mysql databases, i do a block call using await to wait for query to finish.
The problem is it can only handle ~75 requests per second, too low.
So i try to return 200 whenever i get the request, telling caller i get the request.
Then return the query result when ready, mysql query could take a while.
But not working for me yet, this is the code:
router:
router.get('', controller.getCustomers);
controller:
const getCusomers = (req, res) => {
try {
service.getCustomers(res);
res.write('OK');
//res.send(200); this will end the response before query ends
} catch (err) {
res.end(err);
}
service:
const getCustomers = async (res) => {
customers = await mysqlPool.query('select * from cusomerTable');
res.send(customers);
}
Error: Can't set headers after they are sent to the client.
How to fix this pls ?
You can use Server-Sent Events. You can send an event as soon as you get the request.
When the result is ready, you can send another event. Kindly find my GitHub repo useful for a SSE sample.
OR
You can just use res.write() to stream the response. Find this StackOverflow answer useful.

Shopify API - Get all products (60k products) Got Request Time Out or socket hang up

I am trying to get all of products but I got Request timed out while trying to get 60k products for inventory management app.
I am using nodejs to loop into 200 pages, each page limit to 250 products. I limited 2 requests ever 10 seconds for my calls (5 seconds/1 request)
sometime I got these errors one some pages. Sometimes not
read ECONNRESET
Request timed out
socket hang up
Could any one please tell me what is the problem? I would appreciate your help.
for (var i = 1; i<= totalPage;i++)
{
var promise = shopify.product.list({limit:limit,page:i,fields:fields})
.then(products =>{
// do some thing here when got products list
// loop through each product then save to DB
// ShopifyModel.updateOne(.....)
}).catch(error=>{
// some time it fired error here
})
}
I also tried to rewrite a function to get products of 1 page:
const request = require('request-promise');
var getProductOnePage = function (Url_Page,headers,cb){
request.get(productUrl, { headers: headers,gzip:true })
.then((ListProducts) => {
console.log(" Got products list of one page");
cb(ListProducts);
})
.catch(err=>{
// Got All Error Here when try to put into for loop or map or forEach with promise.all
console.log("Error Cant get product of 1 page: ",err.message);
});
}
EDIT:
I found some problem similar to my case here:
https://github.com/request/request/issues/2047
https://github.com/twilio/twilio-node/issues/312
ECONNRESET and Request timed out errors are there mostly due to network problem. Check if you have a stable internet connection.
If you're using shopify api node package, then use autoLimit property. It will take care of rate limiting.
eg:
const shopify = new Shopify({
shopName: shopName,
apiKey: api_key,
password: password,
autoLimit : { calls: 2, interval: 1000, bucketSize: 30 }
});
Edit: Instead of writing then catch inside a for loop, use async await. Because weather you implement request and wait approach or not, for loop will send all requests. But if you use await, it will process one request at a time.
let getProducts = async () => {
for (var i = 1; i<= totalPage;i++)
{
try {
let products = await shopify.product.list({limit:limit,page:i,fields:fields});
if(!products.length) {
// all products have been fetched
break;
}
// do you stuffs here
} catch (error) {
console.log(error);
}
}
}
You have to understand the concept of rate limiting. With any public API like Shopify, you can only make so many calls before they put you on hold. So when you get a response back from Shopify, you can check the header for how many requests you can make. If it is zero, you'll get back a 429 if you try a request.
So when you get a 0 for credits, or a 429 back, you can set yourself a little timeout and wait to make your next call.
If on the other hand, as you say, you are only doing 2 calls every 10 seconds (not at all clear from your code how you do that, and why??) and you're getting timeouts, then your Internet connection to Shopify is probably the problem.

fetch data from multiple table by sending only one request

I am using node.js for server side development and backbone.js for client side development. i want to fetch data from multiple table(more than 3) by sending only one request to node.js. but i cant merge all that result with each other beacuse of asynchronous execution of node.js. i have done this but it sending a lots of get request to node js for getting data from all the tables and because of these performance of my site is become slower. please help me if anyone having any idea.
I would create a method which aggregates the results from each of the requests and sends the response back. Basically each of your three async db calls would pass their data to the same method. That method would check to see if it had all of the data it needed to complete the request, and if it did, send the response.
Here is a pseudo code example:
function handleRequest(req, res) {
var results = {};
db.getUsers(function(data) {
aggregate('users', data);
});
db.getPosts(function(data) {
aggregate('posts', data);
});
db.getComments(function(data) {
aggregate('comments', data);
});
function aggregate(name, data) {
results[name] = data;
if(results.users && results.posts && results.comments) {
res.send(results);
}
}
}
This is simplified greatly, you should also of course check for errors and timeouts to the db calls, but this will allow you to wait for all the async commands to complete before sending the data.

Resources