I am trying to POST a request to external API using node-fetch in my node js code.
I want to retry the request 3 times if there are any timeouts or network failures while doing the POST request.
Could you let me know how to retry the request using node-fetch? I see that there is a npm module https://www.npmjs.com/package/node-fetch-retry but it doesnt seem to work as expected and it also doesn't accept retry interval between retries. Any code snippets will be very helpful.
EDIT:
Thanks i tried using promise-fn-retry but it doesnt seem to do any retry. Below is the code snippet i tried by switching off my WIFI and then doing the FETCH call to see if it retried 3 times. But it does the fetch just once and returns the error.
const promiseFn = () => fetch(url,{method:"POST",headers:header,body:JSON.stringify(payload)});
const options = {
times: 3,
initialDelay: 100,
onRetry: (error) => {
console.log(error);
}
};
console.log('PromiseFn result ****'+retry(promiseFn, options).then(res => res.json()).then((res)=>{console.log('Promise Fn result is'+JSON.stringify(res))}).catch((err)=>{console.log('Error in Promise Fn'+err)}));
this snippet helped me in the past! Hope it is what you are looking for
const fetch = require('node-fetch');
const delay = (ms) => new Promise((resolve) => setTimeout(() => resolve(), ms));
const retryFetch = (
url,
fetchOptions = {},
retries = 3,
retryDelay = 1000,
timeout
) => {
return new Promise((resolve, reject) => {
// check for timeout
if (timeout) setTimeout(() => reject('error: timeout'), timeout);
const wrapper = (n) => {
fetch(url, fetchOptions)
.then((res) => resolve(res))
.catch(async (err) => {
if (n > 0) {
await delay(retryDelay);
wrapper(--n);
} else {
reject(err);
}
});
};
wrapper(retries);
});
};
retryFetch('http://localhost:8080/test', {}, 20)
.then((res) => res.text())
.then(console.log)
.catch(console.error);
Try the promise-fn-retry package, it has properties to specify retry times, initial delay time and callback methods to handle failed fetch/promises.
Hope this helps!
Related
I'm attempting to
Fetch GET my website (with node-fetch)
Scrape it with Cheerio to get specific posts
Fetch GET from my CMS (with node-fetch) to check if there's already a post with the same time
If the check shows no duplicates, Fetch POST into my CMS
The problem I've run into though, is that I can console.log() the duplicate check result, but when I make a conditional with the result for the Fetch POST request, it always returns the check result as a Promise
I'm not sure how to correctly structure my async .then() calls to correctly check.
Code (edited for simplicity):
fetch('https://www.myblog.com/', { method: 'get' })
.then((res) => {
return res.text()
})
.then((data) => {
const $ = cheerio.load(data)
const siteHeading = $('.post-item')
siteHeading.each((index, element) => {
const title = $(element).find('.entry-title').text()
const desc = $(element).find('.entry-content').text()
const exists = fetch(
'https://mycms.com/api/articles?filters[title][$eq]=' +
title,
{
method: 'GET',
}
)
.then((response) => response.json())
.then((article) => article.data[0])
.then((exists) => {
if (exists) {
// ^exists always shows up as True, console.log(exists) shows Promise<pending>
fetch('https://api.broadband.money/api/news-articles', {
method: 'POST',
body: JSON.stringify({
data: {
title: title ? title : null,
desc: blurb ? blurb : null,
},
}),
})
.then((response) => response.json())
.then((data) => console.log(data))
}
})
})
})
Using .then() makes code difficult to understand especially when they are nested together. You have to keep track of each of then and take care that the brackets are closed at correct place.
Use async await instead. This way you can achieve your objective without so much of nesting:
Example:
const data = await fetch('https://www.myblog.com/'{method:'GET'}).then(res=>res.json)
Here the program will wait for the fetch to get completed and then only it will proceed further.
Remember to use async keyword before function declaration inside which you are using fetch.
Example:
async function functionName()
{
..
const data = await fetch...............
..
}
First, what Marcos advises (combining async/await with .then) is an anti-pattern. Either one or the other must be used.
Second, you use extra .then where you don't need it.
Here:
...
.then((exists) => {
...
Thirdly, the forEach method does not wait for the execution of asynchronous code (although in your case this does not seem to matter).
Here is a small example for understanding:
function each() {
console.log("Start");
const arr = [0, 1, 2, 3];
arr.forEach(async (el) => {
await new Promise((resolve) => {
setTimeout(() => {
resolve(console.log("This log should be after the start", el));
}, 1000);
});
});
console.log("Finish");
}
each();
Output:
Start
Finish
This log should be after the start 0
This log should be after the start 1
This log should be after the start 2
This log should be after the start 3
Instead you should use a "for" loop:
async function loop() {
console.log("Start");
const arr = [0, 1, 2, 3];
for (let i = 0; i < arr.length; i++) {
await new Promise((resolve) => {
setTimeout(() => {
resolve(console.log("This log should be after the start", arr[i]));
}, 1000);
});
}
console.log("Finish");
}
loop();
Output:
Start
This log should be after the start 0
This log should be after the start 1
This log should be after the start 2
This log should be after the start 3
Finish
And finally, it seems to me that replacing this
.then((article) => article.data[0])
.then((exists) => {
with this
.then((article) => {
const exists = article.data[0];
should help. If you provide a working code, I can tell you more.
The only way I found to set a timeout on a node-fetch request is to use the abort-controller:
import AbortController from 'abort-controller'
import fetch from 'node-fetch'
const fetchWithTimeout = async (url, options) => {
const controller = new AbortController()
const timeout = setTimeout(() => {
controller.abort()
}, 10000)
const response = await fetch(url, {
signal: controller.signal,
...options
})
clearTimeout(timeout)
return response.text()
}
Besides the fact that this looks ugly, it also has a huge issue that I don't know how to dodge:
The controller.abort() aborts ALL current requests, not just one. So, if I call fetchWithTimeout() 10 times in parallel and one of them gets aborted, then all the others get aborted too.
I've kinda solved it with even more awkward code:
import AbortController from 'abort-controller'
import fetch from 'node-fetch'
const aborted = {}
const fetchWithTimeout = async (url, options) => {
const controller = new AbortController()
const timeout = setTimeout(() => {
aborted[url] = true
controller.abort()
}, 10000)
try {
const response = await fetch(url, {
signal: controller.signal,
...options
})
clearTimeout(timeout)
return response.text()
} catch (e) {
clearTimeout(timeout)
if (aborted[url] || e?.type !== 'aborted') {
delete aborted[url]
throw e
} else {
return await fetchWithTimeout(url, options)
}
}
}
But this is so wrong and inefficient. What else can I do with it besides replacing node-fetch with an alternative?
I tried to reproduce the behaviour of controller.abort() aborting ALL current requests, but it seems to work as expected and I doubt that it's a bug in node-fetch.
I think the problem is how you handle the errors when you execute the requests in parallel (presumably) using Promise.all. As stated in the docs it rejects immediately upon any of the input promises rejecting or non-promises throwing an error, and will reject with this first rejection message / error.
You are probably looking for Promise.allSettled instead, as it resolves after all of the given promises have either fulfilled or rejected independently of each other. So it could look something like this:
(async () => {
const parallelRequests = [];
for (let i = 0; i < 5; i++) {
parallelRequests.push(fetchWithTimeout('http://some-url.com'));
}
const result = await Promise.allSettled(parallelRequests);
// result will be an array holding status information about each promise and the resolved value or the rejected error
})();
The reason why your second approach works with Promise.all, is that you actually catch and handle errors from the fetch-call. But be aware that in case of non-aborted error, you re-throw the error which will again cause an immediate rejection regardless of the other requests.
I've spent too many hours on various node fetch libraries. Just use curl. It just works.
import { exec } from 'child_process'
async function curl (url: string) {
return await new Promise(async resolve => {
const command = `curl "${url}" -L --connect-timeout 10 --max-time 12`
exec(command, { maxBuffer: 1024 * 1024 * 10 }, (err, stdout, stderr) => {
if (err) {
console.error(err, stderr)
resolve('')
} else {
resolve(stdout)
}
})
})
}
export default curl
I have a script using axios that hits an API with a limit of 5 requests per second. At present my request array length is 72 and will grow over time. I receive an abundance of 429 errors. The responses per endpoint change with each run of the script; ex: url1 on iteration1 returns 429, then url1 on iteration2 returns 200, url1 on iteration3 returns 200, url1 on iteration4 returns 429.
Admittedly my understanding of async/await and promises are spotty at best.
What I understand:
I can have multiple axios.get running because of async. The variable I set in my main that uses the async function can include the await to ensure all requests have processed before continuing the script.
Promise.all can run multiple axios.gets but, if a single request fails the chain breaks and no more requests will run.
Because the API will only accept 5 requests per second I have to chunk my axios.get requests to 5 endpoints, wait for those to finish processing before sending the next chunk of 5.
setTimeout will assign a time limit to a single request, once the time is up the request is done and will not be sent again no matter the return being other than 200.
setInterval will assign a time limit but it will send the request again after time's up and keep requesting until it receives a 200.
async function main() {
var endpoints = makeEndpoints(boards, whiteList); //returns an array of string API endpoints ['www.url1.com', 'www.url2.com', ...]
var events = await getData(endpoints);
...
}
The getData() has seen many iterations in attempt to correct the 429's. Here are a few:
// will return the 200's sometimes and not others, I believe it's the timeout but that won't attempt the hit a failed url (as I understand it)
async function getData(endpoints) {
let events = [];
for (x = 0; x < endpoints.length; x++) {
try {
let response = await axios.get(endpoints[x], {timeout: 2000});
if ( response.status == 200 &&
response.data.hasOwnProperty('_embedded') &&
response.data._embedded.hasOwnProperty('events')
) {
let eventsArr = response.data._embedded.events;
eventsArr.forEach(event => {
events.push(event)
});
}
} catch (error) {
console.log(error);
}
}
return events;
}
// returns a great many 429 errors via the setInterval, as I understand this function sets a delay of N seconds before attempting the next call
async function getData(endpoints) {
let data = [];
let promises = [];
endpoints.forEach((url) => {
promises.push(
axios.get(url)
)
})
setInterval(function() {
for (i = 0; i < promises.length; i += 5) {
let requestArr = promises.slice(i, i + 5);
axios.all(requestArr)
.then(axios.spread((...res) => {
console.log(res);
}))
.catch(err => {
console.log(err);
})
}
}, 2000)
}
// Here I hoped Promise.all would allow each request to do its thing and return the data, but after further reading I found that if a single request fails the rest will fail in the Promise.all
async getData(endpoints) {
try {
const res = await Promise.all(endpoints.map(url => axios.get(url))).catch(err => {});
} catch {
throw Error("Promise failed");
}
return res;
}
// Returns so many 429 and only 3/4 data I know to expect
async function getData(endpoints) {
const someFunction = () => {
return new Promise(resolve => {
setTimeout(() => resolve('222'), 100)
})
}
const requestArr = endpoints.map(async data => {
let waitForThisData = await someFunction(data);
return axios.get(data)
.then(response => { console.log(response.data)})
.catch(error => console.log(error.toString()))
});
Promise.all(requestArr).then(() => {
console.log('resolved promise.all')
})
}
// Seems to get close to solving but once an error is it that Promise.all stops processing endpoint
async function getData(endpoints) {
(async () => {
try {
const allResponses = await Promise.all(
endpoints.map(url => axios.get(url).then(res => console.log(res.data)))
);
console.log(allResponses[0]);
} catch(e) {
console.log(e);
// handle errors
}
})();
}
It seems like I have so many relevant pieces but I cannot connect them in an efficient and working model. Perhaps axios has something completely unknown to me? I've also tried using blurbird concurrent to limit the request to 5 per attempt but that still returned the 429 from axios.
I've been starring at this for days and with so much new information swirling in my head I'm at a loss as to how to send 5 requests per second, await the response, then send another set of 5 requests to the API.
Guidance/links/ways to improve upon the question would be much appreciated.
I saw that if a request has no response, it waits more then 60s.
In my case I have nested async loop, managed with callbacks and promise. Each element is a api call.
So I want to detect for example if an api don't respond after 10seconds to go ahead.
My code for the api requests :
return new Promise((resolve, reject) => {
var start = moment();
const req = adapterFor(url).request(options, (resp) => {
//console.log(`El pr ${options.path}`);
let data = '';
resp.on('data', (chunk) => {
data += chunk;
});
resp.on('end', () => {
try {
tmpData = JSON.parse(data.trim());
if(tmpData.length != 0 && tmpData.films.length > 0) {
data = tmpData.films.filter(v => {
return v.film_o.length > 0
})
resolve(data);
}else resolve({"Errore" : url,error:some error',error_type:'some error',duration: ((moment() - start) / 1000)+' s'});
}catch (e) {
resolve({"Errore" : url,'error':'HERE MAYBE',error_type:'?!',duration: ((moment() - start) / 1000)+' s'});
}
// console.log(colors.gray("EL: " + tmpData.DS.Scheduling.Events.length.length));
});
});
req.end();
})
If you can utilize ESnext then you can use Promise.race(). Consider the following example:
Promise.race([
new Promise((resolve, reject) => setTimeout(() => resolve(1), 5000)),
new Promise((resolve, reject) => setTimeout(() => resolve(2), 10000))
]).then(res => console.log(res))
You can provide multiple promises in an Array and Promise.race will always choose the fastest and resolve this one. So you can take your promise as one element and a timeout as shown above as the second. This promise will then always resolve after the timeout, or when your api call is done. Whatever is first will be resolved.
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise/race
https://jsfiddle.net/yLv64zr5/
One option is to indicate a flag to reject after the time configured. In the example below you can see that the variable reqHasFinished is changed inside the request callback, so in case its keeps false you can reject the request:
return new Promise((resolve, reject) => {
let start = moment(),
reqHasFinished = false;
const req = adapterFor(url).request(options, (resp) => {
reqHasFinished = true;
let data = '';
// resp operations here
});
// if after 10 seconds the request has not finished, the promise is rejected.
setTimeout(() => {
if (!reqHasFinished) {
reject('Go ahead');
}
}, 10000)
req.end();
});
My application uses an internal webservice for fetching data, i have a job which creates approx 500 requests which getsfired async to complete the fetch operation.
I make use of Axios, by creating an array of axios promises and then resolving them using using Axios.all();
It works fine until some 200 requests but post that i get socket hung up, however on the server side i see the requests are being processed.
How to configure axios to set custom time out, or is it a better idea to splice my promises array and then run them as multiple batches ?
Source code
let getAxiosPromiseArray = (urlList) => {
var axiosArrayofPromise = [];
return new Promise ( (resolve, reject) => {
try {
urlList.forEach ( (URL) => {
axiosArrayofPromise.push(axios.get(URL));
});
resolve(axiosArrayofPromise);
}
catch (err) {
reject("There is a problem getting Axios array of promises " + err);
}
})
}
async function processAxiosPromises (PromiseArray) {
try {
var results = []
results = await axios.all(PromiseArray);
return results;
}
catch(err) {
throw("There was a problem resolving promises array (Axios) " + err);
}
}
getallID().then ( (urlList) => {
return getAxiosPromiseArray(urlList);
}).then( (AxiosPromises) => {
return processAxiosPromises(AxiosPromises);
}).then ((resultData) => {
console.log(resultData);
});
Error
There was a problem resolving promises array (Axios) Error: socket hang up
First, that pair of functions getAxiosPromiseArray() and processAxiosPromises() needs fixing.
Your new Promise() construction is unnecessary. You can simply return Promise.all(arrayofPromise) (or axios.all(...) if you must) and do away with the other function.
Renaming the remaining function to something meaningful, you would end up with eg :
let getData = (urlList) => {
return Promise.all(urlList.map(URL => axios.get(URL)))
.catch(error => {
error.message = "There is a problem getting Axios array of promises " + error.message; // augment the error message ...
throw error; // ... and re-throw the errror.
});
};
And call as follows :
getallID().then(getData)
.then(resultData => {
console.log(resultData);
}).catch(error => {
console.error(error);
});
That will put you on solid ground but, on its own, is unlikely to fix a concurrency problem (if that's what it is), for which the simplest approach is to use Bluebird's Promise.map with the concurrency option.
The caller code can remain the same, just change getData(), as follows:
let getData = (urlList) => {
let concurrency = 10; // play with this value to find a reliable concurrency limit
return Promise.map(urlList, URL => axios.get(URL), {'concurrency': concurrency})
.catch(error => {
error.message = "There is a problem getting Axios array of promises " + error.message;
throw error;
});
};
// where `Promise` is Bluebird.
const axios = require('axios');
const axiosThrottle = require('axios-throttle');
//pass axios object and value of the delay between requests in ms
axiosThrottle.init(axios,200)
const options = {
method: 'GET',
};
const urlList = [
'https://jsonplaceholder.typicode.com/todos/1',
'https://jsonplaceholder.typicode.com/todos/2',
'https://jsonplaceholder.typicode.com/todos/3',
'https://jsonplaceholder.typicode.com/todos/4',
'https://jsonplaceholder.typicode.com/todos/5',
'https://jsonplaceholder.typicode.com/todos/6',
'https://jsonplaceholder.typicode.com/todos/7',
'https://jsonplaceholder.typicode.com/todos/8',
'https://jsonplaceholder.typicode.com/todos/9',
'https://jsonplaceholder.typicode.com/todos/10'
];
const promises = [];
const responseInterceptor = response => {
console.log(response.data);
return response;
};
//add interceptor to work with each response seperately when it is resolved
axios.interceptors.response.use(responseInterceptor, error => {
return Promise.reject(error);
});
for (let index = 0; index < urlList.length; index++) {
options.url = urlList[index];
promises.push(axiosThrottle.getRequestPromise(options, index));
}
//run when all promises are resolved
axios.all(promises).then(responses => {
console.log(responses.length);
});
https://github.com/arekgotfryd/axios-throttle