Retry logic for https.get in node.js - node.js

Is there a way to implement retries for https.get method in node.js using async-retry.

If you are using this module https://github.com/zeit/async-retry
your answer is there in README.md file
// Packages
const retry = require('async-retry')
const fetch = require('node-fetch')
await retry(async bail => {
// if anything throws, we retry
const res = await fetch('https://google.com')
if (403 === res.status) {
// don't retry upon 403
bail(new Error('Unauthorized'))
return
}
const data = await res.text()
return data.substr(0, 500)
}, {
retries: 5
})
and if you want more popular solution/npm module you can find it here https://www.npmjs.com/package/requestretry
const request = require('requestretry');
...
// use await inside async function
const response = await request.get({
url: 'https://api.domain.com/v1/a/b',
json: true,
fullResponse: true, // (default) To resolve the promise with the full response or just the body
// The below parameters are specific to request-retry
maxAttempts: 5, // (default) try 5 times
retryDelay: 5000, // (default) wait for 5s before trying again
retryStrategy: request.RetryStrategies.HTTPOrNetworkError // (default) retry on 5xx or network errors
})

Related

Extend node-fetch default timeout

I am using node-fetch to perform a get request
const fetch = require("node-fetch");
try {
const response = await fetch(
`https://someurl/?id=${id}`
);
} catch (error) {
console.error(error);
}
The API take long time to return the response, around 15 minutes. With postman everything works fine, but with node-fetch I get a time out error after 600000 ms. So I assume node-fetch has a default timeout. I found some ways to change the default time-out, but if I understand the code well, this will make the default timeout shorter, but will not extend it. Any advice on that?
import * as fetch from "node-fetch"
export default function (url: any, options: any, timeout = 5000) {
return Promise.race([
fetch(url, options),
new Promise((_, reject) => setTimeout(() => reject("timeout"), timeout)),
])
}

Error Unexpected token o in JSON at position 1 when trying to parse json for http request

I am new to node.js, and would appreciate some assistance.
I am currently writing code, that simply fetches a request id, via parsing the json, and then relays that id over to another http call, to derive a token, to then input into html via puppeteer.
*Currently, when I attempt to parse json via nodejs's json parse command, I receive the error:
UnhandledPromiseRejectionWarning: SyntaxError: Unexpected token o in JSON at position 1
My intention is to derive the value of attribute 'request', so It can be referenced as a constant in the subsequent steps.
However, I receive the above error.
The JSON format looks like this when I make the http request.
{"status":1,"request":"123452590"}
(Request value in this test case would be 123452590 which is what I want to obtain and store as constant
Any help would be really appreciated.
Currently, the code is structured in the following manner:
method: 'userrecaptcha',
key: xxx
googlekey: 'xxx',
pageurl: 'http://test.ca',
json: 1
};
var request = require('request');
const response = await request.post('http://2captcha.com/in.php', {form: formData});
const requestId = JSON.parse(response).request;
console.log (requestId)
async function pollForRequestResults(key, id, retries = 30, interval = 1500, delay = 15000) {
await timeout(delay);
return poll({
taskFn: requestCaptchaResults(key, id),
interval,
retries
});
}
function requestCaptchaResults(key, requestId) {
const url = `http://2captcha.com/res.php?key=${key}&action=get&id=${requestId}&json=1`;
return async function() {
return new Promise(async function(resolve, reject){
const rawResponse = await request.get(url);
const resp = JSON.parse(rawResponse);
if (resp.status === 0) return reject(resp.request);
resolve(resp.request);
});
}
}
const timeout = millis => new Promise(resolve => setTimeout(resolve, millis))
})()```
const requestId = JSON.parse(response).request;

How can I change the result status in Axios with an adapter?

The why
We're using the axios-retry library, which uses this code internally:
axios.interceptors.response.use(null, error => {
Since it only specifies the error callback, the Axios documentation says:
Any status codes that falls outside the range of 2xx cause this function to trigger
Unfortunately we're calling a non-RESTful API that can return 200 with an error code in the body, and we need to retry that.
We've tried adding an Axios interceptor before axios-retry does and changing the result status in this case; that did not trigger the subsequent interceptor error callback though.
What did work was specifying a custom adapter. However this is not well-documented and our code does not handle every case.
The code
const axios = require('axios');
const httpAdapter = require('axios/lib/adapters/http');
const settle = require('axios/lib/core/settle');
const axiosRetry = require('axios-retry');
const myAdapter = async function(config) {
return new Promise((resolve, reject) => {
// Delegate to default http adapter
return httpAdapter(config).then(result => {
// We would have more logic here in the production code
if (result.status === 200) result.status = 500;
settle(resolve, reject, result);
return result;
});
});
}
const axios2 = axios.create({
adapter: myAdapter
});
function isErr(error) {
console.log('retry checking response', error.response.status);
return !error.response || (error.response.status === 500);
}
axiosRetry(axios2, {
retries: 3,
retryCondition: isErr
});
// httpstat.us can return various status codes for testing
axios2.get('http://httpstat.us/200')
.then(result => {
console.log('Result:', result.data);
})
.catch(e => console.error('Service returned', e.message));
This works in the error case, printing:
retry checking response 500
retry checking response 500
retry checking response 500
retry checking response 500
Service returned Request failed with status code 500
It works in the success case too (change the URL to http://httpstat.us/201):
Result: { code: 201, description: 'Created' }
The issue
Changing the URL to http://httpstat.us/404, though, results in:
(node:19759) UnhandledPromiseRejectionWarning: Error: Request failed with status code 404
at createError (.../node_modules/axios/lib/core/createError.js:16:15)
at settle (.../node_modules/axios/lib/core/settle.js:18:12)
A catch on the httpAdapter call will catch that error, but how do we pass that down the chain?
What is the correct way to implement an Axios adapter?
If there is a better way to handle this (short of forking the axios-retry library), that would be an acceptable answer.
Update
A coworker figured out that doing .catch(e => reject(e)) (or just .catch(reject)) on the httpAdapter call appears to handle the issue. However we'd still like to have a canonical example of implementing an Axios adapter that wraps the default http adapter.
Here's what worked (in node):
const httpAdapter = require('axios/lib/adapters/http');
const settle = require('axios/lib/core/settle');
const customAdapter = config =>
new Promise((resolve, reject) => {
httpAdapter(config).then(response => {
if (response.status === 200)
// && response.data contains particular error
{
// log if desired
response.status = 503;
}
settle(resolve, reject, response);
}).catch(reject);
});
// Then do axios.create() and pass { adapter: customAdapter }
// Now set up axios-retry and its retryCondition will be checked
Workaround with interceptor and custom error
const axios = require("axios").default;
const axiosRetry = require("axios-retry").default;
axios.interceptors.response.use(async (response) => {
if (response.status == 200) {
const err = new Error("I want to retry");
err.config = response.config; // axios-retry using this
throw err;
}
return response;
});
axiosRetry(axios, {
retries: 1,
retryCondition: (error) => {
console.log("retryCondition");
return false;
},
});
axios
.get("https://example.com/")
.catch((err) => console.log(err.message)); // gonna be here anyway as we'll fail due to interceptor logic

Attach two listeners to single axios stream

I am trying to fetch pdf url as stream from axios. I need to further upload that file to another location and return the hash of the uploaded file. I have third party function which accepts the stream, and upload file to target location. How can I use same stream to get the hash of the file?
I am trying to run below code:
const getFileStream = await axios.get<ReadStream>(externalUrl, {
responseType: "stream"
});
const hashStream = crypto.createHash("md5");
hashStream.setEncoding("hex");
const pHash = new Promise<string>(resolve => {
getFileStream.data.on("finish", () => {
resolve(hashStream.read());
});
});
const pUploadedFile = externalUploader({
stream: () => getFileStream.data
});
getFileStream.data.pipe(hashStream);
const [hash, uploadedFile] = await Promise.all([pHash, pUploadedFile]);
return { hash, id: uploadedFile.id };
After running this code, when I download the same pdf, I am getting corrupted file
You can reuse the same axios getFileStream.data to pipe to multiple sinks as long as they are consumed simultaneously.
Below is an example of downloading a file using an axios stream and "concurrently" calculating the MD5 checksum of the file while uploading it to a remote server.
The example will output stdout:
Incoming file checksum: 82c12f208ea18bbeed2d25170f3669a5
File uploaded. Awaiting server response...
File uploaded. Done.
Working example:
const { Writable, Readable, Transform, pipeline } = require('stream');
const crypto = require('crypto');
const https = require('https');
const axios = require('axios');
(async ()=>{
// Create an axios stream to fetch the file
const axiosStream = await axios.get('https://upload.wikimedia.org/wikipedia/commons/thumb/8/86/Map_icon.svg/128px-Map_icon.svg.png', {
responseType: "stream"
});
// To re-upload the file to a remote server, we can use multipart/form-data which will require a boundary key
const key = crypto.randomBytes(16).toString('hex');
// Create a request to stream the file as multipart/form-data to another server
const req = https.request({
hostname: 'postman-echo.com',
path: '/post',
method: 'POST',
headers: {
'content-type': `multipart/form-data; boundary=--${key}`,
'transfer-encoding': 'chunked'
}
});
// Create a promise that will be resolved/rejected when the remote server has completed the HTTP(S) request
const uploadRequestPromise = new Promise(resolve => req.once('response', (incomingMessage) => {
incomingMessage.resume(); // prevent response data from queuing up in memory
incomingMessage.on('end', () => {
if(incomingMessage.statusCode === 200){
resolve();
}
else {
reject(new Error(`Received status code ${incomingMessage.statusCode}`))
}
});
}));
// Construct the multipart/form-data delimiters
const multipartPrefix = `\r\n----${key}\r\n` +
'Content-Disposition: form-data; filename="cool-name.png"\r\n' +
'Content-Type: image/png\r\n' +
'\r\n';
const multipartSuffix = `\r\n----${key}--`;
// Write the beginning of a multipart/form-data request before streaming the file content
req.write(multipartPrefix);
// Create a promise that will be fulfilled when the file has finished uploading
const uploadStreamFinishedPromise = new Promise(resolve => {
pipeline(
// Use the axios request as a stream source
axiosStream.data,
// Piggyback a nodejs Transform stream because of the convenient flush() call that can
// add the multipart/form-data suffix
new Transform({
objectMode: false,
transform( chunk, encoding, next ){
next( null, chunk );
},
flush( next ){
this.push( multipartSuffix );
next();
}
}),
// Write the streamed data to a remote server
req,
// This callback is executed when all data from the stream pipe has been processed
(error) => {
if( error ){
reject( error );
}
else {
resolve();
}
}
)
});
// Create a MD5 stream hasher
const hasher = crypto.createHash("md5");
// Create a promise that will be resolved when the hash function has processed all the stream
// data
const hashPromise = new Promise(resolve => pipeline(
// Use the axios request as a stream source.
// Note that it's OK to use the same stream to pipe into multiple sinks. In this case, we're
// using the same axios stream for both calculating the haas, and uploading the file above
axiosStream.data,
// The has function will process stream data
hasher,
// This callback is executed when all data from the stream pipe has been processed
(error) => {
if( error ){
reject( error );
}
else {
resolve();
}
}
));
/**
* Note that there are no 'awaits' before both stream sinks have been established. That is
* important since we want both sinks to process data from the beginning of stream
*/
// We must wait to call the hash function's digest() until all the data has been processed
await hashPromise;
const hash = hasher.digest("hex");
console.log("Incoming file checksum:", hash);
await uploadStreamFinishedPromise;
console.log("File uploaded. Awaiting server response...");
await uploadRequestPromise;
console.log("File uploaded. Done.");
})()
.catch( console.error );

Puppeteer: How to listen to a specific response?

I'm tinkering with the headless chrome node api called puppeteer.
I'm wondering how to listen to a specific request response and how to act in consequence.
I have look at events requestfinish and response but it gives me all the request/responses already performed in the page.
How can I achieve commented behaviour?
One option is to do the following:
page.on('response', response => {
if (response.url().endsWith("your/match"))
console.log("response code: ", response.status());
// do something here
});
This still catches all requests, but allows you to filter and act on the event emitter.
https://github.com/GoogleChrome/puppeteer/blob/master/docs/api.md#event-response
Filtered response (wait up to 11 seconds) body parsed as JSON with initially requested PATCH or POST method every time you will be call that:
const finalResponse = await page.waitForResponse(response =>
response.url() === urlOfRequest
&& (response.request().method() === 'PATCH'
|| response.request().method() === 'POST'), 11);
let responseJson = await finalResponse.json();
console.log(responseJson);
Since puppeteer v1.6.0 (I guess) you can use page.waitForResponse(urlOrPredicate[, options])
Example usage from docs:
const firstResponse = await page.waitForResponse('https://example.com/resource');
const finalResponse = await page.waitForResponse(response =>
response.url() === 'https://example.com' && response.status() === 200
);
return finalResponse.ok();
I was using jest-puppeteer and trying to test for a specific response code of my test server. page.goto() resolves to the response of the original request.
Here is a simple test that a 404 response is returned when expected.
describe(`missing article page`, () => {
let response;
beforeAll(async () => {
response = await page.goto('http://my-test-server/article/this-article-does-not-exist')
})
it('has an 404 response for missing articles', () => {
expect(response.status()).toEqual(404)
})
it('has a footer', async () => {
await expect(page).toMatch('My expected footer content')
})
})
to get the xhr response simply do :
const firstResponse = await page.waitForResponse('https://example.com/resource')
// the NEXT line will extract the json response
console.log( await firstResponse.json() )

Resources