Node.js GET API is getting called twice intermittently - node.js

I have a node.js GET API endpoint that calls some backend services to get data.
app.get('/request_backend_data', function(req, res) {
---------------------
}
When there is a delay getting a response back from the backend services, this endpoint(request_backend_data) is getting triggered exactly after 2 minutes. I have checked my application code, but there is no retry logic written anywhere when there is a delay.
Does node.js API endpoint gets called twice in any case(like delay or timeout)?

There might be a few reasons:
some chrome extensions might cause bugs. Those chrome extensions have been causing a lot of issues recently. run your app on a different browser. If there is no issue, that means it is chrome-specific problem.
express might be making requests for favicon.ico. In order to prevent this, use this module : https://www.npmjs.com/package/serve-favicon
add CORS policy. Your server might sending preflight requests Use this npm package: https://www.npmjs.com/package/cors

No there is no default timeouts in nodejs or something like that.
Look for issue at your frontend part:
can be javascript fetch api with 'retry' option set
can be messed up RxJS operators chain which emits events implicitly and triggers another one REST request
can be entire page reload on timeout which leads to retrieve all neccessary data from backend
can be request interceptors (in axios, angular etc) which modify something and re-send
... many potential reasons, but not in backend (nodejs) for sure
Just make simple example and invoke your nodejs 'request_backend_data' endpoint with axois or xmlhttprequest - you will see that problem is not at backend part.

Try checking the api call with the code below, which includes follwing redirects. Add headers as needed (ie, 'Authorization': 'bearer dhqsdkhqd...etc'
var https = require('follow-redirects').https;
var fs = require('fs');
var options = {
'method': 'GET',
'hostname': 'foo.com',
'path': '/request_backend_data',
'headers': {
},
'maxRedirects': 20
};
var req = https.request(options, function (res) {
var chunks = [];
res.on("data", function (chunk) {
chunks.push(chunk);
});
res.on("end", function (chunk) {
var body = Buffer.concat(chunks);
console.log(body.toString());
});
res.on("error", function (error) {
console.error(error);
});
});
req.end();
Paste into a file called test.js then run with node test.js.

Related

Axios always time out on AWS Lambda for a particular API

Describe the issue
I'm not really sure if this is an Axios issue or not. The following code runs successfully on my local development machine but always time out whenever I run it from the cloud (e.g. AWS Lambda). Same thing happens when I run on repl.it.
I can confirm that AWS Lambda has internet access and it works for any other API but this:
https://www.target.com.au/ws-api/v1/target/products/search?category=W95362
Example Code
https://repl.it/repls/AdeptFluidSpreadsheet
const axios = require('axios');
const handler = async () => {
const url = 'https://www.target.com.au/ws-api/v1/target/products/search?category=W95362';
const response = await axios.get(url, { timeout: 10000 });
console.log(response.data.data.productDataList);
}
handler();
Environment
Axios Version: 0.19.2
Runtime: nodejs12x
Update 1
I tried the native require('https') and it times out on both localhost and cloud server. Please find sample code here: https://repl.it/repls/TerribleViolentVolume
const https = require('https');
const url = 'https://www.target.com.au/ws-api/v1/target/products/search?category=W95362';
https.get(url, res => {
var body = '';
res.on('data', chunk => {
body += chunk;
});
res.on('end', () => {
var response = JSON.parse(body);
console.log("Got a response: ", response);
});
}).on('error', e => {
console.log("Got an error: ", e);
});
Again, I can confirm that same code works on any other API.
Update 2
I suspect that this is something server side as it also behaves very weirdly with curl.
curl from local -> 403 access denied
curl from local with User-Agent header -> success
curl from cloud server -> 403 access denied
It must be server side validation, something related to AkamaiGHost.
You have probably placed your Lambda function in a VPC without Internet access to the outside world. Try check the VPC section in your lambda configuration, and setup an internet gateway accordingly
You should try by wrapping axios call into try/catch maybe that will catch the issue.
const axios = require('axios');
const handler = async () => {
try {
const url = 'https://www.target.com.au/ws-api/v1/target/products/search?category=W95362';
const response = await axios.get(url, { timeout: 10000 });
console.log(typeof (response));
console.log(response);
} catch (e) {
console.log(e, "error api call");
}
}
handler();
As suggested by Akshay you can use try and catch block to get the error. Maybe it helps you out.
Have you configured Error Handling for Asynchronous Invocation?
To configure error handling follow the below steps:
Open the Lambda console Functions page.
Choose a function.
Under Asynchronous invocation, choose Edit.
Configure the following settings.
Maximum age of event – The maximum amount of time Lambda retains an event in the asynchronous event queue, up to 6 hours.
Retry attempts – The number of times Lambda retries when the function returns an error, between 0 and 2.
Choose Save.
axios is only Promise based HTTP client for the browser and node.js and as you set timeout: 10000 so I believe timeout issue is not from its end.
Although your API
https://www.target.com.au/ws-api/v1/target/products/search?category=W95362
is working fine on the browser and rendering JSON data.
and Function timeout of lambda is by default 15 minutes, which I believe is enough for the response. There may be another issue.
Make sure you have set other configurations like permissions etc. as suggested in the documentation.
Here you can check the default limits for AWS lambda.

How would I return status codes and repsones in AWS Lambda

I am writing a Lambda function which is going to be used to send a test message to an API. If there are errors I will need it to run certain functionality (like notify me with AWS messaging). I would like to have a simple test by status code. for example if i get a 2XX do nothing, if I get a 4XX or 5XX, notify me so i can research issues. In the test environment I am passing the body as an XML string as a value in a JSON object.
example Lambda Test Event
{
"data": "<xml stuff, credentials, etc"
}
here is my function
exports.handler = async (event, context) => {
const https = require('https');
const options = {
hostname: 'https://mythingy.com',
port: 443,
path: '/target',
method: 'POST',
headers: {'Content-Type': 'application/xml'}
};
const req = https.request(options, res => {
console.log(`statusCode: ${res.statusCode}`);
res.on('data', d => {
process.stdout.write(d);
});
});
req.on('error', error => {
console.error(error);
});
req.write(event.data);
req.end();
};
I'm using node 10.x in Lambda, and i am getting a "result succeeded" message from lambda, but no logged response statusCode. I've done it several ways, and have easily pulled statsCodes from Node fetch, ajax, http requests in the past. I know this probably has something to do with Lambda's env anc the promise. Can anyone help me figure out how to log the stats code in Lambda?
You don't see it printed out because your function is async and https.request uses a callback approach, which will be run asynchronously by the Node.js workers. It turns out that the function will have reached its end before it has a chance to execute the code inside the callback. And yes, you are right, this is due to the way Lambda functions work, because they are short-lived (contexts can be reused, but that's a story for another question), therefore the processes are terminated by the underlying containers. It never happened to you in traditional Node.js applications because they usually run behind a webserver, which is responsible for keeping the process up and running, so callbacks are eventually executed.
You have to either promisify https.request or use a library which already works with Promises, so you could easily await on them. Axios and Request are good options.
Once you have chosen your library - or have promisified https.request - (I'll use axios for my example), you can simply await on the call, get its results and do whatever you want with it.
const res = await axios.post('https://service-you-want-to-connect-to.com', {})
console.log(JSON.stringify(res)) // here you inspect the res object and decide what do to with the status code.

Are multiple HTTP headers with the same name supported?

I'm experimenting with migrating an ASP.net REST backend to Azure Functions. My possibly naive approach to this was creating a catch-all function that proxies HTTP requests via Node's http module, then slowly replacing endpoints with native Azure Functions with more specific routes. I'm using the CLI, and created a Node function like so:
var http = require("http")
module.exports = function (context, req) {
var body = new Buffer([])
var headers = req.headers
headers["host"] = "my.proxy.target"
var proxy = http.request({
hostname: "my.proxy.target",
port: 80,
method: req.method,
path: req.originalUrl,
headers: headers
}, function(outboundRes) {
console.log("Got response")
context.res.status(outboundRes.statusCode)
for(header in outboundRes.headers) {
console.log("Header", header)
if(header != "set-cookie")
context.res.setHeader(header, outboundRes.headers[header])
else
console.log(outboundRes.headers[header])
}
outboundRes.addListener("data", function(chunk) {
body = Buffer.concat([body, chunk])
})
outboundRes.addListener("end", function() {
console.log("End", context.res)
context.res.raw(body)
})
})
proxy.end(req.body)
}
This almost seems to work, but my backend sets several cookies using several Set-Cookie headers. Node hands these back as an array of cookie values, but it seems like Azure Functions doesn't accept arrays as values, or doesn't permit setting multiple headers with the same name, as seems to be allowed for Set-Cookie.
Is this supported? I've googled and have checked out the TypeScript source for Response, but it doesn't appear to be.
If this isn't supported, what Azure platform services should I use to fail over 404s from one app to another, so I can slowly replace the monolith with Functions? Function proxies would work if I could use them as fallbacks, but that doesn't appear possible.
Thanks!

Sinon fake server not intercepting requests

Trying to use Sinon for the first time because of its fake server functionality that lets me stub an API response. Test itself is written for Mocha
However, the fake server doesn't seem to be intercepting the requests.
Code:
describe('when integrated', function() {
var server;
beforeEach(function() {
server = sinon.createFakeServer();
});
afterEach(function() {
server.restore();
});
it('can send a message to the notification service', function() {
server.respondWith("POST", new RegExp('.*/api/notificationmanager/messages.*'),
[200,
{ "Content-Type": "application/json" },
'{ "messageId":23561}'
]);
var messageOnly = new PushMessage(initMessageObj);
var originalUrl = PushMessage.serverUrl;
messageOnly.setServerAPI("http://a.fake.server/api/notificationmanager/messages");
console.log("fake server is: ", server);
messageOnly.notify()
.then(function(response) {
messageOnly.setServerAPI(originalUrl);
return response;
})
.then(function(response) {
response.should.be.above(0);
})
console.log(server.requests);
server.respond();
})
});
For reference, PushMessage is an object that has a static property serverUrl. I'm just setting the value to a fake URL & then resetting it.
The notify() function makes a post message using request-promise-native to the serverUrl set in the PushMessage's static property.
What seems to be happening, is that the POST request ends up being properly attempted against the URL of http://a.fake.server/api/notificationmanager/messages, resulting in an error that the address doesn't exist...
Any idea what I'm doing wrong...? Thanks!
There have been several issues on the Sinon GitHub repository about this. Sinon's fake server:
Provides a fake implementation of XMLHttpRequest and provides several interfaces for manipulating objects created by it.
Also fakes native XMLHttpRequest and ActiveXObject (when available, and only for XMLHTTP progids). Helps with testing requests made with XHR.
Node doesn't use XHR requests, so Sinon doesn't work for this use case. I wish it did too.
Here's an issue that breaks it down: https://github.com/sinonjs/sinon/issues/1049
Nock is a good alternative that works with Node: https://www.npmjs.com/package/nock

Nodejs Request module -- how to set global keepalive

I am using request npm module in my app, to make to create a http client, as this.
var request = require('request');
And each time, I make a request to some server, I pass the options as below:
var options = {
url: "whateverurl...",
body: { some json data for POST ... }
}
request(options, cb(e, r, body) {
// handle response here...
})
This was working fine, until I started testing with high load, and I started getting errors indicating no address available (EADDRNOTAVAIL). It looks like I am running out of ephemeral ports, as there is no pooling or keep-alive enabled.
After that, I changed it to this:
var options = {
url: "whateverurl...",
body: { some json data for POST ... },
forever: true
}
request(options, cb(e, r, body) {
// handle response here...
})
(Note the option (forever:true)
I tried looking up request module's documentation about how to set keep-alive. According to the documentation and this stackoverflow thread, I am supposed to add {forever:true} to my options.
It didn't seem to work for me, because when I checked the tcpdump, the sever was still closing the connection. So, my question is:
Am I doing something wrong here?
Should I not be setting a global option to request module, while I am "require"ing it, instead of telling it to use {forever:true}, each time I make a http request? This is confusing to me.

Resources