Are multiple HTTP headers with the same name supported? - node.js

I'm experimenting with migrating an ASP.net REST backend to Azure Functions. My possibly naive approach to this was creating a catch-all function that proxies HTTP requests via Node's http module, then slowly replacing endpoints with native Azure Functions with more specific routes. I'm using the CLI, and created a Node function like so:
var http = require("http")
module.exports = function (context, req) {
var body = new Buffer([])
var headers = req.headers
headers["host"] = "my.proxy.target"
var proxy = http.request({
hostname: "my.proxy.target",
port: 80,
method: req.method,
path: req.originalUrl,
headers: headers
}, function(outboundRes) {
console.log("Got response")
context.res.status(outboundRes.statusCode)
for(header in outboundRes.headers) {
console.log("Header", header)
if(header != "set-cookie")
context.res.setHeader(header, outboundRes.headers[header])
else
console.log(outboundRes.headers[header])
}
outboundRes.addListener("data", function(chunk) {
body = Buffer.concat([body, chunk])
})
outboundRes.addListener("end", function() {
console.log("End", context.res)
context.res.raw(body)
})
})
proxy.end(req.body)
}
This almost seems to work, but my backend sets several cookies using several Set-Cookie headers. Node hands these back as an array of cookie values, but it seems like Azure Functions doesn't accept arrays as values, or doesn't permit setting multiple headers with the same name, as seems to be allowed for Set-Cookie.
Is this supported? I've googled and have checked out the TypeScript source for Response, but it doesn't appear to be.
If this isn't supported, what Azure platform services should I use to fail over 404s from one app to another, so I can slowly replace the monolith with Functions? Function proxies would work if I could use them as fallbacks, but that doesn't appear possible.
Thanks!

Related

Node.js GET API is getting called twice intermittently

I have a node.js GET API endpoint that calls some backend services to get data.
app.get('/request_backend_data', function(req, res) {
---------------------
}
When there is a delay getting a response back from the backend services, this endpoint(request_backend_data) is getting triggered exactly after 2 minutes. I have checked my application code, but there is no retry logic written anywhere when there is a delay.
Does node.js API endpoint gets called twice in any case(like delay or timeout)?
There might be a few reasons:
some chrome extensions might cause bugs. Those chrome extensions have been causing a lot of issues recently. run your app on a different browser. If there is no issue, that means it is chrome-specific problem.
express might be making requests for favicon.ico. In order to prevent this, use this module : https://www.npmjs.com/package/serve-favicon
add CORS policy. Your server might sending preflight requests Use this npm package: https://www.npmjs.com/package/cors
No there is no default timeouts in nodejs or something like that.
Look for issue at your frontend part:
can be javascript fetch api with 'retry' option set
can be messed up RxJS operators chain which emits events implicitly and triggers another one REST request
can be entire page reload on timeout which leads to retrieve all neccessary data from backend
can be request interceptors (in axios, angular etc) which modify something and re-send
... many potential reasons, but not in backend (nodejs) for sure
Just make simple example and invoke your nodejs 'request_backend_data' endpoint with axois or xmlhttprequest - you will see that problem is not at backend part.
Try checking the api call with the code below, which includes follwing redirects. Add headers as needed (ie, 'Authorization': 'bearer dhqsdkhqd...etc'
var https = require('follow-redirects').https;
var fs = require('fs');
var options = {
'method': 'GET',
'hostname': 'foo.com',
'path': '/request_backend_data',
'headers': {
},
'maxRedirects': 20
};
var req = https.request(options, function (res) {
var chunks = [];
res.on("data", function (chunk) {
chunks.push(chunk);
});
res.on("end", function (chunk) {
var body = Buffer.concat(chunks);
console.log(body.toString());
});
res.on("error", function (error) {
console.error(error);
});
});
req.end();
Paste into a file called test.js then run with node test.js.

I have 2 different NodeJs Servers (A, B), How I can call Server A endpoints inside Server B endpoints?

I have an Update Endpoint which update users DB inside server A, I want this endpoint call another Endpoint in Server B to update accounts DB
Add env variable which is pointing to server B
Create a small module to make a Http call with pure, built-in Node API
Execute a method of fn from created module during handling request in Server A
Do not forget properly log all invocation results to make your life esier!=)
You have two services:
Service A to update users.
Service B to update accounts.
Do you want to call both endpoints at once?
so here are few solutions:
Use PUB/SUB using Redis, bull, or bee-queue.
You can call service B endpoint from service A using rest client like axios, got
you can save to accounts DB directly from the service A.
You can create microservices then create you private api Key in .env in Server A and you sevice is created in Server B is providing you a data. Like in the comment above said you can use libraries like fetch or axios and while you are sending http request you need to pass API_KEY in headers
var http = require('http');
var options = {
host: 'http://localhost',
path: '/ServerB_API',
port: '28017',
headers: {'custom': 'Custom Header Demo works', 'API_KEY': 'Some Secret Key'}
};
callback = function(response) {
var str = ''
response.on('data', function (chunk) {
str += chunk;
});
response.on('end', function () {
console.log(str);
});
}
var req = http.request(options, callback);
req.end();
And in response you can get response but before that that check if API_KEY matches your .env API_KEY they should be same because no other api can acces you services.
if(res.headers['API_KEY'] && (res.headers['API_KEY'] !== process.env.API_KEY)({
throw 'Invalid api key';
}

Nodejs proxy request coalescing

I'm running into an issue with my http-proxy-middleware stuff. I'm using it to proxy requests to another service which i.e. might resize images et al.
The problem is that multiple clients might call the method multiple times and thus create a stampede on the original service. I'm now looking into (what some services call request coalescing i.e. varnish) a solution that would call the service once, wait for the response and 'queue' the incoming requests with the same signature until the first is done, and return them all in a single go... This is different from 'caching' results due to the fact that I want to prevent calling the backend multiple times simultaneously and not necessarily cache the results.
I'm trying to find if something like that might be called differently or am i missing something that others have already solved someway... but i can't find anything...
As the use case seems pretty 'basic' for a reverse-proxy type setup, I would have expected alot of hits on my searches but since the problemspace is pretty generic i'm not getting anything...
Thanks!
A colleague of mine has helped my hack my own answer. It's currently used as a (express) middleware for specific GET-endpoints and basically hashes the request into a map, starts a new separate request. Concurrent incoming requests are hashed and checked and walked on the separate request callback and thus reused. This also means that if the first response is particularly slow, all coalesced requests are too
This seemed easier than to hack it into the http-proxy-middleware, but oh well, this got the job done :)
const axios = require('axios');
const responses = {};
module.exports = (req, res) => {
const queryHash = `${req.path}/${JSON.stringify(req.query)}`;
if (responses[queryHash]) {
console.log('re-using request', queryHash);
responses[queryHash].push(res);
return;
}
console.log('new request', queryHash);
const axiosConfig = {
method: req.method,
url: `[the original backend url]${req.path}`,
params: req.query,
headers: {}
};
if (req.headers.cookie) {
axiosConfig.headers.Cookie = req.headers.cookie;
}
responses[queryHash] = [res];
axios.request(axiosConfig).then((axiosRes) => {
responses[queryHash].forEach((coalescingRequest) => {
coalescingRequest.json(axiosRes.data);
});
responses[queryHash] = undefined;
}).catch((err) => {
responses[queryHash].forEach((coalescingRequest) => {
coalescingRequest.status(500).json(false);
});
responses[queryHash] = undefined;
});
};

Node.js - Why does my HTTP GET Request return a 404 when I know the data is there # the URL I am using

I'm still new enough with Node that HTTP requests trip me up. I have checked all the answers to similar questions but none seem to address my issue.
I have been dealt a hand in the Wild of having to go after JSON files in an API. I then parse those JSON files to separate them out into rows that populate a SQL database. The API has one JSON file with an ID of 'keys.json' that looks like this:
{
"keys":["5sM5YLnnNMN_1540338527220.json","5sM5YLnnNMN_1540389571029.json","6tN6ZMooONO_1540389269289.json"]
}
Each array element in the keys property holds the value of one of the JSON data files in the API.
I am having problems getting either type of file returned to me, but I figure if I can learn what is wrong with the way I am trying to get 'keys.json', I can leverage that knowledge to get the individual JSON data files represented in the keys array.
I am using the npm modules 'request' and 'request-promise-native' as follows:
const request = require('request');
const rp = require('request-promise-native');
My URL is constructed with the following elements, as follows (I have used the ... to keep my client anonymous, but other than that it is a direct copy:
let baseURL = 'http://localhost:3000/Users/doug5solas/sandbox/.../server/.quizzes/'; // this is the development value only
let keysID = 'keys.json';
Clearly the localhost aspect will have to go away when we deploy but I am just testing now.
Here is my HTTP call:
let options = {
method: 'GET',
uri: baseURL + keysID,
headers: {
'User-Agent': 'Request-Promise'
},
json: true // Automatically parses the JSON string in the response
};
rp(options)
.then(function (res) {
jsonKeysList = res.keys;
console.log('Fetched', jsonKeysList);
})
.catch(function (err) {
// API call failed
let errMessage = err.options.uri + ' ' + err.statusCode + ' Not Found';
console.log(errMessage);
return errMessage;
});
Here is my console output:
http://localhost:3000/Users/doug5solas/sandbox/.../server/.quizzes/keys.json 404 Not Found
It is clear to me that the .catch() clause is being taken and not the .then() clause. But I do not know why that is because the data is there at that spot. I know it is because I placed it there manually.
Thanks to #Kevin B for the tip regarding serving of static files. I revamped the logic using express.static and served the file using that capability and everything worked as expected.

NTLM Proxy Authentication in Node Request

I'm trying to make a request using Node behind a corporate web proxy which requires NTLM authentication. I've tried using a couple libraries such as the proxying-agent but am having little success.
Here's a simplified version of my code using the request library and ntlm.js similar to the proxying-agent. I'd expect to receive a successful response after the last request call but for some reason am still getting a 407 -
var ntlmRequest = function(req) {
var ntlmOptions = {};
ntlmOptions.ntlm = {};
ntlmOptions.method = 'GET';
ntlmOptions.path = 'http://www.jsonip.co.uk';
ntlmOptions.ntlm.username = 'USERNAME';
ntlmOptions.ntlm.password = 'Pa$$word';
ntlmOptions.ntlm.workstation = 'PC-NAME';
ntlmOptions.ntlm.domain = 'DOMAIN';
var type1message = ntlm.createType1Message(ntlmOptions.ntlm);
var requestOptions = {
url: 'http://www.jsonip.co.uk',
proxy: 'http://webproxy.domain.com:8080',
headers: req.headers
};
requestOptions.headers['Proxy-Authorization'] = type1message;
requestOptions.headers['Proxy-Connection'] = 'Keep-Alive'
request(requestOptions, function(err,res){
var type2message = ntlm.parseType2Message(res.headers['proxy-authenticate']);
var type3message = ntlm.createType3Message(type2message, ntlmOptions.ntlm);
requestOptions.headers['Proxy-Authorization'] = type3message;
request(requestOptions, function(err,res){
console.log(res.statusCode);
});
});
};
I've tried comparing some packet captures and on a working NTLM request (using curl) I can see this during the type 1 and type 3 requests -
[HTTP request 1/2]
[HTTP request 2/2]
My requests only show 1/1.
I'm thinking maybe in other implementations of NTLM the browsers are spanning the requests across multiple packets. Not sure if this is the reason why it isn't working, or maybe just a different way of doing things.
Thanks in advance.

Resources