I'm trying to make a request using Node behind a corporate web proxy which requires NTLM authentication. I've tried using a couple libraries such as the proxying-agent but am having little success.
Here's a simplified version of my code using the request library and ntlm.js similar to the proxying-agent. I'd expect to receive a successful response after the last request call but for some reason am still getting a 407 -
var ntlmRequest = function(req) {
var ntlmOptions = {};
ntlmOptions.ntlm = {};
ntlmOptions.method = 'GET';
ntlmOptions.path = 'http://www.jsonip.co.uk';
ntlmOptions.ntlm.username = 'USERNAME';
ntlmOptions.ntlm.password = 'Pa$$word';
ntlmOptions.ntlm.workstation = 'PC-NAME';
ntlmOptions.ntlm.domain = 'DOMAIN';
var type1message = ntlm.createType1Message(ntlmOptions.ntlm);
var requestOptions = {
url: 'http://www.jsonip.co.uk',
proxy: 'http://webproxy.domain.com:8080',
headers: req.headers
};
requestOptions.headers['Proxy-Authorization'] = type1message;
requestOptions.headers['Proxy-Connection'] = 'Keep-Alive'
request(requestOptions, function(err,res){
var type2message = ntlm.parseType2Message(res.headers['proxy-authenticate']);
var type3message = ntlm.createType3Message(type2message, ntlmOptions.ntlm);
requestOptions.headers['Proxy-Authorization'] = type3message;
request(requestOptions, function(err,res){
console.log(res.statusCode);
});
});
};
I've tried comparing some packet captures and on a working NTLM request (using curl) I can see this during the type 1 and type 3 requests -
[HTTP request 1/2]
[HTTP request 2/2]
My requests only show 1/1.
I'm thinking maybe in other implementations of NTLM the browsers are spanning the requests across multiple packets. Not sure if this is the reason why it isn't working, or maybe just a different way of doing things.
Thanks in advance.
Related
I would like to POST two data in multipart / form-data format using Node-RED.
(One for text data, one for voice data)
I set the function node and http request node as follows, but it does not seem to be POST.
I think that it is necessary to create a multi-part body and assign it to msg.body, but I do not know how to create a multi-part body of the voice data.
I do not know how to solve it, so someone would like to tell me.
function node
var request = global.get('requestModule');
var fs = global.get('fsModule');
msg.body = {};
msg.body = {
'apikey' : "**********",
'wav' : fs.createReadStream('/tmp/testtest.wav')
};
msg.headers = {};
msg.headers['Content-type'] = 'multipart/form-data';
return msg
http request(Property)
method ⇒ POST
URL ⇒ https://xxxxyyyzzz/
SSL/TLS ⇒ No
Basic ⇒ No
Output ⇒ JSON
The http request Node-Red core node support multipart/form-data POST out of the box.
Add a function node before your http request with this Function :
msg.headers = {};
msg.headers['Content-Type'] = 'multipart/form-data';
msg.headers['Accept'] = 'application/json';
msg.payload = {
'apikey': msg.apiKey,
'wav': {
value: msg.payload.invoice.file,
options: {
filename: 'testtest.wav',
contentType: 'audio/wav', // This is optionnal
}
}
}
return msg;
The http request node use Request nodejs library under the hood, and this one use form-data library for handling multipart, so all the options supported by these works.
The source code of the relevant part of http request handling multipart.
I'm experimenting with migrating an ASP.net REST backend to Azure Functions. My possibly naive approach to this was creating a catch-all function that proxies HTTP requests via Node's http module, then slowly replacing endpoints with native Azure Functions with more specific routes. I'm using the CLI, and created a Node function like so:
var http = require("http")
module.exports = function (context, req) {
var body = new Buffer([])
var headers = req.headers
headers["host"] = "my.proxy.target"
var proxy = http.request({
hostname: "my.proxy.target",
port: 80,
method: req.method,
path: req.originalUrl,
headers: headers
}, function(outboundRes) {
console.log("Got response")
context.res.status(outboundRes.statusCode)
for(header in outboundRes.headers) {
console.log("Header", header)
if(header != "set-cookie")
context.res.setHeader(header, outboundRes.headers[header])
else
console.log(outboundRes.headers[header])
}
outboundRes.addListener("data", function(chunk) {
body = Buffer.concat([body, chunk])
})
outboundRes.addListener("end", function() {
console.log("End", context.res)
context.res.raw(body)
})
})
proxy.end(req.body)
}
This almost seems to work, but my backend sets several cookies using several Set-Cookie headers. Node hands these back as an array of cookie values, but it seems like Azure Functions doesn't accept arrays as values, or doesn't permit setting multiple headers with the same name, as seems to be allowed for Set-Cookie.
Is this supported? I've googled and have checked out the TypeScript source for Response, but it doesn't appear to be.
If this isn't supported, what Azure platform services should I use to fail over 404s from one app to another, so I can slowly replace the monolith with Functions? Function proxies would work if I could use them as fallbacks, but that doesn't appear possible.
Thanks!
i use request module to consume web services. the web services expect Proxy-Authorization field in the header.
i send this value, but in the destination server, Proxy-Authorization is gone.
please help.
I got some helps from my colleage, here is the way to work-around.
just override proxyHeaderExclusiveList
const proxyHeaderExclusiveList = [];
proxyHeaderExclusiveList.concat = function newConcat() {
return this;
};
const options = {
url: SOME_URL,
headers: SOME_HEADERS,
proxyHeaderExclusiveList: proxyHeaderExclusiveList
};
I am sending a https GET Type request to an external API.
First, I created the https agent as below:
import https from 'https';
const KeepAliveAgent = new https.Agent( {
keepAlive: true
} );
then set the request options as below:
let options = {
url: 'externalapiurl',
method: "GET",
qs: queryString,
agent: KeepAliveAgent
};
I just mentioned sample strings for url and qs, in the original request I am using the actual api url and querystring, then I am sending the request as below:
console.time( "requestTime" );
request( options, ( err, response, body ) => {
if ( err ) {
logger.warn( err.message );
}
console.timeEnd( "requestTime" );
});
This is working fine, but I am printing the time taken for response above and this time is much more when I send the request behind a proxy, when I doesn't use proxy it's taking less than half second but with proxy it's taking around 3 secs, so it seems the "keep alive" is not working behind the proxy, how to make this work?
I tried same request using https-proxy-agent node module, but still the issue persists, appreciate any help.
All about measuring performance of HTTP/HTTPS Proxies. https://blog.thousandeyes.com/measuring-performance-with-http-proxies/][1]
Code given by you seems correct.
I suggest use socketio in nodejs for such implementation you are doing.
In a browser, if I send a GET request, the request will send the cookie in the meanwhile. Now I want to simulate a GET request from Node, then how to write the code?
Using the marvelous request library cookies are enabled by default. You can send your own like so (taken from the Github page):
var j = request.jar()
var cookie = request.cookie('your_cookie_here')
j.add(cookie)
request({url: 'http://www.google.com', jar: j}, function () {
request('http://images.google.com')
})
If you want to do it with the native http:request() method, you need to set the appropriate Set-Cookie headers (see an HTTP reference for what they should look like) in the headers member of the options argument; there are no specific methods in the native code for dealing with cookies. Refer to the source code in Mikeal's request library and or the cookieParser code in connect if you need concrete examples.
But Femi is almost certainly right: dealing with cookies is full of rather nitpicky details and you're almost always going to be better off using code that's already been written and, more importantly, tested. If you try to reinvent this particular wheel, you're likely to come up with code that seems to work most of the time, but occasionally and unpredicatably fails mysteriously.
var jar = request.jar();
const jwtSecret = fs.readFileSync(`${__dirname}/.ssh/id_rsa`, 'utf8');
const token = jwt.sign(jwtPayload, jwtSecret, settings);
jar.setCookie(`any-name=${token}`, 'http://localhost:12345/');
const options = {
method: 'GET',
url: 'http://localhost:12345',
jar,
json: true
};
request(options, handleResponse);