I have a node web server which listens on
this.app.use('/register-receiver', (req, res) => {
const receiverId = req.query.id;
res.status(200).set({
connection: 'keep-alive',
'Cache-Control': 'no-cache',
'Content-Type': 'application/json',
});
this.receivers[receiverId] = res;
});
It periodically sends a JSON payload to clients connected to /register-receiver with
this.receivers[receiverId].write(JSON.stringify({
// some big json
}))
I have another node program which acts as a client that makes a connection to the server on startup.
const options = {
agent: false,
host: <my host>,
defaultPort: <my port>
path: `/register-receiver?id=${activeConfig.receiver.id}`
headers: {
connection: 'keep-alive',
'Content-Type': 'application/json',
}
};
http.get(options, res => {
res.setEncoding('utf8');
res.on('data', data => {
try {
const response = JSON.parse(data);
// do stuff with response
} catch(e) {
console.log(`error ${e} with ${data}`);
}
});
res.on('end', () => console.log(`connection finished`) )
})
The server needs to periodically send JSON payloads to the client. The client should receive these JSONs and do something with them. However, the problem is that large JSONs writes are chunked such that the client will receive the JSON in pieces. This breaks JSON.parse(data) and now the client doesn't know how to process the server payloads. I can't rely on res.on('end') to detect the the completion of a chunked write because this is a keep-alive request that should stay open forever.
I want to avoid designing my own protocol for combining JSON chunks because of it's complexity. It's not as simple as concatenating the strings since I could have interleaved JSON chunks if the server sends 2 large JSON payloads at the same time.
Is it possible to force the server to write the entire JSON as 1 chunk in the stream? How can I setup my client such that it establishes a "forever" connection with my server and listens for complete JSON payloads?
Related
I am looking to open a HTTP/2 stream and use that stream to make multiple HTTP/2 POST requests. Each POST request will have its own body payload.
I currently have the code below, which works for requests that don't require a payload, but I'm not sure how to customize it for requests that do require a payload.
I've read RFC 7540 and nearly every related post on SO, but I'm still finding it difficult to write working code of HTTP/2 using a payload body.
For example:
Is using stream.write the recommended way to send DATA frames, or should I be using a built-in function provided by http2?
Do I pass arguments in plaintext, and the http2 protocol takes care of the binary encoding, or do I encode it myself?
How should I modify below code to send a payload body?
.
const http2 = require('http2')
const connection = http2.connect('https://www.example.com:443')
const stream = connection.request({
':authority':'www.example.com',
':scheme':'https',
':method': 'POST',
':path': '/custom/path',
}, { endStream: false })
stream.setEncoding('utf8')
stream.on('response', (headers) => {
console.log('RESPONSE', headers)
stream.on('data', (data) => console.log('DATA', data))
stream.on('end', () => console.log('END'))
})
stream.write(Buffer.from('POST-request-payload-body-here?'))
The first thing that you need to do is the to convert the body data into a buffer
var buffer = new Buffer(JSON.stringify(body));
Them you need to update the connection.request object with Content-Type and Content-Length keys. Note the Content-Length is the length of buffer
const stream = connection.request({
':authority':'www.example.com',
':scheme':'https',
':method': 'POST',
':path': '/custom/path',
'Content-Type': 'application/json',
'Content-Length': buffer.length
}, { endStream: false })
Finally you need to send the request by Converting the body data into a string
stream.end(JSON.stringify(body));
It work in my project
const body ={any}
const req = client.request({
':path': '/',
':method':'POST',
'Content-Type': 'application/json',///set body
});
req.write(JSON.stringify(body), 'utf8');///set body
Stack Overflow community, greetings. I'm trying to pass the response of a new request on the request object using the Node HTTP Module for a basic autocomplete search app for my website (i.e using Node as a proxy that will transform and redirect the requests within the server).
The flow basically is:
Client Browser - Node - ElasticSearch - Node - Client Browser
I've started with:
Listen to requests with http.createServer (function (req,res)
Get the body from the req object and use it in a new request with http.request(options, function (newReqResponse)
Get the body from that newReqResponse object and send it back to the client on the res object
The problem is that the content of newReqResponse is always outdated (trails behind the last typed character). i.e.:
If I type "te", the content of newReqResponse corresponds to that if I had typed only "t".
If I type "test", it corresponds to that if I had typed "tes".
And so on.
I've tried to solve it using Node.js streams and using the file system module to write and read files sync and async, but the result is the same. Here's a sample of the whole -code- picture:
var http = require('http');
var fs = require('fs');
http.createServer(function (req, res) {
var reqBody = '';
var newReqResponseBody = "";
req.on('data', function (chunk) {
reqBody += chunk;
fs.writeFile('reqbody.json', reqBody, function(err) {
if (err) {
throw err;
}
});
var options = {
hostname: '127.0.0.1',
port: 9500,
method: 'GET',
path: '/_search',
headers: { host: 'es',
'content-length': Buffer.byteLength(reqBody),
'content-type': 'application/json',
accept: 'application/json' },
};
var newReq = http.request(options, function (newReqResponse) {
newReqResponse.setEncoding("UTF-8");
newReqResponse.on('data', function (ch) {
newReqResponseBody += ch;
});
newReqResponse.on("end", function() {
fs.writeFile("newReqResponseBody.json", newReqResponseBody, function(err) {
if (err) {
throw err;
}
});
});
});
newReq.on("error", function(err) {
console.log(`problem with request: ${err.message}`);
});
newReq.write(reqBody);
newReq.end();
});
req.on('end', function() {
var responseBody = fs.readFileSync('newReqResponseBody.json', 'utf8');
console.log(responseBody);
res.end(responseBody);
});
}).listen(3000, '127.0.0.1');
Is there a workaround to work with requests and responses within the http server? If there isn't, I'll be very grateful if you give me any directions on how to solve this.
Since the planned use for Node is rather basic, I prefer to stick with core modules rather than having to get new ones from npm, unless that it's necessary.
Thanks in advance.
EDIT:
All I had to do was to call res.end(responseBody) within the newReqResponse.on("end") callback, which is totally counterintuitive for me, but... it works.
Glad you solved your own problem. However, I see room for improvement (not sure if you're new), especially if you're transferring data. Which you can do with streams.
You can see that I didn't calculate the content length, you're asked not to and should get ignore (for this specific case) according to HTTP specification as streams pass data in chunks with 'Transfer-Encoding': 'chunked' header.
const fs = require('fs');
const http = require('http');
const options = {
hostname: '127.0.0.1',
port: 9500,
method: 'GET',
path: '/_search',
headers: {
'Content-Type': 'application/json'
}
};
http.createServer((req, res) => {
req.pipe(fs.createWriteStream('reqBody.json'));
let request = http.request(options, (newRes) => {
newRes.pipe(res);
});
fs.createReadStream('reqBody.json').pipe(request);
res.setHeader('Content-Type', 'application/json');
}).listen(3000, '127.0.0.1');
You can shorten this snippet more if you don't want your data saved in the future and only want to pipe the req stream to request.
I am trying to act as a proxy between a client and an IP Camera using a NodeJS server. When I request a real-time stream from the camera, it responds with
HTTP/1.0 200 OK
Content-Type: Application/octet-stream
followed by a continuous stream of data. If I open the camera stream in Chrome it initiates a never ending download and curling it also initiates a continuous response.
Node appears to be buffering the response from the camera and parsing it through its HTTP parser each time. This works fine the first time as it has the correct headers but upon the second buffer of data it errors with
HPE_INVALID_HEADER_TOKEN
Can someone please help explain why this is happening? It's a continuous stream of data, why is it trying to parse the HTTP headers on the second buffer? I am not sure whether there is an option I am missing or my camera is not following the HTTP specification properly.
Edit: Example Code
const options = {
family: 4,
headers: {
Authorization: 'Basic ' + base64EncodedAuth,
},
host: '192.168.1.131',
method: 'GET',
path: '/cgi-bin/realmonitor.cgi?action=getStream&channel=1&subtype=0',
port: 80,
protocol:'http:',
};
const req = http.request(options, (res) => {
console.log(`STATUS: ${res.statusCode}`);
console.log(`HEADERS: ${JSON.stringify(res.headers)}`);
res.on('data', (chunk) => {
console.log(`BODY: ${chunk}`);
});
res.on('end', () => {
console.log('No more data in response.');
});
});
req.on('error', (e) => {
console.log(`problem with request: ${e.message}`);
});
req.end();
The only callback that is hit is the 'error' one.
I further examined the curl log from the camera and noticed that everything was being marked as:
<= Recv header
It is never sending the separate CRLF required by the HTTP specification to signal that all the headers have been sent. That is why the parser was trying to parse it as a header and, quite rightly, throwing an error.
I have been working with node.js to set up a proxy server that will handle incoming client request and will verify that they have the proper certificates to get connected to the server.
What I want to do is to be able to add the client's certificate to their header to craft a user name, that I will pass on to the server.
function (req, res) {
//Here is the client certificate in a variable
var clientCertificate = req.socket.getPeerCertificate();
// Proxy a web request
return this.handle_proxy('web', req, res);
};
What I want to be able to do is this : req.setHeader('foo','foo')
I know that the proxy.on('proxyReq) exist, but the way the code is set up, I need to be able to use the req parameter.
Is there a way to do this?
Please let me know if I need to clarify my question.
You can craft your own http request with the headers provided in the original request plus any extra headers that you'd like by using http.request. Just receive the original request, copy the headers into the new request headers, add the new headers and send the new request.
var data = [];
var options = {
hostname: 'www.google.com',
port: 80,
path: '/upload',
method: 'POST',
headers: {
'Content-Type': 'application/x-www-form-urlencoded',
'Content-Length': postData.length
}
};
var req = http.request(options, function(res) {
res.setEncoding('utf8');
res.on('data', function (chunk) {
data.push(chunk);
});
res.on('end', function() {
console.log(data.join(""));
//send the response to your original request
});
});
req.on('error', function(e) {
console.log('problem with request: ' + e.message);
});
// Set headers here i.e. req.setHeader('Content-Type', originalReq.getHeader('Content-Type'));
// write data to request body
req.write(/*original request data goes here*/);
req.end();
I'm trying to send a SSE text/event-stream response from an express.js end point. My route handler looks like:
function openSSE(req, res) {
res.writeHead(200, {
'Content-Type': 'text/event-stream; charset=UTF-8',
'Cache-Control': 'no-cache',
'Connection': 'keep-alive',
'Transfer-Encoding': 'chunked'
});
// support the polyfill
if (req.headers['x-requested-with'] == 'XMLHttpRequest') {
res.xhr = null;
}
res.write(':' + Array(2049).join('\t') + '\n'); //2kb padding for IE
res.write('id: '+ lastID +'\n');
res.write('retry: 2000\n');
res.write('data: cool connection\n\n');
console.log("connection added");
connections.push(res);
}
Later I then call:
function sendSSE(res, message){
res.write(message);
if (res.hasOwnProperty('xhr')) {
clearTimeout(res.xhr);
res.xhr = setTimeout(function () {
res.end();
removeConnection(res);
}, 250);
}
}
My browser makes the and holds the request:
None of the response gets pushed to the browser. None of my events are fired. If I kill the express.js server. The response is suddenly drained and every event hits the browser at once.
If I update my code to add res.end() after the res.write(message) line It flushes the stream correctly however it then fallsback to event polling and dosen't stream the response.
I've tried adding padding to the head of the response like
res.write(':' + Array(2049).join('\t') + '\n');
as I've seen from other SO post that can trigger a browser to drain the response.
I suspect this is an issue with express.js because I had been previously using this code with nodes native http server and it was working correctly. So I'm wondering if there is some way to bypass express's wrapping of the response object.
This is the code I have working in my project.
Server side:
router.get('/listen', function (req, res) {
res.header('transfer-encoding', 'chunked');
res.set('Content-Type', 'text/json');
var callback = function (data) {
console.log('data');
res.write(JSON.stringify(data));
};
//Event listener which calls calback.
dbdriver.listener.on(name, callback);
res.socket.on('end', function () {
//Removes the listener on socket end
dbdriver.listener.removeListener(name, callback);
});
});
Client side:
xhr = new XMLHttpRequest();
xhr.open("GET", '/listen', true);
xhr.onprogress = function () {
//responseText contains ALL the data received
console.log("PROGRESS:", xhr.responseText)
};
xhr.send();
I was struggling with this one too, so after some browsing and reading I solved this issue by setting an extra header to the response object:
res.writeHead(200, {
"Content-Type": "text/event-stream",
"Cache-Control": "no-cache",
"Content-Encoding": "none"
});
Long story short, when the EventSource is negotiating with the server, it is sending an Accept-Encoding: gzip, deflate, br header which is making express to respond with an Content-Encoding: gzip header. So there are two solutions for this issue, the first is to add a Content-Encoding: none header to the response and the second is to (gzip) compress your response.