Pipe an MJPEG stream through a Node.js proxy - node.js

Using Motion on linux, every webcam is served up as a stream on its own port.
I now want to serve up those streams, all on the same port, using Node.js.
Edit: This solution now works. I needed to get the boundary string from the original mjpeg stream (which was "BoundaryString" in my Motion config)
app.get('/motion', function(req, res) {
var boundary = "BoundaryString";
var options = {
// host to forward to
host: '192.168.1.2',
// port to forward to
port: 8302,
// path to forward to
path: '/',
// request method
method: 'GET',
// headers to send
headers: req.headers
};
var creq = http.request(options, function(cres) {
res.setHeader('Content-Type', 'multipart/x-mixed-replace;boundary="' + boundary + '"');
res.setHeader('Connection', 'close');
res.setHeader('Pragma', 'no-cache');
res.setHeader('Cache-Control', 'no-cache, private');
res.setHeader('Expires', 0);
res.setHeader('Max-Age', 0);
// wait for data
cres.on('data', function(chunk){
res.write(chunk);
});
cres.on('close', function(){
// closed, let's end client request as well
res.writeHead(cres.statusCode);
res.end();
});
}).on('error', function(e) {
// we got an error, return 500 error to client and log error
console.log(e.message);
res.writeHead(500);
res.end();
});
creq.end();
});
I would think this serves up the mjpeg stream at 192.168.1.2:8302 as /motion, but it does not.
Maybe because it never ends, and this proxy example wasn't really a streaming example?

Streaming over HTTP isn't the issue. I do that with Node regularly. I think the problem you're having is that you aren't sending a content type header to the client. You go right to writing data without sending any response headers, actually.
Be sure to send the right content type header back to the client making the request, before sending any actual content data.
You may need to handle multipart responses, if Node's HTTP client doesn't already do it for you.
Also, I recommend debugging this with Wireshark so you can see exactly what is being sent and received. That will help you narrow down problems like this quickly.
I should also note that some clients have a problem with chunked encoding, which is what Node will send if you don't specify a content length (which you can't because it's indefinite). If you need to disable chunked encoding, see my answer here: https://stackoverflow.com/a/11589937/362536 Basically, you just need to disable it: response.useChunkedEncodingByDefault = false;. Don't do this unless you need to though! And make sure to send a Connection: close in your headers with it!

What you need to do is request the mjpeg stream when it's necessary just in one thread and response each client with mjpeg or jpeg (if you need IE support).

Related

SSE events are not reaching client until res.end() is called

The same code used to work before but i dont know why it is not working now.Please help.
My problem is that when i use SSE for real time data sharing.The data object which should be sent on res.write(data:${JSON.stringify(dataObject)}\n\n) is not being sent until res.end() is called and all the data is event streams are sent at once.
response.writeHead(200, {
Connection: "keep-alive",
"Content-Type": "text/event-stream",
"Cache-Control": "no-cache",
"Access-Control-Allow-Origin": '*',
'Content-Encoding':'identity' // this with and without this
});
let syncher= setInterval(() => {
if(response.finished){ // If response ended the interval is cleared
clearInterval(syncher);
return;
}else{
let dataToSend = getEventData(user,event);
if(! dataToSend ){
response.write('data:{close:true}');
clearInterval(syncher);
return;
}
response.write(`data:${JSON.stringify(dataToSend)}\n\n`);
response.flushHeaders(); // Also tried with response.flush()
if(dataToSend.close){
delEventData(user,event);
response.end();
}
}
}, 500);
The above code is in server side this also has on close listener to close the connection
const ev = new EventSource(conf.apiUrl+'/getStatus/'+ (userData.id || '') );
let data = '';
ev.onmessage = eventData=>{
data = JSON.parse(eventData.data);
if(!data){
setState('progress '+data.completedSoFar)
return;
}
if(!data.close){
}else{
if(data.success){
console.log('Done Successfully')
ev.close();
}
}
This is my client side code
I don't know why the event listener is not getting data stream while i searched the internet about this issue i only found that when compression middleware is used this issue occurs .I don't use any compression middleware in my app. I am using nodejs v11.4.0. I am guessing that when i am making eventsource request chrome is adding gzip encoding by default and node is using that to set response encoding header as gzip I tried to delete and replace it but did'nt work which is causing this issue??
Here is my request and response headers for eventSource request
Sorry for my grammar if i made any mistakes.
Thanks for help. Cheers!
Nextjs is basically compressing your data to make it transmit faster. Unfortunately, this makes it so we can't see the data until we flush the cache (my guess is render behavior has changed because compression has changed the content). You can disable compression entirely by including compress: false in your next.config.
I found here that including the following header sidesteps the compression for a specific endpoint:
res.setHeader("Cache-Control", "no-cache, no-transform");
Caution: This will increase bandwidth/resource usage! HTTP compression can reduce the size of your data by 70%.
After lot of debugging and research. The problem turned to be in the webpack-dev-server which compressed my responses. For more info refer https://github.com/facebook/create-react-app/issues/966

how to stream a large xml response from express

I am trying to stream a large xml file from express to a client, and I have not yet found out how to send until the file is done processing on the server, and res.end() is called.
The xml file is build with xmlbuilder-js. It has a callback that receives document chunks, in which I am trying to send using response.write(chunk).
res.writeHead(200, {
'Content-Type': 'text/xml',
'Transfer-Encoding': 'chunked'
})
xmlToReturn = xmlBuilder.begin({
writer: {
pretty: true,
}
}, function(chunk) {
res.write(chunk)
}).dec('1.0', 'UTF-8', true)
...
res.end()
The callback works as expected, it shows the data chunks coming through.
I have tried:
changing the content-type on the response to, for example,
'application/octet-stream'
using res.flush() after calling
res.write(), or doing that periodically
experimenting with other headers
In all cases, if I can get the response to send, the client never receives the start of it until res.end() is called. What do I need to do so that express starts delivering the content as it flows through the callback?
I've explored questions and posts like this, which suggest my approach is correct but I am doing something wrong, or streaming is not working in express possibly due to other modules or middleware.

Node.js - Stream Binary Data Straight from Request to Remote server

I've been trying to stream binary data (PDF, images, other resources) directly from a request to a remote server but have had no luck so far. To be clear, I don't want to write the document to any filesystem. The client (browser) will make a request to my node process which will subsequently make a GET request to a remote server and directly stream that data back to the client.
var request = require('request');
app.get('/message/:id', function(req, res) {
// db call for specific id, etc.
var options = {
url: 'https://example.com/document.pdf',
encoding: null
};
// First try - unsuccessful
request(options).pipe(res);
// Second try - unsuccessful
request(options, function (err, response, body) {
var binaryData = body.toString('binary');
res.header('content-type', 'application/pdf');
res.send(binaryData);
});
});
Putting both data and binaryData in a console.log show that the proper data is there but the subsequent PDF that is downloaded is corrupt. I can't figure out why.
Wow, never mind. Found out Postman (Chrome App) was hijacking the request and response somehow. The // First Try example in my code excerpt works properly in browser.

Node.js - Create a proxy, why is request.pipe needed?

Can some one explain this code to create a proxy server. Everything makes sense except the last block. request.pipe(proxy - I don't get that because when proxy is declared it makes a request and pipes its response to the clients response. What am I missing here? Why would we need to pipe the original request to the proxy because the http.request method already makes the request contained in the options var.
var http = require('http');
function onRequest(request, response) {
console.log('serve: ' + request.url);
var options = {
hostname: 'www.google.com',
port: 80,
path: request.url,
method: 'GET'
};
var proxy = http.request(options, function (res) {
res.pipe(response, {
end: true
});
});
request.pipe(proxy, {
end: true
});
}
http.createServer(onRequest).listen(8888);
What am I missing here? [...] the http.request method already makes the request contained in the options var.
http.request() doesn't actually send the request in its entirety immediately:
[...] With http.request() one must always call req.end() to signify that you're done with the request - even if there is no data being written to the request body.
The http.ClientRequest it creates is left open so that body content, such as JSON data, can be written and sent to the responding server:
var req = http.request(options);
req.write(JSON.stringify({
// ...
}));
req.end();
.pipe() is just one option for this, when you have a readable stream, as it will .end() the client request by default.
Although, since GET requests rarely have a body that would need to be piped or written, you can typically use http.get() instead, which calls .end() itself:
Since most requests are GET requests without bodies, Node provides this convenience method. The only difference between this method and http.request() is that it sets the method to GET and calls req.end() automatically.
http.get(options, function (res) {
res.pipe(response, {
end: true
});
});
Short answer: the event loop. I don't want to talk too far out of my ass, and this is where node.js gets both beautiful and complicated, but the request isn't strictly MADE on the line declaring proxy: it's added to the event loop. So when you connect the pipe, everything works as it should, piping from the incoming request > proxy > outgoing response. It's the magic / confusion of asynchronous code!

nodejs gm content-length implementation hangs browser

I've written a simple image manipulation service that uses node gm on an image from an http response stream. If I use nodejs' default transfer-encoding: chunked, things work just fine. But, as soon as I try and add the content-length implementation, nodejs hangs the response or I get content-length mismatch errors.
Here's the gist of the code in question (variables have been omitted due to example):
var image = gm(response);
// gm getter used to get origin properties of image
image.identify({bufferStream: true}, function(error, value){
this.setFormat(imageFormat)
.compress(compression)
.resize(width,height);
// instead of default transfer-encoding: chunked, calculate content-length
this.toBuffer(function(err, buffer){
console.log(buffer.length);
res.setHeader('Content-Length', buffer.length);
gm(buffer).stream(function (stError, stdout, stderr){
stdout.pipe(res);
});
});
});
This will spit out the desired image and a content length that looks right, but the browser will hang suggesting that there's a bit of a mismatch or something else wrong. I'm using node gm 1.9.0.
I've seen similar posts on nodejs gm content-length implementation, but I haven't seen anyone post this exact problem yet.
Thanks in advance.
I ended up changing my approach. Instead of using this.toBuffer(), I save the new file to disk using this.write(fileName, callback), then read it with fs.createReadStream(fileName) and piping it to the response. Something like:
var filePath = './output/' + req.param('id') +'.' + imageFormat;
this.write(filePath, function (writeErr) {
var stat = fs.statSync(filePath);
res.writeHead(200, {
'Content-Type': 'image/' + imageFormat,
'Content-Length': stat.size
});
var readStream = fs.createReadStream(filePath);
readStream.pipe(res);
// async delete the file from filesystem
...
});
You end up getting all of the headers you need including your new content-length to return to the client.

Resources