nodejs gm content-length implementation hangs browser - node.js

I've written a simple image manipulation service that uses node gm on an image from an http response stream. If I use nodejs' default transfer-encoding: chunked, things work just fine. But, as soon as I try and add the content-length implementation, nodejs hangs the response or I get content-length mismatch errors.
Here's the gist of the code in question (variables have been omitted due to example):
var image = gm(response);
// gm getter used to get origin properties of image
image.identify({bufferStream: true}, function(error, value){
this.setFormat(imageFormat)
.compress(compression)
.resize(width,height);
// instead of default transfer-encoding: chunked, calculate content-length
this.toBuffer(function(err, buffer){
console.log(buffer.length);
res.setHeader('Content-Length', buffer.length);
gm(buffer).stream(function (stError, stdout, stderr){
stdout.pipe(res);
});
});
});
This will spit out the desired image and a content length that looks right, but the browser will hang suggesting that there's a bit of a mismatch or something else wrong. I'm using node gm 1.9.0.
I've seen similar posts on nodejs gm content-length implementation, but I haven't seen anyone post this exact problem yet.
Thanks in advance.

I ended up changing my approach. Instead of using this.toBuffer(), I save the new file to disk using this.write(fileName, callback), then read it with fs.createReadStream(fileName) and piping it to the response. Something like:
var filePath = './output/' + req.param('id') +'.' + imageFormat;
this.write(filePath, function (writeErr) {
var stat = fs.statSync(filePath);
res.writeHead(200, {
'Content-Type': 'image/' + imageFormat,
'Content-Length': stat.size
});
var readStream = fs.createReadStream(filePath);
readStream.pipe(res);
// async delete the file from filesystem
...
});
You end up getting all of the headers you need including your new content-length to return to the client.

Related

nodejs handling arraybuffers

suppose I make a multipart, application/octet-stream request with responseType as 'arraybuffer'...suppose I receive this in nodejs and I try to write the response into a file. How can I handle this such that I don't corrupt the contents?
My current approach is something like this
var req = var req = restler.post(url, opts)
.on('data', function (data){
console.log('receiving data...');
console.log(data);
}).on('complete', function (data) {
var buff = new Buffer(data) //this is prolly incorrect, but I can't figure this out at all
fs.writeFile(file_name, buff.toString('binary'), function(err){
console.log('done!')
});
Here I write the contents into filename.
Suppose I fetch a microsoft word file...fetching it only leads me to a corrupt file. Also using restler package for this
According to the restler documentation, you can set decoding: 'buffer' in your opts and it will keep the binary data intact as a Buffer instead of the default utf8-encoded string. From there it's just a matter of passing the buffer directly to fs.writeFile() without calling buffer.toString().

hitting a multipart url in nodejs

I have a client code using form-data module to hit a url that returns a content-type of image/jpeg. Below is my code
var FormData = require('form-data');
var fs = require('fs');
var form = new FormData();
//form.append('POLICE', "hello");
//form.append('PAYSLIP', fs.createReadStream("./Desert.jpg"));
console.log(form);
//https://fbcdn-profile-a.akamaihd.net/hprofile-ak-xfp1/v/t1.0- 1/c8.0.50.50/p50x50/10934065_1389946604648669_2362155902065290483_n.jpg?oh=13640f19512fc3686063a4703494c6c1&oe=55ADC7C8&__gda__=1436921313_bf58cbf91270adcd7b29241838f7d01a
form.submit({
protocol: 'https:',
host: 'fbcdn-profile-a.akamaihd.net',
path: '/hprofile-ak-xfp1/v/t1.0-1/c8.0.50.50/p50x50/10934065_1389946604648669_2362155902065290483_n.jpg?oh=13640f19512fc3686063a3494c6c1&oe=55ADCC8&__gda__=1436921313_bf58cbf91270adcd7b2924183',
method: 'get'
}, function (err, res) {
var data = "";
res.on("data", function (chunks) {
data += chunks;
});
res.on("end", function () {
console.log(data);
console.log("Response Headers - " + JSON.stringify(res.headers));
});
});
I'm getting some chunk data and the response headers i received was
{"last-modified":"Thu, 12 Feb 2015 09:49:26 GMT","content-type":"image/jpeg","timing-allow-origin":"*","access-control-allow-origin":"*","content-length":"1443","cache-control":"no-transform, max-age=1209600","expires":"Thu, 30 Apr 2015 07:05:31 GMT","date":"Thu, 16 Apr 2015 07:05:31 GMT","connection":"keep-alive"}
I am now stuck as how to process the response that i received to a proper image.I tried base64 decoding but it seemed to be a wrong approach any help will be much appreciated.
I expect that data, once the file has been completely downloaded, contains a Buffer.
If that is the case, you should write the buffer as is, without any decoding, to a file:
fs.writeFile('path/to/file.jpg', data, function onFinished (err) {
// Handle possible error
})
See fs.writeFile() documentation - you will see that it accepts either a string or a buffer as data input.
Extra awesomeness by using streams
Since the res object is a readable stream, you can simply pipe the data directly to a file, without keeping it in memory. This has the added benefit that if you download really large file, Node.js will not have to keep the whole file in memory (as it does now), but will write it to the filesystem continuously as it arrives.
form.submit({
// ...
}, function (err, res) {
// res is a readable stream, so let's pipe it to the filesystem
var file = fs.createWriteStream('path/to/file.jpg')
res.on('end', function writeDone (err) {
// File is saved, unless err happened
})
.pipe(file) // Send the incoming file to the filesystem
})
The chunk you got is the raw image. Do whatever it is you want with the image, save it to disk, let the user download it, whatever.
So if I understand your question clearly, you want to download a file from an HTTP endpoint and save it to your computer, right? If so, you should look into using the request module instead of using form-data.
Here's a contrived example for downloading things using request:
var fs = require('fs');
var request = require('request')
request('http://www.example.com/picture.jpg')
.pipe(fs.createWriteStream('picture.jpg'))
Where 'picture.jpg' is the location to save to disk. You can open it up using a normal file browser.

Response not ending and browser keeps loading - nodejs and graphicsmagick

I am new to nodejs. I am using graphicsmagick to resize the image before sending it to the browser.
My code looks like this (res is the response to be sent from function(req,res){...}) -
imageURLPromise
.then(function(response) {
var body = response.body;
if (body) {
console.log("Found From: " + response.request.uri.href);
//set response headers here
setResponseData(res, response);
var buf = new Buffer(body, "binary");
console.log(buf);
//gm module
gm(buf).resize(400,300).toBuffer(function(err,buffer) {
console.log("buffer here");
res.end(buffer, "binary");
});
}
}, function(error) {
console.log(error);
});
I get the image in the browser, I get the log "buffer here" but the browser stays in the "loading" state.
I have tried using .stream() with gm and pipe the stdout to response but it has the same problem.
If I do away with gm and directly write body to the response like this
res.end(body, 'binary');
then it works correctly.
Can someone tell what I am doing wrong here?
I figured out the problem.
The problem was not with node or gm but with the HTTP response headers.
When GM returns a buffer and we write that to the HTTP response then it sets the Transfer-Encoding header to "chunked". In that case the Content-Length header should never be set.
You can read more about it here
http://en.wikipedia.org/wiki/Chunked_transfer_encoding
Since I was setting both the browser kept waiting for content even after the image had been sent.
The code is exactly the same as I posted, except for the fact that in the setResponseData() function (which basically used to set headers) I am not setting the content-length header now.

How to make a socket a stream? To connect https response to S3 after imagemagick

I am node and programming in general, and I have been really struggling with this...
I want to take a https response, resize it with graphicsmagick and send it to my amazon S3 bucket.
It appears that the https res is an IncomingMessage object (I can't find any info about that) and the stdout from graphicsmagick is a Socket.
The weird thing is that I can use pipe and send both of these to a writeStream with a local path, and both res and stdout create a nice new resized image.
And I can even send res to the S3 (using knox) and it works.
But stdout doesn't want to go to the S3 :-/
Any help would be appreciated!
https.get(JSON.parse(queryResponse).data.url,function(res){
var headers = {
'Content-Length': res.headers['content-length']
, 'Content-Type': res.headers['content-type']
}
graphicsmagick(res)
.resize('50','50')
.stream(function (err, stdout, stderr) {
req = S3Client.putStream(stdout,'new_resized.jpg', headers, function(err, res){
})
req.end()
})
})
knox - for connecting to S3 – https://github.com/LearnBoost/knox
graphicsmagick - for image manipulation – https://github.com/aheckmann/gm
The problem was with the fact that Amazon needs to know content length before hand (thanks DarkGlass)
However, since my images are relatively small I found buffering preferential to MultiPartUpload.
My solution:
https.get(JSON.parse(queryResponse).data.url,function(res){
graphicsmagick(res)
.resize('50','50')
.stream(function (err, stdout, stderr) {
ws. = fs.createWriteStream(output)
i = []
stdout.on('data',function(data){
i.push(data)
})
stdout.on('close',function(){
var image = Buffer.concat(i)
var req = S3Client.put("new-file-name",{
'Content-Length' : image.length
,'Content-Type' : res.headers['content-type']
})
req.on('response',function(res){ //prepare 'response' callback from S3
if (200 == res.statusCode)
console.log('it worked')
})
req.end(image) //send the content of the file and an end
})
})
})
You appear to be setting Content-Length from the original image and not the resized one
Maybe this helps
get a stream's content-length
https://npmjs.org/package/knox-mpu
You shouldn't be doing req.end() there. By doing that, you will close the stream to S3 before it has had time to send the image data. It will end itself automatically when all of the image data has been sent.

Pipe an MJPEG stream through a Node.js proxy

Using Motion on linux, every webcam is served up as a stream on its own port.
I now want to serve up those streams, all on the same port, using Node.js.
Edit: This solution now works. I needed to get the boundary string from the original mjpeg stream (which was "BoundaryString" in my Motion config)
app.get('/motion', function(req, res) {
var boundary = "BoundaryString";
var options = {
// host to forward to
host: '192.168.1.2',
// port to forward to
port: 8302,
// path to forward to
path: '/',
// request method
method: 'GET',
// headers to send
headers: req.headers
};
var creq = http.request(options, function(cres) {
res.setHeader('Content-Type', 'multipart/x-mixed-replace;boundary="' + boundary + '"');
res.setHeader('Connection', 'close');
res.setHeader('Pragma', 'no-cache');
res.setHeader('Cache-Control', 'no-cache, private');
res.setHeader('Expires', 0);
res.setHeader('Max-Age', 0);
// wait for data
cres.on('data', function(chunk){
res.write(chunk);
});
cres.on('close', function(){
// closed, let's end client request as well
res.writeHead(cres.statusCode);
res.end();
});
}).on('error', function(e) {
// we got an error, return 500 error to client and log error
console.log(e.message);
res.writeHead(500);
res.end();
});
creq.end();
});
I would think this serves up the mjpeg stream at 192.168.1.2:8302 as /motion, but it does not.
Maybe because it never ends, and this proxy example wasn't really a streaming example?
Streaming over HTTP isn't the issue. I do that with Node regularly. I think the problem you're having is that you aren't sending a content type header to the client. You go right to writing data without sending any response headers, actually.
Be sure to send the right content type header back to the client making the request, before sending any actual content data.
You may need to handle multipart responses, if Node's HTTP client doesn't already do it for you.
Also, I recommend debugging this with Wireshark so you can see exactly what is being sent and received. That will help you narrow down problems like this quickly.
I should also note that some clients have a problem with chunked encoding, which is what Node will send if you don't specify a content length (which you can't because it's indefinite). If you need to disable chunked encoding, see my answer here: https://stackoverflow.com/a/11589937/362536 Basically, you just need to disable it: response.useChunkedEncodingByDefault = false;. Don't do this unless you need to though! And make sure to send a Connection: close in your headers with it!
What you need to do is request the mjpeg stream when it's necessary just in one thread and response each client with mjpeg or jpeg (if you need IE support).

Resources