Response not ending and browser keeps loading - nodejs and graphicsmagick - node.js

I am new to nodejs. I am using graphicsmagick to resize the image before sending it to the browser.
My code looks like this (res is the response to be sent from function(req,res){...}) -
imageURLPromise
.then(function(response) {
var body = response.body;
if (body) {
console.log("Found From: " + response.request.uri.href);
//set response headers here
setResponseData(res, response);
var buf = new Buffer(body, "binary");
console.log(buf);
//gm module
gm(buf).resize(400,300).toBuffer(function(err,buffer) {
console.log("buffer here");
res.end(buffer, "binary");
});
}
}, function(error) {
console.log(error);
});
I get the image in the browser, I get the log "buffer here" but the browser stays in the "loading" state.
I have tried using .stream() with gm and pipe the stdout to response but it has the same problem.
If I do away with gm and directly write body to the response like this
res.end(body, 'binary');
then it works correctly.
Can someone tell what I am doing wrong here?

I figured out the problem.
The problem was not with node or gm but with the HTTP response headers.
When GM returns a buffer and we write that to the HTTP response then it sets the Transfer-Encoding header to "chunked". In that case the Content-Length header should never be set.
You can read more about it here
http://en.wikipedia.org/wiki/Chunked_transfer_encoding
Since I was setting both the browser kept waiting for content even after the image had been sent.
The code is exactly the same as I posted, except for the fact that in the setResponseData() function (which basically used to set headers) I am not setting the content-length header now.

Related

Node: Express: How to handle application/octet-stream;charset=;UTF-8 response?

I have a node-express application.
There, I'm trying to make a call to an API which is responding an raw xlsx object as
'Content-Type' : 'application/octet-stream;charset=;UTF-8'
Code how I'm calling the API:
var unirest = require("unirest");
var reqClient = unirest("POST", "https://api.application.com/getExcel");
reqClient.headers({ "Authorization": "Bearer " + req.session.passport.user.token,
"content-type": req.headers['content-type'], "application/json"});
reqClient.type("json");
reqClient.send(JSON.stringify(requestbody));
reqClient.end(function(res) {
if (res.error) throw new Error(res.error);
console.log(res.body);
});
Now 2 things I'm trying to do actually with this data.
Write that into an excel file.
Below is the code how I'm trying it:
let data = res.body // res is the response coming from the API
let buf = Buffer.from(data);
excelfile = fs.createWriteStream("result.xlsx");
excelfile.write(buf);
excelfile.end();
Trying to send it to UI where the excelfile will get created.
Below is my code for that:
let data = res.body // res is the response coming from the API
let buf = Buffer.from(data);
response.write(buf); //response is the response to the request to ui
response.end();
So in both the cases the file is coming corrupted.
But the API response is prefect because, when it's directly getting consumed by the UI, xlsx file is generating properly.
When dealing with binary data, you have to set the encoding to null
reqClient.encoding(null)
reqClient.encoding(null)
reqClient.end(function(res) {
if (res.error) {
return response.status(500).send('error');
}
// res.body is now a buffer
response.write(res.body);
response.end();
});
Otherwise, the data is converted to UTF-8 and you can't convert from UTF-8 back to binary and get the same data, which is what you were doing:
Buffer.from(res.body)
The recommended approach is to use streams directly, I don't see a way of doing this in a simple way in unirest, I recommend using request or got, so you can .pipe directly to a file or express res

NodeJS http.request fails when header is too large

My code is as follows
var http = require('http');
var host=...
var postData=({
//some fun stuff
})
var postOptions ={
host: host,
path: '/api/dostuff',
method: 'POST',
headers:{
AppID:"some stuff",
Authorization: "OAuth token",
"Content-Type":"application/json"
},
};
var req = http.request(postOptions, function(res){
var data = '';
res.on('data', function (chunk) {
data += chunk;
});
res.on('end', function () {
//sanitize data stuff here
console.log("DATA HERE: "+ data);
return data;
});
});
req.write(JSON.stringify(postData));
req.end();
It's a basic HTTP post to a C# server. The important stuff is in the headers. I send the app ID (which is ~50 characters) and the OAuth token (which can be thousands of characters). Right now, the server isn't set up to do anything with the Authorization header. It doesn't even check if its there.
My problem is that when I populate the Authorization header (or any header) with a few random characters as a test, the post succeeds. When I tried it again with a full valid Authorization token (which, to reiterate, is very long) it fails. No matter which part of the header i fill, once it gets too full it returns an error. The error I receive is "Processing of the HTTP request resulted in an exception. Please see the HTTP response returned by the 'Response' property of this exception for details". I was somewhat certain this is a server issue, but when I tried running the exact same body and headers in Postman, I got a valid response.
Does anyone have any idea what is causing this?
There's a compiled constant that's defined to be 80k for Node HTTP headers. Are you running into that? I'd recommend seeing how big the header is with your OAuth token. It shouldn't exceed 80k though, and FWIW, even a kilobyte is huge for OAuth... But regardless... Try dumping the size of the headers (in bytes).

Fabric.js loadFromJSON sometimes fails in Node.js if string contains images

I have a problem with PNG image ganeration at server side, using Fabric.js + Node.js. I am wondering that there is no one with similar probem found in forums. I am in total despair. It makes under risk of using Fabric.js in our project.
PNG image generation in Fabric.js Node.js service fails on a unregular basis. I can not determine why sometimes it gets generated and sometimes not.
I need to generate PNG at server side. I’ve developed a small Node.js webservice based on samples here and here.
I’ve developed also a custom Fabric.js image class “RemoteImage”, based on Kangax sample here.
To minimize JSON string size, I am storing a dataless JSON in my database and images are supposed to be loaded using provide link in “src” attribute of the Fabric.js Image element. As the result, I need to load following JSON into canvas that contains 3 images:
{"objects":[{"type":"remote-image","originX":"left","originY":"top","left":44,"top":29,"width":976,"height":544,"fill":"rgb(0,0,0)","stroke":null,"strokeWidth":1,"strokeDashArray":null,"strokeLineCap":"butt","strokeLineJoin":"miter","strokeMiterLimit":10,"scaleX":0.5,"scaleY":0.5,"angle":0,"flipX":false,"flipY":false,"opacity":1,"shadow":null,"visible":true,"clipTo":null,"backgroundColor":"","fillRule":"nonzero","globalCompositeOperation":"source-over","localId":"222c0a8b-46ac-4c01-9c5c-79753937bc24","layerName":"productCanvas","itemName":"mainCanvas","src":"http://localhost:41075/en/RemoteStorage/GetRemoteItemImage/222c0a8b-46ac-4c01-9c5c-79753937bc24","filters":[],"crossOrigin":"use-credentials","alignX":"none","alignY":"none","meetOrSlice":"meet","remoteSrc":"http://localhost:41075/en/RemoteStorage/GetRemoteItemImage/222c0a8b-46ac-4c01-9c5c-79753937bc24","lockUniScaling":true},
{"type":"remote-image","originX":"left","originY":"top","left":382.5,"top":152.25,"width":292,"height":291,"fill":"rgb(0,0,0)","stroke":null,"strokeWidth":1,"strokeDashArray":null,"strokeLineCap":"butt","strokeLineJoin":"miter","strokeMiterLimit":10,"scaleX":0.43,"scaleY":0.43,"angle":0,"flipX":false,"flipY":false,"opacity":1,"shadow":null,"visible":true,"clipTo":null,"backgroundColor":"","fillRule":"nonzero","globalCompositeOperation":"source-over","localId":"8d97050e-eae8-4e95-b50b-f934f0df2d4c","itemName":"BestDeal.png","src":"http://localhost:41075/en/RemoteStorage/GetRemoteItemImage/8d97050e-eae8-4e95-b50b-f934f0df2d4c","filters":[],"crossOrigin":"use-credentials","alignX":"none","alignY":"none","meetOrSlice":"meet","remoteSrc":"http://localhost:41075/en/RemoteStorage/GetRemoteItemImage/8d97050e-eae8-4e95-b50b-f934f0df2d4c","lockUniScaling":true},
{"type":"remote-image","originX":"left","originY":"top","left":38,"top":38.5,"width":678,"height":370,"fill":"rgb(0,0,0)","stroke":null,"strokeWidth":1,"strokeDashArray":null,"strokeLineCap":"butt","strokeLineJoin":"miter","strokeMiterLimit":10,"scaleX":0.21,"scaleY":0.21,"angle":0,"flipX":false,"flipY":false,"opacity":1,"shadow":null,"visible":true,"clipTo":null,"backgroundColor":"","fillRule":"nonzero","globalCompositeOperation":"source-over","localId":"42dc0e49-e45f-4aa7-80cf-72d362deebb7","itemName":"simple_car.png","src":"http://localhost:41075/en/RemoteStorage/GetRemoteItemImage/42dc0e49-e45f-4aa7-80cf-72d362deebb7","filters":[],"crossOrigin":"use-credentials","alignX":"none","alignY":"none","meetOrSlice":"meet","remoteSrc":"http://localhost:41075/en/RemoteStorage/GetRemoteItemImage/42dc0e49-e45f-4aa7-80cf-72d362deebb7","lockUniScaling":true}],"background":""}
At Node.js server side I use the following code. I am transferring JSON string in base64 encoding to avoid some special-character problems:
var fabric = require('fabric').fabric;
function generatePNG(response, postData) {
var canvas = fabric.createCanvasForNode(1500, 800);
var decodedData = new Buffer(postData, 'base64').toString('utf8');
response.writeHead(200, "OK", { 'Content-Type': 'image/png' });
console.log("decodedData data: " + JSON.stringify(decodedData));
console.log("prepare to load");
canvas.loadFromJSON(decodedData, function () {
console.log("loaded");
canvas.renderAll();
console.log("rendered");
var stream = canvas.createPNGStream();
stream.on('data', function (chunk) {
response.write(chunk);
});
stream.on('end', function () {
response.end();
});
});
}
In a console I see that message “prepare to load” appears, but message “loaded” does not. I am not an expert in Node.js and this is the only way how I can determine that error happens during the loadFromJSON call. But I do not understand, where is the problem.
I am using fabric v.1.5.0 and node-canvas v.1.1.6 on server side.
Node.js + Fabric.js service is running on Windows 8 machine. And I am makeing a request from .NET MVC application, using POST request.
Remark: May be I needed to omit my comment about base64 encoding as it is confusing. I tried to run with normal json string and the same result.
If the images referenced in the JSON are on the NodeJS server, try changing the file path to the directory path on the server as opposed to a web URL.
I'm not sure I fully understand how you are using the base64 image, but there are some character corrections that are required for base64 images. I of course don't recall the specifics and don't have my code handy that I perform this in, but a Google search should set you in the right direction.
I hope those ideas help.
It turned out that problem was related to the way how fabric.util.loadImage method works. For external images loadImage mathod makes an http request assuming that no error can happen. Method used for requesting external images just simply logs an error and ends, instead of returning error through callback method back to loadImage method. At this moment image loading routine falls apart with erroneous state and without any feedback - it just terminates crashing whole Node.js.
It took 3 days for me to finally find out that actually it was my image supplying webservice who just responds with status code 500 making Node.js request to fail. Using my image supplying webservice through browser worked correctly and therefore at the first moment I did not considered that error is related particularly with request.
As the result I rewrote fromObject method of my custom Fabric.js object. Now it works in more safe fashion and in case of error I can get more feedback. Here is the implementation of my fromObject method. For http request I use module "request".
fabric.RemoteImage.fromObject = function (object, callback) {
var requestUrl = object.remoteSrc;
request({
url: object.remoteSrc,
encoding: null
},
function(error, response, body) {
if (error || response.statusCode !== 200) {
var errorMessage = "Error retrieving image " + requestUrl;
errorMessage += "\nResponse for a new image returned status code " + response.statusCode;
if (error) {
errorMessage += " " + error.name + " with message: \n" + error.message;
console.log(error.stack);
}
console.log(errorMessage);
callback && callback(null, new Error(errorMessage));
} else {
var img = new Image();
var buff = new Buffer(body, 'binary');
img.src = buff;
var fabrImg = new fabric.RemoteImage(img, object);
callback && callback(fabrImg);
}
});
};

node.js: browser image caching with correct headers

I'm developing a web application that manages a large amount of images, stores and resizes them.
the request of an image is something like:
domain:port/image_id/size
The server takes the image_id and if there isn't yet an image of such size it creates it and stores it on filesystem.
So everything is ok and the server is running but I need to cache those images in browser for at least one day to reduce the server bandwidth consumption.
I did several tests but nothing seems to work.
Here is the code I use to make the response header:
response.writeHead(304, {
"Pragma": "public",
"Cache-Control": "max-age=86400",
"Expires": new Date(Date.now() + 86400000).toUTCString(),
"Content-Type": contentType});
response.write(data);
response.end();
I also tried with response status 200.
contentType is always a mime type like "image/jpg" or "image/png"
data is the bytes buffer of the image.
Any advice?
Thanks a lot.
live long and prosper,
d.
I did a lot of tests and I came out with a solution that seems pretty good to manage this caching problem.
Basically what I do is getting the request and check for the request header named "if-modified-since".
If I find it and the value (it is a date) is the same as the modified date of the file, the response will be a 304 status with no content.
If I don't find this value or it's different from the modified date of the file, I send the complete response with status 200 and the header parameter for further access by the browser.
Here is the complete code of the working test I did:
with "working" I mean that the first request get the file from the server while the next requests get a 304 response and don't send content to the browser, that load it from local cache.
var http = require("http");
var url = require("url");
var fs = require('fs');
function onRequest(request, response) {
var pathName = url.parse(request.url).pathname;
if (pathName!="/favicon.ico") {
responseAction(pathName, request, response);
} else {
response.end();
}
}
function responseAction(pathName, request, response) {
console.log(pathName);
//Get the image from filesystem
var img = fs.readFileSync("/var/www/radar.jpg");
//Get some info about the file
var stats = fs.statSync("/var/www/radar.jpg");
var mtime = stats.mtime;
var size = stats.size;
//Get the if-modified-since header from the request
var reqModDate = request.headers["if-modified-since"];
//check if if-modified-since header is the same as the mtime of the file
if (reqModDate!=null) {
reqModDate = new Date(reqModDate);
if(reqModDate.getTime()==mtime.getTime()) {
//Yes: then send a 304 header without image data (will be loaded by cache)
console.log("load from cache");
response.writeHead(304, {
"Last-Modified": mtime.toUTCString()
});
response.end();
return true;
}
} else {
//NO: then send the headers and the image
console.log("no cache");
response.writeHead(200, {
"Content-Type": "image/jpg",
"Last-Modified": mtime.toUTCString(),
"Content-Length": size
});
response.write(img);
response.end();
return true;
}
//IF WE ARE HERE, THERE IS A PROBLEM...
response.writeHead(200, {
"Content-Type": "text/plain",
});
response.write("ERROR");
response.end();
return false;
}
http.createServer(onRequest).listen(8889);
console.log("Server has started.");
Of course, I don't want to reinvent the wheel, this is a benchmark for a more complex server previously developed in php and this script is a sort of "porting" of this PHP code:
http://us.php.net/manual/en/function.header.php#61903
I hope this will help!
Please, if you find any errors or anything that could be improved let me know!
Thanks a lot,
Daniele

nodejs gm content-length implementation hangs browser

I've written a simple image manipulation service that uses node gm on an image from an http response stream. If I use nodejs' default transfer-encoding: chunked, things work just fine. But, as soon as I try and add the content-length implementation, nodejs hangs the response or I get content-length mismatch errors.
Here's the gist of the code in question (variables have been omitted due to example):
var image = gm(response);
// gm getter used to get origin properties of image
image.identify({bufferStream: true}, function(error, value){
this.setFormat(imageFormat)
.compress(compression)
.resize(width,height);
// instead of default transfer-encoding: chunked, calculate content-length
this.toBuffer(function(err, buffer){
console.log(buffer.length);
res.setHeader('Content-Length', buffer.length);
gm(buffer).stream(function (stError, stdout, stderr){
stdout.pipe(res);
});
});
});
This will spit out the desired image and a content length that looks right, but the browser will hang suggesting that there's a bit of a mismatch or something else wrong. I'm using node gm 1.9.0.
I've seen similar posts on nodejs gm content-length implementation, but I haven't seen anyone post this exact problem yet.
Thanks in advance.
I ended up changing my approach. Instead of using this.toBuffer(), I save the new file to disk using this.write(fileName, callback), then read it with fs.createReadStream(fileName) and piping it to the response. Something like:
var filePath = './output/' + req.param('id') +'.' + imageFormat;
this.write(filePath, function (writeErr) {
var stat = fs.statSync(filePath);
res.writeHead(200, {
'Content-Type': 'image/' + imageFormat,
'Content-Length': stat.size
});
var readStream = fs.createReadStream(filePath);
readStream.pipe(res);
// async delete the file from filesystem
...
});
You end up getting all of the headers you need including your new content-length to return to the client.

Resources