response.write failure in Nodejs - node.js

I am trying to make a proxy server that gets a page from www.xxx.com for example, cache the response, and then send it to the requesting browser.
To do so, on the server I create an HTTP client that requests the page from xxx.com. The response is returned in the form of chunks (Buffers). Then, since the number of chunks is different according to the webpage, I put the chunks in an array of buffers. Then I send the elements of the array.
My problem is not all the chunks are sent successfully. Is there any other way I can cache the data before sending it? (I know that I can send the data directly, but I need to send the cache instead since I want to send it to more than one browser)
To save the chunks I use:
function getURL(u) {
u = url.parse(u);
var client = http.createClient(u.port || 80, u.hostname);
var request = client.request('GET', '/', {
'Host': u.hostname,
});
var cache ={ };
cache.data = [];
request.end();
request.on('response', function(response) {
cache.statusCode = response.statusCode;
cache.headers = response.headers;
response.on('data', function(chunk) {
cache.data.push(chunk);
}
}
to send the cache, i use:
function sendCache(response, cache) {
var writeSuccess = [];
response.writeHead(cache.statusCose, cache.headers);
for (var i in cache.data) {
// don't encode the data, leave it as it is
writeSuccess[i] = response.write(cache.data[i], "binary");
console.log("chunk " + i + " is of length "
+ cache.data[i].length + ". Response success: " + writeSuccess[i]);
}
}
Here I log the returned value of the response.write to check if it is successful or not. In the node.js API, it is not explained if this function returns something or not, but I just tried it out.
What I noticed, the response.write was sometimes true and then false for other chunks of the cache whereas if I directly send the response without caching, response.write of all chunks is true.
If anyone notices something wrong I am doing or does know a better way to cache the data (preferably in binary so that all non-ASCII characters will be cached to).

If you are trying to proxy requests in node.js you should try using https://github.com/nodejitsu/node-http-proxy, it will save you a lot of time and headaches.

In the latest release of Node.js (v 0.4.0), the issue of writing a response from buffer was solved. So by just updating to this version my problem was solved.
However, one has to know that response.write may still give false, but this doesn't mean that it is not sent, but not sent directly (leaky bucket concept). This is what I was able to conclude from the comment inside the node.js library (I hope I am correct).

Related

NodeJS ending https response early

I have a nodejs application that sends http requests to an external API and retrieves json data (in the form of a hex buffer) in response, using http.
However, when I try to pull large data sets from the API, the data is incomplete. The json will cut off about a third of the way through and trying to parse it produces an error. I don't think it's a problem with my parsing (toString) because res.complete is not being triggered, so it's clearly not finishing.
Is there a way to force my request to wait for res.complete to finish?
My request looks like this:
const req = https.get(options=pull_options,res=> {
res.on('data', d=> {
if(res.complete) {
resolve(d.toString());
} else {
console.error("Connection terminated while message was still being sent.")
}
}
}
I really don't think it's a problem with the API cutting me off because I'm able to pull the same data set with nearly identical code in python with no issues.
output = '';
const req = https.get(options=pull_options,res=> {
res.on('data', d=> {
output += d;
}
res.on('end', d=> {
resolve(output)
}
}
Changing the resolve to on end to allow all the data to come in worked for me.

server response to webform: how to answer duplicates?

I'm running a small server that needs to receive webforms. The server checks the request and sends back "success" or "fail" which is then displayed on the form (client screen).
Now, checking the form may take a few seconds, so the user may be tempted to send the form again.
What is the corret way to ignore the second request?
So far I have come out with this solutions: If the form is duplicate of the previous one
Don't check and send some server error back (like 429, or 102, or some other one)
Close directly the connection req.destroy();res.destroy();
Ignore the request and exit from the requestListener function.
With solution 1 and 2 the form (on client's browser) displays a message error (even if the first request they sent was correct, so as the duplicates). So it's not a good one.
Solution 3 gives the desired outcome... but I'm not sure if it is the right way around it... basically not changing req and res instead of destroying them. Could this cause issues, or slow down the server? (like... do they stack up?). Of course the first request, once it has been checked, will be sent back with the outcome code. My concern is with the duplicate requests, which I don't destroy nor answer...
Some details on the setup: Nodejs application using the very default code by the http module.
const http = require("http");
const requestListener = function (req, res) {
var requestBody = '';
req.on('data', (data)=>{
requestBody += data;
});
req.on('end', ()=>{
if (isduplicate(requestBody))
return;
else
evalRequest(requestBody, res);
})
}

Is there a way to limit the amount of data that I get from a response?

Hello I've got a small challenge to do where I have to display some data that I get from an api. The main page will display the first 20 results and clicking on a button will add 20 more results from the page.
The api call that I was given returns an array with around 1500 elements and the api doesn't have a parameter to limit the amount of elements in the array so my question is if I can limit it somehow with axios or should I just fetch all of these elements and display them?
This is the api: https://api.chucknorris.io/
there are two answers for your question
the short answer is :
On your side, there's nothing you can do until pagination is implemented on API side
the second answer is :
you can handle it using http module like this
http.request(opts, function(response) {
var request = this;
console.log("Content-length: ", response.headers['content-length']);
var str = '';
response.on('data', function (chunk) {
str += chunk;
if (str.length > 10000)
{
request.abort();
}
});
response.on('end', function() {
console.log('done', str.length);
...
});
}).end();
This will abort the request at around 10.000 bytes, since the data arrives in chunks of various sizes.
Since the API has no parameter to limit the amount of results you are responsible for modifying the response.
Since you're using Axios you could do this with a response interceptor so that the response is modified before reaching your application.
You may want to consider where the best place to do this is though. If you allow the full response to come back to your application and then store it somewhere, it may be easier to return the next page of 20 results at the user's request rather than repeatedly calling the API.

Fabric.js loadFromJSON sometimes fails in Node.js if string contains images

I have a problem with PNG image ganeration at server side, using Fabric.js + Node.js. I am wondering that there is no one with similar probem found in forums. I am in total despair. It makes under risk of using Fabric.js in our project.
PNG image generation in Fabric.js Node.js service fails on a unregular basis. I can not determine why sometimes it gets generated and sometimes not.
I need to generate PNG at server side. I’ve developed a small Node.js webservice based on samples here and here.
I’ve developed also a custom Fabric.js image class “RemoteImage”, based on Kangax sample here.
To minimize JSON string size, I am storing a dataless JSON in my database and images are supposed to be loaded using provide link in “src” attribute of the Fabric.js Image element. As the result, I need to load following JSON into canvas that contains 3 images:
{"objects":[{"type":"remote-image","originX":"left","originY":"top","left":44,"top":29,"width":976,"height":544,"fill":"rgb(0,0,0)","stroke":null,"strokeWidth":1,"strokeDashArray":null,"strokeLineCap":"butt","strokeLineJoin":"miter","strokeMiterLimit":10,"scaleX":0.5,"scaleY":0.5,"angle":0,"flipX":false,"flipY":false,"opacity":1,"shadow":null,"visible":true,"clipTo":null,"backgroundColor":"","fillRule":"nonzero","globalCompositeOperation":"source-over","localId":"222c0a8b-46ac-4c01-9c5c-79753937bc24","layerName":"productCanvas","itemName":"mainCanvas","src":"http://localhost:41075/en/RemoteStorage/GetRemoteItemImage/222c0a8b-46ac-4c01-9c5c-79753937bc24","filters":[],"crossOrigin":"use-credentials","alignX":"none","alignY":"none","meetOrSlice":"meet","remoteSrc":"http://localhost:41075/en/RemoteStorage/GetRemoteItemImage/222c0a8b-46ac-4c01-9c5c-79753937bc24","lockUniScaling":true},
{"type":"remote-image","originX":"left","originY":"top","left":382.5,"top":152.25,"width":292,"height":291,"fill":"rgb(0,0,0)","stroke":null,"strokeWidth":1,"strokeDashArray":null,"strokeLineCap":"butt","strokeLineJoin":"miter","strokeMiterLimit":10,"scaleX":0.43,"scaleY":0.43,"angle":0,"flipX":false,"flipY":false,"opacity":1,"shadow":null,"visible":true,"clipTo":null,"backgroundColor":"","fillRule":"nonzero","globalCompositeOperation":"source-over","localId":"8d97050e-eae8-4e95-b50b-f934f0df2d4c","itemName":"BestDeal.png","src":"http://localhost:41075/en/RemoteStorage/GetRemoteItemImage/8d97050e-eae8-4e95-b50b-f934f0df2d4c","filters":[],"crossOrigin":"use-credentials","alignX":"none","alignY":"none","meetOrSlice":"meet","remoteSrc":"http://localhost:41075/en/RemoteStorage/GetRemoteItemImage/8d97050e-eae8-4e95-b50b-f934f0df2d4c","lockUniScaling":true},
{"type":"remote-image","originX":"left","originY":"top","left":38,"top":38.5,"width":678,"height":370,"fill":"rgb(0,0,0)","stroke":null,"strokeWidth":1,"strokeDashArray":null,"strokeLineCap":"butt","strokeLineJoin":"miter","strokeMiterLimit":10,"scaleX":0.21,"scaleY":0.21,"angle":0,"flipX":false,"flipY":false,"opacity":1,"shadow":null,"visible":true,"clipTo":null,"backgroundColor":"","fillRule":"nonzero","globalCompositeOperation":"source-over","localId":"42dc0e49-e45f-4aa7-80cf-72d362deebb7","itemName":"simple_car.png","src":"http://localhost:41075/en/RemoteStorage/GetRemoteItemImage/42dc0e49-e45f-4aa7-80cf-72d362deebb7","filters":[],"crossOrigin":"use-credentials","alignX":"none","alignY":"none","meetOrSlice":"meet","remoteSrc":"http://localhost:41075/en/RemoteStorage/GetRemoteItemImage/42dc0e49-e45f-4aa7-80cf-72d362deebb7","lockUniScaling":true}],"background":""}
At Node.js server side I use the following code. I am transferring JSON string in base64 encoding to avoid some special-character problems:
var fabric = require('fabric').fabric;
function generatePNG(response, postData) {
var canvas = fabric.createCanvasForNode(1500, 800);
var decodedData = new Buffer(postData, 'base64').toString('utf8');
response.writeHead(200, "OK", { 'Content-Type': 'image/png' });
console.log("decodedData data: " + JSON.stringify(decodedData));
console.log("prepare to load");
canvas.loadFromJSON(decodedData, function () {
console.log("loaded");
canvas.renderAll();
console.log("rendered");
var stream = canvas.createPNGStream();
stream.on('data', function (chunk) {
response.write(chunk);
});
stream.on('end', function () {
response.end();
});
});
}
In a console I see that message “prepare to load” appears, but message “loaded” does not. I am not an expert in Node.js and this is the only way how I can determine that error happens during the loadFromJSON call. But I do not understand, where is the problem.
I am using fabric v.1.5.0 and node-canvas v.1.1.6 on server side.
Node.js + Fabric.js service is running on Windows 8 machine. And I am makeing a request from .NET MVC application, using POST request.
Remark: May be I needed to omit my comment about base64 encoding as it is confusing. I tried to run with normal json string and the same result.
If the images referenced in the JSON are on the NodeJS server, try changing the file path to the directory path on the server as opposed to a web URL.
I'm not sure I fully understand how you are using the base64 image, but there are some character corrections that are required for base64 images. I of course don't recall the specifics and don't have my code handy that I perform this in, but a Google search should set you in the right direction.
I hope those ideas help.
It turned out that problem was related to the way how fabric.util.loadImage method works. For external images loadImage mathod makes an http request assuming that no error can happen. Method used for requesting external images just simply logs an error and ends, instead of returning error through callback method back to loadImage method. At this moment image loading routine falls apart with erroneous state and without any feedback - it just terminates crashing whole Node.js.
It took 3 days for me to finally find out that actually it was my image supplying webservice who just responds with status code 500 making Node.js request to fail. Using my image supplying webservice through browser worked correctly and therefore at the first moment I did not considered that error is related particularly with request.
As the result I rewrote fromObject method of my custom Fabric.js object. Now it works in more safe fashion and in case of error I can get more feedback. Here is the implementation of my fromObject method. For http request I use module "request".
fabric.RemoteImage.fromObject = function (object, callback) {
var requestUrl = object.remoteSrc;
request({
url: object.remoteSrc,
encoding: null
},
function(error, response, body) {
if (error || response.statusCode !== 200) {
var errorMessage = "Error retrieving image " + requestUrl;
errorMessage += "\nResponse for a new image returned status code " + response.statusCode;
if (error) {
errorMessage += " " + error.name + " with message: \n" + error.message;
console.log(error.stack);
}
console.log(errorMessage);
callback && callback(null, new Error(errorMessage));
} else {
var img = new Image();
var buff = new Buffer(body, 'binary');
img.src = buff;
var fabrImg = new fabric.RemoteImage(img, object);
callback && callback(fabrImg);
}
});
};

Node.js - How can I wait for something to be POSTed before I reply to a GET

I have 2 clients and one node.js server url - localhost:8888/ServerRequest. The First client GETs from this url and waits for 20 seconds to see if the Second client has POSTed some data for the first client within the 20 second timeout period or not.If the second client did POST before the timeout, then that value is returned to the GET request, else a default value is returned for the GET request. I am not sure what is the best way to implement this. I am trying something like this, but it is not working as desired -
function ServerRequest(response, postData , request)
{
var id;
if(request.method == "GET")
{
id= setTimeout(function( )
{
// handle timeout here
console.log("Got a timeout, sending default value");
cmd = "DefaultVal";
response.write("<?xml version=\"1.0\" encoding=\"UTF-8\"?><list id=\"20101001\"><com type=\"" + cmd + "\"></com></list>")
response.end()
},20000);
}
else if(request.method == "POST")
{
console.log("Received POST, sending POSTed value");
cmd = postData;
//Cancel Timeout
clearTimeout(id);
console.log(" \n Received POST")
response.write("<?xml version=\"1.0\" encoding=\"UTF-8\"?><list id=\"20101001\"><com type=\"" + cmd + "\"></com></list>")
response.end()
}
}
Another approach in my mind was to use 2 separate URLs - One for GET Request (/ServerRequest) and the other for POST Request (/PostData). But then how will I pass the POSTed data from one URL to the other if received before the timeout?
EDIT: I think I know now what I exactly need. I need to implement a longpoll, where a client sends a GET request, and waits for a timeout period (the data might not be immediately available to consume, so it waits for 20 seconds for some other client to POST some data for the first client to consume). In case timeout occurs, a default value is returned in response to the GET request from the first client. I'm working on the longpoll implementation I found here, I'll update if I am able to succeed in what I'm trying. If someone can point me or provide me with a better example, it will be helpful.
Edit: removed my original code after a more careful reading of the question.
The best solution would probably be websockets the browser will appear to hang waiting for 20 seconds.
Using a library like socket.io you can do this
var io = require('socket.io').listen(8888);
function postHandler(req, data, res){
io.sockets.emit("response" , data)
}
then client side
<script src="/socket.io/socket.io.js"></script>
<script>
var socket = io.connect('http://localhost:8888');
socket.on('response', function (data) {
console.log(data);
});
</script>

Resources