HTTP Content-Encoding: gzip not recognized and wrong Content-Length - node.js

I'm developing a nodeJS Proxy and stumbled over a strange behaviour of Google Chrome, which I also can reproduce in Fiddler.
If I send gzipped content to the browser, it doesn't recognize the different zipped/unzipped content sizes. But the content is still beeing displayed correctly.
Here a small piece of code to reproduce the problem:
var http = require('http'),
zlib = require('zlib'),
fs = require('fs');
var Proxy = http.createServer(function(request, response) {
var raw = fs.createReadStream('test');
response.writeHead(200, {
'content-encoding': 'gzip',
});
raw.pipe(zlib.createGzip()).pipe(response);
});
Proxy.listen(8000);
The file 'test' contains some dummy HTML, filesize is about 90KB.
It test the code like this:
$ curl localhost:8000 | gunzip
This works correctly so I think the nodeJS code is correct.
The Problem
This is a screeonshot of the gzipped response. The Size and Content values are nearly the same.
But I expect to be the received gzipped content (Size) to be much smaller then the unzipped content (Content).
Also, I do not see the expected "Content-Encoding: gzip" header. And the "Content-Length" header shows the length of the uncompressed file. I get the same results if I pipe the HTTP traffic through Fiddler
This URL produces the expceted behaviour:
http://www.heise.de/js/jquery/jquery-1.7.1.min.js
So, what am i doing wrong?
Why is Chrome showing the wrong sizes?
Or does the nodeJS code send something wrong as response?

Oh god, I got it working!
I dont' know why, but today I booted into my Windows development machine instead of Ubuntu.
As it seems, Chrome for Windows doesn't display or process the response correctly.
Under Ubuntu, it works at the first try.
I want 6 hours of my life back.

Related

serverless express can't retrieve pdf file (base64 encoding)

I have setup an express/serverless application to retrieve a pdf file on a GET request. But I just retrieve a corrupted repsonse pdf response. I just wondering If my settings are correct to achieve a correct response.
I'm using aws-serverless-express and want to return my pdf buffer to the client browser (it should open in the browser)
My code:
status = 200;
let fileName = "demo.pdf";
res.setHeader('Content-disposition', 'inline; filename="' + fileName + '"');
res.setHeader('Content-type', 'application/pdf');
res.setHeader('isBase64Encoded', true);//isBase64Encoded: true
let pdf = pdfBuffer.toString('base64');
res.status(status).send(pdf);
so I'm sending a base64 encoded string to APIGW. I'm not actually sure if I can set the isBase64Encoded flag via header. I read this before but I'm not so certain about that
I have done this whole procedure before, but didn't make use of aws-serverless-express (where I Could set the isBase64Encoded flag easily)
I'm also using serverless-apigw-binary to automatically setup APIGW for the correct decoding of the base64 encoded data
lambda is automatically encoding to base64, so I had to remove it and directly send the buffer.
I came across similar problem while using serverless Express. AWS Gateway needs a http response like this:
{
"header":{....}
"body": "/9j/4AAQSkZ...",
"isBase64Encoded": true
}
It's useless to put isBase64Encoded in the Http Response Header, AWS API Gateway only checks if the isBase64Encoded is outside the Http header. If it's true, then decode the body before sending it to a HTTP client.
Express (its response object) doesn't seem to allow to add something outside HTTP Header in a Http Response.
If you don't want to give up using Express, the workaround is to do the decoding in the browser, which is pretty easy:
var decodedData = Buffer.from(body,'base64')

How to send back custom HTTP headers from my Perl server?

I am writing a Perl HTTP server using HTTP::Daemon. My Perl client is sending a HEAD request to the server to get the content length of the file which IO have to GET later.
My problem is that I am not able to generate a custom header and send it back to the client.
I can send back basic HTTP headers using $c->send_basic_header, but as soon as I try to send specific headers using $c->send_header( $field1, $value1, $field2, $value2, ... ) it does not work.
I am not able to understand what is the problem.
The headers which I am trying to send is
$c->send_header('Content-Type','image/jpeg','Cotent-Length','56360','Accept-Ranges','bytes')
I am new to Perl, so please help me understand how to do this.
You don't show your code, but do you realise that you need to send_basic_header as well as send_header?
Your code should look like this
$c->send_basic_header;
$c->send_header(
'Content-Type' => 'image/jpeg',
'Content-Length' => '56360',
'Accept-Ranges' => 'bytes',
);
$c->send_crlf;

Node.js: How to stream remote file through my server to the user?

There's a large binary file at somwhere.com/noo.bin
I want to send this to the user on my web app.
I don't want to save this file on my server and then serve it, wondering if there's a way to stream the file in which my web app acts as the proxy (the file will look like mysite.com/noo.bin)
Install request, then:
var http = require('http'),
request = require('request'),
remote = 'http://somewhere.com';
http.createServer(function (req, res) {
// http://somewhere.com/noo.bin
var remoteUrl = remote + req.url;
request(remoteUrl).pipe(res);
}).listen(8080);
Though I would have written exactly #LaurentPerrin's answer myself, for completeness sake I should say this:
The drawback in that method is that the request headers you are sending to somewhere.com are unrelated to the request headers your server got. For example: if the request sent to you has a specific value for Accept-Language, it is likely that (as the code stands) you are not going to specify the same value for Accept-Value when proxying from somewhere.com. Thus the resource might be returned to you (and then from you to the original requester) in the wrong language.
Or if the request to you comes in with Accept-Encoding: gzip, the current code will get the large file uncompressed, and will stream it back uncompressed, when you could have saved bandwidth and time by accepting and streaming back a compressed file.
This may or may not be of relevance to you.
If there are important headers you feel you need to pass, you could either add some code to explicitly copy them from your request to the request you are sending somewhere.com, and then copy relevant response headers back, or use node-http-proxy at https://github.com/nodejitsu/node-http-proxy.
An example for a forward proxy using node-http proxy is https://github.com/nodejitsu/node-http-proxy/blob/master/examples/http/forward-proxy.js

Is There a Way to Check Sent Headers with Node/ Express 2.x?

Is there a way of checking what specific headers have been sent using node/ express 2.x?
I have a file download that works perfectly most of the time, but in a few specific instances I get an error in Chrome (no errors in node):
Duplicate headers received from server
The response from the server contained duplicate headers. This problem is generally the result of a misconfigured website or proxy. Only the website or proxy administrator can fix this issue.
Error 349 (net::ERR_RESPONSE_HEADERS_MULTIPLE_CONTENT_DISPOSITION): Multiple distinct Content-Disposition headers received. This is disallowed to protect against HTTP response splitting attacks.
For testing purposes, I'd like to see if a specific header has been sent or not, is there a way of doing this with node.js?
...And because someone's going to ask me about the code I use to set headers, I'm piping a stream as the download and are only setting headers in one spot.
res.setHeader('Content-disposition', 'attachment; filename=' + filename)
res.setHeader('Content-Length', stats.size)
res.setHeader('Content-type', 'application/pdf')
stream.pipe(res)
The HTTP response is a WritableStream. When the stream closes, a finish event is emitted. Thus listening to it does the trick:
res.on('finish', function() {
console.log(res._headers);
});
Much more flexible. Can be put in a middleware or a resource handler.
As #generalhenry stated on my question comments:
stream.pipe(res).on('end', function () {
console.log(res._headers);
});
The above line worked for me.
res.set("Content-disposition", "attachment; filename=\""+file.name+"\"")
This worked for me.

Express (node.js) seems to be replacing my content-type with application/json

I've written an express web app route that is essentially a proxy - it pipes the contents of an input stream (a zip file) into the server response's output stream.
I'd like the browser to prompt the user for download or whatever is most appropriate for a zip file. However, when I load this route in a browser, the contents of the input stream (the zip file's contents) show up in the browser window as text, rather than prompting a download. l
This is the code sending the response:
res.statusCode = 200;
res.setHeader ('Content-Length', size);
res.setHeader ('Content-Type', 'application/zip');
console.log ("content-type is " + res.header('Content-Type'));
inputStream.pipe (res);
The console.log statement above outputs "content-type is application/zip".
However, when I examine the request in Chrome's network tab, I see that the response's content-type is "application/json". This implies that express, or something else, is overriding my content-type header, or perhaps has already sent it.
Does anyone know what is changing the content-type on me, and how I could make sure the content-type is the one I set?
Thanks for any help.
You should check the order of the middleware, it's really tricky and can mess things up if they are in the correct order.
You can check the correct order here in the Connect webpage

Resources