Chrome is not rendering chunked json response - node.js

I want to make a streaming endpoint that keeps sending a json object. I have tried using chunked transfer encoding and was able to successfully make all the browsers (chrome, firefox, safari) render the data sent from the server if it was either text/html or text/plain. But when I use application/json as the content-type it does not work on chrome. Here's the code:
var http = require('http');
http.createServer(function(request, response){
response.setHeader('Connection', 'Transfer-Encoding');
response.setHeader('Transfer-Encoding', 'chunked');
response.setHeader('Cache-Control', 'no-cache');
// response.setHeader('Content-Type', 'application/json');
response.setHeader('Content-Type', 'text/html');
setInterval(function(){
response.write('test </br>');
// response.write('{"test": "test"}');
}, 10);
}).listen(8000);
The above code works as expected, but cant make it work with the application/json type. Am I missing something ?

This is a bug in browsers. and its fixed in latest chrome(at least in canary).
Chrome Bug - Transfer-Encoding chunked not support on text/plain
Safari (i have tested on Mac OSX) is the only browser that is not rendering the non-html content with chunked encoding.

Related

Express modifying the response headers

I'm using Express to to serve some webpages. I have a font file (woff) that I'm downloading using node's https API. I then pass the response headers along to the response I'm returning to the client. The issue I have is that express seems to be modifying the response headers, specifically the content-type and content-length headers.
I'm calling res.set() and passing the headers from the server side request
Here is some code:
app.get('/*', (req, res, next) => {
https.get(URI, (serversideRes) => {
serversideRes.on('end', () => {
res.set(serversideRes.headers);
console.log(res.getHeaders()['content-type']); //font/x-woff
console.log(res.getHeaders()['content-length']); //27756
res.send(data);
console.log(res.getHeaders()['content-type']); //font/x-woff; charset=utf-8
console.log(res.getHeaders()['content-length']); //49574
}
}
}
In the browser console I'm getting
Failed to decode downloaded font
OTS parsing error: incorrect file size in WOFF header
From the Express API
This method performs many useful tasks for simple non-streaming responses: For example, it automatically assigns the Content-Length HTTP response header field (unless previously defined) and provides automatic HEAD and HTTP cache freshness support.
In this case, the headers clearly seem to be previously defined.

Why can't I remove the transfer-encoding header in a node proxy?

I have a Node http-proxy server doing some response body rewriting that basically does this:
Client GET localhost:8000/api/items
Node Proxy send localhost:8000 -> to example.com/api
Server responds with json [{ id: 1234, url: http://example.com/api/items/1234 }]
Node proxy rewrites json to [{ id: 1234, url: http://localhost:8000/api/items/1234 }]
Node proxy calculates the new content-length header, sets it, and returns the response to the client
This was working fine until the backend server enabled compression. So now, by default, responses were being gzipped. I worked around this by setting this in my proxy:
req.headers['accept-encoding'] = 'deflate';
So after that, responses weren't being gzipped, I could parse them and rewrite the body as necessary before. However, this stopped working with IE. I think the problem is that the response still has a transfer-encoding=chunked header, so IE expects a chunked response. Because that transfer-encoding header is present, there is no content-length header, even though I'm explicitly setting it (those two headers being mutually exclusive). I've tried everything I can think of to remove the transfer-encoding header and get the content-length header in instead, but nothing is working. I've tried all of these:
// In the context of my middleware response.writeHead function
res.setHeader('transfer-encoding', null);
res.setHeader('transfer-encoding', '');
res.removeHeader('transfer-encoding');
res.setHeader('content-length', modifiedBuffer.length); // this line alone worked before
res.originalWriteHead.call(res, statusCode, { 'Content-Length', modifiedBuffer.length });
// In the context of my middleware response.write function res.write(data, encoding)
// Here, encoding parameter is undefined
// According to docs, encoding defaults to utf8, could be 'chunked'
res.oldWrite.call(res, modifiedBuffer, 'utf8');
res.oldWrite.call(res, modifiedBuffer, '');
res.oldWrite.call(res, modifiedBuffer, null);
// tried all three previous the same for res.end
Basically, no matter what I do, the response is not chunked, but has the transfer-encoding header set, and not the content-length. Firefox, safari, chrome all seem to handle this fine, but IE fails with the error XMLHttpRequest: Network Error 0x800c0007, No data is available for the requested resource.. This is (from what I can tell) because it's waiting for chunks (because of the transfer-encoding header), but gets the end of the response, and doesn't have a content-length to read it.
Does anyone know how I can solve this? Am I doing something wrong in trying to remove the transfer-encoding header in favor of content-length?
I figured this out myself:
I was effectively using two middleware components (my own, described in the question), and express.js compression. My middleware was decompressing the response, but compression ensures that the response is always written with transfer-encoding=chunked and content-length is removed.
Removing the express.js compression module solved this for me.

Headers in node.js - socket io Resource interpreted as Script but transferred with MIME type text/plain:

I'm new to node.js and I cannot understand how the headers work. I'm trying to attach to my project fancybox in my index.html but it doesn't work. I'm using c9.io workspace so it looks like that <script src="http://space.......c9.io/jquery.fancybox-1.3.4.js"></script>
I still get the same error in console: socket io Resource interpreted as Script but transferred with MIME type text/plain:
Please, is there anyone who can explain me this as simple as possible ?
my js file
var http = require("http"),
express = require('express'),
app = express(),
server = app.listen(process.env.PORT),
io = require('socket.io').listen(server)
app.get('/', function (req, res) {
res.sendfile(__dirname + '/index.html');
});
io.sockets.on('connection', function (socket) {
});
I think the solution to your problem is a little type="text/javascript" that you should add to your script tag, but here's a low down on mimetypes:
Here's some text about setting headers when using raw Nodejs.
And here's the same thing when using Express.
And here's mimetypes on wikipedia.
I'm not an expert, but as I understand it, every file or chuck of info sent from server to client (browser, mostly) comes with this mimetype that basically tells the browser how to deal with the file / chuck.. Your browser is smart enough to handle that file correctly even though he received no headers + he is smart enough to notify you that other browsers might not be that smart.
This is how I'd write it:
app.get('/',function(request,response){
response.set('Content-Type', 'text/html'); // 'text/html' => mime type
response.sendfile(__dirname + 'index.html')
}
Lots of mimetypes are listed here. But I think you can just google something like "{file extension} mime type" and google will serve you well.
Alternatively, you can use this little package to change response.set('Content-Type', 'text/html') into response.set('Content-Type', mime.lookup(x)); - x being a string such as 'kuku.mpeg' and mime will return the currect mimetype. I use it to resolve plugins that have many subfolders with different filetypes on each one.
A HTTP header is field that contains information about a HTTP request or response. It helps the server or client identify what to do with the data, whether it be what type of data to accept, how big the request or response should be, the origin of the request, to cache data or not, etc.
In HTTP, MIME headers tell the client or server what the type of data is going to be sent or received. The error message you received probably means that the browser thought it was going to receive MIME type text/javascript but received text/plain instead.
Resource interpreted as script but transferred with MIME type text/plain
To fix this problem, specify the content type when sending the script file:
app.get('/script.js', function(req, res) {
res.set('Content-Type', 'text/javascript');
res.sendfile('./script.js');
});
Note that HTTP headers are not specific to Node.js, but are part of the HTTP protocol.

HTTP Content-Encoding: gzip not recognized and wrong Content-Length

I'm developing a nodeJS Proxy and stumbled over a strange behaviour of Google Chrome, which I also can reproduce in Fiddler.
If I send gzipped content to the browser, it doesn't recognize the different zipped/unzipped content sizes. But the content is still beeing displayed correctly.
Here a small piece of code to reproduce the problem:
var http = require('http'),
zlib = require('zlib'),
fs = require('fs');
var Proxy = http.createServer(function(request, response) {
var raw = fs.createReadStream('test');
response.writeHead(200, {
'content-encoding': 'gzip',
});
raw.pipe(zlib.createGzip()).pipe(response);
});
Proxy.listen(8000);
The file 'test' contains some dummy HTML, filesize is about 90KB.
It test the code like this:
$ curl localhost:8000 | gunzip
This works correctly so I think the nodeJS code is correct.
The Problem
This is a screeonshot of the gzipped response. The Size and Content values are nearly the same.
But I expect to be the received gzipped content (Size) to be much smaller then the unzipped content (Content).
Also, I do not see the expected "Content-Encoding: gzip" header. And the "Content-Length" header shows the length of the uncompressed file. I get the same results if I pipe the HTTP traffic through Fiddler
This URL produces the expceted behaviour:
http://www.heise.de/js/jquery/jquery-1.7.1.min.js
So, what am i doing wrong?
Why is Chrome showing the wrong sizes?
Or does the nodeJS code send something wrong as response?
Oh god, I got it working!
I dont' know why, but today I booted into my Windows development machine instead of Ubuntu.
As it seems, Chrome for Windows doesn't display or process the response correctly.
Under Ubuntu, it works at the first try.
I want 6 hours of my life back.

Empty req.body receiving text/plain POST request to node.js

Why can't I receive plain text sent in a POST request body?
The request made from a client browser:
var xhr = new XMLHttpRequest();
xhr.open("POST", "/MyRoute/MySubRoute");
xhr.setRequestHeader("Content-Type", "text/plain;charset=UTF-8");
xhr.send("hello!");
Using Express with my node server:
app.post('/MyRoute/MySubRoute', function(req, res) {
console.log("Received:"+require('util').inspect(req.body,{depth:null});
res.send();
});
Logged to the console I get:
Received:{}
I've tried with text/plain (no charset), with the same result. If I change my content type to application/json and pass a simple JSON string it works fine.
Summarising the above comments which answer the question:
The client's XMLHttpRequest is correct
On the server side, Express uses connect's bodyParser which by default only supports the following content types:
application/json
application/x-www-form-urlencoded
multipart/form-data
Because the content-type of text/plain is not implemented by Express, there is no method to wait for the body to be received before calling the app/post route.
The solution is to add the text/plain content type to Express as described here
Add
app.use(express.text())
You can read more about it here

Resources