Sending multiple responses with the same response object in nodejs - node.js

I have a long running process which needs to send data back at multiple reponse. Is there some way to send back multiple responses with express.js
I want have "test" after 3 seconde a new reponse "bar"
app.get('/api', (req, res) =>
{
res.write("test");
setTimeout(function(){
res.write("bar");
res.end();
}, 3000);
});
with res.write a wait a 3050 of have a on response

Yes, you can do it. What you need to know is actually chunked transfer encoding.
This is one of the old technics used a decade ago, since Websockets, I haven't seen anyone using this.
Obviously you only need to send responses in different times maybe up to some events will be fired later in chunks. This is actually the default response type of express.js.
But there is a catch. When you try to test this, assuming you are using a modern browser or curl which all of them buffer chunks, so you won't be seeing expected result. Trick is filling up chunk buffers before sending consecutive response chunks. See the example below:
const express = require('express'),
app = express(),
port = 3011;
app.get('/', (req, res) => {
// res.write("Hello\n");
res.write("Hello" + " ".repeat(1024) + "\n");
setTimeout(() => {
res.write("World");
res.end();
}, 2000);
});
app.listen(port, () => console.log(`Listening on port ${port}!`));
First res.write with additional 1024 spaces forces browser to render chunk. Then you second res.write behaves as you expected. You can see difference with uncommenting the first res.write.
On network level there is no difference. Actually even in browser you can achieve this by XHR Object (first AJAX implementation) Here is the related answer

An HTTP request response has one to one mapping. You can return HTTP response once, however long it may be. There are 2 common techniques:
Comet responses - Since Response is a writable stream, you can write data to it from time to time before ending it. Ensure the connection read timeout is properly configured on the client that would be receiving this data.
Simply use web sockets - These are persistent connection that allow 2 way messaging between server and client.

Related

Streaming API - how to send data from node JS connection (long-running HTTP GET call) to frontend (React application)

We are attempting to ingest data from a streaming sports API in Node.Js, for our React app (a MERN app). According to their API docs: "The livestream read APIs are long running HTTP GET calls. They are designed to be used to receive a continuous stream of game actions." So we am attempting to ingest data from long-running HTTP GET call in Node Js, to use in our React app.
Are long-running HTTP GET calls the same as websockets? We have not ingested streaming data previously, so not sure about either of these.
So far, we have added the following code into the index.js of our node app:
index.js
...
// long-running HTTP request
http.get(`https://live.test.wh.geniussports.com/v2/basketball/read/1234567?ak=OUR_KEY`, res => {
res.on('data', chunk => {
console.log(`NEW CHUNK`);
console.log(Buffer.from(chunk).toString('utf-8')); // simply console log for now
});
res.on('end', () => {
console.log('ENDING');
});
});
// Start Up The Server
const PORT = process.env.PORT || 8080;
app.listen(PORT, () => console.log(`Express server up and running on port: ${PORT}`));
This successfully connects and console-log's data in the terminal, and continues to console log new data as it becomes available in the long-running GET call.
Is it possible to send this data from our http.get() request up to React? Can we create a route for a long-running request in the same way that other routes are made? And then call that routes in React?
Server-sent events works like this, with a text/event-stream content-type header.
Server-sent events have a number of benefits over Websockets, so I wouldn't say that it's an outdated technique, but it looks like they rolled their own SSE, which is definitely not great.
My recommendation is to actually just use SSE for your use-case. Built-in browser support and ideal for a stream of read-only data.
To answer question #1:
Websockets are different from long-running HTTP GET calls. Websockets are full-duplex connections that let you send messages of any length in either direction. When a Websocket is created, it uses HTTP for the handshake, but then changes the protocol using the Upgrade: header. The Websockets protocol itself is just a thin layer over TCP, to enable discrete messages rather than a stream. It's pretty easy to design your own protocol on top of Websockets.
To answer question #2:
I find Websockets flexible and easy to use, though I haven't used the server-sent events that Evert describes in his answer. Websockets could do what you need (as could SSE according to Evert).
If you go with Websockets, note that it's possible to queue a message to be sent, and then (like any network connection) have the Websocket close before it's sent, losing any unsent messages. Thus, you probably want to build in acknowledgements for important messages. Also note that Websockets close after about a minute of being idle, at least on Android, so you need to either send heartbeat messages or be prepared to reconnect as needed.
If you decided to go on with websockets , I would recommend this approach using socket.io for both client and server:
Server-Side code:
const server = require('http').createServer();
const io = require('socket.io')(server);
io.on("connection", (socket) => {
http.get(`https://live.test.wh.geniussports.com/v2/basketball/read/1234567?ak=OUR_KEY`, res => {
res.on('data', chunk => {
console.log(`NEW CHUNK`);
let dataString = Buffer.from(chunk).toString('utf-8)
socket.emit('socketData' , {data:dataString});
});
res.on('end', () => {
console.log('ENDING');
});
});
});
server.liste(PORT);
Client-Side code:
import {
io
} from 'socket.io-client';
const socket = io('<your-backend-url>');
socket.on('socketData', (data) => {
//data is the same data object that we emited from server
console.log(data)
//use your websocket data
})

Node.js Express block

My Problem is, that I'm planning to use express to cache all requests which I receiver for a certain amount of time until I send all responses at once.
But unfortunately I can't receive a second request until I've responded to the first one. So I guess node / express is somehow blocking the further processing of other requests.
I build a minimal working example for you, so you can see better what I'm talking about.
var express = require('express');
var app = express();
var ca = [];
app.get('/hello.txt', function(req, res){
ca.push(res);
console.log("Push");
});
setInterval(function(){
while (ca.length) {
var res = ca.shift();
res.send('Hello World');
console.log("Send");
}
},9000);
var server = app.listen(3000, function() {
console.log('Listening on port %d', server.address().port);
});
When I'm sending just one request to localhost:3000 and wait for 9sec I'm able to send a second one. But when I send both without waiting for the callback of the interval, the second one is blocked until the first interval triggered.
Long Story short: Why is this blocking happening and what ways are there to avoid this blocking.
PS: It seems that the default http package shows another behavior http://blog.nemikor.com/2010/05/21/long-polling-in-nodejs/
try it with firefox and chrome to prevent serializing the requests...
OK, I've got the solution.
The issue wasn't in my code, it was caused by Chrome. It seems that Chrome is serializing all requests, which target the same URL. But nevertheless it sends both request and won't serve the second request with the response of the first.
Anyway, thanks for you help!

Node JS, delayed response

var http = require('http');
var s = http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.write('Hello\n');
setInterval(function() {
res.end(' World\n');
},2000);
console.log("Hello");
});
s.listen(8080);
After starting the above server, i run,
curl http://127.0.0.1:8080
I get the required delay. output:
Hello <2 seconds> World
But in the browser the whole content loads after 2 seconds.
Hell World <together after 2s>
What am i doing wrong ?
The following piece of code opens up a response stream with the client and streams it to the client. So, in curl you'll get "Hello" first and "World" after 2 seconds (since you've set a timer of 2000 milliseconds).
res.write('Hello\n');
setTimeout(function() {
res.end(' World\n');
},2000);
But the browser renders it only after the complete response stream is recieved.
That is why you're getting the response after 2 seconds.
It is completely the browser's behavior. It doesn't utilize the response stream until the whole response is received. Once the stream is closed, the whole response will be ready to be utilized. However, in PHP there's a way to flush the response stream if need be.
However, if you're looking for streaming data on a frequent basis, this wouldn't be the best way to do it. I'd rather suggest you to use Comet technique or websockets.
I hope this is what you are looking for.
// simulate delay response
app.use((req, res, next) => {
setTimeout(() => next(), 2000);
});
browser behavior is different from curl. browser will not render the page, until you call res.end(). so if you want to load a part of a web page after a delay, you need to load that second part separately via a websocket or an ajax request. I recommend using websocets. take a look at socket.io, it's a simple way of using websockets in node.js.

respond to each request without having to wait until current stream is finished

I'm testing streaming by creating a basic node.js app code that basically streams a file to the response. Using code from here and here.
But If I make a request from http://127.0.0.1:8000/, then open another browser and request another file, the second file will not start to download until the first one is finished. In my example I created a 1GB file. dd if=/dev/zero of=file.dat bs=1G count=1
But if I request three more files while the first one is downloading, the three files will start downloading simultaneously once the first file has finished.
How can I change the code so that it will respond to each request as it's made and not have to wait for the current download to finish?
var http = require('http');
var fs = require('fs');
var i = 1;
http.createServer(function(req, res) {
console.log('starting #' + i++);
// This line opens the file as a readable stream
var readStream = fs.createReadStream('file.dat', { bufferSize: 64 * 1024 });
// This will wait until we know the readable stream is actually valid before piping
readStream.on('open', function () {
console.log('open');
// This just pipes the read stream to the response object (which goes to the client)
readStream.pipe(res);
});
// This catches any errors that happen while creating the readable stream (usually invalid names)
readStream.on('error', function(err) {
res.end(err);
});
}).listen(8000);
console.log('Server running at http://127.0.0.1:8000/');
Your code seems fine the way it is.
I checked it with node v0.10.3 by making a few requests in multiple term sessions:
$ wget http://127.0.0.1:8000
Two requests ran concurrently.
I get the same result when using two different browsers (i.e. Chrome & Safari).
Further, I can get concurrent downloads in Chrome by just changing the request url slightly, as in:
http://localhost:8000/foo
and
http://localhost:8000/bar
The behavior you describe seems to manifest when making multiple requests from the same browser for the same url.
This may be a browser limitation - it looks like the second request isn't even made until the first is completed or cancelled.
To answer your question, if you need multiple client downloads in a browser
Ensure that your server code is implemented such that file-to-url mapping is 1-to-many (i.e. using a wildcard).
Ensure your client code (i.e. javascript in the browser), uses a different url for each request.

Sending an http response outside of the route function?

So, I have a route function like the following:
var http = require('http').createServer(start);
function start(req, res){
//routing stuff
}
and below that,I have a socket.io event listener:
io.sockets.on('connection', function(socket){
socket.on('event', function(data){
//perform an http response
}
}
When the socket event 'event' is called, I would like to perform an http response like the following:
res.writeHead(200, {'Content-disposition': 'attachment; filename=file.zip'})
res.writeHead(200, { 'Content-Type': 'application/zip' });
var filestream = fs.createReadStream('file.zip');
filestream.on('data', function(chunk) {
res.write(chunk);
});
filestream.on('end', function() {
res.end();
});
This last part, when performed within the routing function works just fine, but unfortunately when it is called from the socket event, it of course does not work, because it has no reference to the 'req' or 'res' objects. How would I go about doing this? Thanks.
Hmmm... interesting problem:
It's not impossible to do something like what you're trying to do, the flow would be something like this:
Receive http request, don't respond, keep res object saved somewhere.
Receive websocket request, do your auth/"link" it to the res object saved earlier.
Respond with file via res.
BUT it's not very pretty for a few reasons:
You need to keep res objects saved, if your server restarts a whole bunch of response objects are lost.
You need to figure out how to link websocket clients to http request clients. You could do something with cookies/localstorage to do this, I think.
Scaling to another server will become a lot harder / will you proxy clients to always be served by the same server somehow? Otherwise the linking will get harder.
I would propose a different solution for you: You want to do some client/server steps using websockets before you let someone download a file?
This question has a solution to do downloads via websocket: receive file via websocket and initiate download dialog
Sounds like it won't work on older browsers / IE, but a nice option.
Also mentions downloading via hidden iframe
Check here whether this solution is cross-browser enough for you: http://caniuse.com/#feat=datauri
Another option would be to generate a unique URL for the download, and only append it to the browser's window (either as a hidden iframe download, or as a simple download button) once you've done your logic via websocket. This option would be more available cross-browser and easier to code.

Resources