request.on in http.createServer(function(request,response) {}); - node.js

var http = require('http');
var map = require('through2-map');
uc = map(function(ch) {
return ch.toString().toUpperCase();
});
server = http.createServer(function(request, response) {
request.on('data',function(chunk){
if (request.method == 'POST') {
//change the data from request to uppercase letters and
//pipe to response.
}
});
});
server.listen(8000);
I have two questions about the code above. First, I read the documentation for request, it said that request is an instance of IncomingMessage, which implements Readable Stream. However, I couldn't find .on method in the Stream documentation. So I don't know what chunk in the callback function in request.on does. Secondly, I want to do some manipulation to the data from request and pipe it to response. Should I pipe from chunk or from request? Thank you for consideration!

is chunk a stream?
nop. The stream is the flow among what the chunks of the whole data are sent.
A simple example, If you read a 1gb file, a stream will read it by chunks of 10k, each chunk will go through your stream, from the beginning to the end, with the right order.
I use a file as example, but a socket, request or whatever streams is based on that idea.
Also, whenever someone sends a request to this server would that entire thing be a chunk?
In the particular case of http requests, only the request body is a stream. It can be the posted files/data. Or the response body of the response. Headers are treated as Objects to apply on the request before the body is written on the socket.
A small example to help you with some concrete code,
var through2 = require('through2');
var Readable = require('stream').Readable;
var s1 = through2(function transform(chunk, enc, cb){
console.log("s1 chunk %s", chunk.toString())
cb(err=null, chunk.toString()+chunk.toString() )
});
var s2 = through2(function transform(chunk, enc, cb){
console.log("s2 chunk %s", chunk.toString())
cb(err=null, chunk)
});
s2.on('data', function (data) {
console.log("s2 data %s", data.toString())
})
s1.on('end', function (data) {
console.log("s1 end")
})
s2.on('end', function (data) {
console.log("s2 end")
})
var rs = new Readable;
rs.push('beep '); // this is a chunk
rs.push('boop'); // this is a chunk
rs.push(null); // this is a signal to end the stream
rs.on('end', function (data) {
console.log("rs end")
})
console.log(
".pipe always return piped stream: %s", rs.pipe(s1)===s1
)
s1.pipe(s2)
I would like to suggest you to read more :
https://github.com/substack/stream-handbook
http://maxogden.com/node-streams.html
https://github.com/maxogden/mississippi

All Streams are instances of EventEmitter (docs), that is where the .on method comes from.
Regarding the second question, you MUST pipe from the Stream object (request in this case). The "data" event emits data as a Buffer or a String (the "chunk" argument in the event listener), not a stream.
Manipulating Streams is usually done by implementing a Transform stream (docs). Though there are many NPM packages available that make this process simpler (like through2-map or the like), though in reality, they produce Transform streams.
Consider the following:
var http = require('http');
var map = require('through2-map');
// Transform Stream to uppercase
var uc = map(function(ch) {
return ch.toString().toUpperCase();
});
var server = http.createServer(function(request, response) {
// Pipe from the request to our transform stream
request
.pipe(uc)
// pipe from transfrom stream to response
.pipe(response);
});
server.listen(8000);
You can test by running curl:
$ curl -X POST -d 'foo=bar' http://localhost:8000
# logs FOO=BAR

Related

Cannot pipe after data has been emitted from the response nodejs

I've been experiencing a problem with the require library of node js. When I try to pipe to a file and a stream on response, I get the error: you cannot pipe after data has been emitted from the response. This is because I do some calculations before really piping the data.
Example:
var request = require('request')
var fs = require('fs')
var through2 = require('through2')
options = {
url: 'url-to-fetch-a-file'
};
var req = request(options)
req.on('response',function(res){
//Some computations to remove files potentially
//These computations take quite somme time.
//Function that creates path recursively
createPath(path,function(){
var file = fs.createWriteStream(path+fname)
var stream = through2.obj(function (chunk, enc, callback) {
this.push(chunk)
callback()
})
req.pipe(file)
req.pipe(stream)
})
})
If I just pipe to the stream without any calculations, it's just fine. How can I pipe to both a file and stream using request module in nodejs?
I found this:Node.js Piping the same readable stream into multiple (writable) targets but it is not the same thing. There, piping happens 2 times in a different tick. This example pipes like the answer in the question and still receives an error.
Instead of piping directly to the file you can add a listener to the stream you defined. So you can replace req.pipe(file) with
stream.on('data',function(data){
file.write(data)
})
stream.on('end',function(){
file.end()
})
or
stream.pipe(file)
This will pause the stream untill its read, something that doesn't happen with the request module.
More info: https://github.com/request/request/issues/887

Having trouble streaming response to client using expressjs

I am having a really hard time wrapping my head around how to stream data back to my client when using Nodejs/Expressjs.
I am grabbing a lot of data from my database and I am doing it in chunks, I would like to stream that back to the client as I get the data such that I do not have to store the entire dataset in memory as a json object before sending it back.
I would like the data to stream back as a file, ie I want the browser to ask my users what to do with the file on download. I was previously creating a file system write stream and stream the contents of my data to the file system, then when done I would send the file back to the client. I would like to eliminate the middle man (creating tmp file on file system) and just stream data to client.
app.get(
'/api/export',
function (req, res, next) {
var notDone = true;
while (notDone) {
var partialData = // grab partial data from database (maybe first 1000 records);
// stream this partial data as a string to res???
if (checkIfDone) notDone = false;
}
}
);
I can call res.write("some string data"), then call res.end() when I am done. However I am not 100% sure that this is actually streaming the response to the client as I write. Seems like expressjs is storing all the data until I call end and then sending the response. Is that true?
What is the proper way to stream strings chunks of data to a response using expressjs?
The response object is already a writable stream. Express handles sending chunked data automatically, so you won't need to do anything extra but:
response.send(data)
You may also want to check out the built-in pipe method, http://nodejs.org/api/stream.html#stream_event_pipe.
You can do this by setting the appropriate headers and then just writing to the response object. Example:
res.writeHead(200, {
'Content-Type': 'text/plain',
'Content-Disposition': contentDisposition('foo.data')
});
var c = 0;
var interval = setInterval(function() {
res.write(JSON.stringify({ foo: Math.random() * 100, count: ++c }) + '\n');
if (c === 10) {
clearInterval(interval);
res.end();
}
}, 1000);
// extracted from Express, used by `res.download()`
function contentDisposition(filename) {
var ret = 'attachment';
if (filename) {
filename = basename(filename);
// if filename contains non-ascii characters, add a utf-8 version ala RFC 5987
ret = /[^\040-\176]/.test(filename)
? 'attachment; filename="' + encodeURI(filename) + '"; filename*=UTF-8\'\'' + encodeURI(filename)
: 'attachment; filename="' + filename + '"';
}
return ret;
}
Also, Express/node does not buffer data written to a socket unless the socket is paused (either explicitly or implicitly due to backpressure). Data buffered by node when in this paused state may or may not be combined with other data chunks that already buffered. You can check the return value of res.write() to determine if you should continue writing to the socket. If it returns false, then listen for the 'drain' event and then continue writing again.

HTTP request stream not firing readable when reading fixed sizes

I am trying to work with the new Streams API in Node.js, but having troubles when specifying a fixed read buffer size.
var http = require('http');
var req = http.get('http://143.226.75.100/waug_mp3_128k', function (res) {
res.on('readable', function () {
var receiveBuffer = res.read(1024);
console.log(receiveBuffer.length);
});
});
This code will receive a few buffers and then exit. However, if I add this line after the console.log() line:
res.read(0);
... all is well again. My program continues to stream as predicted.
Why is this happening? How can I fix it?
It's explained here.
As far as I understand it, by reading only 1024 bytes with each readable event, Node is left to assume that you're not interested in the rest of the data that's in the stream buffers, and discards it. Issuing the read(0) (in the same event loop iteration) 'resets' this behaviour. I'm not sure why the process exits after reading a couple of 1024-byte buffers though; I can recreate it, but I don't understand it yet :)
If you don't have a specific reason to use the 1024-byte reads, just read the entire buffer for each event:
var receiveBuffer = res.read();
Or instead of using non-flowing mode, use flowing mode by using the data/end events instead:
var http = require('http');
var req = http.get('http://143.226.75.100/waug_mp3_128k', function (res) {
var chunks = [];
res.on('data', function(chunk) {
chunks.push(chunk);
console.log('chunk:', chunk.length);
});
res.on('end', function() {
var result = Buffer.concat(chunks);
console.log('final result:', result.length);
});
});

Node js- writing data to the writable stream

In my node application im writing data to the file using write method in the createWriteStream method.Now i need to find whether the write for the particular stream is complete or not.How can i find that.
var stream = fs.createWriteStream('myFile.txt', {flags: 'a'});
var result = stream.write(data);
writeToStream();
function writeToStream() {
var result = stream.write(data + '\n');
if (!result) {
stream.once('drain',writeToStream());
}
}
I need to call other method for every time when write completes.How can i do this.
From the node.js WritableStream.write(...) documentation you can give the "write" method a callback that is called when the written data is flushed:
var stream = fs.createWriteStream('myFile.txt', {flags: 'a'});
var data = "Hello, World!\n";
stream.write(data, function() {
// Now the data has been written.
});
Note that you probably don't need to actually wait for each call to "write" to complete before queueing the next call. Even if the "write" method returns false you can still call subsequent writes and node will buffer the pending write requests into memory.
I am using maerics's answer along with error handling. The flag 'a' is used to Open file for appending. The file is created if it does not exist. There Other flags you can use.
// Create a writable stream & Write the data to stream with encoding to be utf8
var writerStream = fs.createWriteStream('MockData/output.txt',{flags: 'a'})
.on('finish', function() {
console.log("Write Finish.");
})
.on('error', function(err){
console.log(err.stack);
});
writerStream.write(outPutData,function() {
// Now the data has been written.
console.log("Write completed.");
});
// Mark the end of file
writerStream.end();

Node.js - How to get stream into string

I have got stream and I need to get stream content into string. I stream from internet using http.get. I also write stream into file, but I don't want to write file and after that open the same file and read from it...
So I need to convert stream into string
Thanks for all advices...
var http = require('http');
var string = '';
var request = http.get("http://www.google.cz", function(response) {
response.on('data', function(response){
string += response.toString();
});
response.on('end', function(string){
console.log(string);
});
});
This works for sure. I am using it.
Instead of using the second nested function, try the toString() method on the stream. This can then be piped wherever you want it to go, and you can also write a through() method that can assign it to a variable or use it directly in the one function.
var http = require('http');
var string = '';
var request = http.get('http://www.google.cz', function (err, data){
if (err) console.log(err);
string = data.toString();
//any other code you may want
});
//anything else
One last note--the http.get() method takes two parameters: the url, and a callback. This requires two parameters, and you may have been getting nothing because it was an empty error message.

Resources