Writing a streaming response from a streaming query in Koa with Mongoose - node.js

I'm trying to send a large result-set from a Mongo database to the user of a Koa application (using Mongoose).
I originally had something like:
var res = yield Model.find().limit(500).exec();
this.body = {data: res};
However, the size of the result set being sent was causing the application to time out, and as such I'd like to stream the response as it comes from the database.
With Mongoose you can turn the result of a query into a stream by doing something like:
var stream = Model.find().limit(300).stream();
However, I'm not sure how to write this stream into the response while preserving the format needed. I want something like this to happen:
this.body.write("{data: "});
this.body.write(stream);
this.body.write("}");
but I know there is no body.write in Koa and I'm sure I'm not using streams properly either. Can someone point me in the right direction?

koa-write may help.
but you might not need that. Koa allows you to do:
this.body = stream;
In your case you can create a transform stream since the mongoose stream isn't exactly what you want to output.

Related

node.js - res.end vs fs.createWriteStream

I am rather new to Node and am attempting to learn streaming; please correct me if my understanding is flawed.
Using fs.createReadStream and fs.createWriteStream together with .pipe method will effectively stream any kind of data.
Also res.end method utilizes streaming by default.
So could we use fs.createReadStream together with res.end to create the same streaming effect?
How would this look?
Under what circumstances would you normally use res.end?
Thank you
You can use pipe like:
readStream.pipe(res);
To stream some readable stream to the response.
See this answer for a working example of using it.
Basically it's something like:
var s = fs.createReadStream(file);
s.on('open', function () {
s.pipe(res);
});
plus some error handling and MIME types support - see this for full code:
How to serve an image using nodejs
where you can find it used in three examples using three node modules:
express
connect
http

Can npm request module be used in a .pipe() stream?

I am parsing a JSON file using a parsing stream module and would like to stream results to request like this:
var request = require("request")
fs.createReadStream('./data.json')
.pipe(parser)
.pipe(streamer)
.pipe(new KeyFilter({find:"URL"}) )
.pipe(request)
.pipe( etc ... )
(Note: KeyFilter is a custom transform that works fine when piping to process.stdout)
I've read the docs and source code. This won't work with 'request' or 'new request()' because the constructor wants a URL.
It will work with request.put() as this : yourStream.pipe(request.put('http://example.com/form'))
After more research and experimenting I've concluded that request cannot be used in this way. The simple answer is that request creates a readable stream and .pipe() methods expects a writable stream.
I tried several attempts to wrap request in a transform to get by this with no luck. While you can receive the piped url and create a new request, I can't figure out how to reset the pipe callbacks without some truly unnatural bending of the stream pattern. Any thoughts would be appreciated, but I have moved on to using an event in the url stream to kick off a new request(url).pipe(etc) type stream.

Decode a base64 document in ExpressJS response

I would like to store some documents in a database as base64 strings. Then when those docs are requested using HTTP, I would like ExpressJS to decode the base64 docs and return them. So something like this:
app.get('/base64', function (req, res) {
//pdf is my base64 encoded string that represents a document
var buffer = new Buffer(pdf, 'base64');
res.send(buffer);
});
The code is simply to give an idea of what I'm trying to accomplish. Do I need to use a stream for this? If so, how would I do that? Or should I be writing these docs to a temp directory and then serving up the file? Would be nice to skip that step if possible. Thanks!
UPDATE: Just to be clear I would like this to work with a typical HTTP request. So the user will click a link in his browser that will take him to a URL that returns a file from the database. Seems like it must be possible, Microsoft SharePoint stores serialized files in a SQL database and returns those files over http requests, and I don't believe it writes all those files to a temp location first. I'm feeling like a nodejs stream may be the answer, but I'm not very familiar with streaming.
Before saving a file representation to the DB you can just use the toString method with base 64 encoding:
var base64pdf = pdf.toString('base64');
After you get the base64 file representation from db use the buffer as follows in order to convert it back to a file:
var decodedFile = new Buffer(base64pdf, 'base64');
More information on Buffer usages can be found here - NodeJS Buffer
As for how to send a buffer from express server to the client, Socket IO should solve this issue.
Using socket.emit -
Emits an event to the socket identified by the string name. Any
other parameters can be included.
All datastructures are supported, including Buffer. JavaScript
functions can’t be serialized/deserialized.
var io = require('socket.io')();
io.on('connection', function(socket){
socket.emit('an event', { some: 'data' });
});
Required documentation on socket.io website.

Node.js request stream ends/stalls when piped to writable file stream

I'm trying to pipe() data from Twitter's Streaming API to a file using modern Node.js Streams. I'm using a library I wrote called TweetPipe, which leverages EventStream and Request.
Setup:
var TweetPipe = require('tweet-pipe')
, fs = require('fs');
var tp = new TweetPipe(myOAuthCreds);
var file = fs.createWriteStream('./tweets.json');
Piping to STDOUT works and stream stays open:
tp.stream('statuses/filter', { track: ['bieber'] })
.pipe(tp.stringify())
.pipe(process.stdout);
Piping to the file writes one tweet and then the stream ends silently:
tp.stream('statuses/filter', { track: ['bieber'] })
.pipe(tp.stringify())
.pipe(file);
Could anyone tell me why this happens?
it's hard to say from what you have here, it sounds like the stream is getting cleaned up before you expect. This can be triggered a number of ways, see here https://github.com/joyent/node/blob/master/lib/stream.js#L89-112
A stream could emit 'end', and then something just stops.
Although I doubt this is the problem, one thing that concerns me is this
https://github.com/peeinears/tweet-pipe/blob/master/index.js#L173-174
destroy should be called after emitting error.
I would normally debug a problem like this by adding logging statements until I can see what is not happening right.
Can you post a script that can be run to reproduce?
(for extra points, include a package.json that specifies the dependencies :)
According to this, you should create an error handler on the stream created by tp.

Serialize node.js stream object (which is circular) to JSON work around

I need to serialize the stream object that is the callback from a net.createServer()
var server = net.createServer(function (stream) {
var json = JSON.stringify(stream);
However, When I do this I get a Type Error because the stream object contains circular attributes.
Is there a workaround for this?
#Jason is correct here. You want to take the data from the stream, not the stream itself, and put it into Redis. In order to do this, you must add event listeners to the stream for the data and end events. In the event handlers you will get a chunk of data with each callback that you can either write to redis in pieces or assemble them in memory and then write the whole thing when the end callback occurs. Here's an example you can follow.

Resources