Piping a stream to a stream with custom results - node.js

After reading and marginally understanding the node stream handbook, I want to use streams whenever it seems appropriate/possible.
I have a request that uploads a file which should be written to another spot on the file system. This is done via:
readStream = fs.createReadStream(request.files.file.path);
readStream.pipe(fs.createWriteStream(targetPath));
This works great, but I want to pipe the result of the write stream to a response -- specifically I want the target path to be piped to the result when it's successful. Right now I'm doing:
readStream.pipe(fs.createWriteStream(targetPath)).on("close", function ()
serverResponse.send(200, targetPath);
});
This works fine, but I feel like it is more verbose than it needs to be and I should be able to call .pipe on the result as in read.pipe(write).pipe(respose).
Is there something I can do to get the write stream to pipe the target path to the response or better way I can go about doing what I'm doing?

Related

Unable to use one readable stream to write to two different targets in Node JS

I have a client side app where users can upload an image. I receive this image in my Node JS app as readable data and then manipulate it before saving like this:
uploadPhoto: async (server, request) => {
try {
const randomString = `${uuidv4()}.jpg`;
const stream = Fse.createWriteStream(`${rootUploadPath}/${userId}/${randomString}`);
const resizer = Sharp()
.resize({
width: 450
});
await data.file
.pipe(resizer)
.pipe(stream);
This works fine, and writes the file to the projects local directory. The problem comes when I try to use the same readable data again in the same async function. Please note, all of this code is in a try block.
const stream2 = Fse.createWriteStream(`${rootUploadPath}/${userId}/thumb_${randomString}`);
const resizer2 = Sharp()
.resize({
width: 45
});
await data.file
.pipe(resizer2)
.pipe(stream2);
The second file is written, but when I check the file, it seems corrupted or didn't successfully write the data. The first image is always fine.
I've tried a few things, and found one method that seems to work but I don't understand why. I add this code just before the I create the second write stream:
data.file.on('end', () => {
console.log('There will be no more data.');
});
Putting the code for the second write stream inside the on-end callback block doesn't make a difference, however, if I leave the code outside of the block, between the first write stream code and the second write stream code, then it works, and both files are successfully written.
It doesn't feel right leaving the code the way it is. Is there a better way I can write the second thumb nail image? I've tried to use the Sharp module to read the file after the first write stream writes the data, and then create a smaller version of it, but it doesn't work. The file doesn't ever seem to be ready to use.
You have 2 alternatives, which depends on how your software is designed.
If possible, I would avoid to execute two transform operations on the same stream in the same "context", eg: an API endpoint. I would rather separate those two different tranform so they do not work on the same input stream.
If that is not possible or would require too many changes, the solution is to fork the input stream and the pipe it into two different Writable. I normally use Highland.js fork for these tasks.
Please also see my comments on how to properly handle streams with async/await to check when the write operation is finished.

Node Pipes Handle Input

I am piping data from a textfile to another pipe which is downloading images from some urls. Now as expected this sends a large number of requests in quick succession and remote server shuts me down. I would like to handle next chunk only after the first is processed.
My code is:
read.pipe(JSONStream.parse('*'))
.pipe(es.map(function (d, cb) {
download_images(x,y)
.then(function(r) ...)
.fail(function(r) ...)
.fin(function(f) cb())
})
.pipe(xyz)
Since I have just started looking into streams, I might have missed a very simple point, or in my zeal to use streams I could have ignored a better approach
Extremely Large json file
Download images with a delay
You can call read.pause() right before calling download_images() and then call read.resume() right before you call cb().

Piping to a stream that needs to be both readable and writable

I'd like to be able to pipe from the src stream to the something stream. I understand that something needs to be a writable stream, as it's being piped to, however, it needs to also be a readable stream, so that it can pipe to somethingElse'. What shouldsomething` return in order to make this work?
example.task('taskOne', function() {
return example
.src('pathName')
.pipe(something())
.pipe(somethingElse())
});
Solved!
Made use of node modules through2 which solves this exact issue.

Can npm request module be used in a .pipe() stream?

I am parsing a JSON file using a parsing stream module and would like to stream results to request like this:
var request = require("request")
fs.createReadStream('./data.json')
.pipe(parser)
.pipe(streamer)
.pipe(new KeyFilter({find:"URL"}) )
.pipe(request)
.pipe( etc ... )
(Note: KeyFilter is a custom transform that works fine when piping to process.stdout)
I've read the docs and source code. This won't work with 'request' or 'new request()' because the constructor wants a URL.
It will work with request.put() as this : yourStream.pipe(request.put('http://example.com/form'))
After more research and experimenting I've concluded that request cannot be used in this way. The simple answer is that request creates a readable stream and .pipe() methods expects a writable stream.
I tried several attempts to wrap request in a transform to get by this with no luck. While you can receive the piped url and create a new request, I can't figure out how to reset the pipe callbacks without some truly unnatural bending of the stream pattern. Any thoughts would be appreciated, but I have moved on to using an event in the url stream to kick off a new request(url).pipe(etc) type stream.

Node.js request stream ends/stalls when piped to writable file stream

I'm trying to pipe() data from Twitter's Streaming API to a file using modern Node.js Streams. I'm using a library I wrote called TweetPipe, which leverages EventStream and Request.
Setup:
var TweetPipe = require('tweet-pipe')
, fs = require('fs');
var tp = new TweetPipe(myOAuthCreds);
var file = fs.createWriteStream('./tweets.json');
Piping to STDOUT works and stream stays open:
tp.stream('statuses/filter', { track: ['bieber'] })
.pipe(tp.stringify())
.pipe(process.stdout);
Piping to the file writes one tweet and then the stream ends silently:
tp.stream('statuses/filter', { track: ['bieber'] })
.pipe(tp.stringify())
.pipe(file);
Could anyone tell me why this happens?
it's hard to say from what you have here, it sounds like the stream is getting cleaned up before you expect. This can be triggered a number of ways, see here https://github.com/joyent/node/blob/master/lib/stream.js#L89-112
A stream could emit 'end', and then something just stops.
Although I doubt this is the problem, one thing that concerns me is this
https://github.com/peeinears/tweet-pipe/blob/master/index.js#L173-174
destroy should be called after emitting error.
I would normally debug a problem like this by adding logging statements until I can see what is not happening right.
Can you post a script that can be run to reproduce?
(for extra points, include a package.json that specifies the dependencies :)
According to this, you should create an error handler on the stream created by tp.

Resources