Can npm request module be used in a .pipe() stream? - node.js

I am parsing a JSON file using a parsing stream module and would like to stream results to request like this:
var request = require("request")
fs.createReadStream('./data.json')
.pipe(parser)
.pipe(streamer)
.pipe(new KeyFilter({find:"URL"}) )
.pipe(request)
.pipe( etc ... )
(Note: KeyFilter is a custom transform that works fine when piping to process.stdout)
I've read the docs and source code. This won't work with 'request' or 'new request()' because the constructor wants a URL.

It will work with request.put() as this : yourStream.pipe(request.put('http://example.com/form'))

After more research and experimenting I've concluded that request cannot be used in this way. The simple answer is that request creates a readable stream and .pipe() methods expects a writable stream.
I tried several attempts to wrap request in a transform to get by this with no luck. While you can receive the piped url and create a new request, I can't figure out how to reset the pipe callbacks without some truly unnatural bending of the stream pattern. Any thoughts would be appreciated, but I have moved on to using an event in the url stream to kick off a new request(url).pipe(etc) type stream.

Related

typescript fetch response streaming

i am trying to stream a response. But i want to be able to read the response (and work with the data) while it is still being sent. I basically want to send multiple messages in one response.
It works internally in node.js, but when i tried to do the same thing in typescript it doesnt work anymore.
My attempt was to do the request via fetch in typescript and the response is coming from a node.js server by writing parts of the response on the response stream.
fetch('...', {
...
}).then((response => {
const reader = response.body.getReader();
reader.read().then(({done, value}) => {
if (done) {
return response;
}
console.log(String.fromCharCode.apply(null, value)); //just for testing purposes
})
}).then(...)...
On the Node.js side it basically looks like this:
// doing stuff with the request
response.write(first_message)
// do some more stuff
response.write(second_message)
// do even more stuff
response.end(last_message)
In Node.js, like i said, i can just read every message once its sent via res.on('data', ...), but the reader.read in typescript only triggers(?) once and that is when the whole response was sent.
Is there a way to make it work like i want, or do i have to look for another way?
I hope it is kinda understandable what i want to do, i noticed while writing this how much i struggled explaining this :D
I found the problem, and as usual it was sitting in front of the pc.
I forgot to write a header first, before writing the response.

node.js - res.end vs fs.createWriteStream

I am rather new to Node and am attempting to learn streaming; please correct me if my understanding is flawed.
Using fs.createReadStream and fs.createWriteStream together with .pipe method will effectively stream any kind of data.
Also res.end method utilizes streaming by default.
So could we use fs.createReadStream together with res.end to create the same streaming effect?
How would this look?
Under what circumstances would you normally use res.end?
Thank you
You can use pipe like:
readStream.pipe(res);
To stream some readable stream to the response.
See this answer for a working example of using it.
Basically it's something like:
var s = fs.createReadStream(file);
s.on('open', function () {
s.pipe(res);
});
plus some error handling and MIME types support - see this for full code:
How to serve an image using nodejs
where you can find it used in three examples using three node modules:
express
connect
http

Piping a readstream into a writestream does not work

I (as the client) am trying to post an image with restify, and the server just needs to save it.
req.pipe(fs.createWriteStream('test.jpg'));
is not working. An empty file is created but nothing more. It works when I copy req.body into a buffer and then fs.writeFile(...). I have also tried req.body.pipe, but this throws an error.
You're probably using a body parser middleware that is already reading all of the data from the request so there is nothing left to read. Try adjusting the placement of your route handler and/or body parsing middleware if you want to read directly from the request object.
However, that will only work if the request contains only the image data. Typically a request is formatted as multipart/form-data if it contains at least one file, so you cannot just pipe the request and expect image data only.
So something else in your middleware chain, probably restify.bodyParser(), is already streaming the request body into a buffer or string as req.body and you can't stream something twice. Find the middleware and disable it for this route if you want to handle the streaming straight to the filesystem yourself.

Piping a stream to a stream with custom results

After reading and marginally understanding the node stream handbook, I want to use streams whenever it seems appropriate/possible.
I have a request that uploads a file which should be written to another spot on the file system. This is done via:
readStream = fs.createReadStream(request.files.file.path);
readStream.pipe(fs.createWriteStream(targetPath));
This works great, but I want to pipe the result of the write stream to a response -- specifically I want the target path to be piped to the result when it's successful. Right now I'm doing:
readStream.pipe(fs.createWriteStream(targetPath)).on("close", function ()
serverResponse.send(200, targetPath);
});
This works fine, but I feel like it is more verbose than it needs to be and I should be able to call .pipe on the result as in read.pipe(write).pipe(respose).
Is there something I can do to get the write stream to pipe the target path to the response or better way I can go about doing what I'm doing?

Node.js request stream ends/stalls when piped to writable file stream

I'm trying to pipe() data from Twitter's Streaming API to a file using modern Node.js Streams. I'm using a library I wrote called TweetPipe, which leverages EventStream and Request.
Setup:
var TweetPipe = require('tweet-pipe')
, fs = require('fs');
var tp = new TweetPipe(myOAuthCreds);
var file = fs.createWriteStream('./tweets.json');
Piping to STDOUT works and stream stays open:
tp.stream('statuses/filter', { track: ['bieber'] })
.pipe(tp.stringify())
.pipe(process.stdout);
Piping to the file writes one tweet and then the stream ends silently:
tp.stream('statuses/filter', { track: ['bieber'] })
.pipe(tp.stringify())
.pipe(file);
Could anyone tell me why this happens?
it's hard to say from what you have here, it sounds like the stream is getting cleaned up before you expect. This can be triggered a number of ways, see here https://github.com/joyent/node/blob/master/lib/stream.js#L89-112
A stream could emit 'end', and then something just stops.
Although I doubt this is the problem, one thing that concerns me is this
https://github.com/peeinears/tweet-pipe/blob/master/index.js#L173-174
destroy should be called after emitting error.
I would normally debug a problem like this by adding logging statements until I can see what is not happening right.
Can you post a script that can be run to reproduce?
(for extra points, include a package.json that specifies the dependencies :)
According to this, you should create an error handler on the stream created by tp.

Resources