I'm trying to create a Read Stream from a remote file without writing it to disc.
var file = fs.createWriteStream('Video.mp4');
var request = http.get('http://url.tld/video.mp4', function(response){
response.pipe(file);
});
Can I create a Read Stream directly from an HTTP response without writing it to disc ? Maybe creating a buffer in chunks and covert it to readable stream ?
Seems like you can use the request module.
Have a look at 7zark7's answer here: https://stackoverflow.com/a/14552721/7189461
Related
I am rather new to Node and am attempting to learn streaming; please correct me if my understanding is flawed.
Using fs.createReadStream and fs.createWriteStream together with .pipe method will effectively stream any kind of data.
Also res.end method utilizes streaming by default.
So could we use fs.createReadStream together with res.end to create the same streaming effect?
How would this look?
Under what circumstances would you normally use res.end?
Thank you
You can use pipe like:
readStream.pipe(res);
To stream some readable stream to the response.
See this answer for a working example of using it.
Basically it's something like:
var s = fs.createReadStream(file);
s.on('open', function () {
s.pipe(res);
});
plus some error handling and MIME types support - see this for full code:
How to serve an image using nodejs
where you can find it used in three examples using three node modules:
express
connect
http
I'm trying to download a large XML file and parse it using xml-stream library. I'm using request to download the file and it's capable of streaming the content of the file. Ideally, I'd like to pass that stream directly to xml-stream and have it parsed. But I can't figure out how to connect those two.
Here is the code a have so far:
request('http://example.com/data.xml').pipe(fs.createWriteStream('data.xml'));
...
var stream = fs.createReadStream('data.xml');
var xml = new XmlStream(stream);
Is it possible to connect them directly without a temp data.xml file?
request() returns a Readable stream, so just pass that return value to XmlStream():
var xml = new XmlStream(request('http://example.com/data.xml'));
Setup: I am reading data from a database and emailing the dataset as a CSV file. The data get read in chunks of 500 rows, and I'm using ya-csv to write the records as a CSV to a stream. I then want to use mailgun-js to email the file as an attachment.
Option 1 (what I don't want to do):
create temp file;
create write stream to that file;
write all CSV records;
read all of it back into memory to attach to an email;
Option 2 (what I want to do but don't quite know how to):
create a writable stream;
create a readable stream;
somehow pipe the writes from (1) into (2);
pass the writable stream to ya-csv;
pass the readable stream to mailgun;
fetch data and write to the write stream until there's no more data;
end the write stream, thus ending the read stream and sending the email.
I've been reading https://github.com/substack/stream-handbook and https://nodejs.org/api/stream.html, and the problem is that I can't use writable.pipe(readable);.
I have tried using a Duplex stream (i.e. both the write and read streams are just Duplex streams) but this doesn't work as Duplex is an abstract class and I'd have to implement several of the linking parts.
Question: how do I use streams to link up this writing of CSV records to streaming an attachment to mailgun?
Don't overthink it, mailgun-js can take a stream as attachment, it can be as easy as:
var csv = require('csv');
var mailgun = require('mailgun-js');
// this will stream some csv
var file = myDbStream.pipe(csv.stringify());
var data = {
from: 'Excited User <me#samples.mailgun.org>',
to: 'serobnic#mail.ru',
subject: 'Hello',
text: 'Testing some Mailgun awesomness!',
attachment: file // attach it to your message, mailgun should deal with it
};
mailgun.messages().send(data, function (error, body) {
console.log(body);
});
I don't know what your Db is, maybe the driver already has a support for streams or you'll have to feed csv manually (can be done very easily with event-stream).
Edit
ya-csv seems to not be able to be piped easily, csv works better.
I'm trying to pipe() data from Twitter's Streaming API to a file using modern Node.js Streams. I'm using a library I wrote called TweetPipe, which leverages EventStream and Request.
Setup:
var TweetPipe = require('tweet-pipe')
, fs = require('fs');
var tp = new TweetPipe(myOAuthCreds);
var file = fs.createWriteStream('./tweets.json');
Piping to STDOUT works and stream stays open:
tp.stream('statuses/filter', { track: ['bieber'] })
.pipe(tp.stringify())
.pipe(process.stdout);
Piping to the file writes one tweet and then the stream ends silently:
tp.stream('statuses/filter', { track: ['bieber'] })
.pipe(tp.stringify())
.pipe(file);
Could anyone tell me why this happens?
it's hard to say from what you have here, it sounds like the stream is getting cleaned up before you expect. This can be triggered a number of ways, see here https://github.com/joyent/node/blob/master/lib/stream.js#L89-112
A stream could emit 'end', and then something just stops.
Although I doubt this is the problem, one thing that concerns me is this
https://github.com/peeinears/tweet-pipe/blob/master/index.js#L173-174
destroy should be called after emitting error.
I would normally debug a problem like this by adding logging statements until I can see what is not happening right.
Can you post a script that can be run to reproduce?
(for extra points, include a package.json that specifies the dependencies :)
According to this, you should create an error handler on the stream created by tp.
I'm trying to read a giant logfile (250,000 lines), parsing each line into a JSON object, and insert each JSON object to CouchDB for analytics.
I'm trying to do this by creating a buffered stream that will process each chunk seperately, but I always run out of memory after about 300 lines. It seems like using buffered streams and util.pump should avoid this, but apparently not.
(Perhaps there are better tools for this than node.js and CouchDB, but I'm interested in learning how to do this kind of file processing in node.js and think it should be possible.)
CoffeeScript below, JavaScript here: https://gist.github.com/5a89d3590f0a9ca62a23
fs = require 'fs'
util = require('util')
BufferStream = require('bufferstream')
files = [
"logfile1",
]
files.forEach (file)->
stream = new BufferStream({encoding:'utf8', size:'flexible'})
stream.split("\n")
stream.on("split", (chunk, token)->
line = chunk.toString()
# parse line into JSON and insert in database
)
util.pump(fs.createReadStream(file, {encoding: 'utf8'}), stream)
Maybe this helps:
Memory leak when using streams in Node.js?
Try to use pipe() to solve it.