Possible to delete data read from readable stream? - node.js

I have some ffmpeg processes running in order and all writing into one stream (fs.createWriteStream).
Is it possible to delete the data read through fs.createReadStream from the file?
I want to run the script 24/7 and want the stream to act like a buffer.
Thanks in advance!

You can actually "append data" to running ffmpeg instance - or any other writable stream. To make this possible, you need to use this option for pipe:
myFile.pipe(ffmpegRunner, {end: false});
This will tell pipe to not notify the ffmpeg that the file has ended. Then, you can switch the files once the first one ends:
myFile.on("end", () => {
myFile.unpipe(ffmpegRunner);
anotherFile.pipe(ffmpegRunner, {end: false});
});
You can do that even before the stream ends I guess.

Related

Piping to a stream that needs to be both readable and writable

I'd like to be able to pipe from the src stream to the something stream. I understand that something needs to be a writable stream, as it's being piped to, however, it needs to also be a readable stream, so that it can pipe to somethingElse'. What shouldsomething` return in order to make this work?
example.task('taskOne', function() {
return example
.src('pathName')
.pipe(something())
.pipe(somethingElse())
});
Solved!
Made use of node modules through2 which solves this exact issue.

Node spawn stdout.on data delay

I am checking for USB drive removal on linux. I am simply monitoring the output of a command line process with child_process.spawn. But for some reason the child's stdout data event doesn't emit until like 20 lines have been printed, which makes it unable to detect a removed drive. After removing the drive many times, it does finally go. But obviously that won't do.
Original:
var udevmonitor = require("child_process").spawn("udevadm", ["monitor", "--udev"]);
udevmonitor.stdout.on("data", function(data) {
return console.log(data.toString());
});
Pretty simple. So I figure it's an issue with the piping node is using internally. So instead of using the pipe, I figure I'll just use a simple passthrough stream. That could solve the problem and give me real-time output. That code is:
var stdout = new require('stream').PassThrough();
require("child_process").spawn("udevadm", ["monitor", "--udev"], { stdio: ['pipe', stdout, 'pipe'] });
stdout.on("data", function(data) {
console.log(data.toString());
});
But that gives me an error:
child_process.js:922 throw new TypeError('Incorrect value for stdio stream: ' + stdio);
The documentation says you can pass a stream in. I don't see what I'm doing wrong and stepping through the child_process source didn't help.
Can someone help? You can run this yourself, provided you're on Linux. Run the code and insert a USB drive. Perhaps you can run the command 'udevadm monitor --udev' in another terminal to see what happens. Remove and reinsert a few times and eventually node will print out.
mscdex, I love you. Changing the spawn command to
spawn("stdbuf", ["-oL", "-eL", "udevadm", "monitor", "--udev"]);
Did the trick. I really appreciate your help!

Piping a stream to a stream with custom results

After reading and marginally understanding the node stream handbook, I want to use streams whenever it seems appropriate/possible.
I have a request that uploads a file which should be written to another spot on the file system. This is done via:
readStream = fs.createReadStream(request.files.file.path);
readStream.pipe(fs.createWriteStream(targetPath));
This works great, but I want to pipe the result of the write stream to a response -- specifically I want the target path to be piped to the result when it's successful. Right now I'm doing:
readStream.pipe(fs.createWriteStream(targetPath)).on("close", function ()
serverResponse.send(200, targetPath);
});
This works fine, but I feel like it is more verbose than it needs to be and I should be able to call .pipe on the result as in read.pipe(write).pipe(respose).
Is there something I can do to get the write stream to pipe the target path to the response or better way I can go about doing what I'm doing?

Node.js request stream ends/stalls when piped to writable file stream

I'm trying to pipe() data from Twitter's Streaming API to a file using modern Node.js Streams. I'm using a library I wrote called TweetPipe, which leverages EventStream and Request.
Setup:
var TweetPipe = require('tweet-pipe')
, fs = require('fs');
var tp = new TweetPipe(myOAuthCreds);
var file = fs.createWriteStream('./tweets.json');
Piping to STDOUT works and stream stays open:
tp.stream('statuses/filter', { track: ['bieber'] })
.pipe(tp.stringify())
.pipe(process.stdout);
Piping to the file writes one tweet and then the stream ends silently:
tp.stream('statuses/filter', { track: ['bieber'] })
.pipe(tp.stringify())
.pipe(file);
Could anyone tell me why this happens?
it's hard to say from what you have here, it sounds like the stream is getting cleaned up before you expect. This can be triggered a number of ways, see here https://github.com/joyent/node/blob/master/lib/stream.js#L89-112
A stream could emit 'end', and then something just stops.
Although I doubt this is the problem, one thing that concerns me is this
https://github.com/peeinears/tweet-pipe/blob/master/index.js#L173-174
destroy should be called after emitting error.
I would normally debug a problem like this by adding logging statements until I can see what is not happening right.
Can you post a script that can be run to reproduce?
(for extra points, include a package.json that specifies the dependencies :)
According to this, you should create an error handler on the stream created by tp.

Does a WriteStream have to be closed?

I used var writeStream = fs.createWriteStream('...') to open a WriteStream, but there is no writeStream.close() function. How is this handled in node?
Call stream.end()
http://nodejs.org/api/stream.html#stream_stream_end
Stream Has to be close after you done your file's writing work.
use
stream.end()
You can't use stream.write('xxx') after stream.end().
While you don't have end/close the writeable stream, you/other can write into file anytime, and NodeJS will keep writable stream's reference in RAM.

Resources