Pipe PCM-Streams into one function - node.js

I have two PCM-streams (decoder1 + decoder2):
var readable1 = fs.createReadStream("track1.mp3");
var decoder1 = new lame.Decoder({
channels: 2,
mode: lame.STEREO
});
readable1.pipe(decoder1);
and
var readable2 = fs.createReadStream("track2.mp3");
var decoder2 = new lame.Decoder({
channels: 2,
mode: lame.STEREO
});
readable2.pipe(decoder2);
Now I want to pipe the streams into one mix-function, where I can use the buffer-function like:
function mixStream(buf1, buf2, callback) {
// The mixStream-Function is not implemented yet (dummy)
var out = new Buffer(buf1.length);
for (i = 0; i < buf1.length; i+=2) {
var uint = Math.floor(.5 * buf1.readInt16LE(i));
out.writeInt16LE(uint, i);
}
this.push(out);
callback();
}
I need something like
mixStream(decoder1.pipe(), decoder2.pipe(), function() { }).pipe(new Speaker());
for output to speaker. Is this possible?

Well, pipe() function actually means a stream is linked to another, a readable to a writable, for instance. This 'linking' process is to write() to the writable stream once any data chunk is ready on the readable stream, along with a little more complex logic like pause() and resume(), to deal with the backpressure.
So all you have to do is to create a pipe-like function, to process two readable streams at the same time, which drains data from stream1 and stream2, and once the data is ready, write them to the destination writable stream.
I'd strongly recommend you to go through Node.js docs for Stream.
Hope this is what you are looking for :)

Related

How to capture the first 10 seconds of an mp3 being streamed over HTTP

disclaimer: newbie to nodeJS and audio parsing
I'm trying to proxy a digital radio stream through an expressJS app with the help of node-icecast which works great. I am getting the radio's mp3 stream, and via node-lame decoding the mp3 to PCM and then sending it to the speakers. All of this just works straight from the github project's readme example:
var lame = require('lame');
var icecast = require('icecast');
var Speaker = require('speaker');
// URL to a known Icecast stream
var url = 'http://firewall.pulsradio.com';
// connect to the remote stream
icecast.get(url, function (res) {
// log the HTTP response headers
console.error(res.headers);
// log any "metadata" events that happen
res.on('metadata', function (metadata) {
var parsed = icecast.parse(metadata);
console.error(parsed);
});
// Let's play the music (assuming MP3 data).
// lame decodes and Speaker sends to speakers!
res.pipe(new lame.Decoder())
.pipe(new Speaker());
});
I'm now trying to setup a service to identify the music using the Doreso API. Problem is I'm working with a stream and don't have the file (and I don't know enough yet about readable and writable streams, and slow learning). I have been looking around for a while at trying to write the stream (ideally to memory) until I had about 10 seconds worth. Then I would pass that portion of audio to my API, however I don't know if that's possible or know where to start with slicing 10 seconds of a stream. I thought possibly trying passing the stream to ffmpeg as it has a -t option for duration, and perhaps that could limit it, however I haven't got that to work yet.
Any suggestions to cut a stream down to 10 seconds would be awesome. Thanks!
Updated: Changed my question as I originally thought I was getting PCM and converting to mp3 ;-) I had it backwards. Now I just want to slice off part of the stream while the stream still feeds the speaker.
It's not that easy.. but I've managed it this weekend. I would be happy if you guys could point out how to even improve this code. I don't really like the approach of simulating the "end" of a stream. Is there something like "detaching" or "rewiring" parts of a pipe-wiring of streams in node?
First, you should create your very own Writable Stream class which itself creates a lame encoding instance. This writable stream will receive the decoded PCM data.
It works like this:
var stream = require('stream');
var util = require('util');
var fs = require('fs');
var lame = require('lame');
var streamifier = require('streamifier');
var WritableStreamBuffer = require("stream-buffers").WritableStreamBuffer;
var SliceStream = function(lameConfig) {
stream.Writable.call(this);
this.encoder = new lame.Encoder(lameConfig);
// we need a stream buffer to buffer the PCM data
this.buffer = new WritableStreamBuffer({
initialSize: (1000 * 1024), // start as 1 MiB.
incrementAmount: (150 * 1024) // grow by 150 KiB each time buffer overflows.
});
};
util.inherits(SliceStream, stream.Writable);
// some attributes, initialization
SliceStream.prototype.writable = true;
SliceStream.prototype.encoder = null;
SliceStream.prototype.buffer = null;
// will be called each time the decoded steam emits "data"
// together with a bunch of binary data as Buffer
SliceStream.prototype.write = function(buf) {
//console.log('bytes recv: ', buf.length);
this.buffer.write(buf);
//console.log('buffer size: ', this.buffer.size());
};
// this method will invoke when the setTimeout function
// emits the simulated "end" event. Lets encode to MP3 again...
SliceStream.prototype.end = function(buf) {
if (arguments.length) {
this.buffer.write(buf);
}
this.writable = false;
//console.log('buffer size: ' + this.buffer.size());
// fetch binary data from buffer
var PCMBuffer = this.buffer.getContents();
// create a stream out of the binary buffer data
streamifier.createReadStream(PCMBuffer).pipe(
// and pipe it right into the MP3 encoder...
this.encoder
);
// but dont forget to pipe the encoders output
// into a writable file stream
this.encoder.pipe(
fs.createWriteStream('./fooBar.mp3')
);
};
Now you can pipe the decoded stream into an instance of your SliceStream class, like this (additional to the other pipes):
icecast.get(streamUrl, function(res) {
var lameEncoderConfig = {
// input
channels: 2, // 2 channels (left and right)
bitDepth: 16, // 16-bit samples
sampleRate: 44100, // 44,100 Hz sample rate
// output
bitRate: 320,
outSampleRate: 44100,
mode: lame.STEREO // STEREO (default), JOINTSTEREO, DUALCHANNEL or MONO
};
var decodedStream = res.pipe(new lame.Decoder());
// pipe decoded PCM stream into a SliceStream instance
decodedStream.pipe(new SliceStream(lameEncoderConfig));
// now play it...
decodedStream.pipe(new Speaker());
setTimeout(function() {
// after 10 seconds, emulate an end of the stream.
res.emit('end');
}, 10 * 1000 /*milliseconds*/)
});
Can I suggest using removeListener after 10 seconds? That will prevent future events from being sent through the listener.
var request = require('request'),
fs = require('fs'),
masterStream = request('-- mp3 stream --')
var writeStream = fs.createWriteStream('recording.mp3'),
handler = function(bit){
writeStream.write(bit);
}
masterStream.on('data', handler);
setTimeout(function(){
masterStream.removeListener('data', handler);
writeStream.end();
}, 1000 * 10);

Basic streams issue: Difficulty sending a string to stdout

I'm just starting learning about streams in node. I have a string in memory and I want to put it in a stream that applies a transformation and pipe it through to process.stdout. Here is my attempt to do it:
var through = require('through');
var stream = through(function write(data) {
this.push(data.toUpperCase());
});
stream.push('asdf');
stream.pipe(process.stdout);
stream.end();
It does not work. When I run the script on the cli via node, nothing is sent to stdout and no errors are thrown. A few questions I have:
If you have a value in memory that you want to put into a stream, what is the best way to do it?
What is the difference between push and queue?
Does it matter if I call end() before or after calling pipe()?
Is end() equivalent to push(null)?
Thanks!
Just use the vanilla stream API
var Transform = require("stream").Transform;
// create a new Transform stream
var stream = new Transform({
decodeStrings: false,
encoding: "ascii"
});
// implement the _transform method
stream._transform = function _transform(str, enc, done) {
this.push(str.toUpperCase() + "\n";
done();
};
// connect to stdout
stream.pipe(process.stdout);
// write some stuff to the stream
stream.write("hello!");
stream.write("world!");
// output
// HELLO!
// WORLD!
Or you can build your own stream constructor. This is really the way the stream API is intended to be used
var Transform = require("stream").Transform;
function MyStream() {
// call Transform constructor with `this` context
// {decodeStrings: false} keeps data as `string` type instead of `Buffer`
// {encoding: "ascii"} sets the encoding for our strings
Transform.call(this, {decodeStrings: false, encoding: "ascii"});
// our function to do "work"
function _transform(str, encoding, done) {
this.push(str.toUpperCase() + "\n");
done();
}
// export our function
this._transform = _transform;
}
// extend the Transform.prototype to your constructor
MyStream.prototype = Object.create(Transform.prototype, {
constructor: {
value: MyStream
}
});
Now use it like this
// instantiate
var a = new MyStream();
// pipe to a destination
a.pipe(process.stdout);
// write data
a.write("hello!");
a.write("world!");
Output
HELLO!
WORLD!
Some other notes about .push vs .write.
.write(str) adds data to the writable buffer. It is meant to be called externally. If you think of a stream like a duplex file handle, it's just like fwrite, only buffered.
.push(str) adds data to the readable buffer. It is only intended to be called from within our stream.
.push(str) can be called many times. Watch what happens if we change our function to
function _transform(str, encoding, done) {
this.push(str.toUpperCase());
this.push(str.toUpperCase());
this.push(str.toUpperCase() + "\n");
done();
}
Output
HELLO!HELLO!HELLO!
WORLD!WORLD!WORLD!
First, you want to use write(), not push(). write() puts data in to the stream, push() pushes data out of the stream; you only use push() when implementing your own Readable, Duplex, or Transform streams.
Second, you'll only want to write() data to the stream after you've setup the pipe() (or added some event listeners). If you write to a stream with nothing wired to the other end, the data you've written will be lost. As #naomik pointed out, this isn't true in general since a Writable stream will buffer write()s. In your example you do need to write() after pipe() though. Otherwise, the process will end before writing anything to STDOUT. This is possibly due to how the through module is implemented, but I don't know that for sure.
So, with that in mind, you can make a couple simple changes to your example to get it to work:
var through = require('through');
var stream = through(function write(data) {
this.push(data.toUpperCase());
});
stream.pipe(process.stdout);
stream.write('asdf');
stream.end();
Now, for your questions:
The easiest way to get data from memory in to a writable stream is to simply write() it, just like we're doing with stream.wrtie('asdf') in your example.
As far as I know, the stream doesn't have a queue() function, did you mean write()? Like I said above, write() is used to put data in to a stream, push() is used to push data out of the stream. Only call push() in your owns stream implementations.
Only call end() after all your data has been written to your stream. end() basically says: "Ok, I'm done now. Please finish what you're doing and close the stream."
push(null) is pretty much equivalent to end(). That being said, don't call push(null) unless you're doing it inside your own stream implementation (as stated above). It's almost always more appropriate to call end().
Based on the examples for stream (http://nodejs.org/api/stream.html#stream_readable_pipe_destination_options)
and through (https://www.npmjs.org/package/through)
it doesn't look like you are using your stream correctly... What happens if you use write(...) instead of push(...)?

Node.js: splitting a readable stream pipe to multiple sequential writable streams

Given a Readable stream (which may be process.stdin or a file stream), is it possible/practical to pipe() to a custom Writable stream that will fill a child Writable until a certain size; then close that child stream; open a new Writable stream and continue?
(The context is to upload a large piece of data from a pipeline to a CDN, dividing it up into blocks of a reasonable size as it goes, without having to write the data to disk first.)
I've tried creating a Writable that handles the opening and closing of the child stream in the _write function, but the problem comes when the incoming chunk is too big to fit in the existing child stream: it has to write some of the chunk to the old stream; create the new stream; and then wait for the open event on the new stream before completing the _write call.
The other thought I had was to create an extra Duplex or Transform stream to buffer the pipe and ensure that the chunk coming into the Writable is definitely equal to or less than the amount the existing child stream can accept, to give the Writable time to change the child stream over.
Alternatively, is this overcomplicating everything and there's a much easier way to do the original task?
I bumped across the question when looking for an answer for a related problem. How to parse a file and split it its lines into separate files depending on some category value in the line.
I did my best to change my code to make it more relevant to your problem. However, that's rapidly adapted. Not tested. Treat it as pseudo-code.
var fs = require('fs'),
through = require('through');
var destCount = 0, dest, size = 0, MAX_SIZE = 1000;
readableStream
.on('data', function(data) {
var out = data.toString() + "\n";
size += out.length;
if(size > MAX_SIZE) {
dest.emit("end");
dest = null;
size = 0;
}
if(!dest) {
// option 1. manipulate data before saving them.
dest = through();
dest.pipe(fs.createWriteStream("log" + destCount))
// option 2. write directly to file
// dest = fs.createWriteStream("log" + destCount);
}
dest.emit("data", out);
})
.on('end', function() {
dest.emit('end');
});
I would introduce a Transform in between the Readable and Writable stream. And in its _transform, I would do all the logic I would need.
Maybe, I would only have a Readable and a Transform only. The _transform method would create all the Writable stream I need
Personally, I only use a Writable stream only when I'm dumping data somewhere and I would be done processing that chunk.
I avoid implementing _read and _write as much as I can and abuse Transform stream.
But the point I don't understand in your question is write about size. What do you mean by it.?

Node.js Readable file stream not getting data

I'm attempting to create a Readable file stream that I can read individual bytes from. I'm using the code below.
var rs = fs.createReadStream(file).on('open', function() {
var buff = rs.read(8); //Read first 8 bytes
console.log(buff);
});
Given that file is an existing file of at least 8 bytes, why am I getting 'null' as the output for this?
Event open means that stream has been initialized, it does not mean you can read from the stream. You would have to listen for either readable or data events.
var rs = fs.createReadStream(file);
rs.once('readable', function() {
var buff = rs.read(8); //Read first 8 bytes only once
console.log(buff.toString());
});
It looks like you're calling this rs.read() method. However, that method is only available in the Streams interface. In the Streams interface, you're looking for the 'data' event and not the 'open' event.
That stated, the docs actually recommend against doing this. Instead you should probably be handling chunks at a time if you want to stream them:
var rs = fs.createReadStream('test.txt');
rs.on('data', function(chunk) {
console.log(chunk);
});
If you want to read just a specific portion of a file, you may want to look at fs.open() and fs.read() which are lower level.

How to wrap a buffer as a stream2 Readable stream?

How can I transform a node.js buffer into a Readable stream following using the stream2 interface ?
I already found this answer and the stream-buffers module but this module is based on the stream1 interface.
The easiest way is probably to create a new PassThrough stream instance, and simply push your data into it. When you pipe it to other streams, the data will be pulled out of the first stream.
var stream = require('stream');
// Initiate the source
var bufferStream = new stream.PassThrough();
// Write your buffer
bufferStream.end(Buffer.from('Test data.'));
// Pipe it to something else (i.e. stdout)
bufferStream.pipe(process.stdout)
As natevw suggested, it's even more idiomatic to use a stream.PassThrough, and end it with the buffer:
var buffer = new Buffer( 'foo' );
var bufferStream = new stream.PassThrough();
bufferStream.end( buffer );
bufferStream.pipe( process.stdout );
This is also how buffers are converted/piped in vinyl-fs.
A modern simple approach that is usable everywhere you would use fs.createReadStream() but without having to first write the file to a path.
const {Duplex} = require('stream'); // Native Node Module
function bufferToStream(myBuuffer) {
let tmp = new Duplex();
tmp.push(myBuuffer);
tmp.push(null);
return tmp;
}
const myReadableStream = bufferToStream(your_buffer);
myReadableStream is re-usable.
The buffer and the stream exist only in memory without writing to local storage.
I use this approach often when the actual file is stored at some cloud service and our API acts as a go-between. Files never get wrote to a local file.
I have found this to be the very reliable no matter the buffer (up to 10 mb) or the destination that accepts a Readable Stream. Larger files should implement

Resources