How to close a file when writing is done? - node.js

I'm using the lame package [1] to write some MP3 data into a file. The data is sent as raw audio on a socket and when received its written into a file stream and every 10 mins I write into a new file. The problem I'm facing is that when this runs for a long time, the system is running out of file handles because the file isn't closed. Something like this:
var stream;
var encoder = lame.Encoder({
// Input
channels: 2,
bitDepth: 16,
sampleRate: 44100,
// Output
bitRate: 128,
outSampleRate: 22050,
mode: lame.STEREO // STEREO (default), JOINTSTEREO, DUALCHANNEL or MONO
});
encoder.on('data', function(data) {
stream.write(data);
});
var server = net.createServer(function(socket) {
socket.on('data', function(data) {
// There is some logic here that will based on time if it's
// time to create a new file. When creating a new file it uses
// the following code.
stream = fs.createWriteStream(filename);
// This will write data through the encoder into the file.
encoder.write(data);
// Can't close the file here since it might try to write after
// it's closed.
});
});
server.listen(port, host);
However, how can I close the file after the last data chunk has been written? Technically a new file can be opened while the previous file still need to finish writing it's last chunk.
Is this scenario, how do I correctly close the file?
[1] https://www.npmjs.com/package/lame

You need process data as Readable Stream then using socket.io-stream to resolve your business.
var ss = require('socket.io-stream');
//encoder.on('data', function(data) {
// stream.write(data);
//});
var server = net.createServer(function(socket) {
ss(socket).on('data', function(stream) {
// There is some logic here that will based on time if it's
// time to create a new file. When creating a new file it uses
// the following code.
stream.pipe(encoder).pipe(fs.createWriteStream(filename))
});
});

Close the stream(file) after all write is done:
stream.end();
See documetation: https://nodejs.org/api/stream.html
writable.end([chunk][, encoding][, callback])#
* chunk String | Buffer Optional data to write
* encoding String The encoding, if chunk is a String
* callback Function Optional callback for when the stream is finished
Call this method when no more data will be written to the stream. If supplied, the
callback is attached as a listener on the finish event.

Related

Pipe PCM-Streams into one function

I have two PCM-streams (decoder1 + decoder2):
var readable1 = fs.createReadStream("track1.mp3");
var decoder1 = new lame.Decoder({
channels: 2,
mode: lame.STEREO
});
readable1.pipe(decoder1);
and
var readable2 = fs.createReadStream("track2.mp3");
var decoder2 = new lame.Decoder({
channels: 2,
mode: lame.STEREO
});
readable2.pipe(decoder2);
Now I want to pipe the streams into one mix-function, where I can use the buffer-function like:
function mixStream(buf1, buf2, callback) {
// The mixStream-Function is not implemented yet (dummy)
var out = new Buffer(buf1.length);
for (i = 0; i < buf1.length; i+=2) {
var uint = Math.floor(.5 * buf1.readInt16LE(i));
out.writeInt16LE(uint, i);
}
this.push(out);
callback();
}
I need something like
mixStream(decoder1.pipe(), decoder2.pipe(), function() { }).pipe(new Speaker());
for output to speaker. Is this possible?
Well, pipe() function actually means a stream is linked to another, a readable to a writable, for instance. This 'linking' process is to write() to the writable stream once any data chunk is ready on the readable stream, along with a little more complex logic like pause() and resume(), to deal with the backpressure.
So all you have to do is to create a pipe-like function, to process two readable streams at the same time, which drains data from stream1 and stream2, and once the data is ready, write them to the destination writable stream.
I'd strongly recommend you to go through Node.js docs for Stream.
Hope this is what you are looking for :)

How to capture the first 10 seconds of an mp3 being streamed over HTTP

disclaimer: newbie to nodeJS and audio parsing
I'm trying to proxy a digital radio stream through an expressJS app with the help of node-icecast which works great. I am getting the radio's mp3 stream, and via node-lame decoding the mp3 to PCM and then sending it to the speakers. All of this just works straight from the github project's readme example:
var lame = require('lame');
var icecast = require('icecast');
var Speaker = require('speaker');
// URL to a known Icecast stream
var url = 'http://firewall.pulsradio.com';
// connect to the remote stream
icecast.get(url, function (res) {
// log the HTTP response headers
console.error(res.headers);
// log any "metadata" events that happen
res.on('metadata', function (metadata) {
var parsed = icecast.parse(metadata);
console.error(parsed);
});
// Let's play the music (assuming MP3 data).
// lame decodes and Speaker sends to speakers!
res.pipe(new lame.Decoder())
.pipe(new Speaker());
});
I'm now trying to setup a service to identify the music using the Doreso API. Problem is I'm working with a stream and don't have the file (and I don't know enough yet about readable and writable streams, and slow learning). I have been looking around for a while at trying to write the stream (ideally to memory) until I had about 10 seconds worth. Then I would pass that portion of audio to my API, however I don't know if that's possible or know where to start with slicing 10 seconds of a stream. I thought possibly trying passing the stream to ffmpeg as it has a -t option for duration, and perhaps that could limit it, however I haven't got that to work yet.
Any suggestions to cut a stream down to 10 seconds would be awesome. Thanks!
Updated: Changed my question as I originally thought I was getting PCM and converting to mp3 ;-) I had it backwards. Now I just want to slice off part of the stream while the stream still feeds the speaker.
It's not that easy.. but I've managed it this weekend. I would be happy if you guys could point out how to even improve this code. I don't really like the approach of simulating the "end" of a stream. Is there something like "detaching" or "rewiring" parts of a pipe-wiring of streams in node?
First, you should create your very own Writable Stream class which itself creates a lame encoding instance. This writable stream will receive the decoded PCM data.
It works like this:
var stream = require('stream');
var util = require('util');
var fs = require('fs');
var lame = require('lame');
var streamifier = require('streamifier');
var WritableStreamBuffer = require("stream-buffers").WritableStreamBuffer;
var SliceStream = function(lameConfig) {
stream.Writable.call(this);
this.encoder = new lame.Encoder(lameConfig);
// we need a stream buffer to buffer the PCM data
this.buffer = new WritableStreamBuffer({
initialSize: (1000 * 1024), // start as 1 MiB.
incrementAmount: (150 * 1024) // grow by 150 KiB each time buffer overflows.
});
};
util.inherits(SliceStream, stream.Writable);
// some attributes, initialization
SliceStream.prototype.writable = true;
SliceStream.prototype.encoder = null;
SliceStream.prototype.buffer = null;
// will be called each time the decoded steam emits "data"
// together with a bunch of binary data as Buffer
SliceStream.prototype.write = function(buf) {
//console.log('bytes recv: ', buf.length);
this.buffer.write(buf);
//console.log('buffer size: ', this.buffer.size());
};
// this method will invoke when the setTimeout function
// emits the simulated "end" event. Lets encode to MP3 again...
SliceStream.prototype.end = function(buf) {
if (arguments.length) {
this.buffer.write(buf);
}
this.writable = false;
//console.log('buffer size: ' + this.buffer.size());
// fetch binary data from buffer
var PCMBuffer = this.buffer.getContents();
// create a stream out of the binary buffer data
streamifier.createReadStream(PCMBuffer).pipe(
// and pipe it right into the MP3 encoder...
this.encoder
);
// but dont forget to pipe the encoders output
// into a writable file stream
this.encoder.pipe(
fs.createWriteStream('./fooBar.mp3')
);
};
Now you can pipe the decoded stream into an instance of your SliceStream class, like this (additional to the other pipes):
icecast.get(streamUrl, function(res) {
var lameEncoderConfig = {
// input
channels: 2, // 2 channels (left and right)
bitDepth: 16, // 16-bit samples
sampleRate: 44100, // 44,100 Hz sample rate
// output
bitRate: 320,
outSampleRate: 44100,
mode: lame.STEREO // STEREO (default), JOINTSTEREO, DUALCHANNEL or MONO
};
var decodedStream = res.pipe(new lame.Decoder());
// pipe decoded PCM stream into a SliceStream instance
decodedStream.pipe(new SliceStream(lameEncoderConfig));
// now play it...
decodedStream.pipe(new Speaker());
setTimeout(function() {
// after 10 seconds, emulate an end of the stream.
res.emit('end');
}, 10 * 1000 /*milliseconds*/)
});
Can I suggest using removeListener after 10 seconds? That will prevent future events from being sent through the listener.
var request = require('request'),
fs = require('fs'),
masterStream = request('-- mp3 stream --')
var writeStream = fs.createWriteStream('recording.mp3'),
handler = function(bit){
writeStream.write(bit);
}
masterStream.on('data', handler);
setTimeout(function(){
masterStream.removeListener('data', handler);
writeStream.end();
}, 1000 * 10);

Node.js Socket pipe method DOES NOT pipe last packet to the http response

I have Node server which use Express as web app.
This server creates a tcp socket connection with other side TCP server.
I'm trying to pipe tcp data to the user http response.
It works fine for a while, but the LAST tcp packet is NOT piped to http response.
So, download status of web browser stopped as 99.9% downloaded.
My source code is below.
Anyone can help me to solve this problem?
Thanks in advance.
app.get('/download/*', function(req, res){
var tcpClient = new net.Socket();
tcpClient.connect(port, ip, function() {
// some logic
});
tcpClient.on('data', function(data) {
/* skip ... */
tcpClient.pipe(res); // This method is called once in the 'data' event loop
/* skip ... */
});
tcpClient.on('close', function() {
clog.debug('Connection closed.');
});
tcpClient.on('end', function() {
clog.debug('Connection Ended.');
});
tcpClient.on('error', function(err){
clog.err(err.stack);
});
});
That's not how you are supposed to use .pipe().
When you pipe a stream into another, you don't have to handle the data events yourself: everything is taken care of by the pipe. Moreover, the data event is emitted on every chunk of data, which means that you are possibly piping() the streams multiple times.
You only need to create and initialize the Socket, and then pipe it to your response stream:
tcpClient.connect(port, ip, function () {
// some logic
this.pipe(res);
});
Edit: As you precised in the comments, the first chunk contains metadata, and you only want to pipe from the second chunk thereon. Here's a possible solution:
tcpClient.connect(port, ip, function () {
// some logic
// Only call the handler once, i.e. on the first chunk
this.once('data', function (data) {
// Some logic to process the first chunk
// ...
// Now that the custom logic is done, we can pipe the tcp stream to the response
this.pipe(res);
});
});
As a side note, if you want to add custom logic to the data that comes from the tcpClient before writing it to the response object, check out the Transform stream. You will then have to:
create a transform stream with your custom transforming logic
pipe all streams together: tcpClient.pipe(transformStream).pipe(res).

Node.js Readable file stream not getting data

I'm attempting to create a Readable file stream that I can read individual bytes from. I'm using the code below.
var rs = fs.createReadStream(file).on('open', function() {
var buff = rs.read(8); //Read first 8 bytes
console.log(buff);
});
Given that file is an existing file of at least 8 bytes, why am I getting 'null' as the output for this?
Event open means that stream has been initialized, it does not mean you can read from the stream. You would have to listen for either readable or data events.
var rs = fs.createReadStream(file);
rs.once('readable', function() {
var buff = rs.read(8); //Read first 8 bytes only once
console.log(buff.toString());
});
It looks like you're calling this rs.read() method. However, that method is only available in the Streams interface. In the Streams interface, you're looking for the 'data' event and not the 'open' event.
That stated, the docs actually recommend against doing this. Instead you should probably be handling chunks at a time if you want to stream them:
var rs = fs.createReadStream('test.txt');
rs.on('data', function(chunk) {
console.log(chunk);
});
If you want to read just a specific portion of a file, you may want to look at fs.open() and fs.read() which are lower level.

Output breaking in child Process by nodejs

I connected wavecom GSM modem on ubantu. I use node.js language to communicate with GSM modem.
I send command to modem by Child Process. Here example
var spawn = require("child_process").spawn,
exec = require('child_process').exec;
// Write dev_ttyUSB15.tmp file
var child = exec('cat < /dev/ttyUSB15 > /tmp/dev_ttyUSB15.tmp');
// Read dev_ttyUSB15.tmp file
var m1 = spawn('tail',['-f','/tmp/dev_ttyUSB15.tmp']);
// on data event is emitted when dev_ttyUSB15.tmp file has some data
m1.stdout.on('data', function (data) {
console.log("Data : "+data); // this is executed as output
});
Now When I fire some command on port /dev/ttyUSB15 I do not get output properly.
E.g
Suppose my output should be
Data : abcd1234
but instead of it I got
Data : abc
Data : d1234
In short My output is breaked.
I can not extrapolate from where my output exactly break. It's random.
Can anyone give me any idea?
Thanks in advance.
As all streams in node.js, the reading of data consists of 2 separate events: data and end.
data event is fired when some data is readable in the stream (in your case, twice).
end event is fired when no more data events will be fired.
var blob = "";
m1.stdout.on('data', function (data) {
blob += data;
});
m1.stdout.on('end', function () {
console.log("Data : " + blob); // here you have all the data within one variable
});
It's hard to say without knowing what protocol you are speaking with the modem, but if it's e.g. \n delimited, you will have to buffer the data and split on \n:
var buffer = '';
m1.stdout.on('data', function(data) {
var received = (buffer + data).split('\n');
buffer = received.pop().trim();
console.log(received.join(''));
});

Resources