Buffering a Float32Array to a client - node.js

This should be obvious, but for some reason I am not getting any result. I have already spent way too much time just trying different ways to get this working without results.
TLDR: A shorter way to explain this question could be: I know how to stream a sound from a file. How to stream a buffer containing sound that was synthesized on the server instead?
This works:
client:
var stream = ss.createStream();
ss(socket).emit('get-file', stream, data.bufferSource);
var parts = [];
stream.on('data', function(chunk){
parts.push(chunk);
});
stream.on('end', function () {
var blob=new Blob(parts,{type:"audio"});
if(cb){
cb(blob);
}
});
server (in the 'socket-connected' callback of socket.io)
var ss = require('socket.io-stream');
// ....
ss(socket).on('get-file', (stream:any, filename:any)=>{
console.log("get-file",filename);
fs.createReadStream(filename).pipe(stream);
});
Now, the problem:
I want to alter this audio buffer and send the modified audio instead of just the file. I converted the ReadStream into an Float32Array, and did some processes sample by sample. Now I want to send that modified Float32Array to the client.
In my view, I just need to replaces the fs.createReadStream(filename) with(new Readable()).push(modifiedSoundBuffer). However, I get a TypeError: Invalid non-string/buffer chunk. Interestingly, if I convert this modifiedSodunBuffer into a Uint8Array, it doesn't yell at me, and the client gets a large array, which looks good; only that all the array values are 0. I guess that it's flooring all the values?
ss(socket).on('get-buffer', (stream:any, filename:any)=>{
let readable=(new Readable()).push(modifiedFloat32Array);
readable.pipe(stream);
});
I am trying to use streams for two reasons: sound buffers are large, and to allow concurrent processing in the future

if you will convert object Float32Array to buffer before sending like this Readable()).push(Buffer.from(modifiedSoundBuffer)) ?

Related

Basic streams issue: Difficulty sending a string to stdout

I'm just starting learning about streams in node. I have a string in memory and I want to put it in a stream that applies a transformation and pipe it through to process.stdout. Here is my attempt to do it:
var through = require('through');
var stream = through(function write(data) {
this.push(data.toUpperCase());
});
stream.push('asdf');
stream.pipe(process.stdout);
stream.end();
It does not work. When I run the script on the cli via node, nothing is sent to stdout and no errors are thrown. A few questions I have:
If you have a value in memory that you want to put into a stream, what is the best way to do it?
What is the difference between push and queue?
Does it matter if I call end() before or after calling pipe()?
Is end() equivalent to push(null)?
Thanks!
Just use the vanilla stream API
var Transform = require("stream").Transform;
// create a new Transform stream
var stream = new Transform({
decodeStrings: false,
encoding: "ascii"
});
// implement the _transform method
stream._transform = function _transform(str, enc, done) {
this.push(str.toUpperCase() + "\n";
done();
};
// connect to stdout
stream.pipe(process.stdout);
// write some stuff to the stream
stream.write("hello!");
stream.write("world!");
// output
// HELLO!
// WORLD!
Or you can build your own stream constructor. This is really the way the stream API is intended to be used
var Transform = require("stream").Transform;
function MyStream() {
// call Transform constructor with `this` context
// {decodeStrings: false} keeps data as `string` type instead of `Buffer`
// {encoding: "ascii"} sets the encoding for our strings
Transform.call(this, {decodeStrings: false, encoding: "ascii"});
// our function to do "work"
function _transform(str, encoding, done) {
this.push(str.toUpperCase() + "\n");
done();
}
// export our function
this._transform = _transform;
}
// extend the Transform.prototype to your constructor
MyStream.prototype = Object.create(Transform.prototype, {
constructor: {
value: MyStream
}
});
Now use it like this
// instantiate
var a = new MyStream();
// pipe to a destination
a.pipe(process.stdout);
// write data
a.write("hello!");
a.write("world!");
Output
HELLO!
WORLD!
Some other notes about .push vs .write.
.write(str) adds data to the writable buffer. It is meant to be called externally. If you think of a stream like a duplex file handle, it's just like fwrite, only buffered.
.push(str) adds data to the readable buffer. It is only intended to be called from within our stream.
.push(str) can be called many times. Watch what happens if we change our function to
function _transform(str, encoding, done) {
this.push(str.toUpperCase());
this.push(str.toUpperCase());
this.push(str.toUpperCase() + "\n");
done();
}
Output
HELLO!HELLO!HELLO!
WORLD!WORLD!WORLD!
First, you want to use write(), not push(). write() puts data in to the stream, push() pushes data out of the stream; you only use push() when implementing your own Readable, Duplex, or Transform streams.
Second, you'll only want to write() data to the stream after you've setup the pipe() (or added some event listeners). If you write to a stream with nothing wired to the other end, the data you've written will be lost. As #naomik pointed out, this isn't true in general since a Writable stream will buffer write()s. In your example you do need to write() after pipe() though. Otherwise, the process will end before writing anything to STDOUT. This is possibly due to how the through module is implemented, but I don't know that for sure.
So, with that in mind, you can make a couple simple changes to your example to get it to work:
var through = require('through');
var stream = through(function write(data) {
this.push(data.toUpperCase());
});
stream.pipe(process.stdout);
stream.write('asdf');
stream.end();
Now, for your questions:
The easiest way to get data from memory in to a writable stream is to simply write() it, just like we're doing with stream.wrtie('asdf') in your example.
As far as I know, the stream doesn't have a queue() function, did you mean write()? Like I said above, write() is used to put data in to a stream, push() is used to push data out of the stream. Only call push() in your owns stream implementations.
Only call end() after all your data has been written to your stream. end() basically says: "Ok, I'm done now. Please finish what you're doing and close the stream."
push(null) is pretty much equivalent to end(). That being said, don't call push(null) unless you're doing it inside your own stream implementation (as stated above). It's almost always more appropriate to call end().
Based on the examples for stream (http://nodejs.org/api/stream.html#stream_readable_pipe_destination_options)
and through (https://www.npmjs.org/package/through)
it doesn't look like you are using your stream correctly... What happens if you use write(...) instead of push(...)?

Performing piped operations on individual chunks (node-wav)

I'm new to node and I'm working on an audio stream server. I'm trying to process / transform the chunks of a stream as they come out of each pipe.
So, file = fs.createReadStream(path) (filestream) is piped into file.pipe(wavy) (remove headers and output raw PCM) gets piped in to .pipe(waver) (add proper wav header to chunk) which is piped into .pipe(spark) (ouput chunk to client).
The idea is that each filestream chunk has headers removed if any (only applies to first chunk), then using the node-wav Writer that chunk is endowed with headers and then sent to the client. As I'm sure you guessed this doesn't work.
The pipe operations into node-wav are acting on the entire filestream, not the individual chunks. To confirm I've checked the output client side and it is effectively dropping the headers and re-adding them to the entire data stream.
From what I've read of the Node Stream docs it seems like what I'm trying to do should be possible, just not the way I'm doing it. I just can't pin down how to accomplish this.
Is it possible, and if so what am I missing?
Complete function:
processAudio = (path, spark) ->
wavy = new wav.Reader()
waver = new wav.Writer()
file = fs.createReadStream(path)
file.pipe(wavy).pipe(waver).pipe(spark)
I don't really know about wavs and headers but if you're "trying to process / transform the chunks of a stream as they come out of each pipe." you can use the Transform stream.
It permits you to sit between 2 streams and modify the bytes between them:
var util = require('util');
var Transform = require('stream').Transform;
util.inherits(Test, Transform);
function Test(options) {
Transform.call(this, options);
}
Test.prototype._transform = function(chunk, encoding, cb) {
// do something with chunk, then pass a modified chunk (or not)
// to the downstream
cb(null, chunk);
};
To observe the stream and potentially modify it, pipe like:
file.pipe(wavy).pipe(new Test()).pipe(waver).pipe(spark)

Playing PCM stream from Web Audio API on Node.js

I'm streaming recorded PCM audio from a browser with web audio api.
I'm streaming it with binaryJS (websocket connection) to a nodejs server and I'm trying to play that stream on the server using the speaker npm module.
This is my client. The audio buffers are at first non-interleaved IEEE 32-bit linear PCM with a nominal range between -1 and +1. I take one of the two PCM channels to start off and stream it below.
var client = new BinaryClient('ws://localhost:9000');
var Stream = client.send();
recorder.onaudioprocess = function(AudioBuffer){
var leftChannel = AudioBuffer.inputBuffer.getChannelData (0);
Stream.write(leftChannel);
}
Now I receive the data as a buffer and try writing it to a speaker object from the npm package.
var Speaker = require('speaker');
var speaker = new Speaker({
channels: 1, // 1 channel
bitDepth: 32, // 32-bit samples
sampleRate: 48000, // 48,000 Hz sample rate
signed:true
});
server.on('connection', function(client){
client.on('stream', function(stream, meta){
stream.on('data', function(data){
speaker.write(leftchannel);
});
});
});
The result is a high pitch screech on my laptop's speakers, which is clearly not what's being recorded. It's not feedback either. I can confirm that the recording buffers on the client are valid since I tried writing them to a WAV file and it played back fine.
The docs for speaker and the docs for the AudioBuffer in question
I've been stumped on this for days. Can someone figure out what is wrong or perhaps offer a different approach?
Update with solution
First off, I was using the websocket API incorrectly. I updated above to use it correctly.
I needed to convert the audio buffers to an array buffer of integers. I choose to use Int16Array. Since the given audio buffer has a range in-between 1 and -1, it was as simple as multiplying by the range of the new ArrayBuffer (32767 to -32768).
recorder.onaudioprocess = function(AudioBuffer){
var left = AudioBuffer.inputBuffer.getChannelData (0);
var l = left.length;
var buf = new Int16Array(l)
while (l--) {
buf[l] = left[l]*0xFFFF; //convert to 16 bit
}
Stream.write(buf.buffer);
}
It looks like you're sending your stream through as the meta object.
According to the docs, BinaryClient.send takes a data object (the stream) and a meta object, in that order. The callback for the stream event receives the stream (as a BinaryStream object, not a Buffer) in the first parameter and the meta object in the second.
You're passing send() the string 'channel' as the stream and the Float32Array from getChannelData() as the meta object. Perhaps if you were to swap those two parameters (or just use client.send(leftChannel)) and then change the server code to pass stream to speaker.write instead of leftchannel (which should probably be renamed to meta, or dropped if you don't need it), it might work.
Note that since Float32Array isn't a stream or buffer object, BinaryJS might try to send it in one chunk. You may want to send leftChannel.buffer (the ArrayBuffer behind that object) instead.
Let me know if this works for you; I'm not able to test your exact setup right now.

Node.js Readable file stream not getting data

I'm attempting to create a Readable file stream that I can read individual bytes from. I'm using the code below.
var rs = fs.createReadStream(file).on('open', function() {
var buff = rs.read(8); //Read first 8 bytes
console.log(buff);
});
Given that file is an existing file of at least 8 bytes, why am I getting 'null' as the output for this?
Event open means that stream has been initialized, it does not mean you can read from the stream. You would have to listen for either readable or data events.
var rs = fs.createReadStream(file);
rs.once('readable', function() {
var buff = rs.read(8); //Read first 8 bytes only once
console.log(buff.toString());
});
It looks like you're calling this rs.read() method. However, that method is only available in the Streams interface. In the Streams interface, you're looking for the 'data' event and not the 'open' event.
That stated, the docs actually recommend against doing this. Instead you should probably be handling chunks at a time if you want to stream them:
var rs = fs.createReadStream('test.txt');
rs.on('data', function(chunk) {
console.log(chunk);
});
If you want to read just a specific portion of a file, you may want to look at fs.open() and fs.read() which are lower level.

createWriteStream vs writeFile?

What is the basic difference between these two operations ?
someReadStream.pipe(fs.createWriteStream('foo.png'));
vs
someReadStream.on('data', function(chunk) { blob += chunk } );
someReadStream.on('end', function() { fs.writeFile('foo.png', blob) });
When using request library for scraping, I can save pics (png, bmp) etc.. only with the former method and with the latter one there is same gibbersh (binary) data but image doesn't render.
How are they different ?
When you are working with streams in node.js you should prefer to pipe them.
According to Node.js’s stream-event docs, data events emit either buffers (by default) or strings (if encoding was set).
When you are working with text streams you can use data events to concatenate chunks of string data together. Then you'll be able to work with your data as one string.
But when working with binary data it's not so simple, because you'll receive buffers. To concatenate buffers you use special methods like Buffer.concat. It's possible to use a similar approach for binary streams:
var buffers = [];
readstrm.on('data', function(chunk) {
buffers.push(chunk);
});
readstrm.on('end', function() {
fs.writeFile('foo.png', Buffer.concat(buffers));
});
You can notice when something goes wrong by checking the output file's size.

Resources