Node.js: How to read data from stream using .read() - node.js

Model: The user selects a mp4 file from his mobile on a static page hosted by a node.js express server on the same network, the file stream is received by busboy in the same server. Now the file stream has to be read in parts/segments/buffers and sent ahead through websocket to be added to the MediaSource buffer on the other screen.
Question: How to read x numbers of bytes of data from a paused readable stream?
Code(not working):
function ReadData(stream, BytesToRead){
stream.resume();
var data = stream.read(BytesToRead);
stream.pause();
return data;
}
Kindly help!

Related

NodeJS - Live Stream Audio to Specific URL with mp3 audio "chunks"

I'm working on developing an application that will capture audio from the browser in 5 second "chunks" (these are full audio files and not simply partial files), send these 5 second chunks to the server, convert it from webm to mp3 on the server, and then broadcast the mp3 file to clients connected via a websocket or a static url.
I've successfully managed to do parts 1 and 2; however, I'm not quite sure the best approach to transmit this created mp3 audio file to the user. My thinking was to generate a single url for clients to listen in to, e.g http://localhost/livestream.mp3 (a live stream url that would automatically update itself with the latest audio data), or to emit the audio files to the clients over a websocket and attempt to play these sequenced audio files seamlessly without any noticeable gaps between the audio files as they switch out.
Here's a snippet of my [typescript] code where I create the mp3 file, and I've pointed out the area in which I would perform the writestream and from there I would expect to pipe this to users when they make an HTTP req.
private createAudioFile(audioObj: StreamObject, socket: SocketIO.Socket) : void {
const directory: string = `${__dirname}/streams/live`;
fs.writeFile(`${directory}/audio_${audioObj.time}.webm`, audioObj.stream, (err: NodeJS.ErrnoException) => {
if (err) logger.default.info(err.toString());
try {
const process: childprocess.ChildProcess = childprocess.spawn('ffmpeg', ['-i', `${directory}/audio_${audioObj.time}.webm`, `${directory}/audio_${audioObj.time}.mp3`]);
process.on('exit', () => {
// Ideally, this is where I would be broadcasting the audio from
// the static URL by adding the new stream data to it, or by
// emitting it out to all clients connected to my websocket
// const wso = fs.createWriteStream(`${directory}/live.mp3`);
// const rso = fs.createReadStream(`${directory}/audio_${audioObj.time}.mp3`);
// rso.pipe(wso);
if (audioObj.last == true) {
this.archiveAudio(directory, audioObj.streamName);
}
});
} catch (e) {
logger.default.error('CRITICAL ERROR: Exception occurred when converting file to mp3:');
logger.default.error(e);
}
});
}
I've seen a number of questions out there that ask for a similar concept, but not quite the final goal that I'm looking for. Is there a way to make this work?

HTML5 WebM streaming using chunks from FFMPEG via Socket.IO

I'm trying to make use of websockets to livestream chunks from a WebM stream. The following is some example code on the server side that I have pieced together:
const command = ffmpeg()
.input('/dev/video0')
.fps(24)
.audioCodec('libvorbis')
.videoCodec('libvpx')
.outputFormat('webm')
const ffstream = command.pipe()
ffstream.on('data', chunk => {
io.sockets.emit('Webcam', chunk)
})
I have the server code structured in this manner so ffstream.on('data', ...) can also write to a file. I am able to open the file and view the video locally, but have difficulty using the chunks to render in a <video> tag in the DOM.
const ms = new MediaSource()
const video = document.querySelector('#video')
video.src = window.URL.createObjectURL(ms)
ms.addEventListener('sourceopen', function () {
const sourceBuffer = ms.addSourceBuffer('video/webm; codecs="vorbis,vp8"')
// read socket
// ...sourceBuffer.appendBuffer(data)
})
I have something such as the above on my client side. I am able to receive the exact same chunks from my server but the sourceBuffer.appendBuffer(data) is throwing me the following error:
Failed to execute 'appendBuffer' on 'SourceBuffer': This SourceBuffer has been removed from the parent media source.
Question: How can I display these chunks in an HTML5 video tag?
Note: From my reading, I believe this has to do with getting key-frames. I'm not able to determine how to recognize these though.

Streaming video with socket io

I am having some difficulty streaming a video file with socket.io and node. My video file is on my server, and I am using the fs module to read it into a readStream. I am them passing chunks of data to a mediasource on the client side, which feeds into an html 5 video tag.
Although the client is receiving the chunks (I'm logging them), and I am appending the chunks to the buffer of the media source, nothing shows up in the video tag.
Anyone know how to fix this?
Here's my code:
Client side:
var mediaSource = new MediaSource();
var mimeCodec = 'video/mp4; codecs="avc1.42E01E, mp4a.40.2"';
document.getElementById('video').src = window.URL.createObjectURL(mediaSource);
mediaSource.addEventListener('sourceopen', function(event) {
var sourceBuffer = mediaSource.addSourceBuffer(mimeCodec);
console.log(sourceBuffer);
socket.on('chunk', function (data) {
if(!sourceBuffer.updating){
sourceBuffer.appendBuffer(data);
console.log(data);
}
});
socket.emit('go',{})
});
Server side:
var stream = fs.createReadStream(window.currentvidpath);
socket.on('go', function(){
console.log('WENT');
stream.addListener('data',function(data){
console.log('VIDDATA',data);
socket.emit('chunk',data);
})
})
Thanks a lot.
The problem is the fact that you only append the source buffer if there it is not updating
if(!sourceBuffer.updating){
sourceBuffer.appendBuffer(data);
console.log(data);
}
Heres my console after I added a else and log the times it don't append
SourceBuffer {mode: "segments", updating: false, buffered: TimeRanges, timestampOffset: 0, appendWindowStart: 0…}
site.html:24 connect
site.html:17 ArrayBuffer {}
30 site.html:20 not appending
So it appended the one chunk of the video and ignored 30
You should store the ones that aren't appended in a array. Then just make a loop with set Interval

Record Internet Audio Stream in NodeJS

I have an internet audio stream that's constantly being broadcast (accessible via http url), and I want to somehow record that with NodeJS and write files that consist of one-minute segments.
Every module or article I find on the subject is all about streaming from NodeJS to the browser. I just want to open the stream and record it (time block by time block) to files.
Any ideas?
I think the project at https://github.com/TooTallNate/node-icy makes this easy, just do what you need to with the res object, in the example it is sent to the audio system:
var icy = require('icy');
var lame = require('lame');
var Speaker = require('speaker');
// URL to a known ICY stream
var url = 'http://firewall.pulsradio.com';
// connect to the remote stream
icy.get(url, function (res) {
// log the HTTP response headers
console.error(res.headers);
// log any "metadata" events that happen
res.on('metadata', function (metadata) {
var parsed = icy.parse(metadata);
console.error(parsed);
});
// Let's play the music (assuming MP3 data).
// lame decodes and Speaker sends to speakers!
res.pipe(new lame.Decoder())
.pipe(new Speaker());
});

How to bridge byte array and audio streaming?

I'm creating a relay server for my streaming app. Basically, it should work like this:
Client A streams microphone audio to server through sockets
Server a gets stream and maybe stores it somewhere temporarily?(not sure)
Client B gets a stream from server and plays it.
Basically, I have 1st part done(sending mic audio to server):
while(isStreaming)
{
minBufSize = recorder.read(buffer, 0, buffer.length);
mSocket.emit("stream", Arrays.toString(buffer));
}
And 3rd part done, simply playing audio:
mediaplayer.reset();
mediaplayer.setDataSource("http://192.168.1.2:1337/stream");
mediaplayer.prepare();
mediaplayer.start();
Now I'm not sure how to bridge incoming byte array and streaming. Here is my current server code:
var ms = require('mediaserver');
// from server to Client B
exports.letsStream = function(req, res, next) {
ms.pipe(req, res, "sample_song_music_file.mp3");
};
// from Client A to server
exports.handleSocketConnection = function(socket)
{
console.log("connected");
socket.on('stream', function(data)
{
var bytes = JSON.parse(data);
console.log("GETTING STREAM:" + bytes);
});
}
Any suggestions? How can I directly stream that byte array?
The mediaserver module only supports streaming an existing audio, rather than a "live" stream. This won't work.
One way to achieve the three tasks would be:
https://www.npmjs.com/package/microphone to read the microphone audio as a byte stream.
http://binaryjs.com/ to handle transmitting the byte stream over websockets to the server and then sending to the client. If you have two separate paths set up, one for sending the data, one for receiving. Send the data from one stream to the other.
Use https://github.com/TooTallNate/node-speaker to play the incoming PCM data stream on Client B

Resources