Node.js Video Stream WEBM Live Feed to HTML - node.js

I have a node.js server that's receiving WEBM blob binary data small packets through socket.io from a Webpage!
(navigator.mediaDevices.getUserMedia -> stream -> mediaRecorder.ondataavailable -> DATA . I'm sending that DATA back to the server. So that includes timestamp and binary data).
How do I stream those back on a http request in a never ending live stream that can be consumed by a HTML webpage simply by adding the URL in the VIDEO tag?
Like this:
<video src=".../video" autoplay></video>
I want to create a live video stream that and basically stream back my Webcam to an html page but I'm a bit lost how do I do that. Please help. Thanks
Edit: I'm using express.js to serve the app.
I just am not sure what I need to do on the Server with the coming webm binary blobs to serve it properly to be consumed by an html page on an endpoint /video
Please help :)

After many failed attempts I was finally able to build what I was trying to:
Live video streaming through socket.io.
So what I was doing was:
Start getUserMedia to start the web camera
Start a mediaRecorder set to record intervals of 100 ms
On each available chunk emit an event through socket.io to the server with the blob converted to base64 string
Server sends back base64 converted 100ms video chunk back to all connected sockets.
Webpage gets the chunk and uses mediaSource and sourceBuffer to add the chunk to the buffer
Attach the media source to a video element and VOILA :) the video would play SMOOTHLY. As long as you attach each chunk in order and you don't skip chunks (in which case it stops playing)
And IT WORKED! BUT was unusable.. :(
The problem is the mediaRecorder process is CPU intensive and the page cpu usage was jumping to 15% and the whole process was TOO SLOW.
There was 2.5 seconds latency on the video stream passing through socket.io and virtually the same EVEN if DON'T send the blobs through socket.io but render them on the same page.
Sooo I found out this works but DOESN'T work for a sustainable video chat service. It's just not designed for it. For recording a webcam video to playback later, mediaRecorder can work but not for live streaming.
I guess for live streaming there's no way around WebRTC, you MUST use WebRTC to send the video stream to either a peer or a server to send to other peers. DO NOT TRY to build a live video chat service with mediaRecorder. You're only gonna waste your time. I did that for you :) so you don't have to. Just look into webRTC. You may have to use a TURN server. Twilio provide STUN, TURN servers but it costs money. BUT you can run your own TURN server with Coturn and other services but I'm yet to look into that.
Thanks. Hope that helps someone.

Related

Should a long Running video processing task to be done client side or server side

I was creating an application in react for uploading video and using a REST API to send that to the server and store in S3. I also wanted the simple audio version of the video for some other tasks and I am confused as to what might be the better way:
Creating audio file on the fly when it is needed using node-ffmpeg package and not store it anywhere
Start converting the video file to audio on the browser client only, and posting that to the server for storage along with the video.
Just post the video to the server and use queue system for creating a new task for video conversion to audio and then save that to the S3 storage.
The second method seems to be saving some compute power on the server but it might be a problem if the video upload completes, audio conversion is still going on and the client disconnects.
Would appreciate some help, thanks.

Question for Node multithreading, media consuming and piping to HTTP response

I have an interesting problem, in short: how to share information between threads in NodeJS (12+).
The tech stack - in short also:
A remote/online streaming server, what producing an MP4 live stream
A client application what only consumes live view through RTSP over HTTP
A small NodeJS based application to get the MP4, transform it and pipe it back to the client.
.
The modules what I use:
NodeJS 12+
Request/fetch/https module
Express module
Stream module
The story:
I have an application, what has a gateway/relay role between two different system. One provide a live media stream (simple MP4(h264) stream) and another one supposed to consume it as RTSP over HTTP. The weird part is, the consumer client does not behave like any other player (like VLC or a webplayer), sometime - seemingly randomly - resend the request, sometime close the current request and resend it. So direct pipe not really working for this use-case.
I made a worker (from worker_threads), what hold a readable stream object, and when the client hit the request, I start populate the MP4 stream into the readable object in the worker, so even if the stream is does get a close or resend, it will not break the live media stream consuming process.
And wherever the client connect, I just would like to pipe the readable object for it.
Originally, I though a simple pipe from like request/fetch/http.get or FFMPEG would be enough, but the client could call the call between 3 seconds and 2 minutes.
.
So, my questions are, what could be the best solution, to pass back the data from the worker to the main and let reach the HTTP routing?
I had some idea like:
I know, I can have my own channel between the threads and can pass back-and-forth information, but waiting for message and keep up the process does block the app, as far as I know (worker.on('message', (stuff) => {});).
Using Socket.io to pass data back from the worker, populate the readable in the main, and pipe the readable at http level (fake shared object basically)
Creating a secondary http server what offer the media stream, then i will just relay this into the response (e.g.: gatewaying/proxying)
Looking up some proxy solution where I can just simply redirect and reshape thing, like the input mp4 transforms into RTSP stream and pipe it to the consumer response
Should I just "remember" to the active stream, and if its streamed by the remote server, always just using the same url, passing to FFMPEG and continue piping to the res?
Note:
I setted up all the headers to keep alive the connection, but seems the client software act as-is.
By default its using RTSP and RTP/TCP to consume video stream, but has option for RTSP over http.
Probably I overlook some trivial task for serving RTSP video from a remote live MP4, but I did not found any good example or source anywhere (everywhere the same 3 article re-shared basically)
I did not found any similar question, nor article anywhere (but checked out the nodejs ffmpeg play video at specific time and stream it to client).

Send browser audio stream over socket.io

I'm using socket.io-stream to share file over socket from server t browser. I'd like to use the same to share audio stream from browser to server. Is it possible? I know that browser audio stream is different from node.js stream, so i need to convert it, how?
Not 100% sure what you're expecting to do with the data, but this answer may be of use to you.
Specifically, I'd suggest you use getUserMedia to get your audio, hook it up to a Script Processor, convert the data, and emit those data chunks to socket.io. Then on your server, you can capture those chunks and write them to your node.js stream. Code samples are at the link; they're fairly lengthy and I don't want to spam, so I won't reproduce them here.

Accessing live video stream midway using websockets

I am using the combination of fragmented mp4 and websockets to stream a live video stream to web browser from where MSE takes over.
I have successfully fragmented into the appropriate fmp4 format using ffmpeg and have checked the data using an mpeg4parser tool.
Utilising a websocket server, the incoming data is broadcasted to all the browser clients connected via websocket. This works fine for both playback and live streaming(using rtsp stream as the input).
The problem I am facing occurs when a client tries to access the stream midway, i.e, once the ffmpeg stream has started. I have saved the init segment(ftyp + moov) elements in a queue buffer in the websocket server. This queue buffer sends this data to each new client on connection.
I believe this data is sent correctly since the browser console does not throw the 'Media Source Element not found' error. Yet no video is streamed when it receives the broadcasted moof/mdat pairs.
So a couple of questions I would like the answer to are:
1) I have observer that each moof element contains a sequence number in it's mfhd child element. Does this have to start from 1 always, which will naturally not be the case for a video stream accessed midway?
2) Is it possible to view the data in the browser client.js. At present all I can view is that my mediaBuffer contains a bunch of [Object ArrayBuffer]. Can I print the binary data inside these buffers?
3) From the server side the data seems to be sent in moof/mdat fragments as each new data arriving from the ffmpeg output to the websocket server begins with a moof element. This was noticed by printing the binary data in console. Is there a similar way to view this data in client side.
4) Does anyone have an idea of why this is happening? Some fragmented mp4 or ISO BMFF format detail that I am missing.
If any further detail is required for clarification please let me know, I will provide it.
Make sure your fragments include a base media decode time. Then set the video tag 'currentTime' to the time of the first fragment received.

Show local video stream in HTML5 video tag

I working on a system where we want to show a video stream from a Video Capture card in a browser. The browser connect towards a remote server and fetch a html page that have video in it. This video should be streamed from the client machine where a video capture card is connected.
On client side we running Linux and the capture card is registered as /dev/video0 by Video4Linux2. The browser on client side is Chrome (chromium-browser). On client side we have a webserver (lighttpd) that is possible to use for streaming.
I have looked into the getUserMedia API but it seems to be poor support for that right now. Other toughts that I have had is to use the local webserver or setup a streaming server on client side that stream video source locally.
Any ideas how to design this would be great input for me!
Thanks,
/Peter
Since Chrome does not yet support RT(S)P streaming for the <video> tag you will have to use a plugin for this.
Given it's availability I would suggest using Flash to write simple SWF which finds the correct video source and displays.
If needed you can use one of the many 'Recording Apps' available and strip out the recording part.

Resources