I have an application where Wasm can handle video & audio codes, which can be passed to Media Source Ext for rendering purposes. To get RTSP & RTP stream data on the browser, needs RTSP wasm module which can consume RTSP streams.
Is there any way we can achieve it?
WebAssembly itself does not provide any APIs for interacting with the browser or OS. All I/O must be handled via imports and exports. So you would need to implement JavaScript functions that send and receive the data and pass it in and out of your WebAssembly module.
Related
i'm using nodejs and react (both with typescript) and i'm trying to get the video from an IP Camera through RTSP,
i tried using node-rtsp-stream but this library uses jsmpeg as a consumer in the front-end and it does not work with react & typescript.
i need some recommandations or other solutions
I have an idea that we can transfer the RTSP to HTTP format(like HLS) first, then many of the video players and js libraries are able to play it.
I have an interesting problem, in short: how to share information between threads in NodeJS (12+).
The tech stack - in short also:
A remote/online streaming server, what producing an MP4 live stream
A client application what only consumes live view through RTSP over HTTP
A small NodeJS based application to get the MP4, transform it and pipe it back to the client.
.
The modules what I use:
NodeJS 12+
Request/fetch/https module
Express module
Stream module
The story:
I have an application, what has a gateway/relay role between two different system. One provide a live media stream (simple MP4(h264) stream) and another one supposed to consume it as RTSP over HTTP. The weird part is, the consumer client does not behave like any other player (like VLC or a webplayer), sometime - seemingly randomly - resend the request, sometime close the current request and resend it. So direct pipe not really working for this use-case.
I made a worker (from worker_threads), what hold a readable stream object, and when the client hit the request, I start populate the MP4 stream into the readable object in the worker, so even if the stream is does get a close or resend, it will not break the live media stream consuming process.
And wherever the client connect, I just would like to pipe the readable object for it.
Originally, I though a simple pipe from like request/fetch/http.get or FFMPEG would be enough, but the client could call the call between 3 seconds and 2 minutes.
.
So, my questions are, what could be the best solution, to pass back the data from the worker to the main and let reach the HTTP routing?
I had some idea like:
I know, I can have my own channel between the threads and can pass back-and-forth information, but waiting for message and keep up the process does block the app, as far as I know (worker.on('message', (stuff) => {});).
Using Socket.io to pass data back from the worker, populate the readable in the main, and pipe the readable at http level (fake shared object basically)
Creating a secondary http server what offer the media stream, then i will just relay this into the response (e.g.: gatewaying/proxying)
Looking up some proxy solution where I can just simply redirect and reshape thing, like the input mp4 transforms into RTSP stream and pipe it to the consumer response
Should I just "remember" to the active stream, and if its streamed by the remote server, always just using the same url, passing to FFMPEG and continue piping to the res?
Note:
I setted up all the headers to keep alive the connection, but seems the client software act as-is.
By default its using RTSP and RTP/TCP to consume video stream, but has option for RTSP over http.
Probably I overlook some trivial task for serving RTSP video from a remote live MP4, but I did not found any good example or source anywhere (everywhere the same 3 article re-shared basically)
I did not found any similar question, nor article anywhere (but checked out the nodejs ffmpeg play video at specific time and stream it to client).
I'm using socket.io-stream to share file over socket from server t browser. I'd like to use the same to share audio stream from browser to server. Is it possible? I know that browser audio stream is different from node.js stream, so i need to convert it, how?
Not 100% sure what you're expecting to do with the data, but this answer may be of use to you.
Specifically, I'd suggest you use getUserMedia to get your audio, hook it up to a Script Processor, convert the data, and emit those data chunks to socket.io. Then on your server, you can capture those chunks and write them to your node.js stream. Code samples are at the link; they're fairly lengthy and I don't want to spam, so I won't reproduce them here.
I need to be able to take the input video from a client side camera and convert it to an RTMP stream to be viewed live by other clients as part of a website application I am building. What libraries exist that could help facilitate this?
i have an RTSP server which is re-streaming the A/V stream from the camera to clients.
Client side we are using MF to play the stream.
I can successfully play the video but not able to play the audio from the server. However when i use vlc to play, it can play both A/V.
Currently i am implementing IMFMediaStream and have created my customize media stream. I have also created a separate IMFStreamDescriptor for audio and added all the required attributes. When i run , everything goes fine but my RequestSample method never gets called.
Please let me know if i am doing it wrong or if there is any other way to play the audio in MF.
Thanks,
Prateek
Media Foundation support for RTSP is limited to a small number of payload formats. VLC supports more (AFAIR through Live555 library). Most likely, your payload is not supported in Media Foundation.