How to display RTSP stream in Electron? - node.js

I have a video stream using the UDP protocol, accessible through either rtp://ipadd:port or udp://#:port. I have absolutely no control over the server, so I can't change it to serve the stream over a WebSocket or transcode it to a compatible format on-the-fly.
I want to display the stream in an Electron app, however the sources I have found from a Google search all tell me that what I wish to accomplish requires me to put together an undesirably hacky solution using something like webchimera.js, for example.
I have tried dropping the URL in a <video> tag as per this answer, but Electron says that the udp and rtp URL schemes are not recognized. I have also tried require('child_process').exec with a static build of ffplay, which works, but it displays the stream on a whole separate window, which is not what I want. ActiveX, NPAPI, and other plugin solutions are not an option because Electron does not support them.
Am I out of luck, or is there a solution that I haven't come upon yet?

Figured it out by looking at pages and pages of other people's code.
Apparently my initial understanding of WebSockets was incorrect - I would not need a server-side change to use WebSockets in my situation.
I had to transcode the stream into MPEG2 from within Electron using an ffmpeg Node.js wrapper, which sends the video to an Express server instance, which then serves the video within a static Web page rendered by jsmpeg. The static Web page is then displayed as an IFrame within the main Electron app page.
The resulting stream has considerably more visual artifacts than what one would see when playing the raw UDP stream with ffplay, and this approach probably introduces a lot of latency, but it works well enough for my needs.

Related

how to broadcast my cam to my server and then to another rtmp server

Hello, I am looking for a way to forward my live stream from my server to another server, for example, Facebook via rtmp.
the structure would be something like:
My cam -> my server -> other server rtmp -> viewers
My intention is to capture the transmission and forward it to many rtmp servers to consume the server's resources and not the client's resources, I don't have much knowledge in video transmissions, if it is possible to do it via nodejs it would be great, thanks
I have searched for SFU and other ways that are possible, but I want to have several alternatives and find the most ideal to implement it in production
I never did it myself, so I can't recommend the best way to do it.
After some research, if you want to stay with nodejs, I personallly recommend Mediasoup.
It is a powerfull SFU developed in c++ which provided really good bindings with nodejs. All the heavy process is done in c++ and the nodejs API call a child process where the c++ mediasoup worker runs on it. You only have to care about the nodejs API nothing else.
With mediasoup it should not be too difficult to get your stream on the nodejs server.
After that, for transmitting you stream to a rtmp server, it seems you can call ffmpeg in a child process to transfer it from your nodejs server to a rtmp server.
I found two github projects with this kind of approach.
The first one is a bit outdated, using an old mediasoup version but maybe you can find something interesting. Specially for the client/browser part, you have an HTML file that should be helpfull. Be aware the API for Mediasoup may have changed, both the front and the back.
EDIT : The first project does not use Mediasoup client library, you can look at it here
The second is more recent and really seems to match your need, maybe you will need some cutomization. But they don't provide any front end part.
For mediasoup, you will find a lot of ressources over the internet, github, youtube for the client/server part.
If you want to look at it, the installation guide for the Mediasoup v3 (last) version. You have to install a python specific version and set few environment variables. After that you can install the npm package and happy coding !
It is easier to install on linux, so if you are on windows, preferably use WSL2 for testing. I don't know anything about Mac, but I know docker is possible, so should be good too.
A lot simpler option to stream your webcam to other servers will be to use OBS studio, but you must have already considered it
They have a plug in that permits to send your stream to multiple platform at once, looks really cool ! Here
Hope it can give you some more options !

An Electron RTSP player: How to play the video stream?

What seemed to be an easy job turned out otherwise. I wanted to write an Electron app to manage DVR streams as I wasn't satisfied with some apps I used. I choose Electron because I recently started JS and took the opportunity to practice it and also play a bit with Electron.
After deciding about how to handle GUI using web components, it was time to see how to read RTSP streams. My initial approach was to use FFmpeg. But I didn't know how to do so in Nodejs or Electron so I started a research.
Long story short I understood that if one wants to use a C/C++ library in JS the best practice is to create bindings using Node API (formerly NAPI) which will result in a FFmpeg native addon. Then I assumed there is already a decent such addon available as FFmpeg is the go-to for video stuff. But to my surprise this isn't the case. Although there are some packages which run ffmpeg executive binary the ones that provide the binding are rare and not recent (such as beamcoder).
FFmpeg web assembly is another option which I may consider but seemed overkill as I am not opening the streams in pure browser.
Another approach was to use chromium media abilities as it has FFmpeg bundled for some media functions but to my understanding it cannot open RTSP streams at least for now.
Can you please add to my current understanding on the matter?

How to create video and voice call to python application?

I want to add video and voice call to my web application developed with python.
I searched about it on internet, I found that I can do it with WebRTC, but this work is done with JavaScript, but I don't know how to do this work with python?
I'm using Sanic as a web framework in python 3.6.
On the other hand, is it possible to do this work with socketio in python?
I know this module is suitable for chatting apps.
I appreciate for your help.
There are several aspects involved in building WebRTC application:
Serving the web pages and javascript code used by your web clients. You can either use plain static files, or a server-side framework of your choice.
Providing a signaling channel which allows participants to exchange information about what media they support (audio, video, data channels) and how they can reach each other. Very often a WebSocket is used for this, but it's not the only possibility.
Taking part in the actual WebRTC media exchange. This really depends on your usecase. If you are doing one-to-one audio/video then the WebRTC endpoints are usually web browsers, but they could also be native applications. If you are building something like a voice-over-IP service, then most likely one endpoint is a browser, and the other is a server such as Asterisk or FreeSWITCH.
In the event you actually want your users to communicate with a custom server written in Python (for instance if you are doing audio / video processing using OpenCV) you can take a look at aiortc:
https://github.com/jlaine/aiortc
Sanic is just a web server which will serve HTML and JavaScript. You could also use any web server and it would not matter to WebRTC. Web server has no interaction with WebRTC code in any way.
All WebRTC code that you need for video chat will be in a JavaScript file and that code will be used by your browser (Firefox, Chrome, Opera,...). What you need to do in a server is signaling between peers. For this signaling process you can use socketio in python.
I would recommend you to learn more about WebRTC https://codelabs.developers.google.com/codelabs/webrtc-web/#0

How online radio live stream music and are there available resources to build one with Node.js?

i'm little curious about 'How live streaming web application works'. Recently I want to built something like a online radio that can perform live stream through all the client, like music, speech etc. I'm quite familiar with Java Spring MVC and Node.js . If there are some resource using thease above technology, it would be really helpful for me to see how it works. Thanks in advance.
There are two good articles about it:
Streaming Audio on the Web with NodeJS
Using NodeJS to Stream a Radio Broadcast
You may also find this module helpful:
https://www.npmjs.com/package/websockets-streaming-audio
The best way to do this is use Node.js as your source application, and leave the actual serving of streams to existing servers. No reason to re-invent streaming on the web if you can get all the flexibility you need by writing the source end.
The flow will look like this:
Your Radio Source App --> Icecast (or similar) --> Listeners
Inside your app itself:
Raw audio sources --> Codecs (MP3, AAC w/ADTS, etc.) --> Icecast Source Client
Basically, you'll need to create a raw PCM audio stream using whatever method you want for your use case. From there, you'll send that stream off to a handful of codecs, configured with different bitrates. What bitrate and quality you use is up to you, based on the bandwidth availability of your users and tradeoff with quality you prefer. These days, I usually have 64k streams for bad mobile connections, and 256k streams for good connections. As long as you have at least a 128k stream in there, you'll be putting out acceptable quality.
The Icecast source client can be a simple HTTP PUT these days. The old method is very similar... instead of PUT, the verb was SOURCE. (There are some other minor differences as well, but that's the gist.)

Send MediaStream from Browser to Server to encode

I am developing a video conferencing system that should enable users to stream a session to (for now) the server. I would prefer using WebRTC to connect client and server. The big hurdle I stumbled upon now is the question how to actually "live stream" the video from getUserMedia to the server.
I came across various methods including using a canvas element as well as some gateway (like Janus or Kurento) in the middle. I also found this answer here on Stack Overflow. However since this is a learning project I would prefer to use a pure WebRTC solution as well as not using the upcoming recording API, since I am aiming at a "live streaming" to the server.
My idea was to use nodeJS and have the server act as a peer. However I did not found a package for nodeJS that would enable those two channels (dataChannel seems always to be possible and might be a solution together with the canvas solution). So my question now: Are there any packages out there that I missed? Or is there any other way to implement this?

Resources