Is it possible to detect WebRTC one-to-one audio or video call duration in Node? - node.js

I am very new to webrtc, I am slightly confused about it.
I am able to do one-to-one video/audio call using node.js, but still confused is it possible to check how long two people had talked?
If yes, please guide me.
If not then what is the best way to monitor call length? (I don't want to record audio or video, just the length).
Thanks in Advance.

Are you using nodejs as your socket server, or as the actual endpoints? Last I checked webrtc didn't have a native nodejs interface but you could use one of the available NPM modules.
It's always possible to track from the app side. Get the time at the start, get the time at the end and report that to your server. The WebRTC api for iOS, Android, and JS has a GetStats api you can call during or after a session to get this information as well. AppRTC has examples on how to do that.

Related

how to broadcast my cam to my server and then to another rtmp server

Hello, I am looking for a way to forward my live stream from my server to another server, for example, Facebook via rtmp.
the structure would be something like:
My cam -> my server -> other server rtmp -> viewers
My intention is to capture the transmission and forward it to many rtmp servers to consume the server's resources and not the client's resources, I don't have much knowledge in video transmissions, if it is possible to do it via nodejs it would be great, thanks
I have searched for SFU and other ways that are possible, but I want to have several alternatives and find the most ideal to implement it in production
I never did it myself, so I can't recommend the best way to do it.
After some research, if you want to stay with nodejs, I personallly recommend Mediasoup.
It is a powerfull SFU developed in c++ which provided really good bindings with nodejs. All the heavy process is done in c++ and the nodejs API call a child process where the c++ mediasoup worker runs on it. You only have to care about the nodejs API nothing else.
With mediasoup it should not be too difficult to get your stream on the nodejs server.
After that, for transmitting you stream to a rtmp server, it seems you can call ffmpeg in a child process to transfer it from your nodejs server to a rtmp server.
I found two github projects with this kind of approach.
The first one is a bit outdated, using an old mediasoup version but maybe you can find something interesting. Specially for the client/browser part, you have an HTML file that should be helpfull. Be aware the API for Mediasoup may have changed, both the front and the back.
EDIT : The first project does not use Mediasoup client library, you can look at it here
The second is more recent and really seems to match your need, maybe you will need some cutomization. But they don't provide any front end part.
For mediasoup, you will find a lot of ressources over the internet, github, youtube for the client/server part.
If you want to look at it, the installation guide for the Mediasoup v3 (last) version. You have to install a python specific version and set few environment variables. After that you can install the npm package and happy coding !
It is easier to install on linux, so if you are on windows, preferably use WSL2 for testing. I don't know anything about Mac, but I know docker is possible, so should be good too.
A lot simpler option to stream your webcam to other servers will be to use OBS studio, but you must have already considered it
They have a plug in that permits to send your stream to multiple platform at once, looks really cool ! Here
Hope it can give you some more options !

Signaling mechanism for webRTC using simpleWebRTC js library and backend in django

I am trying to build video conferencing web application where multiple people can join the call and there will be video/audio/data transmission. I researched a lot for this. What i understand is we can achieve this using webRTC protocol. I started doing research for js libraries and i came to know for simpleWebRTC. I want to develop with backend as a django. I tried to run sample demo code from https://www.sitepoint.com/webrtc-video-chat-application-simplewebrtc/. but i am not able to establish connection with socket.io which is simplewebRTC signaling sandbox server.
I tried with node express server as well, but still i got the same error:- net::ERR_FAILED. This error occured in simplewebrtc-with-adapter-js while making connection.
What would be the correct technologies to achieve this type of functionality?
Front-end webRTC libraries:- simplewebRTC/ EasyRTC/ Any else apis?
Signaling mechanism:- What and how can we use to connect with webRTC?
Backend:- Node js/Django?
I still confused with the signaling protocols/STUN/TURN servers as we have to define the servers by our self. simpleWebRTC is not providing that which we can use in production.
Any help would be much appreciated!
I just started a video calling and chat application as well. open-easyrtc, no problems so far, their demo just works after npm install.
As for signaling servers, since I just started I haven't concerned myself much about them but the most I can make out of it is it's used for exchanging information like video metadata, network information, etc. open-easyrtc comes with public STUN and TURN servers, not sure about the limitation especially if you're going to have a lot of users.
It's also possible to deploy your own, I'm looking at learning more about coturn
once I finished developing my application.
You can use simple-peer, a simple library for webrtc. Here is an example project with multiple users project, DEMO.

Building a chat application, NodeJS and Express - what should I use for media streaming?

I've previously built chat servers using NodeJS (i.e. central chat server with clients, no p2p), with Electron, or just good old Express. I'd like to re-use as much of my old code as possible. Thus, the only missing piece of the puzzle for me is what to use to enable both public and private video/audio streaming. File sending isn't necessary.
Is there anything out there I can 'easily' drop in to this model? I'm aware of Kurento and a few similar offerings but these feel like overkill for how I'm hoping to work.
update: Given a few suggestions about WebRTC, which I'm open to, but plans for this app include automated moderation/content filtering of any video broadcasts and text. So I assume such a solution would need to either treat the server as a 'hardcoded' peer somehow so that it's fairly safe to assume it will see a copy of anything sent over the public chat network. Of course, for private communications this need not be the case. On the flip side, worst case, operating in a spoke topology is fine too.
You can start with a WebRTC samples
https://webrtc.github.io/samples/
WebRTC is kind of standard now for audio/video calls. It's all work p2p with no server interaction.
The only one thing you need to build is a signaling protocol to connect 2 users. For this you can use/extend your nodejs app chat.

Send MediaStream from Browser to Server to encode

I am developing a video conferencing system that should enable users to stream a session to (for now) the server. I would prefer using WebRTC to connect client and server. The big hurdle I stumbled upon now is the question how to actually "live stream" the video from getUserMedia to the server.
I came across various methods including using a canvas element as well as some gateway (like Janus or Kurento) in the middle. I also found this answer here on Stack Overflow. However since this is a learning project I would prefer to use a pure WebRTC solution as well as not using the upcoming recording API, since I am aiming at a "live streaming" to the server.
My idea was to use nodeJS and have the server act as a peer. However I did not found a package for nodeJS that would enable those two channels (dataChannel seems always to be possible and might be a solution together with the canvas solution). So my question now: Are there any packages out there that I missed? Or is there any other way to implement this?

Is there a good radio-like audio streaming solution for node.js?

I'm looking for something to stream audio like radio (playing continuously and clients can join in the middle of a song) with node.js. Is there any node.js module (which I couldn't find)or anything else that I can use along with node.js to achieve this? Is this possible at all with node.js? If not, what do you recommend to use otherwise? (though, I prefer node.js) It's ok for me to use HTML5 Audio API and I don't care about IE support.
Thanks.
Yes, this is entirely possible. I am hosting internet radio on Node.js at the moment.
All you have to do is take the raw stream data from the encoder and send it via HTTP to any connected clients. The clients are good about synching up with the stream, so you don't have to worry about aligning to frames or anything.

Resources