I am trying to build a screen sharing app using electron js, webRTC , socket io and express.
Can please explain to me :
how can i send the MediaStream taken from the getDisplayMedia to the server through socket io … I have read articles about that and they explained that it is not possible to send it through websocket however we should use webRTCpeerConnection because the stream is live … Can you tell me more about this and i wonder if i can send video and audio tracks one by one throught web sockets if possible or if you have a solution for that using websocket instead of RTCpeerconnection .
Sorry for the long message, i will appreciate your help <3
Related
I want to build an internet radio station using nodejs. My architecture needs to be like the following, there is one producer who records live audio, this live audio data needs to be sent to the server. On the server-side, I need to save the live streams of audio data in some audio format(for future playback) and also simultaneously stream the live audio to multiple clients. Can somebody please point me towards some implementations or library available to achieve this? I have read several posts and stackoverflow answers but couldn't find anything related to my need.
Is webRTC needed? I don't want clients to get peer to peer connection as I also want to save the live audio on the server. Please any help would be appreciated.
I have a node.js server that's receiving WEBM blob binary data small packets through socket.io from a Webpage!
(navigator.mediaDevices.getUserMedia -> stream -> mediaRecorder.ondataavailable -> DATA . I'm sending that DATA back to the server. So that includes timestamp and binary data).
How do I stream those back on a http request in a never ending live stream that can be consumed by a HTML webpage simply by adding the URL in the VIDEO tag?
Like this:
<video src=".../video" autoplay></video>
I want to create a live video stream that and basically stream back my Webcam to an html page but I'm a bit lost how do I do that. Please help. Thanks
Edit: I'm using express.js to serve the app.
I just am not sure what I need to do on the Server with the coming webm binary blobs to serve it properly to be consumed by an html page on an endpoint /video
Please help :)
After many failed attempts I was finally able to build what I was trying to:
Live video streaming through socket.io.
So what I was doing was:
Start getUserMedia to start the web camera
Start a mediaRecorder set to record intervals of 100 ms
On each available chunk emit an event through socket.io to the server with the blob converted to base64 string
Server sends back base64 converted 100ms video chunk back to all connected sockets.
Webpage gets the chunk and uses mediaSource and sourceBuffer to add the chunk to the buffer
Attach the media source to a video element and VOILA :) the video would play SMOOTHLY. As long as you attach each chunk in order and you don't skip chunks (in which case it stops playing)
And IT WORKED! BUT was unusable.. :(
The problem is the mediaRecorder process is CPU intensive and the page cpu usage was jumping to 15% and the whole process was TOO SLOW.
There was 2.5 seconds latency on the video stream passing through socket.io and virtually the same EVEN if DON'T send the blobs through socket.io but render them on the same page.
Sooo I found out this works but DOESN'T work for a sustainable video chat service. It's just not designed for it. For recording a webcam video to playback later, mediaRecorder can work but not for live streaming.
I guess for live streaming there's no way around WebRTC, you MUST use WebRTC to send the video stream to either a peer or a server to send to other peers. DO NOT TRY to build a live video chat service with mediaRecorder. You're only gonna waste your time. I did that for you :) so you don't have to. Just look into webRTC. You may have to use a TURN server. Twilio provide STUN, TURN servers but it costs money. BUT you can run your own TURN server with Coturn and other services but I'm yet to look into that.
Thanks. Hope that helps someone.
I've been researching a lot on how to live stream frames coming from the camera on browser, to a node server. This server processes the image, and then is supposed to send the processed image back to the client. I was wondering what the best approach would be. I've seen solutions such as sending frames in chunks to the server, for the server to process. I've looked into webRTC, but came to the conclusion that this works more for client to client connections. Would a simple implementation such as using websockets, or using socket.io suffice?
You can use WebSockets. But, I'd not recommend it. I don't think you should drop WebRTC, yet. It's not just for client to client connections. You can use a MediaServer like Kurento or Jitsi to process your frames and return the output. I've seen Kurento samples for adding filters and stuff. You can build your own modules on how to process the frames. I'd recommend that you check the MediaServer and see if it fits your requirements. Use WebSockets only if you are sure that WebRTC doesn't work for you.
I am developing a colaborative instrument playing game, where multiple users will play an instrument (a synthesizer or sample, using the WebAudio API). On my first prototype I've set up a keyboard that sends note/volume signals via Socket.io to the server, and when the server gets that signal it sends it back to all connected sockets, which will play the corresponding note.
You might have guessed it right: there's a massive amount of lag and inconsistency as to the order of arrival of notes.
What are some efficient ways that I can send the output of WebAudio to the server, and have it broadcast to all connected users, so I have some sort of consistency?
You could try using a MediaStream by adding an MediaStreamAudioDestinationNode to your audio node graph as a destination and use that stream with either WebRTC or RecordRTC to send to your server.
Here is some info I found you could look at.
It does talk about using the getUserMedia method, but both getUserMedia and MediaStreamAudioDestinationNode methods send out a MediaStream constructor. This info
has some ideas on how you could send a MediaStream to your sever. However it does say that it needs to be recorded first. Not when it's live and running.
Sending a MediaStream to host Server with WebRTC after it is captured by getUserMedia
I hope this helps :)
Wanted to clear a few questions about websocket.
Is it possible to stream videos from server to client and client to server at the same time...something like video calling?
Can the server stream two videos to a single client at a time?
Regarding the first question, yes you can. There are already wrappers that simplify that task such as BinaryJS.
As per your second question, it would require a little extra configuration. Once a bidirectional link between client and server is established, the client will treat every incoming message as part of the same stream. Separating or multiplexing two videos in the same stream would have to carry another mark to help the client separate it.
It would be a better idea to open a new connection (with the same server) to stream a second video.