I've been researching a lot on how to live stream frames coming from the camera on browser, to a node server. This server processes the image, and then is supposed to send the processed image back to the client. I was wondering what the best approach would be. I've seen solutions such as sending frames in chunks to the server, for the server to process. I've looked into webRTC, but came to the conclusion that this works more for client to client connections. Would a simple implementation such as using websockets, or using socket.io suffice?
You can use WebSockets. But, I'd not recommend it. I don't think you should drop WebRTC, yet. It's not just for client to client connections. You can use a MediaServer like Kurento or Jitsi to process your frames and return the output. I've seen Kurento samples for adding filters and stuff. You can build your own modules on how to process the frames. I'd recommend that you check the MediaServer and see if it fits your requirements. Use WebSockets only if you are sure that WebRTC doesn't work for you.
Related
I created a web app to let people communicate. I want to implement screen sharing and audio calls.
My current app is programmed in NodeJs and uses express and socket.io to serve the client connection and open a socket connection. I want to stream video and audio. My problem with WebRTC is that all those who connect to a call are vulnerable to a DDoS attack since it is p2p. I found an article from Discord explaining how they managed to let the entire traffic go through their servers: https://blog.discord.com/how-discord-handles-two-and-half-million-concurrent-voice-users-using-webrtc-ce01c3187429, that's exactly what I want to achieve.
Could I possibly use socket.io-stream https://www.npmjs.com/package/socket.io-stream ? I didn't yet figure out how, and it seems like all socket.io streaming libraries are made for file upload/download, not for actual video/audio streaming.
If that doesn't work, a library such as what Discord managed to make would be the perfect solution, since all traffic is proxied, and not p2p. Though I couldn't find any of those libraries, maybe I'm just looking for the wrong thing?
Best regards
You will want to use a SFU.
Each peer negotiates a session with the SFU. They then exchange media through it. Each Peer will just communicate with the server. It has lots of other benefits and is what most WebRTC deploys today use.
There are lots of Open Source SFUs out there. You can even build your own with Open Source libraries.
I am developing a colaborative instrument playing game, where multiple users will play an instrument (a synthesizer or sample, using the WebAudio API). On my first prototype I've set up a keyboard that sends note/volume signals via Socket.io to the server, and when the server gets that signal it sends it back to all connected sockets, which will play the corresponding note.
You might have guessed it right: there's a massive amount of lag and inconsistency as to the order of arrival of notes.
What are some efficient ways that I can send the output of WebAudio to the server, and have it broadcast to all connected users, so I have some sort of consistency?
You could try using a MediaStream by adding an MediaStreamAudioDestinationNode to your audio node graph as a destination and use that stream with either WebRTC or RecordRTC to send to your server.
Here is some info I found you could look at.
It does talk about using the getUserMedia method, but both getUserMedia and MediaStreamAudioDestinationNode methods send out a MediaStream constructor. This info
has some ideas on how you could send a MediaStream to your sever. However it does say that it needs to be recorded first. Not when it's live and running.
Sending a MediaStream to host Server with WebRTC after it is captured by getUserMedia
I hope this helps :)
I need to stream online in real time the video captured from my domestic webcam
making it appearing in a webpage accessible from anywhere,
so I can watch it from a browser by entering something like http://example.com/webcamStream
I readed some stuff about websockets and webRTC, and seems that in my case the best option could be webRTC
I already installed a webserver (apache) and set my domestic router that now redirects external requests to the apache server
I installed node.js, node_modules socket.io, express, ws
I did some little tests following some tutorials like this one (that creates a little websocket chat)
and this one that creates a video stream using webRTC
in the second example I was able to start the video stream but I am still confused about how to make that stream accessible from thw web, it is still not definitely clear to me as I am not very experienced with these kind of things
I hope someone can help me to understand what I need in order to accomplish that webcam stream from my house, any help will be really appreciated :)
Wanted to clear a few questions about websocket.
Is it possible to stream videos from server to client and client to server at the same time...something like video calling?
Can the server stream two videos to a single client at a time?
Regarding the first question, yes you can. There are already wrappers that simplify that task such as BinaryJS.
As per your second question, it would require a little extra configuration. Once a bidirectional link between client and server is established, the client will treat every incoming message as part of the same stream. Separating or multiplexing two videos in the same stream would have to carry another mark to help the client separate it.
It would be a better idea to open a new connection (with the same server) to stream a second video.
I am writing a multiplayer real time game for the browser with the server as a master instance and the clients as input devices and slaves to show the graphics.
I have to send out changes in the game world very often and very fast and it doesn't matter if some of the data sometimes gets lost on the way because a couple of milliseconds later there will be the next update anyway.
Right now I am using Socket.io to talk between the server and the clients but this uses TCP which makes the update come in unnecessary late sometimes.
I know that there is WebRTC with data channels where I would be able to send my updates through wit UDP which would be very awesome and exactly what I want. And it even seems to be implemented in Firefox and Chrome already https://stackoverflow.com/a/12864512/63779
What I now need is some Node library which would allow me to use data channels to send my data (for now just JSON strings) with help of UDP to the clients which are browsers. On the browser I would be able to use webkitRTCPeerConnection() but I have no idea how to start something like that on the Node server. Any suggestions? If there is no Node module for that, would it be possible to write something in some other language and just send the data via Unix domain sockets or something?