I want to connect webrtc peer connection between node js server and browser. I will be sending the video stream to server using ffmpeg. Since connection will establish between server and client, client will recieve the stream to display the stream. Note: My actual situation is robot will be streaming the video to browser.
Can any one suggest how can i solve this scenario.
Related
I just want to know to get webrtc media stream to the node js server and serves it to the clients connected to that server.
I already created P2P WebRTC application with node js signalling server and it works fine. But now wants to route the media stream through the server. It should have low latency and delay to make it work at live server.
What you need is a SFU (Selective Forwarding Unit). Like these
mediasoup
janus-gateway
kurento
Here is a github project where it is implemented using mediasoup.
As the title suggests I want to build a NodeJs server to receive live video stream from multiple clients at the same time OVER THE INTERNET.
My client would basically be a Rpi with a camera sending it's live video stream to my NodeJs server.
I would be having multiple clients {Rpis} doing the same.
My NodeJs Server has a Static Public IP.
My NodeJs server and Clients{Rpis} are not on the same Local Network....So how can I send Live Video Stream from Multiple RPis to the central NodeJs server over the internet ?
....Basically I want to Build a VideoStream API[Implemented on NodeJs server], which can be used by my Multiple clients to push frames on to my server.
I have no clue on how to achieve the above .. Plz Help.
...
I saw many solutions online. None seem to fulfill my purpose.
Most of the solution online said. To configure Rpi as a nodejs server and stream camera feed in local network but I want to achieve the same on the INTERNET. but this not what I want. Since I have multiple client i.e multiple Raspberry pi's which are not same Network as of the NodeJs server.
client{raspberry pi}
const request = require('request');
const readablestream = require('magical-not-provided-way-to-access-camera-stream')
const r = request.post("http://staticIp/");
r.pipe(readablestream);
server
requestlistener(req){req.pipe(writeablestream)}
client
raspberrys.foreach(pullStream)
server{raspberry pi}
requestlistener(req,res){res.pipe(camerastream)}
I have a Raspberry Pi 3 with a camera module. I've installed uv4l, and that gives me the ability to access the camera stream from browser by creating a WebRTC connection from the browser. However, I want to have an extra layer between the client (browser) and uv4l WebRTC stream.
Is there a way to have a Node.js server, which would be a bridge between my client (browser) app and the uv4l stream?
I have this IoT device that I am trying to connect to the server via wifi to send real time data. The server uses socket.io with node.js, but the wifi module (esp8266) I am using only has a websocket package. Would a socket.io connection be able to receive data sent via websocket?
Yes, but it will be a bit of a kludge:
socket.io supports multiple transports, of which one of them is a websocket. You can connect using only the websocket, but you will need to implement the socket.io protocol.
https://github.com/socketio/socket.io-protocol
If you are using a popular platform, there may be libraries available, i.e. for Arduino, there is:
https://github.com/billroy/socket.io-arduino-client
I am in search of a server software which can stream different audios to a different clients.
For example every client will be able to create his own playlist and the server will stream it
Any help will be appreciated
You can check flash which has support for RTMP to stream audio real time using client server & RTMFP which works over peer to peer technology. You can use RTMFP in case peer is directly reachable else use RTMP. There is a open source red5 media server which also has support for RTMP protocol.