webRTC - stream webcam in real time on a webpage - node.js

I need to stream online in real time the video captured from my domestic webcam
making it appearing in a webpage accessible from anywhere,
so I can watch it from a browser by entering something like http://example.com/webcamStream
I readed some stuff about websockets and webRTC, and seems that in my case the best option could be webRTC
I already installed a webserver (apache) and set my domestic router that now redirects external requests to the apache server
I installed node.js, node_modules socket.io, express, ws
I did some little tests following some tutorials like this one (that creates a little websocket chat)
and this one that creates a video stream using webRTC
in the second example I was able to start the video stream but I am still confused about how to make that stream accessible from thw web, it is still not definitely clear to me as I am not very experienced with these kind of things
I hope someone can help me to understand what I need in order to accomplish that webcam stream from my house, any help will be really appreciated :)

Related

Capture audio/video in NodeJS from hardware, steam it to frontend

I am running a NodeJS/express server on a device (I have both BeagleBone and Raspberry Pi). This software needs to also access camera / microphone hardware directly, and send these audio/video streams to connected clients. I don't care at this point how the video /audio gets to the client, just that it gets there are can be decoded and played. The client in this case is a React Native Mobile app.
I want to take a moment to mention that this "device code" is NOT running in a browser. It's NodeJS / server side code. Consider this, for all intents and purposes, a "security" camera device. It sits somewhere and broadcasts what it sees and hears.
How do I:
a) Access video and audio streams in NodeJS
b) Get those streams into some kind of format that a web browser can play?
c) Decode the given video/audio in React Native?
d) Decode the video/audio in React (web)?
Working code examples would be greatly appreciate as opposed to explanations that lead me to dead ends when things don't work as expected.
I've been googling this for the last month and can not find answers. Can't even find someone else doing this same kind of project. Any help is appreciated. Thanks

WebRTC through host with nodeJS express and socketio

I created a web app to let people communicate. I want to implement screen sharing and audio calls.
My current app is programmed in NodeJs and uses express and socket.io to serve the client connection and open a socket connection. I want to stream video and audio. My problem with WebRTC is that all those who connect to a call are vulnerable to a DDoS attack since it is p2p. I found an article from Discord explaining how they managed to let the entire traffic go through their servers: https://blog.discord.com/how-discord-handles-two-and-half-million-concurrent-voice-users-using-webrtc-ce01c3187429, that's exactly what I want to achieve.
Could I possibly use socket.io-stream https://www.npmjs.com/package/socket.io-stream ? I didn't yet figure out how, and it seems like all socket.io streaming libraries are made for file upload/download, not for actual video/audio streaming.
If that doesn't work, a library such as what Discord managed to make would be the perfect solution, since all traffic is proxied, and not p2p. Though I couldn't find any of those libraries, maybe I'm just looking for the wrong thing?
Best regards
You will want to use a SFU.
Each peer negotiates a session with the SFU. They then exchange media through it. Each Peer will just communicate with the server. It has lots of other benefits and is what most WebRTC deploys today use.
There are lots of Open Source SFUs out there. You can even build your own with Open Source libraries.

Client browser webcam streaming to node server and back

I've been researching a lot on how to live stream frames coming from the camera on browser, to a node server. This server processes the image, and then is supposed to send the processed image back to the client. I was wondering what the best approach would be. I've seen solutions such as sending frames in chunks to the server, for the server to process. I've looked into webRTC, but came to the conclusion that this works more for client to client connections. Would a simple implementation such as using websockets, or using socket.io suffice?
You can use WebSockets. But, I'd not recommend it. I don't think you should drop WebRTC, yet. It's not just for client to client connections. You can use a MediaServer like Kurento or Jitsi to process your frames and return the output. I've seen Kurento samples for adding filters and stuff. You can build your own modules on how to process the frames. I'd recommend that you check the MediaServer and see if it fits your requirements. Use WebSockets only if you are sure that WebRTC doesn't work for you.

Straightforward way of getting a telephone audio stream into a node.js environment

What's the most straightforward way of receiving and sending a real time audio between a VoIP calling service and a node app that's only connected to the internet? It needs to be able to 'dial' a call, and send/ receive audio.
Right now, the architecture I've figured out is roughly to use Twilio's Elastic SIP trunking, then setup a SIP server like Asterisk that proxies RTP to WebRTC and connect that to Twilio, and then use something like JsSIP (although I'm not even sure if it allows getting an audio stream in a node environment) as a SIP over WebRTC client, but this is extremely complicated to setup, and just feels like overkill.
Is there an easier way/ service that provides this functionality, or at least an existing guide on how to do this?
You have following options
1) WebRTC
2) EAGI(audio to script file#3, one way).
3) asterisk to JACK
4) create your own c/c++ handler and do in whatever format you want.

Video Stream Hosting

Good day! I'm a newbie on video streaming. Can you help me find good ways on how to make a video streaming secure?
I'm having some issues on my video hosting project security.
I am creating a web page which calls a video stream hosted on a different server where
my web page is deployed.
Server 1(web page video embed) calls video to stream on Server 2(video host).
The problem is that they are hosted on an absolute different network. Should Server 2 where the video is hosted should be private and only allow Server 1 to fetch the video stream creating a server to server transfer of data, or should it be public for the clients to be able access it.
Can you help me decide what to do to secure my videos?
I badly need some idea on this... thanks guys!
How are you streaming and what streaming protocol are you using?
Server to server wont help in securing the video.it is better to stream the video direcly from your Server 2(video host) directly to the client,so that it wont be overhead for server 1(web page video embed).You need to use secure way to protect you video on server 2.if the server2 is not secure,even if you stream through server1 it wont help.
Here are details of security level on different video streamings.
If you are using progressive download.This can be done using normal http protocol.In this approach you would be able to see the video url in the browser.Once you got the url you can download it as a normal file download.Security is very low here.Even if you sign the video url,the user can download the video easily.
Streaming,you can stream the video using different protocol like rtmp etc.If you are streaming videos using some rtmp.In this approch, you wont be able to download the video directly,but you can use some good software to capture the video stream and save to the pc.
Streaming securly.There are some protocols like rtmpe.I tried only rtmpe,In this protocol,the streaming content will be encrypted on the server and decrypted on the client.so the software wont be able to capture the video stream.
Along with approach 3,if you sign the video url,it will add more security.Hope this helps.

Resources