Capture audio/video in NodeJS from hardware, steam it to frontend - node.js

I am running a NodeJS/express server on a device (I have both BeagleBone and Raspberry Pi). This software needs to also access camera / microphone hardware directly, and send these audio/video streams to connected clients. I don't care at this point how the video /audio gets to the client, just that it gets there are can be decoded and played. The client in this case is a React Native Mobile app.
I want to take a moment to mention that this "device code" is NOT running in a browser. It's NodeJS / server side code. Consider this, for all intents and purposes, a "security" camera device. It sits somewhere and broadcasts what it sees and hears.
How do I:
a) Access video and audio streams in NodeJS
b) Get those streams into some kind of format that a web browser can play?
c) Decode the given video/audio in React Native?
d) Decode the video/audio in React (web)?
Working code examples would be greatly appreciate as opposed to explanations that lead me to dead ends when things don't work as expected.
I've been googling this for the last month and can not find answers. Can't even find someone else doing this same kind of project. Any help is appreciated. Thanks

Related

Recommendation on client side echo cancellations

I am developing a MCU based voip service. I think the traditional way of doing MCU is, you have N audio mixers at server and every participant in the call receive a steam that does not have their own voice encoded.
Guess what I wish to do is, have only 1 audio mixer running at server and (on a broadcast kind model) send the final mixer audio to every participant (For scalability obviously).
Now this obviously creates a problem of hearing your own voice coming from speaker as MCU’s output stream.
I am wondering if there is any “client side echo cancellation” project that I can use to cancel the voice of user at desktop/mobile level.
The general approach is to filter/subtract the own voice in the MCU. Doing this on the client side does not work.

Client browser webcam streaming to node server and back

I've been researching a lot on how to live stream frames coming from the camera on browser, to a node server. This server processes the image, and then is supposed to send the processed image back to the client. I was wondering what the best approach would be. I've seen solutions such as sending frames in chunks to the server, for the server to process. I've looked into webRTC, but came to the conclusion that this works more for client to client connections. Would a simple implementation such as using websockets, or using socket.io suffice?
You can use WebSockets. But, I'd not recommend it. I don't think you should drop WebRTC, yet. It's not just for client to client connections. You can use a MediaServer like Kurento or Jitsi to process your frames and return the output. I've seen Kurento samples for adding filters and stuff. You can build your own modules on how to process the frames. I'd recommend that you check the MediaServer and see if it fits your requirements. Use WebSockets only if you are sure that WebRTC doesn't work for you.

Ways to broadcast audio from WebAudio API to server-side and then to connected clients

I am developing a colaborative instrument playing game, where multiple users will play an instrument (a synthesizer or sample, using the WebAudio API). On my first prototype I've set up a keyboard that sends note/volume signals via Socket.io to the server, and when the server gets that signal it sends it back to all connected sockets, which will play the corresponding note.
You might have guessed it right: there's a massive amount of lag and inconsistency as to the order of arrival of notes.
What are some efficient ways that I can send the output of WebAudio to the server, and have it broadcast to all connected users, so I have some sort of consistency?
You could try using a MediaStream by adding an MediaStreamAudioDestinationNode to your audio node graph as a destination and use that stream with either WebRTC or RecordRTC to send to your server.
Here is some info I found you could look at.
It does talk about using the getUserMedia method, but both getUserMedia and MediaStreamAudioDestinationNode methods send out a MediaStream constructor. This info
has some ideas on how you could send a MediaStream to your sever. However it does say that it needs to be recorded first. Not when it's live and running.
Sending a MediaStream to host Server with WebRTC after it is captured by getUserMedia
I hope this helps :)

webRTC - stream webcam in real time on a webpage

I need to stream online in real time the video captured from my domestic webcam
making it appearing in a webpage accessible from anywhere,
so I can watch it from a browser by entering something like http://example.com/webcamStream
I readed some stuff about websockets and webRTC, and seems that in my case the best option could be webRTC
I already installed a webserver (apache) and set my domestic router that now redirects external requests to the apache server
I installed node.js, node_modules socket.io, express, ws
I did some little tests following some tutorials like this one (that creates a little websocket chat)
and this one that creates a video stream using webRTC
in the second example I was able to start the video stream but I am still confused about how to make that stream accessible from thw web, it is still not definitely clear to me as I am not very experienced with these kind of things
I hope someone can help me to understand what I need in order to accomplish that webcam stream from my house, any help will be really appreciated :)

Show local video stream in HTML5 video tag

I working on a system where we want to show a video stream from a Video Capture card in a browser. The browser connect towards a remote server and fetch a html page that have video in it. This video should be streamed from the client machine where a video capture card is connected.
On client side we running Linux and the capture card is registered as /dev/video0 by Video4Linux2. The browser on client side is Chrome (chromium-browser). On client side we have a webserver (lighttpd) that is possible to use for streaming.
I have looked into the getUserMedia API but it seems to be poor support for that right now. Other toughts that I have had is to use the local webserver or setup a streaming server on client side that stream video source locally.
Any ideas how to design this would be great input for me!
Thanks,
/Peter
Since Chrome does not yet support RT(S)P streaming for the <video> tag you will have to use a plugin for this.
Given it's availability I would suggest using Flash to write simple SWF which finds the correct video source and displays.
If needed you can use one of the many 'Recording Apps' available and strip out the recording part.

Resources