RTP to WebRTC or WebSocket - node.js

I've walkie-talkies sending the speech via RTP (G711a) into my LAN. My goal now is to take this audio-stream and provide it (one-to-many) to different Web-Clients. My preferred solution is to do this via WebRTC, but I can't find the right tools to deal with. My favorite environment is Node.js and C/C++.
Anybody out there who can help me, to find the right entry/the tools for this task?

Related

Play video from one device to another

I’m looking to essentially use two devices: raspberry pi 3 and Mac 10.15. I am using the pi to capture video from my web cam and I want to use my Mac to kind of extend to the pi so when I use cv2.videocapture I can capture that same video in preferably real-time or something close. I’m programming this using python on bout devices. I thought of putting it on a local server and retrieving it but I have no idea how I could use that with opencv. If someone could provide and explain a useful example, I would greatly appreciate it. Thank you.
To transfer a video stream, you could use instead of a custom solution a RTMP server on the source machine feeding it with the cam source and the target opens the stream and processes it.
A similar approach to mine is widely implemented into IP cameras: They run a RTMP server to make the stream available for phones and PC.

VoIP android studio

Hi I'm not very good with coding but I'm trying to learn as much as I can. I've been looking for tutorials on how to create an internet call app on android studio. So far I haven't found any. If anyone knows a process that could guide me I would very much appreciate it.
You can use android's own implementation
Session Initiation Protocol
from docs
Android provides an API that supports the Session Initiation Protocol
(SIP). This lets you add SIP-based internet telephony features to your
applications. Android includes a full SIP protocol stack and
integrated call management services that let applications easily set
up outgoing and incoming voice calls, without having to manage
sessions, transport-level communication, or audio record or playback
directly.
or other third-party libraries like following.
1.Pjsip
2.Mjsip
3.doubango
4.belle-sip
Hope it helps..
P.S taken from this answer
refer this also..
Happy Coding :)

Is SIP required for webRTC calling to legacy VTC products

I am working on a webRTC application and would like to be able to support multiple calls and be able to call from the browser to legacy VoIP or Videoconferencing systems as well as browser to browser.
now that Asterisk has added websocket in their latest builds would you need SIP and a SIP proxy in order to communicate with VoIP systems or will Asterisk allow this?
now that H.264 has been open sourced by Cisco would you still need a transcoder in order to call a legacy VTC system?
Is Node.js the preferred technology for implementing webrtc client/server deployments? I've looked into Mobicents SIP Servlets a bit but that seems to be the only alternative technology available beside a node.js solution.
If needed I am planning on creating a SIP trunk between an Asterisk server and our Polycom VBP so the webrtc clients should be able to get presence information through that connection so if no media transcoding is required with the recent changes then media should be able to pass directly from polycom endpoint to browser with the asterisk handling the signalling.
Thank you anyone who is able to answer any of these questions, it is still early in the r&d portion of this project for me and i'd like to get as much information as possible.
also: i did see SIP over websockets to true SIP. I understand that "something" needs to stand in between the webRTC client and the VoIP phone or Legacy SIP endpoint. what I would like to know is if that can be just asterisk with the recent update. if asterisk is all that is required, is there a way to include a media transcoder like red5? I haven't seen anything in the webrtc API that would allow you to include a transcoder, asterisk has transcoding mods but none that will do vp8 to h.26x or Opus to anything as far as i know.
Answer on that question higly depend of destination "legacy" system. Cisco "legacy" systems use h323 and sip, which is not compatible with webRTC.
Sure there are alot of ways to setup asterisk, red5, opensips or other as translation level.
Webrtc goal is call from browser. It never supposed have any API for transcode. That have be done by server part(which require special knowledge and experience to be propertly setup)
There are alot of availible documentation in internet, no any way put answer in less then 30 pages of text.

Controlling an Arduino over the web with the lowest latency

I have an Arduino board in a location, and a webserver.
I also have a website that is supposed to control the Arduino. What technique would allow users to take control of the Arduino board with the least amount of latency?
I have Node.js, socketserver, Jabber in mind to experiment with, but is this the right direction?
You should have a look at Socket.IO for implementing WebSockets on server- and client-side.
There's a great project called duino for accessing Arduino with Node.js, you "only" have to pipe all communication through WebSockets.
Update: In the meantime I have published a framework for sending commands to an Arduino with Node.JS, JavaScript and WebSockets. Make sure to have a look at Noduino!
I had good luck using node-serialport to talk to the Arduino. Using the serial port results in very low latency, and I used it to build a photobooth. The code is on GitHub if you want to check it out, though it's very poorly organized as I was rushing to get this done for my wedding and well... corners were cut.

How to program an audio/video application on network?

I want to make (for fun, challenge) a videoconference application, I have some ideas about this:
1) taking the audio/video streams (I don't know what an audio/video stream is)
2) pass this to a server that lets communicate the clients. I can figure out how to write a server(there are a lot of books and documentation about this) but I really don't know how to interact with the webcam and with the audio/video in general.
I want some links, book, suggestions about the basics of digital audio/video expecially on programming. Please help me!!!
I want to make it run on a Linux platform.
Linux makes video grabbing really nice. As long as you have a driver that outputs the video stream to the /dev/video/v* channels. All you have to do is open up a control connection to the device [an exercise for the OP] and then read in the channel like a file [given the parameters set by the control connection. Audio should be the same way, but don't quote me on it.
BTW: Video streaming from a server is a very complex issue. You have to develop or use an existing protocol. You have to be very aware of networking delays, and adjust the information sent (resize or recompress) to the client based on the link size between the client and the server.

Resources