Mix multiple RTP streams into a single one - audio

I am trying to build a basic conference call system based on plain RTP.
_____
RTP IN #1 ______ | | _______ MIX RTP receiver #1
|______| MIX |_____|
______| | RTP | |_______ MIX RTP receiver #2
RTP IN #2 |_____|
I am creating RTP streams on Android via the AudioStream class and using a server written in Node.js to receive them.
The naive approach I've been using is that the server receives the UDP packets and forwards them to the participants of the conversation. This works perfectly as long as there are two participants, and it's basically the same as if the two were sending their RTP stream to each other.
I would like this to work with multiple participants, but forwarding the RDP packets as they arrive to the server doesn't seem to work, probably for obvious reasons. With more than two participants, the result of delivering the packets coming in from different sources to each of the participants (excluding the sender of such packet) results in a completely broken audio.
Without changing the topology of the network (star rather than mesh) I presume that the server will need to take care of carrying out some operations on the packets in order to extract a unique output RTP stream containing the mixed input RTP streams.
I'm just not sure how to go about doing this.

In your case I know two options:
MCU or Multipoint Control Unit
Or RTP simulcast
MCU Control Unit
This is middle box (network element) that gets several RTP streams and generate one or more RTP streams.
You can implement it by yourself but it is not trivial because you need to deal with:
Stream decoding (and therefore you need jitter buffer and codecs implementation)
Stream mixing - so you need some synchronisation between streams (collect some data from source 1 and source 2, mix them and send to destination 3)
Also there are several project that can do it for you (like Asterisk, FreeSWITCH etc), you can try to write some integration level with them. I haven't heard anything about something on Node.js
Simulcast
This is pretty new technology and their specifications available only in IETF drafts. Core idea here is to send several RTP streams inside one RTP stream simultaneously.
When destination receives several RTP streams it needs to do exactly the same as MCU does - decode all streams and mix them together but in this case destination may use hardware audio mixer to do that.
Main cons for this approach is bandwidth to the client device. If you have N participants you need:
either send all N streams to all other
or select streams based on some metadata like voice activity or audio level
First one is not efficient, second is very tricky.

The options suggested by Dimtry's answer were not feasible in my case because:
The middle box solution is difficult to implement, requires too many resources or requires to rely on an external piece of software, which I didn't want to have to rely on, especially because Android RTP stack should work out of the box with basic support from a server component, especially for hole punching
The simulcast solution cannot be used because the Android RTP package cannot handle that and ad far as my understanding goes it's only capable of handling simple RTP streams
Other options I've been evaluating:
SIP
Android supports it but it's more of a high level feature and I wanted to build the solution into my own custom application, without relying on additional abstractions introduced by a high level protocol such as SIP. Also, this felt just too complex to set up, and conferencing doesn't even seem to be a core feature but rather an extension
WebRTC
This is supposed to be the de-facto standard for peer 2 peer voice and video conferencing but looking through code examples it just looks too difficult to set up. Also requires support from servers for hole punching.
Our solution
Even though I had, and still have, little experience on this I thought there must be a way to make it work using plain RTP and some support from a simple server component.
The server component is necessary for hole punching, otherwise getting the clients to talk to each other is really tricky.
So what we ended up doing for conference calling is have the caller act as the mixer and the server component as the middle-man to deliver RTP packets to the participants.
In practice:
whenever a N-user call is started, we instantiate N-1 simple UDP broadcast servers, listening on N-1 different ports
We send those N-1 ports to the initiator of the call via a signaling mechanism built on socket.io and 1 port to each of the remaining participants
The server component listening on those ports will simply act as a relay: whenever it receives a UDP packet containing the RTP data it will forward it to all the connected clients (the sockets it has seen thus far) except the sender
The initiator of the call will receive and send data to the other participants, mixing it via the Android AudioGroup class
The participants will send data only to the initiator of the call, and they will receive the mixed audio (together with the caller's own voice and the other participants' voices) on the server port that has been assigned to them
This allows for a very simple implementation, both on the client and on the server side, with minimal signaling work required. It's certainly not a bullet proof conferencing solution, but given the simplicity and feature completeness (especially regarding common network issues like NAT traversal, which using a server aid is basically a non-issue) is in my opinion better than writing lots of code which requires many resources for mixing server-side, relying on external software like SIP servers, or using protocols like WebRTC which basically achieve the same with lots more effort implementation wise.

Related

Why is the application data of a packet is called a protocol?

I've been reading on packets a lot today. I was confused for sometime because smtp, http, or ftp, for example, are all called protocols. But that they also somehow utilize transport protocols like TCP. I couldn't locate them on the packet 4 layers. Until I just discovered they're simply part of the application layer.
I want to know what exactly these "protocols" offer. I'm guessing a specific format for the data which applications on the client side know how to handle? If so, does this mean that realistically, I might have to create my own "protocols" if I created an application with a unique functionality?
A protocol, in this case, is just a structured way of communicating between two or multiple parties.
If you write, for example, a PHP-App and offer an API, you created a protocol to interact with your program. It defines how others interact with it and what response they can expect while doing so. Your self-created protocol depends on others, like the HTTP and TCP.
I suggest watching following video of LiveOverflow, explaining exactly this:
https://www.youtube.com/watch?v=d-zn-wv4Di8&ab_channel=LiveOverflow
I want to know what exactly these "protocols" offer.
You can read the definition of each protocol, if you really want to

WebRTC peer to nodejs

I would like to use webRTC but instead of p2p would like to broadcast my audio/video feed to nodejs in realtime. I can encode the video to 125 kbps and 10-12 frames per second for smooth transmission. The idea is nodejs will receive this feed, save it and broadcast it on the same time as a realtime session/webinar. I can connect p2p but I am not sure how to
send feed to nodejs instead of peer
on nodejs how to receive feed
The WeRTC protocol suite is complex enough that an implementation from scratch for a selective forwarding unit SFU is likely to take at least a year by a team of experts. It requires handling a variety of networking protocols including datagrams (UDP) and TCP. And it may require transcoding between video and audio codecs.
The good news is that browser endpoints are now excellent. And open-source server implementations are good enough to get to a minimum viable product.

Handle different UDP message types in Nodejs

I'm writing a application in NodeJs where a client sends udp messages to a server with udp. I'm trying to find out how people normally handle different message types in NodeJs but can only find tons of examples of echo servers where the kind of message is not relevant.
The only example I have found so far is https://github.com/vbo/node-webkit-mp-game-template/tree/networking_1/networking
Maybe the best way is to send the udp messages as json?
User Datagram Protocol (UDP) is a network protocol and mechanism for sending short messages from one host to another without any guarantee of delivery. What you put in the message is entirely up to you.
While JSON can be used to encode your message, it suffers from two problems: it is not secure and is self-describing.
The first problem means that bad actors can easily see the content of your message while in flight and the second implies a substantial overhead for any message above and beyond its intended purpose.
Depending on your needs, a better choice might be to define your own binary protocol specific to your purpose using a node Buffer.
Another might be to use a more compact interchange format like thrift.

Should I use WebRTC or Websockets (and Socket.io) for OSC communication

I am working on an application that will send OSC control messages, which is, as I understand a datagram packet, from a web page to an OSC Receiver (server), such as Max/MSP or Node or any other.
I know typically UDP is used because speed is important in the realtime/ audio visual control work done with OSC (which is also the work I will be doing), but I know other methods can be used.
Right now for instance, I am sending OSC from the browser to a node.js server (using socket.io), and then from the node.js server to Max (which is where I ultimately want the data), also using socket.io. I believe this means I am using websockets and the delay/latency has not been bad.
I am curious though, now that WebRTC is out, if I should place the future of my work there. In all of my work with OSC I have always used UDP, and only used the Socket.io/Websockets connection because I didn't know about WebRTC.
Any advice on what I should do. Specifically I am interested in
1. How I could send OSC messages from the browser directly to an OSC server (as opposed to going through a node server first)
2. If I should stay with the Node/Socket.io/Websocket method for sending OSC data or should I look into WebRTC?
Regarding your first question - if there is a solution for a direkt Websocket link between browser and server (for OSC) - you can have a look into this:
ol.wsserver object for Max/MSP by Oli Larkin: https://github.com/olilarkin/wsserver
I published an osc-js utility library which does the same with a UDP/Websocket bridge plugin:
https://github.com/adzialocha/osc-js
This is not really a question about any specific technical challenge, but a discovery challenge. And is pretty much opinion based.
But I will try to provide my thoughts, as others can be useful too.
If OSC server support WebSockets or WebRTC connection, then you might use one of them directly to it.
Based on nature of browsers, you probably need to support many protocols rather than one specific. As many browsers have different support: http://caniuse.com/rtcpeerconnection unless your users can be "forced" to use specific browser.
WebRTC is meant to be high-performance. I'm not sure it is good to compare it with UDP at all, as it is much more than UDP in comparison as well as implementation vary (inside) per browser.
You need to talk to server rather than between clients, while WebRTC mainly is designed for peer-to-peer communications. So you need precisely investigate into benefits on performance from latency as well as CPU point of view on server side.
Benefit of UDP is the fact that some packets can be dropped/skipped/undelivered, as it is not "mandatory" data, like picture frames or piece of sound, that only results on reduced quality, which is "expected" behaviour in streaming world. WebSockets will ensure that data is delivered, and "price" for it is mechanics that at the end slows down the queue of packets, to ensure ordered and guaranteed delivery of data.

how to create a RTSP streaming server

So I am trying to create a RTSP server that streams music.
I do not understand how the server plays a music and different requests get what ever is playing at that time.
so, to organize my questions:
1) how does the server play a music file?
2) how does the request to the server look like to get whats currently playing?
3) what does the response look like to get the music playing in the client that requested the music?
First: READ THIS (RTSP), and THEN READ THIS (SDP), and then READ THIS (RTP). Then you can ask more sensible questions.
It doesn't, server streams little parts of the audio data to the client, telling it when each part is to be played.
There is no such request. If you want, you can have URL for live streaming, and in RTSP DESCRIBE request, tell the client what is currently on.
Read the first (RTSP) document, all is there! Answer to your question is this:
RTSP/1.0 200 OK
CSeq: 3
Session: 123456
Range: npt=now-
RTP-Info: url=trackID=1;seq=987654
But to get the music playing you will have to do a lot more to initiate a streaming session.
You should first be clear about what is RTSP and RTP. The Real Time Streaming Protocol (RTSP) is a network control protocol designed for use in communications systems to control streaming media servers. where as Most RTSP servers use the Real-time Transport Protocol (RTP) for media stream delivery. RTP uses UDP to deliver the Packet Stream. try to Understanding these concepts.
then Have a look at this project.
http://sourceforge.net/projects/unvedu/
This a open source project developed by our university, which is used to stream video(MKV) and audio file over UDP.
You can also find a .Net Implementation of RTP and RTSP here # https://net7mma.codeplex.com/ which includes a RTSP Client and Server implementation and many other useful utilities e.g. implementations of many popular Digital Media Container Formats.
The solution has a modular design and better performance than ffmpeg or libav at the current time.

Resources