If WebRTC Application is started in between a Phone Call, the peer who is on call is not able to hear anything even after ending that call - browser

I have an application on top of WebRTC. If a peer (A) clicks on the link to the application and starts it in the middle of a phone call, then that peer (A) is getting connected to peer (B) successfully. Even after ending the call, Peer A is not able to hear Peer B whereas Peer-B is able to see the video as well as hear the audio of Peer - A. The application works perfectly if the PeerConnection was created without any ongoing phone call.
I have tried to restart/createOffer the ICE and tried to acquire the user-media again and replace the tracks. But nothing works.
Is there a way I can find out in browser that there is an ongoing call so that I can prevent the customer from even initiating the RTCPeerConnection?
How to recover from this situation of no-audio at PeerA even after hanging up the call

This is a known Chrome bug, see here

Related

WebRTC Mobile - Audio not working unless on same wifi

I am using react-native-webrtc to handle the WebRTC portion of this.
I am using Websockets to signal and using ICE trickling to keep track of the ICE candidates.
I queue my ICE candidates until setLocalDescription has been called on the callee side. Then I addIceCandidate for each candidate in the queue.
On the caller side I am doing the same thing and not processing my ICE candidates until setRemoteDescription has been called.
I am only doing audio so no video being used.
When I test this with two mobile devices on the same network I have no issues.
But if I disconnect one device from the WiFi the calls still connect just fine except the audio cannot be heard on either device.
The onConnectionStateChange handler will still return "connected" and the onIceGatheringStateChanged will still return "complete".
I thought maybe I needed to use a TURN server to get this working so I started using Twilio's paid TURN/STUN server but the issue is still persisting.
Any ideas what to look into?
BACKGROUND
Ok, so you have to take some background on P2P connection on RTC platforms. And so, it begins (in very short version):
In order to establish connection you have to establish direct connection between two clients (how obviously, I know). In order to find this routes you need help on network servers.
And that's why you setup local SDP with setting, to which server we can access. ICE, TURN, STUN (you can find any information, for ex. this one). Now ICE candidates most obvious one, because this server endpoints within your local network and that's why your version is not working with different network.
Right, you have to use TURN/STUN to find NAT and correct routes between peers. Most TURN server are private and paid, but for less loaded application you might use public STUN servers, that would be more then enough.
You can find many available over there. One ex. is here.
stun.l.google.com:19302
stun1.l.google.com:19302
stun2.l.google.com:19302
SOLUTION
Now coming to your problem. If you think you have connected your devices with your signaling it doesn't mean you connected devices. (It's just to clarify, if you don't have media on your devices your RTC connection failed to establish, and it's not just audio).
The problem in using it's TURN/STUN servers on your devices, and you have to trace SDP which established during setRemoteDescription and check the servers were included. Furthermore there is always a Google demo which is working perfectly.
UPDATE
In order to trace how remote SDP will be set and connection establish oyu have to print candidates which will be used to setup. To do that, you have to print information which candidates gathered during setLocalDescription and setRemoteDescription.
In place where you are gathering candidates add logging to print information. You have to see, that STUN, TURN candidates will be there. Below ex in Java. Word ICE shouldn't bother you, because it's just means that candidates AFTER ICE traversing will be found.
// Listen for local ICE candidates on the local RTCPeerConnection
peerConnection.addEventListener('icecandidate', event => {
if (event.candidate) {
// Here should be your part where you are sending this candidate to your signaling channel
// Add logging to print entire candidate information. You should see some data related to ICE, TURN.
}
});

Can you help me for implementing screen sharing in a webRTC based peer to peer connection?

I was to build a video chatting application based on webRTC. It is pretty muchh eassy to get the screen sharing stream by the specific API. But I am quitely unable to stitch with the ongoing peer to peer connection. Can any one provide with necessary updated resource and ideas for implementing?
in peer to peer connection, we mostly use getUserMedia to get our videos to others.
To share my screen view to the other person, you can use getDisplayMedia to share the screen.
navigator.mediaDevices.getDisplayMedia({
audio:true, video:true
}).then(async function(stream){
}).catch(error=> console.log(error));
this is the basic example that I used in my program

how is my webrtc audio is being heard when all elements are muted?

In my webrtc application there are two video elements and both are muted. Both the candidates engaged in a webrtc chat has 2 peer connections. When one to one is happening everything is working fine, But as soon as the other peer connection kicks in, Even though all elements are muted, I am still able to hear the sound of the sender to himself
How is it even possible..?
This is not possible. It's surely some bug in your code.
I just solved my issue after days of trial and error, Turned a problem of the SDP, I just changed 'a=sendrecv' in my sender's SDP to 'a=sendonly' for Senders.

how we will know on the server side if the peer still connected using Nodejs WebRTC

I am using WebRTC to make a audio, video and chat application. How on the server side we can check if the peer is still connected.
Actually, I want to check before making audio/video call that the other user end is still connected. I am able to maintain Presence (i.e online/offline) when user logs in or logs out of the application.
Suppose, the network connection drops or got disconnected, I am not able to get any information on the server side. If I can get, then I can communicate to rest of the peers connected.
So, need help how to get the information if the peer is still connected or not. I am using Nodejs and WebRTC in my application.
Socket.IO has a concept of 'rooms' that makes it very handy for building WebRTC signaling servers, and a disconnect event fired when a user disconnects. You can also set up a custom event to be emited when, for example, a user stops a video stream or leaves a page.
You might want to take a look at the codelab at bitbucket.org/webrtc/codelab, which uses Socket.IO for signaling. (Apologies, once again, for shameless self promotion!)
You would need to implement your own logic to do that.
Since you already have the client registering presence you could:
maintain a persistent connection via websockets
implement a polling/keep alive algorithm between your clients and server

Skype Conference Procedure

I've been looking into skypes protocol or what people can make out since its a propriety protocol. I've read "An analysis of the skype peer-to-peer internet telephony protocol", though it is old it discusses a certain property which I'm looking to recreate in my own architecture. What I'm interested in is during video a conference, data is sent to one machine (the one most likely with the best bandwidth and processing power) which then redistributes to the other machines.
What is not explained is what happens when the machine receiving and sending the data has unexpectedly dropped out. Of course rather than drop the conference it would be best to find another machine to carry on receiving and distributing the data. Is there any documentation on how this performed on skype or a similar peer-to-peer VoIP?
Basically I'm looking for the fastest method to detect when a "super peer" unexpectedly drops out and quickly migrating operations to another machine.
You need to set a timeout (i.e., limit) and declare that if you don't receive communication within then, the communication is either dead (no path between the peers, reachability issue) or the remote peer is down. There is no other method.
If you have direct tcp or other connection to the super peer, you can catch events telling you the connection dies too. If your communication is relayed, and your framework automatically attempt to find a new route to your target peer, it will either find one or never find out. Hence, the necessity for a timeout.
If none hears about someone for some time, they are finally considered/declared dead.

Resources