Audio Latency at GamingAnywhere - audio

I am establishing GamingAnywhere on a Ubuntu 12.04 Server. The server is running, the only problem is there is a delay on audio while the video is running without any latency.
I am having 2 second latency with the sound. There is no noticeable latency for the video.
The sound latency should be the same for the sound as the video, and I have no noticeable video latency.
What can be the reason?

Related

VLC Player drops audio on RTSP stream after a short time

I'm streaming audio and video from a HikVision security camera and the audio drops shortly after I start VLC. It doesn't return unless I close and restart VLC. I'm using VLC version 3.0.18 on Windows 10 with an NVIDIA RTX A4000
I have tried H.264 and H.265 encoding with MP3, MP2L2, and PCM audio encoding. PCM didn't work at all. The camera offers the following audio encoding options: G.722.1, G.711ulaw, G.711alaw, MP2L2, G.726, PCM, and MP3. I'm streaming a fairly low res (640x360) feed. I haven't tried streaming a high res stream but I doubt that will help. My gigabit switch says the camera is only transmitting at about 300Kbs on a 100 megabit connection so it seems unlikely there is a network bandwidth issue, especially since there are only 2 cameras, one PC a phone and tablet connected to this network and I'm the only one using any of it.

Live streaming from Raspberry Pi to NodeJS server hosted on Google App Engine

Description:
I have a Raspberry PI controlling a small vehicle, and it has a RealSense camera attached to it. What I need to do is to send a live stream from the camera to an HTML page/NodeJS server hosted on Google App Engine, so that the user can see the stream on his page. The stream will be used to manually control the vehicle, so low latency is very important.
What I attempted:
My current solution is just a simple socket connection using Socket.IO, I send a frame through the socket, decode it and display it in the page. The problem is that this method is extremely slow, and from what I understood not a good way to stream to a remote client, that is why I need to find a different way.
I tried using uv4l, when I run the line uv4l --driver uvc --device-id "realsense camera id" it says the camera is recognized, but then immediately stops without any error. When I try to open the stream with my IP and click "call" I get the error "invalid input device". Could not find any solution for this problem.
I also thought about using webRTC, I tried to follow this example (which is the closest I found to what I need): https://dev.to/whitphx/python-webrtc-basics-with-aiortc-48id , but it uses a Python server and I want to use my GAE/NodeJS server, and I'm struggling to figure out how to convert this code to use a python client and a NodeJS server.
If anyone can provide some information or advice I'd really appreciate it.
If want to control the vehicle, the latency is extremely important. I think the latency is better if about 100ms, and should not greater than 400ms if the network is jitter for a while.
The latency is introduced by everywhere, from your encoder on Raspberry PI, transfer to media server, and H5 player. Especially the encoder and player.
The best solution is use UDP based protocol like WebRTC:
Raspberry PI PC Chrome H5
Camera --> Encoder ---UDP--> Media Server --UDP---> WebRTC Player
So recommend to use WebRTC to encode and send the frame to media server, and H5 WebRTC player. You could test this solution by replace the encoder with H5 WebRTC publisher, the latency is about 100ms, please see this wiki.The arch is bellow:
Raspberry PI PC Chrome H5
Camera --> WebRTC ---UDP--> Media Server --UDP---> WebRTC Player
Note: The WebRTC stack is complex, so you could build from H5 to H5, test the latency, then move the media server from intranet to internet and test the latency, next replace the H5 publisher by your Raspberry PI and test the latency.
If want to run the solution ASAP, FFmpeg is a better encoder, to encode the frame from camera and package it as RTMP packet, then publish to media server by RTMP, finally play by H5 WebRTC player, please read this wiki. The latency is larger than WebRTC encoder, I think it might be around 600ms, but it should be OK to run the demo. The arch is bellow:
Raspberry PI PC Chrome H5
Camera --> FFmpeg ---RTMP--> Media Server --UDP---> WebRTC Player
or SRT
Note that SRT is also realtime protocol, about 200~500ms latency.
Note that you could also run media server on Raspberry PI, and use WebRTC player to play the stream from it, when they are in the same WiFi. The latency should be the minimum, because it's intranet transport.

What's the best protocol for live audio (radio) streaming for mobile and web?

I am trying to build a website and mobile app (iOS, Android) for the internet radio station.
Website users broadcast their music or radio and mobile users will just listen radio stations and chat with other listeners.
I searched a week and make a prototype with Wowza engine (using HLS and RTMP) and SHOUTcast server on Amazon EC2.
Using HLS has a delay with 5 seconds, but RTMP and SHOUTcast has 2 second delay.
With this result I think I should choose RTMP or SHOUTcast.
But I am not sure RTMP and SHOUTcast are the best protocol. :(
What protocol should I choose?
Do I need to provide a various protocol to cover all platform?
This is a very broad question. Let's start with the distribution protocol.
Streaming Protocol
HLS has the advantage of allowing users to get the stream in the bitrate that is best for their connection. Clients can scale up/down seamlessly without stopping playback. This is particularly important for video, but for audio even mobile clients are capable of playing 128kbit streams in most areas. If you intend to have a variety of bitrates available and want to change quality mid-stream, then HLS is a good protocol for you.
The downside of HLS is compatibility. iOS supports it, but that's about it. Android has HLS support but it is still buggy. (Maybe in another year or two once all the Android 3.0 folks are gone, this won't be as much of an issue.) JWPlayer has some hacks to make HLS work in Flash for desktop users.
I wouldn't bother with RTMP unless you're only concerned with Flash users.
Pure progressive streaming with HTTP is the route I almost always choose to go. Everything can play it. (Even my Palm Pilot's default media player from 12 years ago.) It's simple to implement and well understood.
SHOUTcast is effectively HTTP, but a poorly implemented version that has compatibility issues, particularly on mobile devices. It has a non-standard status line in its response which breaks a lot of clients. Icecast is a good alternative, and is what I would recommend for production use today. As another option, I have created my own streaming service called AudioPump which is HTTP as well, and has been specifically built to fix compatibility with oddball mobile clients, such as native Android players on old hardware. It isn't generally available yet, but you can contact me at brad#audiopump.co if you want to try it.
Latency
You mentioned a latency of 2 seconds being desirable. If you're getting 2-second latency with SHOUTcast, something is wrong. You don't want latency that low, particularly if you're streaming to mobile clients. I usually start with a 20-second buffer at a minimum, which is flushed to the client as fast as it can receive it. This enables immediate starting of the stream playback (as it fills up the client-side buffer so it can begin decoding) while providing some protection against buffer underruns due to network conditions. It's not uncommon for mobile users to walk around the corner of a building and lose their nice signal quality. You want your stream to survive that as best as possible, so if you have already sent the data to cover the drop out, the user doesn't have to know or care that their connection became mediocre for a short period of time.
If you do require low latency, you're looking at the wrong technology entirely. For low latency, check out WebRTC.
You certainly can tweak your traditional internet radio setup to reduce latency, but rarely is that a good idea.
Codec
Codec choice is what will dictate your compatibility more than anything else. MP3 is easily the most compatible, and AAC isn't far behind. If you go with AAC, you get better quality audio for a given bitrate. Most folks use this to reduce their bandwidth bill.
There are licensing fees with MP3, and there may be with AAC depending on what you're using for a codec. Check with a lawyer. I am not one, and the licensing is extremely complicated.
Other codecs include Vorbis and Opus. If you can use Opus, do so as the licensing is wide open and you get good quality for the bandwidth. Client compatibility here though is the killer of Opus. (Maybe in a few years it will be better.) Vorbis is a mediocre codec, but is free and clear.
On the extreme end, I have some stations doing their streaming in FLAC. This is lossless audio quality, but you're paying for 8x the bandwidth as you would with a medium quality MP3 station. FLAC over HTTP streaming compatibility is not code at the moment, but it works alright in VLC.
It is very common to support multiple codecs for your streams. Depending on your budget, if you can't do that, you're best off with MP3.
Finally on encoding, don't go from a lossy codec to another lossy codec if you can help it. Try to get the output stream as close to the input as possible. If you re-encode audio, you lose quality every time.
Recording from Browser
You mentioned users streaming from a browser. I built something like this a couple years ago with the Web Audio API where the audio is captured and then encoded and sent off to Icecast/SHOUTcast servers. Check it out here: http://demo.audiopump.co:3000/ A brief explanation of how it works is here: https://stackoverflow.com/a/20850467/362536
Anyway, I hope this helps you get started.
Streaming straight audio/mpeg (mp3 packets) has worked everywhere I've tried.
If you are developing an APP then go with AAC, if you are simply playing via web browser then you need a HTML5 Implimentation which is MP3. All custom protocols like RTMP or SHOUTcast requires additional UI to be built. There are some third party players available in open source world. You can either use them or stick to HTML5 MP3/OGG as most people now days are using chrome browser or other HTML5 complaint browsers.

Streaming audio over wifi: feasible and how?

I'm evaluating building an application which, simplifying the requirements, records from a microphone equipped small computer (eg: a Raspberry PI) and streams the digitalized sound over wireless connection in almost realtime to a server on the same LAN (No Internet involved). Ideally, the server application would record different streams from various wifi microphones and mix them together..
I'm currently looking into obtain a pretty good quality out of this, comparable somehow to a 128Kb stereo MP3.
At this point, I'm still evaluating options here, so I'm also looking to see your opinion on the feasibility of this.. if you think it's doable, what libraries, APIs, protocols would you use? Consider that this will be likely deployed on Linux based embedded computers (for the wifi mic part) and Linux based servers.
Thanks for your help.
I listen often Shoutcast on the iPad. This sounds pretty good to me. I do not know the kb/s rate there, I think they stream mp3. So I do not think this would be a big issue if you can live with the quality loss which comes with mp3. The bigger issue might be, how good your wireless connection is. When your network is pretty busy, there are more errors and lower speed. It also depends on the wireless standard and the hardware you are using. You may think about buffering, too.

Best bitrate to use for streaming audio via mobile

I run a podcasting site where users can stream the podcasts via a HTML5 player. I've just optimized my site for mobile so that users can listen on the go.
Issue I'm having is that it tends to buffer a lot as the MP3 quality on all the podcasts is V0, which I'm finding must be too high for mobile connections! Plus the amount of data it transfers would be too high for a lot of people's phone contracts to justify bothering with.
Rather than trial and error loads of different bitrates, I was wandering if anyone knew the best MP3 bitrate to go with for mobile streaming. I.e., which bitrate is low enough to allow a stream which has little to no buffering over 3G, but is high enough not to notice the quality degradation.

Resources