mplayer can't read udp video stream - linux

Im trying to compare latency between different video codecs using ffmpeg and mplayer's benchmark.
I am using this command line to generate and send the stream:
ffmpeg -s 1280x720 -r 100 -f x11grab -i :0.0 -vcodec mpeg2video -b:v 8000 -f mpegts udp://localhost:4242
And I'm successfully using ffplay to receive and read it in real time:
ffplay -an -sn -i -fflags nobuffer udp://localhost:4242?listen
Now instead of playing the stream with ffplay, i'd like to use the mplayer benchmark to get some information on the latency:
mplayer -msglevel all=6 -benchmark udp://localhost:4242
But I get this output instead:
Playing udp://localhost:4242.
get_path('sub/') -> '/home/XXXXX/.mplayer/sub/'
STREAM_UDP, URL: udp://localhost:4242
Filename for url is now udp://localhost:4242
Listening for traffic on localhost:4242 ...
Timeout! No data from host localhost
udp_streaming_start failed
No stream found to handle url udp://localhost:4242
I tried with rtp protocol instead, didn't work either...
Does anyone have an idea what i'm doing wrong?

Thanks for the answers,
I actually tried a lot of different codecs, especially vp9, h264 and mpeg2, but the best low latency i got were with mpeg2video. Here are 3 of the command lines I used. I read the ffmpeg streaming guide and the different codec's encoding guides to try to get the best parameters for each of them, but the difference is noticeable:
ffmpeg -an -sn -s 1280x720 -r 30 -f x11grab -i :0.0 -vcodec libx264 -crf 18 -tune zerolatency -preset ultrafast -pix_fmt yuv420p -profile:v baseline -b:v 8000 -f mpegts threads 4 udp://127.0.0.1:4242
ffmpeg -s 1280x720 -r 30 -f x11grab -i :0.0 -vcodec mpeg2video -b:v 800k -f mpegts -threads 8 udp://127.0.0.1:4242
ffmpeg -t 5 -s 1280x720 -r 30 -f x11grab -i :0.0 -vcodec libvpx-vp9 -an -crf 18 -b:v 1M -f webm -threads 8 udp://127.0.0.1:4242
On localhost, I'm close to no latency at all with mpeg2video, when I have almost 1sec latency with h264. I heard vp9 could have very low latency too, but I apparently don't know how to use the options in ffmpeg, cuz I get really bad latency values...
Anyway, to get back to the topic, 127.0.0.1 instead of localhost doesn't help, and with ffmpeg://udp://ip:port it doesn't work either... :/ I think I may have wrong configurations on mplayer. maybe I should try to compile it myself.
But actually, I don't even know if mplayer would give me the informations I want (the average number of ms for a codec to encode/decode a frame, so that I can compare my different codecs precisely).
EDIT: Sorry for that... ffmpeg://udp://ip_addr works =) I made a typing mistake... n_n
Thanks a lot. Though, the quality of the video is really aweful compared to ffplay when I use mplayer...

Related

How can I stream an image to YouTube using FFmpeg?

I have an image that changes every few seconds. How can I stream it to YouTube using FFmpeg?
ffmpeg -f image2 -loop 1 -i input.jpg -re -f lavfi -i anullsrc -vf format=yuv420p -c:v libx264 -b:v 2000k -maxrate 2000k -bufsize 4000k -g 50 -c:a aac -f flv rtmp://output
-f image2 is needed to manually select the image demuxer. Otherwise, depending on the input format, it may choose a different demuxer that does not allow arbitrary replacing of input.jpg.
Replace input.jpg atomically (such as with mv but not cp) or else it may fail.
YouTube requires audio. The anullsrc filter will generate silent audio.
See FFmpeg Wiki: YouTube Streaming.
(Optional) Use the slowest -preset that provides 25 fps (or whatever frame rate you set using the -framerate image demuxer input option).

ffmpeg livestream by ip camera with a problem of DTS

I'm a problem with a code of ffmpeg: ffmpeg -rtsp_transport tcp -i "rtsp://admin:passw#xxxxxxxx.ddns.net:554/live/ch0" -deinterlace -vcodec libx264 -pix_fmt yuv420p -preset medium -s 1920x1080 -b:v 5000k -acodec aac -strict -2 -ar 44100 -threads 6 -qscale 3 -b:a 712000 -bufsize 128k -f flv "rtmp://a.rtmp.youtube.com/live2/key"
It's used for a live streaming by a ip camera, but I have this problem
[flv # 0x558333a41100] Non-monotonous DTS in output stream 0:1; previous: 73709, current: 73220; changing to 73709. This may result in incorrect timestamps in the output file.
[aac # 0x558333a0f100] Queue input is backward in time14.43 bitrate=1288.8kbits/s dup=0 drop=5 speed=0.486x
[flv # 0x558333a41100] Non-monotonous DTS in output stream 0:1; previous: 74591, current: 73614; changing to 74591. This may result in incorrect timestamps in the output file.
who can help me?
Especially in live streams, there will occasionally happen some "jumps" on timestamps. Why this happens... well, there are many reasons, including packet losses.
When this happens this will cause momentary distortion in the output.
Here FFmpeg tells you a jump has occurred. I consider this not an error on your sideā€”just a warning.
As the warning says, FFmpeg already handled it (by correcting the timestamps).
There isn't anything much to do, other than sometimes you may need to restart the FFmpeg, because of the unexpected changes on the stream.
thank you so much.
I proceeded to change the code with a simpler one."ffmpeg -rtsp_transport tcp -i "rtsp://xxxxxxxxxxxxxxxxxxx.ddns.net:554/live/ch0" -tune zerolatency -s 1920x1080 -vcodec libx264 -c:v libx264 -c:a aac -preset ultrafast -g 50 -f flv "rtmp://a.rtmp.youtube.com/live2/key"
actually the problem was that the live of yt would end even if ffmper kept processing so suddenly. With the new code the terminal no longer gives the error but the problem persists

Problem with combining a video and an audio stream from USB device

I have two USB devices attached to an RPi, both show up as usual as /dev/video0. Here's some additional info coming from two command line inputs:
Device 1, video only (attached to an RPi4):
ffmpeg -f v4l2 -list_formats all -i /dev/video0 reports
[video4linux2,v4l2 # 0xe5e1c0] Compressed: mjpeg :
Motion-JPEG : 1280x720 640x480 320x240
v4l2-ctl --list-formats-ext reports
ioctl: VIDIOC_ENUM_FMT
Type: Video Capture
[0]: 'MJPG' (Motion-JPEG, compressed)
Size: Discrete 1280x720
Interval: Stepwise 0.033s - 0.033s with step 0.000s
(30.000-30.000 fps)
Size: Discrete 640x480
Interval: Stepwise 0.033s - 0.033s with step 0.000s
(30.000-30.000 fps)
Size: Discrete 320x240
Interval: Stepwise 0.033s - 0.033s with step 0.000s
(30.000-30.000 fps)
Does work: ffmpeg -f v4l2 -i /dev/video0 -vcodec h264_omx -preset ultrafast -tune zerolatency -g 300 -b:v 1M -mpegts_service_type advanced_codec_digital_hdtv -f mpegts udp://OtherMachine:Port?pkt_size=1316
Device 2, video and audio (attached to an RPi3, but does not work either on the RPi4):
ffmpeg -f v4l2 -list_formats all -i /dev/video0 reports
[video4linux2,v4l2 # 0x2c41210] Compressed: mjpeg :
Motion-JPEG : 1920x1080 1280x720
v4l2-ctl --list-formats-ext reports
ioctl: VIDIOC_ENUM_FMT
Index : 0
Type : Video Capture
Pixel Format: 'MJPG' (compressed)
Name : Motion-JPEG
Size: Discrete 1920x1080
Interval: Discrete 0.033s
(30.000 fps)
Interval: Discrete 0.067s
(15.000 fps)
Size: Discrete 1280x720
Interval: Discrete 0.033s
(30.000 fps)
Interval: Discrete 0.067s
(15.000 fps)
After quite some tedious work and way too many hours I got this running:
Video only: ffmpeg -f v4l2 -input_format mjpeg -i /dev/video0 -c:v copy -preset ultrafast -tune zerolatency -g 300 -f matroska udp://OtherMachine:Port?pkt_size=1316
Does not work at all: ffmpeg -f v4l2 -input_format mjpeg -i /dev/video0 -c:v copy -preset ultrafast -tune zerolatency -g 300 -f mpegts udp://OtherMachine:Port?pkt_size=1316, on "OtherMachine" I do see that there is an incoming data stream via VLC, but it could not be digested properly.
Audio only: ffmpeg -f alsa -thread_queue_size 1024 -i plughw:1 -c:a mp2 -ac 2 -ar 44100 -preset ultrafast -tune zerolatency -b:a 128K -f mpegts udp://OtherMachine:Port?pkt_size=1316
But this does not work either:
ffmpeg -f v4l2 -input_format mjpeg -i /dev/video0 -f alsa -thread_queue_size
1024 -i plughw:1 -c:v copy -c:a mp2 -ac 2 -ar 44100 -preset ultrafast -tune zerolatency -g 300 -b:a 128K -f mpegts udp://OtherMachine:Port?pkt_size=1316
Could you please provide a hint on how to get these two streams for device 2 working together? Both of them come from the same hardware/device, my guess is that the MJPG video stream is somehow not fully compliant with the mpegts standard (like it is for device 1) since it works with matroska, but not with mpegts. Could that be? What needs to be done in that case?
Another hint, with the same kind of hardware setup I can do this
cvlc -vvv v4l2:///dev/video0 --input-slave=alsa://plughw:1,0 --sout='#transcode{acodec=mpga,ab=128}:std{access=http,mux=asf,dst=:Port}'
So, here my understanding is that video gets passed on unchanged (mjpeg) and audio gets transcoded via vlc's mpga which presumably corresponds to mp2 for ffmpeg. The container format is asf, but I was not able to get that running with ffmpeg for no obvious reason. Anyway, picking up this vlc broadcast stream via http://StreamingMachine:Port on any other machine in my network is working well. But how to achieve that with ffmpeg directly and potentially not as http:// but udp:// or pipe stream?
Alternatively, let me ask this question: Given that I have an incoming mjpeg video stream as well as an incoming mp2 audio stream which kind of container format (ok, it's obviously not mpegts) is the most appropriate one for combined streaming across my LAN or even into a pipe for further processing? Believe me, I tried my very best over a couple of hours to find out how to proceed but with no success. At least to my humble knowledge there is nothing such like a table providing answers to questions of that kind.
I'd be glad to get some insights.
Best

ffmpeg - Have troubling syncing up audio and video together

I have a webcam and a separate mic. I want to record what is happening.
It almost works, however the audio seems to play quickly and parts missing while playing over the video.
This is the command I am currently using to get it partially working
ffmpeg -thread_queue_size 1024 -f alsa -ac 1 -i plughw:1,0 -f video4linux2 -thread_queue_size 1024 -re -s 1280x720 -i /dev/video0 -r 25 -f avi -q:a 2 -acodec libmp3lame -ab 96k out.mp4
I have tried other arguments, but unsure if it has to do with the formats I am using or incorrect parameter settings.
Also, the next part would be how to stream it. Everytime I try going through rtp it complains about multiple streams. I tried doing html as well, but didn't like the format. html html://localhost:50000/live_feed or rts rts://localhost:5000
edit:
I am running this on a rpi 3.

How to eliminate the distortion in live streaming Nodejs + ffmpeg

I tried Live video Streaming with NodeJS and ffmpeg encoder. It works with a lag of around 2sec and with a distortion as well. Lag does not matter as there is always. But I need to eliminate the video distortion as much as possible. So what would be the suitable bit rates and is there a better encoder to do this? In ffmpeg, it encodes to mpegts so is there a more preferable format than mpegts ? plz help
my encoding code was
ffmpeg -s 640x480 -f dshow -i video="HP HD Webcam":audio="Microphone (Realtek High Definition Audio)" -preset ultrafast -qp 0 -f mpegts -v:b 800 -r 100 http://localhost:8082/abc/640/480/
You didn't set a video codec, so it used mpeg2 (the default for mpegts). You want to use H264, so use -c:v libx264:
ffmpeg -s 640x480 -f dshow -i video="HP HD Webcam":audio="Microphone (Realtek High Definition Audio)" -c:v libx264 -preset ultrafast -qp 0 -f mpegts -v:b 800 -r 100 http://localhost:8082/abc/640/480/
And then it should be fine. In addition, the green boxes sound like bugs (overflows?), so perhaps file a bug about them on the ffmpeg bug tracker.

Resources