ffmpeg audio and video sync error - linux

./ffmpeg \
-f alsa -async 1 -ac 2 -i hw:2,0 \
-f video4linux2 -vsync 1 -s:v vga -i /dev/video0 \
-acodec aac -b:a 40k \
-r 25 -s:v vga -vcodec libx264 -strict -2 -crf 25 -preset fast -b:v 320K -pass 1 \
-f flv rtmp://192.168.2.105/live/testing
with the above command i able to stream with fps of 25 but their is no audio and video synchronization i.e., audio is faster than video,i am using ffmpeg 0.11.1 version on the pandaboard for an rtmp streaming ,help me out to solve this problem.
Thanks
Ameeth

Don't use -pass 1 if you're not actually doing two-pass encoding.
From the docs (emphasis added):
‘-pass[:stream_specifier] n (output,per-stream)’
Select the pass number (1 or 2). It is used to do two-pass video encoding. The statistics of the video are recorded in the first pass into a log file (see also the option -passlogfile), and in the second pass that log file is used to generate the video at the exact requested bitrate. On pass 1, you may just deactivate audio and set output to null, examples for Windows and Unix:
ffmpeg -i foo.mov -c:v libxvid -pass 1 -an -f rawvideo -y NUL
ffmpeg -i foo.mov -c:v libxvid -pass 1 -an -f rawvideo -y /dev/null

I was streaming to Twitch, and, funnily enough, removing the -r option made video sync with the audio. Now, you might want to have the framerate limited in some way; unfortunately, I have no solution for that, but it does allow to sync the video with the audio very well.

Related

How to convert video for web with ffmpeg

I am trying to rescale, subclip and convert video for web (html5 video tag). Target browsers : Chrome, Safari, Firefox, Ya Browser.
I am using command like that (changing some params)
ffmpeg -i C.mp4 -ss 00:00:00 -t 10 -vf scale=312x104 -vcodec libx264 -strict -2 -movflags faststart -pix_fmt yuv420p -profile:v high -level 3 -r 25 -an -sn -dn d.mp4 -y
But every time video is not playing in some browser.
I would like to find some way to do that task fast (that's why I am using ffmpeg) and stable (so that any video passed would give me a valid video for all browsers)
I also tried to play with setsar, setdar params, but still no success
Thanks everyone, I guess I found smth suitable for my case
Ffmpeg -i C.mp4 -ss 00:00:00 -t 10 -vf scale=dstw=312:dsth=104:flags=accurate_rnd,setdar=3/1 -vcodec libx264 -level 21 -refs 2 -pix_fmt yuv420p -profile:v high -level 3.1 -color_primaries 1 -color_trc 1 -colorspace 1 -movflags +faststart -r 30 -an -sn -dn d.mp4

How can I clean the sound of aplay received by a caputre card

I am trying to setup my linux desktop to be able to view and listent to the device connected to my capture card. I wrote this 2 liner script to be able to do that however my sound is out of tone and a bit distorted, how could I clean it up?
arecord --buffer-time=1 -f cd - | aplay --buffer-time=1 -c 5 -r 48000 -f S16_LE - 2> /dev/null &
ffplay -f video4linux2 -framerate 30 -video_size 1920x1080 -input_format mjpeg /dev/video1 2> /dev/null &
I also tried to do that with ffmpeg piped to ffplay and the sound is crystal clear however there is 2-3 seconds delay on the video and sound, is there a way to fix this?
ffmpeg -framerate 30 -video_size 1920x1080 -thread_queue_size 1024 -input_format mjpeg -i /dev/video1 -f pulse -i 'Analog Input - USB Video' -r 30 -threads 4 -vcodec libx264 -crf 0 -preset ultrafast -vsync 1 -async 1 -f matroska - |ffplay -
Could you try just using ffplay for your second approach?
ffplay -framerate 30 -video_size 1920x1080 \
-thread_queue_size 1024 -input_format mjpeg -i /dev/video1 \
-f pulse -i 'Analog Input - USB Video'`
I could be off-base as I'm only familiar with ffmpeg and don't personally use ffplay, but they share a lot of things (e.g., backend libraries and command line parsing) so I'm hedging this would work.
Also, what do you mean by "there is 2-3 seconds delay on the video and sound"? Are they 2-3 seconds behind what you are physically seeing and hearing? Or are they out of sync by that many seconds?
[addendum]
Not sure if OP is still checking this post, but there is a solution to combine two inputs for ffplay by using an input filtergraph with movie and amovie filters. The following worked in Windows (despite unacceptably large latency):
ffplay -f lavfi -i \
movie=filename=video="Logitech HD Webcam C310":format_name=dshow:format_opts=rtbufsize=702000k[out0]; \
amovie=filename=audio="Microphone (HD Webcam C310)":format_name=dshow[out1]
Note that this is only for the illustration purpose as dshow device can output multiple streams (though the latency is still too bad for real-time use).
The same should be possible in Linux:
ffplay -f lavfi -i \
movie=filename=/dev/video1:format_name=video4linux2:format_opts='framerate=30:video_size=1920x1080:thread_queue_size=1024:input_format=mjpeg'[out0]; \
amovie=filename='Analog Input - USB Video':format_name=pulse[out1]
(Disclaimer: Untested and it may be missing escaping)
The latency may be better in Linux (and with a higher spec'ed PC than mine) so it might be worth a try.

FFmpeg (merge two audio)

I have two long audio files. I want to merge cut some part from each video
and merge both parts in one file.
In below command the problem there is no audio in the second audio part. It contain the first part and the second is empty.
What is the problem?
ffmpeg -f lavfi -i color=c=black
-ss 157.824 -t 99.818
-i "file1.mp4"
-ss 315.764 -t 50.308
-i "file2.mp4"
-s 854x480
-aspect 1.779167
-r 25
-c:v libx264
-b:v 800k
-c:a aac
-strict experimental
-b:a 128k
-f mp4
-t 150.126 -async 1
-y "output.mp4"

Creating Video and Streaming Command Line

Currently, using ffmpeg, I am using two commands on my Terminal to:
1) create a video from a bunch of images:
ffmpeg -r 60 -f image2 -s 1920x1080 -i rotated-pano_frame%05d_color_corrected_gradblend.jpg -vcodec libx264 -crf 25 -pix_fmt yuv420p test.mp4
2) stream the video to a udp address:
ffmpeg -re -i test.mp4 -c copy -f flv udp://127.0.0.1:48550
I am trying to combine both these instructions into one command line instruction, using &&, as suggested in the answer to a previous question of mine:
ffmpeg -r 60 -f image2 -s 1920x1080 -i rotated-pano_frame%05d_color_corrected_gradblend.jpg -vcodec libx264 -crf 25 -pix_fmt yuv420p test.mp4 \
&& ffmpeg -re -i test.mp4 -c copy -f flv udp://127.0.0.1:48550
but am encountering an error which prevents streaming:
[flv # 0x7fa2ba800000] video stream discovered after head already parsed.
Thoughts on a different command line syntax to join the two instructions, different ffmpeg instruction (filters perhaps?), and why I am getting the error?

FFmpeg 0.11.1 rtmp mp4 streaming issues

I am using ffmpeg 0.11.1 onto a panda board for streaming. Linux linaro-developer 3.4.0-1-linaro-lt-omap #1~120625232503-Ubuntu Sx Ubuntu os.
I want to live stream a raw video from a video device or a cam to a rtmp server in an mp4 format, command used:
./ffmpeg \
-f alsa -async 1 -ac 2 -i hw:2,0 \
-f video4linux2 -i /dev/video0 \
-acodec aac -b:a 40k \
-r 50 -s 320x240 -vcodec libx264 -strict -2 -b:v 320K -pass 1 \
-f flv rtmp://...../mp4:demo101
By this command I am able to stream but fps varies between 8 to 7 6 and so on. Help me to figure it out.

Resources