I have some problem to add several fade effects to one audio file. When I try to use a command like this:
ffmpeg -y -i /home/user/video/test/sound.mp3 -af "afade=t=in:ss=0:d=3,afade=t=out:st=7:d=3,afade=t=in:st=10:d=3,afade=t=out:st=17:d=3,afade=t=in:st=20:d=3,afade=t=out:st=27:d=3" /tmp/test.mp3
then my output audio file has a fadein and fadeout applied only once. All the next effects don't get applied. Is there any possible way to apply several fade effects to the same audio file? Also, what is the difference between ss and st parameter in this command?
The problem is that after fading out the audio you are trying to fade in the silence.
The solution is to disable the fade out filter when you want to start fading in.
You can achieve that with Timeline Editing to enable the filters for a particular amount of time.
The following example works just fine:
ffmpeg -i input.mp3 -af "afade=enable='between(t,0,3)':t=in:ss=0:d=3,afade=enable='between(t,7,10)':t=out:st=7:d=3,afade=enable='between(t,10,13)':t=in:st=10:d=3,afade=enable='between(t,13,16)':t=out:st=13:d=3" -t 16 output.mp3
Works for me with ffmpeg 2.5.2.
I'm using fade in and fade out audio filter, both for the duration of 3 seconds.
ffmpeg -i audio.mp3 -af 'afade=t=in:ss=0:d=3,afade=t=out:st=27:d=3' out.mp3
I'd recommend to upgrade your ffmpeg, as this might be a bug. More information in the docs.
take a look here: ffmpeg volume filters
volume='if(lt(t,10),1,max(1-(t-10)/5,0))':eval=frame
complete command:
ffmpeg -i movie.wav -filter volume='if(lt(t,10),1,max(1-(t-10)/5,0))':eval=frame modified-movie.wav
Related
I have a folder with 15 images and 1 audio file:
image_1.jpg, image_2.jpg, image_3.jpg ..... and music.webm
(Also resolution of images is 1440x720)
I want to combine these images into a video with audio in background.And framerate I require is 0.2 (5 second for each frame).I gave a search on Stackoverflow and I found the nearest example and tried.But it failed.
ffmpeg -f image2 -i image%03d.jpg -i music.webm output.mp4
(Actually I have very little knowledge of ffmpeg so please excuse my foolishness)
Please help me with my issue.(Also I didn't understood where in the code I have to enter framerate)
Edit:-If needed I can easily tweak with filename of images.Be free to tell me that too
How your command failed? please paste the text.
According to your image file format, the pattern should be: %3d, but not %03d
e.g.
image_%d.jpg apply to: image_1.jpg, image_2.jpg, image_3.jpg
image_%04d.jpg apply to: image_0001.jpg, image_0002.jpg ... image9999.jpg
and also , when using this "pattern sequence", MAKE SURE:
the sequence should start from xxx001.jpg, otherwise you should specify a parameter -start_index or something.
the sequence must not broken. e.g. image5.jpg, image6.jpg, (missing 7 ) image8.jpg
refer:
https://ffmpeg.org/ffmpeg-formats.html#image2-2
https://en.wikibooks.org/wiki/FFMPEG_An_Intermediate_Guide/image_sequence
Try this:
ffmpeg -r 0.2 -i image_%02d.jpg -i music.webm -vcodec libx264 -crf 25 -preset veryslow -acodec copy video.mkv
So -r specifies fps (I actually didn't try using a float value there, but try it)
-vcodec specifies video codec, -crf quality, preset the encoding speed (slower is more efficient), -acodec copy says audio should be copied.
I think that should work, give it a try. You will need to rename the images to image_01.jpg image_02...
Also have a look here: How to create a video from images with FFmpeg?
I got two different commands.
ffmpeg -i input.mp3 -i second.mp3 -filter_complex "[0:a]atrim=end=10,asetpts=N/SR/TB[begin];[0:a]atrim=start=10,asetpts=N/SR/TB[end];[begin][1:a][end]concat=n=3:v=0:a=1[a]" -map "[a]" output
This command inserts second.mp3 into input.mp3. It seems to always keep the parameters of input.mp3. It inserts it in exact 10 seconds of input.mp3.
Here is the second command:
ffmpeg -i input.mp3 -i second.mp3 -filter_complex "[1:a]adelay=10000|10000[1a];[0:a][1a]amix=duration:first" output
This command is closer to my final goal. It plays input.mp3 and in exact 10 seconds it plays along second.mp3 without stopping input.mp3's sound.(I think that's called mixing?)
My final goal is to create final.mp3.
Its duration must always equal input.mp3 duration. It must keep the samplerate, the count of channels, etc of input.mp3
When playing final.mp3, it must play the whole input.mp3.
But each 10-15 seconds, it must play second.mp3 without stopping input.mp3.(mix)
It could be said that I must use "Second command" but in a loop.
It would be great if there is one-line command for that in ffmpeg.
I am working with flac, mp3 and wav and both of the commands were suitable for that.
For example:
input.mp3 could be 40 seconds long.
second.mp3 could be 2 seconds long.
When I play final.mp3 it will be 40 seconds long, but each 10-15 seconds(on random) it will play second.mp3 at the same time as input.mp3.
Sadly I have no experience with ffmpeg, both of the commands I got are answers to questions here in stackoverflow. Hope somebody can help me. Thank you!
I edned up, generating long ffmpeg code using php
-i input_mp3.mp3 -i second.wav -filter_complex "[1:a]adelay=2000|2000[1a];[1:a]adelay=19000|19000[2a];[1:a]adelay=34000|34000[3a];[1:a]adelay=51000|51000[4a];[1:a]adelay=62000|62000[5a];[1:a]adelay=72000|72000[6a];[1:a]adelay=85000|85000[7a];[1:a]adelay=95000|95000[8a];[1:a]adelay=106000|106000[9a];[1:a]adelay=123000|123000[10a];[1:a]adelay=139000|139000[11a];[1:a]adelay=154000|154000[12a];[1:a]adelay=170000|170000[13a];[1:a]adelay=184000|184000[14a];[1:a]adelay=197000|197000[15a];[1:a]adelay=212000|212000[16a];[1:a]adelay=224000|224000[17a];[1:a]adelay=234000|234000[18a];[1:a]adelay=248000|248000[19a];[1:a]adelay=262000|262000[20a];[1:a]adelay=272000|272000[21a];[1:a]adelay=288000|288000[22a];[0:a][1a][2a][3a][4a][5a][6a][7a][8a][9a][10a][11a][12a][13a][14a][15a][16a][17a][18a][19a][20a][21a][22a]amix=23:duration=first,dynaudnorm" output_mp3_dynaudnorm.mp3
-i input_wav.wav -i second.wav -filter_complex "[1:a]adelay=1000|1000[1a];[0:a][1a]amix=2:duration=first,dynaudnorm" output_wav_dynaudnorm.wav
-i input_flac.flac -i second.wav -filter_complex "[1:a]adelay=1000|1000[1a];[1:a]adelay=11000|11000[2a];[1:a]adelay=27000|27000[3a];[0:a][1a][2a][3a]amix=4:duration=first,dynaudnorm" output_flac_dynaudnorm.flac
Those kind of syntax seems to work. Also I added dynaudnorm to negate the negative effect of amix(FFMPEG amix filter volume issue with inputs of different duration)
Even thought it is claimed that dynaudnorm fixes the problem of amix, it is not completely true, at least not in my case, where I am using it ~30 times...
But the final command works. I will ask a new question how to improve the results.
i've tried many ways to solve the problem, i also tried this code before
ffmpeg -i movie.mp4 -vf trim=3:8 cut.mp4
but for the result, the audio is still there but the video is gone. all i want is the video and audio both are removed.
can anyone help me how resolve this?
Use
ffmpeg -i movie.mp4 -vf "select='not(between(t,3,8))',setpts=N/FRAME_RATE/TB" -af aselect='not(between(t,3,8))',asetpts=N/SR/TB cut.mp4
I am trying to concatenate video files so that next one follows the one before it when it is played. The formatting for all of the files are the same. The files all have audio & video.
I think I am very close (hopefully!) to getting this to work, but I have one final problem. The command below takes all of the mp4 files in my folder and creates a big mp4 file, which is the right size in total MB, but the images for all videos after the first video are garbled. The audio is okay (continues just fine from video to video). Also, I don't get any error messages.
ffmpeg -f concat -i <(for f in /folder1/*.mp4; do echo "file '$f'"; done) -c copy /folder1/all.mp4
I'm not very familiar with ffmpeg yet, so I've just been trying the different suggestions I've found on the web. Can anyone suggest other things for me to try? (I've tried reading the FAQs, but I have to confess that I don't fully understand it. Also, there seems to be some posts about audio being missing after concatenation, but I haven't seen anything on images being garbled.) Thx in advance!
I have had good luck using this ... avconv is a fork of ffmpeg
avconv -i 1.mp4 1.mpeg
avconv -i 2.mp4 2.mpeg
avconv -i 3.mp4 3.mpeg
cat 1.mpeg 2.mpeg 3.mpeg | avconv -f mpeg -i - -vcodec mpeg4 -strict experimental output.mp4
does anyone know how to watermark video from the Linux command line using a simple tool?
Watermarking in ffmpeg isn't supported in the current version, and requires a custom compile.
Max.
ffmpeg -y -i 'inputFile.mpg' -vhook '/usr/lib/vhook/watermark.so -f /home/user/logo.gif'
Make note of the "-vhook" parameter; watermark.so path may vary.
Another simple way to do this is updating ffmpeg to the newest version and adding the overlay video filter:
ffmpeg -y -i video.mp4 -i watermark.png -filter_complex "overlay=(main_w-overlay_w):(main_h-overlay_h)" watermark.mp4
This also gives you more options on where to place the watermark as well. For example, if you wanted to place the watermark in the center of the video you would use:
-filter_complex "overlay=(main_w-overlay_w/2):(main_h-overlay_h/2)"