How can I convert gif to webm preserving alpha channel? - linux

To start out with I have this gif I got from Google Images which has a transparent alpha channel.
Here is the original gif (open it in a new tab to see the transparency):
Here is a recording of it playing on my screen in case it doesn't display right in the browser:
Then I run the following script to convert it to webm, which is what I need for the game framework I'm using.
avconv -f gif img.gif img.webm
However it doesn't maintain the transparency. Here is with an overlay of a properly transparent webm (the water, taken from https://phaser.io/examples/v2/video/alpha-webm)
The white box shouldn't be appearing around the gem.

First convert the gif to png frames:
convert img.gif img%03d.png
Then combine them into a webm with this command (I had to get outside help on this):
ffmpeg -framerate 25 -f image2 -i ./img%03d.png -c:v libvpx -pix_fmt yuva420p img.webm

Related

FFMPEG color key overlay. audio output not working after conversion

ffmpeg -i hollywood.webm -i chroma.mp4 -filter_complex [1:v]colorkey=0x000000:0.3:0.2[ckout];[0:v][ckout]overlay[out] -map [out] test1.mp4
If you dont know the answer would appreaciate to have the map documentations sent to me, didnt find it
this is what been trying t do where chroma is the overlay video that also has audio and hollywood is the background video with also audio, the output image wise is good but the -map section is probably not going that well.

Why does ffmpeg not play the corrected pixel aspect ratio defined in the fMP4 boxes of an HLS file set?

We created a 10 sec 1920x1200 video file of a circle with a border to test HLS streaming pixel aspect ratio (PAR) correction. The video was scaled and encoded with x264 to 320x176 giving a PAR 22/25. Our source and output HLS fMP4 files can be found here https://gitlab.com/kferguson/aspect-ratio-streaming-files.git
The moov.trak.tkhd MP4 box in the fragment_1000kbps_init.mp4 file shows the correct visual presentation size of 281x176 and the ...avc1.pasp box shows the correct PAR of 22/25. (We used the "isoviewer-2.0.2-jfx.jar" application to view the MP4 boxes.)
If the master.m3u8 file is played with the VLC player or with gstreamer,
gst-launch-1.0 playbin uri=file:///master.m3u8
the aspect ratio is correctly displayed. However, with ffplay or streaming to hls.js embedded in a web site, the PAR is not corrected (the circle in the video is squashed). What is ffplay and hls.js 'looking' for in the MP4 boxes for them to playback correctly?
We did a further experiment by concatenating all the fMP4 files into one .mp4 file with the following Powershell command:
gc -Raw .\fragment_1000kbps_init.mp4, .\fragment_1000kbps_00000.m4s, ..., .\fragment_1000kbps_00000.m4s | sc -NoNewline .\fragment_1000kbps.mp4
Strangely, ffplay plays this concatenated mp4 file back with the correct PAR. (Also included in the git link above.) We assumed by this that there is some info in the fragment m4s file boxes that ffplay (and hls.js) require to playback correctly but, we cannot find it?
The issue is the requirement of the optional vui parameters (sar_width and sar_height) within the H.264 Sequence Parameter Set (SPS) of the "avcC" MP4 box. The vlc player and gstreamer only require the pasp box inclusion but, ffmpeg and hls.js players require the additional vui parameters to be set correctly.

How to stretch the width of mpv/mplayer by keeping the height of the video same

I have a screen which is in portrait mode and want to play some videos on it using mpv or mplayer on just lower 70% of the screen area. But since the screen is in portrait mode the video (which is also landscape) isnt getting stretched fully width wise and only occupies the width area according to resolution of the video.
The command I tried was
└──╼ $ mplayer -vf scale -zoom -xy 500 out.mp4
The video should fill the entire width os the screen, keeping the height of the video same. The video of course would get stretched but thats ok. Im getting blue area for video, but I want orange area for the video.
Got the answer from this post :-
FFmpeg - Change resolution of the video with aspect ratio
Had to flip aspect ratio for my project from 16/9 to 9/16
ffmpeg -i <input> -vf "scale=100:-1,setdar=9/16" <output>

Using ffmpeg to overlay black line or add border to two side by side videos

I am using the following to generate a video that is side by side.
ffmpeg -i left.mp4 -i right.mp4 -filter_complex "[0:v]setpts=PTS-STARTPTS, pad=iw*2:ih[bg]; [1:v]setpts=PTS-STARTPTS[fg]; [bg][fg]overlay=w" -y final.mp4
It looks like this.
http://www.mo-de.net/d/partnerAcrobatics.mp4
I would like to place a vertical black line on top right in the middle or add a black border to the video on the left. If I add a border to the left video I would like to maintain the original sum dimension of the original videos. This solution would require subtracting the border width from the left videos width. I will take either solution.
Thanks
Solution | Solved: If both videos have no audio use this.
ffmpeg -i left.mp4 -i right.mp4 -filter_complex "[0:v]crop=639:720, pad=640:720:0:0:black[tmp0]; [1:v]crop=639:720, pad=640:720:1:0:black[tmp1]; [tmp0][tmp1]hstack[v] " -map [v] -y o.mp4
If both videos have audio use the following.
ffmpeg -i c2.mov -i c1.mov -filter_complex "[0:v]crop=1279:720, pad=1280:720:0:0:black[tmp0]; [1:v]crop=1279:720, pad=1280:720:1:0:black[tmp1]; [tmp0][tmp1]hstack[v];[0:a][1:a]amerge=inputs=2[a]" -map [v] -map [a] -ac 2 -y o.mp4
Both videos must have the same height.
crop=1279:720
I used crop to remove one pixel from the video width on the right. It was originally 1280 pixels.
pad=1280:720:0:0:black[tmp0]
I padded the left movie by declaring a new canvas size of 1280 pixel. This moved the movie to the left leaving one pixel of space on the right which is colored "black".
The right movie I padded and moved to the right exposing the black border on the left.
pad=1280:720:1:0:black[tmp1]
I did this to both videos so the affect remained centered if the videos are the same dimensions.
Use
ffmpeg -i left.mp4 -i right.mp4 -filter_complex "[0:v]setpts=PTS-STARTPTS,crop=iw-10:ih:0:0, pad=2*(iw+10):ih[bg]; [1:v]setpts=PTS-STARTPTS[fg]; [bg][fg]overlay=w" -y final.mp4
Since you've already joined the videos, seems like you wanna separate them with a vertical black line.
Since this line is stationary (inanimate), this overlay can be a still picture, eg. black.png (10px wide & same height as your video)
If you want this line to be animated or a moving picture, then the overlay can be another video.
If the videos had NOT been joined, you could pad the 2nd video on the left or 1st video on the right, before joining
eg. ffmpeg -i 1.mp4 -vf "pad=width=<new width>:height=<same height>:color=black" out.mp4
The code below answers your question, with a vertical line of width 10 pixels in the middle, separating the 2 videos:
ffmpeg -i in.mp4 -i black.png -filter_complex "overlay=(main_w-overlay_w)/2:(main_h-overlay_h)/2" out.mp4

FFmpeg: how to make video out of slides and audio

So i have several images some png and some jpgs. And i have mp3 audio. I want to make a video file don't care what format.
So i want either:
A video made up of xyz size meaning images are centered and cropped if they go beyond dimensions coupled with audio in mp3 format..
or just one image centered or and cropped, still image, video with audio.
I have tried copying and pasting things and even modifying them after reading documents but in the end i got a blank video with audio and huge file that took forever to complete.
I have windows 7.
u have to rename the images in a sequence.
example if u have a.png , bds.png , asda.png....
rename it to image1.png, image2.png , image3.png and so on
(it should be in the sequence u want the images to come in the video)
now make sure u are in the folder u have saved the rename images
now use
ffmpeg -i image%d.png output.mp4 (whichever format u want)
now to add audio(say 'input.mp3') to 'output.mp4'
use ffmpeg -i input.mp3 -i output.mp4 output2.mp4
this should work.
hope this helps.

Resources