Backstory: One livestreaming site I use isn't smart enough to detect the capabilities of my webcam (Logitech Brio, 4k), and instead just uses the default frames per second settings, which is 5fps.
(full solution walk-through in the answer)
The best solution I could think of (besides changing livestream providers) was to create a loopback virtual webcam using v4l2loopback that I could force to have the exact settings I wanted to use on that livestream site.
For the brio, the higher frame rates come with mjpeg, not the default yuyv.
Problem 1:
I could easily read mjpeg, but unfortunately kept banging my head against the wall because v4l2loopback evidently only wanted yuyv.
I tried things like:
ffmpeg -f v4l2 \
-input_format mjpeg \
-framerate 30 \
-video_size 1280x720 \
-i /dev/video0 \
-vcodec copy \
-f v4l2 /dev/video6
and
ffmpeg -f v4l2 \
-input_format mjpeg \
-framerate 30 \
-video_size 1280x720 \
-i /dev/video0 \
-vcodec yuyv422 \ # this line changed (even tried "copy")
-f v4l2 /dev/video6
But they wouldn't work. I got errors like:
Unknown V4L2 pixel format equivalent for yuvj422p
and
...deprecated pixel format used, make sure you did set range correctly...
...V4L2 output device supports only a single raw video stream...
Eventually I got this to work:
ffmpeg -f v4l2 \
-input_format mjpeg \
-framerate 30 \
-video_size 1280x720 \
-i /dev/video0 \
-pix_fmt yuyv422 \ # The winning entry
-f v4l2 /dev/video6
Problem 2
The next problem was getting chrome to see the virtual webcam. It worked correctly with guvcview, and on firefox I could use webcam testing sites and it would pick the virtual camera up without a problem.
Turns out google, in it's overly-protective nature (while it's siphoning off all our data, btw), doesn't want to use webcams that can be read and written to.
So when starting v4l2loopback you have to tell it to announce that it's "read only" to consumers like chrome.
Here's the exact modprobe I use that works:
sudo modprobe v4l2loopback devices=1 exclusive_caps=1
Exact solution.
1. Figure out which webcam is the correct input webcam
Use v4l2-ctl to list all the webcams:
v4l2-ctl --list-devices
My output is this (yours will vary, I'll use mine as an example as I go):
Logitech BRIO (usb-0000:00:14.0-5.2):
/dev/video0
/dev/video1
HP HD Camera: HP HD Camera (usb-0000:00:14.0-9):
/dev/video2
/dev/video3
/dev/video4
/dev/video5
In this case my brio is video0.
2. Start v4l2loopback:
sudo modprobe v4l2loopback devices=1 exclusive_caps=1
3. Confirm your loopback device:
v4l2-ctl --list-devices
Mine now shows this, indicating video6 is the loopback:
Dummy video device (0x0000) (platform:v4l2loopback-000):
/dev/video6
Logitech BRIO (usb-0000:00:14.0-5.2):
/dev/video0
/dev/video1
HP HD Camera: HP HD Camera (usb-0000:00:14.0-9):
/dev/video2
/dev/video3
/dev/video4
/dev/video5
4. Determine your optimal input settings
Use guvcview to figure out which codec gives you the resolution and framerate you're looking for (you may have to use the menu -> Video -> Video Codec -> Raw camera input).
I got 60fps using mjpeg, I only needed 30. The default yuyv gave a miserable 5fps.
Now use ffmpeg to list the capabilities of the camera and get the matching codec:
ffmpeg -f v4l2 -list_formats all -i /dev/video0 #use your camera here from step 2
In the output you'll see something like this:
[video4linux2,v4l2 # 0x55f1a4e989c0] Raw : yuyv422 : YUYV 4:2:2 : 640x480 160x120 176x144 320x180 320x240 352x288 340x340 424x240 440x440 480x270 640x360 800x448 800x600 848x480 960x540 1024x576 1280x720 1600x896 1920x1080
[video4linux2,v4l2 # 0x55f1a4e989c0] Compressed: mjpeg : Motion-JPEG : 640x480 160x120 176x144 320x180 320x240 352x288 424x240 480x270 640x360 800x448 800x600 848x480 960x540 1024x576 1280x720 1600x896 1920x1080
In my case it was the mjpeg that gave the best output in guvcview, and that was the exact name of the codec (as indicated above).
5. Start ffmpeg using that input codec and changing the pixel format to yuyv:
ffmpeg -f v4l2 \
-input_format mjpeg \
-framerate 30 \
-video_size 1280x720 \
-i /dev/video0 \
-pix_fmt yuyv422 \
-f v4l2 /dev/video6
Update the video size to the highest size your livestream/video record will support, as long as your camera also supports it.
Now when you want to livestream, just use the camera labeled "Dummy"
Related
I am working on an application that records portion of Windows desktop screen using FFmpeg. It works fine using command like this
ffmpeg -f dshow -i audio="Microphone (Realtek Audio)" -f gdigrab -offset_x 0 -offset_y 0 -video_size 300x200 -i desktop -pix_fmt yuv420p -c:v libx264 -r 15 output.mp4
But when I change audio source to my bluetooth headset, FFmpeg just hangs and doesn't start recording. Here is same command with bluetooth audio device
ffmpeg -f dshow -i audio="Headset (QCY-T1_R Hands-Free AG Audio)" -f gdigrab -offset_x 0 -offset_y 0 -video_size 300x200 -i desktop -pix_fmt yuv420p -c:v libx264 -r 15 output1.mp4
Can you please suggest how can we solve this?
After trying different things, I finally found out that the ffmpeg recording command was fine but some (or may be many?) Chinese Bluetooth headsets ( I tried QCY AirDots and Redmi AirDots ) do not work properly with Windows and that causes voice recording to fail. Screen recording with audio worked perfectly with another Sony bluetooth headset.
I'm using an rpi1 (running raspbian lite and jwm) with a USB webcam hooked to a CRT TV to display its output. Up until now I've been using Camorama which works nicely, but I lose some of the measly 640x480 screen resolution for the title bar of the app, and also some of it below due to the window manager showing buttons etc that I don't need. Is there any way I can simply show up the video output of the device in a full screen window? so I can just add it to the startup of the window manager and run it on top of it. I really don't need any of the features that Camorama has because all I want is to display video ala security camera.
Display webcam output from Linux
Note: Adjust with and height where necessary
mplayer tv:// -tv driver=v4l:width=352:height=288
or
mplayer tv:// -tv driver=v4l2:width=640:height=480:device=/dev/video0:fps=30:outfmt=yuy2
How to Record a Screencast & Convert it to an mpeg
ffmpeg -f x11grab -r 25 -s 640x480 -i :0.0 /tmp/VideoOutput.mpg
Record audio and video from webcam using ffmpeg
Record webcam audio using ALSA, MP3 encoded & video as MPEG-4.
ffmpeg -f alsa -r 16000 -i hw:2,0 -f video4linux2 -s 800x600 -i /dev/video0 -r 30 -f avi -vcodec mpeg4 -vtag xvid -sameq -acodec libmp3lame -ab 96k myVideo.avi
Hope this helps.
When I record both audio and video using ffmpeg the audio recording cuts out for the last two seconds of the video.
ffmpeg \
-f v4l2 -i /dev/video0 \
-f alsa -i hw:2 \
samples/video.mp4
I have tried using different audio and video codecs, as well as different video formats and I, have noticed that mpg format instead of mp4 the audio works better.
I have also tried using different codecs with the mp4 and checked the compatibilities wikipedia but they don't seem to matter much.
So adding the following line seems to solve the problem.
-preset ultrafast -threads 0
I am trying to create a rtp stream using ffmpeg. I am taking input from my logitech C920 which has built in h264 encoding support and also has a microphone. I wanted to send both video(h264 either with the built in encoder or ffmpeg's encoder) and audio(any encoding) through RTP and then play the stream using ffplay.
So far I am able to send only the video with the following command:
ffmpeg -i /dev/video0 -r 24 -video_size 320x240 -c:v libx264 -f rtp rtp://127.0.0.1:9999
and also the audio separately using the command:
ffmpeg -f alsa -i plughw:CARD=C920,DEV=0 -acodec libmp3lame -t 20 -f rtp rtp://127.0.0.1:9998
and play the sdp file with:
ffplay -i -protocol_whitelist file,udp,rtp test3.sdp
ffplay -i -protocol_whitelist file,udp,rtp test4.sdp
I'm on Ubuntu 14.04
How can I play the two streams with a single ffplay command as ffplay cannot take two inputs and I can't send two streams using a single RTP stream(or can I?).
Also, how can I use the built in h264 encoder of my webcam?
Thank you!
I want to live stream video from webcam and sound from microphone from one computer to another but there is some problems.
When I use this command line:
ffmpeg.exe -f dshow -rtbufsize 500M -i video="Camera":audio="Microphone" -c:v mpeg4 -c:a mp2 -f mpegts udp://127.0.0.1:1234
FFmpeg console starts filling with yellow color messages and stream becomes unstable: http://s16.postimg.org/qglcgr345/Untitled.png
To solve this problem I have added new parameter to the command line to set the frame rate -r 25:
ffmpeg.exe -f dshow -rtbufsize 500M -r 25 -i video="Camera":audio="Microphone" -c:v mpeg4 -c:a mp2 -f mpegts udp://127.0.0.1:1234
After I added -r 25 problem with yellow color messages disappears but then appears another problem. When I fresh start FFmpeg with this command line video and sound looks synchronous but after one or two minutes appears ~25 seconds lag between video and sound, sound goes behind video. I have tried that with different protocols UDP, TCP, RTP but problems are the same. Please help me!
I found answer for my problem with "-r" and asynchronous audio and video. Who is interested answer is here: https://trac.ffmpeg.org/wiki/DirectShow (in paragraph "Specifying input framerate").