How you can use kinect camera inside google chrome? - linux

EDIT: It's now supported
I want to use a kinect inside of chrome using webrtc. In linux UYVY is not supported.
Is it possible to create a new device descriptor (/dev/video1) transforming /dev/video0 from UYVY to YUYV?

We need to create a virtual device (loopback)
git clone git#github.com:umlaeute/v4l2loopback.git
cd v4l2loopback
make
sudo make install
sudo modprobe v4l2loopback // creates /dev/video2 or /dev/videox ...
install gstreamer
sudo apt-get install gstreamer0.10
convert from sRGB to YUY2 (YUYV)
gst-launch-0.10 -v v4l2src device=/dev/video0 ! \
ffmpegcolorspace ! \
video/x-raw-rgb ! \
ffmpegcolorspace ! \
video/x-raw-yuv,format=\(fourcc\)YUY2 ! \
v4l2sink device=/dev/video2
test if everything works
gst-launch v4l2src device=/dev/video2 ! xvimagesink
v4l2-ctl -d /dev/video2 --all
Driver Info (not using libv4l2):
Driver name : v4l2 loopback
Card type : Dummy video device (0x0000)
Bus info : v4l2loopback:0
Driver version: 0.8.0
Capabilities : 0x05000001
Video Capture
Read/Write
Streaming
Video input : 0 (loopback: ok)
Video output: 0 (loopback in)
Format Video Capture:
Width/Height : 1280/1024
Pixel Format : 'YUYV'
Field : None
Bytes per Line: 2560
Size Image : 2621440
Colorspace : SRGB
Format Video Output:
Width/Height : 1280/1024
Pixel Format : 'YUYV'
Field : None
Bytes per Line: 2560
Size Image : 2621440
Colorspace : SRGB
Streaming Parameters Video Capture:
Frames per second: 30.000 (30000/1000)
Read buffers : 8
Streaming Parameters Video Output:
Frames per second: 30.000 (30000/1000)
Write buffers : 8
keep_format (bool) : default=0 value=0
sustain_framerate (bool) : default=0 value=0
timeout (int) : min=0 max=100000000 step=1 default=0 value=0
timeout_image_io (bool) : default=0 value=0
Now this should work in chrome via Dummy video device (0x0000)

Just to update the answer from 2014.
gst-launch-1.0 has some flags, so the new command to be used, and which worked for me, is:
gst-launch-1.0 -v v4l2src device=/dev/video0 ! \
videoconvert ! \
video/x-raw, format=RGB ! \
videoconvert ! \
video/x-raw,format=YUY2 ! \
v4l2sink device=/dev/video1
where /dev/video1 is the loopback device.

Related

Gstreamer: Is there plugin to increase audio volume of an audio source

As above. Want to artificially increase the loudness of a microphone recording of live scene. Have used the following plugins volume
pulsesrc volume=8.0 \
! audioconvert \
! audioresample \
! volume volume=1.0 \
! audio/x-raw,rate=8000,channels=1,depth=8,format=S16LE \
! filesink location=record.wav
Have also tried audioamplify but seem like both have the raw recorded volume as the upper limit
in loudness
Thanks for your help
The volume property of the volume element has a range from 0.0 to 10.0 while 1.0 represents 100% volume.

Gstreamer, alsa, bluealsa: bluetooth audio not working in 2 ways audio stream

Context
So I have 2 Gstreamer pipelines, one to send audio up (PC to Pi) to a Raspberry Pi, and another to send audio down (Pi to PC).
The audio down is getting sound from a USB mic and the audio up is outputting sound to a Bluetooth speaker.
The libraries uses are Alsa with Bluez.
Problem
Here is where it gets complicated:
[✓] The audio up alone, is working
[✓] The audio down alone is working
But:
[✗] If the audio down is simultaneous, the audio up is not working
Somehow, Bluetooth audio-out is not working when listening at the same time to a capture device.
However:
[✓] If the audio up is outputting to a non-Bluetooth device (e.g with the jack connector) it is working fine, even simultaneously with audio down.
In other words: the output the the Bluetooth device is working, only if I am not listening to a capture device simultaneously, while if the output device is not Bluetooth, it is working no matter what.
"Not working" means that the audio cannot be heard on the speaker.
Question
How can I get the Bluetooth device to work with GStreamer even when a capture device is listening ?
Code
Audio Down
Send audio down
import gi
gi.require_version('Gst', '1.0')
# gst-launch-1.0
cmd = ' alsasrc device=hw:1,0 ! audio/x-raw ! queue '
cmd += ' ! audioresample ! audioconvert ! avenc_ac3 ! queue '
cmd += f' ! rtpac3pay ! udpsink host={ip} port={port} '
Audio Up
Receive audio and output to bluetooth speaker
# gst-launch-1.0
cmd = f'udpsrc port={port} ! decodebin ! audioresample ! audioconvert '
cmd += f' ! alsasink device=bluealsa:DEV={MAC_ADDRESS} '
Receive audio and output to normal speaker (e.g. jack or USB)
# gst-launch-1.0
cmd = f'udpsrc port={port} ! decodebin ! audioresample ! audioconvert '
cmd += ' ! alsasink device=hw:2,0 '
Notes
I am using the python library for Gstreamer.

How to change mjpeg to yuyv422 from a webcam to a v4l2loopback?

Backstory: One livestreaming site I use isn't smart enough to detect the capabilities of my webcam (Logitech Brio, 4k), and instead just uses the default frames per second settings, which is 5fps.
(full solution walk-through in the answer)
The best solution I could think of (besides changing livestream providers) was to create a loopback virtual webcam using v4l2loopback that I could force to have the exact settings I wanted to use on that livestream site.
For the brio, the higher frame rates come with mjpeg, not the default yuyv.
Problem 1:
I could easily read mjpeg, but unfortunately kept banging my head against the wall because v4l2loopback evidently only wanted yuyv.
I tried things like:
ffmpeg -f v4l2 \
-input_format mjpeg \
-framerate 30 \
-video_size 1280x720 \
-i /dev/video0 \
-vcodec copy \
-f v4l2 /dev/video6
and
ffmpeg -f v4l2 \
-input_format mjpeg \
-framerate 30 \
-video_size 1280x720 \
-i /dev/video0 \
-vcodec yuyv422 \ # this line changed (even tried "copy")
-f v4l2 /dev/video6
But they wouldn't work. I got errors like:
Unknown V4L2 pixel format equivalent for yuvj422p
and
...deprecated pixel format used, make sure you did set range correctly...
...V4L2 output device supports only a single raw video stream...
Eventually I got this to work:
ffmpeg -f v4l2 \
-input_format mjpeg \
-framerate 30 \
-video_size 1280x720 \
-i /dev/video0 \
-pix_fmt yuyv422 \ # The winning entry
-f v4l2 /dev/video6
Problem 2
The next problem was getting chrome to see the virtual webcam. It worked correctly with guvcview, and on firefox I could use webcam testing sites and it would pick the virtual camera up without a problem.
Turns out google, in it's overly-protective nature (while it's siphoning off all our data, btw), doesn't want to use webcams that can be read and written to.
So when starting v4l2loopback you have to tell it to announce that it's "read only" to consumers like chrome.
Here's the exact modprobe I use that works:
sudo modprobe v4l2loopback devices=1 exclusive_caps=1
Exact solution.
1. Figure out which webcam is the correct input webcam
Use v4l2-ctl to list all the webcams:
v4l2-ctl --list-devices
My output is this (yours will vary, I'll use mine as an example as I go):
Logitech BRIO (usb-0000:00:14.0-5.2):
/dev/video0
/dev/video1
HP HD Camera: HP HD Camera (usb-0000:00:14.0-9):
/dev/video2
/dev/video3
/dev/video4
/dev/video5
In this case my brio is video0.
2. Start v4l2loopback:
sudo modprobe v4l2loopback devices=1 exclusive_caps=1
3. Confirm your loopback device:
v4l2-ctl --list-devices
Mine now shows this, indicating video6 is the loopback:
Dummy video device (0x0000) (platform:v4l2loopback-000):
/dev/video6
Logitech BRIO (usb-0000:00:14.0-5.2):
/dev/video0
/dev/video1
HP HD Camera: HP HD Camera (usb-0000:00:14.0-9):
/dev/video2
/dev/video3
/dev/video4
/dev/video5
4. Determine your optimal input settings
Use guvcview to figure out which codec gives you the resolution and framerate you're looking for (you may have to use the menu -> Video -> Video Codec -> Raw camera input).
I got 60fps using mjpeg, I only needed 30. The default yuyv gave a miserable 5fps.
Now use ffmpeg to list the capabilities of the camera and get the matching codec:
ffmpeg -f v4l2 -list_formats all -i /dev/video0 #use your camera here from step 2
In the output you'll see something like this:
[video4linux2,v4l2 # 0x55f1a4e989c0] Raw : yuyv422 : YUYV 4:2:2 : 640x480 160x120 176x144 320x180 320x240 352x288 340x340 424x240 440x440 480x270 640x360 800x448 800x600 848x480 960x540 1024x576 1280x720 1600x896 1920x1080
[video4linux2,v4l2 # 0x55f1a4e989c0] Compressed: mjpeg : Motion-JPEG : 640x480 160x120 176x144 320x180 320x240 352x288 424x240 480x270 640x360 800x448 800x600 848x480 960x540 1024x576 1280x720 1600x896 1920x1080
In my case it was the mjpeg that gave the best output in guvcview, and that was the exact name of the codec (as indicated above).
5. Start ffmpeg using that input codec and changing the pixel format to yuyv:
ffmpeg -f v4l2 \
-input_format mjpeg \
-framerate 30 \
-video_size 1280x720 \
-i /dev/video0 \
-pix_fmt yuyv422 \
-f v4l2 /dev/video6
Update the video size to the highest size your livestream/video record will support, as long as your camera also supports it.
Now when you want to livestream, just use the camera labeled "Dummy"

GStreamer When the frame rate is 10/1, the video playback speed gets faster

I want to save a video using GStreamer in the following environment.
Hardware: Raspberry pi (BCM 2709 Revision: a 22082)
OS: Raspbian GNU / Linux 9 (stretch)
Webcam: Logitech HD WEBCAM C270
The video saved with following command, the playback speed becomes too faster than expected.
$ gst-launch-1.0 v4l2src ! video/x-raw,width=640,height=480,format=YV12,framerate=10/1 ! videoconvert ! omxh264enc ! video/x-h264 ! h264parse ! filesink location=video.h264
However, if I change “framerate” to 30/1, I can watch without problems
I checked the frame rate of the video with following command
$ ffmpeg -i video.h 264
Input # 0, h 264, from 'video. H 264':
Duration: N / A, bitrate: N / A
Stream # 0: 0: Video: h 264 (High), yuv 420 p (progressive), 640 x 480 [SAR 1: 1 DAR 4: 3], 25 fps, 25 tbr, 1200 k tbn, 50 tbc
The frame rate is 25/1.
I tried specifying the frame rate to hardware with following command.
$ v4l2-ctl -d / dev / video0 -p 10
Frame rate set to 10.000 fps
But there was no effect.
Also I tried using “videorate” plugin.
$ gst-launch-1.0 v4l2src ! videorate ! video/x-raw,width=640,height=480,format=YV12,framerate=10/1 ! videoconvert ! omxh264enc ! video/x-h264 ! h264parse ! filesink location=video.h264
But it got worse results.
Even if playback is started, the video will remain stopped, and after a while it will be played at high speed
The version of GStreamer is 1.10.4.
This is the only version available with apt-get.
I tried to compile from source code, but building on Raspbian is not supported and it is very difficult.
Update:
I tried Lad's suggetion(Thanks!)
But following error message shown...
gst-launch-1.0 v4l2src ! video/x-raw,width=640,height=480,format=YV12,framerate=10/1 ! videoconvert ! omxh264enc ! video/x-h264 ! h264parse ! mp4mux ! filesink location=video.h264
...
ERROR: from element /GstPipeline:pipeline0/GstMP4Mux:mp4mux0: Could not multiplex stream.
Additional debug info:
gstqtmux.c(3391): gst_qt_mux_add_buffer (): /GstPipeline:pipeline0/GstMP4Mux:mp4mux0:
Buffer has no PTS.
What is wrong?
You are missing mp4mux element in the pipeline. try the following pipeline :
$ gst-launch-1.0 v4l2src !
video/x-raw,width=640,height=480,format=YV12,framerate=10/1 !
videoconvert ! omxh264enc ! video/x-h264 ! h264parse ! mp4mux !
filesink location=video.h264

flv from vlc to ffmpeg live video error when no sound temporarily

When we get live stream from vlc to ffmpeg , wherever there is sometimes 5-6 second no sound part of video , then ffmpeg is dead with this log
flv # 0x8b426d0]illegal ac vlc code at 4x6
[flv # 0x8b426d0]Error at MB: 142
[flv # 0x8b426d0]concealing 257 DC, 257 AC, 257 MV errors
[mpegts # 0x8b44e50]dts < pcr, TS is invalid
Is there anyway to avoid this problem ?
dvch,
This error occurs when you have corrupted bits in your video capture. With RTP or live streams, this happens a lot since UDP will drop packets. FFMPEG tries hard to recover these areas, but there will be some loss depending on format.
Try grabbing a live stream that is encoded in raw h263, one in raw h263+, and one in raw h264, and see what works better for you. VLC should do a good job of the packetization, so it may be the decoder within FFMPEG that is causing the problem. You could try GStreamer,
My working pipeline:
Sender:
gst-launch-0.10 -v filesrc location=June/akiyo_qcif.264 ! h264parse !
video/x-h264 ! rtph264pay pt=96 config-interval=5 ! udpsink host=127.0.0.1
port=42050 sync=false
Receiver:
gst-launch-0.10 udpsrc port=42050 caps="application/x-rtp, media=(string)video,
clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96,
ssrc=(guint)4091714163, clock-base=(guint)4007889851, seqnum-base=(guint)31909"
! rtph264depay ! filesink location=June/test6.264
via Farah at
Gstreamer-devel Post about Streaming RTP and h264
I hope this helps, I have had this same error to, using h263-1998 over RTP/AVP even on a local network, h263 has no hiccups, and h264 has no hiccups, but there is something going on with FFMPEG and h263p format.
Here are the same errors I was getting to with FFMPEG.
[h263 # 0x101015a00] illegal ac vlc code at 12x15
[h263 # 0x101015a00] Error at MB: 357
[h263 # 0x101015a00] concealing 44 DC, 44 AC, 44 MV errors
Cheers,
Scott Haines
Try
-b:v 64k <your out put method>
use it before declaring your out put.

Resources