As above. Want to artificially increase the loudness of a microphone recording of live scene. Have used the following plugins volume
pulsesrc volume=8.0 \
! audioconvert \
! audioresample \
! volume volume=1.0 \
! audio/x-raw,rate=8000,channels=1,depth=8,format=S16LE \
! filesink location=record.wav
Have also tried audioamplify but seem like both have the raw recorded volume as the upper limit
in loudness
Thanks for your help
The volume property of the volume element has a range from 0.0 to 10.0 while 1.0 represents 100% volume.
Related
Context
So I have 2 Gstreamer pipelines, one to send audio up (PC to Pi) to a Raspberry Pi, and another to send audio down (Pi to PC).
The audio down is getting sound from a USB mic and the audio up is outputting sound to a Bluetooth speaker.
The libraries uses are Alsa with Bluez.
Problem
Here is where it gets complicated:
[✓] The audio up alone, is working
[✓] The audio down alone is working
But:
[✗] If the audio down is simultaneous, the audio up is not working
Somehow, Bluetooth audio-out is not working when listening at the same time to a capture device.
However:
[✓] If the audio up is outputting to a non-Bluetooth device (e.g with the jack connector) it is working fine, even simultaneously with audio down.
In other words: the output the the Bluetooth device is working, only if I am not listening to a capture device simultaneously, while if the output device is not Bluetooth, it is working no matter what.
"Not working" means that the audio cannot be heard on the speaker.
Question
How can I get the Bluetooth device to work with GStreamer even when a capture device is listening ?
Code
Audio Down
Send audio down
import gi
gi.require_version('Gst', '1.0')
# gst-launch-1.0
cmd = ' alsasrc device=hw:1,0 ! audio/x-raw ! queue '
cmd += ' ! audioresample ! audioconvert ! avenc_ac3 ! queue '
cmd += f' ! rtpac3pay ! udpsink host={ip} port={port} '
Audio Up
Receive audio and output to bluetooth speaker
# gst-launch-1.0
cmd = f'udpsrc port={port} ! decodebin ! audioresample ! audioconvert '
cmd += f' ! alsasink device=bluealsa:DEV={MAC_ADDRESS} '
Receive audio and output to normal speaker (e.g. jack or USB)
# gst-launch-1.0
cmd = f'udpsrc port={port} ! decodebin ! audioresample ! audioconvert '
cmd += ' ! alsasink device=hw:2,0 '
Notes
I am using the python library for Gstreamer.
I want to save a video using GStreamer in the following environment.
Hardware: Raspberry pi (BCM 2709 Revision: a 22082)
OS: Raspbian GNU / Linux 9 (stretch)
Webcam: Logitech HD WEBCAM C270
The video saved with following command, the playback speed becomes too faster than expected.
$ gst-launch-1.0 v4l2src ! video/x-raw,width=640,height=480,format=YV12,framerate=10/1 ! videoconvert ! omxh264enc ! video/x-h264 ! h264parse ! filesink location=video.h264
However, if I change “framerate” to 30/1, I can watch without problems
I checked the frame rate of the video with following command
$ ffmpeg -i video.h 264
Input # 0, h 264, from 'video. H 264':
Duration: N / A, bitrate: N / A
Stream # 0: 0: Video: h 264 (High), yuv 420 p (progressive), 640 x 480 [SAR 1: 1 DAR 4: 3], 25 fps, 25 tbr, 1200 k tbn, 50 tbc
The frame rate is 25/1.
I tried specifying the frame rate to hardware with following command.
$ v4l2-ctl -d / dev / video0 -p 10
Frame rate set to 10.000 fps
But there was no effect.
Also I tried using “videorate” plugin.
$ gst-launch-1.0 v4l2src ! videorate ! video/x-raw,width=640,height=480,format=YV12,framerate=10/1 ! videoconvert ! omxh264enc ! video/x-h264 ! h264parse ! filesink location=video.h264
But it got worse results.
Even if playback is started, the video will remain stopped, and after a while it will be played at high speed
The version of GStreamer is 1.10.4.
This is the only version available with apt-get.
I tried to compile from source code, but building on Raspbian is not supported and it is very difficult.
Update:
I tried Lad's suggetion(Thanks!)
But following error message shown...
gst-launch-1.0 v4l2src ! video/x-raw,width=640,height=480,format=YV12,framerate=10/1 ! videoconvert ! omxh264enc ! video/x-h264 ! h264parse ! mp4mux ! filesink location=video.h264
...
ERROR: from element /GstPipeline:pipeline0/GstMP4Mux:mp4mux0: Could not multiplex stream.
Additional debug info:
gstqtmux.c(3391): gst_qt_mux_add_buffer (): /GstPipeline:pipeline0/GstMP4Mux:mp4mux0:
Buffer has no PTS.
What is wrong?
You are missing mp4mux element in the pipeline. try the following pipeline :
$ gst-launch-1.0 v4l2src !
video/x-raw,width=640,height=480,format=YV12,framerate=10/1 !
videoconvert ! omxh264enc ! video/x-h264 ! h264parse ! mp4mux !
filesink location=video.h264
When I record both audio and video using ffmpeg the audio recording cuts out for the last two seconds of the video.
ffmpeg \
-f v4l2 -i /dev/video0 \
-f alsa -i hw:2 \
samples/video.mp4
I have tried using different audio and video codecs, as well as different video formats and I, have noticed that mpg format instead of mp4 the audio works better.
I have also tried using different codecs with the mp4 and checked the compatibilities wikipedia but they don't seem to matter much.
So adding the following line seems to solve the problem.
-preset ultrafast -threads 0
EDIT: It's now supported
I want to use a kinect inside of chrome using webrtc. In linux UYVY is not supported.
Is it possible to create a new device descriptor (/dev/video1) transforming /dev/video0 from UYVY to YUYV?
We need to create a virtual device (loopback)
git clone git#github.com:umlaeute/v4l2loopback.git
cd v4l2loopback
make
sudo make install
sudo modprobe v4l2loopback // creates /dev/video2 or /dev/videox ...
install gstreamer
sudo apt-get install gstreamer0.10
convert from sRGB to YUY2 (YUYV)
gst-launch-0.10 -v v4l2src device=/dev/video0 ! \
ffmpegcolorspace ! \
video/x-raw-rgb ! \
ffmpegcolorspace ! \
video/x-raw-yuv,format=\(fourcc\)YUY2 ! \
v4l2sink device=/dev/video2
test if everything works
gst-launch v4l2src device=/dev/video2 ! xvimagesink
v4l2-ctl -d /dev/video2 --all
Driver Info (not using libv4l2):
Driver name : v4l2 loopback
Card type : Dummy video device (0x0000)
Bus info : v4l2loopback:0
Driver version: 0.8.0
Capabilities : 0x05000001
Video Capture
Read/Write
Streaming
Video input : 0 (loopback: ok)
Video output: 0 (loopback in)
Format Video Capture:
Width/Height : 1280/1024
Pixel Format : 'YUYV'
Field : None
Bytes per Line: 2560
Size Image : 2621440
Colorspace : SRGB
Format Video Output:
Width/Height : 1280/1024
Pixel Format : 'YUYV'
Field : None
Bytes per Line: 2560
Size Image : 2621440
Colorspace : SRGB
Streaming Parameters Video Capture:
Frames per second: 30.000 (30000/1000)
Read buffers : 8
Streaming Parameters Video Output:
Frames per second: 30.000 (30000/1000)
Write buffers : 8
keep_format (bool) : default=0 value=0
sustain_framerate (bool) : default=0 value=0
timeout (int) : min=0 max=100000000 step=1 default=0 value=0
timeout_image_io (bool) : default=0 value=0
Now this should work in chrome via Dummy video device (0x0000)
Just to update the answer from 2014.
gst-launch-1.0 has some flags, so the new command to be used, and which worked for me, is:
gst-launch-1.0 -v v4l2src device=/dev/video0 ! \
videoconvert ! \
video/x-raw, format=RGB ! \
videoconvert ! \
video/x-raw,format=YUY2 ! \
v4l2sink device=/dev/video1
where /dev/video1 is the loopback device.
When we get live stream from vlc to ffmpeg , wherever there is sometimes 5-6 second no sound part of video , then ffmpeg is dead with this log
flv # 0x8b426d0]illegal ac vlc code at 4x6
[flv # 0x8b426d0]Error at MB: 142
[flv # 0x8b426d0]concealing 257 DC, 257 AC, 257 MV errors
[mpegts # 0x8b44e50]dts < pcr, TS is invalid
Is there anyway to avoid this problem ?
dvch,
This error occurs when you have corrupted bits in your video capture. With RTP or live streams, this happens a lot since UDP will drop packets. FFMPEG tries hard to recover these areas, but there will be some loss depending on format.
Try grabbing a live stream that is encoded in raw h263, one in raw h263+, and one in raw h264, and see what works better for you. VLC should do a good job of the packetization, so it may be the decoder within FFMPEG that is causing the problem. You could try GStreamer,
My working pipeline:
Sender:
gst-launch-0.10 -v filesrc location=June/akiyo_qcif.264 ! h264parse !
video/x-h264 ! rtph264pay pt=96 config-interval=5 ! udpsink host=127.0.0.1
port=42050 sync=false
Receiver:
gst-launch-0.10 udpsrc port=42050 caps="application/x-rtp, media=(string)video,
clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96,
ssrc=(guint)4091714163, clock-base=(guint)4007889851, seqnum-base=(guint)31909"
! rtph264depay ! filesink location=June/test6.264
via Farah at
Gstreamer-devel Post about Streaming RTP and h264
I hope this helps, I have had this same error to, using h263-1998 over RTP/AVP even on a local network, h263 has no hiccups, and h264 has no hiccups, but there is something going on with FFMPEG and h263p format.
Here are the same errors I was getting to with FFMPEG.
[h263 # 0x101015a00] illegal ac vlc code at 12x15
[h263 # 0x101015a00] Error at MB: 357
[h263 # 0x101015a00] concealing 44 DC, 44 AC, 44 MV errors
Cheers,
Scott Haines
Try
-b:v 64k <your out put method>
use it before declaring your out put.