I am not sure why this pipeline is breaking, I have gstreamer installed on linux from the websites exact instructions, any ideas?
gst-launch-1.0 v4l2src device=/dev/video0 ! videoscale ! video/x-raw, width=2592, height=600 ! autovideosink -v
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:00.000093207
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
If I change it to:
gst-launch-1.0 v4l2src device=/dev/video0 ! videoconvert! videoscale ! video/x-raw, width=2592, height=600 ! autovideosink -v
it will work but why will it not work the other way?
Your webcam transmits raw video output to whoever is listening to it. Adding videoconvert encodes the raw video to a codec that can be manipulated by the videoscale element and the end product of the manipulation to be understood by the autovideosink element.
So gst-launch-1.0 v4l2src device=/dev/video0 ! videoconvert! videoscale ! video/x-raw, width=2592, height=600 ! autovideosink is telling gstreamer to get the raw video from the camera, encode it into something that we understand, alter the video, and display it.
I really recommend that when you have doubts about an element, call gst-inspect-1.0 <element name> to see it's description and properties.
Related
This is my actual pipeline
gst-launch-1.0 --gst-debug=3 fdsrc name=fdsrc ! application/x-rtp,media=audio,encoding=SBC,payload=96,clock-rate=44100 ! rtpsbcdepay ! sbcparse ! sbcdec ! audioconvert ! audio/x-raw,layout=interleaved,format=F32LE,channels=2 ! audioresample quality=2 ! appsink name=appsink caps="audio/x-raw,layout=interleaved,format=F32LE,rate=48000,channels=2"
I didn't get audio data and so I tried to find out which command causes the issue.
I have used "filesync" to store the data coming from phone and try to debug step by step.
Here is the command to capture the rtp encoded data
"fdsrc name=fdsrc ! application/x-rtp,media=audio,encoding=SBC,payload=96,clock-rate=44100 !filesink location=/tmp/bt_capture.dat"
With this, I got some audio rtp packets in bt_capture.dat file.
I used this file as src and add the commands one by one.
Getting error in "rtpsbcdepay"
$ gst-launch-1.0 --gst-debug=3 filesrc location=/media/media/Audio/bt_capture.dat ! application/x-rtp,media=audio,encoding=SBC,payload=96,clock-rate=44100,format=time ! rtpsbcdepay ! sbcparse ! sbcdec ! audioconvert ! audio/x-raw,layout=interleaved,format=F32LE,channels=2 ! audioresample quality=2 ! appsink name=appsink caps="audio/x-raw,layout=interleaved,format=F32LE,rate=48000,channels=2"
0:00:00.000177132 19448 0x197d800 INFO GST_INIT gst.c:586:init_pre: Initializing GStreamer Core Library version 1.18.4
0:00:00.000618217 19448 0x197d800 INFO GST_INIT gst.c:587:init_pre: Using library installed in /usr/lib
0:00:00.000884197 19448 0x197d800 INFO GST_INIT gst.c:605:init_pre: Linux W-4B17-BM 5.15.21-rt30-qsc+ #1 SMP PREEMPT_RT Thu Oct 20 12:57:18 UTC 2022 x86_64
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
0:00:00.062258692 19448 0x1b5cd20 ERROR rtpbasedepayload gstrtpbasedepayload.c:662:gst_rtp_base_depayload_handle_event: Segment with non-TIME format not supported
0:00:00.062676736 19448 0x1b5cd20 ERROR rtpbasedepayload gstrtpbasedepayload.c:662:gst_rtp_base_depayload_handle_event: Segment with non-TIME format not supported
0:00:00.064241297 19448 0x1b5cd20 WARN basesrc gstbasesrc.c:3127:gst_base_src_loop: error: Internal data stream error.
0:00:00.064581497 19448 0x1b5cd20 WARN basesrc gstbasesrc.c:3127:gst_base_src_loop: error: streaming stopped, reason error (-5)
ERROR: from element /GstPipeline:pipeline0/GstFileSrc:filesrc0: Internal data stream error.
Additional debug info:
../../unpacked/gstreamer-1.18.4/libs/gst/base/gstbasesrc.c(3127): gst_base_src_loop (): /GstPipeline:pipeline0/GstFileSrc:filesrc0:
streaming stopped, reason error (-5)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
Can any one please suggest, how to overcome this error?
P.S : I have used format=time, but it didn't help and threw some syntax error
Adding a ""rtpjitterbuffer" along with "do-timestamp=TRUE" in fdsrc solves the problem but with some warnings. Still, this warning didn't affect playback (may be some packets missing?)
=====================
:01:21.926788667 20721 0x7f47100072a0 WARN audioresample gstaudioresample.c:729:gst_audio_resample_check_discont: encountered timestamp discontinuity of 1874 samples = 0:00:00.042494331
0:01:21.927975600 20721 0x7f47100072a0 WARN audioresample gstaudioresample.c:729:gst_audio_resample_check_discont: encountered timestamp discontinuity of 7968 samples = 0:00:00.180680272
0:01:22.122851228 20721 0x7f47100072a0 WARN audioresample gstaudioresample.c:729:gst_audio_resample_check_discont: encountered timestamp discontinuity of 1874 samples = 0:00:00.042494331
Pipeline:
"fdsrc name=fdsrc do-timestamp=TRUE ! "
"application/x-rtp,media=audio,encoding=SBC,payload=96,clock-rate=44100 ! "
"rtpjitterbuffer !"
"rtpsbcdepay ! "
"sbcparse ! "
"sbcdec ! "
"audioconvert ! "
"audio/x-raw,layout=interleaved,format=F32LE,channels=2 ! "
"audioresample quality=2 ! "
"appsink name=appsink caps="audio/x-raw,layout=interleaved,format=F32LE,rate=48000,channels=2"
Environment :- Raspi v2 camera, Jetson nano board, Ubuntu 18.04
I started with nvarguscamerasrc and it's working :-
gst-launch-1.0 nvarguscamerasrc sensor_mode=0 ! 'video/x-raw(memory:NVMM),width=3820, height=2464, framerate=21/1, format=NV12'! nvegltransform ! nveglglessink -e
and then I tried running these inputs :-
gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-h264, width=3280, height=2464' ! filesink
and also :-
gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-raw, width=640, height=480' ! filesink
and got output as :-
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming stopped, reason not-negotiated (-4)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
I installed a fresh version of Ubuntu 18.04 and I am trying to demux a single channel from a Matroska file by running the exact command from the Gstreamer documentation ( https://gstreamer.freedesktop.org/documentation/tutorials/basic/gstreamer-tools.html ).
Gstreamer is current:
gst-launch-1.0 --version
gst-launch-1.0 version 1.14.1
GStreamer 1.14.1
https://launchpad.net/distros/ubuntu/+source/gstreamer1.0
The problem is that Gstreamer keeps complaining that:
GstMatroskaDemux:d: Delayed linking failed
Error:
gst-launch-1.0 souphttpsrc location=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm ! matroskademux name=d d.video_00 ! matroskamux ! filesink location=sintel_video.mkv
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Got context from element 'souphttpsrc0': gst.soup.session=context, session=(SoupSession)NULL, force=(boolean)false;
WARNING: from element /GstPipeline:pipeline0/GstMatroskaDemux:d: Delayed linking failed.
Additional debug info:
./grammar.y(510): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstMatroskaDemux:d:
failed delayed linking pad video_00 of GstMatroskaDemux named d to some pad of GstMatroskaMux named matroskamux0
ERROR: from element /GstPipeline:pipeline0/GstSoupHTTPSrc:souphttpsrc0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstSoupHTTPSrc:souphttpsrc0:
streaming stopped, reason not-linked (-1)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
How can I get this to work?
I actually remembered that webm is doing matroska file format. The example is actually faulty/out of date. The pad naming is wrong. Instead of 00 use 0:
gst-launch-1.0 souphttpsrc location=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm ! matroskademux name=d d.video_0 ! matroskamux ! filesink location=sintel_video.mkv
I have installed gstreamer-1.0 on my target board.
When I run the below command:
gst-launch-1.0 filesrc location="/home/test.mp4" ! decodebin ! videoconvert ! xvimagesink
I am getting the following output:
Setting pipeline to PAUSED ...
ERROR: Pipeline doesn't want to pause.
ERROR: from element /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0: Could not initialise Xv output
Additional debug info:
xvimagesink.c(1765): gst_xvimagesink_open (): /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0:
Could not open display (null)
Setting pipeline to NULL ...
Freeing pipeline ..
Please suggest some solution to make the pipeline work.
Regards,
Sainath
Hm do you have libxv somewhere?
Find
what happens when you check gst-inspect-1.0 xvimagesink ?
Also there is library path mentioned (| grep Filename).. you can ldd it:
ldd /usr/local/lib/gstreamer-1.0/libgstxvimagesink.so
I had some problems with this as well.. try to install libxv-dev or libxv1.. maybe dev is safer.
I'm trying to use Gstreamer to create a seekable (indexed) video file in Linux. My pipelines work for recording and saving the data, but I can't figure out how to index the data so I can seek using gst_element_seek_simple() [http://docs.gstreamer.com/display/GstSDK/Basic+tutorial+4%3A+Time+management]
I have seen this post: Gstreamer video output position tracking and seeking and validated I am sending an EOS on the pipeline with -e.
Here is my pipeline and output. I'm teeing it to display both to my embedded system's screen and save to the M4V file.
# gst-launch-0.10 -e v4l2src ! \
tee name=t !
queue !
video/x-raw-yuv,width=320,height=240 !
videoflip method=clockwise !
ffmpegcolorspace !
fbdevsink t. !
queue !
ffmpegcolorspace !
ffenc_mpeg4 !
filesink location=output.m4v
Here is the output:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
WARNING: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Could not get parameters on device '/dev/video0'
Additional debug info:
v4l2src_calls.c(235): gst_v4l2src_set_capture (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
system error: Inappropriate ioctl for device
Setting pipeline to PLAYING ...
New clock: GstSystemClock
^CCaught interrupt -- handling interrupt.
Interrupt: Stopping pipeline ...
(gst-launch-0.10:534): GLib-CRITICAL **: Source ID 62 was not found when attempting to remove it
EOS on shutdown enabled -- Forcing EOS on the pipeline
Waiting for EOS...
Got EOS from element "pipeline0".
EOS received - stopping pipeline...
Execution ended after 10057977251 ns.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
And here is the output of gst-discover on my new file:
beaglebone:# gst-discoverer-0.10 output.m4v
Analyzing file:///output.m4v
Done discovering file:///output.m4v
Topology:
video: MPEG-4 Video
Properties:
Duration: 0:00:00.000000000
Seekable: no
Thanks
You need to store the result in a seekable/indexed format. For that you can put the mpeg4 video inside a container such as mp4 or matroska. Use "! mp4mux ! filesink" or "! matroskamux ! filesink" to have it inside those formats that should make it seekable.
Side notes: gstreamer 0.10 is over 2 years obsolete and unmantained, please upgrade to 1.0.
http://gstreamer.freedesktop.org/ is the official gstreamer website and you will find the releases for 1.x versions there. The gstreamer.com website is a project is not related to the official project and, if you read the text in gstreamer.com, you will see that you should be using the official repository and installers.