Environment :- Raspi v2 camera, Jetson nano board, Ubuntu 18.04
I started with nvarguscamerasrc and it's working :-
gst-launch-1.0 nvarguscamerasrc sensor_mode=0 ! 'video/x-raw(memory:NVMM),width=3820, height=2464, framerate=21/1, format=NV12'! nvegltransform ! nveglglessink -e
and then I tried running these inputs :-
gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-h264, width=3280, height=2464' ! filesink
and also :-
gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-raw, width=640, height=480' ! filesink
and got output as :-
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming stopped, reason not-negotiated (-4)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Related
This is my actual pipeline
gst-launch-1.0 --gst-debug=3 fdsrc name=fdsrc ! application/x-rtp,media=audio,encoding=SBC,payload=96,clock-rate=44100 ! rtpsbcdepay ! sbcparse ! sbcdec ! audioconvert ! audio/x-raw,layout=interleaved,format=F32LE,channels=2 ! audioresample quality=2 ! appsink name=appsink caps="audio/x-raw,layout=interleaved,format=F32LE,rate=48000,channels=2"
I didn't get audio data and so I tried to find out which command causes the issue.
I have used "filesync" to store the data coming from phone and try to debug step by step.
Here is the command to capture the rtp encoded data
"fdsrc name=fdsrc ! application/x-rtp,media=audio,encoding=SBC,payload=96,clock-rate=44100 !filesink location=/tmp/bt_capture.dat"
With this, I got some audio rtp packets in bt_capture.dat file.
I used this file as src and add the commands one by one.
Getting error in "rtpsbcdepay"
$ gst-launch-1.0 --gst-debug=3 filesrc location=/media/media/Audio/bt_capture.dat ! application/x-rtp,media=audio,encoding=SBC,payload=96,clock-rate=44100,format=time ! rtpsbcdepay ! sbcparse ! sbcdec ! audioconvert ! audio/x-raw,layout=interleaved,format=F32LE,channels=2 ! audioresample quality=2 ! appsink name=appsink caps="audio/x-raw,layout=interleaved,format=F32LE,rate=48000,channels=2"
0:00:00.000177132 19448 0x197d800 INFO GST_INIT gst.c:586:init_pre: Initializing GStreamer Core Library version 1.18.4
0:00:00.000618217 19448 0x197d800 INFO GST_INIT gst.c:587:init_pre: Using library installed in /usr/lib
0:00:00.000884197 19448 0x197d800 INFO GST_INIT gst.c:605:init_pre: Linux W-4B17-BM 5.15.21-rt30-qsc+ #1 SMP PREEMPT_RT Thu Oct 20 12:57:18 UTC 2022 x86_64
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
0:00:00.062258692 19448 0x1b5cd20 ERROR rtpbasedepayload gstrtpbasedepayload.c:662:gst_rtp_base_depayload_handle_event: Segment with non-TIME format not supported
0:00:00.062676736 19448 0x1b5cd20 ERROR rtpbasedepayload gstrtpbasedepayload.c:662:gst_rtp_base_depayload_handle_event: Segment with non-TIME format not supported
0:00:00.064241297 19448 0x1b5cd20 WARN basesrc gstbasesrc.c:3127:gst_base_src_loop: error: Internal data stream error.
0:00:00.064581497 19448 0x1b5cd20 WARN basesrc gstbasesrc.c:3127:gst_base_src_loop: error: streaming stopped, reason error (-5)
ERROR: from element /GstPipeline:pipeline0/GstFileSrc:filesrc0: Internal data stream error.
Additional debug info:
../../unpacked/gstreamer-1.18.4/libs/gst/base/gstbasesrc.c(3127): gst_base_src_loop (): /GstPipeline:pipeline0/GstFileSrc:filesrc0:
streaming stopped, reason error (-5)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
Can any one please suggest, how to overcome this error?
P.S : I have used format=time, but it didn't help and threw some syntax error
Adding a ""rtpjitterbuffer" along with "do-timestamp=TRUE" in fdsrc solves the problem but with some warnings. Still, this warning didn't affect playback (may be some packets missing?)
=====================
:01:21.926788667 20721 0x7f47100072a0 WARN audioresample gstaudioresample.c:729:gst_audio_resample_check_discont: encountered timestamp discontinuity of 1874 samples = 0:00:00.042494331
0:01:21.927975600 20721 0x7f47100072a0 WARN audioresample gstaudioresample.c:729:gst_audio_resample_check_discont: encountered timestamp discontinuity of 7968 samples = 0:00:00.180680272
0:01:22.122851228 20721 0x7f47100072a0 WARN audioresample gstaudioresample.c:729:gst_audio_resample_check_discont: encountered timestamp discontinuity of 1874 samples = 0:00:00.042494331
Pipeline:
"fdsrc name=fdsrc do-timestamp=TRUE ! "
"application/x-rtp,media=audio,encoding=SBC,payload=96,clock-rate=44100 ! "
"rtpjitterbuffer !"
"rtpsbcdepay ! "
"sbcparse ! "
"sbcdec ! "
"audioconvert ! "
"audio/x-raw,layout=interleaved,format=F32LE,channels=2 ! "
"audioresample quality=2 ! "
"appsink name=appsink caps="audio/x-raw,layout=interleaved,format=F32LE,rate=48000,channels=2"
I would like to stream with rtsp using GStreamer pipeline elements. First, I checked with gst-inspect-1.0 that rtspclientsink is available:
xilinx-k26-starterkit-2020_2:/# gst-inspect-1.0 | grep rtsp
rtspclientsink: rtspclientsink: RTSP RECORD client
rtsp: rtspsrc: RTSP packet receiver
rtsp: rtpdec: RTP Decoder
Then, wrote simplest pipeline and tested it with videotestsrc as source and kmssink as the sink. Following pipeline works well:
gst-launch-1.0 videotestsrc ! video/x-raw, width=1920, height=1080 ! kmssink bus-id=fd4a0000.zynqmp-display fullscreen-overlay=1 sync=false
Then, changed sink to rtspclientsink:
gst-launch-1.0 videotestsrc ! video/x-raw, width=1920, height=1080 ! rtspclientsink location=rtsp://localhost:554/test
However, even with a simple pipeline, stream could not be started and encountered with the error:
xilinx-k26-starterkit-2020_2:/# gst-launch-1.0 videotestsrc ! video/x-raw, width=1920,height=1080 ! rtspclientsink location=rtsp://localhost:554/test
Setting pipeline to PAUSED ...
Pipeline is PREROLLED ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://localhost:554/test
ERROR: from element /GstPipeline:pipeline0/GstRTSPClientSink:rtspclientsink0: Could not open resource for reading and writing.
Additional debug info:
../../../gst-rtsp-server-1.16.1/gst/rtsp-sink/gstrtspclientsink.c(3236): gst_rtsp_client_sink_connect_to_server (): /GstPipeline:pipeline0 /GstRTSPClientSink:rtspclientsink0:
Failed to connect. (Generic error)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
Could anyone enlighten me about the error and the way I can use rtspclientsink as a sink? I also considered to stream with a script (given below) which uses rtsp server as follows but I wonder is it possible to use rtspclientsink as an pipeline element.Thanks.
#include <gst/gst.h>
#include <gst/rtsp-server/rtsp-server.h>
#define DEFAULT_RTSP_PORT "9001"
...(some code)
/* create a server instance */
server = gst_rtsp_server_new ();
g_object_set (server, "service", port, NULL);
mounts = gst_rtsp_server_get_mount_points (server);
factory = gst_rtsp_media_factory_new ();
gst_rtsp_media_factory_set_launch (factory, argv[1]);
gst_rtsp_media_factory_set_shared (factory, TRUE);
...(some code that creates pipeline and calls rtsp stream function)
download rtsp-simple server from
https://github.com/aler9/rtsp-simple-server/releases
unzip and run it
tar xvf rtsp-simple-server_v0.19.3_linux_amd64.tar.gz
./rtsp-simple-server
it will tell you what port it's listening on
2022/07/31 13:23:34 INF rtsp-simple-server v0.19.3
2022/07/31 13:23:34 INF [RTSP] listener opened on :8554 (TCP), :8000 (UDP/RTP), :8001 (UDP/RTCP)
2022/07/31 13:23:34 INF [RTMP] listener opened on :1935
2022/07/31 13:23:34 INF [HLS] listener opened on :8888
point your rtsp stream at it (change localhost to the server hosting the rtsp-simple-server if you are sending over network)
gst-launch-1.0 -v videotestsrc ! videoconvert ! videoscale ! video/x-raw,width=640,height=480 ! x264enc speed-preset=veryfast tune=zerolatency bitrate=800 ! rtspclientsink location=rtsp://localhost:8554/mystream
Check rtsp-simple-server console log
2022/07/31 13:26:02 INF [RTSP] [conn 192.168.1.130:34932] opened
2022/07/31 13:26:02 INF [RTSP] [session 247376253] created by 192.168.1.130:34932
2022/07/31 13:26:03 INF [RTSP] [session 247376253] is publishing to path 'mystream', 1 track with UDP
open vlc-player -> Media -> Open Network Stream
Press Play
vlc should show a test pattern
you will see below in rtsp-simple-server console log
2022/07/31 13:27:10 INF [RTSP] [conn 127.0.0.1:53900] opened
2022/07/31 13:27:10 INF [RTSP] [session 749381985] created by 127.0.0.1:53900
2022/07/31 13:27:10 INF [RTSP] [session 749381985] is reading from path 'mystream', 1 track with UDP
I am not sure why this pipeline is breaking, I have gstreamer installed on linux from the websites exact instructions, any ideas?
gst-launch-1.0 v4l2src device=/dev/video0 ! videoscale ! video/x-raw, width=2592, height=600 ! autovideosink -v
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:00.000093207
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
If I change it to:
gst-launch-1.0 v4l2src device=/dev/video0 ! videoconvert! videoscale ! video/x-raw, width=2592, height=600 ! autovideosink -v
it will work but why will it not work the other way?
Your webcam transmits raw video output to whoever is listening to it. Adding videoconvert encodes the raw video to a codec that can be manipulated by the videoscale element and the end product of the manipulation to be understood by the autovideosink element.
So gst-launch-1.0 v4l2src device=/dev/video0 ! videoconvert! videoscale ! video/x-raw, width=2592, height=600 ! autovideosink is telling gstreamer to get the raw video from the camera, encode it into something that we understand, alter the video, and display it.
I really recommend that when you have doubts about an element, call gst-inspect-1.0 <element name> to see it's description and properties.
I installed a fresh version of Ubuntu 18.04 and I am trying to demux a single channel from a Matroska file by running the exact command from the Gstreamer documentation ( https://gstreamer.freedesktop.org/documentation/tutorials/basic/gstreamer-tools.html ).
Gstreamer is current:
gst-launch-1.0 --version
gst-launch-1.0 version 1.14.1
GStreamer 1.14.1
https://launchpad.net/distros/ubuntu/+source/gstreamer1.0
The problem is that Gstreamer keeps complaining that:
GstMatroskaDemux:d: Delayed linking failed
Error:
gst-launch-1.0 souphttpsrc location=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm ! matroskademux name=d d.video_00 ! matroskamux ! filesink location=sintel_video.mkv
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Got context from element 'souphttpsrc0': gst.soup.session=context, session=(SoupSession)NULL, force=(boolean)false;
WARNING: from element /GstPipeline:pipeline0/GstMatroskaDemux:d: Delayed linking failed.
Additional debug info:
./grammar.y(510): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstMatroskaDemux:d:
failed delayed linking pad video_00 of GstMatroskaDemux named d to some pad of GstMatroskaMux named matroskamux0
ERROR: from element /GstPipeline:pipeline0/GstSoupHTTPSrc:souphttpsrc0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstSoupHTTPSrc:souphttpsrc0:
streaming stopped, reason not-linked (-1)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
How can I get this to work?
I actually remembered that webm is doing matroska file format. The example is actually faulty/out of date. The pad naming is wrong. Instead of 00 use 0:
gst-launch-1.0 souphttpsrc location=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm ! matroskademux name=d d.video_0 ! matroskamux ! filesink location=sintel_video.mkv
I have installed gstreamer-1.0 on my target board.
When I run the below command:
gst-launch-1.0 filesrc location="/home/test.mp4" ! decodebin ! videoconvert ! xvimagesink
I am getting the following output:
Setting pipeline to PAUSED ...
ERROR: Pipeline doesn't want to pause.
ERROR: from element /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0: Could not initialise Xv output
Additional debug info:
xvimagesink.c(1765): gst_xvimagesink_open (): /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0:
Could not open display (null)
Setting pipeline to NULL ...
Freeing pipeline ..
Please suggest some solution to make the pipeline work.
Regards,
Sainath
Hm do you have libxv somewhere?
Find
what happens when you check gst-inspect-1.0 xvimagesink ?
Also there is library path mentioned (| grep Filename).. you can ldd it:
ldd /usr/local/lib/gstreamer-1.0/libgstxvimagesink.so
I had some problems with this as well.. try to install libxv-dev or libxv1.. maybe dev is safer.