I want to stream video from an IP camera TS-WPTCAM.
I can directly stream the video in vlc using rtsp://192.168.100.50:19112/ipcam_h264.sdp but when i try with Gstreamer, it does not play the video.
Below is the output.
Lnx-Workstation:~$ gst-launch-1.0 -v rtspsrc location="rtsp://192.168.100.50:19112/ipcam_h264.sdp" name=demux demux. ! queue max-size-buffers=2 ! rtph264depay ! autovideosink sync=false
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Got context from element 'autovideosink0': gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)"\(GstGLDisplayGBM\)\ gldisplaygbm0";
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://192.168.100.50:19112/ipcam_h264.sdp
0:00:20.130738709 13686 0x5632fcf9d2d0 ERROR default gstrtspconnection.c:1004:gst_rtsp_connection_connect_with_response: failed to connect: Socket I/O timed out
0:00:20.130840128 13686 0x5632fcf9d2d0 ERROR rtspsrc gstrtspsrc.c:4702:gst_rtsp_conninfo_connect:<demux> Could not connect to server. (Generic error)
0:00:20.130850670 13686 0x5632fcf9d2d0 WARN rtspsrc gstrtspsrc.c:7469:gst_rtspsrc_retrieve_sdp:<demux> error: Failed to connect. (Generic error)
0:00:20.130893392 13686 0x5632fcf9d2d0 WARN rtspsrc gstrtspsrc.c:7548:gst_rtspsrc_open:<demux> can't get sdp
0:00:20.130917551 13686 0x5632fcf9d2d0 WARN rtspsrc gstrtspsrc.c:5628:gst_rtspsrc_loop:<demux> we are not connected
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:demux: Could not open resource for reading and writing.
Additional debug info:
gstrtspsrc.c(7469): gst_rtspsrc_retrieve_sdp (): /GstPipeline:pipeline0/GstRTSPSrc:demux:
Failed to connect. (Generic error)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
I also tried to play the video using playbin like this:
Lnx-Workstation:~$ gst-launch-1.0 -v playbin uri=rtsp://192.168.100.50:19112/ipcam_h264.sdp uridecodebin0::source::latency=100
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: ring-buffer-max-size = 0
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: buffer-size = -1
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: buffer-duration = -1
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: use-buffering = false
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: download = false
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: uri = rtsp://192.168.100.50:19112/ipcam_h264.sdp
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: connection-speed = 0
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstRTSPSrc:source: latency = 100
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: source = "\(GstRTSPSrc\)\ source"
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://192.168.100.50:19112/ipcam_h264.sdp
0:00:20.040912220 13549 0x55e2654b5e80 ERROR default gstrtspconnection.c:1004:gst_rtsp_connection_connect_with_response: failed to connect: Socket I/O timed out
0:00:20.041032034 13549 0x55e2654b5e80 ERROR rtspsrc gstrtspsrc.c:4702:gst_rtsp_conninfo_connect:<source> Could not connect to server. (Generic error)
0:00:20.041058980 13549 0x55e2654b5e80 WARN rtspsrc gstrtspsrc.c:7469:gst_rtspsrc_retrieve_sdp:<source> error: Failed to connect. (Generic error)
0:00:20.041160200 13549 0x55e2654b5e80 WARN rtspsrc gstrtspsrc.c:7548:gst_rtspsrc_open:<source> can't get sdp
0:00:20.041185827 13549 0x55e2654b5e80 WARN rtspsrc gstrtspsrc.c:5628:gst_rtspsrc_loop:<source> we are not connected
ERROR: from element /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstRTSPSrc:source: Could not open resource for reading and writing.
Additional debug info:
gstrtspsrc.c(7469): gst_rtspsrc_retrieve_sdp (): /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstRTSPSrc:source:
Failed to connect. (Generic error)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
*** Playbin for a file source works.
How can I play RTSP video using GStreamer?
EDIT:
As per Gregory's answer:
Lnx-Workstation:~$ gst-launch-1.0 -v rtspsrc location="rtsp://192.168.100.50:19112/ipcam_h264.sdp" ! queue max-size-buffers=2 ! rtph264depay ! h264parse ! decodebin ! autovideosink sync=false
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://192.168.100.50:19112/ipcam_h264.sdp
0:00:20.108314006 3310 0x563cd86e8850 ERROR default gstrtspconnection.c:1004:gst_rtsp_connection_connect_with_response: failed to connect: Socket I/O timed out
0:00:20.108425505 3310 0x563cd86e8850 ERROR rtspsrc gstrtspsrc.c:4702:gst_rtsp_conninfo_connect:<rtspsrc0> Could not connect to server. (Generic error)
0:00:20.108449668 3310 0x563cd86e8850 WARN rtspsrc gstrtspsrc.c:7469:gst_rtspsrc_retrieve_sdp:<rtspsrc0> error: Failed to connect. (Generic error)
0:00:20.108540016 3310 0x563cd86e8850 WARN rtspsrc gstrtspsrc.c:7548:gst_rtspsrc_open:<rtspsrc0> can't get sdp
0:00:20.108569689 3310 0x563cd86e8850 WARN rtspsrc gstrtspsrc.c:5628:gst_rtspsrc_loop:<rtspsrc0> we are not connected
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0: Could not open resource for reading and writing.
Additional debug info:
gstrtspsrc.c(7469): gst_rtspsrc_retrieve_sdp (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0:
Failed to connect. (Generic error)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
The error seems to be the same.
You are depayloading H264 from RTP, but you forgot to parse and decode it before passing it to autovideosink. I also don't know why you need the demux part, because you use only video and only one stream.
Try the following:
gst-launch-1.0 -v rtspsrc location="rtsp://192.168.100.50:19112/ipcam_h264.sdp" ! queue max-size-buffers=2 ! rtph264depay ! h264parse ! decodebin ! autovideosink sync=false
Not sure for your case, but IIRC there may be RTSP authentication problems with some gstreamer versions if plugins-ugly is installed. You may try:
sudo apt-get remove gstreamer1.0-plugins-ugly
If not enough you may share the sdp for further advice.
Using mplayer here:
mplayer rtsp://username:password#IPADDRESS:PORT/your-path-to-stream
Related
This is my actual pipeline
gst-launch-1.0 --gst-debug=3 fdsrc name=fdsrc ! application/x-rtp,media=audio,encoding=SBC,payload=96,clock-rate=44100 ! rtpsbcdepay ! sbcparse ! sbcdec ! audioconvert ! audio/x-raw,layout=interleaved,format=F32LE,channels=2 ! audioresample quality=2 ! appsink name=appsink caps="audio/x-raw,layout=interleaved,format=F32LE,rate=48000,channels=2"
I didn't get audio data and so I tried to find out which command causes the issue.
I have used "filesync" to store the data coming from phone and try to debug step by step.
Here is the command to capture the rtp encoded data
"fdsrc name=fdsrc ! application/x-rtp,media=audio,encoding=SBC,payload=96,clock-rate=44100 !filesink location=/tmp/bt_capture.dat"
With this, I got some audio rtp packets in bt_capture.dat file.
I used this file as src and add the commands one by one.
Getting error in "rtpsbcdepay"
$ gst-launch-1.0 --gst-debug=3 filesrc location=/media/media/Audio/bt_capture.dat ! application/x-rtp,media=audio,encoding=SBC,payload=96,clock-rate=44100,format=time ! rtpsbcdepay ! sbcparse ! sbcdec ! audioconvert ! audio/x-raw,layout=interleaved,format=F32LE,channels=2 ! audioresample quality=2 ! appsink name=appsink caps="audio/x-raw,layout=interleaved,format=F32LE,rate=48000,channels=2"
0:00:00.000177132 19448 0x197d800 INFO GST_INIT gst.c:586:init_pre: Initializing GStreamer Core Library version 1.18.4
0:00:00.000618217 19448 0x197d800 INFO GST_INIT gst.c:587:init_pre: Using library installed in /usr/lib
0:00:00.000884197 19448 0x197d800 INFO GST_INIT gst.c:605:init_pre: Linux W-4B17-BM 5.15.21-rt30-qsc+ #1 SMP PREEMPT_RT Thu Oct 20 12:57:18 UTC 2022 x86_64
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
0:00:00.062258692 19448 0x1b5cd20 ERROR rtpbasedepayload gstrtpbasedepayload.c:662:gst_rtp_base_depayload_handle_event: Segment with non-TIME format not supported
0:00:00.062676736 19448 0x1b5cd20 ERROR rtpbasedepayload gstrtpbasedepayload.c:662:gst_rtp_base_depayload_handle_event: Segment with non-TIME format not supported
0:00:00.064241297 19448 0x1b5cd20 WARN basesrc gstbasesrc.c:3127:gst_base_src_loop: error: Internal data stream error.
0:00:00.064581497 19448 0x1b5cd20 WARN basesrc gstbasesrc.c:3127:gst_base_src_loop: error: streaming stopped, reason error (-5)
ERROR: from element /GstPipeline:pipeline0/GstFileSrc:filesrc0: Internal data stream error.
Additional debug info:
../../unpacked/gstreamer-1.18.4/libs/gst/base/gstbasesrc.c(3127): gst_base_src_loop (): /GstPipeline:pipeline0/GstFileSrc:filesrc0:
streaming stopped, reason error (-5)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
Can any one please suggest, how to overcome this error?
P.S : I have used format=time, but it didn't help and threw some syntax error
Adding a ""rtpjitterbuffer" along with "do-timestamp=TRUE" in fdsrc solves the problem but with some warnings. Still, this warning didn't affect playback (may be some packets missing?)
=====================
:01:21.926788667 20721 0x7f47100072a0 WARN audioresample gstaudioresample.c:729:gst_audio_resample_check_discont: encountered timestamp discontinuity of 1874 samples = 0:00:00.042494331
0:01:21.927975600 20721 0x7f47100072a0 WARN audioresample gstaudioresample.c:729:gst_audio_resample_check_discont: encountered timestamp discontinuity of 7968 samples = 0:00:00.180680272
0:01:22.122851228 20721 0x7f47100072a0 WARN audioresample gstaudioresample.c:729:gst_audio_resample_check_discont: encountered timestamp discontinuity of 1874 samples = 0:00:00.042494331
Pipeline:
"fdsrc name=fdsrc do-timestamp=TRUE ! "
"application/x-rtp,media=audio,encoding=SBC,payload=96,clock-rate=44100 ! "
"rtpjitterbuffer !"
"rtpsbcdepay ! "
"sbcparse ! "
"sbcdec ! "
"audioconvert ! "
"audio/x-raw,layout=interleaved,format=F32LE,channels=2 ! "
"audioresample quality=2 ! "
"appsink name=appsink caps="audio/x-raw,layout=interleaved,format=F32LE,rate=48000,channels=2"
I would like to stream with rtsp using GStreamer pipeline elements. First, I checked with gst-inspect-1.0 that rtspclientsink is available:
xilinx-k26-starterkit-2020_2:/# gst-inspect-1.0 | grep rtsp
rtspclientsink: rtspclientsink: RTSP RECORD client
rtsp: rtspsrc: RTSP packet receiver
rtsp: rtpdec: RTP Decoder
Then, wrote simplest pipeline and tested it with videotestsrc as source and kmssink as the sink. Following pipeline works well:
gst-launch-1.0 videotestsrc ! video/x-raw, width=1920, height=1080 ! kmssink bus-id=fd4a0000.zynqmp-display fullscreen-overlay=1 sync=false
Then, changed sink to rtspclientsink:
gst-launch-1.0 videotestsrc ! video/x-raw, width=1920, height=1080 ! rtspclientsink location=rtsp://localhost:554/test
However, even with a simple pipeline, stream could not be started and encountered with the error:
xilinx-k26-starterkit-2020_2:/# gst-launch-1.0 videotestsrc ! video/x-raw, width=1920,height=1080 ! rtspclientsink location=rtsp://localhost:554/test
Setting pipeline to PAUSED ...
Pipeline is PREROLLED ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://localhost:554/test
ERROR: from element /GstPipeline:pipeline0/GstRTSPClientSink:rtspclientsink0: Could not open resource for reading and writing.
Additional debug info:
../../../gst-rtsp-server-1.16.1/gst/rtsp-sink/gstrtspclientsink.c(3236): gst_rtsp_client_sink_connect_to_server (): /GstPipeline:pipeline0 /GstRTSPClientSink:rtspclientsink0:
Failed to connect. (Generic error)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
Could anyone enlighten me about the error and the way I can use rtspclientsink as a sink? I also considered to stream with a script (given below) which uses rtsp server as follows but I wonder is it possible to use rtspclientsink as an pipeline element.Thanks.
#include <gst/gst.h>
#include <gst/rtsp-server/rtsp-server.h>
#define DEFAULT_RTSP_PORT "9001"
...(some code)
/* create a server instance */
server = gst_rtsp_server_new ();
g_object_set (server, "service", port, NULL);
mounts = gst_rtsp_server_get_mount_points (server);
factory = gst_rtsp_media_factory_new ();
gst_rtsp_media_factory_set_launch (factory, argv[1]);
gst_rtsp_media_factory_set_shared (factory, TRUE);
...(some code that creates pipeline and calls rtsp stream function)
download rtsp-simple server from
https://github.com/aler9/rtsp-simple-server/releases
unzip and run it
tar xvf rtsp-simple-server_v0.19.3_linux_amd64.tar.gz
./rtsp-simple-server
it will tell you what port it's listening on
2022/07/31 13:23:34 INF rtsp-simple-server v0.19.3
2022/07/31 13:23:34 INF [RTSP] listener opened on :8554 (TCP), :8000 (UDP/RTP), :8001 (UDP/RTCP)
2022/07/31 13:23:34 INF [RTMP] listener opened on :1935
2022/07/31 13:23:34 INF [HLS] listener opened on :8888
point your rtsp stream at it (change localhost to the server hosting the rtsp-simple-server if you are sending over network)
gst-launch-1.0 -v videotestsrc ! videoconvert ! videoscale ! video/x-raw,width=640,height=480 ! x264enc speed-preset=veryfast tune=zerolatency bitrate=800 ! rtspclientsink location=rtsp://localhost:8554/mystream
Check rtsp-simple-server console log
2022/07/31 13:26:02 INF [RTSP] [conn 192.168.1.130:34932] opened
2022/07/31 13:26:02 INF [RTSP] [session 247376253] created by 192.168.1.130:34932
2022/07/31 13:26:03 INF [RTSP] [session 247376253] is publishing to path 'mystream', 1 track with UDP
open vlc-player -> Media -> Open Network Stream
Press Play
vlc should show a test pattern
you will see below in rtsp-simple-server console log
2022/07/31 13:27:10 INF [RTSP] [conn 127.0.0.1:53900] opened
2022/07/31 13:27:10 INF [RTSP] [session 749381985] created by 127.0.0.1:53900
2022/07/31 13:27:10 INF [RTSP] [session 749381985] is reading from path 'mystream', 1 track with UDP
Environment :- Raspi v2 camera, Jetson nano board, Ubuntu 18.04
I started with nvarguscamerasrc and it's working :-
gst-launch-1.0 nvarguscamerasrc sensor_mode=0 ! 'video/x-raw(memory:NVMM),width=3820, height=2464, framerate=21/1, format=NV12'! nvegltransform ! nveglglessink -e
and then I tried running these inputs :-
gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-h264, width=3280, height=2464' ! filesink
and also :-
gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-raw, width=640, height=480' ! filesink
and got output as :-
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming stopped, reason not-negotiated (-4)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
I installed a fresh version of Ubuntu 18.04 and I am trying to demux a single channel from a Matroska file by running the exact command from the Gstreamer documentation ( https://gstreamer.freedesktop.org/documentation/tutorials/basic/gstreamer-tools.html ).
Gstreamer is current:
gst-launch-1.0 --version
gst-launch-1.0 version 1.14.1
GStreamer 1.14.1
https://launchpad.net/distros/ubuntu/+source/gstreamer1.0
The problem is that Gstreamer keeps complaining that:
GstMatroskaDemux:d: Delayed linking failed
Error:
gst-launch-1.0 souphttpsrc location=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm ! matroskademux name=d d.video_00 ! matroskamux ! filesink location=sintel_video.mkv
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Got context from element 'souphttpsrc0': gst.soup.session=context, session=(SoupSession)NULL, force=(boolean)false;
WARNING: from element /GstPipeline:pipeline0/GstMatroskaDemux:d: Delayed linking failed.
Additional debug info:
./grammar.y(510): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstMatroskaDemux:d:
failed delayed linking pad video_00 of GstMatroskaDemux named d to some pad of GstMatroskaMux named matroskamux0
ERROR: from element /GstPipeline:pipeline0/GstSoupHTTPSrc:souphttpsrc0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstSoupHTTPSrc:souphttpsrc0:
streaming stopped, reason not-linked (-1)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
How can I get this to work?
I actually remembered that webm is doing matroska file format. The example is actually faulty/out of date. The pad naming is wrong. Instead of 00 use 0:
gst-launch-1.0 souphttpsrc location=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm ! matroskademux name=d d.video_0 ! matroskamux ! filesink location=sintel_video.mkv
I have installed Gstreamer on my i.MX6 board. I want to stream the camera connected using RTSP.
The following command displays the camera content on the LVDS screen:
gst-launch tvsrc ! Imxv4vl2sink
Instead of displaying it on the screen, I want to send the content over network RTSP and display it on other device's display
I used the following command to start streaming which works without throwing any error
gst-launch-0.10 -vv imxv4l2src ! video/x-raw-yuv, framerate=30/1, width=1024
, height=768 ! vpuenc codec=avc ! rtph264pay ! udpsink host=127.0.0.1 port=5004
sync=false
On the other device, I executed the following command:
gst-launch rtspsrc location=rtsp://<ip Address>:5004 name=source ! queue ! rtph264depay !
vpudec low-latency=true ! imxv4l2sink
It fails with the following error:
gstrtspsrc.c(5685): gst_rtspsrc_retrieve_sdp (): /GstPipeline:pipeline0/GstRTSPSrc:source:
Failed to connect. (System error: Connection refused)
I can ping from the other device to the streaming device.
What can be the issue?
You are sending RTP over UDP, but do not provide any RTSP protocol. So your receiver which tries to connect via RTSP will fail. You either need to use the GstRTSPServer class and implement some logic as an application on the sender side or receive your data via udpsrc. For the latter you still need a way to transmit SDP data (which usually is one thing an RTSP server does).