gstreamer rtsp client connection refused error - linux

I have installed Gstreamer on my i.MX6 board. I want to stream the camera connected using RTSP.
The following command displays the camera content on the LVDS screen:
gst-launch tvsrc ! Imxv4vl2sink
Instead of displaying it on the screen, I want to send the content over network RTSP and display it on other device's display
I used the following command to start streaming which works without throwing any error
gst-launch-0.10 -vv imxv4l2src ! video/x-raw-yuv, framerate=30/1, width=1024
, height=768 ! vpuenc codec=avc ! rtph264pay ! udpsink host=127.0.0.1 port=5004
sync=false
On the other device, I executed the following command:
gst-launch rtspsrc location=rtsp://<ip Address>:5004 name=source ! queue ! rtph264depay !
vpudec low-latency=true ! imxv4l2sink
It fails with the following error:
gstrtspsrc.c(5685): gst_rtspsrc_retrieve_sdp (): /GstPipeline:pipeline0/GstRTSPSrc:source:
Failed to connect. (System error: Connection refused)
I can ping from the other device to the streaming device.
What can be the issue?

You are sending RTP over UDP, but do not provide any RTSP protocol. So your receiver which tries to connect via RTSP will fail. You either need to use the GstRTSPServer class and implement some logic as an application on the sender side or receive your data via udpsrc. For the latter you still need a way to transmit SDP data (which usually is one thing an RTSP server does).

Related

Play rtsp stream from webcam using Gstreamer

I want to stream video from an IP camera TS-WPTCAM.
I can directly stream the video in vlc using rtsp://192.168.100.50:19112/ipcam_h264.sdp but when i try with Gstreamer, it does not play the video.
Below is the output.
Lnx-Workstation:~$ gst-launch-1.0 -v rtspsrc location="rtsp://192.168.100.50:19112/ipcam_h264.sdp" name=demux demux. ! queue max-size-buffers=2 ! rtph264depay ! autovideosink sync=false
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Got context from element 'autovideosink0': gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)"\(GstGLDisplayGBM\)\ gldisplaygbm0";
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://192.168.100.50:19112/ipcam_h264.sdp
0:00:20.130738709 13686 0x5632fcf9d2d0 ERROR default gstrtspconnection.c:1004:gst_rtsp_connection_connect_with_response: failed to connect: Socket I/O timed out
0:00:20.130840128 13686 0x5632fcf9d2d0 ERROR rtspsrc gstrtspsrc.c:4702:gst_rtsp_conninfo_connect:<demux> Could not connect to server. (Generic error)
0:00:20.130850670 13686 0x5632fcf9d2d0 WARN rtspsrc gstrtspsrc.c:7469:gst_rtspsrc_retrieve_sdp:<demux> error: Failed to connect. (Generic error)
0:00:20.130893392 13686 0x5632fcf9d2d0 WARN rtspsrc gstrtspsrc.c:7548:gst_rtspsrc_open:<demux> can't get sdp
0:00:20.130917551 13686 0x5632fcf9d2d0 WARN rtspsrc gstrtspsrc.c:5628:gst_rtspsrc_loop:<demux> we are not connected
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:demux: Could not open resource for reading and writing.
Additional debug info:
gstrtspsrc.c(7469): gst_rtspsrc_retrieve_sdp (): /GstPipeline:pipeline0/GstRTSPSrc:demux:
Failed to connect. (Generic error)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
I also tried to play the video using playbin like this:
Lnx-Workstation:~$ gst-launch-1.0 -v playbin uri=rtsp://192.168.100.50:19112/ipcam_h264.sdp uridecodebin0::source::latency=100
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: ring-buffer-max-size = 0
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: buffer-size = -1
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: buffer-duration = -1
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: use-buffering = false
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: download = false
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: uri = rtsp://192.168.100.50:19112/ipcam_h264.sdp
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: connection-speed = 0
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstRTSPSrc:source: latency = 100
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: source = "\(GstRTSPSrc\)\ source"
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://192.168.100.50:19112/ipcam_h264.sdp
0:00:20.040912220 13549 0x55e2654b5e80 ERROR default gstrtspconnection.c:1004:gst_rtsp_connection_connect_with_response: failed to connect: Socket I/O timed out
0:00:20.041032034 13549 0x55e2654b5e80 ERROR rtspsrc gstrtspsrc.c:4702:gst_rtsp_conninfo_connect:<source> Could not connect to server. (Generic error)
0:00:20.041058980 13549 0x55e2654b5e80 WARN rtspsrc gstrtspsrc.c:7469:gst_rtspsrc_retrieve_sdp:<source> error: Failed to connect. (Generic error)
0:00:20.041160200 13549 0x55e2654b5e80 WARN rtspsrc gstrtspsrc.c:7548:gst_rtspsrc_open:<source> can't get sdp
0:00:20.041185827 13549 0x55e2654b5e80 WARN rtspsrc gstrtspsrc.c:5628:gst_rtspsrc_loop:<source> we are not connected
ERROR: from element /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstRTSPSrc:source: Could not open resource for reading and writing.
Additional debug info:
gstrtspsrc.c(7469): gst_rtspsrc_retrieve_sdp (): /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstRTSPSrc:source:
Failed to connect. (Generic error)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
*** Playbin for a file source works.
How can I play RTSP video using GStreamer?
EDIT:
As per Gregory's answer:
Lnx-Workstation:~$ gst-launch-1.0 -v rtspsrc location="rtsp://192.168.100.50:19112/ipcam_h264.sdp" ! queue max-size-buffers=2 ! rtph264depay ! h264parse ! decodebin ! autovideosink sync=false
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://192.168.100.50:19112/ipcam_h264.sdp
0:00:20.108314006 3310 0x563cd86e8850 ERROR default gstrtspconnection.c:1004:gst_rtsp_connection_connect_with_response: failed to connect: Socket I/O timed out
0:00:20.108425505 3310 0x563cd86e8850 ERROR rtspsrc gstrtspsrc.c:4702:gst_rtsp_conninfo_connect:<rtspsrc0> Could not connect to server. (Generic error)
0:00:20.108449668 3310 0x563cd86e8850 WARN rtspsrc gstrtspsrc.c:7469:gst_rtspsrc_retrieve_sdp:<rtspsrc0> error: Failed to connect. (Generic error)
0:00:20.108540016 3310 0x563cd86e8850 WARN rtspsrc gstrtspsrc.c:7548:gst_rtspsrc_open:<rtspsrc0> can't get sdp
0:00:20.108569689 3310 0x563cd86e8850 WARN rtspsrc gstrtspsrc.c:5628:gst_rtspsrc_loop:<rtspsrc0> we are not connected
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0: Could not open resource for reading and writing.
Additional debug info:
gstrtspsrc.c(7469): gst_rtspsrc_retrieve_sdp (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0:
Failed to connect. (Generic error)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
The error seems to be the same.
You are depayloading H264 from RTP, but you forgot to parse and decode it before passing it to autovideosink. I also don't know why you need the demux part, because you use only video and only one stream.
Try the following:
gst-launch-1.0 -v rtspsrc location="rtsp://192.168.100.50:19112/ipcam_h264.sdp" ! queue max-size-buffers=2 ! rtph264depay ! h264parse ! decodebin ! autovideosink sync=false
Not sure for your case, but IIRC there may be RTSP authentication problems with some gstreamer versions if plugins-ugly is installed. You may try:
sudo apt-get remove gstreamer1.0-plugins-ugly
If not enough you may share the sdp for further advice.
Using mplayer here:
mplayer rtsp://username:password#IPADDRESS:PORT/your-path-to-stream

rtspclientsink test pipeline from command line

I would like to stream with rtsp using GStreamer pipeline elements. First, I checked with gst-inspect-1.0 that rtspclientsink is available:
xilinx-k26-starterkit-2020_2:/# gst-inspect-1.0 | grep rtsp
rtspclientsink: rtspclientsink: RTSP RECORD client
rtsp: rtspsrc: RTSP packet receiver
rtsp: rtpdec: RTP Decoder
Then, wrote simplest pipeline and tested it with videotestsrc as source and kmssink as the sink. Following pipeline works well:
gst-launch-1.0 videotestsrc ! video/x-raw, width=1920, height=1080 ! kmssink bus-id=fd4a0000.zynqmp-display fullscreen-overlay=1 sync=false
Then, changed sink to rtspclientsink:
gst-launch-1.0 videotestsrc ! video/x-raw, width=1920, height=1080 ! rtspclientsink location=rtsp://localhost:554/test
However, even with a simple pipeline, stream could not be started and encountered with the error:
xilinx-k26-starterkit-2020_2:/# gst-launch-1.0 videotestsrc ! video/x-raw, width=1920,height=1080 ! rtspclientsink location=rtsp://localhost:554/test
Setting pipeline to PAUSED ...
Pipeline is PREROLLED ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://localhost:554/test
ERROR: from element /GstPipeline:pipeline0/GstRTSPClientSink:rtspclientsink0: Could not open resource for reading and writing.
Additional debug info:
../../../gst-rtsp-server-1.16.1/gst/rtsp-sink/gstrtspclientsink.c(3236): gst_rtsp_client_sink_connect_to_server (): /GstPipeline:pipeline0 /GstRTSPClientSink:rtspclientsink0:
Failed to connect. (Generic error)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
Could anyone enlighten me about the error and the way I can use rtspclientsink as a sink? I also considered to stream with a script (given below) which uses rtsp server as follows but I wonder is it possible to use rtspclientsink as an pipeline element.Thanks.
#include <gst/gst.h>
#include <gst/rtsp-server/rtsp-server.h>
#define DEFAULT_RTSP_PORT "9001"
...(some code)
/* create a server instance */
server = gst_rtsp_server_new ();
g_object_set (server, "service", port, NULL);
mounts = gst_rtsp_server_get_mount_points (server);
factory = gst_rtsp_media_factory_new ();
gst_rtsp_media_factory_set_launch (factory, argv[1]);
gst_rtsp_media_factory_set_shared (factory, TRUE);
...(some code that creates pipeline and calls rtsp stream function)
download rtsp-simple server from
https://github.com/aler9/rtsp-simple-server/releases
unzip and run it
tar xvf rtsp-simple-server_v0.19.3_linux_amd64.tar.gz
./rtsp-simple-server
it will tell you what port it's listening on
2022/07/31 13:23:34 INF rtsp-simple-server v0.19.3
2022/07/31 13:23:34 INF [RTSP] listener opened on :8554 (TCP), :8000 (UDP/RTP), :8001 (UDP/RTCP)
2022/07/31 13:23:34 INF [RTMP] listener opened on :1935
2022/07/31 13:23:34 INF [HLS] listener opened on :8888
point your rtsp stream at it (change localhost to the server hosting the rtsp-simple-server if you are sending over network)
gst-launch-1.0 -v videotestsrc ! videoconvert ! videoscale ! video/x-raw,width=640,height=480 ! x264enc speed-preset=veryfast tune=zerolatency bitrate=800 ! rtspclientsink location=rtsp://localhost:8554/mystream
Check rtsp-simple-server console log
2022/07/31 13:26:02 INF [RTSP] [conn 192.168.1.130:34932] opened
2022/07/31 13:26:02 INF [RTSP] [session 247376253] created by 192.168.1.130:34932
2022/07/31 13:26:03 INF [RTSP] [session 247376253] is publishing to path 'mystream', 1 track with UDP
open vlc-player -> Media -> Open Network Stream
Press Play
vlc should show a test pattern
you will see below in rtsp-simple-server console log
2022/07/31 13:27:10 INF [RTSP] [conn 127.0.0.1:53900] opened
2022/07/31 13:27:10 INF [RTSP] [session 749381985] created by 127.0.0.1:53900
2022/07/31 13:27:10 INF [RTSP] [session 749381985] is reading from path 'mystream', 1 track with UDP

Running gstreamer app in OpenVino using Windows subsystem in Linux (WSL2)

My system main OS is Windows-10 OS and installed WSL2 in subsystem.
Openvino is installed in Linux under WSL2. The reason is dlstreamer-gst is supported only in linux.
When I test the following test app, it works only for fps and display is not working.
If I run app as
./vehicle_pedestrian_tracking.sh person-bicycle-car-detection.mp4 10 CPU fps
I can see the output as
FpsCounter(1sec): total=346.54 fps, number-streams=1, per-stream=346.54 fps
FPSCounter(average): total=340.06 fps, number-streams=1, per-stream=340.06 fps
Got EOS from element "pipeline0".
Execution ended after 0:00:02.456260800
Setting pipeline to NULL ...
Freeing pipeline ...
When I run for display,
./vehicle_pedestrian_tracking.sh person-bicycle-car-detection.mp4 10 CPU dispaly
I have error as
ERROR: from element /GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstXvImageSink:xvimagesink0: Could not initialise Xv output
Additional debug info:
../sys/xvimage/xvimagesink.c(1778): gst_xv_image_sink_open (): /GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstXvImageSink:xvimagesink0:
Could not open display (null)
ERROR: pipeline doesn't want to preroll.
Failed to set pipeline to PAUSED.
Setting pipeline to NULL ...
Freeing pipeline ...
Is it display is not available in WSL subsystem?
Use MobaXTerm with WSL2 and change the command of xvimagesink to ximagesink in vehicle_pedestrian_tracking.sh.
My solution is I can't display but I can save to file as
gst-launch-1.0 filesrc location=video1.avi ! decodebin ! video/x-raw ! queue ! gvadetect model=/home/nyan/intel/dl_streamer/models/intel/vehicle-detection/FP16/model_fp16.xml model-proc=./model_proc/vehicle-detection.json inference-interval=10 threshold=0.6 device=CPU ! queue ! gvafpscounter ! filesink location=output.avi

Sink to the virtual v4l2 device

I have tried an example on Ubuntu 19.04
gst-launch-1.0 videotestsrc ! v4l2sink device=/dev/video10
But gstreamer fails
Setting pipeline to PAUSED ...
ERROR: Pipeline doesn't want to pause.
ERROR: from element /GstPipeline:pipeline0/GstV4l2Sink:v4l2sink0: Cannot identify device '/dev/video10'.
Additional debug info:
v4l2_calls.c(609): gst_v4l2_open (): /GstPipeline:pipeline0/GstV4l2Sink:v4l2sink0:
system error: No such file or directory
Setting pipeline to NULL ...
Freeing pipeline ...
Why it doesn't work? I haven't found this in the documentation, do I need to create /dev/video10 somehow?
I did the same for the default device /dev/video1, but it is an input camera device on my laptop:
sudo gst-launch-1.0 videotestsrc ! v4l2sink
Setting pipeline to PAUSED ...
ERROR: Pipeline doesn't want to pause.
ERROR: from element /GstPipeline:pipeline0/GstV4l2Sink:v4l2sink0: Device '/dev/video1' is not a output device.
Additional debug info:
v4l2_calls.c(639): gst_v4l2_open (): /GstPipeline:pipeline0/GstV4l2Sink:v4l2sink0:
Capabilities: 0x4a00000
Setting pipeline to NULL ...
Freeing pipeline ...
Thank in advance.
The title of your questions suggests you would like to write to a virtual video device. v4l2 devices can be both input and output video devices. Your camera is a video input (capture) device. Directing a v4l2sink (so an endpoint of the pipeline) in gstreamer will likely fail.
You can however generate a virtual output device. What you are looking for is something like the v4l2-loopback device. It allows you to generate a virtual /dev/video10 device like this:
modprobe v4l2loopback video_nr=10
Another possible solution for the same error message: recreate the v4l2loopback interface:
sudo rmmod -f v4l2loopback
sudo modprobe v4l2loopback
This might apply to others experiencing the error message from the original question, but who are already aware they need a v4l2loopback device as gstreamer sink.
When trying to stream a video to the existing v4l2loopback device I streamed to using ffmpeg before, I got the same error message
Device '/dev/video0' is not a output device.
Investigation
When comparing the state of a working loopback video device and a non-working one (i.e. after writing to it with ffmpeg) with v4l2-ctl --all -d 0 using diff, I found the following difference :
--- working 2020-11-19 18:03:52.499440518 +0100
+++ non-working 2020-11-19 18:03:57.472802868 +0100
## -3,21 +3,18 ##
Card type : GPhoto2 Webcam
Bus info : platform:v4l2loopback-000
Driver version : 5.9.8
- Capabilities : 0x85208002
- Video Output
+ Capabilities : 0x85208000
Video Memory-to-Memory
Read/Write
Streaming
Extended Pix Format
Device Capabilities
- Device Caps : 0x05208002
- Video Output
+ Device Caps : 0x05208000
Video Memory-to-Memory
Read/Write
Streaming
Extended Pix Format
Priority: 0
-Video output: 0 (loopback in)
Format Video Output:
Width/Height : 960/640
Pixel Format : 'YU12' (Planar YUV 4:2:0)
Somehow that "Video Output" capability is required for gstreamer to work successfully and taken away by my previous ffmpeg call.
The behaviour only occured when I loaded the v4l2loopback module with the exclusive_caps=1 option, see 1.
The solution was to unload / load the v4l2loopback kernel commands, forcefully removing the v4l2loopback kernel module and adding it again using rmmod / modprobe (see above).

How to create an indexed video file with Gstreamer

I'm trying to use Gstreamer to create a seekable (indexed) video file in Linux. My pipelines work for recording and saving the data, but I can't figure out how to index the data so I can seek using gst_element_seek_simple() [http://docs.gstreamer.com/display/GstSDK/Basic+tutorial+4%3A+Time+management]
I have seen this post: Gstreamer video output position tracking and seeking and validated I am sending an EOS on the pipeline with -e.
Here is my pipeline and output. I'm teeing it to display both to my embedded system's screen and save to the M4V file.
# gst-launch-0.10 -e v4l2src ! \
tee name=t !
queue !
video/x-raw-yuv,width=320,height=240 !
videoflip method=clockwise !
ffmpegcolorspace !
fbdevsink t. !
queue !
ffmpegcolorspace !
ffenc_mpeg4 !
filesink location=output.m4v
Here is the output:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
WARNING: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Could not get parameters on device '/dev/video0'
Additional debug info:
v4l2src_calls.c(235): gst_v4l2src_set_capture (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
system error: Inappropriate ioctl for device
Setting pipeline to PLAYING ...
New clock: GstSystemClock
^CCaught interrupt -- handling interrupt.
Interrupt: Stopping pipeline ...
(gst-launch-0.10:534): GLib-CRITICAL **: Source ID 62 was not found when attempting to remove it
EOS on shutdown enabled -- Forcing EOS on the pipeline
Waiting for EOS...
Got EOS from element "pipeline0".
EOS received - stopping pipeline...
Execution ended after 10057977251 ns.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
And here is the output of gst-discover on my new file:
beaglebone:# gst-discoverer-0.10 output.m4v
Analyzing file:///output.m4v
Done discovering file:///output.m4v
Topology:
video: MPEG-4 Video
Properties:
Duration: 0:00:00.000000000
Seekable: no
Thanks
You need to store the result in a seekable/indexed format. For that you can put the mpeg4 video inside a container such as mp4 or matroska. Use "! mp4mux ! filesink" or "! matroskamux ! filesink" to have it inside those formats that should make it seekable.
Side notes: gstreamer 0.10 is over 2 years obsolete and unmantained, please upgrade to 1.0.
http://gstreamer.freedesktop.org/ is the official gstreamer website and you will find the releases for 1.x versions there. The gstreamer.com website is a project is not related to the official project and, if you read the text in gstreamer.com, you will see that you should be using the official repository and installers.

Resources