How to create an indexed video file with Gstreamer - linux

I'm trying to use Gstreamer to create a seekable (indexed) video file in Linux. My pipelines work for recording and saving the data, but I can't figure out how to index the data so I can seek using gst_element_seek_simple() [http://docs.gstreamer.com/display/GstSDK/Basic+tutorial+4%3A+Time+management]
I have seen this post: Gstreamer video output position tracking and seeking and validated I am sending an EOS on the pipeline with -e.
Here is my pipeline and output. I'm teeing it to display both to my embedded system's screen and save to the M4V file.
# gst-launch-0.10 -e v4l2src ! \
tee name=t !
queue !
video/x-raw-yuv,width=320,height=240 !
videoflip method=clockwise !
ffmpegcolorspace !
fbdevsink t. !
queue !
ffmpegcolorspace !
ffenc_mpeg4 !
filesink location=output.m4v
Here is the output:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
WARNING: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Could not get parameters on device '/dev/video0'
Additional debug info:
v4l2src_calls.c(235): gst_v4l2src_set_capture (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
system error: Inappropriate ioctl for device
Setting pipeline to PLAYING ...
New clock: GstSystemClock
^CCaught interrupt -- handling interrupt.
Interrupt: Stopping pipeline ...
(gst-launch-0.10:534): GLib-CRITICAL **: Source ID 62 was not found when attempting to remove it
EOS on shutdown enabled -- Forcing EOS on the pipeline
Waiting for EOS...
Got EOS from element "pipeline0".
EOS received - stopping pipeline...
Execution ended after 10057977251 ns.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
And here is the output of gst-discover on my new file:
beaglebone:# gst-discoverer-0.10 output.m4v
Analyzing file:///output.m4v
Done discovering file:///output.m4v
Topology:
video: MPEG-4 Video
Properties:
Duration: 0:00:00.000000000
Seekable: no
Thanks

You need to store the result in a seekable/indexed format. For that you can put the mpeg4 video inside a container such as mp4 or matroska. Use "! mp4mux ! filesink" or "! matroskamux ! filesink" to have it inside those formats that should make it seekable.
Side notes: gstreamer 0.10 is over 2 years obsolete and unmantained, please upgrade to 1.0.
http://gstreamer.freedesktop.org/ is the official gstreamer website and you will find the releases for 1.x versions there. The gstreamer.com website is a project is not related to the official project and, if you read the text in gstreamer.com, you will see that you should be using the official repository and installers.

Related

rtspclientsink test pipeline from command line

I would like to stream with rtsp using GStreamer pipeline elements. First, I checked with gst-inspect-1.0 that rtspclientsink is available:
xilinx-k26-starterkit-2020_2:/# gst-inspect-1.0 | grep rtsp
rtspclientsink: rtspclientsink: RTSP RECORD client
rtsp: rtspsrc: RTSP packet receiver
rtsp: rtpdec: RTP Decoder
Then, wrote simplest pipeline and tested it with videotestsrc as source and kmssink as the sink. Following pipeline works well:
gst-launch-1.0 videotestsrc ! video/x-raw, width=1920, height=1080 ! kmssink bus-id=fd4a0000.zynqmp-display fullscreen-overlay=1 sync=false
Then, changed sink to rtspclientsink:
gst-launch-1.0 videotestsrc ! video/x-raw, width=1920, height=1080 ! rtspclientsink location=rtsp://localhost:554/test
However, even with a simple pipeline, stream could not be started and encountered with the error:
xilinx-k26-starterkit-2020_2:/# gst-launch-1.0 videotestsrc ! video/x-raw, width=1920,height=1080 ! rtspclientsink location=rtsp://localhost:554/test
Setting pipeline to PAUSED ...
Pipeline is PREROLLED ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://localhost:554/test
ERROR: from element /GstPipeline:pipeline0/GstRTSPClientSink:rtspclientsink0: Could not open resource for reading and writing.
Additional debug info:
../../../gst-rtsp-server-1.16.1/gst/rtsp-sink/gstrtspclientsink.c(3236): gst_rtsp_client_sink_connect_to_server (): /GstPipeline:pipeline0 /GstRTSPClientSink:rtspclientsink0:
Failed to connect. (Generic error)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
Could anyone enlighten me about the error and the way I can use rtspclientsink as a sink? I also considered to stream with a script (given below) which uses rtsp server as follows but I wonder is it possible to use rtspclientsink as an pipeline element.Thanks.
#include <gst/gst.h>
#include <gst/rtsp-server/rtsp-server.h>
#define DEFAULT_RTSP_PORT "9001"
...(some code)
/* create a server instance */
server = gst_rtsp_server_new ();
g_object_set (server, "service", port, NULL);
mounts = gst_rtsp_server_get_mount_points (server);
factory = gst_rtsp_media_factory_new ();
gst_rtsp_media_factory_set_launch (factory, argv[1]);
gst_rtsp_media_factory_set_shared (factory, TRUE);
...(some code that creates pipeline and calls rtsp stream function)
download rtsp-simple server from
https://github.com/aler9/rtsp-simple-server/releases
unzip and run it
tar xvf rtsp-simple-server_v0.19.3_linux_amd64.tar.gz
./rtsp-simple-server
it will tell you what port it's listening on
2022/07/31 13:23:34 INF rtsp-simple-server v0.19.3
2022/07/31 13:23:34 INF [RTSP] listener opened on :8554 (TCP), :8000 (UDP/RTP), :8001 (UDP/RTCP)
2022/07/31 13:23:34 INF [RTMP] listener opened on :1935
2022/07/31 13:23:34 INF [HLS] listener opened on :8888
point your rtsp stream at it (change localhost to the server hosting the rtsp-simple-server if you are sending over network)
gst-launch-1.0 -v videotestsrc ! videoconvert ! videoscale ! video/x-raw,width=640,height=480 ! x264enc speed-preset=veryfast tune=zerolatency bitrate=800 ! rtspclientsink location=rtsp://localhost:8554/mystream
Check rtsp-simple-server console log
2022/07/31 13:26:02 INF [RTSP] [conn 192.168.1.130:34932] opened
2022/07/31 13:26:02 INF [RTSP] [session 247376253] created by 192.168.1.130:34932
2022/07/31 13:26:03 INF [RTSP] [session 247376253] is publishing to path 'mystream', 1 track with UDP
open vlc-player -> Media -> Open Network Stream
Press Play
vlc should show a test pattern
you will see below in rtsp-simple-server console log
2022/07/31 13:27:10 INF [RTSP] [conn 127.0.0.1:53900] opened
2022/07/31 13:27:10 INF [RTSP] [session 749381985] created by 127.0.0.1:53900
2022/07/31 13:27:10 INF [RTSP] [session 749381985] is reading from path 'mystream', 1 track with UDP

Running gstreamer app in OpenVino using Windows subsystem in Linux (WSL2)

My system main OS is Windows-10 OS and installed WSL2 in subsystem.
Openvino is installed in Linux under WSL2. The reason is dlstreamer-gst is supported only in linux.
When I test the following test app, it works only for fps and display is not working.
If I run app as
./vehicle_pedestrian_tracking.sh person-bicycle-car-detection.mp4 10 CPU fps
I can see the output as
FpsCounter(1sec): total=346.54 fps, number-streams=1, per-stream=346.54 fps
FPSCounter(average): total=340.06 fps, number-streams=1, per-stream=340.06 fps
Got EOS from element "pipeline0".
Execution ended after 0:00:02.456260800
Setting pipeline to NULL ...
Freeing pipeline ...
When I run for display,
./vehicle_pedestrian_tracking.sh person-bicycle-car-detection.mp4 10 CPU dispaly
I have error as
ERROR: from element /GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstXvImageSink:xvimagesink0: Could not initialise Xv output
Additional debug info:
../sys/xvimage/xvimagesink.c(1778): gst_xv_image_sink_open (): /GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstXvImageSink:xvimagesink0:
Could not open display (null)
ERROR: pipeline doesn't want to preroll.
Failed to set pipeline to PAUSED.
Setting pipeline to NULL ...
Freeing pipeline ...
Is it display is not available in WSL subsystem?
Use MobaXTerm with WSL2 and change the command of xvimagesink to ximagesink in vehicle_pedestrian_tracking.sh.
My solution is I can't display but I can save to file as
gst-launch-1.0 filesrc location=video1.avi ! decodebin ! video/x-raw ! queue ! gvadetect model=/home/nyan/intel/dl_streamer/models/intel/vehicle-detection/FP16/model_fp16.xml model-proc=./model_proc/vehicle-detection.json inference-interval=10 threshold=0.6 device=CPU ! queue ! gvafpscounter ! filesink location=output.avi

Sink to the virtual v4l2 device

I have tried an example on Ubuntu 19.04
gst-launch-1.0 videotestsrc ! v4l2sink device=/dev/video10
But gstreamer fails
Setting pipeline to PAUSED ...
ERROR: Pipeline doesn't want to pause.
ERROR: from element /GstPipeline:pipeline0/GstV4l2Sink:v4l2sink0: Cannot identify device '/dev/video10'.
Additional debug info:
v4l2_calls.c(609): gst_v4l2_open (): /GstPipeline:pipeline0/GstV4l2Sink:v4l2sink0:
system error: No such file or directory
Setting pipeline to NULL ...
Freeing pipeline ...
Why it doesn't work? I haven't found this in the documentation, do I need to create /dev/video10 somehow?
I did the same for the default device /dev/video1, but it is an input camera device on my laptop:
sudo gst-launch-1.0 videotestsrc ! v4l2sink
Setting pipeline to PAUSED ...
ERROR: Pipeline doesn't want to pause.
ERROR: from element /GstPipeline:pipeline0/GstV4l2Sink:v4l2sink0: Device '/dev/video1' is not a output device.
Additional debug info:
v4l2_calls.c(639): gst_v4l2_open (): /GstPipeline:pipeline0/GstV4l2Sink:v4l2sink0:
Capabilities: 0x4a00000
Setting pipeline to NULL ...
Freeing pipeline ...
Thank in advance.
The title of your questions suggests you would like to write to a virtual video device. v4l2 devices can be both input and output video devices. Your camera is a video input (capture) device. Directing a v4l2sink (so an endpoint of the pipeline) in gstreamer will likely fail.
You can however generate a virtual output device. What you are looking for is something like the v4l2-loopback device. It allows you to generate a virtual /dev/video10 device like this:
modprobe v4l2loopback video_nr=10
Another possible solution for the same error message: recreate the v4l2loopback interface:
sudo rmmod -f v4l2loopback
sudo modprobe v4l2loopback
This might apply to others experiencing the error message from the original question, but who are already aware they need a v4l2loopback device as gstreamer sink.
When trying to stream a video to the existing v4l2loopback device I streamed to using ffmpeg before, I got the same error message
Device '/dev/video0' is not a output device.
Investigation
When comparing the state of a working loopback video device and a non-working one (i.e. after writing to it with ffmpeg) with v4l2-ctl --all -d 0 using diff, I found the following difference :
--- working 2020-11-19 18:03:52.499440518 +0100
+++ non-working 2020-11-19 18:03:57.472802868 +0100
## -3,21 +3,18 ##
Card type : GPhoto2 Webcam
Bus info : platform:v4l2loopback-000
Driver version : 5.9.8
- Capabilities : 0x85208002
- Video Output
+ Capabilities : 0x85208000
Video Memory-to-Memory
Read/Write
Streaming
Extended Pix Format
Device Capabilities
- Device Caps : 0x05208002
- Video Output
+ Device Caps : 0x05208000
Video Memory-to-Memory
Read/Write
Streaming
Extended Pix Format
Priority: 0
-Video output: 0 (loopback in)
Format Video Output:
Width/Height : 960/640
Pixel Format : 'YU12' (Planar YUV 4:2:0)
Somehow that "Video Output" capability is required for gstreamer to work successfully and taken away by my previous ffmpeg call.
The behaviour only occured when I loaded the v4l2loopback module with the exclusive_caps=1 option, see 1.
The solution was to unload / load the v4l2loopback kernel commands, forcefully removing the v4l2loopback kernel module and adding it again using rmmod / modprobe (see above).

GStreamer Example from Official Tutorial does not Run on Ubuntu 18.04 with GStreamer 1.14.1

I installed a fresh version of Ubuntu 18.04 and I am trying to demux a single channel from a Matroska file by running the exact command from the Gstreamer documentation ( https://gstreamer.freedesktop.org/documentation/tutorials/basic/gstreamer-tools.html ).
Gstreamer is current:
gst-launch-1.0 --version
gst-launch-1.0 version 1.14.1
GStreamer 1.14.1
https://launchpad.net/distros/ubuntu/+source/gstreamer1.0
The problem is that Gstreamer keeps complaining that:
GstMatroskaDemux:d: Delayed linking failed
Error:
gst-launch-1.0 souphttpsrc location=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm ! matroskademux name=d d.video_00 ! matroskamux ! filesink location=sintel_video.mkv
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Got context from element 'souphttpsrc0': gst.soup.session=context, session=(SoupSession)NULL, force=(boolean)false;
WARNING: from element /GstPipeline:pipeline0/GstMatroskaDemux:d: Delayed linking failed.
Additional debug info:
./grammar.y(510): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstMatroskaDemux:d:
failed delayed linking pad video_00 of GstMatroskaDemux named d to some pad of GstMatroskaMux named matroskamux0
ERROR: from element /GstPipeline:pipeline0/GstSoupHTTPSrc:souphttpsrc0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstSoupHTTPSrc:souphttpsrc0:
streaming stopped, reason not-linked (-1)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
How can I get this to work?
I actually remembered that webm is doing matroska file format. The example is actually faulty/out of date. The pad naming is wrong. Instead of 00 use 0:
gst-launch-1.0 souphttpsrc location=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm ! matroskademux name=d d.video_0 ! matroskamux ! filesink location=sintel_video.mkv

Could not initialize Xv output

I have installed gstreamer-1.0 on my target board.
When I run the below command:
gst-launch-1.0 filesrc location="/home/test.mp4" ! decodebin ! videoconvert ! xvimagesink
I am getting the following output:
Setting pipeline to PAUSED ...
ERROR: Pipeline doesn't want to pause.
ERROR: from element /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0: Could not initialise Xv output
Additional debug info:
xvimagesink.c(1765): gst_xvimagesink_open (): /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0:
Could not open display (null)
Setting pipeline to NULL ...
Freeing pipeline ..
Please suggest some solution to make the pipeline work.
Regards,
Sainath
Hm do you have libxv somewhere?
Find
what happens when you check gst-inspect-1.0 xvimagesink ?
Also there is library path mentioned (| grep Filename).. you can ldd it:
ldd /usr/local/lib/gstreamer-1.0/libgstxvimagesink.so
I had some problems with this as well.. try to install libxv-dev or libxv1.. maybe dev is safer.

Resources