Running gstreamer app in OpenVino using Windows subsystem in Linux (WSL2) - openvino

My system main OS is Windows-10 OS and installed WSL2 in subsystem.
Openvino is installed in Linux under WSL2. The reason is dlstreamer-gst is supported only in linux.
When I test the following test app, it works only for fps and display is not working.
If I run app as
./vehicle_pedestrian_tracking.sh person-bicycle-car-detection.mp4 10 CPU fps
I can see the output as
FpsCounter(1sec): total=346.54 fps, number-streams=1, per-stream=346.54 fps
FPSCounter(average): total=340.06 fps, number-streams=1, per-stream=340.06 fps
Got EOS from element "pipeline0".
Execution ended after 0:00:02.456260800
Setting pipeline to NULL ...
Freeing pipeline ...
When I run for display,
./vehicle_pedestrian_tracking.sh person-bicycle-car-detection.mp4 10 CPU dispaly
I have error as
ERROR: from element /GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstXvImageSink:xvimagesink0: Could not initialise Xv output
Additional debug info:
../sys/xvimage/xvimagesink.c(1778): gst_xv_image_sink_open (): /GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstXvImageSink:xvimagesink0:
Could not open display (null)
ERROR: pipeline doesn't want to preroll.
Failed to set pipeline to PAUSED.
Setting pipeline to NULL ...
Freeing pipeline ...
Is it display is not available in WSL subsystem?

Use MobaXTerm with WSL2 and change the command of xvimagesink to ximagesink in vehicle_pedestrian_tracking.sh.

My solution is I can't display but I can save to file as
gst-launch-1.0 filesrc location=video1.avi ! decodebin ! video/x-raw ! queue ! gvadetect model=/home/nyan/intel/dl_streamer/models/intel/vehicle-detection/FP16/model_fp16.xml model-proc=./model_proc/vehicle-detection.json inference-interval=10 threshold=0.6 device=CPU ! queue ! gvafpscounter ! filesink location=output.avi

Related

Raspi v2 camera is not working with v4l2src

Environment :- Raspi v2 camera, Jetson nano board, Ubuntu 18.04
I started with nvarguscamerasrc and it's working :-
gst-launch-1.0 nvarguscamerasrc sensor_mode=0 ! 'video/x-raw(memory:NVMM),width=3820, height=2464, framerate=21/1, format=NV12'! nvegltransform ! nveglglessink -e
and then I tried running these inputs :-
gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-h264, width=3280, height=2464' ! filesink
and also :-
gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-raw, width=640, height=480' ! filesink
and got output as :-
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming stopped, reason not-negotiated (-4)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...

Sink to the virtual v4l2 device

I have tried an example on Ubuntu 19.04
gst-launch-1.0 videotestsrc ! v4l2sink device=/dev/video10
But gstreamer fails
Setting pipeline to PAUSED ...
ERROR: Pipeline doesn't want to pause.
ERROR: from element /GstPipeline:pipeline0/GstV4l2Sink:v4l2sink0: Cannot identify device '/dev/video10'.
Additional debug info:
v4l2_calls.c(609): gst_v4l2_open (): /GstPipeline:pipeline0/GstV4l2Sink:v4l2sink0:
system error: No such file or directory
Setting pipeline to NULL ...
Freeing pipeline ...
Why it doesn't work? I haven't found this in the documentation, do I need to create /dev/video10 somehow?
I did the same for the default device /dev/video1, but it is an input camera device on my laptop:
sudo gst-launch-1.0 videotestsrc ! v4l2sink
Setting pipeline to PAUSED ...
ERROR: Pipeline doesn't want to pause.
ERROR: from element /GstPipeline:pipeline0/GstV4l2Sink:v4l2sink0: Device '/dev/video1' is not a output device.
Additional debug info:
v4l2_calls.c(639): gst_v4l2_open (): /GstPipeline:pipeline0/GstV4l2Sink:v4l2sink0:
Capabilities: 0x4a00000
Setting pipeline to NULL ...
Freeing pipeline ...
Thank in advance.
The title of your questions suggests you would like to write to a virtual video device. v4l2 devices can be both input and output video devices. Your camera is a video input (capture) device. Directing a v4l2sink (so an endpoint of the pipeline) in gstreamer will likely fail.
You can however generate a virtual output device. What you are looking for is something like the v4l2-loopback device. It allows you to generate a virtual /dev/video10 device like this:
modprobe v4l2loopback video_nr=10
Another possible solution for the same error message: recreate the v4l2loopback interface:
sudo rmmod -f v4l2loopback
sudo modprobe v4l2loopback
This might apply to others experiencing the error message from the original question, but who are already aware they need a v4l2loopback device as gstreamer sink.
When trying to stream a video to the existing v4l2loopback device I streamed to using ffmpeg before, I got the same error message
Device '/dev/video0' is not a output device.
Investigation
When comparing the state of a working loopback video device and a non-working one (i.e. after writing to it with ffmpeg) with v4l2-ctl --all -d 0 using diff, I found the following difference :
--- working 2020-11-19 18:03:52.499440518 +0100
+++ non-working 2020-11-19 18:03:57.472802868 +0100
## -3,21 +3,18 ##
Card type : GPhoto2 Webcam
Bus info : platform:v4l2loopback-000
Driver version : 5.9.8
- Capabilities : 0x85208002
- Video Output
+ Capabilities : 0x85208000
Video Memory-to-Memory
Read/Write
Streaming
Extended Pix Format
Device Capabilities
- Device Caps : 0x05208002
- Video Output
+ Device Caps : 0x05208000
Video Memory-to-Memory
Read/Write
Streaming
Extended Pix Format
Priority: 0
-Video output: 0 (loopback in)
Format Video Output:
Width/Height : 960/640
Pixel Format : 'YU12' (Planar YUV 4:2:0)
Somehow that "Video Output" capability is required for gstreamer to work successfully and taken away by my previous ffmpeg call.
The behaviour only occured when I loaded the v4l2loopback module with the exclusive_caps=1 option, see 1.
The solution was to unload / load the v4l2loopback kernel commands, forcefully removing the v4l2loopback kernel module and adding it again using rmmod / modprobe (see above).

GStreamer Example from Official Tutorial does not Run on Ubuntu 18.04 with GStreamer 1.14.1

I installed a fresh version of Ubuntu 18.04 and I am trying to demux a single channel from a Matroska file by running the exact command from the Gstreamer documentation ( https://gstreamer.freedesktop.org/documentation/tutorials/basic/gstreamer-tools.html ).
Gstreamer is current:
gst-launch-1.0 --version
gst-launch-1.0 version 1.14.1
GStreamer 1.14.1
https://launchpad.net/distros/ubuntu/+source/gstreamer1.0
The problem is that Gstreamer keeps complaining that:
GstMatroskaDemux:d: Delayed linking failed
Error:
gst-launch-1.0 souphttpsrc location=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm ! matroskademux name=d d.video_00 ! matroskamux ! filesink location=sintel_video.mkv
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Got context from element 'souphttpsrc0': gst.soup.session=context, session=(SoupSession)NULL, force=(boolean)false;
WARNING: from element /GstPipeline:pipeline0/GstMatroskaDemux:d: Delayed linking failed.
Additional debug info:
./grammar.y(510): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstMatroskaDemux:d:
failed delayed linking pad video_00 of GstMatroskaDemux named d to some pad of GstMatroskaMux named matroskamux0
ERROR: from element /GstPipeline:pipeline0/GstSoupHTTPSrc:souphttpsrc0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstSoupHTTPSrc:souphttpsrc0:
streaming stopped, reason not-linked (-1)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
How can I get this to work?
I actually remembered that webm is doing matroska file format. The example is actually faulty/out of date. The pad naming is wrong. Instead of 00 use 0:
gst-launch-1.0 souphttpsrc location=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm ! matroskademux name=d d.video_0 ! matroskamux ! filesink location=sintel_video.mkv

Could not initialize Xv output

I have installed gstreamer-1.0 on my target board.
When I run the below command:
gst-launch-1.0 filesrc location="/home/test.mp4" ! decodebin ! videoconvert ! xvimagesink
I am getting the following output:
Setting pipeline to PAUSED ...
ERROR: Pipeline doesn't want to pause.
ERROR: from element /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0: Could not initialise Xv output
Additional debug info:
xvimagesink.c(1765): gst_xvimagesink_open (): /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0:
Could not open display (null)
Setting pipeline to NULL ...
Freeing pipeline ..
Please suggest some solution to make the pipeline work.
Regards,
Sainath
Hm do you have libxv somewhere?
Find
what happens when you check gst-inspect-1.0 xvimagesink ?
Also there is library path mentioned (| grep Filename).. you can ldd it:
ldd /usr/local/lib/gstreamer-1.0/libgstxvimagesink.so
I had some problems with this as well.. try to install libxv-dev or libxv1.. maybe dev is safer.

How to create an indexed video file with Gstreamer

I'm trying to use Gstreamer to create a seekable (indexed) video file in Linux. My pipelines work for recording and saving the data, but I can't figure out how to index the data so I can seek using gst_element_seek_simple() [http://docs.gstreamer.com/display/GstSDK/Basic+tutorial+4%3A+Time+management]
I have seen this post: Gstreamer video output position tracking and seeking and validated I am sending an EOS on the pipeline with -e.
Here is my pipeline and output. I'm teeing it to display both to my embedded system's screen and save to the M4V file.
# gst-launch-0.10 -e v4l2src ! \
tee name=t !
queue !
video/x-raw-yuv,width=320,height=240 !
videoflip method=clockwise !
ffmpegcolorspace !
fbdevsink t. !
queue !
ffmpegcolorspace !
ffenc_mpeg4 !
filesink location=output.m4v
Here is the output:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
WARNING: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Could not get parameters on device '/dev/video0'
Additional debug info:
v4l2src_calls.c(235): gst_v4l2src_set_capture (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
system error: Inappropriate ioctl for device
Setting pipeline to PLAYING ...
New clock: GstSystemClock
^CCaught interrupt -- handling interrupt.
Interrupt: Stopping pipeline ...
(gst-launch-0.10:534): GLib-CRITICAL **: Source ID 62 was not found when attempting to remove it
EOS on shutdown enabled -- Forcing EOS on the pipeline
Waiting for EOS...
Got EOS from element "pipeline0".
EOS received - stopping pipeline...
Execution ended after 10057977251 ns.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
And here is the output of gst-discover on my new file:
beaglebone:# gst-discoverer-0.10 output.m4v
Analyzing file:///output.m4v
Done discovering file:///output.m4v
Topology:
video: MPEG-4 Video
Properties:
Duration: 0:00:00.000000000
Seekable: no
Thanks
You need to store the result in a seekable/indexed format. For that you can put the mpeg4 video inside a container such as mp4 or matroska. Use "! mp4mux ! filesink" or "! matroskamux ! filesink" to have it inside those formats that should make it seekable.
Side notes: gstreamer 0.10 is over 2 years obsolete and unmantained, please upgrade to 1.0.
http://gstreamer.freedesktop.org/ is the official gstreamer website and you will find the releases for 1.x versions there. The gstreamer.com website is a project is not related to the official project and, if you read the text in gstreamer.com, you will see that you should be using the official repository and installers.

Resources