How do I capture an mpeg-dash stream using Python3 opencv? - python-3.x

I have a URL that links to an MPEG-DASH stream (https://something.com/manifest.mpd). I would like to capture this stream to work with the frames with OpenCV on Python3, which I have installed using pip3. How would I do this?
I have already tried cv2.VideoCapture(URL), but this does not work.

you can try Vidgear. It supports mpeg dash format but it has yet to incorporate Apple HLS format. In case you want to work on a scalable solution, you can use amazon AWS Media Convert, which can convert your source files to any format such as m3u8 or mpd. You can use AWS Media Live to do the same thing for live streams.

Related

Can anyone explain me corellation between MSE DASH and HLS?

I am new to media streaming, just started learning about adaptive streaming.
I have few queries, please clarify -
Does MSE support only DASH streaming, I mean if any website using DASH and my browser supports MSE with DASH, it will play. But if a website uses HLS, then my browser is not playing video content although it has MSE.
Is it because MSE does not support HLS, or my browser MSE does not have implementation of HLS?
If I inspect a webpage playing video stream, I checked many sites uses video tag with "src" attribute as blob. Does blob means it is using MSE.
Can we have blob in "src" attribute for DASH(I checked in Youtube) and for HLS(as in dailymotion or twitch.tv) as well?
I was reading few articles on twitch.tv, does twitch.tv only support HLS with html5 player or flash? If suppose a browser does not support flash and HLS through html5 player, then there is no way to play twitch.tv content on browser?
Thanks
MediaSource Extensions (MSE) supports anything you can de-mux in JavaScript and send to the browser's native codecs. Browsers don't support DASH natively. Some browsers support HLS natively but most don't. It is possible to use both DASH and HLS in browsers that support MSE with the correct JavaScript library for handling each.
The blob you see could be a regular blob (an immutable chunk of binary), but more than likely it's coming from MSE.
I can't speak to what Twitch does internally.
Your questions don't really make sense as they are asked, so I can't answer the 1,2,3. But I can clear up some of your confusion. HLS and DASH are a collection of technologies, not single competing technologies. Most HTTPS streaming protocols are made up of a binary video format, and a text based manifest format. DASH uses an overly complex XML manifest format with a fragmented MP4 video format. HLS uses an m3u8 manifest, with fragmented Transport stream for the video format. As of IOS 10 HLS also supports fragmented MP4. MSE can play fragmented MP4. But browsers don't read manifests. Hence a player application must be used to download and parse the manifest, download the video fragments, then give them to the browser to play. Twitch uses HLS with transport streams, but runs custom software in the browser to convert them to MP4 fragments. (Or flv streams in the case of flash). When you see a src with a blob, that is a normal (not fragmented) MP4, and is completely different. Safari is an exception, it can play HLS using an m3u8 manifest as the source.

Video pixelated issue in Wowza RTSP streaming using Amazon S3

I am using Wowza streaming server (VODS3) on Amazon EC2 and files on S3 bucket.
Videos are played via RTSP.
RTSP is working fine, but the video quality is not good. Video gets pixelated in between.
I've taken Wowza from Amazon market place.
Do I need to set any configuration or something else?
I would suggest trying the default sample.mp4 file that comes with your install of Wowza. Upload that to your s3 account and try playing that back and see if there is a difference? You might use ffmpeg to encode your custom files prior to uploading to s3 to ensure smooth playback. If you are playing back on Android device, you may also try (depending on manufacturer and version) hls playback.

How implement real time video encoding using Libde265 and Linux

I been reading a lot about H265 encoder but I'm no really sure how to start a C or Python application to encode a video stream in real time using H.265 encoder from libde265, I all ready install the library and I guess I could use opencv to get the input video stream from a usb camera, do anyone has worked in this type of application ?
If you are not particular about using libde265, which I am not aware of, please give a shot with gstreamer. gstreamer has lots of plugins and examples if you are doing some standard tasks like encoding on your stream. It also integrates well with native development.
I have worked on a project similar to yours, where we did H264 encoding and decoding along with few other things on a live camera feed.
Please find the video of a similar application here: https://www.youtube.com/watch?v=JcpkGDpfVU0
Just my two cents!!

Windows Azure live media encoders provide live transcoding?

I have a simple question - I want to stream live video + audio. I would like to use Windows Azure for that (mainly because it seems to provide HLS with AES protection which I have not encounted in opensource solutionsand clear for managers pricing per streaming user) I amtrobuled because of next quote:
Currently, Media Services does not provide a live transcoding service.
You can use one of the following third party live encoders that output
RTMP or Smooth Streaming formats: Elemental, Envivio, Cisco, RGP
encoders output Smooth Streaming; Adobe Flash Live, Wirecast and
Tredek encoders output RTMP.
And a few lines after
You can deliver your live stream in any of the following formats:
Smooth Streaming, DASH and HLS. When doing live streaming, HLS is
packaged dynamically and the default HLS packaging ratio is 3 Smooth
fragments to 1 HLS segment (3:1).
...
Configure a live transcoder.
Every time you reconfigure the transcoder, call the Reset method on
the channel.
So no transcoding is provided yet I shall set up a transcoder... What? How?
In FFmpeg there are 2 types of transcoding
from one encoded data format to another (say PCM raw data to encoded MP3 frames)
from one frame/packet type to another (say MP4 frames of already encoded audio/video to FLV frames format with same encoded data in them)
Do they try to tell me that they provide frames repacking from RTMP to HLS yet no live encoding into another compression type (say from Speex audio to AAC)?
As I answered on your another post, you can use tool like Wirecast 6 to encode your live stream and push the stream into Azure Ingest URL. We will give you a publish URL that could dynamically package content into HLS, Smooth Streaming and DASH.
For more information, please refer to this post: http://azure.microsoft.com/blog/2014/09/10/getting-started-with-live-streaming-using-the-azure-management-portal/
Yes. The second type of transcoding you describe can be better named transpackaging because no video coding is done.
Transcoding is not provided. Transpackaging is provided.

Encoding wav to flac and streaming through node.js

Am trying to create an application which will take data as raw audio wav format and output as FLAC.
Now, I need to stream the input and the output at the same time through node.
Can someone guide me on how can I work this out?
Thanks
As far as I know there is no possibility to do so "live" via the Internet. You can't do so with FLAC format. Only AAC, MP3 an WAV are supported for live-streaming.
You can download the file on the client side and then the client using his or het apps is able to play it.

Resources