For SRS SaaS, DRV output are HLS (m3u8), mentioned at here https://github.com/ossrs/srs/issues/2856 and here: https://mp.weixin.qq.com/s/UXR5EBKZ-LnthwKN_rlIjg.
Same idea also discussed recently https://www.bilibili.com/video/BV1234y1b7Pv?spm_id_from=333.999.0.0 At around timestamp 9:50, mentioned that, for SRS SaaS, DRV output are HLS (m3u8).
Question: can we also DVR RTMP Stream into HLS (m3u8) in SRS , as only mp4 and flv options are discussed in wiki https://github.com/ossrs/srs/wiki/v4_EN_DVR
The answer is SRS supports DVR to FLV/MP4 file, and you could also use HLS as DVR, because what DVR does is to covert RTMP to file such as FLV/MP4/HLS.
If you only want to get a record file of live streaming, you could simply use the DVR of SRS, you will see a variety of files is generated. It works like this:
OBS --RTMP--> SRS --DVR--> FLV/MP4 file
But you could also use HLS to DVR the live stream, and it's more complex and powerful way. For example, if you stop publishing, adjust the params of encoder or just change one, then continue publishing, how to DVR it to one file?
If you use DVR of SRS, you will get multiple files, because each stream is covert to a file, and DVR will start a new file when another publishing starts.
If you use HLS, you need to write a backend server, and you will get the on_hls callback, you could determine writing to previous m3u8 or start a new one, it's controlled by your backend server, and because you must write a backend server so it's more complex. It works like this:
OBS --RTMP--> SRS --HLS--> m3u8/ts file
+
+--on-hls---------> Your Backend Server
(HTTP Callback)
There is an example about how to use HLS to covert RTMP to a VoD file, please read srs-cloud for detail.
Related
I don't know how to get started with this.
What I am trying to do is get a video + audio stream from front-end and host the live stream as mp4 thats accessible on browser.
I was able to find information on WebRTC, socket.io, rtmp, but I'm not really sure what tool to use / whats best suited for something like this?
also follow up question, my front-end is iOS app. So what format would I send the live stream to the server?
It depends on which live streaming protocol you want to play on the player, as #Brad said, HLS is the most common protocol for player.
Note: Besides of HLS, iOS native app is able to use fijkplayer or FFmpeg to play any format of live streaming, like HLS, RTMP or HTTP-FLV, even MKV. However, the most straight forward solution is HLS, only need a tag to play MP4 or HLS, and MSE is also a optional solution to use flv.js/hls.js to play live streaming on iOS/Android/PC, this post is about these protocols.
The stream flow is like this:
FFmpeg/OBS ---RTMP--->--+
+--> Media Server---> HLS/HTTP-FLV---> Player
Browser ----WebRTC--->--+
The protocol to push to media server, or receive in node server, depends on your encoder, by RTMP or H5(WebRTC):
For RTMP, you could use FFmpeg or OBS to push stream to your media server.
If want to push stream by H5, the only way is use WebRTC.
The media server coverts the protocol from publisher to player, which use different protocols in live streaming right now(at 2022.01), please read more from this post.
I'm using Cloudflare as the video streaming provider for a project. I'm trying to pre-fetch multiple videos on a mobile device, so using HLS (with it's larger chunk size) is impacting performance; this is why I would like to request the video be sent using DASH. Here, the Cloudflare team writes: "Cloudflare uses two standards for adaptive streaming: HLS and MPEG-DASH".
Every get request to the video has yielded a stream with HLS. Is there any way to request DASH given my Cloudflare video id?
Typically a video origin server and CDN will serve the stream that best matches a devices capabilities - usually this triggered by the device requesting either a HLS or a MPEG DASH stream, the two most popular streaming format today.
Cloudflare Stream should provide you urls to both a HLS manifest and DASH manifest automatically - they should look something like:
MPEG-DASH: https://videodelivery.net/VIDEOID/manifest/video.mpd
HLS: https://videodelivery.net/VIDEOID/manifest/video.hls
I am involved in building a Real Time Messaging Protocol Parser.I am collecting the video/audio data from the RTMP packets.Now to play a video in any player I need to know the container format as well as the codec used.In the video data I am getting from the RTMP packets I know the codec used (for eg. On2 VP6).But I don't know how to know the container of the audio/video stream that I am receiving . So should I assume that RTMP support only FLV container ??? Or is it possible for me to get audio/video packets from any other container formats ?? If Yes then how to know the type of container used from the RTMP data from the information present in RTMP packet ?Adobe specification for RTMP does not provide any information regarding the container of the audio/video data. Any help on this ??? I am stuck here for quite some time.
It is a bit wrong question.
RTMP is a transport protocol that includes containers inside.
Technically it is not correct to say that RTMP carries FLV, because FLV has two layers of incapsulation and RTMP carries only bottom level.
So, it is right to say that RTMP can transfer only those codecs that FLV can and it is not 100% right to say that RTMP transfers FLV.
Adobe's specification of RTMP was created not for developers but for a legal issue against Wowza, so it is not written for you to understand what is happening. Read sources of red5, crtmp or some other rtmp server, they are rather easy to understand.
I need to be able to take the input video from a client side camera and convert it to an RTMP stream to be viewed live by other clients as part of a website application I am building. What libraries exist that could help facilitate this?
I'm using MediaElement to play audio mp3 stream,
everything goes ok, but now I have mp3 stream that does not end with .mp3,
( http://server2.fmstreams.com:8011/spin103) and I'm getting
AG_E_NETWORK_ERROR
I found solution to add ?ext=mp3, but it didn't work for me, any ideas?
If you are streaming live radio, the stream may be encoded by an IceCast server or ShoutCast server. To read these streams, you will need to decode the stream in memomry and pass it to the MediaElement once it has been decoded.
have a look at Mp3MediaStreamSource : http://archive.msdn.microsoft.com/ManagedMediaHelpers
and Audio output from Silverlight
I lost tons of time on this, and this is the best solution I found so far.
You also have to be sure that while you are testing, the device must be unplugged from the computer.