How do I use HLS in Ant Media Server? - http-live-streaming

I am streaming via RTMP and have enabled HLS in the settings. However, I cannot manage to make the stream use HLS.
I have tried both of the following:
/LiveApp/name.m3u8
/LiveApp/streams/name.m3u8
but neither of these work. How can I force HLS to be used in Ant Media Server when streaming via RTMP?

To make HLS available, you need to broadcast explicitly like this:
"rtmp://SERVER_ADDR/LiveApp/name"
Then HLS will be available like you want. For example:
"http://SERVER_ADDR:5080/LiveApp/streams/name.m3u8"

If you want to play a HLS stream that was published to the Antmedia server using WebRTC and you have adaptive streaming turned on, you need to pull the stream from the WebRTCAppEE path, not the LiveApp path. And if you are using adaptive streams you need to suffix the stream-id with the resolution.
For instance it it was "broadcasting" at 720p:
http://SERVER_ADDR:5080/WebRTCAppEE/streams/name_720p.m3u8
Note that you may also need to set the MIME Type which should be:
"application/x-mpegurl"

Related

Web Audio live streaming

There is an audio stream which sends from mobile device to the server. And server sends chunks of data (due web-sockets) to the web.
The question is. What to use to play this audio in live mode, also there is should be a possibility to rewind audio back, listen to what was before..and again switch to live mode.
I considered such possibilities as Media Source API but it's not supported by Safari and Chrome on IOS, isn't it? But we need that support.
Also, there is Web Audio API which supports by modern browsers, but I'm not sure does it possible to listen to audio in live mode and rewind audio back?
Any ideas or guides on how to implement it?
I considered such possibilities as Media Source API but it's not supported by Safari and Chrome on IOS, isn't it? But we need that support.
Then, you can't use MediaSource Extensions. Thanks Apple!
And server sends chunks of data (due web-sockets) to the web.
Without MediaSource Extensions, you have no way of using this data from a web socket connection. (Unless it's PCM, or you're decoding it to PCM, in which case you could use the Web Audio API, but this is totally impractical, inefficient, and not something you should pursue.)
You have to change how you're streaming. You have a few choices:
Best Option: HLS
If you switch to HLS, you'll get the compatibility you need, as well as the ability to go back in time and what not. This is what you should do.
Mediocre Option: HTTP Progressive
This is a fine way to stream for most use cases but there isn't any built-in way to handle the stream seeking that you want. You'd have to build it, which is not worth your time since you could just use HLS.
Even More Mediocre Option: WebRTC
You could switch to WebRTC for streaming, but you have greatly increased infrastructure costs and complexity. And, you still need to figure out how you're going to handle seeking. The only reason you'd want to go the WebRTC route is if you absolutely needed the lowest latency.

Convert RTMP to RTSP & Stream

I am trying to stream the camera view from a React Native or Ionic app and then receive the output as an RTSP feed on my PC. I have been able to use Bambuser's SDK to stream to their servers, and from there I can output to RTMP. Unfortunately it doesn't seem to allow RTSP output.
Can anyone tell me the best way of receiving this RTMP stream & converting it to RTSP please? I can use the nginx RTMP module (https://hub.docker.com/r/tiangolo/nginx-rtmp/) but I am not sure how to convert to RTSP.
I've tried using Wowza for this instead which works great, but there is an error when trying to use their SDK on Vuzix Blade smart glasses which I am using so I am having to look for an alternative solution.
Thanks for any help.
I managed this by using MistServer in the end. It accepts a wide range of protocols including RTMP, and can output RTSP.

How to play a RTP stream in HTML5?

The Janus server is able to replay the RTP stream.
Is there a way to play a RTP stream directly into a video html5 element ?
(I don't really get the difference between RTP and RTSP)
And how can I play the RTP stream: should I transcode it to some HLS ?
You don't, it's not supported in HTML5. I'd recommend transcoding it to DASH and/or HLS, using either open source tools like ffmpeg or commercial solutions like bitmovin.
Webrtc is supported in HTML5, so you can view the video on the browser.
Janus Server supports different plugin for RTSP/RTP, which will receive data in RTSP/RTP and then send that data to the web browser client using webrtc.
https://janus.conf.meetecho.com/docs/streaming.html

Windows Azure live media encoders provide live transcoding?

I have a simple question - I want to stream live video + audio. I would like to use Windows Azure for that (mainly because it seems to provide HLS with AES protection which I have not encounted in opensource solutionsand clear for managers pricing per streaming user) I amtrobuled because of next quote:
Currently, Media Services does not provide a live transcoding service.
You can use one of the following third party live encoders that output
RTMP or Smooth Streaming formats: Elemental, Envivio, Cisco, RGP
encoders output Smooth Streaming; Adobe Flash Live, Wirecast and
Tredek encoders output RTMP.
And a few lines after
You can deliver your live stream in any of the following formats:
Smooth Streaming, DASH and HLS. When doing live streaming, HLS is
packaged dynamically and the default HLS packaging ratio is 3 Smooth
fragments to 1 HLS segment (3:1).
...
Configure a live transcoder.
Every time you reconfigure the transcoder, call the Reset method on
the channel.
So no transcoding is provided yet I shall set up a transcoder... What? How?
In FFmpeg there are 2 types of transcoding
from one encoded data format to another (say PCM raw data to encoded MP3 frames)
from one frame/packet type to another (say MP4 frames of already encoded audio/video to FLV frames format with same encoded data in them)
Do they try to tell me that they provide frames repacking from RTMP to HLS yet no live encoding into another compression type (say from Speex audio to AAC)?
As I answered on your another post, you can use tool like Wirecast 6 to encode your live stream and push the stream into Azure Ingest URL. We will give you a publish URL that could dynamically package content into HLS, Smooth Streaming and DASH.
For more information, please refer to this post: http://azure.microsoft.com/blog/2014/09/10/getting-started-with-live-streaming-using-the-azure-management-portal/
Yes. The second type of transcoding you describe can be better named transpackaging because no video coding is done.
Transcoding is not provided. Transpackaging is provided.

If I can't use WebRTC, what can I use right now for live streaming video

I'm working on a web app in node.js to allow clients to view a live streaming video via a unique url that another client will broadcast from their webcam, i.e., http://myapp.com/thevideo
I understand that webRTC is still not supported in enough browsers to be useful.
I would also like to save this the video stream to be viewed later within the app.
Things get somewhat confusing as I try to narrow down a solution to make this work.
I would like to get some recommendations on proven solutions out there to make this work on desktop and mobile? Any hints would be great.
I'll make a quick suggestion based on the limited details. I would use ffmpeg to encode to HLS. This format will playback natively on iOS and safari on Mac. For all other platforms, either provide an rtmp stream with a flash front end, or use jw player 6 commercial version that can play HLS. Or use a wowza server to handle this all for you.

Resources