Azure media Services player - azure

I'm feeling that this is a really dumb question, but my research tells me I have to create my own player. Is that true?
I have a link (publish URL) from Azure Media services like this:
http://streamvideotest.streaming.mediaservices.windows.net/4ed49a08-f82d-462e-a05e-acea910064a5/7g91d5d8-b213-406c-90d8-75a3a5e2456d.ism/Manifest
Which I would like to hand out to a few people to play the video, or live feed at that channel. But you need some type of player? I've tried Windows Media Player (open URL), but that always fails.

It depends on which platform you want to reach.
If you are trying to play live stream on PC using Smooth Streaming, you could use Silverlight Player (http://smf.cloudapp.net/healthmonitor). Or if you want to stream DASH through modern browser (IE or Chrome), you could do it through HTML5 video tag natively.
If you are trying to reach out to iOS platform, you could do it natively by delivering HLS stream - append (format=m3u8-aapl) in your link above.
This is an article that describe different players you could use in different platform: https://msdn.microsoft.com/en-us/library/azure/dn223283.aspx.
As said, Azure Media Services has just rolled out Azure Media Player, which could detect the capability of the platform and feed in right streaming protocols by using the right player technology. please check out http://azure.microsoft.com/blog/2015/04/15/announcing-azure-media-player/.
Thanks,
Mingfei Yan

The Azure Media Player has officially been rolled out at: http://amsplayer.azurewebsites.net/
Of course, Mingfei is the Queen of Azure Media Services and the most definitive answer to any and all AMS questions!

Just to add to this thread - As Mingfei mentioned it depends on where you want your video to work. However, if you have a browser based offering, we'd reccomend starting with Azure Media Player.
http://azure.microsoft.com/blog/2015/04/15/announcing-azure-media-player/

Related

Stream music from streaming platform (Deezer, Spotify, Soundcloud) to Web Audio API

Do any of you, know a way to get the audio stream of a music platform and plug it to the Web Audio API ?
I am doing a music visualizer based on the Web Audio API. It currently reads sounds from the mic of my computer and process a real-time visualization. If I play music loud enough, my viz works !
But now I'd like to move on and only read the sound coming from my computer, so that the visualization render only to the music and no other sound such as people chatting.
I know I can buffer MP3 file in that API and it would work perfectly. But in 2020, streaming music is very common, via Deezer, Spotify, Souncloud etc.
I know they all have an API but they often offer an SDK where you cannot really do more than "play" music. There is no easy access to the stream of audio data. Maybe I am wrong and that is why I ask your help.
Thanks
The way to stream music to WebAudio is to use a MediaElementAudioSourceNode or MediaStreamAudioSourceNode. However, these nodes will output zero unless you're allowed to access the data. This means you have to set the CORS property correctly on your end and also requires the server to allow the access through CORS.
A google search will help with setting up CORS. But many sites won't allow access unless you have the right permissions. Then you are out of luck.
I find a "no-code" work around. At least on Ubuntu 18.04, I am able to tell Firefox to take my speakers as the "microphone input".
You just have to select the good "mic" in the list when your browser asks for mic permission.
That solution is very convenient since I do not need to write platform-specific binding-code to access to the audio stream

If I can't use WebRTC, what can I use right now for live streaming video

I'm working on a web app in node.js to allow clients to view a live streaming video via a unique url that another client will broadcast from their webcam, i.e., http://myapp.com/thevideo
I understand that webRTC is still not supported in enough browsers to be useful.
I would also like to save this the video stream to be viewed later within the app.
Things get somewhat confusing as I try to narrow down a solution to make this work.
I would like to get some recommendations on proven solutions out there to make this work on desktop and mobile? Any hints would be great.
I'll make a quick suggestion based on the limited details. I would use ffmpeg to encode to HLS. This format will playback natively on iOS and safari on Mac. For all other platforms, either provide an rtmp stream with a flash front end, or use jw player 6 commercial version that can play HLS. Or use a wowza server to handle this all for you.

Chromecast support for Smooth Streaming with PlayReady

I'm aware that developer-preview of Chromecast receiver does not fully support Smooth Streaming manifest URL (See Update#1).
I have tested content provided by Microsoft PlayReady(TM) Test Server - Smooth Streaming assets using sample receiver app provider in GitHub project.
Smooth Streaming Support
As expected, manifest file does not work (See Update#1). But I was able to play individual ismv file (but only low bitrates). When I use higher bitrate, the video container stays black.
PlayReady Support
When I tried to play PlayReady protected low bitrate ismv file, I was expecting some sort of call back MediaProtocolMessageStream.onKeyRequested(). But there did not happen. Here is my android CustomMediaProtocolMessageStream implementation.
So, does anybody know how PlayReady or Widevine supposed to work with Chromecast? I have seen Netflix invokes some binary shell command when app is loaded in chromecast. But I assume, they worked with Google to accomplish this.
Additional SO Resources
How to play smooth streaming video in Chromecast?
Is it actually possible to play SmoothStreaming videos on Chromecast without using (format=mpd-time-csf)?
Playing Smoothstreaming URL by providing Manifest file of smoothstreaming to Chromecast device
Update #1
Based on Les Vogel's answer, smooth streaming manifest file for adaptive bitrate streaming is supported by Chromecast. You need custom player to handle that.
As far as I am aware of, currently there are two JS player which can handle that, but I don't know if they will work on Chromecast.
dash.js - By DASH Industry Forum (https://github.com/Dash-Industry-Forum/dash.js)
Microsoft HTML5 Player Framework - Part of Microsoft Media Platform (http://playerframework.codeplex.com/)
Currently, you need to write your own media player to support adaptive bitrate streaming on Chromecast.
Unfortunately, the MS test server assets do not correctly provide a CORS header, which would be needed if you wrote a javascript player.
PlayReady and Windvine are both supported. We'll be providing additional documentation shortly.
EDIT We announced the beta of the Cast Media Player Library today 2/3/14 - it supports HLS, SmoothStreaming, and MPEG Dash.
Yes, you can use "com.microsoft.playready" for PlayReady and "com.widevine.alpha" for widevine.

How to add live video streaming to a website?

.Hi everyone! I am looking forward to create a website with a live video streaming feature.
I have done some research and read about some applications including Flash Media Live Encoder.
Can anyone please guide me on how to start with this? Thanks!
It really depends from your requirements.
Do you need live streaming for big event or small event (what is your bandwidth)?
Do you need to stream to different devices (desktop+mobile)?
Do you have to stream your desktop/webcam or high quality camera feeds through capture cards?
Are you flexible with different Operative Systems?
Your question is too general. FMLE + FMS is a good solution, but FMS can be expensive.
Try to have a look also to Wowza.
If you just need a few live videos on your website, the solution is quite simple, Flash Media Live Encoder plus Flash Media Server are suitable.

Sending voice over the net and get it with HTML5 and mobile apps

I'm trying to put in place a basic streaming system from the browser.
The idea is to let the user stream audio live from his mic through the browser and then allow others to listen to this stream with their browser (desktop, mobile, etc ...) and iOS/Android apps.
I started doing some tests with the Red5 Server (which is a great free alternative to the Flash Media Server).
With this technologie, I can publish a stream with the RTMP (ex: rtmp://myserver/myApp).
But the problem is that I can't find a way to read the published stream on other plateforms (using the video tag with HTML5, in iOS, etc ...).
As i failed to that, my question is:
How can I let a user to stream his voice over the net (using flash or not) and then allow the others to listen to that stream by using lightweight technologies (HTML5) and mobile apps?
Thanks,
Regards
Looks like RED5 should be able to do what you want...
0.9.0 RC2 has the ability to:
Streaming Audio (MP3, F4A, M4A)
Recording Client Streams (FLV only)
some links that may help:
http://osflash.org/pipermail/red5_osflash.org/2007-April/010400.html
http://www.red5chat.com/
Though not exactly what you're after, you could take a look at BigBlueButton which is a web conferencing suite based on open source components (RED5 is one of them). It's has a rather complex architecture but they do have a flash based client you can take a loot at.

Resources