I have looked through the examples for creating a live stream on Azure, but this leaves you with a url where the stream can be viewed at. I'm wondering if there's anyway to push the live stream to another service (such as twitch.tv or youtube live streaming)?
Currently live streaming doesn't support to push to another service.
Related
To clarify: I have a website hosted in Azure. I want to add a 1.5 minute howto video. I can't imagine it will get shown more than a few tens or hundreds a month (maybe a few thousand if the site takes off).
I was planning on using Azure Media Player to play the video.
In relation to this I thought the video would sit in a streaming endpoint.
But this seems an expensive way of doing this. Are there better ways (especially cheaper)?
EDIT: is it possible to host the video elsewhere and have it embedded in Azure?
The cheap way to do this would be to place the video in a blob storage then play it using a web page.
There is a video explaining how to do this here: https://www.youtube.com/watch?v=qmzns7PgP0A
I would recommended to use Media Service: video-on-demand, content delivery service with an Azure Media Services application in the Azure portal.
Azure Media Services lets you deliver any media, on virtually any device, to anywhere in the world using the cloud. The collection of services provide encoding, live or on-demand streaming, content protection and indexing for video and audio content.
The Windows Azure Media Services platform has four types of services: content uploading, encoding, encrypting content and streaming.
Media Service Pricing: https://azure.microsoft.com/en-in/pricing/details/media-services/
Additional information : Streaming Videos from Azure ( Blob or Media Services)
I have one interesting scenario using Windows Azure to store video directly from IP cameras.
I know Azure Media Services is amazing to video streaming, but I do not know if it is possible to use Media services to store video from IP secure cameras.
Have someone here experience with that?
Yes it's possible if your camera or your streaming software supports RTMP or Fragmented MP4 as streaming protocol.
How to get things ready:
As you can see in the image below, the entity that stores a live streaming is "Program".
Common live streaming scenario
The following steps describe tasks involved in creating common live streaming applications.
Connect a video camera to a computer. Launch and configure an on-premises live encoder that outputs a multi-bitrate RTMP or Fragmented MP4 (Smooth Streaming) stream. (Like WireCast) This step could also be performed after you create your Channel.
Create and start a Channel.
Retrieve the Channel ingest URL. The ingest URL is used by the live encoder to send the stream to the Channel.
Retrieve the Channel preview URL. Use this URL to verify that your channel is properly receiving the live stream.
Create a program. When using the Azure Management Portal, creating a program also creates an asset. When using .NET SDK or REST you need to create an asset and specify to use this asset when creating a Program.
Publish the asset associated with the program. Make sure to have at least one streaming reserved unit on the streaming endpoint from which you want to stream content.
Start the program when you are ready to start streaming and archiving. Optionally, the live encoder can be signaled to start an advertisement. The advertisement is inserted in the output stream.
Stop the program whenever you want to stop streaming and archiving the event.
Delete the Program (and optionally delete the asset).
All these task can be made by using the Azure Management Portal, .NET SDK, Java SDK, Azure Media Services REST API, etc.
More details here
I am working on a school project and using my MSDN subscription for Azure access. I have written a program that uploads MP4 recordings (video surveillance) from a private network to Azure storage on a scheduled basis.
I want to be able to view these MP4 files using the Azure Media Player. I will be the only one using this stream and it would only be on a very infrequent basis (while away on vacation). I played around with the Azure Media Services a bit and it seemed like the only way I could get an "endpoint" for the media player was to open a live streaming channel. Once I did that it gave me an endpoint which I put in the player and it played my video as expected. I turned in my project proposal to my professor based on this prototype and got it approved as my semester project (40% of my grade).
To my surprise two days later I got an email alert saying that my Azure account had shut down automatically due to my exceeding the $50/mth allocation. I was surprised since the files I uploaded amounted to only 5MB and I only downloaded them twice during my proof of concept work.
While reviewing my billing details it appears all these charges came from the media services channel and it appears to based on the time the channel is "alive". 43 hours of this pretty much ate up my whole allotment for the month.
Here are my questions (keeping in mind I am a decent C# developer but completely green about all things Azure):
1) Am I going about this the right way? Do I need a live streaming channel to use the Azure Media Player?
2) If yes to the above, is there a way I can start/stop the live streaming service from code? In this way could I send a command to Azure to wake up the channel when viewing is needed then shut down when complete?
3) Is there some other html5 based media player I could use against Azure file storage so I bypass the live streaming channel and associated costs?
Thanks for any help. When I called Microsoft support all they could do was explain the billing to me and steered me here for technical support.
Based on your project descriptions you already have files encoded to mp4 and uploaded to Azure Media Services.
You don't need to create streaming endpoint and start live channel to playback your files. You need instruct system to generate playback url which will accesible for you during defined period of time and use this url in player.
Example of code:
IAsset asset =_mediaContext.Assets.Where(c=>c.Name="Your MP4 Asset").First();
IAccessPolicy accessPolicy = _mediaContext.AccessPolicies.Create("Read15Min", TimeSpan.FromMinutes(15), AccessPermissions.Read);
ILocator locator = _mediaContext.Locators.CreateLocator("ReadOnlyLocator", LocatorType.Sas, asset, accessPolicy, startTime: null);
var uribuilder = new UriBuilder(locator.Path);
uribuilder.Path += "/" + asset.AssetFiles.First().Name;
var fileUri = uribuilder.ToString();
To test if it is actually working simply go to http://aka.ms/azuremediaplayer and paste value of fileUri to Url input textbox.
If you serious about security i would recommend to read about Content Protection in Azure Media Services and dynamically encrypt content to use token authentication. In my blog post I showed how you can integrate Azure AD authentication in your ASP.NET MVC app with Azure Media Services to protect your content.
You can point your Azure Media Player directly towards the mp4 file's "Absolute URI" found when marking the file and clicking "Edit" at the bottom.
Previously I'm using Amazon AWS, now I'm moving to Microsoft Bizspark program which can use azure for free for certain limitation. I'm new to azure, and I want to setup RTMP live streaming with CDN. I'm using FlowPlayer in my website and OBS software to broadcast my live stream. Can I use Azure CDN with RTMP live streaming ?
Azure Media Services supports ingesting live feeds using RTMP and uses Dynamic Packaging to dynamically transmuxe live streams for delivery in MPEG-DASH, Microsoft Smooth Streaming, Apple HLS, or Adobe HDS formats.
You can use Azure Media Player and Wirecast instead of Flow Player and OBS Software. You can follow this article to setup the live streaming.
I am in the middle of developing services that will deal with media files (audio/video) . These services are responsible for uploading and then streaming media files uploaded by client (IOS, Android but not limited to these devices/platforms).
We are using node.js with mongodb as database. In the near future our services will be part of Azure. (Portions of our backend are already there in Azure)
In that case i came across Azure media services. I know that it does't have any sdk to work with for Node.js so my only option here is to use REST Service from Azure Media Services.
My question are:
1) Whats the correct approach adapted in this scenario by developers already handling this scenario. I am open for approaches/practices here and change what i am planning to do currently.
2)If i use Azure Media Services. How would my client calls my services (node.js) which acts as a proxy for calling REST Services for Azure Media Services. How will this exactly work and i have file in hand in my proxy to re-upload. Or i will internally direct my services so that internally it uploads to Media Services .
3)How these media files uploaded in media services are related to a record in MongoDB. Like a record can have multiple media files.
I appreciate any pointers/explanations here.
Thanks,
To proper answer your question there are few questions need to be answered.
1.What functionality are you going to provide on top of azure media services. From your question it seems that main goal to let users to upload asset and them to have ability to stream uploaded content.
For this purpose you need to have following steps to be implemented in node js.
Create asset and asset files records in Azure Media Service(WAMS) by calling REST API. http://msdn.microsoft.com/en-us/library/windowsazure/hh973617.aspx
Create access policy and locator which will give URI of blob storage where file needs to be uploaded. WAMS REST API
Upload file using node.js to blob storage http://www.windowsazure.com/en-us/develop/nodejs/how-to-guides/blob-storage/
Create encoding job which will encode you input into multi bitrate mp4. WAMS REST API
Package your multi bitrate mp4 to smooth or HLS format or utilize dinamic packaging feature in WAMs. http://channel9.msdn.com/Series/Windows-Azure-Media-Services-Tutorials/Introduction-to-dynamic-packaging
Once you ready to stream your content you need to give user client playback url pointing to origin server. In order to do this you have to call WAMS REST API and create origin locator
Assets are exposing ID and AlternativeID properties which you can use to map your metadata about content with WAMS assets and implement any additional Content Management logic.
You need to act as proxy if you have some user based authentication and don't want to have dedicate separate azure media account to one user. WAMS provides basic blocks for asset ingest, encode, package, ondemand delivery and in nearest future for live streaming.
It can be used as foundation for your cms system or you can act SSAS provider by adding additional authentication, authorization layer. Currently you can use third party offerings http://www.ezdrm.com/ for playback DRM protection or your own license server http://msdn.microsoft.com/en-us/library/windowsazure/dn223278.aspx.
I have a small solution but I think it will require some work from you, maybe you wont like it that much, how about working with Windows Azure Mobile services. it support adding NPM now, the reason that I am telling you to use the Windows Azure Mobile services is that it will help you connect to your clients whatever was the application platform used.
http://weblogs.asp.net/scottgu/archive/2013/06/14/windows-azure-major-updates-for-mobile-backend-development.aspx
to integrate between the .Net and Node Js you can start by using Edge JS or signalR I think.
http://www.infoq.com/articles/the_edge_of_net_and_node
http://www.asp.net/signalr
I just want to suggest an idea that might be helpful to work around the lack of support of Media services in Node Js is that you use Blob storage for streaming. after all the Media Services is based on the Blob storage I think. here is a link that will guide you with the usage of the blob storage.
http://www.windowsazure.com/en-us/develop/nodejs/how-to-guides/blob-storage/
here is also a question posted before about how to stream from blob storage using Node JS I hope you find it beneficial.
How to stream a Windows Azure Blob to the client using Node.js?
Getting contents of a streaming Blob to be sent to a Node.js Server
here is also another link that will help you to do so "Geo-Protection for Video Blobs Using a Node.js Media Proxy"
http://msdn.microsoft.com/en-us/magazine/dn198240.aspx
Just wanted to make sure that you got the Windows Azure Node Js SDK, you might find some solutions that can help you with the development of you application.
https://github.com/WindowsAzure/azure-sdk-for-node
I hope my answer helps you let me know if you need anything else.
I have more recent Typescript based samples now for AMS v3 API using our latest javascript Node.js SDK here
https://github.com/Azure-Samples/media-services-v3-node-tutorials