I am using Azure Media Service to live stream an event. My source is a HD SDI feed being captured via an AJA Kona LHi card and sent to Azure using Wirecast.
I'm using the default settings in Wirecast for Azure Media Service. I am using standard encoding and the 720p preset. I am using the Azure Media Player embedded in a private site.
Everything seems to work however iOS devices are unable to AirPlay the stream. The video plays correctly on the device but no AirPlay controls are available.
If I use the exact same setup to stream my webcam the AirPlay controls are available. Is there some specific configuration required to make this work reliably?
On iPad, Azure Media Player by default streams the videos in DASH. You need to specify the techorder to use give HLS a higher priority, e.g.:
var myOptions = {
techOrder: ["html5", "azureHtml5JS", "flashSS"],
autoplay: false,
controls: true,
width: "640",
height: "400",
};
var myPlayer = amp("azuremediaplayer", myOptions);
If you don't see the AirPlay button on the video player, you can do a swipe down gesture on the upper right region of the screen and bring up the "Quick controls" to access AirPlay:
If this still doesn't work, reply with the url of your webpage and I can take a look.
Related
I've got a link from Tuya API explorer using the service "IoT Video Live Stream". I want to know where I can use this link for stream my camera's video. I have the video on my tuya APP, but I want use this link.
Here's an example of the API return.
{"result": {
"url": "rtsps://eb3ba0.....aa418b.....:4UBYKMX9T.....T0#aws-tractor1.tuyaus.com:443/v1/proxy/echo_show/b6721582601.....b46ac1b71.....bbb3d*****bf7.....},"success": true,"t": 1642462403630}
Since several time, we are trying upload videos to Azure Media Service and watch it in a mobile. This works right in a PC and the situation it´s very crazy.
We upload a video to Azure Media Service using .NET API.
We can watch that videos in our Azure Media Player. But NOT since Azure Administration (There is a option to watch videos). Neither in azure media player sample viewer
Then...we don´t know if the problem is in Azure Administration, Azure Media Player or whe we upload video (Create asset, encode, create locator and policy...).
This is a one of my videos: http://media6franquiciasworldw.streaming.mediaservices.windows.net/e70ca01a-0be8-4f54-911c-6f4b85c0d396/12_mixtaSaltamontes.ism/manifest
This is my code:
//Creamos el ASSET a apartir de un archivo
IAsset inputAsset = _context.Assets.CreateFromFile(video.PathFile, AssetCreationOptions.StorageEncrypted);
//Encode/Codificación del vídeo. Transformamos el primer asset en otro que será el realmente difundido. Se usa un patrón (JSON/XML) definido en video.Enconder
IAsset encodedAsset = EncodeToAdaptiveBitrate(inputAsset, AssetCreationOptions.None, video.Enconder, video.GetAssetName(), video);
//If I use "AssetDeliveryProtocol.All", throw error: "Account is not enabled for HDS streaming"
IAssetDeliveryPolicy policy = _context.AssetDeliveryPolicies.Create("Clear Policy", AssetDeliveryPolicyType.NoDynamicEncryption, AssetDeliveryProtocol.SmoothStreaming, null);
encodedAsset.DeliveryPolicies.Add(policy);
// Publish the output asset by creating an Origin locator for adaptive streaming
_context.Locators.Create(
LocatorType.OnDemandOrigin,
encodedAsset,
AccessPermissions.Read,
TimeSpan.FromDays(3650));
And here it is my "Encoder": https://pastebin.com/zQ8rS73c
I note a couple of issues that may be wrong here.
Do you have a Streaming Endpoint started and running in your account? Make sure that is there first.
Don't use AssetDeliveryProtocol.All. There is an issues there in the SDK where it tries to add in Adobe HDS, which we are dropping support for. You would want to only use the specific protocols that you require for streaming on the delivery protocol. So use the following
pattern:
AssetDeliveryProtocol.SmoothStreaming | AssetDeliveryProtocol.Dash | AssetDeliveryProtocol.HLS | AssetDeliveryProtocol.ProgressiveDownload
You were likely not getting any playback on iOS or Android clients due to the fact that you only set the protocol to allow SmoothStreaming, which would only be supported on desktop or custom mobile clients. Adding DASH for Android, and Apple HLS for iOS should help here.
I am using the navigator.webkitGetUserMedia API to capture the desktop and using microphone to capture audio. When I make the following call
navigator.webkitGetUserMedia({
audio:true,
video: {
mandatory: {
chromeMediaSource: 'desktop',
chromeMediaSourceId: id,
maxWidth:screen.width,
maxHeight:screen.height}
}
}, gotStream, getUserMediaError);
I am getting a screen capture error. Does this API not support the above scenario?
I am able to capture audio and desktop video individually but not together. Also, since I am capturing desktop and not the webcam video, does that make any difference?
Chrome does not allow you to request an audio stream alongside a chromeMediaSource.
See Why Screen Sharing Fails here for more info.
You may be able to circumvent this by sending individual getUserMedia requests - one for the audio stream and one for desktop.
I'm trying to get my receiver to play an mp3 file hosted on the server with the following function
playSound_: function(mp3_file) {
var snd = new Audio("audio/" + mp3_file);
snd.play();
},
However, most of the time it doesn't play and when it does play, it's delayed. When I load the receiver in my local browser, however, it works fine.
What's the correct way to play audio on the receiver?
You can use either a MediaElement tag or Web Audio API. Simplest is probably a MediaElement.
I am creating a Monotouch iPhone app that will display streaming videos. I have been able to get the MPMoviePlayerController working with a local file (NSUrl FromFile), but have not been able to get videos streamed from a media server.
Here is the code I am using to play the video:
string url = #"http://testhost.com/test.mp4";
var nsurl = NSUrl.FromString(url);
mp = new MPMoviePlayerController(nsurl);
mp.SourceType = MPMovieSourceType.Streaming;
//enable AirPlay
//mp.AllowsAirPlay = true;
//Add the MPMoviePlayerController View
this.View.AddSubview(mp.View);
//set the view to be full screen and show animated
mp.SetFullscreen(true, true);
//MPMoviePlayer must be set to PrepareToPlay before playback
mp.PrepareToPlay();
//Play Movie
mp.Play();
Is there something else in implementing the MPMoviePlayerController for video streaming that I am missing? I also read that videos for iOS should be streamed using Apple's HTTP Live Streaming on the media server, is that a requirement? I have never worked with video streaming on an iOS device before so I am not sure if there is something lacking in my code or the media server setup or a combination.
I'm pretty sure you need a streaming server to use a video file from an HTTP url. However it's a requirement for applications (on the AppStore) to do so:
Important iPhone and iPad apps that send large amounts of audio or
video data over cellular networks are required to use HTTP Live
Streaming. See “Requirements for Apps.”
The good news is that your existing code should not have to be changed to handle this (it's server, not client side).