Playing Smoothstreaming URL by providing Manifest file of smoothstreaming to Chromecast device - google-cast

We want to play Smoothstreaming URL by providing Manifest file of smoothstreaming to Chromecast device.
We could play the following on Chromecast device,
1. .mp4 file
2. .ismv file
3. .isma file.
But, if we provide an Manifest file as follows, we are not able to play on Chromecast device.
http://playready.directtaps.net/smoothstreaming/SSWSS720H264/SuperSpeedway_720.ism/Manifest
Please let us know how to play Smoothstreaming URL on Chromecast device.
Or do we need to play .ismv files one by one by providing them in a loop.

The Chromecast has support for SmoothStreaming content through their Media Player Library: https://developers.google.com/cast/docs/player
Below is a bare bones implementation.
Google provides a proper example on GitHub which takes advantage of the MediaManager and accounts for other streaming formats: https://github.com/googlecast/CastMediaPlayerStreamingDRM)
var $mediaElement = $('<video>').attr('autoplay', ''),
mediaElement = $mediaElement[0],
mediaUrl = "http://playready.directtaps.net/smoothstreaming/SSWSS720H264/SuperSpeedway_720.ism/Manifest",
mediaHost,
mediaPlayer;
cast.receiver.CastReceiverManager.getInstance().start();
$('body').append(mediaElement);
mediaHost = new cast.player.api.Host({
mediaElement: mediaElement,
url: mediaUrl
});
var protocol = cast.player.api.CreateSmoothStreamingProtocol(mediaHost);
mediaPlayer = new cast.player.api.Player(mediaHost);
mediaPlayer.load(protocol);

Microsoft's test files (inc the ISM) do not return the CORS header required for the chromecast. use a CORS on all of your server and it shall work.
I've encountered this too, and its working if I host them myself with CORS

Related

Chrome Extension Manifest v3 Service Worker MediaRecorder Alternative

I have chrome extension (written in Manifest v2) that uses MediaRecorder to record the web session
In background.js I have below code to capture web session
chrome.tabs.captureVisibleTab(null, {format: 'png', quality: 100},
function(base64) {
// base64 is used to write to the canvas
// and used canvas.getMediaStream() to get the stream
// and stream passed as input to mediaRecorder
});
now I have to migrate to Manifest v3 where service worker don't have access to mediaRecorder and I am looking for alternative to it.
I have also tried to send base64 to the content script to write it onto canvas and to use canvas stream and media recorder available on the content script, but as and when page refreshes stream ends and new canvas and stream get created which causes video played in different tracks.
please let me know if anyone has solution for it

Switch audio output to speakers

I work on own specialized VoIP client for W10 mobile & desktop. Basic things work ok.
However I cannot get audio output to speakers on my old Lumia.
foreach (var item in (await DeviceInformation.FindAllAsync(DeviceClass.AudioRender)).Where(i => i.Name.Contains("Speakers")))
RendererId = item.Id;
There is "Speakers (WaveRT)" in device list so RendererId is valid.
Later application tries to open audio device (WSAPI) with found RendererId. But anyway phone plays to receiver only.
I modified Voip sample app in attempt to reproduce issue - yes, it happens with Voip sample app also.
My collegue confirms he has same problem on his phone.
Is it possible to play audio via speaker for voip app ?
Thank you!
On the Phone devices only, you can use the AudioRoutingManager to change the audio output.
// to get the audio manager
IsAudioRoutingSupported = ApiInformation.IsApiContractPresent(typeof(PhoneContract).FullName, 1);
if(IsAudioRoutingSupported)
{
// audio routing is supported, we register for the output change events
m_audioRoutingManager = AudioRoutingManager.GetDefault();
m_audioRoutingManager.AudioEndpointChanged += OnAudioEndpointChanged;
}
// to change to output
m_audioRoutingManager.SetAudioEndpoint(AudioRoutingEndpoint.Speakerphone);

To play VOD streams in chrome using videojs

I am trying to play an HLS stream (.m3u8) containing .ts file buffers.
It was playing all right before when i tried with some open sources. Now when i am trying to play with the streams provided by my service provider, It does play in all major browsers except chrome.
P.s: I am using videojs for accomplishing these. I also tested using viblast but no luck there.
For reference i am posting my code :
<script>
//myplayer is my <video> object for videojs
myPlayer.src({
type: "application/x-mpegURL; application/vnd.apple.mpegurl",
src: encodeURI(some m3u8 link)
});
myPlayer.play();
</script>
Now this code here is ruling all browsers but when it faces chrome it kneels down.
The error response from chrome is like below :
VIDEOJS: ERROR: (CODE:3 MEDIA_ERR_DECODE) The media playback was
aborted due to a corruption problem or because the media used features
your browser did not support.
MediaError {code: 3, message: "The media playback was aborted due to a
corruption…media used features your browser did not support."}
Note: I am getting my streams from scaleEngine.
I came across this error when using an Mp4 with a Webm fallback. Everything worked well in Firefox but I was getting this error in Chrome. I switched the order of fallbacks so that videojs used Webm first and Mp4 as a fallback. This fixed it for me at least.

Play Audio from receiver website

I'm trying to get my receiver to play an mp3 file hosted on the server with the following function
playSound_: function(mp3_file) {
var snd = new Audio("audio/" + mp3_file);
snd.play();
},
However, most of the time it doesn't play and when it does play, it's delayed. When I load the receiver in my local browser, however, it works fine.
What's the correct way to play audio on the receiver?
You can use either a MediaElement tag or Web Audio API. Simplest is probably a MediaElement.

Monotouch video streaming with MPMoviePlayerController

I am creating a Monotouch iPhone app that will display streaming videos. I have been able to get the MPMoviePlayerController working with a local file (NSUrl FromFile), but have not been able to get videos streamed from a media server.
Here is the code I am using to play the video:
string url = #"http://testhost.com/test.mp4";
var nsurl = NSUrl.FromString(url);
mp = new MPMoviePlayerController(nsurl);
mp.SourceType = MPMovieSourceType.Streaming;
//enable AirPlay
//mp.AllowsAirPlay = true;
//Add the MPMoviePlayerController View
this.View.AddSubview(mp.View);
//set the view to be full screen and show animated
mp.SetFullscreen(true, true);
//MPMoviePlayer must be set to PrepareToPlay before playback
mp.PrepareToPlay();
//Play Movie
mp.Play();
Is there something else in implementing the MPMoviePlayerController for video streaming that I am missing? I also read that videos for iOS should be streamed using Apple's HTTP Live Streaming on the media server, is that a requirement? I have never worked with video streaming on an iOS device before so I am not sure if there is something lacking in my code or the media server setup or a combination.
I'm pretty sure you need a streaming server to use a video file from an HTTP url. However it's a requirement for applications (on the AppStore) to do so:
Important iPhone and iPad apps that send large amounts of audio or
video data over cellular networks are required to use HTTP Live
Streaming. See “Requirements for Apps.”
The good news is that your existing code should not have to be changed to handle this (it's server, not client side).

Resources