Converting Cast V2 to CAF when it also uses DLNA + custom media path - google-cast

Looking to move an Android app from V2 to CAF but it also allows casting to both DLNA and custom devices. Those are currently handled via the MediaRouteSelector and MediaRouter callback which CAF apps are not supposed to touch. So how do you integrate these other media routes into a CAF app?

Related

Azure media service Transcript generation

I have a requirement to generate transcripts of my videos when I upload it to azure blob in my c# function app. we already have one media service version 2. which we use for video encoding , this uses CloudMediaContext object to create context and job.
i could not find a way to add transform job to CloudMediaContext object. i have only seen examples over internet of adding transform using IAzureMediaServicesClient object.
Do i need to upgrade my media service from v2 to v3 ? but with this i will have to migrate my existing code and assets too , or do we have option of adding transform in V2 as well ?
I would recommend that if you have an existing app on the Media Services v2 API, you should spend the time (hopefully minimal) upgrading it to the v3 API very soon. The v2 API is not getting any newer features and does not have the latest speech to text models and transcription support.
Transforms were introduced in V3 API and also introduced a new basic "audio analyzer" preset that does not exist in V2.
If you are using .NET, take a look at these samples to get started with the v3 API and the Audio analyzer preset using a Transform.
https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/AudioAnalytics/AudioAnalyzer
Let me know if you have further questions. Details on the Audio analyzer preset can be found in the docs here - https://learn.microsoft.com/en-us/azure/media-services/latest/analyze-video-audio-files-concept

Azure Media Player HLS support

I'm using Azure Media Player in my project to play Azure Media Service assets and it works great for that. However, I'd also need to play some HLS content within the same project and would like to use the same player.
Microsoft claims that AMP supports HLS, but when I put any HLS source into it and set the the format to HLS, I'm ending up with a "No compatible source was found for this media." error.
Has anyone managed to successfully play HLS content with AMP?
Azure Media Player only supports playing content from Media Services. If you want to use a single player for any HLS source including non Media Services content then you'll probably want to use a different player like Shaka, Video.js, or JWPlayer. That said, you might have success with AMP if you disable to URL rewriter as per https://learn.microsoft.com/en-us/azure/media-services/azure-media-player/azure-media-player-url-rewriter.
See https://learn.microsoft.com/en-us/azure/media-services/azure-media-player/azure-media-player-playback-technology. AMP uses "html5" and "html5FairPlayHLS" for HLS playback which both rely on native HLS support vs support via javascript. If you need to use HLS on windows / or older versions of android you would need to use another player.

using LibVLCSharp on asp.net core razor page

Ho we use LibVLCSharp on an Asp.net core razor page ?
I have looked at the LibVLCSharp.NetCore.Sample sample, which is an asp.net core console app
i tried to crate an asp.net core webapplication and try to modify the example to display the video
on the razor page
If what you are asking is to have a VLC based player in the browser, this will be supported when https://code.videolan.org/jbk/vlc.js/ lands.
If what you are asking is use LibVLCSharp on the server to stream to a JS player in a webpage, it is likely possible (though probably not the best tool for the job), but you will have to edit your question with more details as to what you want to know.
The LibVLCSharp.NetCore.Sample is a .net core sample, not an ASP.net core sample.
Displaying a video in a web page is something that goes beyond the usage of libvlc.
See also this issue

Integrating the Spotify API with Desktop App using C++ stream music and get metadate

i didnt found any information if i can stream music and get songs metadata in
desktop API ? without using some kind of local server and emulate web api
any idea ?
i want to use stream music and get metadate in c++ app

Qt5 Multimedia Backends specifying

On Windows, I saw that there are at least two media services in the Qt's plugins: dsengine (DirectShow) and wmfengine (Media Foundation) for playing videos (https://wiki.qt.io/Qt_5.5.0_Multimedia_Backends).
How can we know which media services backend being used? And how to specify the media service backend to use programmatically

Resources