How to build XML device description for a minimalistic but fully DLNA compatible audio renderer device? - audio

I'm developing an embedded DLNA audio media renderer in C++ which will capture audio stream and use it as a source for LED controller to create visual effects.
Currently I have chosen gupnp library. Its documentation claims that it can generate C code "scaffold" if I feed it with a correct service description XML file.
I read DLNA specifications documentation but it's not easy to put all the absolute minimal requirements together. For now, I know only that I'll need ConnectionManager, RenderingControl and maybe AVTransport service but I'm not sure which methods will be absolutely required to allow the device to receive mp3/aac/wav audio streams from any DLNA media server/radio streamer etc. out there.
I'll be grateful for information where to find such a template or instructions for building a DLNA compatible XML audio renderer template (MediaRenderer:1 should work fine) with minimum required functions and variables to feed into UPNP generator tools (specifically, gupnp).

Related

Azure Media Player HLS support

I'm using Azure Media Player in my project to play Azure Media Service assets and it works great for that. However, I'd also need to play some HLS content within the same project and would like to use the same player.
Microsoft claims that AMP supports HLS, but when I put any HLS source into it and set the the format to HLS, I'm ending up with a "No compatible source was found for this media." error.
Has anyone managed to successfully play HLS content with AMP?
Azure Media Player only supports playing content from Media Services. If you want to use a single player for any HLS source including non Media Services content then you'll probably want to use a different player like Shaka, Video.js, or JWPlayer. That said, you might have success with AMP if you disable to URL rewriter as per https://learn.microsoft.com/en-us/azure/media-services/azure-media-player/azure-media-player-url-rewriter.
See https://learn.microsoft.com/en-us/azure/media-services/azure-media-player/azure-media-player-playback-technology. AMP uses "html5" and "html5FairPlayHLS" for HLS playback which both rely on native HLS support vs support via javascript. If you need to use HLS on windows / or older versions of android you would need to use another player.

MPEG-DASH support on Android via NDK (C++)

I am looking to support playing DRM into custom player which is built using NDK C++ library as plugin which decodes, converts and then perform some image processing before final presentation. In this scenario, what is the best way to support DRM (I will use NDK's Crypto and DRM interfaces) given that documentation hints at only supporting MPEG-DASH format, which is not natively supported.
Clarify if any of my assumptions are incorrect or there are simple libraries (like libdash) which can solve the problem. Extreme solution in ExoPlayer but current infrastructure is built using C++ and NDK interface to leverage hardware decoders, which excludes that as an option.
If your image processing requires access to the raw image then unfortunately you won't (or you shouldn't!) be able to do this as encrypted video is designed to play via a secure media path which does not allow access to the raw video.

xamarin forms streaming and recording audio

I have an app that I have written using Xamarin forms. I wanted to know if there is any media
class I could use to record audio as well as stream audio from a server. All articles I have found on the web are platform specific so far.
thanks in advance
There currently isn't any audio support in Xamarin.Forms. You will need to write platform specific code for handling the audio and use the XF DependencyService (or something similar) to call it from your shared code.

Is it actually possible to play SmoothStreaming videos on Chromecast without using (format=mpd-time-csf)?

I'm trying to get a video playing on Chromecast - it's available as an MS Smooth Streaming manifest (example), but I can't specify format=mpd-time-csf (example), as it's not available in that format.
Directly passing the manifest URL to the Chromecast doesn't work yet, but will be available for the final SDK release, as stated by Les Vogel here:
Playing Smoothstreaming URL by providing Manifest file of smoothstreaming to Chromecast device
As I understand it can play dash/smooth videos by embedding the dash.js player into the receiver app. However, dash.js only plays smooth streaming videos if (format=mpd-time-csf) is specified - normal smooth manifests don't work.
Does this mean that in its current state, Chromecast only supports smooth video in the mpd-time-csf format? I assume the Netflix app uses Smooth for its Chromecast app - is this how they're doing it?
Currently, unless you write your own (javascript) player than can handle smooth streaming (i.e. parse manifest, fetch fragments, use MSE extension, etc), you cannot play smooth streaming content on chromecast.
Ali.
Just following up my own question in case anyone stumbles across this from Google - with the release of the SDK SmoothStreaming should be playable out of the box with the Media Player Library: https://developers.google.com/cast/docs/player.

Android Native Audio development

I'm trying to develop the application based on native audio in gingerbread,
I executed the sample native audio program under the NDK ,but I'm not clear with
that. I need some example to learn how to use the openSL library.
Can any one suggest an example of open SL|ES based code ?
OpenSL ES documentation and that sample app are the best resources that are out there. Not to say that they're great, but they are definitely sufficient provided that you have the knowledge of object-oriented programming and audio. If you don't, those are the things you should look into first.

Resources