How to play a mp3 audio stored in a byte array in MAUI .NET - audio

I have in a database some MP3 sounds and I read and store its in a byte[] variable.
My question is, How can I play de audio stored in the byte[] variable in .NET MAUI?
Thanks in advance.

Related

JavaFX8 processing internal audio stream

Audio playback. All the examples I've seen for Audio or Media in JavaFX8 have a source, either http: or file:. In my case the byte stream will be coming through as a ByteArrayInputStream. Ultimately I suspect this is what the Audio or Media class objects are processing. The source would start life as compressed audio where I would decompress it and then feed it to the Audio object. I am not seeing how to feed a byte array into a JavaFX audio object? Would someone please point me at a solution.
Thanks.

Whats UCMA 3.0 Voicexml Browser default audio format?

Currently, I am developing a ucma 3.0 vxml application. In the voicexml document , I use a record element to take the caller's recording, and then send the recording to an exteranl web server and save it to a wav file.
whats the format of the saved wav file, 8KHz/8-bit or 8KHz/16-bit or 16KHz/16-bit?
Can I set or change the audio format of the ucma vxml Browser to generate audio with different format?
They supported audio formats for UCMA VoiceXML are:
Raw (headerless) 8kHz 8-bit mono mu-law [PCM] single channel. (G.711) -- audio/basic (from [RFC1521])
Raw (headerless) 8kHz 8 bit mono A-law [PCM] single channel. (G.711) -- audio/x-alaw-basic
WAV (RIFF header) 8kHz 8-bit mono mu-law [PCM] single channel. -- audio/x-wav
WAV (RIFF header) 8kHz 8-bit mono A-law [PCM] single channel. -- audio/x-wav
The part after the "--" is the mime type. You specify the mime type in the "type" attribute of the "record" element.

how to convert audio output data into frames using AVFoundation in iphone sdk

I am trying to capture the video and audio using AVCaptureSession and I done with videocapturing and converted into pixel buffer and I played the output captured video at server side using ffmpeg n rtmp server. But the thing is how can I make the audio to be converted info data and play it at sever side where the data received. And want to know what the audio format is the audio that is captured.
Thank's All,
MONISH

Adobe AIR Filestream with audio playback

I've tried to mix Audio playback with the URLStream and FileStream classes. My idea was to stream the file to disk to save memory and use the sampleData event of the Audio class to play some audio. Is it possible to access the streamed file while it is streaming somehow to feed the Audio class?
This is interesting because there are large podcasts out there that takes a lot of memory. The current solution is to delete the audio class when the user changes the track and it is working fine, but I want to make it even better.

How to convert the audio data to CMSampleBufferRef?

I will to recoding audio to a video file by using AVAssetWriterInput, and the audio data is store in memory with byte array format.how can I do it?
I frind the CMAudioSampleBufferCreateWithPacketDescriptions function meet my request, any one has the sample to use it,by using byte array audio data?

Resources