JavaFX8 processing internal audio stream - audio

Audio playback. All the examples I've seen for Audio or Media in JavaFX8 have a source, either http: or file:. In my case the byte stream will be coming through as a ByteArrayInputStream. Ultimately I suspect this is what the Audio or Media class objects are processing. The source would start life as compressed audio where I would decompress it and then feed it to the Audio object. I am not seeing how to feed a byte array into a JavaFX audio object? Would someone please point me at a solution.
Thanks.

Related

Routing AVPlayer audio output to AVAudioEngine

Due to the richness and complexity of my app's audio content, I am using AVAudioEngine to manage all audio across the app. I am converting every audio source to be represented as a node in my AVAudioEngine graph.
For example, instead using AVAudioPlayer objects to play mp3 files in my app, I create AVAudioPlayerNode objects using buffers of those audio files.
However, I do have a video player in my app that plays video files with audio using the AVPlayer framework (I know of nothing else in iOS that can play video files). Unfortunately, there seems to be no way I can obtain the audio output stream as a node in my AVAudioEngine graph.
Any pointers?
If you have a video file, you can extract audio data and pull it out from the video.
Then you can set the volume of AVPlayer to 0. (If you didn't remove audio data from the video)
and Play AVAudioPlayerNode.
If you receive the video data through network, You should make parser of the packet and divide them.
But AV-sync is very tough thing.

Moviepy write_videofile changes number of video and audio frames even after using 'rawvideo' as the codec parameter

I am using moviepy (Python) to read video and audio frames of a video and after making some changes I am writing them back to a videofile, say new.avi, to preserve the changes, or to avoid compression, I am using codec= 'rawvideo' in write_videofile function. But when I read the video and audio frames back, the number of video and audio frames are different than when they were when written, they are usually increased.
Can anybody tell me the reason,? is it because of the ffmpeg used or some other reason? Does it happen always or there is some problem in my machine? Thank you :-)

PCM/RAW audio container for streaming

I would like to know if any of you tried to stream RAW audio data. I tried to do that using a WAV file but this does not support streaming. Could anyone provide me a container for that (except Matroska) :)
Thank you
I discovered OggPCM. I think is the only container which allows you to stream PCM audio (and quite standard).
WAV file does not allows you to stream raw data, so I eliminated this one from the list. I wanted to stream some audio data thought in my wireless network, so the bandwidth it was not an issue.

how to convert audio output data into frames using AVFoundation in iphone sdk

I am trying to capture the video and audio using AVCaptureSession and I done with videocapturing and converted into pixel buffer and I played the output captured video at server side using ffmpeg n rtmp server. But the thing is how can I make the audio to be converted info data and play it at sever side where the data received. And want to know what the audio format is the audio that is captured.
Thank's All,
MONISH

Adobe AIR Filestream with audio playback

I've tried to mix Audio playback with the URLStream and FileStream classes. My idea was to stream the file to disk to save memory and use the sampleData event of the Audio class to play some audio. Is it possible to access the streamed file while it is streaming somehow to feed the Audio class?
This is interesting because there are large podcasts out there that takes a lot of memory. The current solution is to delete the audio class when the user changes the track and it is working fine, but I want to make it even better.

Resources