I am trying to understand how to capture the audio to render a waveform on an OpenGL canvas for exoplayer.
It seems possible to implement by extending the
MediaCodecAudioTrackRenderer class as it has a processOutputBuffer() method.
Can anybody advise or point me to an example of how this can be implemented?
Related
I would like some how to seek position of Audio component in Vaadin, or to read current time from Audio player. Is this possible to do somehow?
These features are not included in Audio component. But luckily there is add-on, more sophisticated audio player, which has the features you are looking for.
https://vaadin.com/directory/component/audiovideo
For even more complex use cases of multiple audio streams there exists also
https://vaadin.com/directory/component/audioplayer-add-on
Could any help me with Sound\Music Visualisation simple example code (oscillogramm) on C++?
Is it possible to make it without registrating MFT DLL as in DShow\Scope - simple manual connecting source-visualisation?
You can use the Sample Grabber Sink configured to accept audio samples (audio IMFMediaType). The data from the captured audio samples can then be visualized using DirectX, GDI or even simple controls like progress bars.
Check this link: https://msdn.microsoft.com/en-us/library/windows/desktop/hh184779(v=vs.85).aspx
The OnProcessSample function printf's some info about each audio sample. You can use it as a starting point for your visualization code.
I have developed a music player in javafx.
Is there any way to implement 3D audio effect similar to WOW EFFECT in windows media player.
I have searched there is no method of mediaplayer to implement such thing.
My javafx level: intermediate
AFAIK, the only effect that can be applied on the JavaFX MediaPlayer is an equalizer, with the javafx.scene.media.AudioEqualizer class. You can get the MediaPlayer's AudioEqualizer with the getAudioEqualizer method, and modify its existing bands, add/remove bands, etc.
However, if you want to implement other effects (reverb, delay, distortion... whatever else), I think you're out of luck: The JavaFX media API doesn't provide methods for that, and it doesn't seem to be meant to be extensible in any way (you can't add support for other codecs either, for example).
If you need more than what the JavaFX media API provides, the only solution for serious media playback in java seems to be to use a native library with a Java wrapper. vlcj (website here, Javadoc here) seems to be a good solution: it offers a java wrapper around VLC, which is a really powerful media player, so you should be able to do most of what you may want - worst case, it provides APIs to directly access the audio buffer and manipulate it yourself. It's clearly quite a bit more involved than using JavaFX's native media playback though...
I have an C-Code for a video codec. It takes in a compressed format as an input and give out a YUV data buffer. As a standalone application i'm able to render the YUV generated using OpenGL.
Note: This codec is currently not supported by VLC/gstreamer.
My task now is to create a player using this code (that is with features such as play, pause, step, etc.). Instead of re-inventing the whole wheel, i think it would be better if i'm able to integrate my codec into gstreamer player code(for Linux).
Is it possible to achieve the above? Is there some tutorial using which i can proceed? I have searched a lot on net but was unable to find anything specific to my requirement. Any information or links specific to the above problem will be of great help to me. Thanks in advance.
-Regards
Since the codec and container are of new MIME types, you will have to implement a new GstElement for demuxer and codec. A simple example (for audio) is available in this location. I presume this should provide a good starting reference for you.
Some additional links:
To create a decoder plugin, you can refer to the vorbisdec implementation.
To create a demuxer, you can refer to the oggdemuxer implementation.
Reference to factory make
I have a project for the iphone that use CoreAudio to play multiple files at the same time. So what is the fastest library/framework to animate graphics along CoreAudio programming : CoreGraphics? CoreAnimation ? Cocos2D? OpenGL ?
My needs are simple : loading, displaying, hiding, rotation images and panning some views.
Thanks.
André
If I am interpreting your question correctly, you want to know which of the specified frameworks can animate graphics fast enough to keep up with some audio. I would use Core Animation for this. Core Animation is a wrapper around OpenGL and therefore is very performance effective. It supports both 2D and 3D animations and all animation is hardware accelerated. Core Graphics is not for animating but for drawing, so you can rule that out. OpenGL is a possible candidate, but it is very low level and difficult to learn, so I wouldn't recommend. Furthermore, it is largely used for 3D animations. Cocos2D is a game framework and can be used for animation, but I don't recommend using it solely for animation if you are not also using it for an actual game.
Hope this helps!
The question doesn't make sense. Graphics is independent of audio.
Sounds like you could get away with animating properties on UIViews, although if you need high performance you would want core animation.
actually, core animation is conceptually a better path, seeing as you probably don't need to respond to touch events.
That is what you want anyway. Core animation.