Monotouch - Fast Forward/Rewind Audio Stream - xamarin.ios

I'm using the Monotouch sample to play Audio file from a remote server (https://github.com/xamarin/monotouch-samples/tree/master/StreamingAudio), and I need to implement the Fast forward and Rewind functionalities.
The StreamingPlayback Class has implements the OutputAudioQueue Play and Pause functions. I have worked with the AVAudioPlayer in Objective-C and could do something like:
-(void) rewind()
{
myPlayer.currentTime -= 10.0;
}
However with the OutputAudioQueue, the CurrentTime is Read-Only, and it looks to me like there is no way to implement the Rewind/Fast Forward with this. Anyone has a solution to this, or a better way to play my remote audio files?
Thanks

Related

Pre Spatializer Effect - Unity and Google VR Audio

I am using GVR Audio within unity to provide HRTF's for my audio sources, my project involves modelling the acoustics of the virtual environment which needs to happen before the HRTF filters.
On a default unity audio source there is an option to spatialise post effects, meaning I can insert my own effect there. However on the GVR Audio source there is no such option, what is the recomended way to spatialize post effects with GVR?
GvrAudioSource uses Unity's AudioSource under the hood. This means, it is possible to apply pre-spatialization processing using the OnAudioFilterRead method - as you'd normally write for audio sources in your script.
Alternatively, for other audio effect components that would require the spatializePostEffects option, you could simply enable the option through the script by adding the corresponding line below to Awake() function in GvrAudioSource.cs:
void Awake () {
...
audioSource.spatialBlend = 1.0f;
audioSource.spatializePostEffects = true; // Add this line.
OnValidate();
...
}
Please also note that, this unfortunately does not currently allow you to add Unity's stock AudioEffect components (e.g. AudioLowPassFilter) in the Editor, as it'd complain about the lack of an AudioSource component in that game object. This is, however, only a UI limitation, i.e., adding a component with such restrictions in run time should still work as expected.
Hope this answers your question.
Cheers

Using javax.sound.sampled with openjdk-7 on Raspi

I'm having big problems getting good sound out of my Raspi using Java.
I want to write an little AirPlay Client to a media server I wrote in Java. I started with using the Player Class from javazoom (http://www.javazoom.net/javalayer/docs/docs1.0/javazoom/jl/player/package-summary.html). Which gave me a not really choppy but somehow distorted and slower than normal playback of an mp3 file I streamed over to the Raspi.
My first idea was, that maybe the decoding of an mp3 was a bit too much for the Raspi, especially since overclocking helped a little.
So now I'm converting the mp3 to a wav file on the server and then stream it to the Raspi playing it with the javax.sound.sampled.* stuff. Yet, no improvement :|
Has anybody some experience with playing soundFiles from Java on an Raspi? Any advice helps!
Thanks, Stefan
I suggest you try using JavaFX instead. It is way better than support for media on the standard JDK. Besides, you are already using Java 7, so migrating will be easy. Here is how:
Media media = new Media(Utils.findURI(baseDirName, soundFileName).toString());
SwingUtilities.invokeLater(new Runnable() {
#Override
public void run() {
Platform.runLater(new Runnable() {
#Override
public void run() {
MediaPlayer mediaPlayer = new MediaPlayer(media);
mediaPlayer.setCycleCount(1);
mediaPlayer.play();
}
});
}
});
the implementation of sound in OpenJDK has lots of bugs. One of them is that it abuses the sound system and monopolizes it. So on Linux you have better chances to get things working if you do as above.

Using MediaPlayer with a Timer

I currently have a timer that I want to play a sound file when the timer ends. However it's only playing a fraction of audio file I have stored. I have nothing in the code that tells the file to stop playing, so I don't know why this is happening.
I start out with this:
public class DrinkinzActivity extends Activity {
/** Called when the activity is first created. */
Timer myTimer;
public MediaPlayer mMediaPlayer;
Then in the onCreate method I have this:
mMediaPlayer = new MediaPlayer();
mMediaPlayer = MediaPlayer.create(getBaseContext(), R.raw.soundfile);
Then when the timer runs out, I have this:
mMediaPlayer.start();
I haven't even gotten to the part where the user can pick which sound file to play, or use a ringtone instead. Is there something I'm missing?
Apparently I answered my own question. The audio file I was using was in wma format, and android (being Linux) didn't really like that. I reformatted it to mp3 and it works fine as of right now. It had nothing to do with the timer.

How do I play a sound file in Java ME on Samsung mobile phones?

File formats I would like to play include .wav, .mp3, .midi.
I have tried using the Wireless Toolkit classes with no success. I have also tried using the AudioClip class that is part of the Samsung SDK; again with
If this device supports audio/mpeg you should be able to play mp3 use this code inside your midlet...
This works on my nokia symbian phones
// Code starts here put this into midlet run() method
public void run()
{
try
{
InputStream is = getClass().getResourceAsStream("your_audio_file.mp3");
player = Manager.createPlayer(is,"audio/mpeg");
// if "audio/mpeg" doesn't work try "audio/mp3"
player.realize();
player.prefetch();
player.start();
}
catch(Exception e)
{}
}
As for emulators my nokia experience is that I couldn't make it emulate mp3 player but when I put application on phone it works...
Without source code to review, I would suggest using the wireles toolkit (from http://java.sun.com)first.
It contains the standard J2ME emulator for windows and example code that will allow you to play a wav file.
assuming that works OK for you, try the same code on your Samsung device (of course, the location of the wav file will probably change so you have a tiny modification to make in the example code).
Assuming that works, compare your code that doesn't with the example code.

j2me screen flicker when switching between canvases

I'm writing a mobile phone game using j2me. In this game, I am using multiple Canvas objects.
For example, the game menu is a Canvas object, and the actual game is a Canvas object too.
I've noticed that, on some devices, when I switch from one Canvas to another, e.g from the main menu to the game, the screen momentarily "flickers". I'm using my own double buffered Canvas.
Is there anyway to avoid this?
I would say, that using multiple canvases is generally bad design. On some phones it will even crash. The best way would really be using one canvas with tracking state of the application. And then in paint method you would have
protected void paint(final Graphics g) {
if(menu) {
paintMenu(g);
} else if (game) {
paintGame(g);
}
}
There are better ways to handle application state with screen objects, that would make the design cleaner, but I think you got the idea :)
/JaanusSiim
Do you use double buffering? If the device itself does not support double buffering you should define a off screen buffer (Image) and paint to it first and then paint the end result to the real screen. Do this for each of your canvases. Here is an example:
public class MyScreen extends Canvas {
private Image osb;
private Graphics osg;
//...
public MyScreen()
{
// if device is not double buffered
// use image as a offscreen buffer
if (!isDoubleBuffered())
{
osb = Image.createImage(screenWidth, screenHeight);
osg = osb.getGraphics();
osg.setFont(defaultFont);
}
}
protected void paint(Graphics graphics)
{
if (!isDoubleBuffered())
{
// do your painting on off screen buffer first
renderWorld(osg);
// once done paint it at image on the real screen
graphics.drawImage(osb, 0, 0, Tools.GRAPHICS_TOP_LEFT);
}
else
{
osg = graphics;
renderWorld(graphics);
}
}
}
A possible fix is by synchronising the switch using Display.callSerially(). The flicker is probably caused by the app attempting to draw to the screen while the switch of the Canvas is still ongoing. callSerially() is supposed to wait for the repaint to finish before attempting to call run() again.
But all this is entirely dependent on the phone since many devices do not implement callSerially(), never mind follow the implementation listed in the official documentation. The only devices I've known to work correctly with callSerially() were Siemens phones.
Another possible attempt would be to put a Thread.sleep() of something huge like 1000 ms, making sure that you've called your setCurrent() method beforehand. This way, the device might manage to make the change before the displayable attempts to draw.
The most likely problem is that it is a device issue and the guaranteed fix to the flicker is simple - use one Canvas. Probably not what you wanted to hear though. :)
It might be a good idea to use GameCanvas class if you are writing a game. It is much better for such purpose and when used properly it should solve your problem.
Hypothetically, using 1 canvas with a sate machine code for your application is a good idea. However the only device I have to test applications on (MOTO v3) crashes at resources loading time just because there's too much code/to be loaded in 1 GameCanvas ( haven't tried with Canvas ). It's as painful as it is real and atm I haven't found a solution to the problem.
If you're lucky to have a good number of devices to test on, it is worth having both approaches implemented and pretty much make versions of your game for each device.

Resources