Notification sound not playing in J2ME - audio

I am working on a J2ME application.
I am using Nokia 6131 NFC phone. I am using NetBeans IDE.
I have 4 forms and I am playing some notification sounds for the user while filling the form.
The problem is sound goes off suddenly after 3 to 4 min and the only solution is to exit the application and again open it.
My Code
public void playSoundOK()
{
try
{
InputStream is = getClass().getResourceAsStream("/OK.wav");
Player player = Manager.createPlayer(is,"audio/X-wav");
player.realize();
player.prefetch();
player.start();
}
catch(Exception e)
{
e.printStackTrace();
}
}
Exception
at com.nokia.mid.impl.isa.mmedia.audio.AudioOutImpl.openSession(AudioOutImpl.java:206)
at com.nokia.mid.impl.isa.mmedia.MediaOut.openDataSession(MediaOut.java:282)
at com.nokia.mid.impl.isa.mmedia.MediaPlayer.doPrefetch(MediaPlayer.java:155)
at com.nokia.mid.impl.isa.amms.audio.AdvancedSampledPlayer.doPrefetch(+4)
at com.nokia.mid.impl.isa.mmedia.BasicPlayer.prefetch(BasicPlayer.java:409)
at org.ird.epi.ui.UtilityClass.playSoundOK(UtilityClass.java:139)
at org.ird.epi.ui.EnrollmentForm.targetDetected(+695)
at javax.microedition.contactless.DiscoveryManager.notifyTargetListeners(DiscoveryManager.java : 700)
at javax.microedition.contactless.DiscoveryManager.access$1200(DiscoveryManager.java:103)
at javax.microedition.contactless.DiscoveryManager$Discoverer.notifyIndication(DiscoveryManager.java:882)
at com.nokia.mid.impl.isa.io.protocol.external.nfc.isi.NFCConnectionHandler$IndicationNotifier.run(+67) javax.microedition.media.MediaException: AUD

I would advise you to split NFC and audio playback into 2 different threads.
It is typically a bad idea to call a method that should take some time to complete (like prefetch) from inside an API callback (like targetDetected) because it makes you rely on a particularly robust kind of internal threading model that may not actually exist in your phone's implementation of MIDP.
You should have one thread whose sole purpose is to play the sounds that your application can emit. Use the NFC callback to send a non-blocking command to play a sound (typically using synchronized access to a queue of commands). The audio playback thread can decide to ignore commands if they were issued at a time when it was busy playing a sound (no point in notifying the users of multiple simultaneous NFC contacts)

You should close your player. Add the following code to your method:
PlayerListener listener = new PlayerListener() {
public void playerUpdate(Player player, String event, Object eventData) {
if (PlayerListener.END_OF_MEDIA.equals(event)) {
player.close();
}
}
};
player.addPlayerListener(listener);

Related

Haptics don't play in background when bluetooth headphones are connected to Apple Watch

I have a watchOS3 workout app that uses haptic notifications. It setup correctly to run a workout session and haptics work when running in the background. However, if bluetooth headphones are connected to the apple watch, then you only get the haptic vibration OR the audio chime for the haptic, depending on whether the app is currently showing on the watch face or running in background.
Here's how I'm playing the haptic:
WKInterfaceDevice.current().play(.notification)
Here are the details:
Apple Watch Nike+ Unpaired to Headphones: Haptic and chime sound activate regardless of whether the watch face is on or off. Chime sound is loud and clear.
Apple Watch Nike+ Paired to Bluetooth Headphones:
Haptic only active if watch face is on, audio chime is off. Audio chime is on when watch face is off, haptic is off. Chime sound is loud and clear.
I tested the app pairing the Apple Watch separately with the Platronic Backbeat Go 2 (released 7/2013) and the Bose QuietControl 30 (released 10/2016). The results were the same.
Anyone know if this is a limitation of watchOS 3, a bug, or is there something else I need to be doing?
Thanks,
Jeff
There's another factor to consider, which is whether the main Watch mute switch is on or off.
I found the following to be a suitable workaround.
When you want your audio/haptic to be noticed while music is playing on Bluetooth Headphones, have this AVAudioSession category set:
AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, with: .duckOthers)
Then, when you're done with your audio/haptic, reset the AVAudioSession back to:
AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, with: .mixWithOthers)
I have this helper class to help manage the states:
import AVFoundation
enum AudioPlaybackState {
case playback
case playbackDuckOthers
}
class AVAudio {
private init() {} // strictly a helper class
static func setAudioState(_ state: AudioPlaybackState) {
DispatchQueue.main.async {
do {
deactivateAudioSession()
switch state {
case .playback:
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, with: .mixWithOthers)
case .playbackDuckOthers:
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, with: .duckOthers)
}
activateAudioSession()
} catch {}
}
}
static func deactivateAudioSession() {
activateAudioSession(false)
}
private static func activateAudioSession(_ value: Bool=true) {
do {
try AVAudioSession.sharedInstance().setActive(value)
} catch {}
}
}
Then, I can switch quickly by: AVAudio.setAudioState(.playbackDuckOthers)
The behavior documented in the question is the the way its supposed to be, according to Apple. The given reason is because of the delay between the haptic tap and the haptic audio when using bluetooth headphones. I'm hoping this behavior changes in the future...

How do I turn the mic off?

I'm using navigator.getUserMedia for a voice memo app. It works like a charm.
When I quit the app, the microphone stays on (there is a notification "Audio Memos Mic is on" and the red dot remains in the status bar. It stays on even after the phone went to sleep for fifteen minutes.
How do I turn the mic off when I quit the app. I've check the mediastream API but couldn't find any reference.
Have you tried the stop method: https://developer.mozilla.org/en-US/docs/Web/API/MediaRecorder.stop
There may be a better way but I got the light to turn off on minimized apps by using code similar to:
mediaRecorder was a ref to the MediaRecorder object tied to the stream. You can also just call stop on the stream returned by the GetUserMedia call. Also had to call setup on init of app as well.
function handleVisibilityChange() {
if (document.hidden) {
console.log("hidden");
if( gumStream ){
gumStream.stop();
mediaRecorder = null;
}
} else {
console.log("visible");
setup(); //Call getUserMedia
}
}
document.addEventListener("visibilitychange", handleVisibilityChange, false);

Resume music after exit app on windows phone

i`m using mediaElement to play background music in my app. And that works just fine.
Problem is when the user minimize the application. When the application resume there is no sound... I can play other sounds in my application but cant play that background music any more.
First i have this code to stop all background music at first time app open:
if (Microsoft.Xna.Framework.Media.MediaPlayer.State == MediaState.Playing)
{
Microsoft.Xna.Framework.Media.MediaPlayer.Pause();
FrameworkDispatcher.Update();
}
xaml code of that mediaElement
<MediaElement AutoPlay="True" Source="/Dodaci/pozadina.mp3" x:Name="muzika_pozadina" MediaEnded="pustiPonovo" Loaded="pustiPonovo" />
and the cs code
private void pustiPonovo(object sender, RoutedEventArgs e)
{
muzika_pozadina.Play();
}
sound is about 300kb size.
So, how can i resume that sound playing after the user resume the application?
When your App is put into Dormant State (when you hit Start buton for example), the MediaElement is stopped. Then after you return to your App (and it wasn't Tombstoned), the Page is not Initialized once again, which means that your MediaElement is not loaded once again, so your Music doesn't start once again.
It depends on your purpose and code how it can be returned. In very simple example when you don't need to remember music last position you can just set source of your MediaElement once again in OnNavigatedTo() event:
protected override void OnNavigatedTo(NavigationEventArgs e)
{
base.OnNavigatedTo(e);
if (e.NavigationMode == NavigationMode.Back)
muzika_pozadina.Source = new Uri("/Dodaci/pozadina.mp3", UriKind.RelativeOrAbsolute);
}
As you have set your MediaElement.AutoPlay to true - it should start automatically (because of that you probably also don't need your Loaded event pustiPonovo).
In more complicated cases you can take an advantage of Activation and Deactivation events of your App - returning to MediaElement from Dormant/Tombstoned case is well explained here in the article.
You should also read about Fast App Resume in case User decides to return to your App by Tile instead of Launchers-Choosers.
I haven't tried above code, but hopefully it will do the job.

IMFMediaPlayer hangs during SetSourceFromByteStream

Background: I'm coding a metro-styled app for Win8. I need to be able to play music-file. Because of quality and space requirements we're using encoded audio (mp3/ogg).
I'm using XAudio2 to play sound effects (.wav files), but since I couldn't figure out a way to play encoded audio with it, I decided to play the music files with Media Foundation (IMFMediaPlayer interface).
I downloaded metro apps sample, and found out that the Media Engine Native C++ video playback sample was closest to what I needed.
Now that my app has MediaPlayer playing musics, I ran into a problem. If the device running the app is slow enough, MediaPlayer hangs. When I'm running the release-version of the app on my device, it's fine and I can hear the music just fine. But when I attach the debugger or run it on a slower device, it hangs when I'm setting bytestream for the MediaPlayer to play.
Here's some code, you'll find it pretty similiar to the sample:
StorageFolder^ installedLocation = Windows::ApplicationModel::Package::Current->InstalledLocation;
m_pickFileTask = Concurrency::task<StorageFile^>(installedLocation->GetFileAsync(filename)), m_tcs.get_token());
auto player = this;
m_pickFileTask.then([player](StorageFile^ fileHandle)
{
player->SetURL(fileHandle->Path);
Concurrency::task<IRandomAccessStream^> fOpenStreamTask = Concurrency::task<IRandomAccessStream^> (fileHandle->OpenAsync(Windows::Storage::FileAccessMode::Read));
fOpenStreamTask.then([player](IRandomAccessStream^ streamHandle)
{
MEDIA::ThrowIfFailed(
player->m_spMediaEngine->Pause()
);
MEDIA::GetMediaError(player->m_spMediaEngine);
player->SetBytestream(streamHandle);
if (player->m_spMediaEngine)
{
MEDIA::ThrowIfFailed(
player->m_spEngineEx->Play()
);
MEDIA::GetMediaError(player->m_spMediaEngine);
}
}
);
}
);
And here's the SetBytestream method:
SetBytestream(IRandomAccessStream^ streamHandle)
{
if(m_spMFByteStream != nullptr)
{
m_spMFByteStream->Close();
m_spMFByteStream = nullptr;
}
MEDIA::ThrowIfFailed(
MFCreateMFByteStreamOnStreamEx((IUnknown*)streamHandle, &m_spMFByteStream)
);
MEDIA::ThrowIfFailed(
m_spEngineEx->SetSourceFromByteStream(m_spMFByteStream.Get(), m_bstrURL)
);
MEDIA::GetMediaError(m_spEngineEx);
return;
}
The line where it hangs is:
m_spEngineEx->SetSourceFromByteStream(m_spMFByteStream.Get(), m_bstrURL)
When I'm debugging the app, I can press pause and see the stack. Well, not much of it, but atleast I can see it that it's indefinitely at
ntdll.dll!77b7f4dc()
Any ideas why my app would hang in such a way?
(OPTIONAL: If you know a better way to play mp3/ogg in a c++ metro-styled app, let me know)
Could not figure out why this is happening, but I managed to code a work-a-round:
IMFSourceReader can be used to decode MP3s and feed bytes into XAudio2SourceVoice.
XAudio2 audio stream effect sample contains good example how to do this.

Blackberry Audio Recording Sample Code

Does anyone know of a good repository to get sample code for the BlackBerry? Specifically, samples that will help me learn the mechanics of recording audio, possibly even sampling it and doing some on the fly signal processing on it?
I'd like to read incoming audio, sample by sample if need be, then process it to produce a desired result, in this case a visualizer.
RIM API contains JSR 135 Java Mobile Media API for handling audio & video content.
You correct about mess on BB Knowledge Base. The only way is browse it, hoping they'll not going to change site map again.
It's Developers->Resources->Knowledge Base->Java API's&Samples->Audio&Video
Audio Recording
Basically it's simple to record audio:
create Player with correct audio encoding
get RecordControl
start recording
stop recording
Links:
RIM 4.6.0 API ref: Package javax.microedition.media
How To - Record Audio on a BlackBerry smartphone
How To - Play audio in an application
How To - Support streaming audio to the media application
How To - Specify Audio Path Routing
How To - Obtain the media playback time from a media application
What Is - Supported audio formats
What Is - Media application error codes
Audio Record Sample
Thread with Player, RecordControl and resources is declared:
final class VoiceNotesRecorderThread extends Thread{
private Player _player;
private RecordControl _rcontrol;
private ByteArrayOutputStream _output;
private byte _data[];
VoiceNotesRecorderThread() {}
private int getSize(){
return (_output != null ? _output.size() : 0);
}
private byte[] getVoiceNote(){
return _data;
}
}
On Thread.run() audio recording is started:
public void run() {
try {
// Create a Player that captures live audio.
_player = Manager.createPlayer("capture://audio");
_player.realize();
// Get the RecordControl, set the record stream,
_rcontrol = (RecordControl)_player.getControl("RecordControl");
//Create a ByteArrayOutputStream to capture the audio stream.
_output = new ByteArrayOutputStream();
_rcontrol.setRecordStream(_output);
_rcontrol.startRecord();
_player.start();
} catch (final Exception e) {
UiApplication.getUiApplication().invokeAndWait(new Runnable() {
public void run() {
Dialog.inform(e.toString());
}
});
}
}
And on thread.stop() recording is stopped:
public void stop() {
try {
//Stop recording, capture data from the OutputStream,
//close the OutputStream and player.
_rcontrol.commit();
_data = _output.toByteArray();
_output.close();
_player.close();
} catch (Exception e) {
synchronized (UiApplication.getEventLock()) {
Dialog.inform(e.toString());
}
}
}
Processing and sampling audio stream
In the end of recording you will have output stream filled with data in specific audio format. So to process or sample it you will have to decode this audio stream.
Talking about on the fly processing, that will be more complex. You will have to read output stream during recording without record commiting. So there will be several problems to solve:
synch access to output stream for Recorder and Sampler - threading issue
read the correct amount of audio data - go deep into audio format decode to find out markup rules
Also may be useful:
java.net: Experiments in Streaming Content in Java ME by Vikram Goyal
While not audio specific, this question does have some good "getting started" references.
Writing Blackberry Applications
I spent ages trying to figure this out too. Once you've installed the BlackBerry Component Packs (available from their website), you can find the sample code inside the component pack.
In my case, once I had installed the Component Packs into Eclipse, I found the extracted sample code in this location:
C:\Program
Files\Eclipse\eclipse3.4\plugins\net.rim.eide.componentpack4.5.0_4.5.0.16\components\samples
Unfortunately when I imported all that sample code I had a bunch of compile errors. To workaround that I just deleted the 20% of packages with compile errors.
My next problem was that launching the Simulator always launched the first sample code package (in my case activetextfieldsdemo), I couldn't get it to run just the package I am interested in. Workaround for that was to delete all the packages listed alphabetically before the one I wanted.
Other gotchas:
-Right click on the project in Eclipse and select Activate for BlackBerry
-Choose BlackBerry -> Build Configurations... -> Edit... and select your new project so it builds.
-Make sure you put your BlackBerry source code under a "src" folder in the Eclipse project, otherwise you might hit build issues.

Resources