XNA , Monogame ; Is there an alternative to XACT? - audio

I'm making a game in XNA... I havent looked at monogame yet but I'm conscious that I probably will be looking at it in the future..
I havent implemented sounds in my game yet.. Atmosphere is very imprtant in this game so different reverb and delay on the sounds in different rooms is important.
I could use XACT to do this dynamically, however I know XACT is not supported by Monogame
Is there something else I could look at??
What I could do is record 3 versions of each sound effect with little, medium and high reverb and just play different ones depending on what room you are in.. I think this would work ok and I'm assuming with less real-time audio processing going on it will be lighter on CPU.

This is an old question, but I think it still needs an appropriate answer for the one who are looking for an answer.
For sound in Monogame, you can use SoundEffect or MediaPlayer classes to play audio.
Example for SoundEffect class:
Declaration: SoundEffect soundEffect;
In LoadContent(): soundEffect= Content.Load<SoundEffect>("sound_title");
In wherever you want to play this sound: soundEffect.Play();
Example for SoundEffectInstances class (using SoundEffect that created above):
SoundEffectInstance soundEffectInstance = effect.CreateInstance();
soundEffectInstance.Play();
Then you can stop the soundeffect from playing whenever you want by using: soundEffectInstance.Stop();
Example for MediaPlayer class (best for background music):
In LoadContent():
Song song = Content.Load<Song>("song_title");
MediaPlayer.Play(song);
Hope this helps!

Related

Libgdx music/sound effect with reverb

Is it possible to add specific reverb to my sound effect/music track in libgdx?
I want to add outdoor/indoor reverb to make all tracks sounded the same.
I don't think that Libgdx has a mechanism to adding effects to sound. The Sound class delivers no function for this.
I see three solutions here:
Prepare two kinds of sounds (one with reverb one without - it is easy to do using software like Audacity - and play one or another due to environment of player's current being
Try to implement it yourself
I see that in the Sound class there is setPitch(long soundId, float pitch) method. Due to Wikipedia the reverb is just a kind of echo so maybe (but not for sure) you could achieve the effect by
making copy of sound
slowering it a little
lowering the volume
playing simultaneously with original sound
Find 3rd part library that will do it for you - the Google returns some examples of libs working with libgdx like SoundTouch Audio Processing Library - maybe you will find something usefull
First one is the easiest and if you are not afraid of space problems I would strongly recommend it to you (althought why not to try implementing it)
I've implemented reverb, positional audio, and arbitrary filters using OpenAL against the latest libgdx 1.10+/lwjgl 3+ with this demo code, based off of gdx-sfx (which only works with lwjgl 2) and libgdx-audio-effects.
I'd like to promote this into a fully fledged library at some point 😂

Play More Than One Sound Files in Livecode

I want to create a game with sound effects. When I start the game, the background music should be played until the game is over. When I click on something in the game (such as buttons), a sound effect should be played but the background music is stopped.
How can I make the background music play continuously while the sound effect from object is playing?
I already have these scripts...
Card script...
on openCard
play "backgroundmusic.wav" looping
end openCard
Buttons (or any object)...
on mouseup
play "sound.wav"
end mouseup
How to play these sounds together?
Update: I found a game uploaded to Game Jam. This game was ranked #1. When I play the game, the sound was amazing that it has background music and sound effects. But the owner of this game doesn't upload the livecode stack file in order to study it. The game was entitled Space Shooter Game. The sounds of this game is what I expect.
Note:
As what I figured out from the answers, using the player object can be work. But this requires QuickTime which I don't have that installed in my PC. I want also the sound to be able to play in mobile devices.
As it stands, the soundChannel property has no effect in LiveCode and is only provided for Hypercard compatibility.
Currently on desktop there are two ways to do multi-channel sound: 1) play imported sounds as one channel, and use a player object as the second channel, or 2) use two player objects.
Typically, a good option is to import short sounds as sound effects into a stack that only play once, and reserve the player object for background music. Imported sounds usually play with the least latency, however, you cannot play multiple imported simultaneously -- attempting to play a second sound while a first is playing will stop the first to play the second. If you have a need to play asynchronous sound effects, this option will not work; you must use a combination of playback options.
Multiple players can be used, but note that there can be some latency during the process of loading a sound (assigning a sound's filepath to a player) and playing it.
Also note that truly seamless playback of of a track is difficult if not impossible -- LiveCode will at some point become susceptible to some system event that will cause a slight pause between loops. A while back, Trevor Devore made an addition to his Enhanced QuickTime external that enabled true seamless looping of audio. However, with Apple getting rid QuickTime, it's unknown how much longer this option will be useful.
With the enhancements that the RunRev guys have been making to the engine, it's likely we'll see improvement with media playback and management, hopefully sooner rather than later.
In the LiveCode forums, they suggest using player objects on the card instead and telling them to play.
In HyperCard, you could set the soundChannel property for that. Have you checked in the LiveCode documentation whether it supports that? The docs for the play command and the the sound property might also help. Maybe those contain hints. FWIW, in HC
set the soundChannel to 1
play "BackgroundMusic"
set the soundChannel to 2
play "SoundEffect"
would play the sound effect and background music at the same time. Maybe that's how it works in LiveCode as well?
The multimedia capabilities are a going through a transformation. Previously everything was built around QuickTime (well almost everything) and you needed to add a player control for each concurrent sound. Currently the whole foundation is changed as Apple dropped QuickTime, but assuming you develop for desktop you should still (again) be able to add a player object and then use:
start player "name of player"
You can also create player object dynamically by
create player "my player""
and then use
set the filename of player "my player to "/path/to/your/audio/file"
before staring your sound. And as long as you have different players for your different sounds they should play simultaneously.
on openCard
put specialFolderPath("engine") & "/soundfx/backgroundmusic.wav" into tSound
mobilePlaySoundOnChannel tSound, "Background", "looping"
end openCard
on mouseup
play "sound.wav"
end mouseup

How to implement audio effects like 3D surround sound and reverb in javafx

I have developed a music player in javafx.
Is there any way to implement 3D audio effect similar to WOW EFFECT in windows media player.
I have searched there is no method of mediaplayer to implement such thing.
My javafx level: intermediate
AFAIK, the only effect that can be applied on the JavaFX MediaPlayer is an equalizer, with the javafx.scene.media.AudioEqualizer class. You can get the MediaPlayer's AudioEqualizer with the getAudioEqualizer method, and modify its existing bands, add/remove bands, etc.
However, if you want to implement other effects (reverb, delay, distortion... whatever else), I think you're out of luck: The JavaFX media API doesn't provide methods for that, and it doesn't seem to be meant to be extensible in any way (you can't add support for other codecs either, for example).
If you need more than what the JavaFX media API provides, the only solution for serious media playback in java seems to be to use a native library with a Java wrapper. vlcj (website here, Javadoc here) seems to be a good solution: it offers a java wrapper around VLC, which is a really powerful media player, so you should be able to do most of what you may want - worst case, it provides APIs to directly access the audio buffer and manipulate it yourself. It's clearly quite a bit more involved than using JavaFX's native media playback though...

Corona sdk : Is there a difference between audio.play() and media.play() and which one is better?

Is there a difference between audio.play() and media.play() and which one is better?
The audio.* API calls use the OpenAL audio layer to play. They are considered a safer and better way to play audio in Corona SDK. You can have 32 different sounds playing at once. You can control the volume on each channel independently, pause and resume, fade in, fade out, etc. It is the preferred way to play sound.
The media.* API calls write directly to the hardware and you cannot control the volume, have multiple sounds going on. The media.* API Calls though are good for video, playing long clips, like podcasts since that audio can be backgrounded, but more importantly, on Android, Google has decided to poorly implement OpenAL and under 4.x there is a significant lag from the time you tell audio.play() to play a sound and it really happening. The lag isn't as bad under 2.2 and 2.3, but there still is a lag. The media.* api calls, if you're playing a short clip will play in a timely fashion.
media API:Only one sound can be playing using this sound API. Calling this API with a different sound file will stop the existing sound and play the new sound.

How to check that ads are being played, before the real video plays?

I am working on a site, which airs ads before the real video plays.
The business requirement is that the ads should play before the video plays.
I am Using watir for testing. can you help me in this regard.
Thanks.
You may want to investigate Sikuli I've seen other threads where people were using it in combination with watir to work with things like flash. However, since it works based on visual recognition, I expect it would not work at all with video (a changing image that might only be 'right' for a fraction of a second) while it is playing unless there is some aspect of the screen that is relatively static that could be used to know the video play is in progress. See this blog posting for more info

Resources