I am working with jPlayer 2.4.
I have used the call: player.pause() to pause the audio, this works fine.
However, there is no 'resume' or 'continue' calls, so I used the call: player.play().
But, it starts at the beginning of the song.
What is the proper way to pause and resume a song in jPlayer?
According to jPlayer dev guide:
$(id).jPlayer( "play", [Number: time] ) : jQuery
Description This
method is used to play the media specified using jPlayer("setMedia",
media).
If necessary, the file will begin downloading.
Without the time parameter, new media will play from the start. Open
media will play from where the play-head was when previously paused
using jPlayer("pause", [time]).
So, considere to review your code and make use of the correct jsplayer selector like $('jquery_jplayer').jPlayer('play');. It should do the trick.
Related
I'm trying to get a multi-audio HLS stream working on a v3 Google Cast custom receiver app. The master playlist of the stream refers to several video renditions of different resolution and two alternative audio tracks:
#EXTM3U
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="aac",LANGUAGE="de",NAME="TV Ton",DEFAULT=YES, AUTOSELECT=YES,URI="index_1_a.m3u8"
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="aac",LANGUAGE="de",NAME="Audiodeskription",DEFAULT=NO, AUTOSELECT=NO,URI="index_2_a.m3u8"
#EXT-X-STREAM-INF:AUDIO="aac",BANDWIDTH=383000,RESOLUTION=320x176,CODECS="avc1.4d001f, mp4a.40.2",CLOSED-CAPTIONS=NONE
index_0_av.m3u8
...more renditions
#EXT-X-STREAM-INF:AUDIO="aac",BANDWIDTH=3697000,RESOLUTION=1280x720,CODECS="avc1.4d001f, mp4a.40.2",CLOSED-CAPTIONS=NONE
index_6_av.m3u8
The video plays fine in both the sender and receiver app, I can see both audio tracks in the sender app, but when casting to the receiver there are no controls for changing the audio tracks.
When accessing the AudioTracksManager's getTracks() method while intercepting the LOAD message like so...
playerManager.setMessageInterceptor(
cast.framework.messages.MessageType.LOAD, loadRequestData => {
loadRequestData.media.hlsSegmentFormat = cast.framework.messages.HlsSegmentFormat.TS
const audioTracksManager = playerManager.getAudioTracksManager();
console.log(audioTracksManager.getTracks())
console.log('Load request: ', loadRequestData);
return loadRequestData;
});
I get an error saying:
Uncaught Error: Tracks info is not available.
Maybe unrelated, but super weird: I can console.log the request's media prop and see its tracks prop (an array with the expected 1 video and 2 audio tracks), however, if I try to access the tracks property in the LOAD message interceptor I get undefined.
I currently cannot look into the iOS sender code yet, so I tried to eliminate error sources on the receiver end. The thing is:
I always assumed that the receiver identifies alternative audio tracks on its own when loading HLS playlists. Is this assumption correct or can the AudioTracksManager only access tracks that have been previously defined in a sender app?
I couldn't find a clear statement on that in the Google Cast reference...
Ok, feeling stupid for the time I spent on this, but I'm finally able to answer my own question. I didn't realize that I was accessing the AudioTracksManager in the wrong place - namely in the LOAD message interceptor instead of in a PLAYER_LOAD_COMPLETE event listener (as it is properly documented here)
After placing my logic into this event listener I was able to access and programmatically set my audio tracks.
So to answer my original question: Yes, the receiver app automatically identifies alternative audio tracks from an HLS playlist.
In my Corona SDK project, I have a composer scene called "menu.lua" (created with composer.newScene()) that is the first scene, called by main.lua file. I have a background track only for this scene, loaded in scene:create() with audio.loadSound() in a local variable. When I load another scene (let's suppose it's a "credit" scene, static, with no music, sounds, animations, timers, etc.) and then come back to menu scene, audio is still played, but with a lower volume.
Audio is played in loop on channel 2, I use audio.play() in scene:show "did" phase. I use audio.fadeOut() in scene:hide "will" phase, and stop it with audio.stop() in "did" phase, then dispose it with audio.dispose() in scene:destroy.
In "menu.lua" file
local composer=require("composer")
local scene=composer.newScene()
local theme --this is the variable for audio
function scene:create(event)
local sceneGroup=self.view
theme=audio.loadSound("sfx/theme_short.mp3")
end
function scene:show(event)
local sceneGroup=self.view
if event.phase=="will"
audio.play(theme,{loops=-1,channel=2})
end
end
function scene:hide(event)
local sceneGroup=self.view
if event.phase=="will" then
audio.fadeOut(theme,{500})
end
elseif event.phase=="did" then
audio.stop(2)
end
end
function scene:destroy(event)
local sceneGroup=self.view
audio.dispose(theme)
end
The other scene (let's suppose it's "credits.lua") is called by a button with a "tap" event attached. In "credits.lua" I use this function to go back to "menu.lua" scene (function is called with a "tap" event attached to a button)
local function goMenu()
composer.removeScene("menu")
composer.gotoScene("menu","slideUp",500)
return true
end
I've already tried to play audio in scene:show "did" phase and in scene:create, but the problem persists. The problem happens with all the scenes, all static (3 in total). Any idea?
You should replace
audio.fadeOut(theme,{500})
with
audio.fadeOut( { channel=2, time=500 } )
since you use wrong syntax.
See audio.fadeOut()
Make sure you read the "Gotcha" section of the docs:
When you fade the volume, you are changing the volume of the channel. This value is persistent and it's your responsibility to reset the volume on the channel if you want to use the channel again later (see audio.setVolume()).
You're responsible for setting the channel volume back because fadeOut changes the volume of the channel.
I am creating a game using Libgdx. I have a lot of small sounds files in the MP3-format and since it is so many I do not preload them. I only load the sound I want to play when it is to be used, like this:
actorSound = Gdx.audio.newSound(Gdx.files.internal(sound));
The code above works great, but the rest of my sounds do not unfortunately. The actor above has its own class and plays a different sound every time it is touched.
When I try to play sounds in my Gamescreen I get the following error:
AUDIO_OUTPUT_FLAG_FAST denied by client
All the audiofiles have the same properties and have been recorded using the same microphone & Audacity. The files are all 44100Hz and only a few kb in size each.
I wonder why the sounds the Actor plays work and the other sounds do not?
I decided to try to change the non-working sounds to music instead and now they play fine - for a little while that is. I can play a full game, return to the menu and start a new game again. The second time I start a game I only get 3 sounds from the Gamescreen and then it is silent except for the sounds from the actor. The error that appears looks like this:
E/MediaPlayer: Error (1,-19)
I load the Music just the same way as the Sound:
gameSound = Gdx.audio.newMusic(Gdx.files.internal(soundeffect));
I have looked into the two errors by reading posts like these:
AUDIO_OUTPUT_FLAG_FAST denied by client
Mediaplayer error (-19,0) after repeated plays
But I'm not sure what to do or change to solve my problem. I would prefer Sound if that is possible, but Music is an acceptable workaround...
When it comes to Music it is probably as suggested in the URL, that I do not release the media players. I am not sure how to do that?
When I leave the GameScreen for another screen, like the MenuScreen or RewardScreen, I dispose Music first. The other screens use Music as well and the sounds are loaded when needed. When I change back to the GameScreen I dispose again and then start a new game...
Any ideas or suggestions? Any help is greatly appreciated.
I added AssetManager as suggested, and I now have a loding screen that loads all the sounds. I load them as sounds and not music, which is how I prefer it.
The sounds work well in the actual game but once I get to the reward screen, only the first sound plays and after that the app crashes with the following error:
06-02 07:47:09.774 6208-6282E/AndroidRuntime: FATAL EXCEPTION: GLThread 2935
Process: PID: 6208
java.lang.NullPointerException: Attempt to read from field 'com.badlogic.gdx.assets.AssetManager
Assets.Assets.manager' on a null object reference
at DelayedSounds(RewardsScreen.java:538)
at RewardsScreen.Update(RewardsScreen.java:567)
at Screens.RewardsScreen.render(RewardsScreen.java:577)
at Game.render(Game.java:46)
at com.badlogic.gdx.backends.android.AndroidGraphics.onDrawFrame(AndroidGraphics.java:459)
at android.opengl.GLSurfaceView$GLThread.guardedRun(GLSurfaceView.java:1562)
at android.opengl.GLSurfaceView$GLThread.run(GLSurfaceView.java:1262)
06-02 07:47:13.823 6208-6208 E/AndroidGraphics: waiting for pause
synchronization took too long; assuming deadlock and killing
The DelayedSounds method looks like this:
public void DelayedSounds(){
timer = timer + Gdx.graphics.getDeltaTime();
if(playitem == true && timer > 2){
itemsound = "vinster/" + item + ".mp3";
assets.manager.get(itemsound, Sound.class).play(volume);
//sound = Gdx.audio.newMusic(Gdx.files.internal(itemsound));
//sound.setVolume(volume);
//sound.play();
playitem = false;
}
if(playkeep == true && time > 3){
assets.manager.get("dialog/VINSTEN.mp3", Sound.class).play(volume);
//sound = Gdx.audio.newMusic(Gdx.files.internal("prizes/prize.mp3"));
//sound.setVolume(volume);
//sound.play();
playkeep = false;
}
}
As can be seen I use AssetManager now, and have commented out my old code for testing purposes. I define AssetManager in my main class and then pass it around to all other classes that uses sounds.
The RewardsScreen will only play the first sound and then it crashes on a NULL object reference as it seems.
If I change my code back to using Music in the RewardsScreen it works fine (see the code that is commmented out)
The sound I try to play is exactly the same in both cases. Assets class that handles the loading of all my assets has the sounds included and since I still get a NULL object I assume one or more items fails to load?
I search the logs and find this where it loads the sounds:
06-02 08:05:09.360 9844-9888/E/WVMExtractor: Failed to open libwvm.so: dlopen failed: library "libwvm.so" not found
06-02 08:05:09.395 9844-9888/ E/NdkMediaExtractor: sf error code: -1010
06-02 08:05:09.395 9844-9888/ E/SoundPool: Unable to load sample
Maybe this is related to my problem?
I load all the sounds in the same manner and most definitely seem to work, the loading is the regular:
manager.load("prizes/cash.mp3", Sound.class);
I still find the AUDIO_OUTPUT_FLAG_FAST denied by client in my logs but now the sounds are playing instead of being rejected.
Any more ideas about how to solve this?
Your link having enough information for your bug.
It's not good way to create Resource instance each time in Game, Create once and user All over your game, if possible use AssetManager.
Like create Music instance in onCreate() method of your game.
gameSound = Gdx.audio.newMusic(Gdx.files.internal(soundeffect));
Music having play(), resume() and many other helpful methods.
You can also take a look of this, it may be helpful.
I have a EmbeddedMediaPlayerComponent and I want to check before playing if the video has audio track.
The getMediaPlayer().getAudioTrackCount() method works fine but only when I play the video and I am inside the public void playing(MediaPlayer mp) event.
I also tryed
getMediaPlayer().prepareMedia("/path/to/media", null);
getMediaPlayer().play();
System.out.println("TRACKS: "+getMediaPlayer().getAudioTrackCount());
But it does not work. it says 0.
I also tryed:
MediaPlayerFactory factory = new MediaPlayerFactory();
HeadlessMediaPlayer p = factory.newHeadlessMediaPlayer();
p.prepareMedia("/path/to/video", null);
p.parseMedia();
System.out.println("TRACKS: "+p.getAudioTrackCount());
But it also says -1. Is there a way I can do that ? or using another technique?
The track count is not metadata, so using parseMedia() here is not going to help.
parseMedia() will work to get e.g. ID3 tag data, title, artist, album, and so on.
The track data is usually not available until after the media has started playing, since it is the particular decoder plugin that knows how many tracks there are. Even then, it is not always available immediately after the media has started playing, sometimes there's an indeterminate delay (and no LibVLC event).
In applications where I need the track information before playing the media, I usually would use something like the native MediaInfo application and parse the output - this has a plain-text out format, or an XML output format and IIRC the newer versions have a JSON output format. The downside is you have to launch a native process to do this, I use CommonsExec for things like this. It's pretty simple and does work even though it's not a pure Java solution, but neither is vlcj!
A slight aside if you did actually want the meta data there is an easier way, just use
this method on the MediaPlayerFactory:
public MediaMeta getMediaMeta(String mediaPath, boolean parse);
This gives you the meta data without having to prepare, play or parse media.
I made a streaming music player and it works fine in the foreground.
But in the background iOS4, it doesn't play the next song automatically. ( remote control works )
The reason is AudioQueueStart return -12985.
I already check the audio session. it just fine. I use AudioQueueStart when it start to play the music.
How can you remove AudioQueueStart?
- (void)play
{
[self setupAudioQueueBuffers]; // calcluate the size to use for each audio queue buffer, and calculate the // number of packets to read into each buffer
OSStatus status = AudioQueueStart(self.queueObject, NULL);
}
I read the answer in the web about the AudioQueueStart fail subject.
One thing to check is that the AudioSession is active first.
In my case, I had previously set the session to inactive between song changes before starting a new song:AudioSessionSetActive(false);
Once I removed this AudioQueueStart works just fine from the background.
In my experience, the -12985 message occurs because another app already has an audio session active when you try to start playback in your app. Options are to 1) instruct the user to close the other app, or 2) set mix mode (see kAudioSessionProperty_OverrideCategoryMixWithOthers).
The disadvantage of mix mode is if you depend on lock screen art or remote controls, they won't work with mix mode set.
I also faced with such problem week ago. I've spent two days to find solution and I found it. May be this link will help (it is official answer): http://developer.apple.com/library/ios/#qa/qa1668/_index.html
Make sure that you activate session from applicationDidEnterBackground task handler. Now my application can play sound in background.
See this.
You probably need to include the following:
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
Towards the bottom there is a reiteration of the how important that line is. As it is not mentioned in any of the three main audio audio guides (AVFoundation, AudioSession, or AudioQueue) it can easily be missed.
I have the same problem.
I registry the AudioSessionInterruptionListener, pause the audio when phone call, resume it after the call end. but get -12985 error code when call AudioQueueStart to resume.
My solution is that I try to call AudioQueueStart after 0.02s.
I don't know the reason.
On iOS7, AudioQueueStart was returning '!int' ('tni!'), though i'm sure no one would be surprised to find that it's not documented in the docs or headers. It was the same issue, though, and the same fix (setting the audio session to active in the background task handler) worked for me.