Adding observers to AVPlayerItem - avplayer

I am working in an app that plays an audio streaming. I have added the observers:playbackBufferEmpty and playbackLikelyToKeepUp to implement a "buffering" message when internet connection is lost.
This is what happens with the following code: it starts playing without problem, I see the "good to go" message, if I lost the connection it detect the playbackBufferEmpty, I see the "buffering" message, but when the connection is back is like the observer is lost and it doesn't even run observeValueForKeyPath method with the playbackLikelyToKeepUp.
Here is my code:
- (void)viewDidLoad
{
…
NSString *urlstr = #"http://xxxxxxx.aac";
NSURL *url=[NSURL URLWithString:urlstr];
playerItem = [[AVPlayerItem playerItemWithURL:url]retain];
[playerItem addObserver:self forKeyPath:#"playbackBufferEmpty" options:NSKeyValueObservingOptionNew context:nil];
[playerItem addObserver:self forKeyPath:#"playbackLikelyToKeepUp" options:NSKeyValueObservingOptionNew context:nil];
player = [[AVPlayer playerWithPlayerItem:playerItem] retain];
-(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object
change:(NSDictionary *)change context:(void *)context {
if (object == playerItem && [keyPath isEqualToString:#"playbackBufferEmpty"]){
if (playerItem.playbackBufferEmpty) {
escuchando.text = #"buffering";
}
}
else if (object == playerItem && [keyPath isEqualToString:#"playbackLikelyToKeepUp"])
{
if (playerItem.playbackLikelyToKeepUp)
{
escuchando.text = #"good to go";
}
}
Thanks for your help!

Related

Front Camera recording is MUTE

I am working with some camera recoding app. I want to record video using front and back camera both. For back camera my video is working fine but for front camera my final video is mute (without audio).
CODE:
- (id)initWithPreviewView:(UIView *)previewView {
self = [super init];
if (self) {
NSError *error;
self.captureSession = [[AVCaptureSession alloc] init];
self.captureSession.sessionPreset = AVCaptureSessionPresetHigh;
//AVCaptureSessionPresetHigh AVCaptureSessionPresetPhoto
// AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
//AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *videoDevice;
// if (isNeededToSave)
// {
// //for Front cam
// videoDevice = [self frontCamera];
//
// }
// else
// {
// //for back cam
videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
// }
AVCaptureDeviceInput *videoIn = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (error) {
NSLog(#"Video input creation failed");
return nil;
}
if (![self.captureSession canAddInput:videoIn]) {
NSLog(#"Video input add-to-session failed");
return nil;
}
[self.captureSession addInput:videoIn];
/*Take PHoto*/
self.isUsingFrontFacingCamera = 0;
// Make a still image output
stillImageOutput = [AVCaptureStillImageOutput new];
[stillImageOutput addObserver:self forKeyPath:#"capturingStillImage" options:NSKeyValueObservingOptionNew context:(__bridge void *)(AVCaptureStillImageIsCapturingStillImageContext)];
if ( [self.captureSession canAddOutput:stillImageOutput] )
[self.captureSession addOutput:stillImageOutput];
// Make a video data output
videoDataOutput = [AVCaptureVideoDataOutput new];
// we want BGRA, both CoreGraphics and OpenGL work well with 'BGRA'
NSDictionary *rgbOutputSettings = [NSDictionary dictionaryWithObject:
[NSNumber numberWithInt:kCMPixelFormat_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
[videoDataOutput setVideoSettings:rgbOutputSettings];
[videoDataOutput setAlwaysDiscardsLateVideoFrames:YES]; // discard if the data output queue is blocked (as we process the still image)
// create a serial dispatch queue used for the sample buffer delegate as well as when a still image is captured
// a serial dispatch queue must be used to guarantee that video frames will be delivered in order
// see the header doc for setSampleBufferDelegate:queue: for more information
videoDataOutputQueue = dispatch_queue_create("VideoDataOutputQueue", DISPATCH_QUEUE_SERIAL);
[videoDataOutput setSampleBufferDelegate:self queue:videoDataOutputQueue];
if ( [self.captureSession canAddOutput:videoDataOutput] )
[self.captureSession addOutput:videoDataOutput];
[[videoDataOutput connectionWithMediaType:AVMediaTypeVideo] setEnabled:NO];
/*Take PHoto*/
// save the default format
self.defaultFormat = videoDevice.activeFormat;
defaultVideoMaxFrameDuration = videoDevice.activeVideoMaxFrameDuration;
AVCaptureDevice *audioDevice= [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
AVCaptureDeviceInput *audioIn = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];
[self.captureSession addInput:audioIn];
self.fileOutput = [[AVCaptureMovieFileOutput alloc] init];
[self.captureSession addOutput:self.fileOutput];
self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession];
self.previewLayer.frame = previewView.bounds;
self.previewLayer.contentsGravity = kCAGravityResizeAspectFill;
self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[previewView.layer insertSublayer:self.previewLayer atIndex:0];
[self.captureSession startRunning];
}
return self;
}
- (void)switchCameras
{
// [self frontCamera];
// AVCaptureDevicePosition desiredPosition;
// desiredPosition = AVCaptureDevicePositionFront;
AVCaptureDevicePosition desiredPosition;
NSInteger isFront = [[NSUserDefaults standardUserDefaults] integerForKey:#"isUsingFrontFacingCamera"];
if (isFront)
desiredPosition = AVCaptureDevicePositionBack;
else
desiredPosition = AVCaptureDevicePositionFront;
for (AVCaptureDevice *d in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]) {
if ([d position] == desiredPosition) {
[[self.previewLayer session] beginConfiguration];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:d error:nil];
for (AVCaptureInput *oldInput in [[self.previewLayer session] inputs]) {
[[self.previewLayer session] removeInput:oldInput];
}
[[self.previewLayer session] addInput:input];
[[self.previewLayer session] commitConfiguration];
break;
}
}
if (isFront==0)
{
[[NSUserDefaults standardUserDefaults] setInteger:1 forKey:#"isUsingFrontFacingCamera"];
}
else
{
[[NSUserDefaults standardUserDefaults] setInteger:0 forKey:#"isUsingFrontFacingCamera"];
}
[[NSUserDefaults standardUserDefaults] synchronize];
//NSInteger isFront1= [[NSUserDefaults standardUserDefaults] integerForKey:#"isUsingFrontFacingCamera"];
}

GPUImage saving video in background issue

I'm having an issue with saving video from a GPUImage videoCamera to the Camera Roll when my app goes into the background. The file is only saved to the camera roll when the app returns to the foreground / is restarted. I'm no doubt making a beginners code error , if anyone can point it out that would be appreciated.
- (void)applicationDidEnterBackground:(UIApplication *)application {
if (isRecording){
[self stopRecording];
};
if (self.isViewLoaded && self.view.window){
[videoCamera stopCameraCapture];
};
runSynchronouslyOnVideoProcessingQueue(^{
glFinish();
});
NSLog(#"applicationDidEnterBackground");
and then
-(void)stopRecording {
[filterBlend removeTarget:movieWriter];
videoCamera.audioEncodingTarget = nil;
[movieWriter finishRecording];
NSString *path = [NSTemporaryDirectory() stringByAppendingPathComponent:#"file.mov"];
ALAssetsLibrary *al = [[ALAssetsLibrary alloc] init];
[al writeVideoAtPathToSavedPhotosAlbum:[NSURL fileURLWithPath:path] completionBlock:^(NSURL *assetURL, NSError *error) {
if (error) {
NSLog(#"Error %#", error);
} else {
NSLog(#"Success");
}
}];
isRecording = NO;
NSLog(#"Stop recording");
It was exactly as Brad pointed out in his, as usual, insightful comment, the -writeVideoAtPathToSavedPhotosAlbum:completionBlock: wasn't completing till after the app returned to the foreground, I solved it by adding
self.backgroundTask = [[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:^{
NSLog(#"Background handler called. Not running background tasks anymore.");
[[UIApplication sharedApplication] endBackgroundTask:self.backgroundTask];
self.backgroundTask = UIBackgroundTaskInvalid;
}];
and
#property (nonatomic) UIBackgroundTaskIdentifier backgroundTask;
Found this solution at http://www.raywenderlich.com/29948/backgrounding-for-ios

UIManagedDocument - How to deal with UIDocumentStateSavingError?

I am working on my first iCloud App. After working for a while the app cannot access a UIManagedDocument any more due to an "UIDocumentStateSavingError". Is there any way to actually find out what error occurred?
This is my code to create the UIManagedDocument:
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
iCloudURL = [[NSFileManager defaultManager] URLForUbiquityContainerIdentifier:nil];
if (iCloudURL == nil) {
dispatch_async(dispatch_get_main_queue(), ^{
[self iCloudNotAvailable];
});
return;
}
iCloudDocumentsURL = [iCloudURL URLByAppendingPathComponent:#"Documents"];
iCloudCoreDataLogFilesURL = [iCloudURL URLByAppendingPathComponent:#"TransactionLogs"];
NSURL *url = [iCloudDocumentsURL URLByAppendingPathComponent:#"CloudDatabase"];
iCloudDatabaseDocument = [[UIManagedDocument alloc] initWithFileURL:url];
NSMutableDictionary *options = [NSMutableDictionary dictionary];
NSString *name = [iCloudDatabaseDocument.fileURL lastPathComponent];
[options setObject:name forKey:NSPersistentStoreUbiquitousContentNameKey];
[options setObject:iCloudCoreDataLogFilesURL forKey:NSPersistentStoreUbiquitousContentURLKey];
iCloudDatabaseDocument.persistentStoreOptions = options;
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(documentContentsChanged:) name:NSPersistentStoreDidImportUbiquitousContentChangesNotification object:iCloudDatabaseDocument.managedObjectContext.persistentStoreCoordinator];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(documentStateChanged:) name:UIDocumentStateChangedNotification object:iCloudDatabaseDocument];
if ([[NSFileManager defaultManager] fileExistsAtPath:[iCloudDatabaseDocument.fileURL path]]) {
// This is true, the document exists.
if (iCloudDatabaseDocument.documentState == UIDocumentStateClosed) {
[iCloudDatabaseDocument openWithCompletionHandler:^(BOOL success) {
if (success) {
dispatch_async(dispatch_get_main_queue(), ^{
[self documentConnectionIsReady];
});
} else {
dispatch_async(dispatch_get_main_queue(), ^{
[self connectionError:iCloudConnectionErrorFailedToOpen];
});
}
}];
} else if (iCloudDatabaseDocument.documentState == UIDocumentStateNormal) {
...
}
} else {
...
}
});
The Document already exists and thus openWithCompletionHandler: is called on the document. This fails and the UIDocumentStateChangedNotification is fired which shows a document states of 5:
UIDocumentStateClosed and
UIDocumentStateSavingError
After this the completion block gets called. What is correct way to proceed from here? Is there any way to find out what went wrong and what kind of error occurred?
I tried to re-open the document in the completion block but the result is the same.
I guess I could solve the problem by just deleting the file and recreate it. But this is obviously not an option once the app will be out in the store. I would like to know what is going wrong and give the user an appropriator way to handle the problem.
I already checked other questions here handling the UIDocumentStateSavingError (there a not a lot of them) but the seem not to be applicable for the problem here.
Any idea how I can find out what the problem is? I cannot belive that the API tells you "Something went wrong during saving but I will not tell you what!"
You can query the documentState in the completion handler. Unfortunately, if you want the exact error, the only way to get it is to subclass and override handleError:userInteractionPermitted:
Maybe something like this would help (typed freehand without compiler)...
#interface MyManagedDocument : UIManagedDocument
- (void)handleError:(NSError *)error
userInteractionPermitted:(BOOL)userInteractionPermitted;
#property (nonatomic, strong) NSError *lastError;
#end
#implementation MyManagedDocument
#synthesize lastError = _lastError;
- (void)handleError:(NSError *)error
userInteractionPermitted:(BOOL)userInteractionPermitted
{
self.lastError = error;
[super handleError:error
userInteractionPermitted:userInteractionPermitted];
}
#end
Then in you can create it like this...
iCloudDatabaseDocument = [[UIManagedDocument alloc] initWithFileURL:url];
and use it in the completion handler like this...
[iCloudDatabaseDocument openWithCompletionHandler:^(BOOL success) {
if (success) {
dispatch_async(dispatch_get_main_queue(), ^{
[self documentConnectionIsReady];
});
} else {
dispatch_async(dispatch_get_main_queue(), ^{
[self connectionError:iCloudConnectionErrorFailedToOpen
withError:iCloudDatabaseDocument.lastError];
});
}
}];
Based on #JodyHagins excellent snippet, I have made a UIDocument subclass.
#interface SSDocument : UIDocument
- (void)openWithSuccess:(void (^)())successBlock
failureBlock:(void (^)(NSError *error))failureBlock;
#end
#interface SSDocument ()
#property (nonatomic, strong) NSError *lastError;
#end
#implementation SSDocument
- (void)handleError:(NSError *)error userInteractionPermitted:(BOOL)userInteractionPermitted {
self.lastError = error;
[super handleError:error userInteractionPermitted:userInteractionPermitted];
}
- (void)clearLastError {
self.lastError = nil;
}
- (void)openWithSuccess:(void (^)())successBlock failureBlock:(void (^)(NSError *error))failureBlock {
NSParameterAssert(successBlock);
NSParameterAssert(failureBlock);
[self clearLastError];
[self openWithCompletionHandler:^(BOOL success) {
if (success) {
successBlock();
} else {
NSError *error = self.lastError;
[self clearLastError];
failureBlock(error);
}
}];
}
#end

Unable to do audio playback in background in iOS5 with AVQueuePlayer

I'm trying to build an app to play local music, but unfortunately, i'm unable to do audio playback in background in iOS5 with AVQueuePlayer.
In my ViewDidLoad, i got this code :
// Player setup
mAudioPlayer = [[AVQueuePlayer alloc] init];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(playerItemDidReachEnd:) name:AVPlayerItemDidPlayToEndTimeNotification object:nil];
[mAudioPlayer addPeriodicTimeObserverForInterval:CMTimeMakeWithSeconds(1.0, 1) queue:NULL usingBlock:^(CMTime time) {
[self updatePositionOnDisplay];
}];
// Audio session setup
NSError *setCategoryErr = nil;
NSError *activationErr = nil;
[[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayback error: &setCategoryErr];
[[AVAudioSession sharedInstance] setActive: YES error: &activationErr];
Here is my "playerItemDidReachEnd" method:
- (void)playerItemDidReachEnd:(NSNotification*)notification
{
NSLog(#"playerItemDidReachEnd");
UIBackgroundTaskIdentifier newTaskID = [[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:^{
[[UIApplication sharedApplication] endBackgroundTask:newTaskID];
}];
NSLog(#"playerItemDidReachEnd 2");
NSLog(#"searching next song");
mCurrentSong = [self getNextSongWithIsSwitched:NO];
if(mCurrentSong != nil){
NSLog(#"Start : %# - %#", mCurrentSong.artist, mCurrentSong.title);
mTapeTitle.text = [NSString stringWithFormat:#"%# - %#", mCurrentSong.artist, mCurrentSong.title];
AVPlayerItem* i = [[AVPlayerItem alloc] initWithURL:[NSURL URLWithString:mCurrentSong.path]];
if(i != nil){
[mAudioPlayer insertItem:i afterItem:nil];
}else
NSLog(#"BING!! no AVPlayerItem created for song's path: %#", mCurrentSong.path);
[i release];
}else{
NSLog(#"no song found");
[mAudioPlayer pause];
isPlaying = NO;
[mPlayButton setSelected:NO];
}
[[UIApplication sharedApplication] endBackgroundTask:newTaskID];
newTaskID = UIBackgroundTaskInvalid;
}
When I start the playback, it works, and keep playing when i switch off the screen. BUT when the song is over, here are the logs
2012-03-01 10:00:27.342 DEMO[3096:707] playerItemDidReachEnd
2012-03-01 10:00:27.360 DEMO[3096:707] playerItemDidReachEnd 2
2012-03-01 10:00:27.363 DEMO[3096:707] searching next song
2012-03-01 10:00:27.381 DEMO[3096:707] Start : Moby - Ah-Ah
But no song start effectively...
Can anyone tell me what's wrong with my code ??
Thanks a lot.
try to comment next lines
[[UIApplication sharedApplication] endBackgroundTask:newTaskID];
newTaskID = UIBackgroundTaskInvalid;
if it works then you need to add an observer to a mAudioPlayer for "currentItem.status" when status AVPlayerStatusReadyToPlay then end background task

IPhone how do display Leaderboard screen in my own game .... developed in cocos2d

i want to show leaderbord in my own game ....i am using following method for that but noting happen ... i am confuse with rootview controller as my game is developed in cocos2d so there is nothing like dat :(
// Leaderboards
-(void) showLeaderboard
{
if (isGameCenterAvailable == NO)
return;
GKLeaderboardViewController* leaderboardVC = [[[GKLeaderboardViewController alloc] init] autorelease];
if (leaderboardVC != nil)
{
leaderboardVC.leaderboardDelegate = self;
[self presentViewController:leaderboardVC];
}
}
///
-(void) leaderboardViewControllerDidFinish:(GKLeaderboardViewController*)viewController
{
[self dismissModalViewController];
[delegate onLeaderboardViewDismissed];
}
///////
-(UIViewController*) getRootViewController
{
return [UIApplication sharedApplication].keyWindow.rootViewController;
}
///
-(void) presentViewController:(UIViewController*)vc
{
UIViewController* rootVC = [self getRootViewController];
[rootVC presentModalViewController:vc animated:YES];
}
////
-(void) dismissModalViewController
{
UIViewController* rootVC = [self getRootViewController];
[rootVC dismissModalViewControllerAnimated:YES];
}
...
regards
Haseeb
i dont know but it work for me.if anyone can describe the real reason for why this working in this way i will be very glad....i call it through appdelegate
[(myAppDelegate*)[[UIApplication sharedApplication] delegate]gameCenter];
and from appdelegate i call rootviewcontroller method like
-(void)gameCenter
{
[rootViewController gameCenterLeaderboard];
}
and in rootviewcontroller there is a method
-(void)gameCenterLeaderboard
{
GKLeaderboardViewController* leaderboardVC = [[[GKLeaderboardViewController alloc] init] autorelease];
if (leaderboardVC != nil) {
leaderboardVC.leaderboardDelegate = self;
[self presentModalViewController: leaderboardVC animated: YES];
}
}
the following method is also override in rootviewcontroller
- (void)leaderboardViewControllerDidFinish:(GKLeaderboardViewController *)leaderboardController
{
[self dismissModalViewControllerAnimated:YES];
}
If you don't have a root UIViewController then I'd recommend creating a new UIViewController set it's view to your openGLView then use that view controller to present the leaderboard as a modal view controller.
UIViewController *leaderboardViewController = [[UIViewController alloc] init];
[leaderboardViewController setView:[[CCDirector sharedDirector] openGLView]];
[leaderboardViewController presentModalViewController:leaderboardVC animated:YES]; //leaderboardVC is your GKLeaderboardViewController

Resources