I'm having an issue with saving video from a GPUImage videoCamera to the Camera Roll when my app goes into the background. The file is only saved to the camera roll when the app returns to the foreground / is restarted. I'm no doubt making a beginners code error , if anyone can point it out that would be appreciated.
- (void)applicationDidEnterBackground:(UIApplication *)application {
if (isRecording){
[self stopRecording];
};
if (self.isViewLoaded && self.view.window){
[videoCamera stopCameraCapture];
};
runSynchronouslyOnVideoProcessingQueue(^{
glFinish();
});
NSLog(#"applicationDidEnterBackground");
and then
-(void)stopRecording {
[filterBlend removeTarget:movieWriter];
videoCamera.audioEncodingTarget = nil;
[movieWriter finishRecording];
NSString *path = [NSTemporaryDirectory() stringByAppendingPathComponent:#"file.mov"];
ALAssetsLibrary *al = [[ALAssetsLibrary alloc] init];
[al writeVideoAtPathToSavedPhotosAlbum:[NSURL fileURLWithPath:path] completionBlock:^(NSURL *assetURL, NSError *error) {
if (error) {
NSLog(#"Error %#", error);
} else {
NSLog(#"Success");
}
}];
isRecording = NO;
NSLog(#"Stop recording");
It was exactly as Brad pointed out in his, as usual, insightful comment, the -writeVideoAtPathToSavedPhotosAlbum:completionBlock: wasn't completing till after the app returned to the foreground, I solved it by adding
self.backgroundTask = [[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:^{
NSLog(#"Background handler called. Not running background tasks anymore.");
[[UIApplication sharedApplication] endBackgroundTask:self.backgroundTask];
self.backgroundTask = UIBackgroundTaskInvalid;
}];
and
#property (nonatomic) UIBackgroundTaskIdentifier backgroundTask;
Found this solution at http://www.raywenderlich.com/29948/backgrounding-for-ios
Related
Is it possible to play a track when the app has exited/gone into a background state?
For example:
- (void)applicationDidEnterBackground:(UIApplication *)application
{
NSURL *trackURL = [NSURL URLWithString:#"spotify:track:489K1qunRVBm2OpS4XGLNd"];
[[SPSession sharedSession] trackForURL:trackURL callback:^(SPTrack *track) {
if (track != nil) {
[SPAsyncLoading waitUntilLoaded:track timeout:kSPAsyncLoadingDefaultTimeout then:^(NSArray *tracks, NSArray *notLoadedTracks) {
[self.playbackManager playTrack:track callback:^(NSError *error) {
if (error) {
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Cannot Play Track"
message:[error localizedDescription]
delegate:nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[alert show];
} else {
self.currentTrack = track;
}
}];
}];
}
}];
}
The above code was taken from the Simple Player app provided with cocoalibspotify.
Copying and pasting code like this won't get you far when backgrounding - you can't use UI elements like UIAlertView, for a start.
You need to declare your application as supporting the audio background mode as documented by Apple. Then, when you go into the background you should make a background task to start playback.
I would have thought NSFileManagers method of removeItemAtURL:error: would remove the Core Data log files created when using UIManagedDocuments with iCloud.
What is the best way to make sure all of these log files are removed?
I have used...
- (void)deleteRemnantsOfOldDatabaseDocumentAndItsTransactionLogsWithCompletionHandler:(completion_success_t)completionBlock
{
__weak CloudController *weakSelf = self;
NSURL *databaseStoreFolder = self.iCloudDatabaseStoreFolderURL;
NSURL *transactionLogFolder = self.transactionLogFilesFolderURL;
[self deleteFileAtURL:databaseStoreFolder withCompletionBlock:^(BOOL docSuccess) {
[weakSelf deleteFileAtURL:transactionLogFolder withCompletionBlock:^(BOOL logSuccess) {
completionBlock(docSuccess && logSuccess);
}];
}];
}
In conjunction with...
- (void)deleteFileAtURL:(NSURL *)fileURL withCompletionBlock:(completion_success_t)completionBlock
{
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
NSFileCoordinator *fileCoordinator = [[NSFileCoordinator alloc] initWithFilePresenter:nil];
NSError *coordinatorError = nil;
__block BOOL success = NO;
[fileCoordinator coordinateWritingItemAtURL:fileURL
options:NSFileCoordinatorWritingForDeleting
error:&coordinatorError
byAccessor:^(NSURL *writingURL) {
NSFileManager *fileManager = [[NSFileManager alloc] init];
NSError *removalError = nil;
if ([fileManager fileExistsAtPath:[writingURL path]]) {
if (![fileManager removeItemAtURL:writingURL error:&removalError]) {
NSLog(#"deleteFileAtURL: removal error: %#", removalError);
} else {
success = YES;
}
}
}];
if (coordinatorError) {
NSLog(#"deleteFileAtURL: coordinator error: %#", coordinatorError);
}
completionBlock(success);
});
}
Note: this was used for a single document toolbox style app, and was intended more for clearing out the iCloud container before creating a brand new document, in an 'apparently' empty iCloud store for the first time. But I'm sure it can be adapted without too much work.
Oops, the above won't make sense/work without:
typedef void (^completion_success_t)(BOOL success);
You can debug the contents of your iCloud container and verify things have been removed by using a method like (which to be honest I've probably lifted from somewhere else and modified):
- (void)logDirectoryHierarchyContentsForURL:(NSURL *)url
{
NSFileManager *fileManager = [NSFileManager defaultManager];
NSDirectoryEnumerator *directoryEnumerator = [fileManager enumeratorAtURL:url
includingPropertiesForKeys:#[NSURLNameKey, NSURLContentModificationDateKey]
options:NSDirectoryEnumerationSkipsHiddenFiles
errorHandler:nil];
NSMutableArray *results = [NSMutableArray array];
for (NSURL *itemURL in directoryEnumerator) {
NSString *fileName;
[itemURL getResourceValue:&fileName forKey:NSURLNameKey error:NULL];
NSDate *modificationDate;
[itemURL getResourceValue:&modificationDate forKey:NSURLContentModificationDateKey error:NULL];
[results addObject:[NSString stringWithFormat:#"%# (%#)", itemURL, modificationDate]];
}
NSLog(#"Directory contents: %#", results);
}
And it's also worth logging onto developer.icloud.com and examining what is actually in the iCloud store. There is sometimes a difference between what is retained in the device ubiquity container, and what is actually in the iCloud server folder structure. Between all of these you can get quite a good idea of what's going on.
I am working on my first iCloud App. After working for a while the app cannot access a UIManagedDocument any more due to an "UIDocumentStateSavingError". Is there any way to actually find out what error occurred?
This is my code to create the UIManagedDocument:
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
iCloudURL = [[NSFileManager defaultManager] URLForUbiquityContainerIdentifier:nil];
if (iCloudURL == nil) {
dispatch_async(dispatch_get_main_queue(), ^{
[self iCloudNotAvailable];
});
return;
}
iCloudDocumentsURL = [iCloudURL URLByAppendingPathComponent:#"Documents"];
iCloudCoreDataLogFilesURL = [iCloudURL URLByAppendingPathComponent:#"TransactionLogs"];
NSURL *url = [iCloudDocumentsURL URLByAppendingPathComponent:#"CloudDatabase"];
iCloudDatabaseDocument = [[UIManagedDocument alloc] initWithFileURL:url];
NSMutableDictionary *options = [NSMutableDictionary dictionary];
NSString *name = [iCloudDatabaseDocument.fileURL lastPathComponent];
[options setObject:name forKey:NSPersistentStoreUbiquitousContentNameKey];
[options setObject:iCloudCoreDataLogFilesURL forKey:NSPersistentStoreUbiquitousContentURLKey];
iCloudDatabaseDocument.persistentStoreOptions = options;
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(documentContentsChanged:) name:NSPersistentStoreDidImportUbiquitousContentChangesNotification object:iCloudDatabaseDocument.managedObjectContext.persistentStoreCoordinator];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(documentStateChanged:) name:UIDocumentStateChangedNotification object:iCloudDatabaseDocument];
if ([[NSFileManager defaultManager] fileExistsAtPath:[iCloudDatabaseDocument.fileURL path]]) {
// This is true, the document exists.
if (iCloudDatabaseDocument.documentState == UIDocumentStateClosed) {
[iCloudDatabaseDocument openWithCompletionHandler:^(BOOL success) {
if (success) {
dispatch_async(dispatch_get_main_queue(), ^{
[self documentConnectionIsReady];
});
} else {
dispatch_async(dispatch_get_main_queue(), ^{
[self connectionError:iCloudConnectionErrorFailedToOpen];
});
}
}];
} else if (iCloudDatabaseDocument.documentState == UIDocumentStateNormal) {
...
}
} else {
...
}
});
The Document already exists and thus openWithCompletionHandler: is called on the document. This fails and the UIDocumentStateChangedNotification is fired which shows a document states of 5:
UIDocumentStateClosed and
UIDocumentStateSavingError
After this the completion block gets called. What is correct way to proceed from here? Is there any way to find out what went wrong and what kind of error occurred?
I tried to re-open the document in the completion block but the result is the same.
I guess I could solve the problem by just deleting the file and recreate it. But this is obviously not an option once the app will be out in the store. I would like to know what is going wrong and give the user an appropriator way to handle the problem.
I already checked other questions here handling the UIDocumentStateSavingError (there a not a lot of them) but the seem not to be applicable for the problem here.
Any idea how I can find out what the problem is? I cannot belive that the API tells you "Something went wrong during saving but I will not tell you what!"
You can query the documentState in the completion handler. Unfortunately, if you want the exact error, the only way to get it is to subclass and override handleError:userInteractionPermitted:
Maybe something like this would help (typed freehand without compiler)...
#interface MyManagedDocument : UIManagedDocument
- (void)handleError:(NSError *)error
userInteractionPermitted:(BOOL)userInteractionPermitted;
#property (nonatomic, strong) NSError *lastError;
#end
#implementation MyManagedDocument
#synthesize lastError = _lastError;
- (void)handleError:(NSError *)error
userInteractionPermitted:(BOOL)userInteractionPermitted
{
self.lastError = error;
[super handleError:error
userInteractionPermitted:userInteractionPermitted];
}
#end
Then in you can create it like this...
iCloudDatabaseDocument = [[UIManagedDocument alloc] initWithFileURL:url];
and use it in the completion handler like this...
[iCloudDatabaseDocument openWithCompletionHandler:^(BOOL success) {
if (success) {
dispatch_async(dispatch_get_main_queue(), ^{
[self documentConnectionIsReady];
});
} else {
dispatch_async(dispatch_get_main_queue(), ^{
[self connectionError:iCloudConnectionErrorFailedToOpen
withError:iCloudDatabaseDocument.lastError];
});
}
}];
Based on #JodyHagins excellent snippet, I have made a UIDocument subclass.
#interface SSDocument : UIDocument
- (void)openWithSuccess:(void (^)())successBlock
failureBlock:(void (^)(NSError *error))failureBlock;
#end
#interface SSDocument ()
#property (nonatomic, strong) NSError *lastError;
#end
#implementation SSDocument
- (void)handleError:(NSError *)error userInteractionPermitted:(BOOL)userInteractionPermitted {
self.lastError = error;
[super handleError:error userInteractionPermitted:userInteractionPermitted];
}
- (void)clearLastError {
self.lastError = nil;
}
- (void)openWithSuccess:(void (^)())successBlock failureBlock:(void (^)(NSError *error))failureBlock {
NSParameterAssert(successBlock);
NSParameterAssert(failureBlock);
[self clearLastError];
[self openWithCompletionHandler:^(BOOL success) {
if (success) {
successBlock();
} else {
NSError *error = self.lastError;
[self clearLastError];
failureBlock(error);
}
}];
}
#end
I'm trying to build an app to play local music, but unfortunately, i'm unable to do audio playback in background in iOS5 with AVQueuePlayer.
In my ViewDidLoad, i got this code :
// Player setup
mAudioPlayer = [[AVQueuePlayer alloc] init];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(playerItemDidReachEnd:) name:AVPlayerItemDidPlayToEndTimeNotification object:nil];
[mAudioPlayer addPeriodicTimeObserverForInterval:CMTimeMakeWithSeconds(1.0, 1) queue:NULL usingBlock:^(CMTime time) {
[self updatePositionOnDisplay];
}];
// Audio session setup
NSError *setCategoryErr = nil;
NSError *activationErr = nil;
[[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayback error: &setCategoryErr];
[[AVAudioSession sharedInstance] setActive: YES error: &activationErr];
Here is my "playerItemDidReachEnd" method:
- (void)playerItemDidReachEnd:(NSNotification*)notification
{
NSLog(#"playerItemDidReachEnd");
UIBackgroundTaskIdentifier newTaskID = [[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:^{
[[UIApplication sharedApplication] endBackgroundTask:newTaskID];
}];
NSLog(#"playerItemDidReachEnd 2");
NSLog(#"searching next song");
mCurrentSong = [self getNextSongWithIsSwitched:NO];
if(mCurrentSong != nil){
NSLog(#"Start : %# - %#", mCurrentSong.artist, mCurrentSong.title);
mTapeTitle.text = [NSString stringWithFormat:#"%# - %#", mCurrentSong.artist, mCurrentSong.title];
AVPlayerItem* i = [[AVPlayerItem alloc] initWithURL:[NSURL URLWithString:mCurrentSong.path]];
if(i != nil){
[mAudioPlayer insertItem:i afterItem:nil];
}else
NSLog(#"BING!! no AVPlayerItem created for song's path: %#", mCurrentSong.path);
[i release];
}else{
NSLog(#"no song found");
[mAudioPlayer pause];
isPlaying = NO;
[mPlayButton setSelected:NO];
}
[[UIApplication sharedApplication] endBackgroundTask:newTaskID];
newTaskID = UIBackgroundTaskInvalid;
}
When I start the playback, it works, and keep playing when i switch off the screen. BUT when the song is over, here are the logs
2012-03-01 10:00:27.342 DEMO[3096:707] playerItemDidReachEnd
2012-03-01 10:00:27.360 DEMO[3096:707] playerItemDidReachEnd 2
2012-03-01 10:00:27.363 DEMO[3096:707] searching next song
2012-03-01 10:00:27.381 DEMO[3096:707] Start : Moby - Ah-Ah
But no song start effectively...
Can anyone tell me what's wrong with my code ??
Thanks a lot.
try to comment next lines
[[UIApplication sharedApplication] endBackgroundTask:newTaskID];
newTaskID = UIBackgroundTaskInvalid;
if it works then you need to add an observer to a mAudioPlayer for "currentItem.status" when status AVPlayerStatusReadyToPlay then end background task
I've been struggling with adding assets from the iPhone Photo Library to a AVMutableComposition and then export them. Here is what I got:
Finding the assets: (here I grab the AVURLAsset)
-(void) findAssets {
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
// Enumerate just the photos and videos group by using ALAssetsGroupSavedPhotos.
[library enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos usingBlock:^(ALAssetsGroup *group, BOOL *stop) {
// Within the group enumeration block, filter to enumerate just videos.
[group setAssetsFilter:[ALAssetsFilter allVideos]];
[group enumerateAssetsUsingBlock:^(ALAsset *alAsset, NSUInteger index, BOOL *innerStop){
// The end of the enumeration is signaled by asset == nil.
if (alAsset) {
ALAssetRepresentation *representation = [alAsset defaultRepresentation];
NSURL *url = [representation url];
AVURLAsset *avAsset = [AVURLAsset URLAssetWithURL:url options:nil];
// Do something interesting with the AV asset.
[thumbs addObject:alAsset];
[assets addObject:avAsset];
}else if(alAsset == nil){
[self createScroll];
}
}];
}
failureBlock: ^(NSError *error) {
// Typically you should handle an error more gracefully than this.
NSLog(#"No groups");
}];
[library release];
}
Here I add a asset to my composition (I use the first object in the array for testing only.
-(void) addToCompositionWithAsset:(AVURLAsset*)_asset{
NSError *editError = nil;
composition = [AVMutableComposition composition];
AVURLAsset* sourceAsset = [assets objectAtIndex:0];
Float64 inSeconds = 1.0;
Float64 outSeconds = 2.0;
// calculate time
CMTime inTime = CMTimeMakeWithSeconds(inSeconds, 600);
CMTime outTime = CMTimeMakeWithSeconds(outSeconds, 600);
CMTime duration = CMTimeSubtract(outTime, inTime);
CMTimeRange editRange = CMTimeRangeMake(inTime, duration);
[composition insertTimeRange:editRange ofAsset:sourceAsset atTime:composition.duration error:&editError];
if (!editError) {
CMTimeGetSeconds (composition.duration);
}
}
And finally I export the comp and here it crashes
-(void)exportComposition {
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetPassthrough];
NSLog (#"can export: %#", exportSession.supportedFileTypes);
NSArray *dirs = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectoryPath = [dirs objectAtIndex:0];
NSString *exportPath = [documentsDirectoryPath stringByAppendingPathComponent:EXPORT_NAME];
[[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
NSURL *exportURL = [NSURL fileURLWithPath:exportPath];
exportSession.outputURL = exportURL;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;//#"com.apple.quicktime-movie";
[exportSession exportAsynchronouslyWithCompletionHandler:^{
NSLog (#"i is in your block, exportin. status is %d",
exportSession.status);
switch (exportSession.status) {
case AVAssetExportSessionStatusFailed:
case AVAssetExportSessionStatusCompleted: {
[self performSelectorOnMainThread:#selector (exportDone:)
withObject:nil
waitUntilDone:NO];
break;
}
};
}];
}
Does anyone have an idea of what it might be? It crashes on
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetPassthrough];
And I tried different presets and outputFileTypes.
Thanks
* SOLVED *
I have to answer my own question now when I have solved. It's amazing that I've been struggling with this for a whole day and then I fix it right after posting a question :)
I changed and moved:
composition = [AVMutableComposition
composition];
to:
composition = [[AVMutableComposition
alloc] init];
I think I was too tired when I was working on this yesterday. Thanks guys!