GPU Image - Combine two GPUImageFilterGroups from two distinct set of cases - gpuimage

I'm building a Photo App for the iPhone using GPUImage by Brad Larson and based on the DLCImagePickerController by GoBackSpaces (which helped me learn lots of stuff so far), but I'm stuck with one of the features I want the App to have. I have two sets of cases:
Cases A: a set of cases that represent "film rolls" (Filter Cases)
Cases B: a set of cases that represent "camera lenses" (Len Cases)
Within each case I have GPUImageFilterGroups. I want the user to be able to choose a filter from Cases A and then be able to apply another filter from Cases B over the same StaticPicture. So, in short, after the user chooses a filter from Cases A the StaticPicture changes according to the filter chosen, then I want to apply another filter from Cases B over that modified StaticPicture. I tried it several different ways, but I had no luck. The StaticPicture disappears after choosing a filter from Cases B. Do I need to retain something? Any help would be truly appreciated. Here is my code:
filter = [[GPUImageRGBFilter alloc] init];
lens = [[GPUImageRGBFilter alloc] init];
-(void) filterClicked:(UIButton *) sender {
for(UIView *view in self.filterScrollView.subviews){
if([view isKindOfClass:[UIButton class]]){
[(UIButton *)view setSelected:NO];
}
}
[sender setSelected:YES];
[self removeAllTargets];
selectedFilter = sender.tag;
[self setFilter:sender.tag];
[self prepareFilter];
}
-(void) lensClicked:(UIButton *) sender {
for(UIView *view in self.lensScrollView.subviews){
if([view isKindOfClass:[UIButton class]]){
[(UIButton *)view setSelected:NO];
}
}
[sender setSelected:YES];
selectedLens = sender.tag;
[self setLens:sender.tag];
[self prepareLens];
}
-(void) setFilter:(int) index {
switch (index) {
case 1:{
filter = [[GPUImageFilterGroup alloc] init];
UIImage *inputImage23 = [UIImage imageNamed:#"vignette.png"];
sourcePicture5 = [[GPUImagePicture alloc] initWithImage:inputImage23 smoothlyScaleOutput:YES];
GPUImageOverlayBlendFilter * overlay23 = [[GPUImageOverlayBlendFilter alloc] init];
[(GPUImageFilterGroup *)filter addFilter:overlay23];
[sourcePicture5 addTarget:overlay23 atTextureLocation:1];
[sourcePicture5 processImage];
UIImage *inputImage21 = [UIImage imageNamed:#"frame.png"];
sourcePicture6 = [[GPUImagePicture alloc] initWithImage:inputImage21 smoothlyScaleOutput:YES];
GPUImageNormalBlendFilter * overlay21 = [[GPUImageNormalBlendFilter alloc] init];
[(GPUImageFilterGroup *)filter addFilter:overlay21];
[sourcePicture6 addTarget:overlay21 atTextureLocation:1];
[sourcePicture6 processImage];
GPUilforddelta100 *ilford = [[GPUilforddelta100 alloc] init];
[(GPUImageFilterGroup *)filter addFilter:ilford];
[overlay23 addTarget:overlay21];
[overlay21 addTarget:ilford];
[(GPUImageFilterGroup *)filter setInitialFilters:[NSArray arrayWithObject:overlay23]];
[(GPUImageFilterGroup *)filter setTerminalFilter:ilford];
} break;
case 2: {
filter = [[GPUImageFilterGroup alloc] init];
UIImage *inputImage1 = [UIImage imageNamed:#"trixgrain.png"];
sourcePicture = [[GPUImagePicture alloc] initWithImage:inputImage1 smoothlyScaleOutput:YES];
GPUImageMultiplyBlendFilter * overlay1 = [[GPUImageMultiplyBlendFilter alloc] init];
[(GPUImageFilterGroup *)filter addFilter:overlay1];
[sourcePicture addTarget:overlay1 atTextureLocation:1];
[sourcePicture processImage];
UIImage *inputImage2 = [UIImage imageNamed:#"trixfeel.png"];
sourcePicture3 = [[GPUImagePicture alloc] initWithImage:inputImage2 smoothlyScaleOutput:YES];
GPUImageOverlayBlendFilter * overlay2 = [[GPUImageOverlayBlendFilter alloc] init];
[(GPUImageFilterGroup *)filter addFilter:overlay2];
[sourcePicture3 addTarget:overlay2 atTextureLocation:1];
[sourcePicture3 processImage];
UIImage *inputImage18 = [UIImage imageNamed:#"frame.png"];
sourcePicture2 = [[GPUImagePicture alloc] initWithImage:inputImage18 smoothlyScaleOutput:YES];
GPUImageNormalBlendFilter * overlay18 = [[GPUImageNormalBlendFilter alloc] init];
[(GPUImageFilterGroup *)filter addFilter:overlay18];
[sourcePicture2 addTarget:overlay18 atTextureLocation:1];
[sourcePicture2 processImage];
GPUTrix400 *trix400 = [[GPUTrix400 alloc] init];
[(GPUImageFilterGroup *)filter addFilter:trix400];
[overlay2 addTarget:overlay1];
[overlay1 addTarget:overlay18];
[overlay18 addTarget:trix400];
[(GPUImageFilterGroup *)filter setInitialFilters:[NSArray arrayWithObject:overlay2]];
[(GPUImageFilterGroup *)filter setTerminalFilter:trix400];
} break;
default:
filter = [[GPUImageRGBFilter alloc] init];
break;
}
}
-(void) setLens:(int) index {
switch (index) {
case 1:{
lens = [[GPUImageFilterGroup alloc] init];
UIImage *inputImage1 = [UIImage imageNamed:#"lightleak.png"];
sourcePicture = [[GPUImagePicture alloc] initWithImage:inputImage1 smoothlyScaleOutput:YES];
GPUImageOverlayBlendFilter * overlay1 = [[GPUImageOverlayBlendFilter alloc] init];
[(GPUImageFilterGroup *)lens addFilter:overlay1];
[sourcePicture addTarget:overlay1 atTextureLocation:1];
[sourcePicture processImage];
UIImage *inputImage18 = [UIImage imageNamed:#"holgaframe.png"];
sourcePicture2 = [[GPUImagePicture alloc] initWithImage:inputImage18 smoothlyScaleOutput:YES];
GPUImageNormalBlendFilter * overlay18 = [[GPUImageNormalBlendFilter alloc] init];
[(GPUImageFilterGroup *)lens addFilter:overlay18];
[sourcePicture2 addTarget:overlay18 atTextureLocation:1];
[sourcePicture2 processImage];
GPUholgaroid *holgaroid = [[GPUholgaroid alloc] init];
[(GPUImageFilterGroup *)lens addFilter:holgaroid];
[holgaroid addTarget:overlay1];
[overlay1 addTarget:overlay18];
[(GPUImageFilterGroup *)lens setInitialFilters:[NSArray arrayWithObject:holgaroid]];
[(GPUImageFilterGroup *)lens setTerminalFilter:overlay18];
} break;
case 2:{
lens = [[GPUImageFilterGroup alloc] init];
UIImage *inputImage1 = [UIImage imageNamed:#"dianafeel.png"];
sourcePicture = [[GPUImagePicture alloc] initWithImage:inputImage1 smoothlyScaleOutput:YES];
GPUImageOverlayBlendFilter * overlay1 = [[GPUImageOverlayBlendFilter alloc] init];
[(GPUImageFilterGroup *)lens addFilter:overlay1];
[sourcePicture addTarget:overlay1 atTextureLocation:1];
[sourcePicture processImage];
UIImage *inputImage18 = [UIImage imageNamed:#"dianaframe.png"];
sourcePicture2 = [[GPUImagePicture alloc] initWithImage:inputImage18 smoothlyScaleOutput:YES];
GPUImageNormalBlendFilter * overlay18 = [[GPUImageNormalBlendFilter alloc] init];
[(GPUImageFilterGroup *)lens addFilter:overlay18];
[sourcePicture2 addTarget:overlay18 atTextureLocation:1];
[sourcePicture2 processImage];
GPUdianaf *dianaf = [[GPUdianaf alloc] init];
[(GPUImageFilterGroup *)lens addFilter:dianaf];
[dianaf addTarget:overlay1];
[overlay1 addTarget:overlay18];
[(GPUImageFilterGroup *)lens setInitialFilters:[NSArray arrayWithObject:dianaf]];
[(GPUImageFilterGroup *)lens setTerminalFilter:overlay18];
} break;
default:
lens = [[GPUImageRGBFilter alloc] init];
break;
}
}
-(void) prepareFilter {
if (![UIImagePickerController isSourceTypeAvailable: UIImagePickerControllerSourceTypeCamera]) {
isStatic = YES;
}
if (!isStatic) {
[self prepareLiveFilter];
} else {
[self prepareStaticFilter];
}
}
-(void) prepareLiveFilter {
[filter removeAllTargets];
[cropFilter removeAllTargets];
[blurFilter removeAllTargets];
[stillCamera addTarget:cropFilter];
[cropFilter addTarget:filter];
if (hasBlur) {
[filter addTarget:blurFilter];
[blurFilter addTarget:self.imageView];
} else {
[filter addTarget:self.imageView];
}
//GPUImage issue with new code, do not use until further improvement
//[filter prepareForImageCapture];
}
-(void) prepareStaticFilter {
if (!staticPicture) {
[self performSelector:#selector(switchToLibrary:) withObject:nil afterDelay:3.5];
}
[stillCamera removeAllTargets];
[staticPicture removeAllTargets];
[staticPicture addTarget:filter];
if (hasBlur) {
[filter addTarget:blurFilter];
[blurFilter addTarget:self.imageView];
} else {
[filter addTarget:self.imageView];
}
GPUImageRotationMode imageViewRotationMode = kGPUImageNoRotation;
switch (staticPictureOriginalOrientation) {
case UIImageOrientationLeft:
imageViewRotationMode = kGPUImageRotateLeft;
break;
case UIImageOrientationRight:
imageViewRotationMode = kGPUImageRotateRight;
break;
case UIImageOrientationDown:
imageViewRotationMode = kGPUImageRotate180;
break;
default:
imageViewRotationMode = kGPUImageNoRotation;
break;
}
[self.imageView setInputRotation:imageViewRotationMode atIndex:0];
[staticPicture processImage];
}
-(void) prepareLens {
if (![UIImagePickerController isSourceTypeAvailable: UIImagePickerControllerSourceTypeCamera]) {
isStatic = YES;
}
if (!isStatic) {
[self prepareLiveLens];
} else {
[self prepareStaticLens];
}
}
-(void) prepareLiveLens {
[stillCamera addTarget:lens];
[lens addTarget:self.imageView];
//GPUImage issue with new code, do not use until further improvement
//[lens prepareForImageCapture];
}
-(void) prepareStaticLens {
if (!staticPicture) {
[self performSelector:#selector(switchToLibrary:) withObject:nil afterDelay:3.5];
}
[staticPicture addTarget:lens];
[lens addTarget:self.imageView];
GPUImageRotationMode imageViewRotationMode = kGPUImageNoRotation;
switch (staticPictureOriginalOrientation) {
case UIImageOrientationLeft:
imageViewRotationMode = kGPUImageRotateLeft;
break;
case UIImageOrientationRight:
imageViewRotationMode = kGPUImageRotateRight;
break;
case UIImageOrientationDown:
imageViewRotationMode = kGPUImageRotate180;
break;
default:
imageViewRotationMode = kGPUImageNoRotation;
break;
}
[self.imageView setInputRotation:imageViewRotationMode atIndex:0];
[staticPicture processImage];
}
-(void) removeAllTargets {
[filter removeAllTargets];
[lens removeAllTargets];
[sourcePicture removeAllTargets];
[sourcePicture2 removeAllTargets];
[sourcePicture3 removeAllTargets];
[stillCamera removeAllTargets];
[staticPicture removeAllTargets];
[cropFilter removeAllTargets];
[blurFilter removeAllTargets];
}
-(void) prepareForCapture {
[stillCamera.inputCamera lockForConfiguration:nil];
if(self.flashToggleButton.selected &&
[stillCamera.inputCamera hasTorch]){
[stillCamera.inputCamera setTorchMode:AVCaptureTorchModeOn];
[self performSelector:#selector(captureImage)
withObject:nil
afterDelay:0.50];
}else{
[self captureImage];
}
}
-(void)captureImage {
[lens forceProcessingAtSize:CGSizeMake (1936, 1936)];
[staticPicture addTarget:lens];
[stillCamera capturePhotoAsImageProcessedUpToFilter:lens
withCompletionHandler:^(UIImage *processed, NSError *error) {
isStatic = YES;
runOnMainQueueWithoutDeadlocking(^{
#autoreleasepool {
[stillCamera.inputCamera unlockForConfiguration];
[stillCamera stopCameraCapture];
[self removeAllTargets];
staticPicture = [[GPUImagePicture alloc] initWithImage:processed smoothlyScaleOutput:NO];
staticPictureOriginalOrientation = processed.imageOrientation;
[self prepareFilter];
[self.retakeButton setHidden:NO];
[self.photoCaptureButton setTitle:#"Done" forState:UIControlStateNormal];
[self.photoCaptureButton setImage:nil forState:UIControlStateNormal];
[self.photoCaptureButton setEnabled:YES];
if(![self.filtersToggleButton isSelected]){
[self showFilters];
}
}
});
}];
}
-(IBAction) takePhoto:(id)sender{
[self.photoCaptureButton setEnabled:NO];
if (!isStatic) {
isStatic = YES;
[self.libraryToggleButton setHidden:YES];
[self.cameraToggleButton setEnabled:NO];
[self.flashToggleButton setEnabled:NO];
[self prepareForCapture];
} else {
GPUImageOutput<GPUImageInput> *processUpTo;
if (hasBlur) {
processUpTo = blurFilter;
} else {
processUpTo = lens;
}
[staticPicture processImage];
UIImage *currentFilteredVideoFrame = [processUpTo imageFromCurrentlyProcessedOutputWithOrientation:staticPictureOriginalOrientation];
NSDictionary *info = [[NSDictionary alloc] initWithObjectsAndKeys:
UIImageJPEGRepresentation(currentFilteredVideoFrame, self.outputJPEGQuality), #"data", nil];
[self.delegate imagePickerController:self didFinishPickingMediaWithInfo:info];
}
}
-(void) dealloc {
[self removeAllTargets];
stillCamera = nil;
cropFilter = nil;
filter = nil;
blurFilter = nil;
sourcePicture = nil;
sourcePicture2 = nil;
sourcePicture3 = nil;
staticPicture = nil;
self.blurOverlayView = nil;
self.focusView = nil;
}
Thanks in advance for any explanation or help with this question from a noobie.

You can use below link:
https://github.com/BradLarson/GPUImage/issues/112
Thanks

Related

Implemetation of videos in video with blur effect in objective c

I have to make a video editor with a blur effect on video .
Can someone please guide me some useful links or the way this task should be proceeded .I have tried doing overlapping of videos but it doesn't bring me videos in center exactly.
- (void) overlapVideos{
AVURLAsset* firstAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"BearVideo" ofType:#"mp4"]] options:nil];
AVURLAsset * secondAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"BearVideo" ofType:#"mp4"]] options:nil];
AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration) ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
AVMutableCompositionTrack *secondTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[secondTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, secondAsset.duration) ofTrack:[[secondAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
AVMutableVideoCompositionInstruction * instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, firstAsset.duration);
AVMutableVideoCompositionLayerInstruction *FirstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];
CGAffineTransform Scale = CGAffineTransformMakeScale(0.6f,0.6f);
CGAffineTransform Move = CGAffineTransformMakeTranslation(140,20);
[FirstlayerInstruction setTransform:CGAffineTransformConcat(Scale,Move) atTime:kCMTimeZero];
AVMutableVideoCompositionLayerInstruction *SecondlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:secondTrack];
CGAffineTransform SecondScale = CGAffineTransformMakeScale(0.9f,0.9f);
CGAffineTransform SecondMove = CGAffineTransformMakeTranslation(0,0);
[SecondlayerInstruction setTransform:CGAffineTransformConcat(SecondScale,SecondMove) atTime:kCMTimeZero];
instruction.layerInstructions = [NSArray arrayWithObjects:FirstlayerInstruction,SecondlayerInstruction,nil];;
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.instructions = [NSArray arrayWithObject:instruction];
videoComposition.frameDuration = CMTimeMake(1, 30);
videoComposition.renderSize = CGSizeMake(1280, 720);
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:#"overlapVideo.mov"];
if([[NSFileManager defaultManager] fileExistsAtPath:myPathDocs])
{
[[NSFileManager defaultManager] removeItemAtPath:myPathDocs error:nil];
}
NSURL *url = [NSURL fileURLWithPath:myPathDocs];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL=url;
[exporter setVideoComposition:videoComposition];
exporter.outputFileType = AVFileTypeQuickTimeMovie;
[exporter exportAsynchronouslyWithCompletionHandler:^
{
dispatch_async(dispatch_get_main_queue(), ^{
[self exportDidFinish:exporter];
});
}];
}
- (void)exportDidFinish:(AVAssetExportSession*)session
{
NSURL *outputURL = session.outputURL;
if(self.videodelegateObj!=nil){
[_videodelegateObj videoOverlappingFinished:outputURL];
}
}
-(void)applyBlurOnAsset:(AVAsset *)asset Completion:(void(^)(BOOL success, NSError* error, NSURL* videoUrl))completion{
CIFilter *filter = [CIFilter filterWithName:#"CIGaussianBlur"];
AVVideoComposition *composition = [AVVideoComposition videoCompositionWithAsset: asset
applyingCIFiltersWithHandler:^(AVAsynchronousCIImageFilteringRequest *request){
// Clamp to avoid blurring transparent pixels at the image edges
CIImage *source = [request.sourceImage imageByClampingToExtent];
[filter setValue:source forKey:kCIInputImageKey];
[filter setValue:[NSNumber numberWithDouble:10.0] forKey:kCIInputRadiusKey];
CIImage *output = [filter.outputImage imageByCroppingToRect:request.sourceImage.extent];
[request finishWithImage:output context:nil];
}];
NSURL *outputUrl = [[NSURL alloc] initWithString:#"Your Output path"];
[[NSFileManager defaultManager] removeItemAtURL:outputUrl error:nil];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPreset960x540] ;
exporter.videoComposition = composition;
exporter.outputFileType = AVFileTypeMPEG4;
if (outputUrl){
exporter.outputURL = outputUrl;
[exporter exportAsynchronouslyWithCompletionHandler:^{
switch ([exporter status]) {
case AVAssetExportSessionStatusFailed:
NSLog(#"crop Export failed: %#", [[exporter error] localizedDescription]);
if (completion){
dispatch_async(dispatch_get_main_queue(), ^{
completion(NO,[exporter error],nil);
});
return;
}
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"crop Export canceled");
if (completion){
dispatch_async(dispatch_get_main_queue(), ^{
completion(NO,nil,nil);
});
return;
}
break;
default:
break;
}
if (completion){
dispatch_async(dispatch_get_main_queue(), ^{
completion(YES,nil,outputUrl);
});
}
}];
}
}
Kindly give some guidance.Any help or guidance in this direction would be highly appreciated.Thanks in advance.

Using NSExpression to combine entities and return an NSArray or NSArray elements

I would like the NSFetchRequest for my UITableViewController to group similar records (entities) based upon a particular attribute. I currently do a two-step process, but I believe there might be someway to use an + (NSExpression *)expressionForAggregate:(NSArray *)collection.
Could someone help with the appropriate code?
Here's the code for my two step process that returns an array of an array:
+(NSArray *)getTopQforDogByProgram2:(Dog *)dog
inProgram:(RunProgramTypes)programType
inManagedContext:(NSManagedObjectContext *)context {
NSString *searchString;
NSFetchRequest *request = [NSFetchRequest fetchRequestWithEntityName:#"Run"];
request.predicate = [NSPredicate predicateWithFormat:[NSString stringWithFormat:#"dog.callName = '%#'",dog.callName]];
NSSortDescriptor *classSortDescriptor = [NSSortDescriptor sortDescriptorWithKey:#"runClass" ascending:NO];
request.sortDescriptors = [NSArray arrayWithObject:classSortDescriptor];
NSError *error = nil;
NSArray *dataArray = [context executeFetchRequest:request error:&error];
NSMutableArray *returnArray = [[NSMutableArray alloc] init];
if ( [dataArray count] > 0 ) {
NSMutableArray *pointArray = [[NSMutableArray alloc] init];
for ( Run *run in dataArray ) {
if ( ! [returnArray count] ) {
[pointArray addObject:run];
[returnArray addObject:pointArray];
} else {
BOOL wasSame = FALSE;
for ( NSMutableArray *cmpArray in returnArray ) {
Run *cmpRun = [cmpArray lastObject];
if ( cmpRun.runClass == run.runClass ) {
[cmpArray addObject:run];
wasSame = TRUE;
break;
}
}
if ( ! wasSame ) {
pointArray = [[NSMutableArray alloc] init];
[pointArray addObject:run];
[returnArray addObject:pointArray];
}
}
}
}
return returnArray;
}
You could use a fetch results controller to do the sectioning for you like this:
NSFetchedResultsController* controller = [[NSFetchedResultsController alloc] initWithFetchRequest:request
managedObjectContext:context
sectionNameKeyPath:#"runClass"
cacheName:nil];
NSMutableArray* sections = [[NSMutableArray alloc] initWithCapacity:[controller sections] count];
for (id<NSFetchedResultsSectionInfo> section in [controller sections]) {
NSMutableArray* sectionCopy = [NSMutableArray arrayWithArray:[section objects]];
[sections addObject:sectionCopy];
}
Or do it yourself with: (given that the results are sorted by runClass)
NSMutableArray* sections = [NSMutableArray new];
NSMutableArray* currentSection = [NSMutableArray new];
for (Run* run in dataArray) {
Run* lastObject = (Run*)[currentSection lastObject];
if (lastObject && (run.runClass == lastObject.runClass)) {
currentSection = [NSMutableArray new];
[sections addObject:currentSection];
}
[currentSection addObject:run];
}

Lat,Long values are not accurate to the current location in iphone

I am new to IOS now I am developing location app i am not getting correct latitude,longitude values while running my app in iPhone.. it gives me too far lat,long values to the current location why it shows me these values can anyone tell me..
Try it....
Set Delegate CLLocationManagerDelegate
#import <CoreLocation/CoreLocation.h>
CLLocationManager *locationManager;
CLLocation *startLocation;
- (void)viewDidLoad {
[super viewDidLoad];
NSString *currentLatitude;
NSString *currentLongitude;
locationManager = [[CLLocationManager alloc] init];
locationManager.desiredAccuracy = kCLLocationAccuracyBest;
locationManager.delegate = self;
[locationManager startUpdatingLocation];
startLocation = nil;
}
-(void)locationManager:(CLLocationManager *)manager
didUpdateToLocation:(CLLocation *)newLocation
fromLocation:(CLLocation *)oldLocation
{
currentLatitude = [[NSString alloc]
initWithFormat:#"%g",
newLocation.coordinate.latitude];
currentLongitude = [[NSString alloc]
initWithFormat:#"%g",
newLocation.coordinate.longitude];
NSLog(#"%#",currentLatitude);
NSLog(#"%#",currentLongitude);
}
-(void)locationManager:(CLLocationManager *)manager
didFailWithError:(NSError *)error
{
UIAlertView *alert = [[UIAlertView alloc]initWithTitle:#"ERROR" message:[NSString stringWithFormat:#"%#",error] delegate:self cancelButtonTitle:#"OK" otherButtonTitles:nil, nil];
[alert show];
[alert release];
}

mkannotation automatic/default callout not happening

i added an annotation and in viewforannotation i set canshowcallout and selected the annotation as well but it aint getting selected...
-(void)locationManager:(CLLocationManager *)manager didUpdateToLocation:(CLLocation *)newLocation fromLocation:(CLLocation *)oldLocation{
currentCoordinates = newLocation.coordinate;
ParkPlaceMark *pa = [[ParkPlaceMark alloc] init];
pa.coordinate = currentCoordinates;
pa.title = #"POI";
pa.title2 = #"CUrrent Locn";
pa.subtitle = #"Drag and drop to adjust the position if necessary";
[mapView addAnnotation:pa];
[pa release];
}
- (MKAnnotationView *)mapView:(MKMapView *)map
viewForAnnotation:(ParkPlaceMark*)annotation {
MKAnnotationView *test=[[MKAnnotationView alloc] initWithAnnotation:annotation reuseIdentifier:#"parkingloc"];
if([annotation.title caseInsensitiveCompare:#"POI"] == NSOrderedSame)
{
test.annotation = annotation;
test.userInteractionEnabled = YES;
test.draggable = YES;
// test.pinColor = MKPinAnnotationColorRed;
[test setImage:[UIImage imageNamed:#"marker3.png"]];
[test setEnabled:YES];
[test setCanShowCallout:YES];
[test setDragState:MKAnnotationViewDragStateEnding animated:YES];
[self.mapView selectAnnotation:test.annotation animated:YES];
return test;
}
}
I had added custom animation so
[self.mapView selectAnnotation:test.annotation animated:YES];
was not working . The correct code is :
[self.mapView selectAnnotation:test.annotation animated:NO];

Background-queue changes to parent NSManagedObjectContext in UIManagedDocument cause duplicate in NSFetchedresultsController on merge

Ok guys. This one is driving me up the wall. I have
UIManagedDocument and its 2 MOContexts (regular and parent.)
A UITableViewController (subclassed to CoreDataTableViewController by Paul Hegarty) that runs off of an
NSFetchedResultsController
A background GCD Queue for syncing with the server that the parent cue accesses
I've tried this so many different ways and I run into problems each time.
When I add a new "animal" entity, it is no problem and immediately shows up on the table. But when I upload it to the server (on the upload queue) and changed its "status" (with the parent context) so that it should be in the uploaded section, it appears there but doesn't disappear from the un-uploaded section.
I END UP WITH TWINS I DIDN'T WANT! or it doesn't even make the correct one sometimes and just keeps the wrong one.
***BUT, the extra one will disappear when the app is shut down and reloaded. So it's just in memory somewhere. I can verify in the store that everything is correct. But the NSFetchedResultsController isn't firing the controllerDidChange... stuff.
Here is the superclass of my view controller
CoreDataTableViewController.m
#pragma mark - Fetching
- (void)performFetch
{
self.debug = 1;
if (self.fetchedResultsController) {
if (self.fetchedResultsController.fetchRequest.predicate) {
if (self.debug) NSLog(#"[%# %#] fetching %# with predicate: %#", NSStringFromClass([self class]), NSStringFromSelector(_cmd), self.fetchedResultsController.fetchRequest.entityName, self.fetchedResultsController.fetchRequest.predicate);
} else {
if (self.debug) NSLog(#"[%# %#] fetching all %# (i.e., no predicate)", NSStringFromClass([self class]), NSStringFromSelector(_cmd), self.fetchedResultsController.fetchRequest.entityName);
}
NSError *error;
[self.fetchedResultsController performFetch:&error];
if (error) NSLog(#"[%# %#] %# (%#)", NSStringFromClass([self class]), NSStringFromSelector(_cmd), [error localizedDescription], [error localizedFailureReason]);
} else {
if (self.debug) NSLog(#"[%# %#] no NSFetchedResultsController (yet?)", NSStringFromClass([self class]), NSStringFromSelector(_cmd));
}
[self.tableView reloadData];
}
- (void)setFetchedResultsController:(NSFetchedResultsController *)newfrc
{
NSFetchedResultsController *oldfrc = _fetchedResultsController;
if (newfrc != oldfrc) {
_fetchedResultsController = newfrc;
newfrc.delegate = self;
if ((!self.title || [self.title isEqualToString:oldfrc.fetchRequest.entity.name]) && (!self.navigationController || !self.navigationItem.title)) {
self.title = newfrc.fetchRequest.entity.name;
}
if (newfrc) {
if (self.debug) NSLog(#"[%# %#] %#", NSStringFromClass([self class]), NSStringFromSelector(_cmd), oldfrc ? #"updated" : #"set");
[self performFetch];
} else {
if (self.debug) NSLog(#"[%# %#] reset to nil", NSStringFromClass([self class]), NSStringFromSelector(_cmd));
[self.tableView reloadData];
}
}
}
#pragma mark - UITableViewDataSource
- (NSInteger)numberOfSectionsInTableView:(UITableView *)tableView
{
if (self.debug) NSLog(#"fetchedResultsController returns %d sections", [[self.fetchedResultsController sections] count]);
return [[self.fetchedResultsController sections] count];
}
- (NSInteger)tableView:(UITableView *)tableView numberOfRowsInSection:(NSInteger)section
{
return [[[self.fetchedResultsController sections] objectAtIndex:section] numberOfObjects];
}
- (NSString *)tableView:(UITableView *)tableView titleForHeaderInSection:(NSInteger)section
{
return [[[self.fetchedResultsController sections] objectAtIndex:section] name];
}
- (NSInteger)tableView:(UITableView *)tableView sectionForSectionIndexTitle:(NSString *)title atIndex: (NSInteger)index
{
return [self.fetchedResultsController sectionForSectionIndexTitle:title atIndex:index];
}
- (NSArray *)sectionIndexTitlesForTableView:(UITableView *)tableView
{
return [self.fetchedResultsController sectionIndexTitles];
}
#pragma mark - NSFetchedResultsControllerDelegate
- (void)controllerWillChangeContent:(NSFetchedResultsController *)controller
{
if (!self.suspendAutomaticTrackingOfChangesInManagedObjectContext)
{
[self.tableView beginUpdates];
self.beganUpdates = YES;
}
}
- (void)controller:(NSFetchedResultsController *)controller
didChangeSection:(id <NSFetchedResultsSectionInfo>)sectionInfo
atIndex:(NSUInteger)sectionIndex
forChangeType:(NSFetchedResultsChangeType)type
{
if (!self.suspendAutomaticTrackingOfChangesInManagedObjectContext)
{
switch(type)
{
case NSFetchedResultsChangeInsert:
[self.tableView insertSections:[NSIndexSet indexSetWithIndex:sectionIndex] withRowAnimation:UITableViewRowAnimationFade];
break;
case NSFetchedResultsChangeDelete:
[self.tableView deleteSections:[NSIndexSet indexSetWithIndex:sectionIndex] withRowAnimation:UITableViewRowAnimationFade];
break;
}
}
}
- (void)controller:(NSFetchedResultsController *)controller
didChangeObject:(id)anObject
atIndexPath:(NSIndexPath *)indexPath
forChangeType:(NSFetchedResultsChangeType)type
newIndexPath:(NSIndexPath *)newIndexPath
{
if(self.debug) NSLog(#"controller didChangeObject: %#", anObject);
if (!self.suspendAutomaticTrackingOfChangesInManagedObjectContext)
{
NSLog(#"#########Controller did change type: %d", type);
switch(type)
{
case NSFetchedResultsChangeInsert:
[self.tableView insertRowsAtIndexPaths:[NSArray arrayWithObject:newIndexPath] withRowAnimation:UITableViewRowAnimationFade];
break;
case NSFetchedResultsChangeDelete:
[self.tableView deleteRowsAtIndexPaths:[NSArray arrayWithObject:indexPath] withRowAnimation:UITableViewRowAnimationFade];
break;
case NSFetchedResultsChangeUpdate:
[self.tableView reloadRowsAtIndexPaths:[NSArray arrayWithObject:indexPath] withRowAnimation:UITableViewRowAnimationFade];
break;
case NSFetchedResultsChangeMove:
[self.tableView deleteRowsAtIndexPaths:[NSArray arrayWithObject:indexPath] withRowAnimation:UITableViewRowAnimationFade];
[self.tableView insertRowsAtIndexPaths:[NSArray arrayWithObject:newIndexPath] withRowAnimation:UITableViewRowAnimationFade];
break;
}
}
}
- (void)controllerDidChangeContent:(NSFetchedResultsController *)controller
{
if (self.beganUpdates) [self.tableView endUpdates];
if (self.debug) NSLog(#"controller Did Change Content");
}
- (void)endSuspensionOfUpdatesDueToContextChanges
{
_suspendAutomaticTrackingOfChangesInManagedObjectContext = NO;
}
- (void)setSuspendAutomaticTrackingOfChangesInManagedObjectContext:(BOOL)suspend
{
if (suspend) {
_suspendAutomaticTrackingOfChangesInManagedObjectContext = YES;
} else {
[self performSelector:#selector(endSuspensionOfUpdatesDueToContextChanges) withObject:0 afterDelay:0];
}
}
#end
And here's my specific view controller I subclassed from it:
- (NSArray *)sectionHeaderTitles
{
if (_sectionHeaderTitles == nil) _sectionHeaderTitles = [NSArray arrayWithObjects:#"Not Yet Uploaded", #"Uploaded But Not Featured", #"Previously Featured", nil];
return _sectionHeaderTitles;
}
- (NSDictionary *)selectedEntry
{
if (_selectedEntry == nil) _selectedEntry = [[NSDictionary alloc] init];
return _selectedEntry;
}
- (void)setupFetchedResultsController
{
[self.photoDatabase.managedObjectContext setStalenessInterval:0.0];
[self.photoDatabase.managedObjectContext.parentContext setStalenessInterval:0.0];
NSFetchRequest *request = [NSFetchRequest fetchRequestWithEntityName:#"Animal"];
request.sortDescriptors = [NSArray arrayWithObjects:[NSSortDescriptor sortDescriptorWithKey:#"status" ascending:YES], [NSSortDescriptor sortDescriptorWithKey:#"unique" ascending:NO], nil];
self.fetchedResultsController = [[NSFetchedResultsController alloc] initWithFetchRequest:request managedObjectContext:self.photoDatabase.managedObjectContext sectionNameKeyPath:#"status" cacheName:nil];
NSError *error;
BOOL success = [self.fetchedResultsController performFetch:&error];
if (!success) NSLog(#"error: %#", error);
else [self.tableView reloadData];
self.fetchedResultsController.delegate = self;
}
- (void)useDocument
{
if (![[NSFileManager defaultManager] fileExistsAtPath:[self.photoDatabase.fileURL path]]) {
[self.photoDatabase saveToURL:self.photoDatabase.fileURL forSaveOperation:UIDocumentSaveForCreating completionHandler:^(BOOL success) {
[self setupFetchedResultsController];
}];
} else if (self.photoDatabase.documentState == UIDocumentStateClosed) {
[self.photoDatabase openWithCompletionHandler:^(BOOL success) {
[self setupFetchedResultsController];
}];
} else if (self.photoDatabase.documentState == UIDocumentStateNormal) {
[self setupFetchedResultsController];
}
}
- (void)setPhotoDatabase:(WLManagedDocument *)photoDatabase
{
if (_photoDatabase != photoDatabase) {
_photoDatabase = photoDatabase;
[self useDocument];
}
}
- (void)viewDidLoad
{
[super viewDidLoad];
UILabel *label = [[UILabel alloc] initWithFrame:CGRectZero];
label.backgroundColor = [UIColor clearColor];
label.font = [UIFont fontWithName:#"AmericanTypewriter" size:20];
label.shadowColor = [UIColor colorWithWhite:0.0 alpha:0.5];
label.textAlignment = UITextAlignmentCenter;
label.textColor = [UIColor whiteColor];
self.navigationItem.titleView = label;
label.text = self.navigationItem.title;
[label sizeToFit];
}
- (void)viewWillAppear:(BOOL)animated
{
[super viewWillAppear:animated];
// Get CoreData database made if necessary
if (!self.photoDatabase) {
NSURL *url = [[[NSFileManager defaultManager] URLsForDirectory:NSDocumentDirectory inDomains:NSUserDomainMask] lastObject];
url = [url URLByAppendingPathComponent:#"Default Photo Database"];
self.photoDatabase = [[WLManagedDocument alloc] initWithFileURL:url];
NSLog(#"No existing photoDatabase so a new one was created from default photo database file.");
}
self.tableView.backgroundColor = [UIColor colorWithPatternImage:[UIImage imageNamed:#"DarkWoodBackGround.png"]];
}
- (void)syncWithServer
{
// This is done on the syncQ
// Start the activity indicator on the nav bar
dispatch_async(dispatch_get_main_queue(), ^{
[self.spinner startAnimating];
self.navigationItem.leftBarButtonItem = [[UIBarButtonItem alloc] initWithCustomView:self.spinner];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(managedObjectContextDidSave:)
name:NSManagedObjectContextDidSaveNotification
object:self.photoDatabase.managedObjectContext.parentContext];
});
// Find new animals (status == 0)
NSFetchRequest *newAnimalsRequest = [NSFetchRequest fetchRequestWithEntityName:#"Animal"];
newAnimalsRequest.predicate = [NSPredicate predicateWithFormat:#"status == 0"];
NSError *error;
NSArray *newAnimalsArray = [self.photoDatabase.managedObjectContext.parentContext executeFetchRequest:newAnimalsRequest error:&error];
if ([newAnimalsArray count]) NSLog(#"There are %d animals that need to be uploaded.", [newAnimalsArray count]);
if (error) NSLog(#"fetchError: %#", error);
// Get the existing animals from the server
NSArray *parsedDownloadedAnimalsByPhoto = [self downloadedAllAnimalsFromWeb];
// In the parent context, insert downloaded animals into core data
for (NSDictionary *downloadedPhoto in parsedDownloadedAnimalsByPhoto) {
[Photo photoWithWebDataInfo:downloadedPhoto inManagedObjectContext:self.photoDatabase.managedObjectContext.parentContext];
// table will automatically update due to NSFetchedResultsController's observing of the NSMOC
}
// Upload the new animals if there are any
if ([newAnimalsArray count] > 0) {
NSLog(#"There are %d animals that need to be uploaded.", [newAnimalsArray count]);
for (Animal *animal in newAnimalsArray) {
// uploadAnimal returns a number that lets us know if it was accepted by the server
NSNumber *unique = [self uploadAnimal:animal];
if ([unique intValue] != 0) {
animal.unique = unique;
// uploadThePhotosOf returns a success BOOL if all 3 uploaded successfully
if ([self uploadThePhotosOf:animal]){
[self.photoDatabase.managedObjectContext performBlock:^{
animal.status = [NSNumber numberWithInt:1];
}];
}
}
}
}
[self.photoDatabase.managedObjectContext.parentContext save:&error];
if (error) NSLog(#"Saving parent context error: %#", error);
[self performUpdate];
// Turn the activity indicator off and replace the sync button
dispatch_async(dispatch_get_main_queue(), ^{
// Save the context
[self.photoDatabase saveToURL:self.photoDatabase.fileURL forSaveOperation:UIDocumentSaveForOverwriting completionHandler:^(BOOL success) {
if (success)
{
NSLog(#"Document was saved");
[self.photoDatabase.managedObjectContext processPendingChanges];
} else {
NSLog(#"Document was not saved");
}
}];
[self.spinner stopAnimating];
self.navigationItem.leftBarButtonItem = self.syncButton;
});
// Here it skips to the notification I got from saving the context so I can MERGE them
}
- (NSNumber *)uploadAnimal:(Animal *)animal
{
NSURL *uploadURL = [NSURL URLWithString:#"index.php" relativeToURL:self.remoteBaseURL];
NSString *jsonStringFromAnimalMetaDictionary = [animal.metaDictionary JSONRepresentation];
NSLog(#"JSONRepresentation of %#: %#", animal.namestring, jsonStringFromAnimalMetaDictionary);
ASIFormDataRequest *request = [ASIFormDataRequest requestWithURL:uploadURL];
[request setPostValue:jsonStringFromAnimalMetaDictionary forKey:#"newmeta"];
[request startSynchronous];
NSError *error = [request error];
NSString *response;
if (!error) {
response = [request responseString];
NSNumber *animalUnique = [(NSArray *)[response JSONValue]objectAtIndex:0];
return animalUnique;
} else {
response = [error description];
NSLog(#"%# got an error: %#", animal.namestring, response);
return [NSNumber numberWithInt:0];
}
}
- (BOOL)uploadThePhotosOf:(Animal *)animal
{
NSURL *uploadURL = [NSURL URLWithString:#"index.php" relativeToURL:self.remoteBaseURL];
int index = [animal.photos count];
for (Photo *photo in animal.photos) {
// Name the jpeg file
NSTimeInterval timeInterval = [NSDate timeIntervalSinceReferenceDate];
NSString *imageServerPath = [NSString stringWithFormat:#"%lf-Photo.jpeg",timeInterval];
// Update the imageServerPath
photo.imageURL = imageServerPath;
NSData *photoData = [[NSData alloc] initWithData:photo.image];
NSString *photoMeta = [photo.metaDictionary JSONRepresentation];
ASIFormDataRequest *request = [ASIFormDataRequest requestWithURL:uploadURL];
[request addPostValue:photoMeta forKey:#"newphoto"];
[request addData:photoData withFileName:imageServerPath andContentType:#"image/jpeg" forKey:#"filename"];
[request setUploadProgressDelegate:self.progressView];
[request startSynchronous];
NSLog(#"%# progress: %#", animal.namestring, self.progressView.progress);
NSString *responseString = [request responseString];
NSLog(#"uploadThePhotosOf:%# photo at placement: %d has responseString: %#", animal.namestring, [photo.placement intValue], responseString);
SBJsonParser *parser= [[SBJsonParser alloc] init];
NSError *error = nil;
id jsonObject = [parser objectWithString:responseString error:&error];
NSNumber *parsedPhotoUploadResponse = [(NSArray *)jsonObject objectAtIndex:0];
// A proper response is not 0
if ([parsedPhotoUploadResponse intValue] != 0) {
photo.imageid = parsedPhotoUploadResponse;
--index;
}
}
// If the index spun down to 0 then it was successful
int success = (index == 0) ? 1 : 0;
return success;
}
- (NSArray *)downloadedAllAnimalsFromWeb
{
NSURL *downloadURL = [NSURL URLWithString:#"index.php" relativeToURL:self.remoteBaseURL];
ASIFormDataRequest *request = [ASIFormDataRequest requestWithURL:downloadURL];
[request setPostValue:#"yes" forKey:#"all"];
request.tag = kGetHistoryRequest;
[request startSynchronous];
NSString *responseString = [request responseString];
NSLog(#"downloadedAllAnimalsFromWeb responseString: %#", responseString);
SBJsonParser *parser= [[SBJsonParser alloc] init];
NSError *error = nil;
id jsonObject = [parser objectWithString:responseString error:&error];
NSArray *parsedDownloadedResponseStringArray = [NSArray arrayWithArray:jsonObject];
return parsedDownloadedResponseStringArray;
}
- (void)performUpdate
{
NSManagedObjectContext * context = self.photoDatabase.managedObjectContext.parentContext;
NSSet * inserts = [context updatedObjects];
if ([inserts count])
{
NSError * error = nil;
NSLog(#"There were inserts");
if ([context obtainPermanentIDsForObjects:[inserts allObjects]
error:&error] == NO)
{
NSLog(#"BAM! %#", error);
}
}
[self.photoDatabase updateChangeCount:UIDocumentChangeDone];
}
- (void)managedObjectContextDidSave:(NSNotification *)notification
{
[[NSNotificationCenter defaultCenter] removeObserver:self name:NSManagedObjectContextDidSaveNotification object:self.photoDatabase.managedObjectContext.parentContext];
NSLog(#"userInfo from the notification: %#", [notification userInfo]);
// Main thread context
NSManagedObjectContext *context = self.fetchedResultsController.managedObjectContext;
SEL selector = #selector(mergeChangesFromContextDidSaveNotification:);
[context performSelectorOnMainThread:selector withObject:notification waitUntilDone:YES];
NSLog(#"ContextDidSaveNotification was sent. MERGED");
}
#pragma mark - Table view data source
- (UITableViewCell *)tableView:(UITableView *)tableView cellForRowAtIndexPath:(NSIndexPath *)indexPath
{
UITableViewCell *cell = [tableView dequeueReusableCellWithIdentifier:#"EntryCell"];
if (!cell) {
cell = [[UITableViewCell alloc] initWithStyle:UITableViewCellStyleDefault reuseIdentifier:#"EntryCell"];
}
// Configure the cell here...
Animal *animal = [self.fetchedResultsController objectAtIndexPath:indexPath];
cell.textLabel.text = animal.namestring;
if (([animal.numberofanimals intValue] > 0) && animal.species) {
cell.detailTextLabel.text = [NSString stringWithFormat:#"%#s", animal.species];
} else {
cell.detailTextLabel.text = animal.species;
}
return cell;
}
- (void)prepareForSegue:(UIStoryboardSegue *)segue sender:(id)sender
{
NSIndexPath *indexPath = [self.tableView indexPathForCell:sender];
Animal *animal = [self.fetchedResultsController objectAtIndexPath:indexPath];
// be somewhat generic here (slightly advanced usage)
// we'll segue to ANY view controller that has a photographer #property
if ([segue.identifier isEqualToString:#"newAnimal"]) {
NSLog(#"self.photodatabase");
[(NewMetaEntryViewController *)[segue.destinationViewController topViewController] setPhotoDatabaseContext:self.photoDatabase.managedObjectContext];
} else if ([segue.destinationViewController respondsToSelector:#selector(setAnimal:)]) {
// use performSelector:withObject: to send without compiler checking
// (which is acceptable here because we used introspection to be sure this is okay)
[segue.destinationViewController performSelector:#selector(setAnimal:) withObject:animal];
NSLog(#"animal: %# \r\n indexPath: %#", animal, indexPath);
}
}
- (CGFloat)tableView:(UITableView *)tableView heightForHeaderInSection:(NSInteger)section
{
return 30;
}
- (NSArray *)sectionIndexTitlesForTableView:(UITableView *)tableView
{
return nil;
}
- (UIView *)tableView:(UITableView *)tableView viewForHeaderInSection:(NSInteger)section
{
NSLog(#"header for section called for section: %d", section);
NSLog(#"fetchedResultsController sections: %#", self.fetchedResultsController.sections);
CGRect headerRect = CGRectMake(0, 0, tableView.bounds.size.width, 30);
UIView *header = [[UIView alloc] initWithFrame:headerRect];
UILabel *headerTitleLabel = [[UILabel alloc] initWithFrame:CGRectMake(5, 5, tableView.bounds.size.width - 10, 20)];
if ([(Animal *)[[[[self.fetchedResultsController sections] objectAtIndex:section] objects] objectAtIndex:0] status] == [NSNumber numberWithInt:0]) {
headerTitleLabel.text = [self.sectionHeaderTitles objectAtIndex:0];
} else if ([(Animal *)[[[[self.fetchedResultsController sections] objectAtIndex:section] objects] objectAtIndex:0] status] == [NSNumber numberWithInt:1]) {
headerTitleLabel.text = [self.sectionHeaderTitles objectAtIndex:1];
} else {
headerTitleLabel.text = [self.sectionHeaderTitles objectAtIndex:2];
}
headerTitleLabel.textColor = [UIColor whiteColor];
headerTitleLabel.font = [UIFont fontWithName:#"AmericanTypewriter" size:20];
headerTitleLabel.backgroundColor = [UIColor clearColor];
headerTitleLabel.alpha = 0.8;
[header addSubview:headerTitleLabel];
return header;
}
Way too much code for anyone to want to wade through.
However, from a quick inspection, it looks like you are violating the MOC constraints. Specifically, you are accessing the parent context directly, and not from its own thread, either.
Typically, you would start a new thread, then create a MOC in that thread, make its parent be the MOC of the document. then do your stuff, and call save on the new MOC. It will then notify the parent, which should handle the updating.

Resources