I would think this work work easily, however I cannot understand why my NSMutableDictionary property is not working as I would have expected.
[self.testSprite.userData setValue:#"CAT" forKey:#"key"];
NSLog(#"%#", [self.testSprite.userData objectForKey:#"key"]);
NSLog(#"%lu", [self.testSprite.userData count]);
I am retuning (null) and 0.
Is there a special trick for using the spriteNode userdata ?
Thanks
The userData property is initially nil. You have to create a dictionary and assign it first:
self.testSprite.userData = [NSMutableDictionary dictionary];
[self.testSprite.userData setValue:#"CAT" forKey:#"key"];
NSLog(#"%#", [self.testSprite.userData objectForKey:#"key"]);
NSLog(#"%lu", [self.testSprite.userData count]);
Related
I am currently working on an app that will use the data returned by reversegeocode. Right now I can successfully receive the following values for a location: address, city, state, zip code, and country. In addition to the values that I am able to get, I would also like to obtain the name of the neighborhood for the locations that I reversegeocode. My code is as follows:
CLLocationManager *location = [[CLLocation alloc] init];
[location setDelegate:self];
location.desiredAccuracy=kCLLocationAccuracyBest;
location.distanceFilter=kCLDistanceFilterNone;
[location startMonitoringSignificantLocationChanges];
CLGeocoder *geolocation = [[CLGeocoder alloc] init];
- (void)locationManager:(CLLocationManager *)manager didUpdateLocations:(NSArray *)locations
{
NSLog(#"Update method is definitely being called!");
NSLog(#"Your current location is : %#", [locations lastObject]);
[geolocation reverseGeocodeLocation:[locations lastObject] completionHandler:^(NSArray *placemarks, NSError *error) {
NSLog(#"Reverse geocode complete: %#", placemarks);
CLPlacemark *placemark = [placemarks objectAtIndex:0];
NSLog(#"The locality area is: %#", placemark.locality);
}];
}
I expected placemark.locality to return the neighborhood but it returns the city instead.
Any help would be greatly appreciated,
Dave
After reading the documentation Apple has for the CLPlacemark class, I noticed that there were a few fields that I was unaware of.
Inclusive in these fields is exactly what I was trying acquire, the subLocality, which seems to be Apple documentation for neighborhood. If I had just read the documentation instead of assuming the object returned from [placemarks objectAtIndex:0], when stored in CLPlacemark *placemark, would have no more data than what is shown when NSLog(#"%#", [placemarks objectAtIndex:0]) is called, I would have figured this out much sooner. Oh well. The code I used to access the neighborhood is:
[placemark subLocality];
I am having an issue running GPUImage. I have modified SimpleVideoFileFilter program(replaced the filter with a chromakeyfilter) and am using my own video. My program is terminating due to the following error:
[AVAssetWriter startWriting] Cannot call method when status is 3'
I have gone through the forums but not sure why the moviewriter is closing and then someone is writing to it.
I am using iPhone4 running iOS 7.0
Any clues are greatly appreciated. Thanks much!
Check whether your destination file exists already. If it does, remove it.
I was trying to add the file to a directory which did not exist. Example : /Videos/Video.mov , leaving it just /Video.mov worked.
Ok, I have a few ideas for you.
When you say "it just shows a frame and never plays the video" we have a good indication that your entire processing pipeline from start to finish is functional exactly once, then stops working.
That tells us that you are stringing things together correctly, but some of the components don't exist longer than a single frame buffer cycle, and subsequently the whole process stops.
it looks like filter and movieWriter are scoped to the class (I'm assuming they're not properties from the lack of an underscore, _filter and _movieWriter). So they will live on after this method has finished (correct me if I'm wrong...)
I think where you are encountering trouble is your (GPUImageView*)displayView
This should probably be declared as a class property (although it could work as just a variable) and then instantiated through the nib or the viewDidLoad method of the view controller.
As you have it now, this line: GPUImageView* filterView = (GPUImageView*)displayView; is making an assignment for filterView which is not used (and therefore unnecessary). It's not clear if displayView really is an instance of GPUImageView or if it will still be in existence when the current method finishes. (in fact you say it "is a UIView that I have programmatically created")
displayView will have to be a subclass of GPUImageView for this whole thing to work, and it will have to be scoped to the class, and not the method.
Declare it like this:
#property (strong, nonatomic)GPUImageView* displayView;
and then instantiate it and add it to your view hierarchy from within viewDidLoad
movieFile1 = [[GPUImageMovie alloc] initWithURL:movieFileURL1];
movieFile2 = [[GPUImageMovie alloc] initWithURL:movieFileURL2];
movieFile2.runBenchmark = YES;
movieFile2.playAtActualSpeed = NO;
filter = [[GPUImageChromaKeyBlendFilter alloc] init];
[(GPUImageChromaKeyBlendFilter *)filter setColorToReplaceRed:0.0 green:1.0 blue:0.0];
[(GPUImageChromaKeyBlendFilter *)filter setThresholdSensitivity:0.4];
GPUImageView *filterView = (GPUImageView*)displayView;
[filter addTarget:displayView];
[movieFile1 addTarget:filter];
[movieFile2 addTarget:filter];
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Movie.m4v"];
unlink([pathToMovie UTF8String]);
NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(1920.0, 1280.0)];
[filter addTarget:movieWriter];
movieWriter.shouldPassthroughAudio = YES;
movieFile1.audioEncodingTarget = movieWriter;
[movieFile1 enableSynchronizedEncodingUsingMovieWriter:movieWriter];
[movieWriter startRecording];
[movieFile1 startProcessing];
[movieFile2 startProcessing];
[movieWriter setCompletionBlock:^{
[filter removeTarget:movieWriter];
[movieWriter finishRecording];
}];
if (outputPath) {
finalURL = [[stongObj tempFileURL] copy];
DebugLog(#"Start Filter Processing :%#",finalURL);
DebugLog(#"movieUrl :%#",movieUrl);
// [CSUtils removeChuckFilePaths:#[outputPath]];
//Create Image Movie Object
_movieFile = [[GPUImageMovie alloc] initWithURL:outputPath];
//_movieFile = [[GPUImageMovie alloc] initWithURL:[[NSBundle mainBundle] URLForResource:#"videoviewdemo" withExtension:#"mp4"]];
_movieFile.runBenchmark = NO;
_movieFile.playAtActualSpeed = YES;
_movieFile.delegate = self;
//Movie Writer Object
_movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:finalURL size:CGSizeMake([UIScreen mainScreen].bounds.size.height,[UIScreen mainScreen].bounds.size.height)];
//_movieWriter.delegate = self;
//Create Selecetive GPU Image Filter
[stongObj setGpuOutputFilter:selectedVideoFilterType];
//Create Group Filter
groupFilter = [[GPUImageFilterGroup alloc] init];
[groupFilter addTarget:imageOutputFilter];
// Only Single Filter is implemented.
//Apply Initial and Terminal Filter
[(GPUImageFilterGroup *)groupFilter setInitialFilters:[NSArray arrayWithObject:imageOutputFilter]];
[(GPUImageFilterGroup *)groupFilter setTerminalFilter:imageOutputFilter];
//_movieWriter -> groupFilter ->_movieFile
[_movieFile addTarget:groupFilter];
[groupFilter addTarget:_movieWriter];
_movieWriter.shouldPassthroughAudio = YES;
_movieFile.audioEncodingTarget = _movieWriter;
[_movieFile enableSynchronizedEncodingUsingMovieWriter:_movieWriter];
//Start Recording
[_movieWriter startRecording];
//Start Processing
[_movieFile startProcessing];
__weak typeof(self) weekSelf=self;
[_movieWriter setCompletionBlock:^{
__strong typeof(self) stongSelf=weekSelf;
DebugLog(#"Movie Write Completed");
//Finish Recording.
[stongSelf.movieWriter finishRecording];
//Release all object
// [self releaseAllObject];
//remove movieUrl,audioUrl,outputPath
[CSUtils removeChuckFiles:#[movieUrl,audioUrl,outputPath]];
}];
[_movieFile startProcessing]; app get crash in iOS 8 on this line but working fine on iOS 7
#Seasia Creative ,I have no enough reputation to add a comment by that list,I create a new list to answer U.
I check the output URL,console log "/var~~~~/tmpmerge.mp4",so i realize that ,i miss a "/" --->"/var~~~~/tmp/merge.mp4".
If the url is no correct, project runs into the same error.
hope to help some.
Just a quick question about SPTrack and SPAlbum
Say I have an array of SPTracks stored in myTracks
I can get the name of this track by doing
[[myTracks objectAtIndex:0] name];
However; when I try to get the name of the album like this
[myTracks objectAtIndex:0] album];
an SPAlbum object is returned. This makes sense, but I am unable to access the name property of the SPAlbum like so:
[[[myTracks objectAtIndex:0] album] name];
The name property is defined as an NSString in SPAlbum.m. Am I attempting to access this incorrectly? Thanks for your help.
Are you getting nil back?
If so, you need to make sure your objects are loaded first, using SPAsyncLoading.
[SPAsyncLoading waitUntilLoaded:[[myTracks objectAtIndex:0] album] timeout:kSPAsyncLoadingDefaultTimeout then:^(NSArray *loadedItems, NSArray *notLoadedItems) {
if (loadItems.count == 0) return; // Album didn't load!
NSLog(#"%#", [[[myTracks objectAtIndex:0] album] name]);
}];
Setup: I have a collection of parent objects, call them ObjectA. Each ObjectA has a one-to-many relation to ObjectB. So, one ObjectA may contain 0..n ObjectB-s, and each ObjectB has a specific ObjectA as its parent.
Now, I would like to do a Core Data fetch of ObjectA-s, where they are sorted by their latest ObjectB. Is it possible to create a sort descriptor for that?
There is a related question that describes exactly the same situation. The answer suggests denormalizing the attribute from ObjectB into ObjectA. This would be OK if there really is no way to do this with one fetch request.
The related question also mentions:
Actually, I just had an idea! Maybe I can sort Conversations by messages.#max.sortedDate…
I tried. It doesn’t seem to be possible. I get this error:
2012-10-05 17:51:42.813 xxx[6398:c07] *** Terminating app due to uncaught
exception 'NSInvalidArgumentException', reason: 'Keypath containing
KVC aggregate where there shouldn't be one; failed to handle
ObjectB.#max.creationTime'
Is denormalizing the attribute into ObjectA the only/best solution?
You could add an attribute in ObjectB which is the time stamp of the add date, then in the fetch request you can do something like this:
NSSortDescriptor *descriptor = [[NSSortDescriptor alloc] initWithKey:#"objectB.addTime" ascending:YES];
...
fetchRequest.sortDescriptors = #[descriptor];
I know this question is a bit old but what I did was get all ObjectBs, iterate over the results and pull out the ObjectB property and add it to a new array.
NSFetchRequest *fetchRequest = [NSFetchRequest new];
[fetchRequest setEntity:self.entityDescForObjectB];
// sort
NSSortDescriptor *sortDescriptor = [[NSSortDescriptor alloc] initWithKey:#"date" ascending:YES];
[fetchRequest setSortDescriptors:#[sortDescriptor]];
NSError *error = nil;
NSArray *fetchedObjects = [self.managedObjectContext executeFetchRequest:fetchRequest error:&error];
if (fetchedObjects == nil) {
NSLog(#"Error fetching objects: %#", error.localizedDescription);
return;
}
// pull out all the ObjectA objects
NSMutableArray *tmp = [#[] mutableCopy];
for (ObjectB *obj in fetchedObjects) {
if ([tmp containsObject:obj.objectA]) {
continue;
}
[tmp addObject:obj.objectA];
}
This works because CoreData is an object graph so you can work backwards. The loop at the end basically checks to see if the tmp array already has a specific ObjectA instance and if not adds it to the array.
It's important that you sort the ObjectBs otherwise this exercise is pointless.
Not sure if this is the right place (I am sure someone will let me know if it is not) I have a iPhone application that has a UITableview that is backed by core data. I want to perform a reducing search so that only the items starting with the characters entered into the search bar are shown. This is normally done with the delegate - (void)searchBar:(UISearchBar *)searchBar textDidChange:(NSString *)searchText no problem. I am a little confused as I am new to Core Data how to do this. One of the big problems as I see it is going to be updating the interface to let it know what to present. I assume an alternative NSFetchedResultsController needs to be sent to the UITableView is that correct?
So here are my issues:
1) I assume I need to create a NSFetchedResultsController with only the correct items in it then tell the UITableView to use this as the dataSource and reload the table?
2) is there a better way than executing a full sorted fetch and removing those objects that do not conform. ie is there a way of doing a select where type fetch?
Thanks in advance and sorry if this is a dumb question.
Regards
Damien
Yes, you will need a new NSFetchedResultsController.
You should use a NSPredicate in your new NSFetchRequest to filter by your search text.
For example, if your managed objects have a field "name" that should be filtered:
NSPredicate *pred = [NSPredicate predicateWithFormat:#"%K beginswith[c] %#", #"name", searchText];
[fetchRequest setPredicate:pred];
I used a slightly different solution: instead of relying on a different NSFetchedResultsController, I created a NSMutableArray (filteredListContent) in my table view controller, used to store the temporary data, as inspired by Apple sample code and Mugunth Kumar's tutorial.
In tableView:cellForRowAtIndexPath:, returning the appropriate data-source array:
if(receivedTableView == self.searchDisplayController.searchResultsTableView){
Objects* object = [self.filteredListContent objectAtIndex:indexPath.row];
cell.textLabel.text = object.name;
} else {
Objects* object = [self.unfilteredListContent objectAtIndex:indexPath.row];
cell.textLabel.text = object.name;
}
As in Apple's sample code, add pretty much the same method in other methods, such as
- (NSInteger)tableView:(UITableView *)receivedTableView numberOfRowsInSection:(NSInteger)section {
if(receivedTableView == self.searchDisplayController.searchResultsTableView){
return [self.filteredListContent count];
}
return [self.unfilteredListContent count];
}
As well as in tableView:didSelectRowAtIndexPath:...
Then conformed to UISearchDisplayDelegate protocol and added the following methods:
- (void)filterContentForSearchText:(NSString*)searchText
{
if (!self.filteredListContent) {
self.filteredListContent = self.filteredListContent = [[NSMutableArray alloc] init];
}
[self.filteredListContent removeAllObjects];
for (Objects *object in [self.coreDataStuffVariable.fetchedResultsController fetchedObjects])
{
NSPredicate *predicate = [NSPredicate predicateWithFormat:
#"(SELF contains[cd] %#)", searchText];
NSString * elementTitle = [NSString stringWithFormat:#"%#", object.name];
[elementTitle compare:searchText options:NSCaseInsensitiveSearch];
if([predicate evaluateWithObject:elementTitle])
{
[self.filteredListContent addObject:password];
}
}
}
- (BOOL)searchDisplayController:(UISearchDisplayController *)controller
shouldReloadTableForSearchString:(NSString *)searchString{
[self filterContentForSearchText:searchString];
// Return YES to cause the search result table view to be reloaded.
return YES;
}
Pretty simple. I guess it can end up badly if the core data objects are reloaded during a search, but well... if you can sleep knowing that then it may be a good solution!