Save/Load system for game using CoreData - core-data

I've been trying to build a simple load/save system for a game using Core Data.
Saving, loading and creating my UIManagedDocument works fine. Setting and loading values for attributes in my savegame entity works fine as well.
The problem is that these values are lost when my app quits, because I don't know how to access them.
I have about 200 attributes inside my savegame entity, such as (NSNumber *)currentIncome, NSNumber *currentMoney, NSNumber *currentMonth etc. Seems simple, right? If I were to group these into attributes with relationships, I'd probably end up with about 20 attributes. From what I gathered from the Core Data Programming Guide, in order to fill the entity with the values from my saved UIManagedDocument, I need to perform a fetchrequest with a predicate which fills an array of results.
This is where my first question arises: Would I need to perform a fetchrequest for every single attribute? This seems useful if you only have a few attributes or if you have 'to many' relationships. In my case, this would seem incredibly tedious though.
I might be missing something very essential here, but I would need something like load my UIManagedDocument and automatically fill my NSManagedModel with the one that was saved in the document's context and I cannot find it.
That is where my second question comes in: Is CoreData and UIManagedDocument even the right approach for this? 200 variables is too much for NSUserDefaults - I could imagine using NSCoding though. I definately want to incorporate iCloud savegame sharing at a later point, and UIManagedDocument and CoreData just seemed perfect for this.
Solved:
I just rewrote the entire Core Data fetching code (20 lines down to 10 or so).
Performing a fetchrequest for an entity without a predicate apparently returns the entire entity.
If my (NSArray *)fetchedResults turns up nil (database is empty), I insert a new instance of my savegame entity in my managedobjectcontext.
If it turns up non-nil, I just do a (NSManagedObject *)saveGame = [fetchedResults lastObject] and every value gets loaded fine.

From a database perspective it sounds like what you have here is a database with a single table saveGame with 200 columns currentMoney, currentMonth, etc. You then have a single row in your database representing the current game state.
The NSFetchRequest is the equivalent of the database SELECT statement, and as you only have one row you don't really need any predicates WHERE clauses etc, just get everything from this table, which is what your fetch request that only specifies the entity is doing SELECT * FROM saveGame.
So all in all it doesn't sound like you're getting much value out of the core-data framework here. Another alternative might be to look into the iCloud Key-Value storage API, which sounds closer to what you are currently getting from core-data.

Related

NSRangeException following Core Data migration

After adding a new Core Data model version to my app, I performed a lightweight migration, apparently successfully. The migrated file loaded fine, but upon the first attempt to access an attribute via a particular relationship, the app crashes with an NSRangeException: '*** -[__NSArrayM objectAtIndex:]: index 4294967295 beyond bounds [0 .. 35]'. This relationship worked fine prior to the migration. I know from other posts here that 4294967295 is really -1, but the only thing I can identify with 36 items in my app/data is that there are 36 total entities in the data model (for reference, the relationship that's being fetched has 58 items in its table).
The question:
My question is: based on the error I'm getting and the troubleshooting I've done below, is there a type of schema change that could pass the lightweight migration, but corrupt the data along the way, leading to the noted exception? I'm going to try breaking down the migration into smaller chunks over several versions to either isolate or avoid the issue, but it would be nice to be able to focus on specific schema changes that might be at fault.
The failure:
The failure occurs with the following code in "myobject":
[[self object2] text];
The object2 relationship is to-one, non-optional both ways and neither the forward nor inverse relationship was changed between data models. The text attribute is likely not relevant because when the error occurs, awakeFromFetch is not reached in object2. If I assign [self object2] to a variable prior to the above statement, the assignment is successful and reports data: <fault>.
The database:
Looking at the database in sqlite3, I notice the following:
The index values for the forward and inverse relationships appear to be correct in each table.
The object2 table has two columns for the inverse relationship instead of the one prior to migration (ZMYOBJECT as before and the additional Z2_MYOBJECT, which is empty for all rows). No other relationship were added to explain this column.
In the Z_PRIMARYKEY table, all entries post-migration show -1 for Z_MAX, whereas prior to migration they showed zero for empty tables and the maximum row number for populated tables. Manually updating Z_MAX to the proper values did not help with the exception. All Z_SUPER values were correct.
I set up a mapping model to see if anything looked awry with the automatic mappings, but everything looked fine.
Overall schema changes:
In the source version of the data model, there were fourteen entities, of which only four had been populated with data (the app is still in development). Seven were top-level entities and seven were sub-entities of three of the top-level entities.
In the target version of the data model, twenty-two entities were added, some top-level and some sub-entities, with dozens of relationships, including some added to existing entities.
Some attributes and relationships were removed from existing entities and others were added. No data types or relationship settings were changed, no attributes or relationships were renamed, and no special mappings were required.
Update (2/25/12): As I started working on a new intermediate model, I remembered that I had changed the class (representedClassName) for a number of entities from NSManagedObject to an NSManagedObject subclass, but hadn't generated the class files. I didn't suspect that would cause an issue and, indeed, creating all of the class files did not help with the exception. I just wanted to note that as another change between models.
Conclusions:
This is a wild guess, but if the 36 entity count is not a coincidence, it seems that when "myobject" attempts to fault in "object2" it does not have a valid reference for the table and is attempting to load table number -1, causing the exception. The fact that a simple assignment of [self object2] is successful, however, doesn't jibe with that conclusion.
Any ideas?
By working through several incremental migrations I was able to determine what is causing the issue, and a solution.
The problem:
One of the existing entities with data has no child entities in the current model. If I create a new model that simply adds a child entity, containing no attributes or relationships, and makes no other changes, the NSRangeException, Z_MAX observation, and doubling of the inverse relationship noted in my question all occur.
The solution:
After observing the failures following a "successful" lightweight migration for the case above, I created a mapping model. Since the only change was one additional entity, all but one of the entity mappings were straightforward. The question was what to do with the single added entity.
By default, the added entity with no attributes or relationships of its own was showing attribute and relationship mappings for all of the parent's properties. All of the mappings had empty value expressions by default, which I assumed meant that it would just skip them during the migration. Not true, apparently. By deleting all of the attribute and relationship mappings within the entity mapping and then turning off inferred mapping, the migration proceeded successfully.
I still have to tackle all of the remaining entities and will be trying this approach to do the rest in bulk, with all planned attributes and relationships intact.
Your posts were helpful when I encountered this problem. Thank you. [Have you reported the bug yet?]
Here are some more experimental results but, alas, not a great solution.
My schema change similarly added an entity subtype that has no additional attributes or relationships. The error message is the same as yours except the bounds are [0 .. 19]. That does correspond to 20 entity types, validating your hypothesis. Like your situation, the error happened when attempting to access an entity property after migration completed.
Adding a dummy attribute and a dummy self-relationship to the new entity type didn't avoid the post-migration crash. (However, I didn't test with that new entity type as the only schema change since I previously pushed that schema change to alpha testers.)
I observe the Z2_MYOBJECT column and Z_PRIMARYKEY.Z_MAX = -1 symptoms after successful migrations for other schema changes, so those may not be problematic at all. The -1 values get replaced lazily by the proper max values. The extra column might be used during migration.
In my case, the new entity's supertype has an ordered to-many relationship. In the very simple case where the entire data store contains just one object instance (an instance of that entity type with no outgoing relationship links), the schema migration succeeds. It does have the extra Z2_MYOBJECT column and Z_PRIMARYKEY.Z_MAX = -1 values and yet the resulting data store works fine when adding objects from there.
I tried creating a mapping model but was unsuccessful in getting Core Data to apply it. Turning off inferred mapping just made Core Data unable to migrate at all. Is there a trick to it? Do I have to write custom migration code to invoke a mapping model? This is Xcode 4.6.2 so the older bug is long gone.
When using git to roll the code & data model backwards or forwards to conduct an experiment, it seems to be necessary to (1) close & reopen the Xcode project and (2) do a clean build. Otherwise Xcode may crash and/or leave confounding state around.
To experimentally roll backwards, you must delete the .momd/ directory or the entire app from the target iOS simulator/device (or deploy the app via iTunes or TestFlight) since redeploying via Xcode won't remove obsolete files (like .mom and .omo data model definitions) which in turn lets the app do lightweight migrations that the actual deployed app can't do.
About the entity mapping to use for the added entity type, note that when Core Data applies a mapping model, it's copying entities from the old data store to a new one. It's not modifying the tables in place. You don't want it to "skip" properties (including inherited properties) unless you want to drop them.
However, since the schema change added an entity type, that entity has no instances to migrate so its custom mapping model rules do not matter.
Thus I wonder if something else caused your crashes to stop, like leftover experimental .mom files or custom migration code. Did your workaround hold up?
After 2 days of experimenting I decided my alpha testers would have to live without data migration this time. Fortunately this happened without production customers. But it doesn't give me confidence in Core Data.
I had the same sort of NSRangeException after adding a core data model version when accessing any instance of a particular entity after automatic lightweight migration. In my case also the range corresponded to the number of entities in my model.
I generated a mapping model with Xcode 4.6 (4H127) using File > New > File... and then selecting Core Data > Mapping Model. This caused the crash to (d)evolve into -[NSSymbolicExpression length]: unrecognized selector sent to instance...
Solution
The issue in my case was that my entity causing the original crash had a relationship named size, which is a reserved word listed in apple's Predicate Programming Guide. An examination of the mapping model revealed that the reserved word had been capitalized in the Value Expression for the relationship:
FUNCTION($manager, "destinationInstancesForEntityMappingNamed:sourceInstances:" , "PNSizeOptionToPNSizeOption", $source.SIZE)
I found the solution in Core Data Model Versioning and Data Migration Programming Guide:
Reserved words in custom value expressions: If you use a custom value
expression, you must escape reserved words such as SIZE, FIRST, and
LAST using a # (for example, $source.#size).
Unfortunately, Xcode's algorithm for generating the mapping model did not recognize the reserved word and I had to change the expression's key path in the Relationship Mapping inspector to $source.#size. This solved the problem. I assume that core data's inferred mapping model ran into a similar problem during lightweight migration.
There may be other causes of this kind crash and so this solution may not apply, but it may be worth checking the property names in your model against the list of reserved words in the Predicate Programming Guide.

Supplying UITableView Core Data the old-fashioned way

Does anyone have an example of how to efficiently provide a UITableView with data from a Core Data model, preferable including the use of sections (via a referenced property), without the use of NSFetchedResultsController?
How was this done before NSFetchedResultsController became available? Ideally the sample should only get the data that's being viewed and make extra requests when necessary.
Thanks,
Tim
For the record, I agree with CommaToast that there's at best a very limited set of reasons to implement an alternative version of NSFetchedResultsController. Indeed I'm unable to think of an occasion when I would advocate doing so.
That being said, for the purpose of education, I'd imagine that:
upon creation, NSFetchedResultsController runs the relevant NSFetchRequest against the managed object context to create the initial result set;
subsequently — if it has a delegate — it listens for NSManagedObjectContextObjectsDidChangeNotification from the managed object context. Upon receiving that notification it updates its result set.
Fetch requests sit atop predicates and predicates can't always be broken down into the keys they reference (eg, if you create one via predicateWithBlock:). Furthermore although the inserted and deleted lists are quite explicit, the list of changed objects doesn't provide clues as to how those objects have changed. So I'd imagine it just reruns the predicate supplied in the fetch request against the combined set of changed and inserted records, then suitably accumulates the results, dropping anything from the deleted set that it did previously consider a result.
There are probably more efficient things you could do whenever dealing with a fetch request with a fetch limit. Obvious observations, straight off the top of my head:
if you already had enough objects, none of those were deleted or modified and none of the newly inserted or modified objects have a higher sort position than the objects you had then there's obviously no changes to propagate and you needn't run a new query;
even if you've lost some of the objects you had, if you kept whichever was lowest then you've got an upper bound for everything that didn't change, so if the changed and inserted ones together with those you already had make more then enough then you can also avoid a new query.
The logical extension would seem to be that you need re-interrogate the managed object context only if you come out in a position where the deletions, insertions and changes modify your sorted list so that — before you chop it down to the given fetch limit — the bottom object isn't one you had from last time. The reasoning being that you don't already know anything about the stored objects you don't have hold of versus the insertions and modifications; you only know about how those you don't have hold of compare to those you previously had.

Core Data: am I on the right track? Setting up data model for data that contains multiple arrays, eg. accelerometer data

I am working on a project that involves a lot of data, and at first I was doing it all in plist, and I realized it was getting out of hand and I would have to learn Core Data. I'm still not entirely sure whether I can do what I want in Core Data, but I think it should work out. I've set up a data model, but I'm not sure if it's the right way to do it. Please read on if you think you can help out and let me know if I'm on the right track. Please bear with me, because I am trying to explain it as thoroughly as I can.
I've got the basic object with attributes set up at the root level; say a person with attributes like a name, date of birth, etc. Pretty simple. You set up one entity like this "Person" in your model, and you can save as many of them as you want in your data and retrieve them as an array, right? It could be sorted based on an attribute in the Person, such as the date they were added to the database.
Now where I get a bit more confused is when I want to store several different collections of data with each person. For example a list of courses and associated test marks. In a plist I would have stored an array of dictionaries that stored this, sorted by the date assessed. The way I set this up in my data model was that I added an entity called "Tests" and a "to-many" relationship from Person to Tests, and then when I pull that I get an NSSet that I can order by a timestamp again? Is there a better way to do this?
Similarly the Person may have a set of arrays of numerical data (the kind that you could graph over time,eg. Nike+ stores your running data like distance vs time, and a person would have multiple runs associated with them, hence a set of arrays, each with their own associated date of collection). The way I set this up is a little different, with a "Runs" attribute with just a timestamp attribute, and that is connected from Person via a to-many relationship, with inverse "forPerson". Then the Runs entity is connected to another entity via a to-many relationship that has attributes to store numerical data and the time. This would once again I would use a time/order attribute to sort them.
So the main question I have is whether using an internal attribute like timestamp to sort a set would be the right way to load in a "array" from core data. Searching forums/stack overflow about how to store NSArrays in core data seem overly complicated compared to this, giving me the sense that I'm misunderstanding something.
Thanks for your help. Sorry for all the text, but I'm new to Core Data and I figure setting up the data model properly is essential before starting to code methods for getting/saving data. If necessary, I can set up a sample model to demonstrate this and post a picture of it.
CoreData will give you NSSets by default. These are convertible to arrays by calling allObjects or sortedArrayUsingDescriptors, if you want a sorted array. The "ordered" property on the relationship description gives you an NSOrderedSet in the managed object. Hashed sets provide quicker adds, access and membership checks, with a penalty (relative to ordered sets) for the sort.

Sorting Core Data objects by transformable attribute with NSFetchedResultsController

I'm displaying objects stored in Core Data in a UITableView and am having problems sorting these objects by one of the object's transformable attributes. I should point out that I'm using an NSFetchedResultsController as the controller between the Core Data store and my table view. When I was simply using an array to hold all of my objects, I could sort them without any problems at all. I'm using an FRC because I need the data grouped in sections with section headers and the FRC makes that very easy.
Let's call these objects I'm sorting "Measurement" objects. Each Measurement object has a distance attribute. That distance attribute is of a custom class, EPHDistance, so it's set up in the Core Data model as a Transformable attribute.
To make a long story short, the sorting of Measurement objects by their distance does work, but only after I've edited an object that's stored by Core Data or if I add a new object to the store. After editing the store and returning to my table that lists all the Measurement objects in order, everything works great. It's just the initial launch and viewing of the table view where the objects aren't sorted properly. I've actually placed an NSLog statement in my EPPDistance -compare: method and it's not getting called when I sort the objects until I add/edit an object in the Core Data store. For what it's worth, if I sort theses Measurement objects by their "date" attribute, which is an NSDate, it works great right out of the gate.
I'm not super experienced with Core Data and this is my first real attempt at using an NSFetchedResultsController so I'm a little baffled by this. Any input would be greatly appreciated.
Thanks a lot,
Erik
You could create an optional method in your Measurement class call -(NSString*)distanceCompareString, which returns a string that will help you sort from your EPHDistance object. The in your NSSortDescriptor, you just use distanceCompareString as your sort key.

Store NSArray In Core Data Sample Code?

I have been searching for some sample code on how to store an NSArray in Core Data for awhile now, but haven't had any luck. Would anyone mind pointing me to some tutorial or example, or better yet write a simple sample as an answer to this question? I have read this but it doesn't show an example of how to go about implementing a transformable attribute that is an NSArray. Thanks in advance!
If you really need to do it, then encode as data. I simply created a new filed called receive as NSData (Binary data).
Then in the NSManagedObject implementation:
-(void)setReceiveList:(NSArray*)list{
self.receive = [NSKeyedArchiver archivedDataWithRootObject:list];
}
-(NSArray*)getReceiveList{
return [NSKeyedUnarchiver unarchiveObjectWithData:self.receive];
}
Transformable attributes are the correct way to persist otherwise unsupported object values in Core Data (such as NSArray). From Core Data Programming Guide: Non-Standard Persistent Attributes:
The idea behind transformable attributes is that you access an attribute as a non-standard type, but behind the scenes Core Data uses an instance of NSValueTransformer to convert the attribute to and from an instance of NSData. Core Data then stores the data instance to the persistent store.
A transformable attribute uses an NSValueTransformer to store an otherwise unsupported object in the persistent store. This allows Core Data to store just about anything that can be represented as NSData - which can be very useful. Unfortunately, transformable attributes cannot be matched in a predicate or used in sorting results with the NSSQLiteStoreType. This means that transformable attributes are useful only for storage, not discovery of objects.
The default transformer allows any object that supports NSCoding (or NSSecureCoding) to be stored as a transformable attribute. This includes NSArray, UIColor, UIImage, NSURL, CLLocation, and many others. It's not recommended to use this for data that can be arbitrarily large, as that can have a significant performance impact when querying the store. Images, for example, are a poor fit for transformable attributes - they are large bags of bytes that fragment the store. In that case, it's better to use the external records storage capabilities of Core Data, or to store the data separately as a file, and store the URL to the file in Core Data. If you must store a UIImage in Core Data, be sure you know the trade offs involved.
Creating a transformable attribute is easy:
• In the Xcode Core Data Model Editor, select the model attribute you want to modify. In the right side inspector, set the attribute type as "Transformable". You can leave the "Name" field blank to use the default transformer. If you were using a custom transformer, you would enter the class name here and register the class using +[NSValueTransformer setValueTransformer:forName:] somewhere in your code.
• In your NSManagedObject subclass header declare the property that describes the transformable attribute with the correct type. In this case, we're using NSArray:
#property (nonatomic, retain) NSArray *transformedArray;
• In the NSManagedObject subclass implementation file the property should be dynamic:
#dynamic transformedArray;
And you are done. When an NSArray value object is passed to setTransformedArray: that array is retained by the object. When the context is saved Core Data will transform the NSArray into NSData using the NSValueTransformer described in the model. The NSData bytes will be saved in the persistent store.
You don't store an NSArray natively in Core Data. You need to transform the values stored within the array into something Core Data can use, and then save the data in the store so that you can push and pull it to your NSArray as needed.
Philip's answer is right. You don't store arrays in Core Data. It is totally against what Core Data is made for. Most of the time you don't need the information of the array but one and that one can get dynamically loaded by Core Data. In the case of collections, it makes no difference if you iterate through an array of your whatever properties or of an array of fetched results on an NSSet (which is basically just an array too).
Here is the explanation what Philip said. You can't store an array directly, but you can create a property list from it. There is a method in all NS Arraytypes that gives you a nice and clean string and core data love strings. The cool thing about property lists stored as strings is, they can become what they were. There is a method for that in NSString. Tataaa...
There is a price of course.
Arrays as property lists can get gigantic and that doesn't go well with iOS devices where RAM is limited. Trying to save an array to core data indicates a poor entity design especially for large data. A small array is OK for speed reasons.
Another, less space consuming way, is to use binary property lists. Those come close to zip sizes when stored in Core Data or directly in the filesystem. Downside is, you can't simply open and read them like an XML or JSON file. For development I prefer something human readable and for release the binary version. A constant tied to the DEBUG value in the preprocessor takes care of that, so I don't have to change my code.
Core Data stores instances of NSManagedObject or subclasses of same. NSManagedObject itself is very much like a dictionary. To-many relationships between objects are represented as sets. Core Data has no ordered list that would correspond to an array. Instead, when you retrieve objects from a Core Data store, you use a fetch request. That fetch request can specify one or more sort descriptors that are used to sort the objects, and the objects returned by a fetch request are stored in an array.
If preserving the order of objects is important, you'll need to include an attribute in your entity that can be used to sort the objects when you fetch them.

Resources