I work with iPhone iOS 4.3.
In my project I need a read-only, repopulated table of data (say a table with 20 rows and 20 fields).
This data has to be fetched by key on the row.
What is better approach? CoreData Archives, SQLite, or other? And how can I prepare and store this table?
Thank you.
I would use core data for that. Drawback: You have to write a program (Desktop or iOS) to populate the persistent store.
How to use a pre-populated store, you should have a look into the Recipes sample code at apple's.
The simplest approach would be to use an NSArray of NSDictionary objects and then save the array to disk as a plist. Include the plist in your build and then open it read only from the app bundle at runtime.
Each "row" would be the element index of the array which would return a dictionary object wherein each "column" would be a key-value pair.
I've done this 2 different ways:
Saved all my data as dictionaries in a plist, then deserialized everything and loaded it into the app during startup
Created a program during development that populates the Core Data db. Save that db to the app bundle, then copy the db during app startup into the Documents folder for use as the Persistent Store
Both options are relatively easy, and if your initial data requirements get very large, it's also proven to be the most performant for me.
Related
I have an online SQL database that needs to sync with my app's Core Data persistent store. Is there a simple way of writing to Core Data only data that currently does not exist?
For example:
If the SQL database currently holds 3 records: A B and C.
Core Data currently holds only A and B.
I just want to add C to Core Data during the syncing process.
I could do a series of loops to check each record but is there an easier way? All the records will be unique and therefore maybe there is a method for setting the Core Data attribute to act like a primary key.
Give the online database records a "last updated" attribute. Keep track of when your Core Data store was last updated. When it's time to sync, process only the remote records changed since your last local update.
Looping through all records, using find-or-create in Core Data, will be slow, because you'll be creating an NSManagedObject instance for each object in your datastore. Keep the logic needed for sync-or-not outside of Core Data for speed.
I have an application that uses two merged core data models mapped to two different data stores (both are Sqlite stores) via use of model configurations (each unique configuration within each model is mapped to its own data store). The persistent store coordinator does a good job in saving relevant data into a correct store. However, the problem is that when the stores initially created by core data on very first save operation their data schemas look absolutely identically and correspond to a union of the two merged models.
Is there any way to control core data so it creates the stores solely based on the configuration/model mapped into that store?
I guess not, because if core data can generate partially schemas into different persistent store then it might destroy the relationships between them and thus cause problems. At least in current stage i dont think Apple tends to complete this.
I wrote a Console Application that reads list of flat files
and Parse the data type on a row basis
and inserts records one after other in respective tables.
there are few Flat Files which contains about 63k records(rows).
for such files, my program is taking about 6 hours for one file of 63k
records to complete.
This is a test data file. In production i have to deal with 100 time more load.
I am worried, if i can do this any better to speed up.
Can any one suggest a best way to handle this job?
Work Flow is as below:
Read the FlatFile from Local Machine using File.ReadAllLines("location")
Create a Record Entity object after parsing each field of the row.
Insert this current row in to the Entity
Purpose of making this as console application is,
this application should be run(scheduled application) on weekly basis
and there is conditional logic in it, based on some variable there will be
full table replace or
update a existing table or
delete records in table.
You can try to use 'bulk insert' operation for inserting a huge data into database.
I have set of data which contains images also. I want to cache this data. Should i store them on file system or on core data and why?
There are two main options:
Store the file on disk, and then store the path to the image in core data
Store the binary data of the image in core data
I personally prefer the 1st option, since it allows me to choose when I want to load the actual image in memory. It also means that I don't have to remember what format the raw data is in; I can just use the path to alloc/init a new UIImage object.
You might want to read this from the Core Data Programming Guide on how to deal with binary large objects (BLOBs). There are rules of thumb for what size binary data should and should not be stored within the actual Core Data store.
You might also look at Core Data iPad/iPhone BLOBS vs File system for 20k PDFs
If you do place binary data within Core Data store, you would do well to have a "Data" entity that holds the actual data and to have your "Image" entity separate. Create a relationship between the two entities, so that "Data" need only be loaded when actually needed. The "Image" entity can hold the meta-data such as title, data type, etc.
With regards to where to store the user data/files (I found "application support" to be a decent location given that i was wary of the user moving, deleting or altering the file in some way that would result in the image not being able to be recovered and used later by my application)
Take minecraft as an example:
eg. "~/Library/Application Support/minecraft/saves/"
I would agree with the previous comments and store paths to the images in core data but otherwise store the images themselves as png files in their own folder outside of core data.
I'm converting an app from SQLitePersistentObjects to CoreData.
In the app, have a class that I generate many* instances of from an XML file retrieved from my server. The UI can trigger actions that will require me to save some* of those objects until the next invocation of the app.
Other than having a single NSManagedObjectContext for each of these objects (shared only with their subservient objects which can include blobs). I can't see a way how I can have fine grained control (i.e. at the object level) over which objects are persisted. If I try and have a single context for all newly created objects, I get an exception when I try to move one of my objects to a new context so I can persist it on ots own. I'm guessing this is because the objects it owns are left in the 'old' context.
The other option I see is to have a single context, persist all my objects and then delete the ones I don't need later - this feels like it's going to be hitting the database too much but maybe CoreData does magic.
So:
Am I missing something basic about the way my CoreData app should be architected?
Is having a context per object a good design pattern?
Is there a better way to move objects between contexts to avoid 2?
* where "many" means "tens, maybe hundreds, not thousands" and "some" is at least one order of magnitude less than "many"
Also cross posted to the Apple forums.
Core Data is really not an object persistence framework. It is an object graph management framework that just happens to be able to persist that graph to disk (see this previous SO answer for more info). So trying to use Core Data to persist just some of the objects in an object graph is going to be working against the grain. Core Data would much rather manage the entire graph of all objects that you're going to create. So, the options are not perfect, but I see several (including some you mentioned):
You could create all the objects in the Core Data context, then delete the ones you don't want to save. Until you save the context, everything is in-memory so there won't be any "going back to the database" as you suggest. Even after saving to disk, Core Data is very good at caching instances in the contexts' row cache and there is surprisingly little overhead to just letting it do its thing and not worrying about what's on disk and what's in memory.
If you can create all the objects first, then do all the processing in-memory before deciding which objects to save, you can create a single NSManagedObjectContext with a persistent store coordinator having only an in-memory persistent store. When you decide which objects to save, you can then add a persistent (XML/binary/SQLite) store to the persistent store coordinator, assign the objects you want to save to that store (using the context's (void)assignObject:(id)object toPersistentStore:(NSPersistentStore *)store) and then save the context.
You could create all the objects outside of Core Data, then copy the objects to-be-saved into a Core Data context.
You can create all the objects in a single in-memory context and write your own methods to copy those objects' properties and relationships to a new context to save just the instances you want. Unless the entities in your model have many relationships, this isn't that hard (see this page for tips on migrating objects from one store to an other using a multi-pass approach; it describes the technique in the context of versioning managed object models and is no longer needed in 10.5 for that purpose, but the technique would apply to your use case as well).
Personally, I would go with option 1 -- let Core Data do its thing, including managing deletions from your object graph.