Core Data Managed Object one-to-many relationship insert/update save - core-data

Lets assume I have person->phones relationship (One to many of course). Initial insert save is working correctly where I do:
if (does not exist) {
user = (Member *)[NSEntityDescription insertNewObjectForEntityForName:#"Person" inManagedObjectContext:ctx];
} else {
searchObjectsForEntity:#"Person" withPredicate:pred andSortKey:nil andSortAscending:NO andContext:ctx];
Should this be replaced with just one insertNewObjectForEntityForName which would either insert or get existing?
Next, I need to create my phones objects and add them to my Person which I do with something like this:
NSManagedObject* mo=nil;
Phone* phone = (Phone *)[NSEntityDescription insertNewObjectForEntityForName:#"Phone"
inManagedObjectContext:ctx];
[mutableSetOfPhones addObject:mo];
user.phones = phones;
So I create a new instance of phone managed object add it to a set and add it to person, after that I save.
All this is good except when I use the same code to re-save Person instance i.e. edit/insert/delete a phone or any other modifications to user data. Old records of phones remain in the DB and are no longer associated with any Person.
What is the right approach of doing this? Do I need to iterate through user.phones to see edits/deletes by some id? Should I just delete older instances prior to saving updated records (much simpler)? What is the recommended approach, maybe I am doing something completely incorrect?

Related

How to avoid changing property values in an NSBatchInsertRequest?

I have a simple Core Data entity Story that occasionally I update with the latest data from a network call. This network call sometimes updates many, many stories instances, so I run an NSBatchInsertRequest, shown below. (The other reason I'm using a batch insert is that many stories might need to be added to the persistent store.)
The problem is a user can have already marked a Story as a favorite. When they do that, I set story.isFavorite = true on the main thread and save viewContext.
However, when the batch insert occurs it overwrites story.isFavorite, setting it back to false, even though I'm using NSMergeByPropertyObjectTrumpMergePolicy on both the batch insert and view contexts. I am not touching story.isFavorite in the batch insert handler either so I don't expect that property to be overwritten.
I thought the benefit of a batch insert with this merge policy was to avoid first fetching + then manually updating changed properties + finally saving. What is the right way to avoid changing property values in an NSBatchInsertRequest?
Story
#objc(Story)
public class Story: NSManagedObject {
#NSManaged public var title: String?
#NSManaged public var storyURL: URL?
#NSManaged public var updatedTime: Date?
#NSManaged public var isFavorite: Bool // <- the problem property
}
Batch insert
container.viewContext.mergePolicy = NSMergeByPropertyObjectTrumpMergePolicy
container.viewContext.automaticallyMergesChangesFromParent = false
let context = NSManagedObjectContext(concurrencyType: .privateQueueConcurrencyType)
context.parent = container.viewContext
context.mergePolicy = NSMergeByPropertyObjectTrumpMergePolicy
context.perform {
let batchInsert = NSBatchInsertRequest(entity: Story.entity(), managedObjectHandler: { managedObject in
let story = managedObject as! Story
let storyResponse = downloadedStories[I]
// Update story with latest response data BUT don't modify story.isFavorite.
story.title = storyResponse.title
story.storyURL = storyResponse.storyURL
story.updatedTime = storyResponse.updatedTime
// ...
})
let result = try context.execute(batchInsert) as? NSBatchInsertResult
if let insertedIDs = result?.result as? [NSManagedObjectID] {
// Merge changes into parent context. Skip save() because not needed for batch insert.
NSManagedObjectContext.mergeChanges(fromRemoteContextSave: [NSInsertedObjectsKey: insertedIDs], into: [container.viewContext])
}
}
Edit
The Story entity does have a unique value constraint using attribute storyURL.
Update after Michael Tsai's answer
By making the Story entity attribute isFavorite a non-Optional Boolean without a default value (it was marked as Optional before, though I'm not sure it makes a difference here) and keeping the Use Scalar Type box checked, I can confirm that existing objects in the store will not be modified (at all) with this configuration of the batch insert context.
context.persistentStoreCoordinator = container.persistentStoreCoordinator
// HOWEVER, observe that regardless of the merge policy below,
// setting `context.parent = container.viewContext` will also
// overwrite the store data!
context.mergePolicy = NSMergeByPropertyStoreTrumpMergePolicy
// NSMergeByPropertyObjectTrumpMergePolicy ignores objects in the store
// (which have the same unique constraint value, here equal `storyURL`)
// and overwrites all properties.
// To confirm that the batch insert operation does not modify
// existing Story instances (at all), first delete all instances where
// where isFavorite == false. Then load the all story data again and
// execute the NSBatchInsertRequest with this change to managedObjectHandler:
story.title = storyResponse.title + " (modified)"
You will see the missing stories get inserted back, this time with their titles having a suffix " (modified)"; but previously favorited stories
do not get modified (basically, with this setup, the batch insert won't re-insert objects).
So the isFavorite property does not get overwritten BUT neither do any properties that should be changed (because they received a new title, for example).
Therefore, if you don't want your objects to get updated, but you want completely new objects to be inserted, you can use this approach.
However, if you are expecting your objects to require updates here are some alternatives:
you may opt to run a separate update operation, maybe an NSBatchUpdateRequest after you run your batch insert in this way,
or after the batch insert, you can update certain properties in a simple loop in a (possibly background/child) context without a batch operation, which could be fine if there isn't tons of data;
lastly, you might be able to first batch insert new data to a temporary store before somehow manually merging your choice of properties with the new store, then delete the temporary store.
A simpler approach: you could fetch the all properties you want to keep unchanged before you execute the batch insert (storing them in an dictionary keyed by your object's uniqueness constraint value), and then during the batch insert set the property again.
For this approach, you will want to use a different merge policy such as NSMergeByPropertyObjectTrumpMergePolicy so that the updated object gets re-inserted into the store (make sure to fetch all properties that you don't want to lose in advance of the batch insert)
random idea: How to Save Data When Using One ManagedObjectContext and PersistentStoreCoordinator with Two Stores
I don't think it is actually possible to do a partial update with a batch insert request. It's hard to know for sure because I don't think any of this is documented except in WWDC sessions. When I first watched the 2019 session, I was excited because the presenter said:
Attributes that are optional or configured with default values can be omitted from the dictionary as well.
In the case of updating an object with unique constraint, the existing values will not be changed.
I took this to mean that:
You can omit values for new objects, and you'll get the defaults or NULL. That makes sense.
If there's an existing object and you omit a value, that value will not the changed. So you can purposely omit values to do a partial update, i.e. update other values while leaving your isFavorite alone.
But, after writing code to test this and looking at the output from com.apple.CoreData.SQLDebug, what actually seems to happen with NSMergeByPropertyObjectTrumpMergePolicy is:
If you omit a value that's required you get a validation error.
If you omit a value that's optional, it updates the row to NULL. For a Bool property in Swift, this will become false.
If you omit a value with a default value, it updates the row to the default.
This is a shame because it seems like partial updates could be implemented by having the ON CONFLICT clause only specify DO UPDATE SET for the attributes that you actually set. But (as of macOS 11) Core Data seems to always generate SQL to set all of the columns.
In summary, with batch inserts, NSMergeByPropertyObjectTrumpMergePolicy does not actually merge by property based on what's changed (like with a regular Core Data save). Rather, it either inserts a new row (if the object is absent) or overwrites all the columns but preserves the objectID (if the object was present).
NSMergeByPropertyStoreTrumpMergePolicy also doesn't merge by property. It just means to leave the stored object alone if it's already present.
Update (2021-06-24): I heard from DTS that Apple considers the current (iOS 14/macOS 11) behavior described above a bug, and that it should let you batch insert without changing omitted properties. The Radar number is 79747419.

Maximo automatisation script to change statut of workorder

I have created a non-persistent attribute in my WoActivity table named VDS_COMPLETE. it is a bool that get changed by a checkbox in one of my application.
I am trying to make a automatisation script in Python to change the status of every task a work order that have been check when I save the WorkOrder.
I don't know why it isn't working but I'm pretty sure I'm close to the answer...
Do you have an idea why it isn't working? I know that I have code in comments, I have done a few experimentations...
from psdi.mbo import MboConstants
from psdi.server import MXServer
mxServer = MXServer.getMXServer()
userInfo = mxServer.getUserInfo(user)
mboSet = mxServer.getMboSet("WORKORDER")
#where1 = "wonum = :wonum"
#mboSet .setWhere(where1)
#mboSet.reset()
workorderSet = mboSet.getMbo(0).getMboSet("WOACTIVITY", "STATUS NOT IN ('FERME' , 'ANNULE' , 'COMPLETE' , 'ATTDOC')")
#where2 = "STATUS NOT IN ('FERME' , 'ANNULE' , 'COMPLETE' , 'ATTDOC')"
#workorderSet.setWhere(where2)
if workorderSet.count() > 0:
for x in range(0,workorderSet.count()):
if workorderSet.getString("VDS_COMPLETE") == 1:
workorder = workorderSet.getMbo(x)
workorder.changeStatus("COMPLETE",MXServer.getMXServer().getDate(), u"Script d'automatisation", MboConstants.NOACCESSCHECK)
workorderSet.save()
workorderSet.close()
It looks like your two biggest mistakes here are 1. trying to get your boolean field (VDS_COMPLETE) off the set (meaning off of the collection of records, like the whole table) instead of off of the MBO (meaning an actual record, one entry in the table) and 2. getting your set of data fresh from the database (via that MXServer call) which means using the previously saved data instead of getting your data set from the screen where the pending changes have actually been made (and remember that non-persistent fields do not get saved to the database).
There are some other problems with this script too, like your use of "count()" in your for loop (or even more than once at all) which is an expensive operation, and the way you are currently (though this may be a result of your debugging) not filtering the work order set before grabbing the first work order (meaning you get a random work order from the table) and then doing a dynamic relationship off of that record (instead of using a normal relationship or skipping the relationship altogether and using just a "where" clause), even though that relationship likely already exists.
Here is a Stack Overflow describing in more detail about relationships and "where" clauses in Maximo: Describe relationship in maximo 7.5
This question also has some more information about getting data from the screen versus new from the database: Adding a new row to another table using java in Maximo

Insert/Update SQL table from observablelist

Ok, so I'm working with an ObservableList, which is working fine, but now I need to use the observable list to insert rows into and update rows in an SQL database table. I've found little info on working between JavaFX and SQL databases ... all the examples of data tables have the data created in the java code. I had hope when I saw "update SQL database" in this post:
Update sql database from FoxPro data on Glassfish server
but it was not applicable to my situation.
So the question is, how do I start the code to read from the ObservableList so I can run my SQL Insert statement? If you could point me to an example code where an ObservableList is used and an SQL table is created/added to/updated I would greatly appreciate it.
Thanks!
UPDATE TO QUESTION:
I can't really post relevant code here because the relevant parts are what I don't have. However, I'm thinking what I need to do is something like this:
mylist.moveToFirst();
while (mylist.next()) {
make connection // think I got it
INSERT INTO mytable (name, address, phone) VALUES (observablename, observableaddress, observablephone // think I got this as well
Obviously I'm applying my knowledge of other areas to ObservableList, but I am doing it to demonstrate what I don't know how to do with my ObservableList (mylist).
Again, thanks for any help.
Tying up loose ends today, and this question has not really been answered. I reposted a newer question with more specifics once I learned more about the situation, and that question also went unanswered, but I did figure it out, and posted an answer here: Understanding my ObservableList.
However, to be neat and tidy, let me post here some code to help me remember, as well as help anyone else who looks at this question and says, "YES, BUT WHAT IS THE SOLUTION?!?!?"
Generically, it looks something like this:
I like to open my connection and prepare my statement(s) first.
Use the iterator to get the variables from the list
within the iterator, add the variables to the prepared statement and execute.
I read somewhere about batch execution of statements, but with as few updates as I'm doing with each list, that seemed too complicated, so I just do each update individually within the iterator.
Specifically, here is some code:
Connection con;
con = [your connection string]; // I actually keep my connection string in its own class
// and just call it (OpenDB.connect()). This way I can swap out the class OpenDB
// for whatever database I'm using (MySQL, MS Access, etc.) and I don't have to
// change a bunch of connection strings in other classes.
PreparedStatement pst;
String insertString = "INSERT INTO People (Name, Address, Phone) VALUES (?, ?, ?)";
pst = con.prepareStatement(insertString);
for(Person p : mylist) { // read as: for each Person [a data model defined in a class
// named Person] which in this set of statements we shall call 'p' within the list
// previously defined and named 'mylist' ... or "For each Person 'p' in 'mylist'"
String name = p.name.get(); // get the name which corresponds to the Person in this object of 'mylist'
String address = p.address.get(); // ditto, address
Integer phone = p.phone.get(); // ditto, phone. Did as integer here to show how to add to pst below
pst.setString(1, name); // replace question mark 1 with value of 'name'
pst.setString(2, address); // ditto, 2 and 'address'
pst.setInt(3, phone); // ditto, 3 and 'phone'
pst.executeUpdate();
And that's how I did it. Not sure if it's the 'proper' way to do it, but it works. Any input is welcomed, as I'm still learning.
In JavaFX you usually get to be the person to create the example :)
ObservableList supports listeners, these receive events which tell you what has been added or updated by default. There is a good example in the javadocs here.
To get update events you need to provide an 'extractor' to the method creating the list here. This should take an instance of the object in the list and provide an array of the properties you want to listen to.
Try this:
SQLEXEC(lnConn, "Update INVENTORY SET brand = ?_brand, model = ?_model, code =?_code, timestamp =?_datetime where recno=?_id ")

cakePHP and authorization for CRUD operations

I have a cakephp 1.3 application and I have run into a 'data leak' security hole. I am looking for the best solution using cake and not just something that will work. The application is a grade tracking system that lets teachers enter grades and students can retrieve their grades. Everything is working as expected but when I started to audit security I found that the basic CRUD operations have leaks. Meaning that student X can see student Y's grades. Students should only see their own grades. I will limit this questions to the read operation.
Using cake, I have a grade_controller.php file with this view function:
function view($id = null) {
// Extra, not related code removed
$this->set('grade', $this->grade->read(null, $id));
}
And
http://localhost/grade/view/5
Shows the grade for student $id=5. That's great. But if student #5 manipulates the URL and changes it to a 6, person #6's grades are shown. The classic data leak security hole.
I had two thoughts on the best way to resolve this. 1) I can add checks to every CRUD operations called in the controller. Or 2) add code to the model (for example using beforeFind()) to check if person X has access to that data element.
Option #1 seems like it is time consuming and error prone.
Option #2 seem like the best way to go. But, it required calling find() before some operations. The read() example above never executes beforeFind() and there is no beforeRead() callback.
Suggestions?
Instead of having a generic read() in your controller, you should move ALL finds, queries..etc into the respective model.
Then, go through each model and add any type of security checks you need on any finds that need to be restricted. 1) it will be much more DRY coding, and 2) you'll better be able to manage security risks like this since you know where all your queries are held.
For your example, I would create a getGrade($id) method in my Grade model and check the student_id field (or whatever) against your Auth user id CakeSession::read("Auth.User.id");
You could also build some generic method(s) similar to is_owner() to re-use the same logic throughout multiple methods.
If CakePHP supports isAuthorized, here's something you could do:
Create a column, that has the types of users (eg. 'student', 'teacher', ...)
Now, it the type of User is 'student', you can limit their access, to view only their data. An example of isAuthorized is as follows. I am allowing the student to edit only their profile information. You can extend the concept.
if ((($role['User']['role'] & $this->user_type['student']) == $this->user_type['student']) {
if (in_array($this->action, array('view')) == true) {
$id = $this->params->pass[0];
if ($id == $user_id) {
return (true);
}
}
}
}

Insert/update Doctrine object from Excel

On the project which I am currently working, I have to read an Excel file (with over a 1000 rows), extract all them and insert/update to a database table.
in terms of performance, is better to add all the records to a Doctrine_Collection and insert/update them after using the fromArray() method, right? One other possible approach is to create a new object for each row (a Excel row will be a object) and them save it but I think its worst in terms of performance.
Every time the Excel is uploaded, it is necessary to compare its rows to the existing objects on the database. If the row does not exist as object, should be inserted, otherwise updated. My first approach was turn both object and rows into arrays (or Doctrine_Collections); then compare both arrays before implementing the needed operations.
Can anyone suggest me any other possible approach?
We did a bit of this in a project recently, with CSV data. it was fairly painless. There's a symfony plugin tmCsvPlugin, but we extended this quite a bit since so the version in the plugin repo is pretty out of date. Must add that to the #TODO list :)
Question 1:
I don't explicitly know about performance, but I would guess that adding the records to a Doctrine_Collection and then calling Doctrine_Collection::save() would be the neatest approach. I'm sure it would be handy if an exception was thrown somewhere and you had to roll back on your last save..
Question 2:
If you could use a row field as a unique indentifier, (let's assume a username), then you could search for an existing record. If you find a record, and assuming that your imported row is an array, use Doctrine_Record::synchronizeWithArray() to update this record; then add it to a Doctrine_Collection. When complete, just call Doctrine_Collection::save()
A fairly rough 'n' ready implementation:
// set up a new collection
$collection = new Doctrine_Collection('User');
// assuming $row is an associative
// array representing one imported row.
foreach ($importedRows as $row) {
// try to find an existing record
// based on a unique identifier.
$user = Doctrine_Core::getTable('User')
->findOneByUsername($row['username']);
// create a new user record if
// no existing record is found.
if (!$user instanceof User) {
$user = new User();
}
// sync record with current data.
$user->synchronizeWithArray($row);
// add to collection.
$collection->add($user);
}
// done. save collection.
$collection->save();
Pretty rough but something like this worked well for me. This is assuming that you can use your imported row data in some way to serve as a unique identifier.
NOTE: be wary of synchronizeWithArray() if you're using sf1.2/doctrine 1.0 - if I remember correctly it was not implemented correctly. it works fine in doctrine 1.2 though.
I have never worked on Doctrine_Collections, but I can answer in terms of database queries and code logic in a broader sense. I would apply the following logic:-
Fetch all the rows of the excel sheet from database in a single query and store them in an array $uploadedSheet.
Create a single array of all the rows of the uploaded excel sheet, call it $storedSheet. I guess the structures of the Doctrine_Collections $uploadedSheet and $storedSheet will be similar (both two-dimensional - rows, cells can be identified and compared).
3.Run foreach loops on the $uploadedSheet as follows and only identify the rows which need to be inserted and which to be updated (do actual queries later)-
$rowsToBeUpdated =array();
$rowsToBeInserted=array();
foreach($uploadedSheet as $row=>$eachRow)
{
if(is_array($storedSheet[$row]))
{
foreach($eachRow as $column=>$value)
{
if($value != $storedSheet[$row][$column])
{//This is a representation of comparison
$rowsToBeUpdated[$row]=true;
break; //No need to check this row anymore - one difference detected.
}
}
}
else
{
$rowsToBeInserted[$row] = true;
}
}
4. This way you have two arrays. Now perform 2 database queries -
bulk insert all those rows of $uploadedSheet whose numbers are stored in $rowsToBeInserted array.
bulk update all the rows of $uploadedSheet whose numbers are stored in $rowsToBeUpdated array.
These bulk queries are the key to faster performance.
Let me know if this helped, or you wanted to know something else.

Resources