EntityManage - How can I Rollback 1st table updated data if 2nd table update is not successful using hibernate entitymanager - hibernate-entitymanager

I want to update two database tables using hibernate entityManager. Currently I am updating 2nd table after verifying that data has been updated in 1st table.
My question is how to Rollback 1st table if data is not updated in 2nd table.
This is how I am updating individual table.
try {
wapi = getWapiUserUserAuthFlagValues(subject, UserId);
wapi.setFlags((int) flags);
entityManager.getTransaction().begin();
entityManager.merge(wapi);
entityManager.flush();
entityManager.getTransaction().commit();
} catch (NoResultException nre) {
wapi = new Wapi();
wapi.setSubject(merchant);
wapi.setUserId(UserId);
wapi.setFlags((int) flags);
entityManager.getTransaction().rollback();
}
Note - I am calling separate methods to update each table data
Thanks

I got the solution. basically I was calling two methods to update 2 db table and these two methods I am calling from one method.
Ex - I am method p and q from method r.
Initially I was calling begin, merge, flush and commit entitymanager methods in both p and q.
Now I am calling begin and commit in r and merge and flush in p and q.
So now my tables are getting updated together and rollback is also simple.
Hope it will help someone. Because I wasted my time for this, probably it can save someone else.
Thanks

Related

Validate #versionColumn value before saving an entity with TypeORM

I'm currently working on saving data in a postgres DB using TypeORM with the NestJS integration. I'm saving data which keeps track of a version property using TypeORM's #VersionColumn feature, which increments a number each time save() is called on a repository.
For my feature it is important to check this version number before updating the records.
Important
I know I could technically achieve this by retrieving the record before updating it and checking the versions, but this leaves a small window for errors. If a 2nd user updates the same record in that millisecond between the get and save or if it would take longer for some weird reason, it would up the version and make the data in the first call invalid. TypeORM doesn't check the version value, so even if a call has a lower value than what is in the database, it still saves the data eventhough it should be seen as out of date.
1: User A checks latest version => TypeORM gives back the latest version: 1
2: User B updates record => TypeORM ups the version: 2
3: User A saves their data with version 1 <-- This needs to validate the versions first.
4: TypeORM overwrites User B's record with User A's data
What I'm looking for is a way to make TypeORM decline step 3 as the latest version in the database is 2 and User A tries to save with version 1.
I've tried using the querybuilder and update statements to make this work, but the build-in #VersionColumn only up the version on every save() call from a repository or entity manager.
Besides this I also got a tip to look into database triggers, but as far as I could find, this feature is not yet supported by TypeORM
Here is an example of the setup:
async update(entity: Foo): Promise<boolean> {
const value = await this._configurationRepository.save(entity);
if (value === entity) {
return true;
}
return false;
}
In my opinion, something like this is much better served through triggers directly in the Database as it will fix concerns around race conditions as well as making it so that modifications made outside the ORM will also update the version number. Here is a SQL Fiddle demonstrating triggers in action. You'll just need to incorporate it into your schema migrations.
Here is the relevant DDL from the SQL Fiddle example:
CREATE TABLE entity_1
(
id serial PRIMARY KEY,
some_value text,
version int NOT NULL DEFAULT 1
);
CREATE OR REPLACE FUNCTION increment_version() RETURNS TRIGGER AS
$BODY$
BEGIN
NEW.version = NEW.version + 1;
RETURN NEW;
END;
$BODY$
LANGUAGE plpgsql VOLATILE;
CREATE TRIGGER increment_entity_1_version
BEFORE UPDATE
ON entity_1
FOR EACH ROW
EXECUTE PROCEDURE increment_version();
The same trigger function can be used for any table that has a version column in case this is a pattern you want to use across multiple tables.
I think you are looking for concurrency control. If this is the case there is a solution in this about 1/2 the way down. TypeORM concurrency control issue

Sequelize/Postgres - how to update each row individually on migrate?

I have lots of records in my postgres. (using sequelize to communicate)
I want to have a migrate script, but due to locking, I have to do each change as atomic as possible.
So I don't want to selectAll and then modify and then saveAll.
In mongo I have forEach cursor which allows me to update a record, save it and only then move to the next one.
Anything similar in sequelize/postgres?
Currently, I am doing that in my code - getting the IDs, then for each performing a query.
return migration.runOnAllUpdates((record)=>{
record.change = 'new value';
return record.save()
});
where runOnAllUpdates will simply give me records one by one.

UserInfoProvider.DeleteUser() vs DeleteData(whereCondition)

I know that DeleteUser() will run procedures to delete all relationships etc. Will the private internal DeleteData with a where condition also delete all relationships or will it just try deleting the main record from the table? If any relational data exists will it throw an error?
If you call UserInfoProvider.DeleteData() it won't delete the related data. It just executes the object's deletion SQL query. It won't even look for the cms.user.removedependencies query.
On the other hand, calling DeleteData() upon an info object would cause the related data to be deleted.
If you need to bulk delete users then retrieve them from the DB using object query (make sure you restrict columns, UserID should be enough) first. And then iterate through the collection calling Delete() on each one of them.
foreach (var user in UserInfoProvider.GetUsers().Where("UserEnabled=0").Columns("UserID").TypedResult.Items)
{
user.Delete();
}

Insert/Update SQL table from observablelist

Ok, so I'm working with an ObservableList, which is working fine, but now I need to use the observable list to insert rows into and update rows in an SQL database table. I've found little info on working between JavaFX and SQL databases ... all the examples of data tables have the data created in the java code. I had hope when I saw "update SQL database" in this post:
Update sql database from FoxPro data on Glassfish server
but it was not applicable to my situation.
So the question is, how do I start the code to read from the ObservableList so I can run my SQL Insert statement? If you could point me to an example code where an ObservableList is used and an SQL table is created/added to/updated I would greatly appreciate it.
Thanks!
UPDATE TO QUESTION:
I can't really post relevant code here because the relevant parts are what I don't have. However, I'm thinking what I need to do is something like this:
mylist.moveToFirst();
while (mylist.next()) {
make connection // think I got it
INSERT INTO mytable (name, address, phone) VALUES (observablename, observableaddress, observablephone // think I got this as well
Obviously I'm applying my knowledge of other areas to ObservableList, but I am doing it to demonstrate what I don't know how to do with my ObservableList (mylist).
Again, thanks for any help.
Tying up loose ends today, and this question has not really been answered. I reposted a newer question with more specifics once I learned more about the situation, and that question also went unanswered, but I did figure it out, and posted an answer here: Understanding my ObservableList.
However, to be neat and tidy, let me post here some code to help me remember, as well as help anyone else who looks at this question and says, "YES, BUT WHAT IS THE SOLUTION?!?!?"
Generically, it looks something like this:
I like to open my connection and prepare my statement(s) first.
Use the iterator to get the variables from the list
within the iterator, add the variables to the prepared statement and execute.
I read somewhere about batch execution of statements, but with as few updates as I'm doing with each list, that seemed too complicated, so I just do each update individually within the iterator.
Specifically, here is some code:
Connection con;
con = [your connection string]; // I actually keep my connection string in its own class
// and just call it (OpenDB.connect()). This way I can swap out the class OpenDB
// for whatever database I'm using (MySQL, MS Access, etc.) and I don't have to
// change a bunch of connection strings in other classes.
PreparedStatement pst;
String insertString = "INSERT INTO People (Name, Address, Phone) VALUES (?, ?, ?)";
pst = con.prepareStatement(insertString);
for(Person p : mylist) { // read as: for each Person [a data model defined in a class
// named Person] which in this set of statements we shall call 'p' within the list
// previously defined and named 'mylist' ... or "For each Person 'p' in 'mylist'"
String name = p.name.get(); // get the name which corresponds to the Person in this object of 'mylist'
String address = p.address.get(); // ditto, address
Integer phone = p.phone.get(); // ditto, phone. Did as integer here to show how to add to pst below
pst.setString(1, name); // replace question mark 1 with value of 'name'
pst.setString(2, address); // ditto, 2 and 'address'
pst.setInt(3, phone); // ditto, 3 and 'phone'
pst.executeUpdate();
And that's how I did it. Not sure if it's the 'proper' way to do it, but it works. Any input is welcomed, as I'm still learning.
In JavaFX you usually get to be the person to create the example :)
ObservableList supports listeners, these receive events which tell you what has been added or updated by default. There is a good example in the javadocs here.
To get update events you need to provide an 'extractor' to the method creating the list here. This should take an instance of the object in the list and provide an array of the properties you want to listen to.
Try this:
SQLEXEC(lnConn, "Update INVENTORY SET brand = ?_brand, model = ?_model, code =?_code, timestamp =?_datetime where recno=?_id ")

Insert/update Doctrine object from Excel

On the project which I am currently working, I have to read an Excel file (with over a 1000 rows), extract all them and insert/update to a database table.
in terms of performance, is better to add all the records to a Doctrine_Collection and insert/update them after using the fromArray() method, right? One other possible approach is to create a new object for each row (a Excel row will be a object) and them save it but I think its worst in terms of performance.
Every time the Excel is uploaded, it is necessary to compare its rows to the existing objects on the database. If the row does not exist as object, should be inserted, otherwise updated. My first approach was turn both object and rows into arrays (or Doctrine_Collections); then compare both arrays before implementing the needed operations.
Can anyone suggest me any other possible approach?
We did a bit of this in a project recently, with CSV data. it was fairly painless. There's a symfony plugin tmCsvPlugin, but we extended this quite a bit since so the version in the plugin repo is pretty out of date. Must add that to the #TODO list :)
Question 1:
I don't explicitly know about performance, but I would guess that adding the records to a Doctrine_Collection and then calling Doctrine_Collection::save() would be the neatest approach. I'm sure it would be handy if an exception was thrown somewhere and you had to roll back on your last save..
Question 2:
If you could use a row field as a unique indentifier, (let's assume a username), then you could search for an existing record. If you find a record, and assuming that your imported row is an array, use Doctrine_Record::synchronizeWithArray() to update this record; then add it to a Doctrine_Collection. When complete, just call Doctrine_Collection::save()
A fairly rough 'n' ready implementation:
// set up a new collection
$collection = new Doctrine_Collection('User');
// assuming $row is an associative
// array representing one imported row.
foreach ($importedRows as $row) {
// try to find an existing record
// based on a unique identifier.
$user = Doctrine_Core::getTable('User')
->findOneByUsername($row['username']);
// create a new user record if
// no existing record is found.
if (!$user instanceof User) {
$user = new User();
}
// sync record with current data.
$user->synchronizeWithArray($row);
// add to collection.
$collection->add($user);
}
// done. save collection.
$collection->save();
Pretty rough but something like this worked well for me. This is assuming that you can use your imported row data in some way to serve as a unique identifier.
NOTE: be wary of synchronizeWithArray() if you're using sf1.2/doctrine 1.0 - if I remember correctly it was not implemented correctly. it works fine in doctrine 1.2 though.
I have never worked on Doctrine_Collections, but I can answer in terms of database queries and code logic in a broader sense. I would apply the following logic:-
Fetch all the rows of the excel sheet from database in a single query and store them in an array $uploadedSheet.
Create a single array of all the rows of the uploaded excel sheet, call it $storedSheet. I guess the structures of the Doctrine_Collections $uploadedSheet and $storedSheet will be similar (both two-dimensional - rows, cells can be identified and compared).
3.Run foreach loops on the $uploadedSheet as follows and only identify the rows which need to be inserted and which to be updated (do actual queries later)-
$rowsToBeUpdated =array();
$rowsToBeInserted=array();
foreach($uploadedSheet as $row=>$eachRow)
{
if(is_array($storedSheet[$row]))
{
foreach($eachRow as $column=>$value)
{
if($value != $storedSheet[$row][$column])
{//This is a representation of comparison
$rowsToBeUpdated[$row]=true;
break; //No need to check this row anymore - one difference detected.
}
}
}
else
{
$rowsToBeInserted[$row] = true;
}
}
4. This way you have two arrays. Now perform 2 database queries -
bulk insert all those rows of $uploadedSheet whose numbers are stored in $rowsToBeInserted array.
bulk update all the rows of $uploadedSheet whose numbers are stored in $rowsToBeUpdated array.
These bulk queries are the key to faster performance.
Let me know if this helped, or you wanted to know something else.

Resources