Restore Fixed Assets records with custom field and related table records - acumatica

In Fixed Assets (FA303000), I have customization that contains two custom fields and one custom table which is a child table referenced with FixedAsset's AssetID column.
Now, for some reason, we have to delete half of our fixed assets and put it back into Acumatica again. We are not creating the whole snapshot to create and restore the process. We are processing half of fixed assets records by removing it and put it back.
My initial thought was I just needed to put those records into some temporary table using (select * into duplicate_FixedAsset from FixedAsset.) then, delete those records from FixedAsset and put it back into FixedAsset.
Custom Fields record from Fixed Assets will be back, and for my custom child table that is linked via AssetID, I can use AssetCD to get AssetID and link it back again and everything should be fine.
But, I got wrong when inserting this record didn't appear back into the Fixed Assets page.
Upon inspecting Row_Deleting event, I found below code snippet.
protected virtual void FixedAsset_RowDeleting(PXCache sender, PXRowDeletingEventArgs e)
{
FixedAsset asset = (FixedAsset)e.Row;
if (asset == null) return;
if (null != (FATran)PXSelect<FATran, Where<FATran.assetID, Equal<Current<FixedAsset.assetID>>, And<FATran.batchNbr, IsNotNull>>>.SelectSingleBound(this, new object[] { asset }))
{
throw new PXSetPropertyException(Messages.BalanceRecordCannotBeDeleted);
}
this.EnsureCachePersistence(typeof(FARegister));
this.EnsureCachePersistence(typeof(FABookHistory));
foreach (FARegister reg in PXSelectJoinGroupBy<FARegister,
LeftJoin<FATran, On<FARegister.refNbr, Equal<FATran.refNbr>>>,
Where<FATran.assetID, Equal<Required<FixedAsset.assetID>>>,
Aggregate<GroupBy<FARegister.refNbr>>>.Select(this, asset.AssetID))
{
this.Caches<FARegister>().Delete(reg);
}
foreach (FABookHistory hist in PXSelect<FABookHistory, Where<FABookHistory.assetID, Equal<Required<FixedAsset.assetID>>>>.Select(this, asset.AssetID))
{
this.Caches<FABookHistory>().Delete(hist);
}
}
So, it doesn't seem that simple process that I have thought at the beginning. It shows that FARegister and FABookHistory are linked as well.
So, I would really appreciate it if I know what is the proper steps to follow to make sure that not only I get the Fixed Assets back without breaking any relation to other tables but also properly link back to my custom fields and table.

Related

How to avoid changing property values in an NSBatchInsertRequest?

I have a simple Core Data entity Story that occasionally I update with the latest data from a network call. This network call sometimes updates many, many stories instances, so I run an NSBatchInsertRequest, shown below. (The other reason I'm using a batch insert is that many stories might need to be added to the persistent store.)
The problem is a user can have already marked a Story as a favorite. When they do that, I set story.isFavorite = true on the main thread and save viewContext.
However, when the batch insert occurs it overwrites story.isFavorite, setting it back to false, even though I'm using NSMergeByPropertyObjectTrumpMergePolicy on both the batch insert and view contexts. I am not touching story.isFavorite in the batch insert handler either so I don't expect that property to be overwritten.
I thought the benefit of a batch insert with this merge policy was to avoid first fetching + then manually updating changed properties + finally saving. What is the right way to avoid changing property values in an NSBatchInsertRequest?
Story
#objc(Story)
public class Story: NSManagedObject {
#NSManaged public var title: String?
#NSManaged public var storyURL: URL?
#NSManaged public var updatedTime: Date?
#NSManaged public var isFavorite: Bool // <- the problem property
}
Batch insert
container.viewContext.mergePolicy = NSMergeByPropertyObjectTrumpMergePolicy
container.viewContext.automaticallyMergesChangesFromParent = false
let context = NSManagedObjectContext(concurrencyType: .privateQueueConcurrencyType)
context.parent = container.viewContext
context.mergePolicy = NSMergeByPropertyObjectTrumpMergePolicy
context.perform {
let batchInsert = NSBatchInsertRequest(entity: Story.entity(), managedObjectHandler: { managedObject in
let story = managedObject as! Story
let storyResponse = downloadedStories[I]
// Update story with latest response data BUT don't modify story.isFavorite.
story.title = storyResponse.title
story.storyURL = storyResponse.storyURL
story.updatedTime = storyResponse.updatedTime
// ...
})
let result = try context.execute(batchInsert) as? NSBatchInsertResult
if let insertedIDs = result?.result as? [NSManagedObjectID] {
// Merge changes into parent context. Skip save() because not needed for batch insert.
NSManagedObjectContext.mergeChanges(fromRemoteContextSave: [NSInsertedObjectsKey: insertedIDs], into: [container.viewContext])
}
}
Edit
The Story entity does have a unique value constraint using attribute storyURL.
Update after Michael Tsai's answer
By making the Story entity attribute isFavorite a non-Optional Boolean without a default value (it was marked as Optional before, though I'm not sure it makes a difference here) and keeping the Use Scalar Type box checked, I can confirm that existing objects in the store will not be modified (at all) with this configuration of the batch insert context.
context.persistentStoreCoordinator = container.persistentStoreCoordinator
// HOWEVER, observe that regardless of the merge policy below,
// setting `context.parent = container.viewContext` will also
// overwrite the store data!
context.mergePolicy = NSMergeByPropertyStoreTrumpMergePolicy
// NSMergeByPropertyObjectTrumpMergePolicy ignores objects in the store
// (which have the same unique constraint value, here equal `storyURL`)
// and overwrites all properties.
// To confirm that the batch insert operation does not modify
// existing Story instances (at all), first delete all instances where
// where isFavorite == false. Then load the all story data again and
// execute the NSBatchInsertRequest with this change to managedObjectHandler:
story.title = storyResponse.title + " (modified)"
You will see the missing stories get inserted back, this time with their titles having a suffix " (modified)"; but previously favorited stories
do not get modified (basically, with this setup, the batch insert won't re-insert objects).
So the isFavorite property does not get overwritten BUT neither do any properties that should be changed (because they received a new title, for example).
Therefore, if you don't want your objects to get updated, but you want completely new objects to be inserted, you can use this approach.
However, if you are expecting your objects to require updates here are some alternatives:
you may opt to run a separate update operation, maybe an NSBatchUpdateRequest after you run your batch insert in this way,
or after the batch insert, you can update certain properties in a simple loop in a (possibly background/child) context without a batch operation, which could be fine if there isn't tons of data;
lastly, you might be able to first batch insert new data to a temporary store before somehow manually merging your choice of properties with the new store, then delete the temporary store.
A simpler approach: you could fetch the all properties you want to keep unchanged before you execute the batch insert (storing them in an dictionary keyed by your object's uniqueness constraint value), and then during the batch insert set the property again.
For this approach, you will want to use a different merge policy such as NSMergeByPropertyObjectTrumpMergePolicy so that the updated object gets re-inserted into the store (make sure to fetch all properties that you don't want to lose in advance of the batch insert)
random idea: How to Save Data When Using One ManagedObjectContext and PersistentStoreCoordinator with Two Stores
I don't think it is actually possible to do a partial update with a batch insert request. It's hard to know for sure because I don't think any of this is documented except in WWDC sessions. When I first watched the 2019 session, I was excited because the presenter said:
Attributes that are optional or configured with default values can be omitted from the dictionary as well.
In the case of updating an object with unique constraint, the existing values will not be changed.
I took this to mean that:
You can omit values for new objects, and you'll get the defaults or NULL. That makes sense.
If there's an existing object and you omit a value, that value will not the changed. So you can purposely omit values to do a partial update, i.e. update other values while leaving your isFavorite alone.
But, after writing code to test this and looking at the output from com.apple.CoreData.SQLDebug, what actually seems to happen with NSMergeByPropertyObjectTrumpMergePolicy is:
If you omit a value that's required you get a validation error.
If you omit a value that's optional, it updates the row to NULL. For a Bool property in Swift, this will become false.
If you omit a value with a default value, it updates the row to the default.
This is a shame because it seems like partial updates could be implemented by having the ON CONFLICT clause only specify DO UPDATE SET for the attributes that you actually set. But (as of macOS 11) Core Data seems to always generate SQL to set all of the columns.
In summary, with batch inserts, NSMergeByPropertyObjectTrumpMergePolicy does not actually merge by property based on what's changed (like with a regular Core Data save). Rather, it either inserts a new row (if the object is absent) or overwrites all the columns but preserves the objectID (if the object was present).
NSMergeByPropertyStoreTrumpMergePolicy also doesn't merge by property. It just means to leave the stored object alone if it's already present.
Update (2021-06-24): I heard from DTS that Apple considers the current (iOS 14/macOS 11) behavior described above a bug, and that it should let you batch insert without changing omitted properties. The Radar number is 79747419.

Load product image everywhere in the Shopware 6 storefront from external URL at runtime without saving it in filesystem

I am changing the image of a product from an external URL at runtime on saleschannel.product.load event. This all works fine, but when placing the order, it gives the error about
SQLSTATE[23000]: Integrity constraint violation: 1452 Cannot add or update a child row: a foreign key constraint fails (`sv3_dev`.`order_line_item`, CONSTRAINT `fk.order_line_item.cover_id` FOREIGN KEY (`cover_id`) REFERENCES `media` (`id`) ON UPDATE CASCADE)
I am guessing it is because I am overrwriting the media entity of the product with my custom implementation like this, so it does not find the media cover ID when inserting the order line item:
$pathInfo = pathinfo($url);
$media = new MediaEntity();
$media->setId(Uuid::randomHex());
$media->setUrl($url);
$media->setMimeType(sprintf('image/%s', $pathInfo['extension']));
$media->setFileExtension($pathInfo['extension']);
$media->setFileName($pathInfo['filename']);
$productMediaEntity = new ProductMediaEntity();
$productMediaEntity->setId(Uuid::randomHex());
$productMediaEntity->setMedia($media);
$productMediaEntity->setPosition(0);
$mediaCollection = new ProductMediaCollection([$productMediaEntity]);
$entity->setMedia($mediaCollection);
if ($entity->getCover() === null) {
$entity->setCover($productMediaEntity);
} else {
$entity->getCover()->setMedia($productMediaEntity->getMedia());
}
Is there a way to dynamically change the image at run time everywhere in the storefront?
I cannot save the image / media in the filesystem due to some copyright clause which does not allow the images to be downloaded in the shop. We can only load it at runtime.
To anyone else who stumbles upon this, dynamically adding the media entity like mentioned in the question works for the rest of the shop except when placing the order, as it requires a media ID due to FK constraint. So what I did was that I created a media entity from the mediaRepository and used that ID as reference for the order instead of the Uuid::randomHex() without saving the actual image in the filesystem.

How can I manipulate the Audit screen (SM205510) through code

I'm trying to manipulate the Audit screen (SM205510) through code, using a graph object. The operation of the screen has processes that seem to work when a screen ID is selected in the header. This is my code to create a new record:
Using PX.Data;
Using PX.Objects.SM;
var am = PXGraph.CreateInstance<AUAuditMaintenance>();
AUAuditSetup auditsetup = new AUAuditSetup();
auditsetup.ScreenID = "GL301000";
auditsetup = am.Audit.Insert(auditsetup);
am.Actions.PressSave();
Now, when I execute the code above, it creates a record in the AUAuditSetup table just fine - but it doesn't automatically create the AUAuditTable records the way they are auto-generated in the screen (I realize that the records aren't in the database yet) - but how can I get the graph object to auto-generate the AUAuditTable records in the cache the way they are in the screen?
I've tried looking at the source code for the Audit screen - but it just shows blank, like there's nothing there. I look in the code repository in Visual Studio and I don't see any file for AUAuditMaintenance either, so I can't see any process that I could run in the graph object that would populate those AUAuditTable records.
Any help would be appreciated.
Thanks...
If I had such a need, to manipulate Audit screen records, I'd rather create my own graph and probably generate DAC class. Also I'd add one more column UsrIsArtificial and set it to false by default. And then manage them as ordinary records, but each time I'll add something, I'd set field UsrIsArtificial to false.
You can hardly find how that records are managed at graph level because that records are created and handled on on Graph level, but on framework level. Also think twice or even more about design, as direct writing into Audit history may cause confusion for users in the system of what was caused by user, and what was caused by your code. From that point of view I would rather add one more additional table, then add confusion to existing one.
Acumatica support provided this solution, which works beautifully (Hat tip!):
var screenID = "GL301000"; //"SO303000";
var g = PXGraph.CreateInstance<AUAuditMaintenance>();
//Set Header Current
g.Audit.Current = g.Audit.Search<AUAuditSetup.screenID>(screenID);
if (g.Audit.Current == null) //If no Current then insert
{
var header = new AUAuditSetup();
header.ScreenID = screenID;
header.Description = "Test Audit";
header = g.Audit.Insert(header);
}
foreach (AUAuditTable table in g.Tables.Select())
{
table.IsActive = true;
//Sets Current for Detail
g.Tables.Current = g.Tables.Update(table);
foreach (AUAuditField field in g.Fields.Select())
{
field.IsActive = false;
g.Fields.Update(field);
}
}
g.Actions.PressSave();

Umbraco - Examine index not updated

I am using Umbraco CMS, and trying to use its site search function that uses Examine.
When I edit a page and publish it, the examine index is not updated, hence search results are always out of date. I have to manually delete the Index folder to update it.
Shouldn't the index be updated automatically everytime you update the content?
I wrote a class that updates the index on publish.
using umbraco.BusinessLogic;
using umbraco.cms.businesslogic.web;
using Examine;
public class UmbracoEvents: ApplicationBase
{
/// <summary>Constructor</summary>
public UmbracoEvents()
{
Document.AfterPublish += new Document.PublishEventHandler(Document_AfterPublish);
}
private void Document_AfterPublish(Document sender, umbraco.cms.businesslogic.PublishEventArgs e)
{
// Rebuild SiteSearchIndexer
ExamineManager.Instance.IndexProviderCollection["SiteSearchIndexer"].RebuildIndex(); // Unfortunately this doesn't index the latest change, must republish to index it
}
}
However it doesn't get the latest change even if it is supposed to run "after" publish. So, to make the search results up to date, you have to publish twice :S
You can manually update the index by using Examine Dashboard.
To automatically rebuild indexes on app startup, you could add this line into ExamineIndex.config located in config directory
<Examine RebuildOnAppStart="true">
Indexes should rebuild automatically when you publish/republish a content node. If it doesn't work, you may have configuration problem with Examine.

Insert/update Doctrine object from Excel

On the project which I am currently working, I have to read an Excel file (with over a 1000 rows), extract all them and insert/update to a database table.
in terms of performance, is better to add all the records to a Doctrine_Collection and insert/update them after using the fromArray() method, right? One other possible approach is to create a new object for each row (a Excel row will be a object) and them save it but I think its worst in terms of performance.
Every time the Excel is uploaded, it is necessary to compare its rows to the existing objects on the database. If the row does not exist as object, should be inserted, otherwise updated. My first approach was turn both object and rows into arrays (or Doctrine_Collections); then compare both arrays before implementing the needed operations.
Can anyone suggest me any other possible approach?
We did a bit of this in a project recently, with CSV data. it was fairly painless. There's a symfony plugin tmCsvPlugin, but we extended this quite a bit since so the version in the plugin repo is pretty out of date. Must add that to the #TODO list :)
Question 1:
I don't explicitly know about performance, but I would guess that adding the records to a Doctrine_Collection and then calling Doctrine_Collection::save() would be the neatest approach. I'm sure it would be handy if an exception was thrown somewhere and you had to roll back on your last save..
Question 2:
If you could use a row field as a unique indentifier, (let's assume a username), then you could search for an existing record. If you find a record, and assuming that your imported row is an array, use Doctrine_Record::synchronizeWithArray() to update this record; then add it to a Doctrine_Collection. When complete, just call Doctrine_Collection::save()
A fairly rough 'n' ready implementation:
// set up a new collection
$collection = new Doctrine_Collection('User');
// assuming $row is an associative
// array representing one imported row.
foreach ($importedRows as $row) {
// try to find an existing record
// based on a unique identifier.
$user = Doctrine_Core::getTable('User')
->findOneByUsername($row['username']);
// create a new user record if
// no existing record is found.
if (!$user instanceof User) {
$user = new User();
}
// sync record with current data.
$user->synchronizeWithArray($row);
// add to collection.
$collection->add($user);
}
// done. save collection.
$collection->save();
Pretty rough but something like this worked well for me. This is assuming that you can use your imported row data in some way to serve as a unique identifier.
NOTE: be wary of synchronizeWithArray() if you're using sf1.2/doctrine 1.0 - if I remember correctly it was not implemented correctly. it works fine in doctrine 1.2 though.
I have never worked on Doctrine_Collections, but I can answer in terms of database queries and code logic in a broader sense. I would apply the following logic:-
Fetch all the rows of the excel sheet from database in a single query and store them in an array $uploadedSheet.
Create a single array of all the rows of the uploaded excel sheet, call it $storedSheet. I guess the structures of the Doctrine_Collections $uploadedSheet and $storedSheet will be similar (both two-dimensional - rows, cells can be identified and compared).
3.Run foreach loops on the $uploadedSheet as follows and only identify the rows which need to be inserted and which to be updated (do actual queries later)-
$rowsToBeUpdated =array();
$rowsToBeInserted=array();
foreach($uploadedSheet as $row=>$eachRow)
{
if(is_array($storedSheet[$row]))
{
foreach($eachRow as $column=>$value)
{
if($value != $storedSheet[$row][$column])
{//This is a representation of comparison
$rowsToBeUpdated[$row]=true;
break; //No need to check this row anymore - one difference detected.
}
}
}
else
{
$rowsToBeInserted[$row] = true;
}
}
4. This way you have two arrays. Now perform 2 database queries -
bulk insert all those rows of $uploadedSheet whose numbers are stored in $rowsToBeInserted array.
bulk update all the rows of $uploadedSheet whose numbers are stored in $rowsToBeUpdated array.
These bulk queries are the key to faster performance.
Let me know if this helped, or you wanted to know something else.

Resources