How to decode a CKRecord NSData field, created by a transformable CoreData attribute, back to its original type? - core-data

Setup:
My app uses CoreData & CloudKit sync.
Entity Item has among others an attribute status of type Int16 with corresponding property #NSManaged var status: Int16, and an attribute names of type transformable using NSSecureUnarchiveFromDataTransformer with corresponding property #NSManaged var names: Set<String>?
To get notified when iCloud records change, the app uses a query subscription.
When a notification is delivered, an identifier of the changed object is sent, the objectID of the NSManagedObject is obtained, and the corresponding ckRecord: CKRecord is fetched using persistentContainer.record(for: objectID) of persistentContainer: NSPersistentCloudKitContainer to get the actual property values.
Question:
"normal" field values, e.g. status, can easily be obtained using ckRecord["CD_status"] as! Int16. I can also get ckRecord["CD_names"] as! NSData, but I don't know how to convert it back to Set<String>?.

Types for properties are limited to some types in CoreData, Set<String> not being one, that's why you need a transformer, that will transform it into (NS)Data.
So, you could use that transformer to transform it back with transformedValue(_:):
NSSecureUnarchiveFromDataTransformer().transformedValue(ckRecord["CD_names"])

Related

afterUpdate listener in typeorm does not listen to updates on #UpdateDateColumn - postgres DB

I'm working on a functionality, where in I have an entity called 'Employee' with a few columns like first name, last name, DOB, etc. and two columns:createdAt and updatedAt defined with #CreateDateColumn and #UpdateDateColumn respectively.
My requirement is that, whenever an update operation is performed on this entity, I should be logging the column name that was updated, along with the old and new value that it holds.
I'm performing the update on Employee table using queryRunner.manager.save()
To track the changes,
I'm making use of the entity subscribers 'afterUpdate' event.
Now on performing an update, for e.g. When I change the Name, my Employee entity is correctly updating the name as well as the 'updatedAt' field to current timestamp in DB
However,my 'afterUpdate' listener is only logging the 'name' column with its old and new values.
Its not logging the updatedAt column.
Can anyone please help me understand what's possibly going wrong here?
Entity definition:
...All columns (first name, etc.)
#UpdateDateColumn({name:'updated_at',type:'timestamp',nullable:true})
updatedAt:Date;
Subscriber code:
#EventSubscriber()
export class EmployeeSubscriber implements
EntitySubscriberInterface<EmployeeEntity> {
constructor()
{}
listenTo() {
return EmployeeEntity;
}
afterUpdate(event: UpdateEvent<EmployeeEnitity>){
const {updatedColumns, dataBaseEntity,entity}=event;
updatedColumns.forEach(({propertyName})=>{
//Get the updated column name and old and new values
//col_name = propertyName;
//oldVal= databaseEntity[propertyName as keyof EmployeeEntity] as string,
//newVal= entity?.[propertyName as keyof EmployeeEntity] as string,
})
}
}
I tried using different save operations (via repository, queryRunner, manager), but it didn't work.
Official docs say the event is triggered only on save operation.
If I manually pass the date to updatedAt field, then I am able to get the field within the event handler, but I don't think that's the right approach as manual intervention should not be needed for #UpdateDateColumn

Azure Cosmos DB: Unique index constraint violation using UpsertDocumentAsync

I have defined a UniqueKey policy in my Azure Cosmos DB Container, for field UniqueName
The below function is being called on a timer.
I'm attempting to Upsert documents in Azure Cosmos DB using Azure Functions bindings, like so:
public async Task ManageItems([ActivityTrigger] string records,
[CosmosDB(
databaseName: "mydatabase",
collectionName: "items",
ConnectionStringSetting = "CosmosDbConnectionString")] DocumentClient client,
ILogger log)
{
var collectionUri = UriFactory.CreateDocumentCollectionUri("mydatabase", "items");
foreach (var record in records)
{
log.LogDebug($"Upserting itemNumber={record.UniqueName}");
await client.UpsertDocumentAsync(collectionUri, record);
}
}
During the first execution in a blank "items" container, the Upsert for each record works splendidly, inserting each record as a specific document.
However when doing a test of the same data as the first execution, but now expecting an "Update" as opposed to an "Insert" attempt, I get an exception:
Unique index constraint violation after UpsertDocumentAsync method runs.
What am I missing here?
To my understanding, an Upsert is either an update or insert, depending on whether the object exists or not, via it's unique identifier.
The check of whether the outgoing object unique id from the method matches the existing document unique id is supposed to be happening at the Cosmos DB container level.
What I expect to happen is the call notices that the document with that unique ID already exists, and it performs an update, not throw an exception. I would expect it to throw an exception if the method was Insert only.
This issue was fixed by specifying an explicit "id" field in the class where the "records" came from.
The "id" was set to the unique "recordNumber" that I wanted to use as a unique value.
For good measure I set the disableAutomaticIdGeneration to true in the UpsertDocumentAsync method
UpsertDocumentAsync(collectionUri, record, disableAutomaticIdGeneration:true);
No more unique index violation, and no duplicates either.
Worth noting the solution is similar to this one: How can I insert/update data in CosmosDB in an Azure function

Google Datastore can't update an entity

I'm having issues retrieving an entity from Google Datastore. Here's my code:
async function pushTaskIdToCurrentSession(taskId){
console.log(`Attempting to add ${taskId} to current Session: ${cloudDataStoreCurrentSession}`);
const transaction = datastore.transaction();
const taskKey = datastore.key(['Session', cloudDataStoreCurrentSession]);
try {
await transaction.run();
const [task] = await transaction.get(taskKey);
let sessionTasks = task.session_tasks;
sessionTasks.push(taskId);
task.session_tasks = sessionTasks;
transaction.save({
key: taskKey,
data: task,
});
transaction.commit();
console.log(`Task ${taskId} added to current Session successfully.`);
} catch (err) {
console.error('ERROR:', err);
transaction.rollback();
}
}
taskId is a string id of another entity that I want to store in an array of a property called session_tasks.
But it doesn't get that far. After this line:
const [task] = await transaction.get(taskKey);
The error is that task is undefined:
ERROR: TypeError: Cannot read property 'session_tasks' of undefined
at pushTaskIdToCurrentSession
Anything immediately obvious from this code?
UPDATE:
Using this instead:
const task = await transaction.get(taskKey).catch(console.error);
Gets me a task object, but it seems to be creating a new entity on the datastore:
I also get this error:
(node:19936) UnhandledPromiseRejectionWarning: Error: Unsupported field value, undefined, was provided.
at Object.encodeValue (/Users/.../node_modules/#google-cloud/datastore/build/src/entity.js:387:15)
This suggests the array is unsupported?
The issue here is that Datastore supports two kinds of IDs.
IDs that start with name= are custom IDs. And they are treated as strings
IDs that start with id= are numeric auto-generated IDs and are treated as integers
When you tried to updated the value in the Datastore, the cloudDataStoreCurrentSession was treated as a string. Since Datastore couldn't find an already created entity key with that custom name, it created it and added name= to specify that it is a custom name. So you have to pass cloudDataStoreCurrentSession as integer to save the data properly.
If I understand correctly, you are trying to load an Array List of Strings from Datastore, using a specific Entity Kind and Entity Key. Then you add one more Task and updated the value of the Datastore for the specific Entity Kind and Entity Key.
I have create the same case scenario as yours and done a little bit of coding myself. In this GitHub code you will find my example that does the following:
Goes to Datastore Entity Kind Session.
Retrieves all the data from Entity Key id=5639456635748352 (e.g.).
Get's the Array List from key: session_tasks.
Adds the new task that passed from the function's arguments.
Performs the transaction to Datastore and updates the values.
All steps are logged in the code and there are a lot of comments explaining exactly how the code works. Also there are two examples of currentSessionID. One for custom names and other one for automatically generated IDs. You can test the code to understand the usage of it and modify it according to your needs.

Cloud firestore - adding object to an array of objects

I am using Cloud Firestore (not Realtime Database). I have a document of user details containing an array (cards) of user credit cards, i.e. Card objects. A card object is structured as follows:
{
address: "123"
number: "456643634634634"
postcode: "hshshs"
}
The document is defined as follows:
userDocument : AngularFirestoreDocument<User>;
this.userDocument = this.afs.collection('users').doc(`${this.userId}`);
When a user adds a new card, I call the following function:
addCard(card: Card){
this.userDocument.update({
cards : firebase.firestore.FieldValue.arrayUnion(card)
});
}
However, I get errors before even saving the document saying:
[ts]
Type 'FieldValue' is not assignable to type 'Card[]'.
Property 'length' is missing in type 'FieldValue'. [2322]
user.model.ts(15, 5): The expected type comes from property 'cards' which is declared here on type 'Partial<User>'
Not sure how to use arrayUnion to add an element to an array in Cloud Firestore correctly. Can anyone help with this?

Record Id and Type coming as null when I get the result from a saved search in a suitscript

I have a saved search already in my sandbox account.I am not sure on what record the saved search is created. I tried loading the saved search as :
var savedSearch = nlapiLoadSearch("item", searchId);
var resultset = savedSearch.runSearch();
resultset.getResults(0, 1000);//Actually I have looped and got all my search results.
When I run it on the debugger I get to see the results in the columns correctly, but I see the recordId and recordType of the savedsearch result is null. I want to have the recordtype, so that I can load that particular record as required.
Attached is a screenshot of the debugger results in variables section.
If the methods Eric mentions are returning nulls the your search is probably using aggregates like count and sum.
You can get internal id by including internal id as a group field and you can include type as a group field too but you can't use it directly like you can results[i].getRecordType()
nlobjSearchResult Objects have getId() and getRecordType() methods for this purpose.
For example, if you store your results in an Array called searchResults:
searchResults.forEach(printResult);
function printResult(result) {
var recordId = result.getId();
var recordType = result.getRecordType();
// ...
}

Resources