Data migration after adding CloudKit configuration to local CoreData - core-data

My iOS app was using a pure CoreData local store ("Default" configuration).
I have added a new configuration with a CloudKit container to add some new entities into it.
I use this piece of code to manage my "local" store and "cloud" store:
let container = NSPersistentCloudKitContainer(name: "MyModel")
let description = container.persistentStoreDescriptions.first
description?.setOption(true as NSNumber, forKey: NSPersistentHistoryTrackingKey)
container.persistentStoreDescriptions = [description!]
let storeDirectory = FileManager.default.urls(for: .applicationSupportDirectory, in: .userDomainMask).first!
// Create a store description for a local store
let defaultUrl = storeDirectory.appendingPathComponent("default.sqlite")
let localStoreDescription = NSPersistentStoreDescription(url: defaultUrl)
localStoreDescription.configuration = "Default"
// Create a store description for a CloudKit-backed local store
let cloudUrl = storeDirectory.appendingPathComponent("cloud.sqlite")
let cloudStoreDescription = NSPersistentStoreDescription(url: cloudUrl)
cloudStoreDescription.configuration = "Cloud"
// Set the container options on the cloud store
cloudStoreDescription.cloudKitContainerOptions =
NSPersistentCloudKitContainerOptions(
containerIdentifier: "iCloud.com.myname.myapp")
// Update the container's list of store descriptions
container.persistentStoreDescriptions = [
cloudStoreDescription,
localStoreDescription
]
// Load both stores
container.loadPersistentStores { _, error in
guard error == nil else {
fatalError("Could not load persistent stores. \(error!)")
}
}
context = container.viewContext
context.automaticallyMergesChangesFromParent = true
context.mergePolicy = NSMergeByPropertyObjectTrumpMergePolicy
It works perfectly but all the objects that was in my local store are lost with this new store. This legacy objects are supposed to stay in the local store ("cloud" entires are new).
I have created a new version of my .xcdatamodel
I haven't found any solution to keep my previous data in this new store

Related

PagerDuty Terraform API Limitations

Just inquiring if anyone's aware of any permission limitations with the PagerDuty terraform API? With a base role of Observer in PagerDuty, it appears as though certain objects (which my user created) can be deleted via the GUI, but not via the terraform API even though I’m using the same user account. A PagerDuty Extension is an example of an object where I’m hitting this issue.
The same test case works as expected if I try it with a user with a base role of Manager though. Here’s a quick terraform file I threw together to verify this test case:
resource "pagerduty_schedule" "schedule" {
name = "terraform-test-schedule"
time_zone = "America/Denver"
teams = ["PRDBAEK"]
layer {
name = "weekly"
start = "2020-02-05T09:00:00-06:00"
rotation_virtual_start = "2020-02-05T09:00:00-06:00"
rotation_turn_length_seconds = 604800
users = ["PN94M6Q"]
}
}
resource "pagerduty_escalation_policy" "escalation_policy" {
name = "terraform-test-ep"
description = "terraform-test-ep"
num_loops = 0
teams = ["PRDBAEK"]
rule {
escalation_delay_in_minutes = 10
target {
type = "schedule_reference"
id = pagerduty_schedule.schedule.id
}
}
}
resource "pagerduty_service" "event" {
name = "terraform-test-service"
description = "terraform-test-service"
alert_creation = "create_alerts_and_incidents"
escalation_policy = pagerduty_escalation_policy.escalation_policy.id
incident_urgency_rule {
type = "constant"
urgency = "severity_based"
}
alert_grouping_parameters {
type = "intelligent"
config {
fields = []
timeout =0
}
}
auto_resolve_timeout = "null"
acknowledgement_timeout = "null"
}
resource "pagerduty_extension" "test_extension" {
name = "terraform-test-extension"
extension_schema = data.pagerduty_extension_schema.generic_v2_webhook.id
endpoint_url = https://fakeurl.com
extension_objects = [
pagerduty_service.event.id
]
config = jsonencode({})
}
All objects can be created successfully. I get the following error when testing a terraform destroy with an account with base role Observer though. It can't delete the Extension.
Error: DELETE API call to https://api.pagerduty.com/extensions/P53423F failed 403 Forbidden. Code: 2010, Errors: <nil>, Message: Access Denied
But using that same account, I can delete that extension in the GUI with no issues.

How can I set up a Influxdb database based on workspace name?

I have a terraform script where I have to set up an Influxdb server and I want to create different databases based on the workspace name. Is it possible to create a map in the variables file to allocate a database name and look it up from a different variable within the same file?
Ex:
var file:
variable "influx_database" "test" {
name = "${lookup(var.influx_database_name, terraform.workspace)}
}
variable "influx_database_name" {
type = "map"
default = {
dump = "dump_database"
good = "good_database"
}
}
You can use local variable like below,
locals {
influx_database_name = "${lookup(var.influx_database_name, terraform.workspace)}"
}
variable "influx_database_name" {
type = "map"
default = {
default = "default_database"
dump = "dump_database"
good = "good_database"
}
}
output "influx_database_name" {
value = "${local.influx_database_name}"
}
local.influx_database_name is defined by workspace name.

How to share variables in nodejs?

I want to share variables between different files in node.
I have seen many sites but none of them worked for me.
these are some of the sites
Share variables between files in Node.js?
https://stackabuse.com/how-to-use-module-exports-in-node-js/
usersConroller.js file
module.exports.fetchedUser = fetchedUser;
module.exports.fetchedUser.branchId = branchId;
module.exports.fetchedUser.role = role;
module.exports.isLoggedIn = isLoggedIn;
then on another file
I imported userController and tried to access the variables
as this
let usersController = require('./usersController');
let fetchedUser = usersController.fetchedUser;
let branchId = usersController.branchId;
let role = usersController.role;
let isLoggedIn = usersController.isLoggedIn;
and when i console.log() the variables, is says undefined
any help.please??
Thank You for your help!!
If there is no typo anywhere and you are using correct file name in your require statement, then the problem is your way of accessing the variables.
Your export variable looks something like this
exports = {
fetchedUser: {
branchId: <some_value>,
role: <some_other_value>
},
isLoggedIn: <another_value>
}
Now, let's look at your code:
// this line should give you the desired result
let fetchedUser = usersController.fetchedUser;
// this a wrong way to access branchId
// let branchId = usersController.branchId;
// branchId is actually a property of fetchedUser
// so you'll have to first access that
let branchId = usersController.fetchedUser.branchId;
// alternatively (because fetchedUser is already
// saved in a variable):
branchId = fetchedUser.branchId;
// similar problem while accessing role property
// let role = usersController.role;
// correct way:
let role = fetchedUser.role;
// this line is correct
let isLoggedIn = usersController.isLoggedIn;

Can I delete a file in Acumatica via the API?

I'm creating a file in Acumatica by calling an action from the API, so that I can retrieve the file in my application.
Is it possible to delete the file via API after I'm done with it? I'd rather not have it cluttering up my Acumatica database.
Failing this, is there a recommended cleanup approach for these files?
Found examples of how to delete a file from within Acumatica, as well as how to save a new version of an existing file! The below implementation saves a new version but has the deletion method commented out. Because I built this into my report generation process, I'm not later deleting the report via API, but it would be easy to translate a deletion into an action callable by the API.
private IEnumerable ExportReport(PXAdapter adapter, string reportID, Dictionary<String, String> parameters)
{
//Press save if the SO is not completed
if (Base.Document.Current.Completed == false)
{
Base.Save.Press();
}
PX.SM.FileInfo file = null;
using (Report report = PXReportTools.LoadReport(reportID, null))
{
if (report == null)
{
throw new Exception("Unable to access Acumatica report writer for specified report : " + reportID);
}
PXReportTools.InitReportParameters(report, parameters, PXSettingProvider.Instance.Default);
ReportNode reportNode = ReportProcessor.ProcessReport(report);
IRenderFilter renderFilter = ReportProcessor.GetRenderer(ReportProcessor.FilterPdf);
//Generate the PDF
byte[] data = PX.Reports.Mail.Message.GenerateReport(reportNode, ReportProcessor.FilterPdf).First();
file = new PX.SM.FileInfo(reportNode.ExportFileName + ".pdf", null, data);
//Save the PDF to the SO
UploadFileMaintenance graph = new UploadFileMaintenance();
//Check to see if a file with this name already exists
Guid[] files = PXNoteAttribute.GetFileNotes(Base.Document.Cache, Base.Document.Current);
foreach (Guid fileID in files)
{
FileInfo existingFile = graph.GetFileWithNoData(fileID);
if (existingFile.Name == reportNode.ExportFileName + ".pdf")
{
//If we later decide we want to delete previous versions instead of saving them, this can be changed to
//UploadFileMaintenance.DeleteFile(existingFile.UID);
//But in the meantime, for history purposes, set the UID of the new file to that of the existing file so we can save it as a new version.
file.UID = existingFile.UID;
}
}
//Save the file with the setting to create a new version if one already exists based on the UID
graph.SaveFile(file, FileExistsAction.CreateVersion);
//Save the note attribute so we can find it again.
PXNoteAttribute.AttachFile(Base.Document.Cache, Base.Document.Current, file);
}
//Return the info on the file
return adapter.Get();
}
The response from Acumatica:
S-b (Screen-base) API allows clean way of downloading report generated as file. C-b (Contract-base) simply does not have this feature added. I suggest you provided feedback here: feedback.acumatica.com (EDIT: Done! https://feedback.acumatica.com/ideas/ACU-I-1852)
I think couple of workaround are:
1) use s-b using login from c-b to generate report and get as file (see example below), or
2) create another method to delete the file once required report file is downloaded. For that, you will need to pass back FileID or something to identify for deletion.
example of #1
using (DefaultSoapClient sc = new DefaultSoapClient("DefaultSoap1"))
{
string sharedCookie;
using (new OperationContextScope(sc.InnerChannel))
{
sc.Login("admin", "123", "Company", null, null);
var responseMessageProperty = (HttpResponseMessageProperty)
OperationContext.Current.IncomingMessageProperties[HttpResponseMessageProperty.Name];
sharedCookie = responseMessageProperty.Headers.Get("Set-Cookie");
}
try
{
Screen scr = new Screen(); // add reference to report e.g. http://localhost/Demo2018R2/Soap/SO641010.asmx
scr.CookieContainer = new System.Net.CookieContainer();
scr.CookieContainer.SetCookies(new Uri(scr.Url), sharedCookie);
var schema = scr.GetSchema();
var commands = new Command[]
{
new Value { LinkedCommand = schema.Parameters.OrderType, Value = "SO" },
new Value { LinkedCommand = schema.Parameters.OrderNumber, Value = "SO004425" },
schema.ReportResults.PdfContent
};
var data = scr.Submit(commands);
if(data != null && data.Length > 0)
{
System.IO.File.WriteAllBytes(#"c:\Temp\SalesOrder.pdf",
Convert.FromBase64String(data[0].ReportResults.PdfContent.Value));
}
}
finally
{
sc.Logout();
}
}
Hope this helps. Also, it would be great if you update the stackover post based on these suggestions.
Thanks
Nayan Mansinha
Lead - Developer Support | Acumatica

Buggy and Slow scrolling when loading TableView images from CoreData

Problem :
The process of loading images to Table View using their Paths that who stored in Core Data DB works fine , but the user experience not going well. The scroll is slow and buggy.
Importants Notes :
For my local DB i use Core Data
Im not saving the image it self in the Core Data , only their path(the image name)
As for the table view DataSource , i use an array of type Person that contains an ID and Img name(TableView rows equal array.Count).
-This is the url im getting my Json from (Check it out) - Json Link
All the object inside the Core Data DB
As far as i know , i did all the UI Updates in the Main theard
After each check im reloading the tableview.
This are the steps that being taking in a right sequence :
Get the data using NSURLSession - DataTask. After that "parsing" it and check if each object(In a for loop) exists in the Core Data DB,and than appending his variables to the TableView datasource array , and reloading the data
1)
let request = NSMutableURLRequest(URL: dataSourceURL!)
let task = NSURLSession.sharedSession().dataTaskWithRequest(request) {
data, response, error in
if error != nil {
println("error=\(error)")
return
}
if data != nil {
let datasourceDictionary = NSJSONSerialization.JSONObjectWithData(data, options: nil, error: nil) as NSDictionary
var DataArray = datasourceDictionary["descisions"] as NSArray
//Convert it to useable array
for var i = 0 ; i < DataArray.count ; i++ {
let ID_s = DataArray[i]["ID"]! as Int
//For each Object from the Json array i'm check if he is exisitng in the local DB
var ConvertetdID = Int32(ID_s)
let object : Lockdown? = self.CheckIfObjectExistInDBbyID(ConvertetdID)
//CheckIfExists - if it does , it return an object with the correct values
if object != nil {
//exists - load file from Core Data
let imgname = object!.imgPath
let photoRecord = PhotoRecord(name:"\(ConvertetdID)", url:imgname)
self.photos.append(photoRecord)
//TableView object array (photos)
dispatch_async(dispatch_get_main_queue()) {
self.tableView.reloadData()
//After each check reload the tableView
}
}
}
}
task.resume()
Method that checks if he is exists or not in Core Data DB(the method receive the ID and returns object if exists and nil if not :
func CheckIfObjectExistInDBbyID(id : Int32) -> Lockdown? {
let appDelegate =
UIApplication.sharedApplication().delegate as AppDelegate
let managedContext = appDelegate.managedObjectContext!
var request : NSFetchRequest = NSFetchRequest(entityName: "Lockdown")
request.predicate = NSPredicate(format: "id = %d", id)
var error : NSError?
request.fetchLimit = 1
var count : Int = managedContext.countForFetchRequest(request,error: &error)
if count == 0 {
// println("Error : \(error?.localizedDescription)")
println("Object \(id) Dosent exist in CoreData")
return nil
}
println("Object \(id) exist in CoreData")
let result = managedContext.executeFetchRequest(request, error: &error) as NSArray!
let lockdown = result.objectAtIndex(0) as Lockdown
println(lockdown.id)
return lockdown
}
cellForRowAtIndexPath method
let cell = tableView.dequeueReusableCellWithIdentifier("CellIdentifier", forIndexPath: indexPath) as UITableViewCelllet photoDetails = photos[indexPath.row]
cell.textLabel.text = photoDetails.name as String
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0)) {
var myPathList : NSArray = NSSearchPathForDirectoriesInDomains(NSSearchPathDirectory.CachesDirectory, NSSearchPathDomainMask.UserDomainMask, true)
var myPath = myPathList[0] as String
myPath = myPath.stringByAppendingPathComponent("\(photoDetails.name).png")
var image : UIImage = UIImage(contentsOfFile: myPath)!
dispatch_async(dispatch_get_main_queue()) {
cell.imageView.image = image
}
}
return cell
Any suggestions what is the cause of the problem?
You shouldn't always load the image from disk, it's slow. Instead, load the image from disk the first time and then store it in a cache (in memory). Check if the image is cached before loading it from disk.
You need to empty the cache if it gets too big or you get a memory warning.
Advanced versions can pre-cache some images for the cells that are just below the current scroll position.
You may be able to use a library for this, perhaps SDWebImage with file URLs (I haven't tried this).
The buggy part is because your async block doesn't check that the cell hasn't been reused before the image is loaded. If it has then the wrong image will appear on the reused cell (until it's replaced by another image). To fix, capture the indexPath in the block and get the cell for that indexPath once the image is loaded - if the table view returns nil then the cell is no longer visible and you don't need to do anything other than cache the image for future use.

Resources