I have an old core data version with Allows external storage in Binary data. And i have created a new one where Allows external storage was unchecked. When i tried to launch my app i got an error:
Unresolved error
Error Domain=NSCocoaErrorDomain Code=134140 "The operation couldn’t be
completed. (Cocoa error 134140.)" UserInfo=0xbd5cd20 {reason=Can't
find or automatically infer mapping model for migration,
destinationModel=...
Then I have created a new file in my project - Model.xcmappingmodel. I have selected source data model and target data model.
After that i see
NSPersistentStoreCoordinator error: NSPersistentStoreCoordinator
_coordinator_you_never_successfully_opened_the_database_schema_mismatch
For future context ... writing to CoreData's external storage is broken as of iOS 12. My solution was based off Drew McCormack's comments on Twitter.
It's worth noting that only writing is broken, reading remains functional.
Create a new property eg «original property name»Internal with the same type but with Allows external storage box unchecked.
You now have two options:
Migrate everything on initial startup of the app - ie fetch all objects with a predicate like this «original property name»Internal == NULL && «original property name» != NULL and move the data across (I did this).
Migrate on the fly ie use the two properties in parallel but only write to «original property name»Internal (while niling «original property name»)
Worth noting one last time that your customer's data is safe until they update it. Tred carefully but you should be fine.
Related
I am currently working on a project where i deploy multiple arm templates each deploying a VM and doing few operations on them. I wanted handle quota issues by calling template validation before triggerring the first deployment. So, i created a template which has logic to create required VMs and i am using this template only for validation (to check if quota will not be exceeded).
Since our code already has the ResourceManagementClient, i tried the following code:
Deployment parameters = new Deployment(
new DeploymentProperties(DeploymentMode.Incremental)
{
Template = templateFile,
Parameters = parameterFile,
});
DeploymentsValidateOperation dp = deployments.StartValidate(groupName, "validation", parameters);
But when i try to access the Value from the variable dp, i keep getting the following exception:
Generic Exception System.InvalidOperationException: The operation has
not completed yet. at Azure.Core.ArmOperationHelpers`1.get_Value()
at
Azure.ResourceManager.Resources.DeploymentsValidateOperation.get_Value()
at DeployTemplate.Program.d__3.MoveNext() in
\Program.cs:line 88
I even added a loop after the "StartValidate" to wait till the dp.HasCompleted is set to true. But this seems to run indefinetly. I also tried the "StartValidateAsync" method, which seems to have the same issue.
I wanted to understand if i am using this method correctly? if there is a better way to do the template validations? I could not find any examples on this method`s usage. if possible please share any code snippet where this method is used for my reference.
Note: Currently, Since this is not working, i am testing with Fluent Api way. That seems to be working. But, it requires lot of changes in our code as it creates ambiguity with many classes in "Azure.ResourceManager.Resources" which are already used for other operations.
I found that even though the deployment operations HasCompleted field is not set, when I call dp.GetRawResponse(), it returns the exact errors expected.
I now use this to validate my templates.
Following these guides https://developers.google.com/apps-script/guides/rest/quickstart/target-script and https://developers.google.com/apps-script/guides/rest/quickstart/nodejs, I am trying to use the Execution API in node to return some data that are in a Google Spreadsheet.
I have set the script ID to be the Project Key of the Apps Script file. I have also verified that running the function in the Script Editor works successfully.
However, when running the script locally with node, I get this error:
The API returned an error: Error: ScriptError
I have also made sure the script is associated with the project that I use to auth with Google APIs as well.
Does anyone have any suggestion on what I can do to debug/ fix this issue? The error is so generic that I am not sure where to look.
UPDATE: I've included a copy of the code in this JSBin (the year function is the entry point)
https://jsbin.com/zanefitasi/edit?js
UPDATE 2: The error seems to be caused by the inclusion of this line
var spreadsheet = SpreadsheetApp.open(DriveApp.getFileById(docID));
It seems that I didn't request the right scopes. The nodejs example include 'https://www.googleapis.com/auth/drive', but I also needed to include 'https://www.googleapis.com/auth/spreadsheets' in the SCOPES array. It seems like the error message ScriptError is not very informative here.
In order to find what scopes you'd need, to go the Script Editor > File > Project Properties > Scopes. Remember to delete the old credentials ~/.credentials/old-credential.json so that the script will request a new one.
EDIT: With the update in information I took a closer look and saw you are returning a non-basic type. Specifically you are returning a Sheet Object.
The basic types in Apps Script are similar to the basic types in
JavaScript: strings, arrays, objects, numbers and booleans. The
Execution API can only take and return values corresponding to these
basic types -- more complex Apps Script objects (like a Document or
Sheet) cannot be passed by the API.
https://developers.google.com/apps-script/guides/rest/api
In your Account "Class"
this.report = spreadsheet.getSheetByName(data.reportSheet);
old answer:
'data.business_exp' will be null in this context. You need to load the data from somewhere. Every time a script is called a new instance of the script is created. At the end of execution chain it will be destroyed. Any data stored as global objects will be lost. You need to save that data to a permanent location such as the script/user properties, and reloaded on each script execution.
https://developers.google.com/apps-script/reference/properties/
I'm using Web api with Entity Framework 4.2 and the Sybase Ase connector.
This was working without issues returning JSon, until I tried to add a new table.
return db.car
.Include("tires")
.Include("tires.hub_caps")
.Include("tires.hub_caps.colors")
.Include("tires.hub_caps.sizes")
.Include("tires.hub_caps.sizes.units")
.Where(c => c.tires == 13);
The above works without issues if the following line is removed:
.Include("tires.hub_caps.colors")
However, when that line is included, I am given the error:
""An error occurred while preparing the command definition. See the inner exception for details."
The inner exception reads:
"InnerException = {"Specified method is not supported."}"
"source = Sybase.AdoNet4.AseClient"
The following also results in an error:
List<car> cars = db.car.AsNoTracking()
.Include("tires")
.Include("tires.hub_caps")
.Include("tires.hub_caps.colors")
.Include("tires.hub_caps.sizes")
.Include("tires.hub_caps.sizes.units")
.Where(c => c.tires == 13).ToList();
The error is as follows:
An exception of type 'System.Data.EntityCommandCompilationException' occurred in System.Data.Entity.dll but was not handled in user code
Additional information: An error occurred while preparing the command definition. See the inner exception for details.
Inner exception: "Specified method is not supported."
This points to a fault with with the Sybase Ase Data Connector.
I am using data annotations on all tables to control which fields are returned. On the colors table, I have tried the following annotations to limit the properties returned just the key:
[JsonIgnore]
[IgnoreDataMember]
Any ideas what might be causing this issue?
Alternatively, if I keep colors in and remove,
.Include("tires.hub_caps.sizes")
.Include("tires.hub_caps.sizes.units")
then this works also. It seems that the Sybase Ase connector does not support cases when an include statement forks from one object in two directions. Is there a way round this? The same issue occurs with Sybase Ase and the progress data connector.
The issue does not occur in a standard ASP.net MVC controller class - the problem is with serializing two one to many relationships on a single table to JSON.
This issue still occurs if lazy loading is turned on.
It seems to me that this is a bug with Sybase ASE, that none of the connectors are able to solve.
Given the following code:
listView.ItemsSource =
App.azureClient.GetTable<SomeTable>().ToIncrementalLoadingCollection();
We get incremental loading without further changes.
But what if we modify the read.js server side script to e.g. use mssql to query another table instead. What happens to the incremental loading? I'm assuming it breaks; if so, what's needed to support it again?
And what if the query used the untyped version instead, e.g.
App.azureClient.GetTable("SomeTable").ReadAsync(...)
Could incremental loading be somehow supported in this case, or must it be done "by hand" somehow?
Bonus points for insights on how Azure Mobile Services implements incremental loading between the server and the client.
The incremental loading collection works by sending the $top and $skip query parameters (those are also sent when you do a query by using the .Take and .Skip methods in the table). So if you want to modify the read script to do something other than the default behavior, while still maintaining the ability to use that table with an incremental loading collection, you need to take those values into account.
To do that, you can ask for the query components, which will contain the values, as shown below:
function read(query, user, request) {
var queryComponents = query.getComponents();
console.log('query components: ', queryComponents); // useful to see all information
var top = queryComponents.take;
var skip = queryComponents.skip;
// do whatever you want with those values, then call request.respond(...)
}
The way it's implemented at the client is by using a class which implements the ISupportIncrementalLoading interface. You can see it (and the full source code for the client SDKs) in the GitHub repository, or more specifically the MobileServiceIncrementalLoadingCollection class (the method is added as an extension in the MobileServiceIncrementalLoadingCollectionExtensions class).
And the untyped table does not have that method - as you can see in the extension class, it's only added to the typed version of the table.
Some minutes ago I tried to create a time job
A added some properties like
this.Properties.Add("fileName", fileName);
this.Properties.Add("username", new NetworkCredential("username", "passworD");
After updating the job a get a critical error in the Timer Job list of the Central Administration occured.
The platform does not know how to deserialize an object of type System.Net.NetworkCredential. The platform can deserialize primitive types such as strings, integers, and GUIDs; other SPPersistedObjects or SPAutoserializingObjects; or collections of any of the above. Consider redesigning your objects to store values in one of these supported formats, or contact your software vendor for support.
Now Im unabled to delete or retract the job with SPJobdefinition's Delete() method or other classes within the SPObject model.
Ok. I got it.
I deleted the corresponding object in the SharepointConfigDatabase.dbo.Objects table