iCloud with Core Data - core-data

this question was asked a year ago, but I have a specific question about online/offline syncing.
Device A & B are both offline, and you make independent changes to the model, lets say you connect device A first, so those changes sync to the cloud.
Now you connect device B, now those changes need to be merged with the changes existing on the cloud (not simply replace the cloud, because they have changes as well)
Does iCloud take care of this?

That's the idea, Core Data works with iCloud and merges any changes, resolving conflicts by choosing a winner. You don't get to help choose the winner, but you get notified if/when any new changes are available to your app.

Related

Is CouchDB/PouchDB a viable solution for my project? Any advice is welcome

I have been reading up a lot about CouchDB (and PouchDB) and am still unsure what the best option would be for a project of mine.
I do have a possible way to solve the project in my head based on what I have read so far, but I am unsure about things like performance and would love to get some insights. Or perhaps there's a better place to ask this question? Please let me know if that's the case! (Already tried their IRC channel and the mailing list, but no answers there as of yet)
So the project is basically an 'offline-first' mobile application. The users are device installers. They get assigned a few locations and devices to install every day. They need to walk around buildings and update the data (eg. device X has been installed at location Y; Or property A of device B on location C has been changed to D, etc...)
Some more info about the basic data.
There are users, they are the device installers. They need to log into the app.
There are locations, all the places that the device installers need to visit.
There are devices, all the different devices that can be installed by the users.
There are todos, basically a planned installation for a specific user at a specific location for specific devices.
Of course I have tried to simplify the data, but this should contain the gist.
Now, these are important characteristics of the application:
Users, locations and devices can be changed by an administrator (back-end software).
Todos can be planned by an administrator (back-end software).
App user (device installer) only sees his/her own todos/planning for today + 1 week ahead.
Multiple app users (device installers) might be assigned to the same location and/or todos, because for a big building there might be multiple installers at work.
Automatic synchronization between the data in each app in use and the global database.
Secure, it should only be possible for user X to request his/her own todos/planning.
Taking into account these characteristics I currently have the following in mind:
One global 'master' database containing all users, locations, devices, todos.
Filtered replication/sync using a selector object which for every user replicates only the data that may be accessible for this specific user.
Ionic application using PouchDB which does full/normal replication/sync with his/her own user database.
Am I correct in assuming the following?
The user of the application using PouchDB will have full read access on his own user database which has been filtered server-side?
For updating data I can make use of validate_doc_update to check whether the user may or may not modify something?
Any changes done on the PouchDB database will be replicated to the 'user' database?
These changes will then also be replicated from the 'user' database to the global 'master' database?
Any changes done on the global 'master' database will be replicated to the 'user' database, but only if required (only if there have been new/changed(/deleted) documents for this user)?
These changes will then also be replicated from the 'user' database to the PouchDB database for the mobile app?
If all this holds true, then it might be a good fit for this project. At least I think so? (Correct me if I'm wrong!) But I did read some 'performance' problem regarding filtered replication. Suppose there are hundreds of users (device installers) (there aren't this many right now, but there might be in the future). Then would it be a problem to have this filtered replication running for hundreds of 'user' databases? I did read about CouchDB 2.0 and 2.1 having a selector object to do filtered replication instead of the usual JS MapReduce which is supposed to be up to 10x faster. But my question is still: does this work well, even for hundreds (or even thousands) of 'filtered' databases? I don't know enough about the underlying algorithms and limitations but I am wondering whether any change to the global 'master' database does or does not require expensive calculations to run to decide which 'filtered' databases to replicate to. And if it does... does it matter in practice?
Please, any advice would be welcome. I did also consider using other databases. My first approach would actually have been to use a relational database. But one of the required characteristics of this app must be the real-time synchronization. In the past I have been able to handle this myself using revision fields in a RDBMS and with a lot of code, but I would really prefer something as elegant as CouchDB/PouchDB for the synchronization. This is really an area that would save me a lot of headache. Keeping this in mind, what are my options? Am I going in a possible right path or could performance become an issue down the road?
Also note that I have also thought about having separate databases for each user ('one database per user'), but I think it might not be the best fit for this project because some todos might be assigned to multiple users and when one user updates something for a todo, it must be updated for the other user as well.
Hopefully some CouchDB experts can shed some light on my questions. Much appreciated!
I understand there might be some debate but I am only interested in the facts and expertise of others.

When does tvOS purge user data

Does anyone know what events will cause tvOS to purge temp user data on Apple TV? I'm assuming that it will be things like when it needs space to cache movies and other iTunes content. Does this also apply for streaming offline content (i.e. streaming things from local servers in a non-internet connected environment)?
I have a client who wants an Apple TV version of their app. Data is currently stored using CoreData. I will be using CloudKit to sync that data with iCloud. However, the problem is that my client may not have the Apple TV connected to the internet at all times. My concern is if they make a bunch of entries (small data, on the order of a few MBs, max) while the unit is offline. If they never connect the unit to the internet so it can sync with iCloud, is it still possible that OS will purge that use data?
If it still may be purged, what are other options? I know we only get 500kb of NSUserDefaults storage, which is not going to be enough space for my needs.
My assumption is that if the Apple TV is not connect to the internet and not downloading and streaming content from iTunes, the OS will not have a need to purge anything and the user data should be safe.
Can anyone comment on this, or point to some documentation in some magical hidden location on the Apple Developer portal?

iCloud-CoreData resolve conflicts

I'm using Ulysses application for iPad. This app uses iCloud as a sync system. In case of conflicts the app shows a popover with the descriptions of the devices involved.
In particular:
Name of the device (e.g. Matteo's Macbook Pro)
Time stamps (e.g. 22nd March 2015 9:34)
Choosing the right version of the note it's than possible to complete the synchronization.
I've already setup the icloud stack to handle the synchronization, and it works pretty good but i can't figure out how, in case of conflicts, retrieve that kind of informations.
Any suggestions?
Listen for the NSPersistentStoreDidImportUbiquitousContentChangesNotification and rather than just calling mergeChangesFromContextDidSaveNotification:, first examine the two versions.
This can be done by retrieving the userInfo dictionary of the notification which should contain the NSManagedObjectIDs of the changed objects under the NSUpdatedObjectsKey.

Share UILocalnotification between devices

I'm developing an ToDo app, and I'm planning to use UILocalNotification. I have integrated CoreData with iCloud support.
I have searched a lot on Google how to share the notifications between the devices, but I can't find anything.
So i hope some of you people out there can help a new Swift developere here, so my question is:
1: Can i share notifications between devices without using a server to send push notifications?
2: How many NSLocalnotification can i have on a device?
//Kim
If you are using Core Data with iCloud to keep the data in sync between devices you can somewhat achieve what your after. What you would need to do is detect when new changes are imported from iCloud and refresh the notifications at that point. I have used that approach successfully before but it does come with a few caveats. In particular:
- Without storing additional information the same notification will be shown on multiple devices.
- The synchronising of the notifications is reliant on the iCloud data being updated. This means that without a means to refresh the iCloud data in the background other devices will not be in sync. It may be possible to work around this using an extension or background services but I'm not too familiar with them.

GitHub and Source Code Protection and Control [duplicate]

This question already has answers here:
How do you protect your software from illegal distribution? [closed]
(22 answers)
Closed 5 years ago.
I am working in a small startup organization with approximately 12 - 15 developers. We recently had an issue with one of our servers where by the entire server was "Re provisioned" i.e. completely wiped of all the code, databases, and information on it. Our hosting company informed us that only someone with access to the server account could have done something like this - and we believe that it may have been a disgruntled employee (we have recently had to downsize). We had local backups of some of the data but are still trying to recover from the data loss.
My question is this - we have recently began using GitHub to manage source control on some of our other projects - and have more then a few private repositories - is there any way to ensure that there is some sort of protection of our source code? What i mean by this is that I am aware that you can delete an entire Project on GitHub, or even a series of code updates. I would like to avoid this from happening.
What i would like to do is create (perhaps in a separate repository) a complete replica of the project on Git - and ensure that only a single individual has access to this replicated project. That way if the original project is corrupted or destroyed for any reason we can restore where we were (with history intact) from the backup repository.
Is this possible? What is the best way to do this? Github has recently introduced "Company" accounts... is that the way to go?
Any help on this situation would be greatly appreciated.
Cheers!
Well, if a disgruntled employee leaves, you can easily remove them from all your repositories, especially if you are using the Organizations - you just remove them from a team. In the event that someone deletes a repository maliciously that still had access for some reason, we have daily backups of all of the repositories that we will reconstitute if you ask. So you would never lose more than one day of code work at worst. Likely someone on the team will have an update with that code anyhow. If you need more protection than that, then yes, you can setup a cron'd fetch or something that will do mirrors of your code more often.
First, you should really consult github support -- only they can tell you how they do the backup, what options for permission control they have (esp. now that they introduced "organizations") etc. Also you have agreement with them -- do read it.
Second, it's still very easy to do git fetch by cron, say, once an hour (on your local machine or on your server) -- and you're pretty safe.
Git is a distributed system. So your local copy is the same as your remote copy on Git hub! You should be OK to push it back up there.

Resources