Cosmos DB apply changes to existing data - azure-web-app-service

I'm using Cosmos DB in an attempt to keep a Web App as cheap / free as possible. I'm not very familiar with it.
I've added a bunch of data. Approx 200 rows in a table called Members. I then added more fields. In particular this field
public bool ArchiveMember { get; set; }
Any new Members I add have the ArchiveMember field, but existing data doesn't include the new field (set as false) as I expected.
Is there a way of applying migrations to all data?

Thank you rickvdbosch ,Posting your comment as an answer to help other community members for this similar issue.
"You should update the data yourself using a script or tool. It might be simpler to have the entity have a default value for the ArchiveMember property instead of updating all data. You could also take a look at Table Storage which is a feature of Storage Accounts. The API is also supported by Cosmos DB, enabling you to start with a storage account and migrate over if requirements or performance change."

Related

How to get both before and after updated records in azure data factory

I had a scenario if a new record is updated, I need both before the update and currently updated value in azure data factory, and pass it to function app. Since CDC is not supported in Azure SQL I cannot get History records. Kindly suggest me to get better solution
I'm afraid to say, data factory can't get the data both before the update and currently updated value. Data Factory is more focus on data transferring and some data conversion.
Maybe you need build your code and design a new logic to achieve that, before update the record, query first and store the data. Then update data and query again to get the current data.

Logic App to push data from Cosmosdb into CRM and perform an update

I have created a logic app with the goal of pulling data from a container within cosmosdb (with a query), looping over the results and then pushing this data into CRM (or Common Data Service). When the data is pushed to CRM, an ID will be generated. I wish to then update cosmosdb with this new ID. Here is what I have so far:
This next step is querying for the data within our cosmosdb database and selecting all IDS with a length that is greater than 15. (This tells us that the ID is not yet within the CRM database)
Then we loop over the results and push this into CRM (Dynamics365 or the Common Data Service)
Dilemma: The first part of this process appears to be correct, however, I want to make sure that I am on the right track with this. Furthermore, once the data is successfully pushed to CRM, CRM automatically generates an ID for each record. How would I then update cosmosDB with the newly generated IDs?
Any suggestion is appreciated
Thanks
I see a red flag in your approach here with this query with length(c.id) > 15. This is not something I would do. I don't know how big your database is going to be but generally not very performant to do high volumes of cross partition queries, especially if the database is going to keep growing.
Cosmos DB already provides an awesome streaming capability so rather than doing this in a batch I would use Change Feed and use that to accomplish whatever your doing here in your Logic App. This will likely give you better control of the process and likely allow you to get the id back out of your CRM app to insert back into Cosmos DB.
Because you will be writing back to Cosmos DB, you will need a flag to ignore the update in Change Feed when the item is updated.

Copy documents from one DocumentCollection to another?

In my Azure CosmosDB, that I use with the Gremlin API there is one database called graphdb with several {DocumentCollections}.
I would like to copy a selected set of Vertices and Edges from one collection (graphdb) to another (Tintin).
I managed to do this by transferring all data via the client, but it would be much easier if data stayed in Azure. Thus I tried some SQL in the Azure portal like:
SELECT *
INTO Tintin
FROM graphdb;
However, this seems unsupported.
Now you cannot join multiple collections and you query violates this rule.
But I think +1 for your idea, you should post it on https://feedback.azure.com/

Syncing Problems with Xamarin Forms and Azure Easy Tables

I've been working on a Xamarin.Forms application in Visual Studio using Azure for the backend for a while now, and I've come across a really strange issue.
Please note, that I am following the methods mentioned in this blog
For some strange reason the PullAsync() method seems to have some bizarre problems. Any data that I create and sync will only be pulled by PullAsync() from that solution. What I mean by that is that if I create another solution that accesses the exact same backend, it can perform it's own create/sync data, but will not bring over the data generated by the other solution, even though they both seem to have the exact same access. This appears to be some kind of a security feature/issue, but I can't quite make sense of it.
Has anyone else encountered this at all? Was there a work-around at all? This could potentially cause problems down the road if I were to ever want to create another solution that accesses the same system/data for whatever reason.
For some strange reason the PullAsync() method seems to have some bizarre problems. Any data that I create and sync will only be pulled by PullAsync() from that solution.
According to your provided tutorial, I found that the related PullAsync is using Incremental Sync.
await coffeeTable.PullAsync("allCoffees", coffeeTable.CreateQuery());
Incremental Sync:
the first parameter to the pull operation is a query name that is used only on the client. If you use a non-null query name, the Azure Mobile SDK performs an incremental sync. Each time a pull operation returns a set of results, the latest updatedAt timestamp from that result set is stored in the SDK local system tables. Subsequent pull operations retrieve only records after that timestamp.
Here is my test, you could refer to it for a better understanding of Incremental Sync:
Client : await todoTable.PullAsync("todoItems-02", todoTable.CreateQuery());
The client SDK would check if there has a record with the id equals deltaToken|{table-name}|{query-id} from the __config table of your SQLite local store.
If there has no record, then the SDK would send a request as following for pulling your records:
https://{your-mobileapp-name}.azurewebsites.net/tables/TodoItem?$filter=(updatedAt%20ge%20datetimeoffset'1970-01-01T00%3A00%3A00.0000000%2B00%3A00')&$orderby=updatedAt&$skip=0&$top=50&__includeDeleted=true
Note: the $filter would be set as (updatedAt ge datetimeoffset'1970-01-01T00:00:00.0000000+00:00')
While there has a record, then the SDK would pick up the value as the latest updatedAt timestamp and send the request as follows:
https://{your-mobileapp-name}.azurewebsites.net/tables/TodoItem?$filter=(updatedAt%20ge%20datetimeoffset'2017-06-26T02%3A44%3A25.3940000%2B00%3A00')&$orderby=updatedAt&$skip=0&$top=50&__includeDeleted=true
Per my understanding, if you handle the same logical query with the same query id (non-null) in different mobile client, you need to make sure the local db is newly created by each client. Also, if you want to opt out of incremental sync, pass null as the query ID. In this case, all records are retrieved on every call to PullAsync, which is potentially inefficient. For more details, you could refer to How offline synchronization works.
Additionally, you could leverage fiddler for capturing the network traces when you invoke the PullAsync, in order to troubleshoot your issue.

Azure Mobile Services and Code First Migrations update

I have created an Azure Mobile Service project. From the beginning of the project I created my entities and enabled Code First Migrations. During the development process I never had any problem creating new entities, modifying existing ones and updating the database through data migrations. All sweet and nice.
I published my solution to Azure Mobile Services. My database schema was created automatically and everything was playing nice.
After few days I needed to update a field in a table. So updated the entity locally and run the service locally. My local version of my database was updated with my new addition. I uploaded the service to Azure and I was expecting my online database to be updated also. But I get this error
The model backing the 'xxxxx' context has changed since the database was created. Consider using Code First Migrations to update the database.
That is strange, since code first migrations are already enabled. My database was initially created using them. After many days of trying almost everything I deleted the schema of my database at the online version. I run again the service online and it created again the database schema with the last change I did. So I figure out the Azure Mobile Service has no problem to create the schema from the beginning but cannot figure out how to apply schema updates.
I do not recommend this as an answer (so please don't accept it as such), but I ended up so frustrated with the code-first migrations (which, like you, I just could not get to work) that I did this as a work-around while I await enlightenment.
1) Update the data model
For me this was simply adding this line to my Item class:
public bool IsPublic { get; set; }
2) Manually update the SQL server
You'll find the connection details in the publish profile you can download from the mobile service's dashboard in the Azure Portal. My tSQL command was simply
ALTER TABLE vcollectapi.Items
ADD IsPublic BIT NOT NULL DEFAULT(0)
3) Stop the service checking whether the model backing the context has changed since the last successful migration
There are several answers on how to do this, I followed this one and added the following static constructor to my data context class VCollectAPIContext
static VCollectAPIContext()
{
Database.SetInitializer<VCollectAPIContext>(null);
}
Now my service is back up-and-running and my data remained intact.

Resources