Azure data factory - Continue on conflict on insert - azure

We are building data migration pipeline using Azure data factory (ADF). We are transferring data from one CosmosDb instance to another. We plan to enable dual writes, so that we write to both the databases before migration begins to ensure that during migration if any data point changes both the databases get the most updated data. However, In ADF there is only Insert or upsert options available. Our case is on Insert if it gets 'conflict' continue and fail the pipeline. Can anyone give any pointers on how to achieve that in ADF?
Other option would be to create our own custom tool using CosmosDb libraries to transfer data.

If you are doing a live migration ADF is not the right tool to use as this is intended for offline migrations. If you are migrating from one Cosmos DB account to another your best option is to use the Cosmos DB Live Data Migrator.
This tool also provides dead letter support as well which is another requirement you have.

Related

Is there a way to create a CosmosDb Snapshot and restore it using a C# Console app?

I searched everywhere and I couldn't find a single article that shows a way to create a script/console app that creates a snapshot of a CosmosDb database and restores it -- is this even possible?
Cosmos DB doesn't have the ability to snapshot a database. You'd need to create this on your own.
While "how" you accomplish this is a bit off-topic, as it's very broad, there are two built-in Azure approaches:
Change Feed. Cosmos DB has a Change Feed you may subscribe to, to consume content from a container in a streaming approach. By consuming the change feed, you could effectively re-create a container's data into another container. There are several writeups around this very topic.
Data Factory. You can copy content between containers via an Azure Data Factory pipeline (Cosmos DB is available as both a source and a sink for pipelines).

Onpremise Databases to Azure SQL Databases and Sync continuously

My requirements are as below :
Move 3 SAP local databases to 3 Azure SQL DB.
Then Sync daily transactions or data to azure every night. If transactions of local DB are already exists in azure, update process will do on these transactions if not insert process will do.
Local systems will not stop after moving to azure. They will still goes about 6 months.
Note :
We are not compatible with Azure Data Sync process because of it's
limitations - only support 500 tables, can't sync no primary keys
table, no views and no procedure. It also increase database size on
both(local and azure).
Azure Data Factory Pipeline can fulfill my requirements but I have
to create pipeline and procedure manually for each table. (SAP has
over 2000 tables, not good for me)
We don't use azure VM and Manage Instance
Can you guide me the best solution to move and sync? I am new to azure.
Thanks all.
Since you mentioned that ADF basically meets your needs, I will try to start from ADF. Actually,you don't need to manually create each table one by one.The creation could be done in the ADF sdk or powershell script or REST api. Please refer to the official document:https://learn.microsoft.com/en-us/azure/data-factory/
So,if you could get the list of SAP table names(i found this thread:https://answers.sap.com/questions/7375575/how-to-get-all-the-table-names.html) ,you could loop the list and execute the codes to create pipelines in the batch.Only table name property need to be set.

Cosmos DB data migration

Wanted to implement my own backup mechanism for Cosmos DB. In order to do that wanted just to grab the data every x hours and put it onto some other storage account / different cosmos db instance.
Since I can't use Data Factory (not available in my region) is there any other easy way to get data from Cosmos and put it somewhere else?
First thing that comes to my mind are just some SQL queries that would go through all collections and copy them. Is there an easier way?
Since you can't use Data Factory (maybe it's most suitable for you), I suggest you using below two solutions:
1.Azure Time Trigger Function.
It supports CORN expression. So ,you could query the data and copy them into the target collection via Cosmos db sdk. However, please note the Azure Function has execution time limitation.
2.Azure Cosmos DB Migration Tool.
You could see the tool could be executed in command-line. So, please package the commands into a bat file. Then use Windows scheduled task to execute the file. Or you could use Azure Web Job to implement the same requirements.

Use CosmosDB Database Triggers with Azure Data Factory Copy Activity

I have a ADF copy activity copy rows of data from Azure SQL to Azure Cosmos DB.
I have a need to manipulate the document generated. I wrote the logic for the same inside a Pre Create Database Trigger that gets executed whenever a new document is created.
The trigger is not getting executed.
I was not able understand what the problem is, couldn't find any documentation either. The Cosmos DB client API's to create document needs the trigger to execute to be specified explicitly. Not sure if something similar could be done for ADF copy activity as well. Please help.
I am trying to avoid writing a custom activity (so as to leverage built-in scaling and error handling capabilities).
This seems similar to Azure cosmos db trigger, but the answers are not applicable to this question.

How to migrate db from SQL server to SQL Azure which contains asp.net membership provider with data

In one of my project i have used Asp.net membership provider and there is some data on production environment.Now i need to to migrate that DB with SQL Azure.
I have used SQLAzureMW tool to migrate that and that tool has done proper migration except aspnet_user table's data and some of SPs and i have skip that table manually at time of final step on that tool
When i have looked in to data then aspnet_users table's data missing!
I have also read about New script for SQL Azure but i think that is for to create from scratch, i also heard about Universal Provider but confused in that.
As per my requirement what steps i need to follow to migrate existing sql db to windows azure sql db(WASD) with data and considering this what would be impact of that on application?
Note: Session State is also being managed using SQL provider here
Update
Again i have tried it with SQLAzureMW tool and this time i noticed that due to default collation type some of Sps were missing so i have run that manually as per this link
However still i need to make sure would there be any issue regarding session state Or other thing as i have migrate it's from existing db to WASD?
the SQLAzureMW tool show any message that you skip?
I have had many problems with this tool, but finally I could did everything with it. I would try creating first time Data Schema Only, and after copying the Data. Maybe you should use also DROP and CREATE option when generating the script, if the old data can't be overwrite. I hope this helps!

Resources