Azure Data Factory Incremental Load without altering on premises database - azure

I am trying to use Azure Data Factory to perform an incremental load on a database without using a watermark or change tracking technology. I do not have the rights to add watermarks to tables, I can only read data from the target database. The database system does not have an ability to enable change tracking technology. It is also a very large database, which is why I want to be able to incrementally load changes rather than dropping the entire database and re-uploading it every night.
Is there a way to only upload the changes without altering the on-premises database or am I SOL?
I am connecting to an old Sybase database on premises and uploading data to an Azure SQL Server Database.

I would suggest use Data Flow. It provide options 'upsert' for you to allow insert or update the data in Azure SQL database. We don't need to drop the entire database and re-uploading it every night.
Ref here : Sink transformation

Related

Azure SQL: How to Merge Multiple Db's into a Single Managing Db, while Syncing Bidirectionally any future changes

I have multiple devices with assignments, where each generating similar in structure data offline. Also each device periodically gets online to sync with an Azure SQL database that is separate and only assign to it. The devices also received new assignment through syncing with the Azure SQL database.
I want to combine these multiple database into a single database for managing, while bidirectionally getting updates when a sync goes through and also relaying back any assignments to the separate databases.
Any help or ideas would be much appreciated.
You can use Azure SQL Data Sync for the same purpose, which can update bi-directionally and can be scheduled to run according to requirements. However for multiple databases, we need to create multiple sync groups.
Set up SQL Data Sync between databases in Azure SQL Database and SQL Server

Azure data factory - Continue on conflict on insert

We are building data migration pipeline using Azure data factory (ADF). We are transferring data from one CosmosDb instance to another. We plan to enable dual writes, so that we write to both the databases before migration begins to ensure that during migration if any data point changes both the databases get the most updated data. However, In ADF there is only Insert or upsert options available. Our case is on Insert if it gets 'conflict' continue and fail the pipeline. Can anyone give any pointers on how to achieve that in ADF?
Other option would be to create our own custom tool using CosmosDb libraries to transfer data.
If you are doing a live migration ADF is not the right tool to use as this is intended for offline migrations. If you are migrating from one Cosmos DB account to another your best option is to use the Cosmos DB Live Data Migrator.
This tool also provides dead letter support as well which is another requirement you have.

Apply local DB changes to Azure SQL Database

I have a backup file that came from Server A and I copied that .bak files into my local and setup that DB into my Sql Server Management Studio. Now After setting it up I deployed it in Azure Sql Database. But now there were change in the Data in Server A because it's still being used, so I need to get all those changes to the Azure SQL Database that I just deployed. How am I going to do that?
Note: I'm using Azure for my server and I have a local copy of Server A database. So basically in terms of data and structure my local and the previous Server A db is the same. But after a few days Server A data is now updated and my local DB is still the same as when I just backup the db in Server A.
How can I update the DB in Azure to take all the changes in Server A and deploy it in Azure?
You've got a few choices. It's just about migrating data. It's also a question of which data you're going to migrate. Let's say it's a neat, complete replacement. Then, I'd suggest looking at the bacpac mechanism. That's a way to export a database, it's structure and data, then import it into a new location. This is one mechanism of moving to Azure.
If you can't simply replace everything, you need to look at other options. First, there's SSIS. You can build a pipeline to move the data you need. There's also export and import through sqlcmd, which can connect to Azure SQL Database. You can also look to a third party tool like Redgate SQL Data Compare as a way to pick and choose the data that gets moved. There are a whole bunch of other possible Extract/Transform/Load (ETL) tools out there that can help.
Do you want to sync schema changes as well as Data change or just Data? If it is just Data then the best service to be used would be Azure Data Migration Service, where this service can help you copy the delta with respect to Data to Azure incrementally, both is online and offline manner and you can also decide on the schedule.

Do I copy a db or use data sync to create and manage a db for testing in Azure

If I want a daily copy/replication of my production database, I know I can copy, but what happens when the size grows to ~100 terabytes or more?
It doesn't seem logical to copy a db of that size everyday just to use for testing/QA.
Ideally I'd like a solution where -
1. just the changes (data) are copied (nightly) to the testing db, there by eliminating the overhead of copying a large db.
2. when I do push changes (column additions, keys, etc) to production then those changes get copied to the testing db as well.
Is there an Azure solution or setup for this?
just the changes (data) are copied (nightly) to the testing db, there by eliminating the overhead of copying a large db. 2. when I do push changes (column additions, keys, etc) to production then those changes get copied to the testing db as well.
Please reference the document of SQL Data Sync. SQL Data Sync is a service built on Azure SQL Database that lets you synchronize the data you select bi-directionally across multiple SQL databases and SQL Server instances.
Data Sync is based around the concept of a Sync Group. A Sync Group is a group of databases that you want to synchronize.
Data Sync uses a hub and spoke topology to synchronize data. You define one of the databases in the sync group as the Hub Database. The rest of the databases are member databases. Sync occurs only between the Hub and individual members.
You can sync the data between hub database and member datatbase manually or automatically. Please see Tutorial: Set up SQL Data Sync between Azure SQL Database and SQL Server on-premises
Hope this helps.

Azure Database sync failed

I am trying to sync my on-premises SQL database with Azure SQL database.The first time was successes. However, when I tried to modify my sync database structure(delete the unnecessary tables from sync group), it couldn't sync. The error was :
Failed to perform data sync operation: Exception of type 'Microsoft.SqlAzureDataSync.ObjectModel.SyncGroupNotReadyForReprovisionException' was thrown.
I searched it on Google but I couldn't find a solution for that. How can I solve this?
Your sync database structure has changed, that's why the sync stopped and the error happens.
SQL Data Sync lets users synchronize data between Azure SQL databases and on-premises SQL Server in one direction or in both directions. One of the current limitations of SQL Data Sync is a lack of support for the replication of schema changes. Every time you change the table schema, you need to apply the changes manually on all endpoints, including the hub and all members, and then update the sync schema.
If you are making a change in an on-premises SQL Server database, make sure the schema change is supported in Azure SQL Database.
For more details, please see Automate the replication of schema changes in Azure SQL Data Sync. This article introduces a solution to automatically replicate schema changes to all SQL Data Sync endpoints.
Hope this helps.

Resources