I am trying to sync my on-premises SQL database with Azure SQL database.The first time was successes. However, when I tried to modify my sync database structure(delete the unnecessary tables from sync group), it couldn't sync. The error was :
Failed to perform data sync operation: Exception of type 'Microsoft.SqlAzureDataSync.ObjectModel.SyncGroupNotReadyForReprovisionException' was thrown.
I searched it on Google but I couldn't find a solution for that. How can I solve this?
Your sync database structure has changed, that's why the sync stopped and the error happens.
SQL Data Sync lets users synchronize data between Azure SQL databases and on-premises SQL Server in one direction or in both directions. One of the current limitations of SQL Data Sync is a lack of support for the replication of schema changes. Every time you change the table schema, you need to apply the changes manually on all endpoints, including the hub and all members, and then update the sync schema.
If you are making a change in an on-premises SQL Server database, make sure the schema change is supported in Azure SQL Database.
For more details, please see Automate the replication of schema changes in Azure SQL Data Sync. This article introduces a solution to automatically replicate schema changes to all SQL Data Sync endpoints.
Hope this helps.
Related
I have multiple devices with assignments, where each generating similar in structure data offline. Also each device periodically gets online to sync with an Azure SQL database that is separate and only assign to it. The devices also received new assignment through syncing with the Azure SQL database.
I want to combine these multiple database into a single database for managing, while bidirectionally getting updates when a sync goes through and also relaying back any assignments to the separate databases.
Any help or ideas would be much appreciated.
You can use Azure SQL Data Sync for the same purpose, which can update bi-directionally and can be scheduled to run according to requirements. However for multiple databases, we need to create multiple sync groups.
Set up SQL Data Sync between databases in Azure SQL Database and SQL Server
I am trying to use Azure Data Factory to perform an incremental load on a database without using a watermark or change tracking technology. I do not have the rights to add watermarks to tables, I can only read data from the target database. The database system does not have an ability to enable change tracking technology. It is also a very large database, which is why I want to be able to incrementally load changes rather than dropping the entire database and re-uploading it every night.
Is there a way to only upload the changes without altering the on-premises database or am I SOL?
I am connecting to an old Sybase database on premises and uploading data to an Azure SQL Server Database.
I would suggest use Data Flow. It provide options 'upsert' for you to allow insert or update the data in Azure SQL database. We don't need to drop the entire database and re-uploading it every night.
Ref here : Sink transformation
Error : Getting Schema Information for the database failed with the exception unable to process a schema with 2434 500
Database Sync Group error on Azure syncing local offline server.
What's the best way to sync offline database even if it's not this way
Please reference the Data Sync Limitations on service and database dimensions:
I agree with #Alberto Morillo, your exception should be "Unable to process a schema with 2434 tables, 500 is the max....".
Here's the Azure official blog talk about how to Sync SQL data in large scale using Azure SQL Data Sync. It gives you a solution to solve the exception:
Sync data between databases with many tables
Currently, data sync can only sync between databases with less than 500 tables. You can work around this limitation by creating multiple sync groups using different database users. For example, you want to sync two databases with 900 tables. First, you need to define two different users in the database where you load the sync schema from. Each user can only see 450 (or any number less than 500) tables in the database. Sync setup requires ALTER DATABASE permission which implies CONTROL permission over all tables so you will need to explicitly DENY the permissions on tables which you don’t want a specific user to see, instead of using GRANT. You can find the exact privilege needed for sync initialization in the best practice guidance. Then you can create two sync groups, one for each user. Each sync group will sync 450 tables between these two databases. Since each user can only see less than 500 tables, you will be able to load the schema and create sync groups! After the sync group is created and initialized, we recommend you follow the best practice guidance to update the user permission and make sure they have the minimum privilege for ongoing sync.
Hope this helps.
The full error message you are receiving may look like "Getting schema information for the database failed with the exception "Unable to process a schema with 2434 tables, 500 is the max. For more information, provide tracing ID '8d609598-3dsf-45ae-93v7-04ab21e45f6f' to customer support."
It is a current limitation on SQL Data Sync that if the database has more than 500 tables, it does not get the schema for you to select the tables - even if you want to select and sync only 1 table.
A workaround for you is to delete the extra tables or move the un-needed tables into another DB. Not ideal, we agree, but is a workaround for now. To try the workaround perform the following steps.
Script the tables you want to sync(less than 500 tables per synchronization group)
Create a new temporary database and run the script to create the tables you want to sync
Register and add the new temporary database as a member of the sync group
Use the new temporary database to pick the tables you want to sync
Add all other databases that you want to sync with (on-premise databases
and hub database)
Once the provisioning is done, remove the temporary database from the sync group.
This could be managed using following steps (I have used limited tables steps, as I was only targeting 400 tables for my requirement):
If you are looking for limited tables only
Create a user at your on-prem sql server, with limited tables right. i.e. GRANT SELECT ON databasename.table_name to username; (Add multiple tables which needs to be synced)
Once this is done, register this user at SQL Data Sync application at your On-Prem server
SQL Data Sync image
and select this at your Database Sync Group at Azure (Add an On-Premises Database).
Database Sync Group image
You'll be able to see all tables you granted permission at your On-Prem DB
If you are looking for all tables
Create multiple users with 500 tables each user, and create multiple sync groups using these different database users.
If I want a daily copy/replication of my production database, I know I can copy, but what happens when the size grows to ~100 terabytes or more?
It doesn't seem logical to copy a db of that size everyday just to use for testing/QA.
Ideally I'd like a solution where -
1. just the changes (data) are copied (nightly) to the testing db, there by eliminating the overhead of copying a large db.
2. when I do push changes (column additions, keys, etc) to production then those changes get copied to the testing db as well.
Is there an Azure solution or setup for this?
just the changes (data) are copied (nightly) to the testing db, there by eliminating the overhead of copying a large db. 2. when I do push changes (column additions, keys, etc) to production then those changes get copied to the testing db as well.
Please reference the document of SQL Data Sync. SQL Data Sync is a service built on Azure SQL Database that lets you synchronize the data you select bi-directionally across multiple SQL databases and SQL Server instances.
Data Sync is based around the concept of a Sync Group. A Sync Group is a group of databases that you want to synchronize.
Data Sync uses a hub and spoke topology to synchronize data. You define one of the databases in the sync group as the Hub Database. The rest of the databases are member databases. Sync occurs only between the Hub and individual members.
You can sync the data between hub database and member datatbase manually or automatically. Please see Tutorial: Set up SQL Data Sync between Azure SQL Database and SQL Server on-premises
Hope this helps.
Currently, we are mirroring our local SQL server Database with Azure SQL Server Database. For this, we are using Azure Data Management gateway but the problem is we are not able to handle the update or delete scenario. Update and Delete are not reflecting in Azure SQL Server database(Mirror).
Thanks.
What type of activity are you using in your pipeline? I'm assuming a simple copy activity?
My suggestion would be to have a copy activity that lands a clone of the on premises data in the Azure SQL DB first, maybe in a staging schema set of table or something. Then have a second downstream activity that performs a stored procedure execution activity. You can code a MERGE statement or whatever in the procedure to output the data into a separate table.
It sounds like you almost want a secondary node for your SQL Server in Azure. Maybe just use an availability group?! SQL 2014 or higher required on prem though.
You can also take a look at Transactional Replication to Azure SQL database, with transactional replication, Azure SQL database acts as a subscriber, update/delete changes you made in on-premises SQL Server database will be reflected in Azure SQL database.