I've created a backup of my local database through "Export Data Tier Application" and I saved the file at Azure Blob.
At Azure Portal, I choose my SQL Server and import a new database. I select the backup from the Blob, and wait a long time for the DB creation. It stucks at 1% all the time.
After 40 minutes, I get this message every single time I try to create the database:
The ImportExport operation with Request Id
'f6743e06-592d-4531-b319-4297b345f744e' failed due to 'Could not
import package. Warning SQL0: A project which specifies SQL Server
2019 or Azure SQL Database Managed Instance as the target platform may
experience compatibility issues with Microsoft Azure SQL Database v12.
Warning SQL72012: The object [data_0] exists in the target, but it
will not be dropped even though you selected the 'Generate drop
statements for objects that are in the target database but that are
not in the source' check box. Warning SQL72012: The object [log]
exists in the target, but '.
This is very frustrating, its just a database with tables (with no data) that only weights 25 megs. Im following every single tutorial to make this work, every single step, and I always get that error, no matter which database name I choose.
Any help will be appreciated.
Thanks.
Instead of going through the process of creating a bacpac, upload it to an Azure Storage account and the fail at the end to import it to Azure SQL, you can easily migrate that SQL Server to azure using Azure Data Migration Assistant (DMA).
You just have to create an empty Azure SQL Database, and DMA do the rest. You can download it from here.
Related
I am looking for a way to back up SSIS Catalog Database that was deployed on Azure. I looked through the documentation here:
SSIS Catalog
It seems like the first step of doing that would be backing up a master key, which is not supported on Azure, so I tried to look for a more general way of backing up SQL Server on Azure like this method of using Azure Portal GUI:
Azure Backup: SQL Databases and How To Back Them Up
Or using SSMS export wizard:
Export a Data-tier Application
However, it seems like they all fail to back up SSIS Catalog DB, giving this Error SQL71564:
One or more unsupported elements were found in the schema used as part of a data package.
Error SQL71564: Error validating element Signature for '[internal].[get_database_principals]': The element Signature for '[internal].[get_database_principals]' cannot be deployed. This element contains state that cannot be recreated in the target database.
Error SQL71564: Error validating element Signature for '[internal].[get_principal_id_by_sid]': The element Signature for '[internal].[get_principal_id_by_sid]' can...
I am investigating the details of these error messages, and they seem to mean that I should change the metadata or structure of the SSISDB, which I can't really do at the moment. Is there any way to backup SSIS DB on Azure without really changing SSIS DB?
Please try generating a DACPAC from the command line.
You can follow these articles:
Using SQLPackage to import or export SQL Server and Azure SQL DB
Generating an SSISDB DACPAC
I'm using this very useful SQLCLR script to make a REST call to an API and save the data on SQL Server on the fly.
I have created a stored procedure that withdraws new data every hour so my data are always updated.
I would like to have all this on Azure so I can then create a Power BI data visualization.
THE PROBLEM:
As soon as I try to transfer the database on Azure I receive this error:
TITLE: Microsoft SQL Server Management Studio
------------------------------
Could not import package.
Warning SQL0: A project which specifies SQL Server 2019 or Azure SQL Database Managed Instance as the target platform may experience compatibility issues with Microsoft Azure SQL Database v12.
Error SQL72014: .Net SqlClient Data Provider: Msg 40517, Level 16, State 1, Line 4 Keyword or statement option 'unsafe' is not supported in this version of SQL Server.
Error SQL72045: Script execution error. The executed script:
CREATE ASSEMBLY [ClrHttpRequest]
AUTHORIZATION [dbo]
FROM 0x4D5A90000300000004000000FFFF0000B800000000000000400000000000000000000000000000000000000000000000000000000000000000000000800000000E1FBA0E00B409CD21B8014CCD21546869732070726F6772616D2063616E6E6F742062652072756E20696E20444F53206D6F64652E0D0D0A2400000000000000504500004C0103006D85475F0000000000000000E00022200B0130000026000000060000000000007E45000000200000006000000000001000200000000200000400000000000000060000000000000000A00000000200004C1E01000300608500001000001000000000100000100000000000001000000000000000000000002C4500004F00000000600000FC03000000000000000000000000000000000000008000000C000000F44300001C0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000200000080000000000000000000000082000004800000000000000000000002E7465787400000084250000002000000026000000020000000000000000000000000000200000602E72737263000000FC030000006000000004000000280000000000000000000000000000400000402E72656C6F6300000C000000008000000002000000
(Microsoft.SqlServer.Dac)
------------------------------
BUTTONS:
OK
------------------------------
This happens because Azure SQL has some feature stripped off like SQLCLR or SQL Server Agent (for some obvious security reason).
Is there any alternative to SQLCLR on Azure?
Is there any alternative to SQL Server Agent on Azure?
Basically: how to automate a REST call to an API every hour and save the result to SQL Server on Azure?
I do not think there is a straight forward replacement for SQL CLR. However, there are some Azure offerings that might be interesting.
I suppose an alternative is using a scheduled azure function that calls the API and store the result in the Azure SQL Database.
Do mind that if the process takes longer than 10 minutes you cannot use a consumption plan for the Azure Function, which is the most cost effective probably.
Depending on the scenario, Azure Data Factory can also provide a solution. You can create a pipeline that calls the API and copies the data to Sql Server as outlined here, based on a schedule trigger.
Even though Azure Functions is great, you could even solve this without much code using Azure Logic Apps, a scheduled trigger, the http request and the mssql connector.
https://azure.microsoft.com/de-de/services/logic-apps/
I have a backup file that came from Server A and I copied that .bak files into my local and setup that DB into my Sql Server Management Studio. Now After setting it up I deployed it in Azure Sql Database. But now there were change in the Data in Server A because it's still being used, so I need to get all those changes to the Azure SQL Database that I just deployed. How am I going to do that?
Note: I'm using Azure for my server and I have a local copy of Server A database. So basically in terms of data and structure my local and the previous Server A db is the same. But after a few days Server A data is now updated and my local DB is still the same as when I just backup the db in Server A.
How can I update the DB in Azure to take all the changes in Server A and deploy it in Azure?
You've got a few choices. It's just about migrating data. It's also a question of which data you're going to migrate. Let's say it's a neat, complete replacement. Then, I'd suggest looking at the bacpac mechanism. That's a way to export a database, it's structure and data, then import it into a new location. This is one mechanism of moving to Azure.
If you can't simply replace everything, you need to look at other options. First, there's SSIS. You can build a pipeline to move the data you need. There's also export and import through sqlcmd, which can connect to Azure SQL Database. You can also look to a third party tool like Redgate SQL Data Compare as a way to pick and choose the data that gets moved. There are a whole bunch of other possible Extract/Transform/Load (ETL) tools out there that can help.
Do you want to sync schema changes as well as Data change or just Data? If it is just Data then the best service to be used would be Azure Data Migration Service, where this service can help you copy the delta with respect to Data to Azure incrementally, both is online and offline manner and you can also decide on the schedule.
My requirements are as below :
Move 3 SAP local databases to 3 Azure SQL DB.
Then Sync daily transactions or data to azure every night. If transactions of local DB are already exists in azure, update process will do on these transactions if not insert process will do.
Local systems will not stop after moving to azure. They will still goes about 6 months.
Note :
We are not compatible with Azure Data Sync process because of it's
limitations - only support 500 tables, can't sync no primary keys
table, no views and no procedure. It also increase database size on
both(local and azure).
Azure Data Factory Pipeline can fulfill my requirements but I have
to create pipeline and procedure manually for each table. (SAP has
over 2000 tables, not good for me)
We don't use azure VM and Manage Instance
Can you guide me the best solution to move and sync? I am new to azure.
Thanks all.
Since you mentioned that ADF basically meets your needs, I will try to start from ADF. Actually,you don't need to manually create each table one by one.The creation could be done in the ADF sdk or powershell script or REST api. Please refer to the official document:https://learn.microsoft.com/en-us/azure/data-factory/
So,if you could get the list of SAP table names(i found this thread:https://answers.sap.com/questions/7375575/how-to-get-all-the-table-names.html) ,you could loop the list and execute the codes to create pipelines in the batch.Only table name property need to be set.
I'm attempting to add a new data source from a SQL Server on Azure VM for a search service and indexer I'm creating through the Azure web portal. It's my understanding that I can create an index, import this data, then create an indexer to regularly push data to the index. I'm adding the connection string for our SQL Server and getting a successful message when clicking "Test Connection". The tables show up in a drop-down list, and I select one.
When I click "OK" on creating the new data source, a popup comes up that says "Sampling Data Source..." then an "Error detecting index schema from data source: 'Data source payload should specify at least one of datasource name and type'".
I've tried Googling this error, and I can't find anything on it and not sure how to fix it so I can proceed.
This looks like a bug in Azure Search portal support for SQL Server data sources.
We'll investigate. In the meantime, you can create your datasource programmatically as shown in Connecting Azure SQL Database to Azure Search.