Azure SDK - Copy Azure SQL database Schema only through code - azure

Currently use the Microsoft.Azure.Management.Fluent libraries in our solution to create copies of Azure SQL databases - works well:
await server.Databases.Define(databaseName).WithExistingElasticPool(elasticPoolName).WithSourceDatabase(source).WithMode(CreateMode.Copy).CreateAsync();
Is there a way to do the same thing but only copy the schema - no data rows?
I was hoping for a WithMode option but nothing stands out in the documentation.
Thanks much in advance

Related

Additional column throwing validation issue with Azure SQL data sink in Azure Data Factory

Validation Error
I've got this weird issue where validation fails on 'additional columns' for my data sink to Azure SQL coming from a blob storage source in the Azure Data Factory GUI. No matter how many times we recreate the dataset (or specify another dataset, new) we can't get past this validation issue.
The irony of this is we deploy these pipelines from code and when we run them, we get no errors at all. This issue we have had just made life really difficult developing pipelines further as we have to do everything by code. We cant use the pipepline publish option.
Here are some screen grabs for you of the pipeline so you can see the flow.
Pipeline
Inside copyCustomer.
Source
Mapping
Sink
Any ideas on how to fix this validation would be greatly appreciated.
For what it's worth, we have recreated the dataset multiple times (clone and new) to avoid any issue with the dataset model not being the latest as per what's documented here https://learn.microsoft.com/en-us/azure/data-factory/copy-activity-overview#add-additional-columns-during-copy
Sometimes by setting the table in sink to autocreate has shown the validation to be 'fixed' but then when we go to publish it errors out again.
When your Azure SQL dataset was created long time before and is still utilizing an outdated dataset model that Additional Columns do not support, this is expected behavior.
As per official Microsoft documentation
To resolve this issue, you can just follow the error message to create a new Azure SQL dataset and use this as copy sink.
I followed error message and created new data set and it is working fine for me.
Source:
Mapping:
Sink:
Output:
I suspect here, your dataset of Sink type is incorrect. I reproduced,
same at my end. Its working fine. Kindly make sure you create a sink dataset type with Azure SQL database type connector only.
Please check below screenshots from my implementation.
If still it helps, feel free to share your sink dataset type connector details along with screenshots.

Manual Azure Backup Cosmos DB

Tried to export data in CosmosDB but it was not successful. According to https://learn.microsoft.com/en-us/azure/cosmos-db/storage-explorer, by using this tool I can export the data inside the cosmosdb, but no option to export. Tried to do the instructions here https://azure.microsoft.com/en-us/updates/documentdb-data-migration-tool/ and https://learn.microsoft.com/en-us/azure/cosmos-db/import-data#JSON, but error is being encountered.
Can you help me how to do this in Data Factory or any steps just to manual backup cosmos DB?
i tried doing the backup through azure data factory but data factory can't seem to connect to cosmos db, it's so weird 'cause the primary string/secondary string that I used is in the details of the cosmos db
Thank you.
Can you help me how to do this in Data Factory
According to your description,it seems you have trouble with export data,not import data. You could use Copy activity in ADF which supports cosmos db connector.For you needs,cosmos db is source dataset and please add one more sink dataset(destination). Such as some json files in the blob storage.Just make sure you configure right authentication information with your cosmos db account.
ADF is more suitable for the batch back up or daily back up.
or any steps just to manual backup cosmos DB
Yes,Storage Explorer is not for exporting data from cosmos db,Data migration tool is the suitable option.Please install the tool and refer to some details from this link:https://learn.microsoft.com/en-us/azure/cosmos-db/import-data#export-to-json-file
DMT is more suitable for single back up.Surely,it also supports execution in the batch if you use command line to execute it.
Cosmos DB Data Migration tool can be used to export data from Cosmos DB.
Refer https://learn.microsoft.com/en-us/azure/cosmos-db/import-data#export-to-json-file
this one worked for me... since my SSL in my Macbook did not work, I did these steps from the Azure VM that I created.
Steps:
Download MongoDB Community Server Client tool as per your OS version and MongoDB compatible version.
(Or you can download [v3.2.22 for Windows X64] directly at here, please don’t download the version beyond 4.2 as it’s incompatible)
After installing the MongoDB client tools, go to the installation directory -> go to the subfolder “bin” containing the mongoexport.exe, then issue below command to export your data:
mongoexport --host=: -u= -p= --db= --collection= --ssl --sslAllowInvalidCertificates --out=
Note 1: You can find the , , and in Cosmos DB Portal – “Connection String”

How to log all incompaitable rows in storage account using ADF V2 copy data tools

I have selected the option of logging all the incompatible rows into the storage account default container, but there have been no logs written inside the storage account, I am wondering why is that not happening?
Is there anything which can be done to make this work?
It's a regression and we are working on the fix, it's expected to be deployed by end of this week. Please try after that.
Update:
The issue is fixed, can you try again?

Windows Azure SQL Database Collation

How can I change the collation of my Azure SQL Database? I need to change it to Latin1_General_CI_AS
Thank you!
SQL Azure v12 supports altering the data collation of the database.However the catalog collation is fixed and cannot be altered
You cannot change the collation of a SQL Azure Database at the server level:
http://msdn.microsoft.com/en-us/library/windowsazure/ee336245.aspx#sscs
You can change it when creating the database initially.
Here is how I went about changing the default collation.
1) Create New Database Instance from the Portal with the proper Collation
2) Export Schema from Source and Apply To Destination
3) Disable all constraints
4) Using SSMS Import Wizard, copy all data from the old instance to the new instance. This was extremely fast for me, as both instances were on the same server.
5) Re-enable all constraints
All columns picked up the new default Collation. I went from CI to CS.
You can then rename the databases or just reference the new instance.
To check your columns, you can run this cmd before and after.
SELECT c.name, c.collation_name FROM SYS.COLUMNS c JOIN SYS.TABLES t ON t.object_id = c.object_id
Note: The import wizard doesn't copy everything; e.g., Users.
and this might not work for you if you specified a column level collation.

No longer able to create a bacpac: SQL70015: Deprecated feature 'String literals as column aliases' is not supported on SQL Azure

We have encountered a critical error today - we are no longer able to create bacpac files of our live Azure production databases. Everything was working up until now, and suddenly we've started encountering the following error:
Error encountered during the service operation. Could not extract package from specified database. Error SQL70015: Deprecated feature 'String literals as column aliases' is not supported on SQL Azure.
We have a complex database schema which has been deployed live to Azure for over a year. We are relying on daily bacpacs are our only backup strategy - need help to figure out how to resume making bacpacs.
well I fell your pain... the answer here is: Replace your schema...There's no other way...
Instead of 'Column Name' use [Column Name]... instead of Select CryptColumnA 'Column A' from myTable use Select CryptColumnA as [Column A] from myTable and so forth...
We have filed a support ticket with Microsoft and the issue was acknowledged as a bug. We only had a problem with bacpac export, not import - and apparently it was due to a SQL Azure change which hardened some of the export validations.
To make the long story short, the issue has been fixed by Microsoft and we are no longer experiencing the problem - and that is without any schema changes on our end.

Resources