Clone cosmos Db - EnableDataTransfer is not enabled for this account - azure

I am trying to follow this document to clone my cosmosdb container https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-container-copy
When I ran create container job copy command. I have this error
I try to find it on azure documents but nowhere it talk about this errors. At first I think it is about setting of the cosmos account but I have a look at Json template for cosmos, there is no property EnableDataTransfer or something similar.
I also confirm that I have write data access to that cosmos account.
Could anyone help please. Thanks

You might have skipped over this step right there from that docs page to register first for the feature.
To get started using container copy jobs, register for "Intra-account offline container copy (Cassandra & SQL)" preview from the 'Preview Features' list in the Azure portal. Once the registration is complete, the preview will be effective for all Cassandra and API for NoSQL accounts in the subscription.

Related

How to choose only specific table as follower instead of entire DB in Azure data explorer using azure data share

I am working on something, where I need to replicate only few tables instead of entire database from the leader cluster. How should I do it in the Azure Portal using Azure data share? I can see from azure documentation, that they are using C# or some other language for it, can we do it directly via Azure Portal?
As of this writing, table-level sharing isn't yet available through Azure Data Share, but should become available in the next few weeks (follow this doc for updates: https://learn.microsoft.com/en-us/azure/data-explorer/data-share)
As you mentioned correctly, it is already available programmatically using the management API (documented here: https://learn.microsoft.com/en-us/azure/data-explorer/follower#table-level-sharing)

Onpremise Databases to Azure SQL Databases and Sync continuously

My requirements are as below :
Move 3 SAP local databases to 3 Azure SQL DB.
Then Sync daily transactions or data to azure every night. If transactions of local DB are already exists in azure, update process will do on these transactions if not insert process will do.
Local systems will not stop after moving to azure. They will still goes about 6 months.
Note :
We are not compatible with Azure Data Sync process because of it's
limitations - only support 500 tables, can't sync no primary keys
table, no views and no procedure. It also increase database size on
both(local and azure).
Azure Data Factory Pipeline can fulfill my requirements but I have
to create pipeline and procedure manually for each table. (SAP has
over 2000 tables, not good for me)
We don't use azure VM and Manage Instance
Can you guide me the best solution to move and sync? I am new to azure.
Thanks all.
Since you mentioned that ADF basically meets your needs, I will try to start from ADF. Actually,you don't need to manually create each table one by one.The creation could be done in the ADF sdk or powershell script or REST api. Please refer to the official document:https://learn.microsoft.com/en-us/azure/data-factory/
So,if you could get the list of SAP table names(i found this thread:https://answers.sap.com/questions/7375575/how-to-get-all-the-table-names.html) ,you could loop the list and execute the codes to create pipelines in the batch.Only table name property need to be set.

Notify email when Azure Storage table gets new entry

Is there an inbuilt way to notify an email when a new entry is added to the Table?
I am asking for anything programatically just within their own UI
Not currently but you could put it on an Azure Storage Queue and process it to Table Storage and send an Email with Azure Functions.
Check out this page what is possible - https://learn.microsoft.com/en-us/azure/azure-functions/functions-triggers-bindings
Okay, so the easiest way to see the data is to use their desktop app
Use https://azure.microsoft.com/en-us/features/storage-explorer/
The lastest Azure product updates covering event-driven applications suggest to adopt these application patterns to react to events published by Azure Storage. These docs/resources might help to explore the application pattern and current platform/framework support.
Azure Storage
Reacting to Blob storage events (preview)
Did not found anything similar for Azure Tables - what about suggesting this on UserVoice?
CosmosDB
Working with the change feed support in Azure Cosmos DB

Change Tracking in an Azure SQL DB for Azure Search for non-dbo Schema

I am associating an Azure SQL DB Table to my Azure Search using an Indexer. I am setting this all up using Azure's website: https://portal.azure.com
When I try and create the Indexer in Azure Search, I get the warning about "Consider enabling integrated change tracking on your database." However, I have enabled integrated change tracking on my database and table.
I have successfully setup several tables this way, in the same database, and they're working just fine with Azure Search. However, this table has a schema other than [dbo], and the others with change tracking were [dbo]. The same SQL user is being used for all the tables, and it has been granted the change tracking permission to this table, too.
Is there a problem with the Azure website where I cannot do this via the UI? Can this be done otherwise? Is there a permission issue with my DB's schema? Something else?
Because of this warning, I have not actually created this Azure Search Index.
Any help is appreciated!
It's a limitation of Azure Search portal - it doesn't support enabling integrated change tracking for non-default schemas. The workaround is to create the indexer programmatically, using REST API or .NET SDK. For a walkthrough, see https://learn.microsoft.com/azure/search/search-howto-connecting-azure-sql-database-to-azure-search-using-indexers.

How do I backup my azure search index?

I am using Azure search and would like to make sure I can recover from a self inflicted disaster before I push more docs in there. How do I backup my index?
Is creating Azure Search replicas equivalent to making a backup?
How would one restore that?
Thanks
Microsoft have released a console app on github that can be used to backup and restore Azure Search Indexes - its not perfect, but I use it almost daily for backup and restores from prod to CI/QC/Dev instances
https://learn.microsoft.com/en-us/samples/azure-samples/azure-search-dotnet-samples/azure-search-backup-restore-index/
Right now you can't do that from the API or the portal, just save a copy of the JSON schema to a .js file, for example. See the Get Index API.
Normally you don't need to touch the index very often, only add, update or remove documents.
You would need to use an indexer from an external source to push the data into Search and be able to create backups at the same time.
If its an AzureSQL database, this may do it for you automatically, depending on your subscription
Create a table with the same fields in the Azure Search Index and add a deleted flag and a last update date, then import all of your data into the database. Set the date flag to the time that you imported the data.
At the top of the azure search bar, there is an option to 'Import Data'. This will allow you to connect the data source, this way you can create an index which will look at the last modified data and deleted flag when you create the connection.
The wizard will take you through all of the options
From there, just update the SQL table with your changes and the indexer will automatically push them to Azure search.
Thank you for an answer about https://learn.microsoft.com/en-us/rest/api/searchservice/Get-Index
Sometimes Azure Search index it's an only source to restore data.
For example in Microsoft QnA maker - if you will delete azure web app or azure app service- you no longer can even export Knowledge base from QnA maker.
To somehow restore data from QnA maker- I used Azure Search index.

Resources