I want to copy the Data from the AzureCosmosDb Database\Container to the Local.
I am trying with the Azcopy tool. I have tried the query as per this URL. But its not working.
The query which I have tried is as below :
Azcopy /Source:<Endpoint\Database\Container> /SourceKey:key /"<PrimaryKey>" /Dest:<Local Location> /EntityOperation:InsertOrReplace
What should I have to modify in this query to get the Data from Cosmos Db to my Local folder ?
The above method is recommended for Table API as mentioned in the doc, if you want to migrate data of SQL API use the Data Migration Tool.
Related
I need to save a output from a Kusto query on monitoring logs into a database table but I am unable to find a way to do it. I am presuming there will be a way to get the output from a Kusto query and save it to storage then pull that data into a table using a pipeline.
Any suggestions welcome
I have reproduced in my environment and got expected results as below:
Firstly, I have executed below Kusto query and exported it into csv file into local machine:
AzureActivity
| project OperationName,Level,ActivityStatus
And then uploaded the csv file from local machine into my Blob storage account as below:
Now I created an ADF service
now create new pipeline in it and take Copy activity in that pipeline.
Then I created linked service for blob storage as source and linked service for SQL database as sink.
In source dataset I gave the Blob file
In source dataset I gave SQL server table
Copy activity sink settings set table options as Auto create table
Output In SQL Query editor:
So what we do not is we have created a logic application that runs the query real-time and returns the data via http then we save that to the table. No manual intervention
I want to take an Archival(Backup) of Azure SQL Table to the Azure Blob Storage, I have done the backup in Azure Blob storage using via the Pipeline in CSV file format. And From the Azure Blob Storage, I have restored the data into the Azure SQL Table successfully using the Bulk Insert process.
But now I want to retrieve the data from this CSV file using some kind of filter criteria. Is there any way that I can apply a filter query on Azure Blob storage to retrieve the data?
Is there any other way to take a backup differently and then retrieve the data from Azure Storage?
My end goal is to take a backup of the Azure SQL table in Azure Storage and retrieve the data directly from Azure Storage with a filter.
Note
I know that I can take a backup using the SSMS, but that is not a requirement, I want this process through some kind of Pipeline or using the SQL command.
AFAIK, there is no such filtering option available when restoring the database. But, as you are asking for another way to backup and restoring, SQL Server Management Studio (SSMS) is one the most conveniently used platform for almost all SQL Server related activities.
You can use SSMS to access Azure SQL database using server name and Login Password.
Find this official tutorial from Microsoft about how to take backup of your Azure SQL Database and store it in Storage account and then restore it.
What are the best ways to Back up and restore Azure SQL Database schema in Azure cloud?
I have tried creating bacpac files, but problem with that is, it will be imported as a new database. I want to back up and restore specific schema only within the same database.
Another way i am looking at is creating a sql script file which contains data and schema using SSMS. But here size of the sql script is huge.
Any help is greatly appreciated
We can use bcp Utility for exporting and importing the data in a fast way.
I want to back up and restore specific schema only within the same
database.
There is no native tool for Azure SQL Database that can do backup/restore of some certain schema.
The closest one to the requirements is a bacpac, however it can restore data into the empty or in a new database.
Therefore, a possible option is to move data out and then in using ETL tools like:
SSIS
ADF
Databricks
I have a table [Assets] on Azure SQL Server with columns (Id, Name, Owner, Asset). The [Asset] column is varbinaryblob type that store PDF files.
I would like to use Azure Search to be able to search through the content of this column. Currently Azure Search can be directly used with Blob Store or exclusively for table store however I a am not able to find a solution for my scenario, Any help in terms of approach is greatly appreciated.
Is it possible for you to create a SQL VM, sync your data on SQL Azure with the VM with SQL Data Sync, then sync data on the SQL VM with Azure Search as explained here?
Another option is to move your SQL Azure database to a SQL VM on Azure, then sync data on SQL VM with Azure Search as explained here.
Hope this helps.
Azure Search SQL indexer doesn't support document extraction from varbinary/blob columns.
One approach would be to upload the file data into Azure blob storage and then use Azure Search blob indexer.
Another approach is to use Apache Tika or iTextSharp to extract text from PDF in your code and then index it with Azure Search.
I have read this SO question but mine is quite specific to the "import" of CSV and not how to access the blob to get the CSV out
Which is the best way?
1) CSV Stored in the Blob - use a worker role, read the CSV from the blob, parse data and update database
2) Is SQL BulkCopy/BulkInsert an option. The challenge here is that it should not have any on-premise involvement. All within Azure: blob->SQL DAtabase.
3) Will Azure Automation help? Are there PS scripts/workflows that help in such bulk update of CSV data to Azure SQL DB? I haven't found any though
Are there other options that help import blob CSV data to SQL DB without having to write custom code?
Appreciate any thoughts...
Your first method would work. You could also use azcopy (http://aka.ms/azcopy) to download the file locally, and then use BCP to load it into SQL - this way you wont have to write any code for this.
Azure Automation would help if you want to do this repeatedly. You should be able to set this up as a script even if one doesn't exist.
I know this is outdated question but for anyone looking for quick way to do this feel free to check my article on how to do this quickly using SQL prodecure triggered by Logic App.
In short you run on master
CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'UNIQUE_STRING_HERE'
Then you run on DB
CREATE DATABASE SCOPED CREDENTIAL BlobCredential
WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
SECRET = 'sv=SAS_TOKEN_HERE';
CREATE EXTERNAL DATA SOURCE AzureBlob
WITH (
TYPE = BLOB_STORAGE,
LOCATION = 'https://<account_name>.blob.core.windows.net/<container_name>',
CREDENTIAL = BlobCredential
);
And then
BULK INSERT <my_table>
FROM '<file_name>.csv'
WITH (
DATA_SOURCE = 'AzureBlob',
FORMAT = 'CSV',
FIRSTROW = 2
);
Just wrap this insert in procedure and execute it from logic app.
https://marczak.io/posts/azure-loading-csv-to-sql/
or just use ADF like here
https://azure4everyone.com/posts/2019/07/data-factory-intro/
Late answer to old question, but...
If you can use an Azure SQL Data warehouse you could take advantage of PolyBase to directly query the data in CSV format stored in the blob https://learn.microsoft.com/en-us/azure/sql-data-warehouse/sql-data-warehouse-load-polybase-guide#export-data-to-azure-blob-storage. This will allow you to directly map the data as an external table and query it dynamically.
This saves you the trouble of writing an external tool/solution for extracting, parsing and uploading the data to the Azure SQL database. Unfortunately PolyBase only works for Azure SQL Data warehouse, not Database, but you could setup something that read the structured data from the warehouse to your solution.
I know this question is two years old, but for those just now searching on the topic, I'd like to mention that the new Azure Feature Pack for SSIS makes this an easy task in SSIS. In VS Data Tools, after installing the Azure Feature pack, you would open an empty SSIS project and 1) Create an Azure Storage Connection Manager, then 2) Add a Data Flow Task, then open the Data Flow task and 3) Add a Blob Source tool to connect to the CSV, and then 4) using Destination Assistant connect to the SQL Table where the data is going. You can then execute this as a one-time load interactively inside the VS Data Tools IDE, or publish it to the SQL Server instance and create a recurring job.