Azure newbie here.
I have an architecture requirement to move the data from on-premise to cloud database. This is done in two steps due to security restrictions and timelines.
Move the file to azure blob storage
Read from the blob and Import to the sql database .
Azure blob is suggested for unstructured data. However, the data we want to export to cloud is a simple export of data from sql tables to csv files.
For such requirements what is recommended? Azure blob or Azure file share? When to use blob versus azure file share ?
Actually, if you want to migrate database from local/on-premise SQL Server to Azure SQL, there are many ways can help do that directly without Blob or File Storage.
Such as:
Using Data Migration Assistant(DMA) to help you migrate the data/database to Azure.
Ref: Migrate on-premises SQL Server or SQL Server on Azure VMs to Azure SQL Database using the Data Migration Assistant
SQL Server Management Studio(SSMS) task: Deploy Database to Microsoft Azure SQL Database.
Ref: Using the Deploy Database to SQL Azure Wizard in SQL Server Management Studio to move to the Cloud
Of course you could export the sql server data as CSV files to Blob or File storage, then import the csv file to Azure SQL. In usually, we often use Blob Storage work with Azure SQL database. It's up to you, reference the document #Gaurav Mantri-AIS mentioned.
Hope this helps.
Related
I am thinking about using Snowflake as data warehouse. My databases are in Azure SQl Database and I would like to know what tools I need for etl my data from Azure SQL Database to Snowflake.
I think Snowpark could work for data transformations, but I wonder what other code tool could I use.
Also, I wonder if I use azure blob storage as staging area or snowflake has its own.
Thanks
You can use HEVO data a third-party tool where you can directly migrate data from Microsoft SQL Server to Snowflake.
STEPS TO BE FOLLOWED
Make a connection to your Microsoft SQL Server database.
Choose a replication mode.
Create a Snowflake Data Warehouse configuration.
Alternatively, You can use SnowSQL to Connect Microsoft SQL Server to Snowflake where you export data from SQL Server to SSMS, upload the same to either Azure storage or S3, and move the data from Storage to Snowflake.
REFERENCES:
Microsoft SQL Server to Snowflake
How to move the data from Azure Blob Storage to Snowflake
I want to take an Archival(Backup) of Azure SQL Table to the Azure Blob Storage, I have done the backup in Azure Blob storage using via the Pipeline in CSV file format. And From the Azure Blob Storage, I have restored the data into the Azure SQL Table successfully using the Bulk Insert process.
But now I want to retrieve the data from this CSV file using some kind of filter criteria. Is there any way that I can apply a filter query on Azure Blob storage to retrieve the data?
Is there any other way to take a backup differently and then retrieve the data from Azure Storage?
My end goal is to take a backup of the Azure SQL table in Azure Storage and retrieve the data directly from Azure Storage with a filter.
Note
I know that I can take a backup using the SSMS, but that is not a requirement, I want this process through some kind of Pipeline or using the SQL command.
AFAIK, there is no such filtering option available when restoring the database. But, as you are asking for another way to backup and restoring, SQL Server Management Studio (SSMS) is one the most conveniently used platform for almost all SQL Server related activities.
You can use SSMS to access Azure SQL database using server name and Login Password.
Find this official tutorial from Microsoft about how to take backup of your Azure SQL Database and store it in Storage account and then restore it.
I want to resolve below problems:
1) Take Database dump from a Oracle Database(On Prem)
2) Create a Oracle Database in Azure
3) Place the On Prem Database dump file on BLOB storage and import the database in the Azure Oracle Database.
For creating database dump, i am trying to use SQL Developer Database Export utility. But struggling with the output format.
For Azure Oracle DB, i have deployed Oracle Standard 12.2 from MarketPlace, but don't know how to create a DB and import DB using BLOB storage DUMP file
You could use Copy Activity in Azure Data Factory.
It supports Oracle Database connector and Azure Blob Storage connector.
Here is an official detailed guide about transferring data from or to Oracle DB on-prem.
I'm doing some tests with Azure Data Lake Analytics and I can’t add a new SQL Server database as a Data Source. When I click on "Add data source", the only two available options are: "Azure Data Lake Storage Gen1" and "Azure Storage".
What I want is to add one SQL Server database so that I can run U-SQL queries against it.
Our SQL Server firewall is correctly configured to allow access to Azure Services, but I am not allowed to add it as a data source.
How can this be done? Is it a matter of other configuration issues?
Any help would be greatly appreciated.
Per my research ,there is no other configuration issues for sql server data source in DLA. Based on this official doc, DLA only supports two data sources:Data Lake Store and Azure Storage.
As workaround , I suggest you using Azure Data Factory to transfer data from sql server database to azure storage so that you could run U-SQL script against data source.
Any concern,please let me know.
I am beginner of the azure portal , I configured the Azure Application insight in front-end side (Angular 2) and Back-end side (Asp.net core)
I can track my application log file through azure application insight,and export the xls sheet also http://dailydotnettips.com/2015/12/04/exporting-application-insights-data-to-excel-its-just-a-single-click/ ,But i need to store all my log file into azure data lake storage for the Backup tracking purpose
I need to debug the issue on my application while facing issues.but i got the link https://learn.microsoft.com/en-us/azure/application-insights/app-insights-code-sample-export-sql-stream-analytics and Can I download data collected by Azure Application Insights (events list)? continues export for sql,blob storage,i dont want unwanted storage for storing my data in azure resources.
So If there is any way for connect application insight to Azure Data lake through connector or plugins.IF its could you please share me the link.
Thank you..
Automatic
If you export the events to azure blob storage you can do multiple things:
Use Azure Data Factory to copy the data from blob storage to Azure Data Lake
Use AdlCopy to copy the data from blob storage to Azure Data Lake
Write an U-Sql job to copy the data to Azure Data Lake
Manual
To manually place exported Application Insights data (in .xls format) you can use the portal to upload the file to Azure Data Lake.
If you need to have more control about the exported data you can use Application Insights Analytics to create a query based on the available data and export it to an .xls file.
If course you can also create a small app to export the .xls file to Azure Data Lake if you do not want to upload it using the portal. You can use the api for that.