Bulk Load Files into SQL Azure? - azure

I have an ASP.NET app that takes multimegabyte file uploads, writes them to disk, and later MSSQL 2008 loads them with BCP.
I would like to move the whole thing to Azure, but since there are no "files" for BCP, can anyone comment on how to get bulk data from an Azure app into SQL Azure?
I did see http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.aspx but am not sure if that applies.
Thanks.

BCP is one way to do it.
This post explains it in three easy steps:
Bulk insert with Azure SQL

You are on the right track. The Bulk Copy API will work. I am using it. And it's the fastest way to import data because it uses INSERT BULK statements. Not to get confused with the BULK INSERT statement which is not supported in SQL Azure. In essence, BCP and the SqlBulkCopy API use the same method.

http://www.solidq.com/sqj/Pages/2011-May-Issue/Migrating-Data-into-Microsofts-Data-Platform-SQL-Azure.aspx for a detailed analysis of options available

I think it's important to note that BCP cannot handle source files that are Unicode while using a format file to do your import.

Related

Build a pipeline in azure data factory to load Excel files, format content, transform in csv and send to azure sql DB

I'm approaching to Azure environment and watching tutorials/reading documents, but I'm trying to figure out how to setup a flow that enables the process that I will describe hereunder. The starting point are reports in .xlsx format produced monthly by Mktg Dept: the requirements are to bring them in Azure SQL DB so that data can be stored and analysed. Sofar I managed to put those files (previously manually converted in .csv format) in a BLOB storage and build an ADF pipeline that copy each file in a table on the SQL DB.
The problem is that as far as I understood with ADF it's not possible to directly manage xlsx files, and I'm wondering how to set up an automated procedure that enables the conversion from .xlsx to .csv and save them on BLOB storage. I was thinking about adding to the pipeline a python script/Databricks notebook to convert format, but I'm not sure this could be the best solution. Any hint/reference to existing tutorial or resources would be very appreciated
I found a tutorial which uses Logic Apps to do the conversion.
Datanovice indirectly suggested using a Custom activity to run either a C# or Python application to do the conversion for you.
The least expensive solution would be to do the conversion before uploading to blob, like Datanovice said.

Table Replication and Synchronization in AZURE

I am pretty new to AZURE cloud and stuck at a place where I want to repplicate 1 table into another database with same schema and table name.
By replication I mean, the new table in another database should automatically synced with the original table. I can do this using the elastic table, but the queries are taking way too long and some time getting timed out, so I am thinking of having a local table in another database instead of elastic table, but I am not sure how I can do this in AZURE ?
Note: Both database resided on same DB server
Any example, links will be helpful
Thanks
To achieve this you can use a DACPAC (Data-Tier Package) a data tier package can be created in Visual Studio or extracted from an existing database. They contain the database creation scripts and manage your deltas for you. More information can be found here. For information about how to build and deploy a DACPAC using both VS and extracted from a database see this answer

how to load local file into Azure SQL DB

I have not been able to find a solution to this so will ask the experts.
A co-worker has a .txt file on his laptop that we want to load into Azure SQL DB using SSMS and Bulk Insert. We can open the local file easily enough but we don't know how to reference this file in FROM clause.
Assuming a file named myData.txt is saved to
c:\Users\Someone
how do we tell Azure SQL DB where that file is?
You don't. :) You have to upload a file to an Azure Blob Store and then, from there, you can use BULK INSERT or OPENROWSET to open the file.
https://learn.microsoft.com/en-us/sql/t-sql/statements/bulk-insert-transact-sql?view=sql-server-2017
I've written an article that describes the steps to open a JSON file here:
https://medium.com/#mauridb/work-with-json-files-with-azure-sql-8946f066ddd4
I fixed this problem by uploading the file to a local database and then use a linked server to my Azure db to insert or update the record. Much easier than creating a Blob Storage. However, if the file is very big or you have a lot of files to upload you might not want to use my method as linked servers is not the quickest connection.

I want to take backup and restore of SQLite Database file using xamarin Azure SDK for My Xamarin App

I am using xamarin Azure SDK to download and manage the local database for my Xamarin . Forms App.
We are facing downloading time issues because we have a lot of data,
so I am thinking of taking backup once of the SQLite File from one device and use it to restore on different devices as restoring the same SQLite File.
Planned to use Azure Blob storage to store backup of SQLite files and for different device planning to download that blob of SQLite file and thinking of restore it on different devices.
Any Help will be appreciated.
Thanks :)
An approach I have used in the past is to create a controller method on the azure end which the client app can call that generates a pre-filled sqlite database or 'snapshot' on the server (making sure you include all the extra azure tables and columns) and then return a download url for the file to the client. We also zip-up the snapshot database to reduce the download times. You could store this 'snapshot' in azure blob if you desired.
Please refer given link. SQLite is not supporting only relationship like Foreign Key.
Memory Stream as DB
you can upload back up file on Blob with respective user details. and when there is any call with same user details you can download it from blob.
Those are the links that provide you with the code / knowledge required to use Azure Blob Storage on Xamarin:
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-xamarin-blob-storage
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-dotnet-how-to-use-blobs
As this question is very general I can provide you only with those general links. There are many details on how to implement that in your case, if you face some specific problem I recommend to ask another question with the exact description of that specific problem.
EDIT: According to your comment you have some problems in replacing the local file. The only thing is that you must replace it before you initialize SQLite, otherwise it is a simple file operation.

Azure Database Backup lost Comment

I am using the DAC framework's Import Export Service Client tool to export a BACPAC file from a SQL Database on Azure. But when I restore the BACPAC file to the Azure Database, it loses the comment of a Store Procedure or a View, which was initially entered above the Create statement. Is this because of the export tool neglecting comments outside of the Create View or Store Procedure?
/*
XXXX Procedure
Version 1.0, */
CREATE Procedure [dbo].XXXX
I am not familiar with DAC. But I think this is an expected behavior. Comments are not used at runtime. If they’re removed, performance can be slightly improved as the engine doesn’t need to parse the comment.
Best Regards,
Ming Xu.

Resources