I have not been able to find a solution to this so will ask the experts.
A co-worker has a .txt file on his laptop that we want to load into Azure SQL DB using SSMS and Bulk Insert. We can open the local file easily enough but we don't know how to reference this file in FROM clause.
Assuming a file named myData.txt is saved to
c:\Users\Someone
how do we tell Azure SQL DB where that file is?
You don't. :) You have to upload a file to an Azure Blob Store and then, from there, you can use BULK INSERT or OPENROWSET to open the file.
https://learn.microsoft.com/en-us/sql/t-sql/statements/bulk-insert-transact-sql?view=sql-server-2017
I've written an article that describes the steps to open a JSON file here:
https://medium.com/#mauridb/work-with-json-files-with-azure-sql-8946f066ddd4
I fixed this problem by uploading the file to a local database and then use a linked server to my Azure db to insert or update the record. Much easier than creating a Blob Storage. However, if the file is very big or you have a lot of files to upload you might not want to use my method as linked servers is not the quickest connection.
Related
I'm approaching to Azure environment and watching tutorials/reading documents, but I'm trying to figure out how to setup a flow that enables the process that I will describe hereunder. The starting point are reports in .xlsx format produced monthly by Mktg Dept: the requirements are to bring them in Azure SQL DB so that data can be stored and analysed. Sofar I managed to put those files (previously manually converted in .csv format) in a BLOB storage and build an ADF pipeline that copy each file in a table on the SQL DB.
The problem is that as far as I understood with ADF it's not possible to directly manage xlsx files, and I'm wondering how to set up an automated procedure that enables the conversion from .xlsx to .csv and save them on BLOB storage. I was thinking about adding to the pipeline a python script/Databricks notebook to convert format, but I'm not sure this could be the best solution. Any hint/reference to existing tutorial or resources would be very appreciated
I found a tutorial which uses Logic Apps to do the conversion.
Datanovice indirectly suggested using a Custom activity to run either a C# or Python application to do the conversion for you.
The least expensive solution would be to do the conversion before uploading to blob, like Datanovice said.
I'm using Databricks on Azure and am using a library called OpenPyXl.
I'm running the sameple cosde shown here: and the last line of the code is:
wb.save('document.xlsx', as_template=False)
The code seems to run so I'm guessing it's storing the file somewhere on the cluster. Does anyone know where so that I can then transfer it to BLOB?
To save a file to the FileStore, put it in the /FileStore directory in DBFS:
dbutils.fs.put("/FileStore/my-stuff/my-file.txt", "Contents of my
file")
Note: The FileStore is a special folder within Databricks File System - DBFS where you can save files and have them accessible to your web browser. You can use the File Store to:
For more detials, refer "Databricks - The FileStore".
Hope this helps.
I am using xamarin Azure SDK to download and manage the local database for my Xamarin . Forms App.
We are facing downloading time issues because we have a lot of data,
so I am thinking of taking backup once of the SQLite File from one device and use it to restore on different devices as restoring the same SQLite File.
Planned to use Azure Blob storage to store backup of SQLite files and for different device planning to download that blob of SQLite file and thinking of restore it on different devices.
Any Help will be appreciated.
Thanks :)
An approach I have used in the past is to create a controller method on the azure end which the client app can call that generates a pre-filled sqlite database or 'snapshot' on the server (making sure you include all the extra azure tables and columns) and then return a download url for the file to the client. We also zip-up the snapshot database to reduce the download times. You could store this 'snapshot' in azure blob if you desired.
Please refer given link. SQLite is not supporting only relationship like Foreign Key.
Memory Stream as DB
you can upload back up file on Blob with respective user details. and when there is any call with same user details you can download it from blob.
Those are the links that provide you with the code / knowledge required to use Azure Blob Storage on Xamarin:
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-xamarin-blob-storage
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-dotnet-how-to-use-blobs
As this question is very general I can provide you only with those general links. There are many details on how to implement that in your case, if you face some specific problem I recommend to ask another question with the exact description of that specific problem.
EDIT: According to your comment you have some problems in replacing the local file. The only thing is that you must replace it before you initialize SQLite, otherwise it is a simple file operation.
I want to import data of excel file to oracle database by using oracle forms 10g . I try to use WebUtil but it is so slow.
Can any one help me to find another way?
Big Thanks
The easiest way is to get the file on the database server.
If the file is on the client side, you can do this by using a shared drive between your application server and db server and then transfer the file using webutil.
When the file is on the db server read the file using an external table.
See this link for more information on external tables
I have an ASP.NET app that takes multimegabyte file uploads, writes them to disk, and later MSSQL 2008 loads them with BCP.
I would like to move the whole thing to Azure, but since there are no "files" for BCP, can anyone comment on how to get bulk data from an Azure app into SQL Azure?
I did see http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.aspx but am not sure if that applies.
Thanks.
BCP is one way to do it.
This post explains it in three easy steps:
Bulk insert with Azure SQL
You are on the right track. The Bulk Copy API will work. I am using it. And it's the fastest way to import data because it uses INSERT BULK statements. Not to get confused with the BULK INSERT statement which is not supported in SQL Azure. In essence, BCP and the SqlBulkCopy API use the same method.
http://www.solidq.com/sqj/Pages/2011-May-Issue/Migrating-Data-into-Microsofts-Data-Platform-SQL-Azure.aspx for a detailed analysis of options available
I think it's important to note that BCP cannot handle source files that are Unicode while using a format file to do your import.