I have a ASP.NET WebApp that manages some Records. The Records are kept in a Azure Table Storage tables.
The client gave me an Excel file with some hundred of Records in Excel table format (Fields in Columns).
How can I export that table from Excel to Azure Table? I saw there is a way to import data from Azure Tables into Office 2016 Excel (via Data>DetData>FromAzure) but I'd like to know if there are ways to do it backward(from Excel to Azure), and perhaps apply a custom logic when exporting that data (like manage DateTime or transform enumerations...).
I would like to import at least the string fields that does not need transformations, then I would do the rest manually or by code...
You have several options:
Using the Azure Storage Explorer you can import / export data to and from table storage using CSV files.
Upload the file to blob storage and use Azure Data Factory to transform and import the data.
Write some code to do this.
An addition about the transform part: you might be able to do this in the source excel file as well. In that case option 1 is probably the easiest.
When it comes to option 1, you can choose to use a .typed.csv file or a regular one. Using the latter it will try to distill the type. So importing a .csv file looking like this:
PartitionKey,RowKey,C1,C2
a,a,1,w
a,b,2,ww
a,c,5,www
will result in a table with 4 columns. (Actually, there will be five, the Timestamp column you'll get for free)
Related
I want to convert Excel to CVS in Azure Synapse Analytics but I got an error.
The error message is "Invalid excel header with empty value".
The Excel file I want to convert looks like this (created for the question) and I need to remove the blank column A when converting to csv.
I have never used ADF before so I don't know.
Can someone please tell me how to do this?
Any help would be appreciated.
sample.excel
You have to use dataflows to do that in ADF.
First create a linked service for your source data set.
Create linked service for your target folder.
My input looks like this (took from your attached sheet)
Go to the author tab of data factory and select on new dataflow.
Source settings should look like this
Source options: Point to the location where you have stored excel sheet and also select the sheetname, in my case it is sheet1 (For this example I have used Azure Blob storage)
Keep rest of the tabs as default and add a sink to your data flow.
Sink Settings should look like below
Point to the target location where you want to store your csv file (I have used Azure blob storage). Keep rest of the things on default
Go to the new pipeline and pull dataflow activity in your canvas and trigger your dataflow.
And my output in csv looks like this
I'm new to using Data Factory and what I want to do is to copy the information from several CSV files (storage accounts) to a SQL Server database to the respective tables already created. If for example I have 4 CSV files there should be 4 tables.
I have been testing some activities, for example the "Copy Data", but that would cause me to create the same amount of datasets and if for example there were 15 tables, that would be too many datasets.
I want to make it dynamic but I can't figure out how to do it.
How do you suggest me to do this, any example please, thanks.
Either, you can use wildcard to read all files together which are under same blob container.
Once, you read that then in mapping you can add identifier to determine which columns belongs to which table through which you can identify and import data into resp tables OR You can use foreach loop to read all files from blob container and import data into resp table based on file name.
I had created below article to copy data from sql database to sql database for beginners, you can refer it for some initial level setting.
Azure Data Factory (ADF) Overview For Beginners
I am trying to automatically export the Azure SQL database table data to Excel sheets.I tried to achieve it with Azure Data Factory but couldn't succeed as Azure data factory doesn't have direct support for Excel. I found in some documentation where it was mentioned that SQL database should be exported as text file first. Following that documentation, i exported the SQL database data as CSV file in Azure Blob Storage using Azure Data Factory and couldn't proceed further. Is there any way to convert that CSV in Azure Blob to Excel in automated way? Are there any better alternatives for the overall process?
Exporting data from SSMS to Excel using Copy-Paste from DataGrid or exporting to .csv will cause the loss of the data types, which again, will cost you additional work when importing these data into Excel (import as text).
I have developed SSMSBoost add-in for SSMS, which allows to copy-paste data to Excel using native Excel clipboard format, preserving all data types.
Additionally, you can use SSMSBoost to create ".dqy" query file from your current SQL Script and open it in excel (in this case Excel will use provided connection information and SQL text to execute the query directly against your database).
I hope this helps.
You can have an Azure Function Activity in your Azure Data Factory pipeline and chain it to your Copy Activity. By chaining the activities, you are making sure that the Azure Function Activity is invoked only once the csv file is written successfully.
In the Azure Function, you can use a language of your choice to write code to convert the csv file to xls.
There are a bunch of libraries that you can use to convert csv to xls. Some of them are below :
Simple Excel for Javascript
Some ideas to do it in Java
Python
Hope this helps.
I didn't find the way to convert that CSV file in Azure Blob to Excel in automated way.
But I find that there is tool FileSculptor can help you convert csv file to Excel automatically with scheduled tasks.
Main Benefits:
Convert between file formats CSV, XML, XLS and XLSX
Support for spreadsheets from Excel 97 to Excel 2019
Support for international character sets (unicode)
Select and reorder fields
Create calculated fields
Create an icon on the desktop to convert files with one click
Automatically convert files using scheduled tasks
Additional utility to convert files from DOS command prompt or .BAT
files.
For more details, please reference this tutorial: Convert Between CSV, XML, XLS and XLSX Files.
And about how to export Azure SQL database to a csv file, this tutorial How to export SQL table to Excel gives you two exampes:
Export SQL table to Excel using Sql to Excel Utility. Perhaps the simplest way to export SQL table to Excel is using Sql to Excel utility that actually creates a CSV file that can be opened with Excel. It doesn’t require installation and everything you need to do is to connect to your database, select a database and tables you want to export.
Export SQL table to Excel using SSMS.
I did't find the way which automatically export the Azure SQL database table data to Excel sheets. Azure SQL database doesn't support SQL Server agent.
Hope this helps.
For an experiement I need data from football matches (the results and statistics of both teams before the match, the result of the match, the number of spectactors, the referee etc.). On www.flaschscores.com it is very well summarized. Is it possible to import date from that website into an Azure ML experiement?
AzureML only supports the following data formats:
Plain text (.txt)
Comma-separated values (CSV) with a header (.csv) or without (.nh.csv)
Tab-separated values (TSV) with a header (.tsv) or without (.nh.tsv)
Excel file
Azure table
Hive table
SQL database table
OData values
SVMLight data (.svmlight) (see the SVMLight definition for format information)
Attribute Relation File Format (ARFF) data (.arff) (see the ARFF definition for format information)
Zip file (.zip)
R object or workspace file (.RData)
Nevertheless this gives you a lot of versatility, you can use web technologies to web scrape any site and create a csv, or store the data in an Azure SQL databases, and you can connect that data source to AzureMl.
By itself AzureML is not designed to web scrape websites.
I have a sharepoint list which i have linked to in MS Access.
The information in this table needs to be compared to information in our datawarehouse based on keys both sets of data have.
I want to be able to create a query which will upload the ishare data into our datawarehouse under my login run the comparison and then export the details to Excel somewhere. MS Access seems to be the way to go here.
I have managed to link the ishare list (with difficulties due to the attachment fields)and then create a local table based on this.
I have managed to create the temp table in my Volatile space.
How do i append the newly created table that i created from the list into my temporary space.
I am using Access 2010 and sharepoint 2007
Thank you for your time
If you can avoid using Access I'd recommend it since it is an extra step for what you are trying to do. You can easily manipulate or mesh data within the Teradata session and export results.
You can run the following types of queries using the standard Teradata SQL Assistant:
CREATE VOLATILE TABLE NewTable (
column1 DEC(18,0),
column2 DEC(18,0)
)
PRIMARY INDEX (column1)
ON COMMIT PRESERVE ROWS;
Change your assistant to Import Mode (File-> Import Data)
INSERT INTO NewTable (?,?)
Browse for your file, this example would be a comma delineated file with two numeric columns and column one being the index.
You can now query or join this table to any information in the uploaded database.
When you are finished you can drop with:
DROP TABLE NewTable
You can export results using File->Export Data as well.
If this is something you plan on running frequently there are many ways to easily do these type of imports and exports. The Python module Pandas has simple functionality for reading a query directly into DataFrame objects and dropping those objects into Excel through the pandas.io.sql.read_frame() and .to_excel functions.