I need to generate an excel file in Azure Data Factory and save it in a blob storage. I know that Sink to Excel file is not available. I would like to create an Azure Function in PowerShell to achieve this but I don't know how to go about it.
Related
Is it possible to convert csv files into pdf using Azure Data Factory or any other Azure technology?
Converting CSV (or any other file type) to PDF is not supported in Azure data factory.
You can use Azure Logic apps or Azure functions and Microsoft Graph to convert a file to PDF.
Note: You can call an API or execute azure functions in azure data factory pipeline.
Receiving an excel file every day with one sheet(Sheet name will be the different every time) and that will be stored in Azure Blob container and is there any possibility to convert the excel to CSV either by using SSIS script Task or Azure Logic Apps.
Any help would be appreciated. Thank you.
There are many ways we can do that.
With Logic app, you could ref the answer here:
Converting should be pretty easy. On high level, you can do following:
Use Excel connector to read into the content of the excel file
Use Data Operations - Create CSV Table to create a CSV format populated with dynamic data from step #1
Use Azure Blob Connector to create and save the new csv file on the blob storage
Since the excel is stored in Blob Storage, I would suggest you use Data factory, it supports Excel file directly:
Create the Source dataset:
Create Sink dataset: set the new csv file name:
Copy active overview:
It works well and very easy and directly:
have you ever made an azure data convert Azure Data Lake excel conversion to CSV file.
first, I have tried using SSIS with Azure Data Lake Source, but when Mapping is not possible, the choice is to add text.
second, says try using azure apps logic with create CSV table but the csv that comes out is only the structure in that folder
Thank you in advance
There is not a built-in way to extract from excel file in Azure data lake. I would suggest you to try one of the below approaches:
Write Custom .NET library for converting Excel to CSV and deploy that to Azure Data Lake Analytics. Azure Data Lake Analytics Programming Guide
Write a custom .NET activity in Azure Data Factory to do this. Custom Activities in Azure Data Factory
Use Azure Functions and Open XML do this activity as detailed in the stack overflow post
Use SSIS Package to do the conversion. You can have SSIS Runtimes in Azure Data Factory. SSIS packages running in Azure Data Factory
As I know about Azure, These isn't any way can help convert the excel file to csv directly.
You could follow these steps:
Download the excel file to you computer.
Import the excel file to the you SQL database.
Then export the table data as CSV file to you Blob Storage.
You could reference this document:
Import data from Excel to SQL Server or Azure SQL Database
Connect to Azure Blob Storage (SQL Server Import and Export
Wizard)
Hope this helps.
I am new to Azure and hence trying to understand what services to use when and how.
At the moment, I have one excel file that has couple of tabs that require some transformation to create one excel file tab (inside the source file itself - say Tab "x"). The final tab "x" created is then being useful for creating one final excel file that is shared to various team.
At present, everything is done manually.
This needs to change and the excel file shared to team has to be automated. The source of the file is the excel file that has various tabs (excluding tab "x") and the reporting tool will be SSRS with excel data being stored in cloud.
Keeping this scenario in mind, what is the best way to store excel data into cloud? The excel data will be stored in cloud on a monthly basis. I am confused as to whether to store data in Azure-SQL, Azure Data Lake Gen 2 or Azure Data Lake Analytics or Azure SQL VM?
Every month data can be fetched from Excel file and populate into Azure using azure data factory. But I am not sure what is the best way to store data in the cloud considering the fact that some ETL process is needed to generate data in format similar to tab "X".
I think you can think about to using Azure SQL database.
Azure SQL database or SQL server support you import data from the excel( or csv) files. For more details and limits, please see: Import data from Excel to SQL Server or Azure SQL Database.
If your data have stored in Azure SQL database, you also can using EXCEL to get the data from Azure SQL database:
Connect Excel to a single database in Azure SQL Database and import data and create tables and charts based on values in the database. In this tutorial you will set up the connection between Excel and a database table, save the file that stores data and the connection information for Excel, and then create a pivot chart from the database values.
Reference: Import data from Excel to SQL Server or Azure SQL Database.
I think you don't need to store these excel files in Azure Data Lake.Azure Data Lake Storage Gen2 is a set of capabilities dedicated to big data analytics, built on Azure Blob storage. It's still a storage.
The more Azure resource you use, the more cost you need to pay.
If your excel file stored in you local computer, you can using Azure Data Factory to access these local files or with self host integration runtime.
Please referenceļ¼ Copy data to or from a file system by using Azure Data Factory.
Hope this helps.
Your storage requirements are very minimal, so I would select Data Lake to store your documents. The alternative is Blob Storage, but I always prefer Data Lake because it works with Azure Active Directory.
In your scenario, drop it in the ADL, and use the ADL as the source in Azure Data Factory.
Edit:
Honestly, your original post is a little confusing. You have a RAW Excel document, you do some transformations on the RAW document, to generate an Excel Source document. This source document holds the final dataset that the dev team will use to build out SSRS reports. You need to make this dataset available to the teams so that they can connect to it to build the reports? My suggestion is to keep it simple and drop the final source dataset in Excel format, into blob or data lake storage and then ask the dev guys to pick it up from the location. If you are going the route of designing and maintaining a data pipeline (Blob > Data Factory > SQL, or CSV, TSV - then you are introducing unnecessary complications.
I want to create an Azure Function that connects to Logic Apps that will be used as an Add-in for Excel Online. I want this Azure Function to read the Excel online file as a blob.
How can I do this?
Per your description, I assume you want to use Logic App read the excel file then use Function to store the excel file into the Blob.
You could just do it with Logic App. Firstly use SharePoint connector to get the excel file content, then use Azure Blob connector, Create blob use file name and file content.
And here is the excel blob file in Azure Blob.