I have a Power automate Flow which uses Azure blob connector to read excel file from the blob using the Get blob content action.
The problem is I need to process the excel data and save it in D365 f and O entity. for that I need the data in json format. I saw we can use cloudmersive connector to convert excel to json
I want to do it without using any 3rd party connector.?
You can read the file, and insert it into a table. After that, you can use compose actions or arrays to assign them to a JSON object.
Related
I am trying to populate a Sharepoint Online list. The source files are of csv format and reside in Azure Blob storage. Is it possible to use Logic Apps to read the contents of the csv line by line (not copy/move files to Sharepoint) and insert a row in the Sharepoint list? I could not find any actions that would allow me to parse the csv files in Azure Logic Apps.
Side note: The csv files are being generated using an Azure Data Pipeline. I am only writing them to csv as ADF doesn't allow Sharepoint Online as a sink. If there is a way to directly populate the data to Sharepoint and avoid writing to a csv, then that would be magical!
You can use Parse CSV action of Plumsail, a 3rd party connector. Below is the flow of my logic app where I could able to insert each row from csv to SharePoint list.
Below is my CSV File in my storage
RESULTS:
I have a simple logic app that runs a daily SQL report, converts the data into a CSV, then attaches the CSV to an email. (see image)
The problem is, that the data is getting too big for the email server to allow. So, if I can compress the CSV, or convert it to an XLS, it'll be small enough for the email server to handle it.
Can I get this done without writing to blob storage or any other storage system? I did find a 3rd party action by Encodian which I might be able to use, just can't figure out the details.
Can I get this done without writing to blob storage or any other storage system? I did find a 3rd party action by Encodian which I might be able to use, just can't figure out the details.
You can use encodian's Convert Excel Action.
In order to set up the connection, you need to add the API key choosing the Encodian subscription that fits your needs and budget.
For more information, you can refer Encodian Flowr for Azure Logic Apps
Alternatively, You can use Plumsail's Csv to Excel action.
while adding the connector you can generate an API key from Plumsail Account API Key.
REFERENCES:
Convert Excel and CSV Files
How to convert CSV files to Excel
I have a query being executed in a Azure server periodically and I need to add some code to it, so it can save some data from Tables/Views to a Excel file during the execution.
I have implemented some code like this on other databases (non-Azures), but executing the same code in Azure gives me messages like "Azure doesn't support" some of the tools I used.
What should I use to do this? I just got to save some Tables data to specific sheets in Excel.
Thanks in advance!
In case if the requirement is specific to Excel file creation ; you can use a logic app to query database from Azure SQL database and generate Excel file based on the below link:
https://community.dynamics.com/ax/b/d365fortechies/posts/logic-app-for-azure-sql-db-to-azure-file-storage-workflow
Note: You can select Excel file generation for Logic app rather than CSV mentioned in the above example or generate an CSV file and then convert into Excel
Since OPENDATASOURCE is not supported in Azure SQL. You also can use other ETL tools to save some data from Tables/Views to a Excel.
Such as Azure data factory:
Using Copy activity in Azure data factory, you can query from table, execute your sql query and execute stored procudure then convert to a Excel file. There are multiple destinations for you to choose to store this excel, cloud or local server.
Receiving an excel file every day with one sheet(Sheet name will be the different every time) and that will be stored in Azure Blob container and is there any possibility to convert the excel to CSV either by using SSIS script Task or Azure Logic Apps.
Any help would be appreciated. Thank you.
There are many ways we can do that.
With Logic app, you could ref the answer here:
Converting should be pretty easy. On high level, you can do following:
Use Excel connector to read into the content of the excel file
Use Data Operations - Create CSV Table to create a CSV format populated with dynamic data from step #1
Use Azure Blob Connector to create and save the new csv file on the blob storage
Since the excel is stored in Blob Storage, I would suggest you use Data factory, it supports Excel file directly:
Create the Source dataset:
Create Sink dataset: set the new csv file name:
Copy active overview:
It works well and very easy and directly:
I have trying to list all filename/url in my blob containter and save it to csv or table in azure sql database.
I was struggling in ADF with metadata activity:
But I can take the child item into table or csv. Is there any advice?
I suggest you to use logic app to achieve your needs, because it is very simple, the specific design is shown in the figure:
As the url is in this format:
https://myaccount.blob.core.windows.net/mycontainer/myblob
You need to define a variable as a prefix.
The usage of List blobs, you can refer to this link. For how to connect to your Azure database, you can refer to this official document.
===========update==============
Regarding your question on how to create a csv file, the answer is updated as follows:
I designed my logic app like this
In these steps, on how to create a csv table, you can learn from this official document.
I tested it for you and found no problems:
If you are trying to get the list in a C# program, you could use BlobContainerClient.GetBlobs()