Can Logic App refresh excel file in the Azure Storage (blob)? - excel

I have data source from external (KoboTool Box). This data source provides conversion to excel file (From Web connection) by put specific web link. After I got all data, I put these excel file in the Azure blob storage.
By using Logic App, I have got the file in the blob container but I have no idea how to refresh this excel file to retrieve external data. There are two Excel Online (One drive & Business). I don't know what next steps.
My question, is it possible to refresh this excel file to get the latest update data from external data source by using logic app ?

Related

Azure Blob storage to Sharepoint online using Azure Logic Apps

I am trying to populate a Sharepoint Online list. The source files are of csv format and reside in Azure Blob storage. Is it possible to use Logic Apps to read the contents of the csv line by line (not copy/move files to Sharepoint) and insert a row in the Sharepoint list? I could not find any actions that would allow me to parse the csv files in Azure Logic Apps.
Side note: The csv files are being generated using an Azure Data Pipeline. I am only writing them to csv as ADF doesn't allow Sharepoint Online as a sink. If there is a way to directly populate the data to Sharepoint and avoid writing to a csv, then that would be magical!
You can use Parse CSV action of Plumsail, a 3rd party connector. Below is the flow of my logic app where I could able to insert each row from csv to SharePoint list.
Below is my CSV File in my storage
RESULTS:

Azure Database -> Excel Query

I have a query being executed in a Azure server periodically and I need to add some code to it, so it can save some data from Tables/Views to a Excel file during the execution.
I have implemented some code like this on other databases (non-Azures), but executing the same code in Azure gives me messages like "Azure doesn't support" some of the tools I used.
What should I use to do this? I just got to save some Tables data to specific sheets in Excel.
Thanks in advance!
In case if the requirement is specific to Excel file creation ; you can use a logic app to query database from Azure SQL database and generate Excel file based on the below link:
https://community.dynamics.com/ax/b/d365fortechies/posts/logic-app-for-azure-sql-db-to-azure-file-storage-workflow
Note: You can select Excel file generation for Logic app rather than CSV mentioned in the above example or generate an CSV file and then convert into Excel
Since OPENDATASOURCE is not supported in Azure SQL. You also can use other ETL tools to save some data from Tables/Views to a Excel.
Such as Azure data factory:
Using Copy activity in Azure data factory, you can query from table, execute your sql query and execute stored procudure then convert to a Excel file. There are multiple destinations for you to choose to store this excel, cloud or local server.

PowerBI - Using an Azure blob as a DataFlow data source

Hopefully someone can help. I have a requirement of creating something like a "bring your own data model" feature/steps so clients can upload their own files, use it in a PowerBI data flow, link it with other sources (databases, services, etc), and create reports from it. The only problem is I'm having problems creating a data flow from an Azure blob file.
The files are CSV format
I have premium feature
I first tested and created an M query in PowerBI Desktop and it works, I can access the data
What I do is I copy my created M query to the Power BI data flow data source in a blank query. The result will is I can preview the data, but if I try to save it, I'm getting this error:
I tried the steps on this link but no luck.
Any ideas?

Azure read Excel Online document in Azure Function

I want to create an Azure Function that connects to Logic Apps that will be used as an Add-in for Excel Online. I want this Azure Function to read the Excel online file as a blob.
How can I do this?
Per your description, I assume you want to use Logic App read the excel file then use Function to store the excel file into the Blob.
You could just do it with Logic App. Firstly use SharePoint connector to get the excel file content, then use Azure Blob connector, Create blob use file name and file content.
And here is the excel blob file in Azure Blob.

Azure Data Factory and SharePoint

I have some Excel files stored in SharePoint online. I want copy files stored in SharePoint folders to Azure Blob storage.
To achieve this, I am creating a new pipeline in Azure Data factory using Azure Portal. What are possible ways to copy files from SharePoint to Azure blob store using Azure Data Factory pipelines?
I have looked at all linked services types in Azure data factory pipeline but couldn't find any suitable type to connect to SharePoint.
Rather than directly accessing the file in SharePoint from Data Factory, you might have to use an intermediate technology and have Data Factory call that. You have a few of options:
Use a Logic App to move the file
Use an Azure Function
Use a custom activity and write your own C# to copy the file.
To call a Logic App from ADF, you use a web activity.
You can directly call an Azure Function now.
We can create a linked service of type 'File system' by providing the directory URL as 'Host' value. To authenticate the user, provide username and password/AKV details.
Note: Use Self-hosted IR
You can use the logic app to fetch data from Sharepoint and load it to azure blob storage and now you can use azure data factory to fetch data from blob even we can set an event trigger so that if any file comes into blob container the azure pipeline will automatically trigger.
You can use Power Automate (https://make.powerautomate.com/) to do this task automatically:
Create an Automated cloud flow trigger whenever a new file is dropped in a SharePoint
Use any mentioned trigger as per your requirement and fill in the SharePoint details
Add an action to create a blob and fill in the details as per your use case
By using this you will be pasting all the SharePoint details to the BLOB without even using ADF.
My previous answer was true at the time, but in the last few years, Microsoft has published guidance on how to copy documents from a SharePoint library. You can copy file from SharePoint Online by using Web activity to authenticate and grab access token from SPO, then passing to subsequent Copy activity to copy data with HTTP connector as source.
I ran into some issues with large files and Logic Apps. It turned out there were some extremely large files to be copied from that SharePoint library. SharePoint has a default limit of 100 MB buffer size, and the Get File Content action doesn’t natively support chunking.
I successfully pulled the files with the web activity and copy activity. But I found the SharePoint permissions configuration to be a bit tricky. I blogged my process here.
You can use a binary dataset if you just want to copy the full file rather than read the data.
If my file is located at https://mytenant.sharepoint.com/sites/site1/libraryname/folder1/folder2/folder3/myfile.CSV, the URL I need to retrieve the file is https://mytenant.sharepoint.com/sites/site1/libraryname/folder1/folder2/folder3/myfile.CSV')/$value.
Be careful about when you get your auth token. Your auth token is valid for 1 hour. If you copy a bunch of files sequentially, and it takes longer than that, you might get a timeout error.

Categories

Resources