How to copy the latest file from Sharepoint to Blob Storage using Logic App? - azure

I am trying to extract the latest excel file from Sharepoint into Azure blob storage using Logic App.
I created the flow and it's working. However, it's copying all the files from the sharepoint to Blob.
Below is my flow.
enter image description here
I get new excel file everyday in my Sharepoint (/Shared documents/Data), hence I used list folder to locate it.
Then I used Filter array to filter the files as last modified with less than or equal to 5 m
I don't get any error. However, it's copying all the files rather than last modified file.
Can anyone advise how to address this?

You can use the trigger specific to 'When new file is added in sharepoint folder'. Documentation link - https://learn.microsoft.com/en-us/connectors/sharepoint/#when-a-file-is-created-in-a-folder

Related

SharePoint document library not showing actual user's name in 'Modified By' on Custom List's meta-data

I have created an utility for browsing and uploading file from user's machine (OS is Windows10) to a SharePoint site's document library. This utility is created using 1. a canvas form created in Power App, 2. a workflow created in Power Automate and 3. destination document library of SharePoint site.
To briefly summarize about its working; when any user uploads any file using this form, workflow is triggered to store this file as an entry to a SharePoint list, and it is then sent to designated reviewers to get it reviewed and approved. On approval, file is moved from SharePoint list to another site's document library.
Files are successfully being moved to the destination document library.
Here is the problem I'm facing for the moved files - The document library has 'modified by' meta-data column associated with files. This field does not display actual user's name whoever uploads a file or replaces an existing file by uploading a modified version of this file. It always shows my name in 'Modified By' field.
Is there any way to fix this and show the actual user's name in this field who uploads any new file or replaces any existing one.
Any help is really appreciated.
Thank you.
The issue you are having is by design.
The kind of trigger you are using in SharePoint connector always operate in context of the owner of the flow (you)
Please check the following for detailed description and possible workarounds: https://sharepoint.stackexchange.com/questions/269396/microsoft-flow-always-run-in-context-of-user-who-published-it
You can try using 'update file properties'. I have not tested if this works or not, but it seems practical solution to test.

Azure Data Factory Excel read via HTTP fails

I am looking to import data form a publicly available Excel sheet into ADF. I have set up the dataset using an HTTP linked service (see first screenshot), with AutoResolveIntegrationRuntime. However, when I attempt to preview the data, I get an error suggestion that the source file is not in the correct format (second screenshot).
I'm wondering if I may have something set incorrectly in my configuration?
.xls format is not supported while using HTTP.
Since, the API downloads file you can't preview data. You can load file to blob or Azure Datalake Storage using copy activity and then on top of that file have a dataset to preview.
The workaround is to save your .xlsx file as a .csv file because Azure Data Factory does not support reading .xlsx files explicitly for HTTP connectors.
Furthermore, there is no need to convert the.xlsx file to.csv if you only want to copy it; simply select the Binary Copy option.
Here, is a similar discussion where the MS-FTE has confirmed with Product Team that's its not supported yet for HTTP Connector.
Please submit a proposal in the QnA thread to allow this functionality in future versions, which will be actively monitored by the data factory product team and evaluated for adoption.
Please check the issue at QnA Thread- Here.

Create sharepoint list using azure logic apps

I want to create a list from an excel sheet I am uploading to SharePoint using Azure Logic Apps. Now I want to use this app to update a list on SharePoint using the same excel file. It is getting executed but gives absurd values in the list. Please help me to know how can I make this work.
This is because you just pass the file content not the items, the logic app won't do data processing, you need design you flow to process the data then create item one by one.
Below is my test flow, I get the csv file from SharePoint then I use Plumsail Parse CSV action to get the items. If you are processing other excel file you could use excel connector to get rows.
Here is my test result.

I am having problem reading Excel file using OData SSIS component from Share Point

as title says I am trying to read the Excel file placed on online SharePoint using the OData component in SSIS.
I am trying to make an SSIS package that will update my database in sql server with the data from that Excel file. I managed to do that when my file is placed locally.
I also managed to see my file using by adding /_vti_bin/listdata.svc/ inside target URI, i get an Share Point list where one of the components is my excel file, but I can't open it that way.
Also I need to use credentials to access the SharePoint.
I would be thankfull for any help :D
Acceptable solution would be to download that file first, but I haven't managed to do that either.

Upload Excel 2013 Workbook to website hosted on Azure

Does anyone have guidance and/or example code (which would be awesome) on how I would go about the following?
With a Web application using C# / ASP.NET MVC and hosted on Azure:
Allow a user to upload an Excel Workbook (multiple worksheets) via a web page UI
Populate a Dataset by reading in the worksheets so I can then process the data
Couple of things I'm unclear on:
I've read that Azure doesn't have ACEOLEDB, which is what Excel 2007+ requires, and I'd have to use OPEN XML SDK. Is this true? Is this the only way?
Is it possible to read the file into memory and not actually save it to Azure storage?
I DO NOT need to modify the uploaded spreadsheet. Only read the data in and then throw the spreadsheet away.
Well that's many questions in one post, let me see if we can tackle them one by one
With a Web application using C# / ASP.NET MVC and hosted on Azure:
1.Allow a user to upload an Excel Workbook (multiple worksheets) via a web page UI
2.Populate a Dataset by reading in the worksheets so I can then process the data
Couple of things I'm unclear on:
1.I've read that Azure doesn't have ACEOLEDB, which is what Excel 2007+ requires, and I'd have to use OPEN XML SDK. Is this true? Is
this the only way?
2.Is it possible to read the file into memory and not actually save it to Azure storage?
1/2. You can allow a user to upload the excel workbook to some /temp location and once you have read you can choose to do the cleanup, you can also write a script which can do the cleanup of the files which couldn't get deleted from /temp for whatever reasons.
Alternatively if you want to keep the files, you should store them in Azure Stoarge, and fetch/read when you need to.
check out this thread read excelsheet in azure uploaded as a blob
By default when you upload a file it is wrote into local disk and one later chooses to save the files to azure storage or whatever places.
Reading the excel - you can use any of the nugget packages given here http://nugetmusthaves.com/Tag/Excel and read the excel file, I prefer Gembox and NPOI
http://www.aspdotnet-suresh.com/2014/12/how-to-upload-files-in-asp-net-mvc-razor.html

Resources