Power Automate: Sweep thru Subsites in SharePoint and copy new files - sharepoint

Good Day!
I am struggling with the best way to approach this via Power Automate.
Desired Flow: This will be a scheduled flow that would trigger every 10 minutes. Cycle through a SharePoint List that contains a list of subsites within my enterprise along with the path to a specific folder in a specific library. Using the 'Apply to Each' function, identify any NEW files since the last sweep. Copy those files to a different directory in SharePoint (details of which are also included in that same SharePoint list).
I have got all the pieces to this EXCEPT: How to identify the new files since the last sweep.
Can't use the "when a new file is created" trigger, as this listing has close to 300 subsites on it....which would mean 300 different flows.
Appreciate any guidance!
UPDATE:
Getting an error when using the Get Files. Here is the snippet of the flow:
Snippet of the Flow Where error is occuring
Snippet of the SharePoint List

I would use a Get Files (properties only) action with a Filter Query which uses an Created greater than 10 minutes ago.
Below is an example.
Created ge '#{addMinutes(utcNow(), -10)}'
This is for the second part of your question.

Related

Sending recently created Sharepoint-file as attachment with Power Automate

After some months I could say I am getting the hang of Microsoft Flow, however I could use some help with the following issue:
In a flow for reporting purposes, a temporary file (.xlsx) is created in a sharepoint folder by means of a template. This temporary file is then filled with rows and info from other sources. So far so good.
I use the body of this newly created and furnished file as an attachment for an e-mail to the chief. However, the attachment came out identical to the (empty) template file, without the rows and furnishing.
Adding a delay of two minutes before attaching and sending the mail solved it for relatively small reports, but this is not ideal as I want it to work regardless of file size. Furthermore I do not understand why it would send an empty (old) version of the temp. file in the first place, as all the furnishing operations should have executed before copying and attaching (the flow is entirely in series).
Sorry for the long story. Does anyone have a more elegant solution than using a Delay-node?

Power Automate - Infinite Loop issue

This sounds so simple in my head, but Power Automate doesn't like it.
I have a library with a lookup column. I have a Flow created which takes the filename of the document and puts this name into a "Title" column. Then I can use a lookup column on the Title column to find all the files in the library.
I've used "When a file is created or modified". Yet this flow runs constantly. No files are being update or modified at 1am, yet it still runs over and over. I've had an automated email telling me to fix this before it is disabled.
All I want it to do is run the damn flow ONLY when a file is updated or uploaded, just as its own function title suggested.
It would seem I need to add trigger rules, but all the guides I found were talking about checking if a specific person has modified it.
This used to be so simple with workflows, it would only run if something was modified or uploaded.

Extract Keywords from Office Documents with Sharepoint Flow

I am trying to implement a document management system using Sharepoint. One major issue is that colleagues cannot find documents in the current setup (local fileserver). They have asked that we have a system that scans uploaded documents and automatically looks for keywords in them and then populates a "Meta" column.
I have had sort of success with OCR on image files, but getting keywords out of office documents (doc, xls etc.) I have had no success until now.
Is there a way to setup a flow to do this task for me?
any help is much aprechiated.
i tried "Get file metadata" and Azure "Text analysis", but it seems to take the raw data of the files (XML I assume) and returns that the document is to large to analyse.
There is something vague about this requirement - how is a keyword defined in a document?
Therefore, first obvious solution would be to assign keywords for each file upon uploading it. You may create a process for this with flow - have tasks, reminders and so on.
Automating this with OCR first means that you need to user OCR that works with MS flow you have only one choice - ElasticOCR. Then, in your flow
- feed the document content to the ElasticOCR action
- keep in mind that OCR is not 100% accurate
- analyze the generated text content according to your keyword definition
- finally write the meta back to the library in the corresponding columns.
Having worked on a similar requirement, we asked uploaders to publish their documents with a short abstract(column from the content type). The assumption is the abstract contains the keywords and is stored in a multi-line column - making it searchable site wide.

Export Sharepoint list to .csv and upload to Azure Data Lake Using Flow

I am trying to using Microsoft Flow to export a Sharepoint List to Azure Data Lake.
I want it so that anytime a particular online list is changed, its entire contents are loaded into a file in Data Lake. If the file already exists, I want to overwrite it. Can someone please explain how I can go about doing this, I have tried multiple ways, but they are not getting the job done.
Thanks
I was able to get the items in the SharePoint list to near perfection. I will post the Flow here in case anyone in the future needs it.
So what I did is that every 5 minutes I "create" a file in Azure Data Lake which overwrites the file if it exists. The content of the files cannot be blank, so I added a newline to the content. Then I use Get Items to retrieve all the items in the SharePoint List. From there, using an Apply to each loop, I append the content of the current row of the Sharepoint list to the Data Lake file (separated by | and ending with a new line after all the content is added). This works to near perfection, with the only caveat being the newline at the beginning of the file, which I eliminate using PowerQuery.
This is exactly what I needed. If anybody sees a way to make this better, please post so that we can get this to perfection.

How to export work items from one TFS server to another TFS server using Excel

I need to migrate Work Items from one TFS server to another TFS server. I tried migrating them using the TFSMigration tool available in CodePlex.
The problem I am encountering is the schema for the work item on the source TFS is different from the schema of the work item type on destination. I don't want the destination TFS server schema for the work item to be modified. The change in schema is one new column only, but still don't want to take that change.
In one blog it was said that we can do that using Excel but not much details were available. I am not sure we can even use Excel to migrate the entire history related to Excel.
Have a look at the TFS Integration Tools on VS gallery. This supports custom field mappings as part of a migration, documentation here (direct download).
I did this a while back and, not finding an appropriate tool, resorted to copying the title and description etc across manually, as we only had a few active work items at the time, so it only took about an hour.
however, if I need to do it again, I'll use the TFS API to read fields of interest and write them to the new database. that way any schema differences don't matter, and the process is automated but under your control. Search for studying work items with the TFS API for details - it's really very easy.
Of course with both of these approaches (and all the migration tools AFAIK) you will only get a snapshot of the data - all history will be lost (or at best you can query using AsOf to get historical data, but all the entries you make will be timestamped at the moment you write them, not with the historical time that the event originally occurred.)
You can use the Excel editor to edit the source query All Items "Open Query in Microsoft Excel". Then open the destination query All Items "Open Query in Microsoft Excel". Copy and paste the contents from one excel window to the other. Certain fields like attachments will not transfer.

Resources