Logic App - Copy from sharepoint to Azure BlobStorage - azure

I am currently testing my newly created workflow. The goal is to copy the files from sharepoint to blobstorage.
It needs to check in certain intervals if something has been added or updated in sharepoint and then copy it.
For this I use 'When a file is created or modified (properties only)', followed by a couple of 'Initialize variable' to adjust path etc. and after that comes 'Get file content'. At the end 'Upload blob to storage container'.
It works fine, however it only copies the files that have been added/updated to sharepoint after workflow was created.
My question is if it is possible to copy also the files that were already in sharepoint before creating workflow.
Thanks : ))

After reproducing from my end, I could able to get this done by listing the files from required folder and creating the blob using Get file content using path. Below is the flow of my logic app.
RESULTS:

Related

Move data from Sharepoint through a Logic App

We are using Logic App to move data from a Sharepoint folder to an Azure Blob Storage.
We were using the Sharepoint trigger "When a file is created or modified in a folder". Unfortunately, this trigger has been deprecated and does not work anymore (i.e., when a file is indeed created or modified, no further action is done after running the trigger).
No file is moved around anymore. The trigger does not execute the Logic App even though a file is created or modified in the Sharepoint origin folder. I have been through the various other Sharepoint triggers but they do not seem to fit our use case. We cannot create a Logic App for each file. We are not using Sharepoint lists but classic folders. We could use several triggers pointing directly at each existin file, but as we have many files to move in the same folder, we would have to create many Logic Apps and that is not how we want to do it. Moreover, some new files may be created in the future.
What could we do to keep the same architecture of moving data around from Sharepoint to Blob Storage through the non-deprecated Logic App triggers?
Thank you in advance,
Alexis
You can use When a file is created or modified (properties only) and get the properties of the file that is getting created or updated. Then you can use Get file content using the properties from the previous step. Finally, you can create a blob using the previous steps. Below is the flow of my logic app.
RESULTS:

I want to transfer multiple files from multiple subfolders sftp to SharePoint Document Library using logic app

The files and folders in SFTP are in format as
folder/new/demo/test/sample.csv ,
folder/new/demo/test/sample1.csv,
folder/new/demo/test1/sample3.csv ,
folder/new/demo/test/practice/sample5.csv, ... and so on,
I want to copy same structure which is in SFTP like folders inside subfolders and files to SharePoint Document Library. I have created a work flow but somehow got stuck here. I am attaching here screenshot of workflow. click here

How to trigger Azure Logic App when dropping file in Sharepoint folder

I created a Logic App that uses the Sharepoint trigger "When a file is created or modified in a folder". It works perfectly when I upload a file in Sharepoint online (in a Sharepoint browser tab). But, it doesn't work when I drop a file in my synced Windows explorer folder.
I read that someone faced the same problem: https://learn.microsoft.com/en-us/answers/questions/41215/logic-app-why-does-sharepoint-file-properties-trig.html. Here it says:
Move files and flow runs When you move one or more files from one
document library to another, the original file is moved from the
source library to the destination library. Moving the file does not
alter any custom metadata, including when the file was created and
modified. Hence, this action does not trigger any flows for those file
updates associated in the library where it was moved.
Syncing files to your OneDrive for business and SharePoint document
libraries When users sync one or more files from one document library
to another, the original file is moved (synced) from your client to
the destination library. Syncing the file will not alter any custom
metadata including when the file was created and modified. Hence, this
action will not trigger any flows for those file syncs in that library
or in your OneDrive for business.
The thing is that I NEED this Logic App to run by just dropping this file in a Windows Explorer folder (which is a Sharepoint folder shared with a certain person). Do you know how can I achieve this?
It started working for me when I used the OneDrive - When a file is created Connector because we use OneDrive for Windows Explorer and need to include the folder where the trigger should be invoked. We must set Include subfolder to true if we want the trigger to be fired while adding any file to the subfolders.
Here are the screenshots of the logic app working
When adding file in subfolders
When adding file in root folder

Setting up a trigger to watch new folders Azure Logic Apps

I am trying to create a logic app that will transfer files as they are created from my FTP server to my Azure file share. The structure of the folder my trigger is watching is structured by date (see below). Each day that a file is added, a new folder is created, so I need the trigger to check new subfolders but I don't want to go into the app every day to change which folder the trigger looks at. Is this possible?
Here's how my folder(Called data) structure is, each day that a file is added a new folder is created.
-DATA-
2016-10-01
2016-10-02
2016-10-03
...
The FTP Connector uses a configurable polling where you set how many times it should look for a file. The trigger currently does not support dynamic folders. However what you could try is the following:
Trigger your logic app by recurrence (same principle as the FTP trigger in fact)
Action: Create a variable to store the date time (format used in your folder naming)
Action: Do a list files in folder (here you should be able to dynamically set the folder name using the variable you created)
For-each file in folder
Action: Get File Content
Whatever you need to do with the file (call nested logic app in case you need to do multiple processing actions on each fiel is smart if you need to handle resubmits of the flow by file)
In order to avoid that you pick up every file each time, you will need to find a way to exlude files which have been processed in an earlier run. So either rename the file after it's processed to an extension you can exclude in the next run or move the file to a subfolder "Processed\datetime" in the root.
This solution will require more actions and thus will be more expensive. I haven't tried it out, but I think this should work. Or at least it's the approach I would try to set up.
Unfortunately, what you're asking is not possible with the current FTP Connector. And there aren't any really great solution right now...:(
As an aside, I've seen this pattern several times and, as you are seeing, it just causes more problems than it could solve, which realistically is 0. :)
If you own the FTP Server, the best thing to do is put the files in one folder.
If you do not own the FTP Server, politely mention to the owner that this patterns is causing problems and doesn't help you in any way so please, put the files on one folder ;)

Azure Logic Apps FTP file content

I'm using Azure Logic Apps to copy files from FTP to Blob Storage. I'm using the action "FTP - When a file is added or modified" and after this I copy the file content to a Blob Storage.
The problem is that the file is being copied to Blob Storage before it is completely added to the FTP, and I get partial files.
Is there any way to hold Azure Logic Apps such that it will only copy the file after it is completed?
This behavior happens when your FTP file system does not provide file locking. Similar behavior can occur when using the FTP-adapter in BizTalk.
In BizTalk, the best way to handle this is by modifying the client that is creating the file on FTP so that it uses temporary filenames.
Client creates file ftpFile.tmp
Client writes file content
Client renames ftpFile.tmp to ftpFile.xml or whatever extension is needed
BizTalk only picks up files with extension .xml
Major problem for me with the Logic Apps FTP connector is that you can't specify a file mask in the designer when using the connector as a trigger, which is strange, because i remember that option being available in the first version of logic apps.
You need to Enable the Include File Content Option.
You can do this in 2 ways.
1.
Go to LogicAppDesigner->Expand your FTP-Trigger ->Set yes to Include File Content
Like this
Go to Logic App Code View-> find the Trigger JSON- > Add "includeFileContent": true inside the "Queries"

Resources