Copy Sharepoint folder to local drive - sharepoint

I'm new to power automate and have been searching on Microsoft forums and googling a flow to copy all contents of a sharepoint root folder (Documents) to a local drive and cannot find an easy straight forward answer.
All i see is how to copy files to a local drive, which after trying and failing a lot, i finally found 1 flow that helped me do it in 2 steps:
1 - When a file is created;
2 - Create file
What i intend is to backup the root documents folder monthly with all other folder and files included to a local drive with power automate.
Appreciate any help.
Il post the prints on the flow i have right now:
Created Flow
Error after running flow
What happens is that one subfolder is selected and only the files on the subfolder are copied to the local drive, not the subfolder itself, and after that it stops the flow saying no dependent actions succeeded. I was expecting the following:
1 - Select files in folder and copy to chosen path;
2 - Select subfolders with files and create the same subfolders with files on the chosen path;

First you have to list all the files you need to copy to your Drive using the sharepoint Get Files action:
Next, add an Apply to each using the first dynamic content you have on the right side, normally it's a value like showed below, then add sharepoint action Get file content using path. Select the same root directory you used in the Get Files action and in the File Path property use the dynamic content Complete Path.
Next, all you have to do is create a new file using the Google Drive action Create File still inside the Apply to each. Use the file name and in File Contents use the dynamic content File contents:

Related

I want to copy files at the bottom of the folder hierarchy and put them in one folder

In Azure Synapse Analytics, I want to copy files at the bottom of the folder hierarchy and put them in one folder.
The files you want to copy are located in their respective folders.
(There are 21 files in total.)
enter image description here
I tried it using ability to flatten the hierarchy of "Copy" activity.
However, as you can see in the attached image, the file name is created on the Synapse side.
enter image description here
I tried to get the name of the bottom-level file with the "Get Metadata" activity, but I could not use wildcards in the file path.
I considered creating and running 21 pipelines that would copy each file, but since the files are updated daily in Blob, it would be impractical to run the pipeline manually every day using 21 folder paths.
Does anyone know of any smart way to do this?
Any help would be appreciated.
Using flatten hierarchy does not preserve existing file name, new file name will be generated. Wildcard paths are not accepted by Get metadata activity. Hence one option is to use Get Metadata with ForEach to achieve the requirement.
The following are the images of folder structure that I used for this demonstration.
I created a Get Metadata activity first. I am retrieving the folder names (21 folders like '20220701122731.zip') inside Intage Sample folder using field list as child items.
Now I used ForEach activity to loop through these folders names by giving items value as #activity('Get folders level1').output.childItems.
Inside ForEach I have 3 activities. First is another Get Metadata activity to get the subfolder names (to get one folder inside '20220701122731.zip', that is '20220701122731')
In this, while creating dataset, we passed the name of parent folder (folder_1 = '20220701122731.zip') to the dataset to use it in the path as
#{concat('unzipped/Intage Sample.zip/Intage Sample/',dataset().folder_1)}
This returns the names of subfolders (like '20220701122731') which are inside parent folder (like '20220701122731.zip' which have 1 subfolder each). I used set variable activity to assign the child items output to this variable using #activity('Get folder inner').output.childItems .
The final step is copy activity to move the required files to one single destination folder. Since there is only one sub-folder inside each of the 21 folders (only one sub-folder like '20220701122731' inside folder like '20220701122731.zip'), we can use the values achieved from above steps directly to complete the copy.
Along with the help of wildcard paths in this copy data activity, we can complete the copy. The wildcard directory path will be
#{concat('unzipped/Intage Sample.zip/Intage Sample/',item().name, '/', variables('test')[0].name)}
#item().name give parent folder name, in your case- '20220701122731.zip'
#variables('test')[0].name gives sub-folder name, in your case like '20220701122731'
For sink, I have created a dataset pointing to a folder inside my container called output_files. When triggered, the pipeline runs successfully.
The following are the contents of my output_files folder.

Transfer files under multiple folder from sftp to sharepoint document library using logic app

I have a scenario how to transfer files from sftp server to sharepoint document library
Example: files in sftp folder are /new/folder1/id1.csv /new/folder2/id2.csv like that every day the files will be uploaded to folders. how to transfer the same structure in sharepoint document library using logic apps..
The workflow for your folder structure would be as follows:
List files in your SFTP folder "/new".
Create a "For each" loop using the output of the list action as a parameter.
To make sure you don't treat files as folders (if you can have files in /new, e.g. /new/test.txt), add a condition: the IsFolder property of the loop item = true.
Inside the loop (and the condition result True) list files again, this time in the subfolder, using the Path property of the loop item.
Create a new "For each" loop using the output of this list action as a parameter.
Optionally, add a condition: the IsFolder property of the inner loop item = false.
Get content of the SFTP file, using the Id property of the inner loop item.
Create a file on SharePoint using the folder path, file name, and file content retrieved in the previous actions as parameters. If the folder doesn't exist in SharePoint library, it should be created automatically.
This is the most simple scenario, given the folder structure provided in your question. If the folder structure is more complex (subfolders can contain both files and subfolders, which in turn can contain other files and subfolders, and so on) then you'd need to use a recursive algorithm - first the Logic App would need to list files in a single SFTP folder (provided to the Logic App in the HTTP request body), then for each listed file (not subfolder) upload its content to SharePoint, and for each listed subfolder (not file) the Logic App would need to call itself passing the subfolder path in the HTTP request body - this way all subfolders would be processed recursively and all files in them would be transferred to SharePoint.
Please note that each such a workflow is run it would transfer all files - it wouldn't check what files are new, what files have been transferred in previous Logic App runs, etc. - that would be a completely different challenge.

Transfer a modified sharepoint Excel file to another folder

Is there a way that whenever I modify an Excel file in sharepoint in the "INPUT" folder the file is then copied to the "OUTPUT" folder erasing the previous one?
Thanks in advance
I see the tag [flow] in your question, so I'll answer with that.
This is a straight forward flow to create. You need two steps
The trigger is a Sharepoint one - When a file is created or modified in a folder. You provide the site name and the source folder
The next action is a Sharepoint Move file.
Current Site address is the same as the trigger
File to move is the Dynamic Content x-ms-file-id
Destination Site is the site where you want to copy the file to
Destination folder is the target folder
Choose your overwrite strategy, presumably 'Replace'

Automatically create Subfolders on SharePoint Document Library

I have been trying to accomplish this for weeks now and end up hitting a wall.
I have a document library on SharePoint Online with the following (close enough) structure.
Clients
-> Schools
-->Client Name
--->Communications
--->Documentation
--->Projects
---->Project Name 1
---->Project Name 2
->Retail
-->Client Name
--->Communications
--->Documentation
--->Projects
---->Project Name 1
---->Project Name 2
... and so on.
Inside the "Projects" folder there is a set of folders as well.
Right now we have a Project template folder that we used to just copy/paste and rename when we had our file server, but now on SharePoint, the copy to process is way too many clicks to get it to that location.
What I am trying to accomplish is be able to create a new project folder and automatically create all the folders under it.
Appreciate the guidance on this.
I was able to figure this out.
My challenge was when creating the folder, it always wanted to create it inside the root of the document library and not the subfolder.
So I created 2 Content types for folders, one for clients and one for projects.
Used SharePoint Designer to create the workflow, but the trick here was to extract the URL from the current item, which is the folder being created, and remove the first x amount of characters from it which equals the SharePoint document library location. The remaining part of the string was the exact location where I wanted the subfolders to be created.
After that, I used that variable to create all other subfolders.

sharepoint workflow to move files of a content type to new folder (after creating it if needed)

I have multiple SP document libraries for different meetings. I want to keep the libraries organized by meeting day. So each meeting would have it's own folder and all the files for that day would go in that folder.
To make it easy I wanted to make it so you can upload a file and then a SP workflow will create a folder for that meeting, if needed, and move the file.
So I created a "meeting file" content type that also gets meeting date and file type (minutes, presentation, misc, etc...)...
What I need to do next is check if a folder for that meeting date exists and create it if it does not. Then move the file over to that folder.
Any ideas how I could do this?
I could also try it without using content types but then the workflow starts automatically for every file added and then I cannot create a new folder with that workflow (because it would start a new instance of that workflow).
I was hoping to keep it reusable so I could just use one workflow for all the document libraries. I thought the workflow could find the path of the list it is being run on, and create the folders and do the other work within that list.
Any ideas are appreciated.
For creating folder via workflows:
Creating folders and sub-folders using SharePoint 2010 Designer Workflow
and for checking folder name:
Create a string workflow variable.
Now create a lookup for your folder and set the variable to the folder's title. Take a look here for some helpful information and usage guidelines when it comes to list folders.
The actions dependent of the existence of the folder have to be placed inside an if-statement
E.g. the if-statement should be like "if [variable] not equals [folder name]" if you want the actions only to be run when the folder does not exist
and here the whole tutorial that I found:
Create folders using a SPD workflow

Resources