I am getting the everyday one new file in my SharePoint. My file is stored in "Shared Document/Data"
Below is my logic app flow. I used Get Files and I am selecting Document and included Nested items as I am getting new data under "data folder" which is stored in the Shared document.
I used Filter Array to get the last modified file with less than or equal to 5m so I can get the latest file
I am facing two issues,
It takes all the files from SharePoint under "Get Files" and the filter array is not working.
I have used create blob wrongly
Can anyone advise me on how to do this?
Follow the workaround
You can use the below Trigger & Connector
Sharepoint (Trigger) - When a file is created or modified in a folder
You can use this connector to select and get the exact Directory to fetch the recent Modified/Created File.
Azure Blob Storage (Connector)- Create Blob (V2)
Use this connector to create a blob.
Result
Modified file fetched and added in a Blob
Refer here for more information
Updated Answer
Here is the list of available directories in SharePoint that will be shown in a SharePoint Trigger. You can select according to your requirement.
Related
The files will get updated in blob storage everyday so I want in incremental method which transfer only new files and create folder in SharePoint if it is not exists. For example mycontainer/folder/20210101/test.csv , mycontainer/folder/20210102/test.csv the csv files may be single file or multiple files. I have created workflow in logic app but somehow I got stuck here I am attaching my screenshot of my workflow.
Image screenshot:
Here is the overall flow
This is how I achieved your requirement
I first built a folder where files will be added on a daily basis, and then I used compose connector to retrieve the 'LastModifiedDate' from it.
Here is the Compose Connector Expression that I used [Compose]
substring(join(split(triggerBody()?['LastModified'],'-'),''),0,8)
Later I have created another folder with it and added the file to that folder. Then I used compose connector in order to get the path
Here is the Path Compose Connector [Compose2]
substring(body('Create_blob_(V2)')?['Path'],0,lastIndexOf(body('Create_blob_(V2)')?['Path'],'/'))
Lastly, I have used a SharePoint connector and created a folder using the above path where in the next step I created a file same as the blob structure.
Here are the screenshots from my storage account and SharePoint
Storage Account
Sharepoint
Update
I noticed that the blob was erased when I added the Delete blob connector at the end with blob as the 'List of files path.' As a result, this may meet the criteria.
I am currenly working for a client that has a Sharepoint list in which every month, a new sub-map is made. Every day a new file is added to that month's sub-map.
I already designed a Logic App which copies all files to an Azure Storage Account, the problem is, I only need the most recent file. This is what the Logic App looks like:
Logic App picture link
I tried to compare the Sharepoint-list with the blob storage list of yesterday so that every new day, a new file is recoqnized, but that didn't work out as i hoped.
Is there any way to retrieve only the most recently added files from a sharepoint list?
You can use filter array in compose Connector and compare LastModified with the specified time that you wanted i.e.. 24h, 12h
Here is the screenshot of the flow for your reference
And there by you can able to view all the blobs which are in timeframe of 24hours.
Updated Answer
If it is from Sharepoint then you can add Get files from Sharepoint connector and add filter array.
Here is the screenshot for your reference
In the image, I was referring to the list of 4 months value and could able to retrieve the same.
I need to get the file name of a file when it is uploaded to blob storage using Logic Apps. I'm new to Logic Apps and this seems like it should be easy but I'm not having any luck.
To try and find the filename I'm sending what's available to me in an email. I will eventually use the filename as part of an http post to another service.
The logic app is triggered as it should be when I upload but I do not get any data in my email for the items I chose. I am not uploading to a subfolder. I've looked at code views and searched other post but not finding a solution. Any help most appreciated.
Thanks
Instead of using Inbuilt When a blob is added or modified in Azure Storage connector, try using When a blob is added or modified (properties only) (V2) and add List of Files Display Name connector in order to get the file name.
Here are the screenshots
Here is the overall Logic app flow
Here is the screenshot from my outlook
I have some Excel files stored in SharePoint online. I want copy files stored in SharePoint folders to Azure Blob storage.
To achieve this, I am creating a new pipeline in Azure Data factory using Azure Portal. What are possible ways to copy files from SharePoint to Azure blob store using Azure Data Factory pipelines?
I have looked at all linked services types in Azure data factory pipeline but couldn't find any suitable type to connect to SharePoint.
Rather than directly accessing the file in SharePoint from Data Factory, you might have to use an intermediate technology and have Data Factory call that. You have a few of options:
Use a Logic App to move the file
Use an Azure Function
Use a custom activity and write your own C# to copy the file.
To call a Logic App from ADF, you use a web activity.
You can directly call an Azure Function now.
We can create a linked service of type 'File system' by providing the directory URL as 'Host' value. To authenticate the user, provide username and password/AKV details.
Note: Use Self-hosted IR
You can use the logic app to fetch data from Sharepoint and load it to azure blob storage and now you can use azure data factory to fetch data from blob even we can set an event trigger so that if any file comes into blob container the azure pipeline will automatically trigger.
You can use Power Automate (https://make.powerautomate.com/) to do this task automatically:
Create an Automated cloud flow trigger whenever a new file is dropped in a SharePoint
Use any mentioned trigger as per your requirement and fill in the SharePoint details
Add an action to create a blob and fill in the details as per your use case
By using this you will be pasting all the SharePoint details to the BLOB without even using ADF.
My previous answer was true at the time, but in the last few years, Microsoft has published guidance on how to copy documents from a SharePoint library. You can copy file from SharePoint Online by using Web activity to authenticate and grab access token from SPO, then passing to subsequent Copy activity to copy data with HTTP connector as source.
I ran into some issues with large files and Logic Apps. It turned out there were some extremely large files to be copied from that SharePoint library. SharePoint has a default limit of 100 MB buffer size, and the Get File Content action doesn’t natively support chunking.
I successfully pulled the files with the web activity and copy activity. But I found the SharePoint permissions configuration to be a bit tricky. I blogged my process here.
You can use a binary dataset if you just want to copy the full file rather than read the data.
If my file is located at https://mytenant.sharepoint.com/sites/site1/libraryname/folder1/folder2/folder3/myfile.CSV, the URL I need to retrieve the file is https://mytenant.sharepoint.com/sites/site1/libraryname/folder1/folder2/folder3/myfile.CSV')/$value.
Be careful about when you get your auth token. Your auth token is valid for 1 hour. If you copy a bunch of files sequentially, and it takes longer than that, you might get a timeout error.
How do I delete all files in a source folder (located on On-premise file system). I would need help with a .NET custom activity or any Out-of-the-box solutions in Azure Data Factory.
PS: I did find a delete custom activity but it's more towards Blob storage.
Please help.
Currently, there is NO support for a custom activity on Data Management Gateway. Data Management Gateway only supports copy activity and Stored Procedure activity as of today (02/22/2017).
Work Around: As I do not have a delete facility for on-premise files, I am planning to have source files in folder structures of yyyy-mm-dd. So, every date folder (Ex: 2017-02-22 folder) will contain all the related files. I will now configure my Azure Data Factory job to pull data based on date.
Example: The ADF job on Feb 22nd will search for 2017-02-22 folder. In the next run my ADF job will search for 2017-02-23 folder. This way I don't need to delete the processed files.
Actually, there is a normal way to do it. You will have to create Azure Functions App that will accept POST with your FTP/SFTP settings (in case you use one) and file name to remove. Therefore you parse request content to JSON, extract settings and use SSH.NET library to remove desired file. In case you have just a file share, you do not even need to bother with SSH.
Later on in Data Factory you add a Web Activity with dynamic content in Body section constructing the JSON request in the form I've mentioned above. For URL you specify published Azure Function Url + ?code=<your function key>
We actually ended up creating the whole bunch of Azure Functions that serve as custom activities for our DF pipelines.