I'm using Azure Logic Apps to copy files from FTP to Blob Storage. I'm using the action "FTP - When a file is added or modified" and after this I copy the file content to a Blob Storage.
The problem is that the file is being copied to Blob Storage before it is completely added to the FTP, and I get partial files.
Is there any way to hold Azure Logic Apps such that it will only copy the file after it is completed?
This behavior happens when your FTP file system does not provide file locking. Similar behavior can occur when using the FTP-adapter in BizTalk.
In BizTalk, the best way to handle this is by modifying the client that is creating the file on FTP so that it uses temporary filenames.
Client creates file ftpFile.tmp
Client writes file content
Client renames ftpFile.tmp to ftpFile.xml or whatever extension is needed
BizTalk only picks up files with extension .xml
Major problem for me with the Logic Apps FTP connector is that you can't specify a file mask in the designer when using the connector as a trigger, which is strange, because i remember that option being available in the first version of logic apps.
You need to Enable the Include File Content Option.
You can do this in 2 ways.
1.
Go to LogicAppDesigner->Expand your FTP-Trigger ->Set yes to Include File Content
Like this
Go to Logic App Code View-> find the Trigger JSON- > Add "includeFileContent": true inside the "Queries"
Related
We are using Logic App to move data from a Sharepoint folder to an Azure Blob Storage.
We were using the Sharepoint trigger "When a file is created or modified in a folder". Unfortunately, this trigger has been deprecated and does not work anymore (i.e., when a file is indeed created or modified, no further action is done after running the trigger).
No file is moved around anymore. The trigger does not execute the Logic App even though a file is created or modified in the Sharepoint origin folder. I have been through the various other Sharepoint triggers but they do not seem to fit our use case. We cannot create a Logic App for each file. We are not using Sharepoint lists but classic folders. We could use several triggers pointing directly at each existin file, but as we have many files to move in the same folder, we would have to create many Logic Apps and that is not how we want to do it. Moreover, some new files may be created in the future.
What could we do to keep the same architecture of moving data around from Sharepoint to Blob Storage through the non-deprecated Logic App triggers?
Thank you in advance,
Alexis
You can use When a file is created or modified (properties only) and get the properties of the file that is getting created or updated. Then you can use Get file content using the properties from the previous step. Finally, you can create a blob using the previous steps. Below is the flow of my logic app.
RESULTS:
I created a Logic App that uses the Sharepoint trigger "When a file is created or modified in a folder". It works perfectly when I upload a file in Sharepoint online (in a Sharepoint browser tab). But, it doesn't work when I drop a file in my synced Windows explorer folder.
I read that someone faced the same problem: https://learn.microsoft.com/en-us/answers/questions/41215/logic-app-why-does-sharepoint-file-properties-trig.html. Here it says:
Move files and flow runs When you move one or more files from one
document library to another, the original file is moved from the
source library to the destination library. Moving the file does not
alter any custom metadata, including when the file was created and
modified. Hence, this action does not trigger any flows for those file
updates associated in the library where it was moved.
Syncing files to your OneDrive for business and SharePoint document
libraries When users sync one or more files from one document library
to another, the original file is moved (synced) from your client to
the destination library. Syncing the file will not alter any custom
metadata including when the file was created and modified. Hence, this
action will not trigger any flows for those file syncs in that library
or in your OneDrive for business.
The thing is that I NEED this Logic App to run by just dropping this file in a Windows Explorer folder (which is a Sharepoint folder shared with a certain person). Do you know how can I achieve this?
It started working for me when I used the OneDrive - When a file is created Connector because we use OneDrive for Windows Explorer and need to include the folder where the trigger should be invoked. We must set Include subfolder to true if we want the trigger to be fired while adding any file to the subfolders.
Here are the screenshots of the logic app working
When adding file in subfolders
When adding file in root folder
Possible to rename an underlying file while Unzipping using Logic App? I am calling an HTTP activity to download a ZIP file. That Zip contains only 1 Underlying file with some value appended to the name. I want to store the Unzipped file with a better name so that it can be used further. Is it possible ?
Incoming ZIP File --> SAMPLEFile.ZIP
Underlying File --> SampleTextFile20200824121212.TXT
Desired File --> SampleTextFile.TXT
Suggestions ?
As far as I know, we can't implement this requirement directly in "Extract archive to folder" action. We can just rename the file by copy it from one folder to another folder (shown as below).
You can create a new ticket on feedback page to ask azure team for this feature.
I have a use case where I should read the file from windows remote location with the following steps:
Read file from windows share location say input folder.
Process file and upload to REST service with the content of file.
If the call is successful move the remote file to archive folder, archival folder will also be on windows remote share.
I am using spring integration smb. Please help how can I achieve the archival use case.
Thanks,
barvepan
If it's on the same share, use the SmbRemoteFileTemplate's rename() method.
If it's a different share you'll have to copy the file and remove it (you can use the remote file template for that too).
I have created a mvc4 application with text file. I write some data in a file "ddd.txt". I wrote such address at my PC: "#D://Project//ddd.txt" and it worked finely. However, when I deploy my website on "azurewebsites.net", then I should write another address.
It is very important text file for me. And I would like to read data from it.
What address of directory should I write in my application to work him on Windows Azure server?
I would try using App_Data (you can just add the directory if it doesn't exist in your project).
You could then load the file like this:
string path = HttpContext.Server.MapPath("~/App_Data/ddd.txt");
// load file here
The other option would be to store it in Blob storage, there is a good walkthough of the different features here: http://www.windowsazure.com/en-us/develop/net/how-to-guides/blob-storage/