I am trying to create a logic app that will transfer files as they are created from my FTP server to my Azure file share. The structure of the folder my trigger is watching is structured by date (see below). Each day that a file is added, a new folder is created, so I need the trigger to check new subfolders but I don't want to go into the app every day to change which folder the trigger looks at. Is this possible?
Here's how my folder(Called data) structure is, each day that a file is added a new folder is created.
-DATA-
2016-10-01
2016-10-02
2016-10-03
...
The FTP Connector uses a configurable polling where you set how many times it should look for a file. The trigger currently does not support dynamic folders. However what you could try is the following:
Trigger your logic app by recurrence (same principle as the FTP trigger in fact)
Action: Create a variable to store the date time (format used in your folder naming)
Action: Do a list files in folder (here you should be able to dynamically set the folder name using the variable you created)
For-each file in folder
Action: Get File Content
Whatever you need to do with the file (call nested logic app in case you need to do multiple processing actions on each fiel is smart if you need to handle resubmits of the flow by file)
In order to avoid that you pick up every file each time, you will need to find a way to exlude files which have been processed in an earlier run. So either rename the file after it's processed to an extension you can exclude in the next run or move the file to a subfolder "Processed\datetime" in the root.
This solution will require more actions and thus will be more expensive. I haven't tried it out, but I think this should work. Or at least it's the approach I would try to set up.
Unfortunately, what you're asking is not possible with the current FTP Connector. And there aren't any really great solution right now...:(
As an aside, I've seen this pattern several times and, as you are seeing, it just causes more problems than it could solve, which realistically is 0. :)
If you own the FTP Server, the best thing to do is put the files in one folder.
If you do not own the FTP Server, politely mention to the owner that this patterns is causing problems and doesn't help you in any way so please, put the files on one folder ;)
Related
We are using Logic App to move data from a Sharepoint folder to an Azure Blob Storage.
We were using the Sharepoint trigger "When a file is created or modified in a folder". Unfortunately, this trigger has been deprecated and does not work anymore (i.e., when a file is indeed created or modified, no further action is done after running the trigger).
No file is moved around anymore. The trigger does not execute the Logic App even though a file is created or modified in the Sharepoint origin folder. I have been through the various other Sharepoint triggers but they do not seem to fit our use case. We cannot create a Logic App for each file. We are not using Sharepoint lists but classic folders. We could use several triggers pointing directly at each existin file, but as we have many files to move in the same folder, we would have to create many Logic Apps and that is not how we want to do it. Moreover, some new files may be created in the future.
What could we do to keep the same architecture of moving data around from Sharepoint to Blob Storage through the non-deprecated Logic App triggers?
Thank you in advance,
Alexis
You can use When a file is created or modified (properties only) and get the properties of the file that is getting created or updated. Then you can use Get file content using the properties from the previous step. Finally, you can create a blob using the previous steps. Below is the flow of my logic app.
RESULTS:
I've been searching around and can't find any clear answers to this. I need a small amount of data - talking kilobytes, probably not ever reaching megabyte range - available as a file on my Azure instance, outside the web app itself, for a web job to work with. I won't get into why this is necessary, but it is (alternatives have been explored), and the question is now where to put those files. The obvious answer seems to be to connect to the FTP, create a directory, plop them there and work with them there.
I did a quick test and I'm able to create a "downloads" directory within the "data" directory, drop some files in it, and work with them there. It works great for this very small, simple need that I have.
How long will that data stay there? Is that directory purged at any point automatically by the servers? Is that directory part of any backups that are maintained? How "safe" is something I manually put outside the wwwroot folder?
It will never be purged. The only folder that can get purged is the %TEMP% folder. All other folders that you have write access to will be persisted forever.
I am using wss3.0 and I need a way to listen on a shared folder library for file changes that are coming from users and check out those files and copy them somewhere else on disk. It's almost like an alert functionality, but every time it happens instead of emailing people, it needs to run some code to check out the new files and copy them to a network location.
the best solution that I can come up with is creating some custom timer job and check which files have changed since my last successful run but then I will need to save my last successful run date time somewhere.
If anybody has a better idea, they are more than welcome to share it.
You can add Event Receiver to this library, and every time an item is added it would start. Then inside Event Receiver you would copy the file to your disk location.
The tasks I do manually for updating my web site:
Stop IIS 7
Copy source files from a folder to the virtual directory of my web site
Start IIS 7
There are many ways to approach this, but here is one way.
I am assuming you don't want every single file in your source repository to exist on your destination server. The best way to reliably extract what you need from your source on a regular basis is through a build file. Two options for achieving this are nant and msbuild.
Once you have the set of files you want to deploy, you now need a way to distribute them to your destination server & to stop and start IIS. Again, there are options, but I would personally recommend powershell (with the IIS snapin) for this.
If you want this to happen regularly, consider a batch file executed by some timer, such as a scheduled task, or even better, a CI solution such as TeamCity.
For a full rundown, there are examples within my PowerUp project that does this.
It depends where you are updating from, but you could have a your virtual directory pointing to a local read-only working copy of your source code and create a task that every day runs a batch file/powershell script/etc. that would update that working copy (via a svn update, git pull etc.)
That supposes that you have a branch that always contains the latest releasable code.
You have to create a batch file with the following content:
Stop WWW publishing service
Delete your old files
Copy the new files
Start WWW publishing service
You can start/stop services like this:
net stop "World Wide Web Publishing Service"
When you have your batch file you can create a task in the Task Scheduler that calls your batch in a regular time interval (e.g. each day).
I have a need to convert any document gets uploaded to Image.
I downloaded the exe (with all the dlls) on my local machine (dont have to install)
export.exe sourcefile.doc destinationfile.gif >> this syntax works from my local dos prompt.
How do I use the same syntax "export.exe exampledoc.doc exampledoc.gif" when an item is added to sharepoint doc library.
and Do I need to put the folder (where the exe and dlls are for this) in the sharepoint frontend server so it's accessible? If yes, where should this folder reside? Does the folder and files need sharepoint service account access?
I am totally new and I would really like if someone can shed some light on this (step by step if possible)?
Thanks
Justin...
In order to do this from a SharePoint event handler, each WFE on the farm would need to have your conversion application available, your event handler code would need to place the uploaded file in a temporary location on disc, invoke the conversion application (look at the .NET Process class for this), cancel the addition of the original, unconverted document, and add the output of your processed file to the library (ensure you use the DisableEventFiring() method of the event handler as to not have the event handler trigger itself during the addition of the new file), and then clean up after itself.
Having said that, this operation seems like something that you really wouldn't want to tax a web server getting any real traffic with doing in real time. You may want to look into batching the jobs to be done daily during traffic lulls by another system, or one dedicated resource on the farm.