I need to process files in an order based on the file modify/create date. I'm using a logic app to process files but cannot get to the date property using the List or the Get from the SFTP Connector or the FTP connector.
Any thoughts on how this can be accomplished?
Any access to source code so I can make a tweak or two?
The current SFTP and FTP do not return modified date/time. If you could choose one of the following, do you have a preference? Not making any promises but investigating best way to resolve this and light up this scenario:
Add FileModifiedDateTime property for each file returned
Provide a parameter to sort the ListFiles. So, property is still not exposed, but the files are sorted as required by the client so you don't have to check the time of each file to see which is earliest.
Related
I am using Logic app to detect any change in a FTP folder. There are 30+ files and whenever there is any change the storage copies the file to blob. The issue is it's firing on each file if 30 files are changed then it will fire 30 times. I want it to fire only once no matter how many files in a folder changed. After blobs are copied I am firing a Get request so that my website is updated also. Am I using the wrong approach?
Below you can see my whole logic.
As per your verbatim you have mentioned that you are using the FTP connector but as per your screenshot (that has included file content property on the trigger) it looks like you are using the SFTP-SSH connector trigger as FTP trigger doesn't have that property. Please correct if my understanding is correct.
If you are using When a file is added or modified trigger, then it will trigger your workflow on every file that is modified or added, and this is expected behavior that it will trigger your workflow on every file that is modified or added.
But if you are using the When files are added or modified (properties only) then this action has the setting Split On property which you can disable (by default enabled) so your workflow will execute only once for all the files that are added or modified for the How often do you want to check for the item? property time you have configured.
In case if it is FTP connector then you need to disable the Split On property and it still holds valid. For more details refer to this section.
how you doing?
I'm trying to download a excel file from a web site (Specifically DataCamp) in order to use its data into an automatic process, but before to get the file is necessary to sign in on the page. I was thinking that this would be possible with the JSON Query on the HTTP action, but to be honest I don't know where to start (I'm new on Azure).
The process that I need to emulate to get the file extraction would be as follow (I know this could be possible with an API or RPA but I don't have any available for now):
Could you tell me guys some advices (how to get the desired result or at least where to make research)? is this even posibile?
Best regards.
If you don't have other ways, e.g. your source is on an SFTP, etc. than using an HTTP Action should work, pass the BODY to your next action (e.g. you might want to persist that on a BLOB if content is binary).
If your content is "readable", e.g. JSON, CSV and want to load for processing, you need to ensure, for large files, that you read it in Chunks to load it completely before processing.
Detailed explanation at https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-handle-large-messages#download-content-in-chunks
Need to check a sftp site and there will be multiple files been uploaded to a folder. I am writing logic apps to process the files. Each logic app will handle one file because each file format is different. Problem is sftp trigger can only detect change to ANY file in the folder. So if a file changes, the logic app for that file will run, but OTHER logic apps will run as well which is not desired.
i have tried use recurrence trigger then followed by a sftp get file content by path action but that will fail if the file specified does not exist, what I want is the logic app just quit or better not been triggered at all.
How to just trigger the logic app if a particular file is updated/uploaded?
On your logic App you can actually use the Dynamic Content and Expressions to do the following
decodeBase64(triggerOutputs()['headers']['x-ms-file-name-encoded'])
Hope it helps!
I tried my Azure web FTP site with a condition if file name is equal to abc.txt and get the same Input. The Expression result is always false.
Then I check the Run Details I found the file name in OUTPUTS wasn't abc.txt, it's encrypted with base64.
In my test abc.txt was encrypted into YWJjLnR4dA==, then I changed the file name to YWJjLnR4dA== in the Logic App condition and it works.
So you could go to check your run history get the file name or you could go to this site encrypt you file name with Base64.
Hope thhis could help you, if you still have other questions, please let me know.
I want to export a table to an Excel file. I need to export a report.
ORA_EXCEL.new_document;
ORA_EXCEL.add_sheet('Sheet name');
ORA_EXCEL.query_to_sheet('select * from mytable');
ORA_EXCEL.save_to_blob(myblob);
I saved my table to blob.
How do I export/respond to the user (client)?
I need something that is simple to allow a user to be able to download an Excel file to their own computer. I tried doing this procedure in an Oracle workflow:
ORA_EXCEL.save_to_file('EXPORT_DIR', 'example.xlsx');
But this did not help, because it is saves the file to a directory on the server and I need it in the real server.
The way I have handled similar issues in the past was to work with the systems people to mount a directory from either a web server or file server on the Database server.
Then create a directory object so that the procedure can save to a location that is accessible to the user.
If the files are not sensitive and there are a limited number of users then a file server makes sense as it is then just a matter of giving the user access to the file share.
If files are sensitive or this is a large number or unknown users we then used the Web server and sent a email with a link to the user enabling them to download their file. Naturally there needs to be security built into this to stop people being able to download other users files.
We didn't just email the files as an attachment because...
1) Emails with attachments tend to get blocked
2) We always advise not to open attachments on emails. (Yes I know we advise not to click on links as well but nothing is perfect)
Who or what is invoking the production of the document?
If it´s done by an application, which the user is working on, this application can fetch the BLOB, stores it at f.e. TEMP-Directory and calls
System.Diagnostics.Process.Start("..."); to open it with the associated application. (see Open file with associated application)
If it´s a website, this one could stream the blob back as Excel-Mimetype (see Setting mime type for excel document)
Also you could store in an Oracle-DIRECTORY, but this one has to be on the server and should be a netword-share to be accessible for clients (which is rarely accepted in a productive environment!)
If MAIL isn´t the solution, then maybe FTP can be a way to store files in a common share. See UTL_TCP - Package, with this a FTP-transfer can be achieved (a bit hard to code, but there are solutions to find in the web) and I guess, professional tools that generate Office-documents out of Oracle-DB and distribute them do it like this.
I am using int-sftp:inbound-channel-adapter to download valid files from sftp server to local dir. I need to delete local file if that file is rejected by my custom filter. Is this something achieved via config or need to implement code? If so is there any sample out there?
You would need to do the deletion in your custom filter. Use File.delete().
But, of course, it would be better to use a custom remote filter, instead of a custom local filter, to avoid fetching the invalid file (unless you need to look at the contents).