I am trying to access a file from Sharepoint from my Azure Logic App. I am using the Sharepoint connector "When a file is created or modified in a folder". The file is accessed by the connector but it the properties the file name is given as "Z2lnYWJpdF92b3VjaGVycy5jc3Y=" rather than the actual file name "mydata.csv". Anyone know why this is?
It will be in a base64 format so you need to use base64tostring in compose connector i.e.. base64ToString(< FileName >)
Here is the screenshot for your reference
Related
Need to check a sftp site and there will be multiple files been uploaded to a folder. I am writing logic apps to process the files. Each logic app will handle one file because each file format is different. Problem is sftp trigger can only detect change to ANY file in the folder. So if a file changes, the logic app for that file will run, but OTHER logic apps will run as well which is not desired.
i have tried use recurrence trigger then followed by a sftp get file content by path action but that will fail if the file specified does not exist, what I want is the logic app just quit or better not been triggered at all.
How to just trigger the logic app if a particular file is updated/uploaded?
On your logic App you can actually use the Dynamic Content and Expressions to do the following
decodeBase64(triggerOutputs()['headers']['x-ms-file-name-encoded'])
Hope it helps!
I tried my Azure web FTP site with a condition if file name is equal to abc.txt and get the same Input. The Expression result is always false.
Then I check the Run Details I found the file name in OUTPUTS wasn't abc.txt, it's encrypted with base64.
In my test abc.txt was encrypted into YWJjLnR4dA==, then I changed the file name to YWJjLnR4dA== in the Logic App condition and it works.
So you could go to check your run history get the file name or you could go to this site encrypt you file name with Base64.
Hope thhis could help you, if you still have other questions, please let me know.
Im working with a PHP script that POSTs to a GPService Toolbox (written in python), the first parameter is supposed to be a GPDataFile. From the documentation, it looks like I can set the value of this parameter to a json formatted string literal, {"url", "http://localhost/export/1234567890.kml"}, and the arcpy.GetParameter(0) should handle this object correctly.
Unfortunately I am receiving an error, saying 'Please check your parameters', there are two other parameters on the toolbox but they are just strings and are working correctly. I am working in ArcGIS 10.0.
The overall goal of this interaction is to send a KML file from our SWF/ActionScript to the PHP, which saves the KML to our database and subsequently sends it to the GPService to translate it into a GDB and then to individual shapefile objects that are stored in the database for rendering back to the SWF/Actionscript.
Any help our thoughts on how to get the Toolbox to accept the JSON structure would be greatly appreciated, I would like to avoid having to send the KML contents as a string object to the Toolbox.
Answer can be what maniksundaram wrote in ESRI forum (https://community.esri.com/thread/107738):
ArcGIS server will not support direct GPDataFile upload. You have to upload the file using upload task and give the item id for the GP service.
Here is the high level idea to get it work for any GP service which needs file upload,
-Publish the Geoprocessing service with upload option
Refer : ArcGIS Help (10.2, 10.2.1, and 10.2.2)
Operations allowed: Uploads: This capability controls whether a client can upload a file to your GIS server that the tasks within the geoprocessing service would eventually use. The upload operation is mainly used by web clients that need a way to send a file to the server for processing. The upload operation returns a unique ID for the file after the upload completes, which the web application could pass to the geoprocessing service. You may need to modify the maximum file size and timeouts depending on how large an upload you want your server to accept. Check the local REST SDK documentation installed on your ArcGIS Server machine for information on using an uploaded file with a geoprocessing service. This option is off by default. Allowing uploads to your service could possibly pose a security risk. Only turn this on if you need it.
-Upload the file using the upload url that is generated in the geoprocessing service . It will give you the itemID of the uploaded file in response.
http://<servername>:6080/arcgis/rest/services/GP/ConvertKMLToLayer/GPServer/uploads/upload
Response Json:
{"success":true,"item":{"itemID":"ie84b9b8a-5007-4337-8b6f-2477c79cde58","itemName":"SStation.csv","description":null,"date":1409942441508,"committed":true}}
-Invoke the geoprocessing service with the item id as the GPDataFile input ,
For Ex: KMLInput value would be {"itemID":"ie84b9b8a-5007-4337-8b6f-2477c79cde58"}
-The result will be added to map service with job id if you have configured the view the GP results in a map service. Or you can read the response as it returns.
I need to process files in an order based on the file modify/create date. I'm using a logic app to process files but cannot get to the date property using the List or the Get from the SFTP Connector or the FTP connector.
Any thoughts on how this can be accomplished?
Any access to source code so I can make a tweak or two?
The current SFTP and FTP do not return modified date/time. If you could choose one of the following, do you have a preference? Not making any promises but investigating best way to resolve this and light up this scenario:
Add FileModifiedDateTime property for each file returned
Provide a parameter to sort the ListFiles. So, property is still not exposed, but the files are sorted as required by the client so you don't have to check the time of each file to see which is earliest.
I'm trying to integrate with Microsoft OneDrive service. I read API docs and went throw Authorization. Unfortunatelly, there's no info in docs about creating different File types.
I created a .txt file using HTTP Requests as it's written here https://dev.onedrive.com/items/upload_put.htm .
When I'm trying to create an Excel file this way, it does, but the document doesn't open. I think I need to send some special params (metadata?) also, but I don't know which ones.
I would be very pleased for any help :)
if i upload a file on azure blob in the same container where the file is existing already, it is over-writing the file, how to avoid overwriting the same? below i am mentioning the scenario...
step1 - upload file "abc.jpg" on azure in container called say "filecontainer"
step2 - once it gets uploaded, try uploading some different file with the same name to the same container
Output - it will overwrite existing file with the latest uploaded
My Requirement - i want to avoid this overwrite, as different people may upload files having same name to my container.
Please help
P.S.
-i do not want to create different containers for different users
-i am using REST API with Java
Windows Azure Blob Storage supports conditional headers using which you can prevent overwriting of blobs. You can read more about conditional headers here: http://msdn.microsoft.com/en-us/library/windowsazure/dd179371.aspx.
Since you want that a blob should not be overwritten, you would need to specify If-None-Match conditional header and set it's value to *. This would cause the upload operation to fail with Precondition Failed (412) error.
Other idea would be to check for blob's existence just before uploading (by fetching it's properties) however I would not recommend this approach as it may lead to some concurrency issues.
You have no control over the name your users upload their files with. You, however, have control over the name you store those files with. The standard way is to generate a Guid and name each file accordingly. The chances of conflict is almost zero.
A simple pseudocode looks like this:
//generate a Guid and rename the file the user uploaded with the generated Guid
//store the name of the file in a dbase or what-have-you with the Guid
//upload the file to the blob storage using the name you generated above
Hope that helps.
Let me put it that way:
step one - user X uploads file "abc1.jpg" and you save it io a local folder XYZ
step two - user Y uploads another file with same name "abc1.jpg", and now you save it again in a local folder XYZ
What do you do now?
With this I am illustrating that your question does not relate to Azure in any way!
Just do not rely on original file names when saving files. Where-ever you are saving them. Generate random names (GUIDs for example) and "attach" the original name as meta-data.