How to iterate through all files in sftp folder in Microsoft Azure Logic App - azure

steps i already did using SFTP connector (how can i access files while looping through list files in folder in azure logic app):
I added foreach loop
I added list files in folder
I passed Body as parameter in foreach loop
then i added action to create new file with new name for all files.
but i am not able to get file name and content while iterating sftp folder using foreach loop?

Please see example image, showing the logic app design. I am iterating sftp folder posting file content to http end point:

Related

Azure Data Factory: Delete an Empty Directory Issue

Within an Azure Data Factory Pipeline I am attempting to remove an empty directory.
The files within the directory were removed by a previous pipelines' iterative operation thus leaving an empty directory to be removed.
The directory is a sub-folder: The hierarchy being:
container / top-level-folder (always present) / directory - dynamically created - the result of an unzip operation.
I have defined a specific dataset that points to
container / #concat('top-level-folder/',dataset().dataset_folder)
where 'dataset_folder' is the only parameter.
The Delete Activity is configured like this:
On running the pipeline it errors with this error:
Failed to execute delete activity with data source 'AzureBlobStorage' and error 'The required Blob is missing. Folder path: container/top level directory/Directory to be removed/.'. For details, please reference log file here:
The log is an empty spreadsheet.
What am I missing from either the dataset or delete activity?
In azure blob storage, when all the contents inside a folder are deleted, then the folder would automatically get deleted.
When I deleted each file and tried to delete the folder in the end, then I got the same error.
This is because the folder is deleted as soon as the files in the folders are deleted. Only when using azure data lake storage, you will have to specifically delete the folder as well.
Since the requirement is to delete the folder in azure blob storage, you can simply remove the delete activity that you are using to remove the folder.
I used an Azure function suggested here:
Delete folder using Microsoft.WindowsAzure.Storage.Blob : blockBlob.DeleteAsync();
to perform the action.

How to get the contents of files in a SharePoint document library using PnP.js and pass them to client application so users can view them

I need to get a list of files from a SharePoint document library and return the results to a client application with file metadata and the contents.
I can get the list of files using
const files = await sp.web.getFolderByServerRelativeUrl("/sites/mysite/mylib/docs").files.get();
How can I retrieve the contents of the files and send them to clients too so they can view the files along with the metadata.
It looks like I can get the content of a file using
const blob = await sp.web.getFileByServerRelativeUrl("/sites/dev/documents/file.avi").getBlob();
Since I already retrieve an array of file objects, does the file content already returned from the file object? If yes, how to access it?
I know that I can return the LinkingUrl to the client. But for some reasons, we have users that do not have access to the SharePoint site so the link url doesn't work. I have to return the contents directly to them.

How to get multiple files using GetBobContent and add as attachment to Email in Azure Logic app?

Hi I am working in Azure Logic app. I am trying to get multiple files from azure data Lake gen v2 and attach these multiple files in an email. As a first step I have added http request and I am giving required information along with file path. It works fine for one file. but I am trying to input folder path and inside that folder, all the files I want to get and attach in email.
Logic app Flow Diagram
Added sample screenshot for attachment
tried to add attchment
In the above diagram, Get blob content step which works fine for one file but I am finding difficult to attach multiple files in email. Can some one help me to figure out the solution. Any help would be appreciated. Thank you
You can use List blobs action to list all blobs in the folder you want:
Then you can define a variable to append the attachments array.
Use For Each to loop the blobs from List Blobs action. Within For Each you can use Get blob content to get blob content, and then use Append to array variable to append attachments.
The expressions of Path, DisplayName and File Content are as follows:
Path : items('For_each')?['Path']
DisplayName : items('For_each')?['DisplayName']
File Content : body('Get_blob_content')
Finally, please fill in the attachment in the email:
==========================update===================
If you send an email with 400 response, please use expression in Append to array variable as below:
base64(body('Get_blob_content'))

Rest API to get the list of files inside the container/directory/ in AZURE storage

I have a container called 'services'.Inside the Container , i have a directory called 'Test'.how we can get the list of blobs inside the directory test.I have few kept few CSVs inside the test. Need the rest api to get the list of the files.
However i am able to get the list of items inside the container easily using below Rest API
https://myaccount.blob.core.windows.net/services?restype=container&comp=list
i tried
https://myaccount.blob.core.windows.net/services/Test?restype=directory&comp=list
but it is not working.
Please help to get the coorect param value or rest API to find the list of items inside the directory
https://myaccount.blob.core.windows.net/services/Test?restype=directory&comp=list&prefix=Test/
The doc:
https://learn.microsoft.com/en-us/rest/api/storageservices/list-blobs#uri-parameters
Just add a param prefix is ok.

Logic Apps - for each loop with liquid from blob storage

Im learning logic apps and im trying to create a simple flow from azure blob storage, perform a liguid parsing and then save parsed file to another blob container.
How should it work:
1. Whenever new file is added to blob container ("from") [containing xml files]
2.Liquid action takes place (XML -> JSON)
3.New file .json is saved to blob container ("too") :)
What i have learned:
1. I manage to write a liguid template for xml files - tested - working
2. I know how to copy file between blob containers - tested - working
For each loop:
https://i.imgur.com/ImaT3tf.jpg "FE loop"
Completed:
https://i.imgur.com/g6M9eLJ.jpg "Completed..."
Current LA:
https://i.imgur.com/ImaT3tf.jpg "Current"
What I dont know how to do:
1. How to "insert" current file content in for each into liquid action? It looks like logic apps is skipping that step?
The main problem is you could not use Current item as the xml content, you need to get the content with Get blob content action in For_each, then parse xml to json. After this create the blob in another container with json value.
You could refer to my workflow.

Resources