Storage account name is required in logicapp - azure

Created http request trigger and list blob action then in for each created get blob content and create blob actions
In create blob after entering storage account name also still getting storage account name is required as error message and after this I want to delete blob1.
Any help is appreciated

In storage account created blob1 and blob2 containers.
In blob1 sample data is taken and blob2 container is empty.
After reproducing issue from my side you have to take value as Dynamic content in For each Then you can retrieve previous action content then you can see in Create blob action the storage account name. as shown in below image.
logic app ran successfully and you can see sample file in blob2 container as shown in below image
Delete blob action ran successfully so now blob1 container is empty

Related

Storage event Trigger in Azure ADF

I have to create a storage event trigger to process a file created on BLOB . While creating it is asking me for storage account and container name. I need to put the values dynamically as I have different storage account name for different environments(prod and non-prod).
But I am unable to find an option to give dynamic storage account name. What should i do?

Azure blob snapshots not getting deleted via logic app

While deleting old blobs using a logic app by giving the container path, we ran into an error message "Status code:409, "message": This operation is not permitted because the blob has snapshots”. This subsequently fails the running of the logic app. I tried to use delete blob by providing Id and Filename but the error persists. Is there any way to specifically delete blob and its corresponding snapshot using the logic app? Approaches to solving the issue are welcome. Blob's lifecycle management policy does not work for us.
You can use an Azure Function to delete your blob including this header in your request:
x-ms-delete-snapshots: {include, only}
Required if the blob has associated snapshots. Specify one of the following two options:
include: Delete the base blob and all of its snapshots.
only: Delete only the blob's snapshots and not the blob itself.
This header should be specified only for a request against the base blob resource. If this header is specified on a request to delete an individual snapshot, the Blob service returns status code 400 (Bad Request).
If this header is not specified on the request and the blob has associated snapshots, the Blob service returns status code 409 (Conflict).
Check documentation here.
You can try to filter and order your Blobs before remove your base blob, deleting snapshots first within your Logic App.

Azure Data Factory to Azure Blob Storage Permissions

I'm connecting ADF to blob storage v2 using a managed identity following this doc: Doc1
When it comes to test the connection with my first dataset, I am successful when I test the connection to the linkedservice. When I try by the filepath, and enter "testfolder" (which exists in the blob) it fails returning a generic forbidden error displayed at the end of this post.
However, when I opt to "browse" the folders in the dataset portal, the folder "testfolder" does show up. But when I select it, it will not show me anything within that folder.
The Data Factory managed instance is given the role of Contributor, granting full access to manage all resources. Is there some other hidden issue or possible way to narrow down the issue? My instinct is that this is something within the blob container since I can view the containers, but not their contents.
Error message:
It seems that you don't give the role of azure blob storage.
Please fellow this:
1.click IAM in azure blob storage,navigate to Role assignments and add role assignment.
2.choose role according your need and select your data factory.
3.A few minute later,you can retry to choose file path.
Hope this can help you.

How to copy blob from one container to another using Logic App

In Azure, I have Blob storage with two containers. Input and Output. I have file say Test1.csv, which after processing I want to copy to output container. I need to do this as a step in Azure Logic app. However I am not able to understand how to configure the Copy Blob action in Azure Logic app. I am not able to configure the source path URL correctly.
It gives error message file not found.
Thanks
If you want to use Copy blob action to copy blob, the simplest way to get the blob url is use the Create SAS URI by path action. Then pass the url to the Copy blob action and the destination.
Except this way, you could use create blob to copy blob. Firstly use Get blob content using path to get the blob content, then use Create blob to upload the blob to the destination container.

access a file from a directory in azure blob storage through Azure Logic App

I am using LogicApp to import a set of files which are inside the directory(/devcontainer/sample1/abc.csv).
The problem here is that,I could not even located to the azure file from my LogicApp, I am getting the following error as:
verify that the path exists and does not contain the blob name.List Folder is not allowed on blobs.
Screenshots for reference
The problem here is that,I could not even located to the azure file from my LogicApp,
The file explorer will show all the contains and blobs when you choose blob path. And it will cache the data for a period of time to ensure the smoothness of the operation. If a blob is added to the container recently, it will not be seen and chosen from the file explorer. The workaround is by clicking the change connection link and using a new connection to retrieve the data.
Does your blob connection pointing to the correct storage account? one thing you can try to do is instead of providing the path try to browse the path so that you can what are the containers and the blobs that are present in the storage account that you are trying to access.

Resources