I am creating an 'Azure event trigger' in my Azure Data Factory but I am getting the below error.
I found other threads where they mention to check for role/access set to "Owner".
This is the case, I am a bit lost trying to find the solution for this.
Anyone faced this issue before or knows how to solve this?
The client *** with object id *** does not have authorization to perform action 'Microsoft.EventGrid/eventSubscriptions/write' over scope '/subscriptions/ZZZ/resourceGroups//providers/Microsoft.Storage/storageAccounts//providers/Microsoft.EventGrid/eventSubscriptions/****' or the scope is invalid. If access was recently granted, please refresh your credentials.
Thank you
You need any of following RBAC settings for storage event trigger:
Owner role to the storage account
Contributor role to the storage account
Microsoft.EventGrid/EventSubscriptions/Write permission to storage account /subscriptions/####/resourceGroups/####/providers/Microsoft.Storage/storageAccounts/storageAccountName
Source: Authorizing access to Event Grid resources, Create a trigger that runs a pipeline in response to a storage event
Related
I am setting up a event trigger on a blob storage v2in data factory pipeline, when i publish the pipeline I keep getting this error below, i have only set up storage recently but i cant see any thing out of place, do I need to set up even subscription in blob storage and create event from the storage itself as there are option to set up automation in there
The attempt to configure storage notifications for the provided storage account hmtest1 failed. Please ensure that your storage account meets the requirements described at https://aka.ms/storageevents. The error is Failed to retrieve credentials for request=RequestUri=https://management.azure.com/subscriptions
{"code":"InvalidAuthenticationToken","message":"The received access token is not valid: at least one of the claims 'puid' or 'altsecid' or 'oid' should be present. If you are accessing as application please make sure service principal is properly created in the tenant."}}
{"code":"InvalidAuthenticationToken","message":"The received access token is not valid: at least one of the claims 'puid' or 'altsecid' or 'oid' should be present. If you are accessing as application please make sure service principal is properly created in the tenant."}}
AFAIK, In ADF, this error occurs when the Data factory is not registered in the Resource providers.
To resolve this, we need to register Data factory in the Resource Providers.
Go to Subscriptions->your account->Resource providers and check whether Data factory is Registered or not.
If it is showing as NotRegistered then select it and click on Register.
After successfully registered, create a new data factory workspace and check the Storage event trigger.
If it still gives the same error, register the EventGrid as well and re-check.
I'm trying to do something similar to this link:
when something (html file) added to the blob
get the content (In my case, an html file that I want to be my email body. see the next step)
email the content (simple case as attachment, what I'm trying to do is as email body)
However, I'm stuck at the beginning with some permission issues:
I check my storage account IAM role to see if I can add Logic App as Blob Contributor/Reader but I couldn't find anything there. I doesn't list my logic app there:
Can someone help me with that?
As far as I know, it has nothing to do with assign some access(some roles) to logic app as you mentioned in your last screenshot. It is related to the user permissions who create the blob storage API connection.
According to the screenshot you provided, it seems you can add the trigger "When a blob is added or modified (properties only)" with blob storage API connection into logic app success but it shows the error message Please check you account......(I test in my side, if I do not have the permission, it will not allow me to create the connection when add the trigger). So when you add the trigger, it might not ask you to select a storage account (to create API connection). It might have just used an existing API connection (in same resource group of logic app) to connect to storage account. You can see the API connection in the same resource group of your logic app. Its name may be azureblob and if you click into the API connection, you can see the display name is f which same with your screenshot.
But the user who created the API connection doesn't have permission to storage account now or the API connection has expired(maybe expire in 90 days). So it shows the error message.
To solve this problem, you can click "Change connection" button at the bottom of the trigger to add another connection to connect to storage account.
=============================Update===========================
To connect to storage in logic app through vnet, we can refer to this post.
I am trying to understand the overlap between two of those roles in Azure RBAC. Looks like monitor-contributor completely covers application-insights-component-contributor besides "Microsoft.Resources/deployments/*". Considering the following situation whether I am deploying web availability tests into AppInsights resource and the deployment identity is service principal which was already granted monitor-contributor permissions. Should I grant this identity also 'application-insights-component-contributor' to be able to create those resources or 'monitor contributor' is good enough?
1 Edit
I am also deploying alert rules along with the tests and those rules implemented as rm template, if SP was granted monitoring-contributor only it's fails with
Error: requesting Validation for Template Deployment "app508-dfpg-dev3-diag-eastus2-backoffice-ai-test-dep" (Resource Group "app508-dfpg-ne-diag-eastus2"): resources.DeploymentsClient#Validate: Failure sending request: StatusCode=403 -- Original Error: Code="AuthorizationFailed" Message="The client '2c20abbf-e825-495c-9d06-90c5f04f9c60' with object id '2c20abbf-0000-0000-0000-90c5f04f9c60' does not have authorization to perform action 'Microsoft.Resources/deployments/validate/action' over scope '/subscriptions/s/resourcegroups/app508-dfpg-ne-diag-eastus2/providers/Microsoft.Resources/deployments/app508-dfpg-dev3-diag-eastus2-backoffice-ai-test-dep' or the scope is invalid. If access was recently granted, please refresh your credentials."
No need to give the Application Insights Component Contributor role, Monitoring Contributor role is enough. When you deploying the web availability tests, you just need the Microsoft.Insights/webtests/* action permission, it is already included in Monitoring Contributor.
Details of the Error: Get access token from MSI failed for Datafactory XXXX, region XXXX. Please verify resource url is valid and retry. Details: Accquire MI token from MI store V1 failed.
Error Code: 2403
Failure type: User Configuration issue
used web activity in Azure Data Factory to access Azure function app using MSI
I also had these kind of issues and it took me some time to figure out the right resource ID for the token I needed.
First of all the "Web-Activity" in ADF or Azure Synapse can be used for performing Azure REST-API calls quite good.
But we have to understand that "access token" is not always the same "access token". Azure AD provides different access token depending on the resource provider you want to access.
Here is a list of Resource IDs you can use:
https://learn.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/services-support-managed-identities#azure-services-that-support-azure-ad-authentication
Unfortunately it doesn't seem up to date, as I'm using in my case for https://dev.azuresynapse.net (which is not listed on the docs yet).
As an alternative, there is Azure Function activity in Azure Data Factory. You can try that.https://learn.microsoft.com/en-us/azure/data-factory/control-flow-azure-function-activity
I'm connecting ADF to blob storage v2 using a managed identity following this doc: Doc1
When it comes to test the connection with my first dataset, I am successful when I test the connection to the linkedservice. When I try by the filepath, and enter "testfolder" (which exists in the blob) it fails returning a generic forbidden error displayed at the end of this post.
However, when I opt to "browse" the folders in the dataset portal, the folder "testfolder" does show up. But when I select it, it will not show me anything within that folder.
The Data Factory managed instance is given the role of Contributor, granting full access to manage all resources. Is there some other hidden issue or possible way to narrow down the issue? My instinct is that this is something within the blob container since I can view the containers, but not their contents.
Error message:
It seems that you don't give the role of azure blob storage.
Please fellow this:
1.click IAM in azure blob storage,navigate to Role assignments and add role assignment.
2.choose role according your need and select your data factory.
3.A few minute later,you can retry to choose file path.
Hope this can help you.