Unable to add service principle, groups to the $logs container in ADLS2 - azure

Recently enabled storage analytics on ADLS Gen2 storage account.I can see the $logs container and the logs are writing to this on an hourly basis. But when I'm trying to add service principal to this container getting permission denied. I have storage data contributor role on this storage account, any special permission is required to achieve this?

In general, being able to manage IAM requires higher level roles to be granted to your account. I assume, that you're trying to grant access via Access Control (IAM) feature / API call. Using Storage Data Contributor is not sufficient as it only allows you to access containers and blobs with read / write / delete access.
You need a role which grants you Microsoft.Authorization/*/write permission in order to get it working.

The problem is resolved by adding the SP/groups from the portal at the container level instead of storage explorer.

Related

Azure Storage Blob Python Invalid Permission

I create Azure storage account, and I have ensured following i have such permissions:
Contributor
Grants full access to manage all resources, but does not allow you to assign roles in Azure RBAC, manage assignments in Azure Blueprints, or share image galleries.
Subscription (Inherited)
--
Storage Blob Data Contributor
Allows for read, write and delete access to Azure Storage blob containers and data
Subscription (Inherited)
--
Storage Blob Data Owner
Allows for full access to Azure Storage blob containers and data, including assigning POSIX access control.
Subscription (Inherited)
--
Storage Blob Data Reader
Allows for read access to Azure Storage blob containers and data
Subscription (Inherited)
--
Then I go to App Registrations, and registered. I use Application (client) ID and also generated Client Secret
but still get:
azure.core.exceptions.HttpResponseError: This request is not authorized to perform this operation using this permission.
ErrorCode:AuthorizationPermissionMismatch
Error:None
To resolve the error, make sure to grant roles to service principal (Your Azure AD app) along with your user account like shown in the below screenshot:
Go to Azure Portal -> Storage Accounts -> Your Storage Account -> Access Control (IAM) -> Add role assignment
Note: Make sure to add Storage Blob Data Contributor role to your service principal (Azure AD App).
Now you have the permission to access the Azure Blob Storage and can connect to it without any permission issues.

Azure databricks cluster don't have acces to mounted adls2

I followed the documentation azure-datalake-gen2-sp-access and I mounted a ADLS2 storage in databricks, but when I try to see data from the GUI I get the next error:
Cluster easy-matches-cluster-001 does not have the proper credentials to view the content. Please select another cluster.
I don't find any documentation, only something about premium databricks, so I can only access with a premium databricks resource?
Edit1: I can see the mounted storage with dbutils.
After mounting the storage account, please do run this command do check if you have data access permissions to the mount point created.
dbutils.fs.ls("/mnt/<mount-point>")
If you have data access - you will see the files inside the storage
account.
Incase if you don't have data access- you will get this error - "This request is not authorized to perform this operation using this permissions", 403.
If you are able to mount the storage but unable to access, check if the ADLS2 account has the necessary roles assigned.
I was able to repro the same. Since you are using Azure Active Directory application, you would have to assign "Storage Blob Data Contributor" role to Azure Active Directory application too.
Below are steps for granting blob data contributor role on the registered application
1. Select your ADLS account. Navigate to Access Control (IAM). Select Add role assignment.
2. Select the role Storage Blob Data Contributor, Search and select your registered Azure Active Directory application and assign.
Back in Access Control (IAM) tab, search for your AAD app and check access.
3. Run dbutils.fs.ls("/mnt/<mount-point>") to confirm access.
Solved unmounting, mounting and restarting the cluster. I followed this doc: https://learn.microsoft.com/en-us/azure/databricks/kb/dbfs/remount-storage-after-rotate-access-key
If you still encounter the same issue when Access Control is checked. Do the following.
Use dbutils.fs.unmount() to unmount all storage accounts.
Restart the cluster.
Remount

Is there any method by which I can restrict other user not to view my container in Azure data lake gen 2

Problem Statement- There are two different teams working on two different project for same client. Both team have access to azure resource group on which azure data lake storage has been created. Now Client want us to use same data lake storage for both project but they also want that team working on a specific containers should not have access to other containers which other team will use and vice-versa.
Example--
Azure data lake storage -both team have access to this
->container1--only team 1 should have access to this
->container2--only team 2 should have access to this
Can anyone please suggest that how can we achieve this.
Thanks In advance!!
You can manage the access to containers, directories and blobs by using Access control lists (ACLs) feature in Azure Data Lake Storage Gen2.
You can associate a security principal with an access level for files and directories. Each association is captured as an entry in an access control list (ACL). Each file and directory in your storage account has an access control list. When a security principal attempts an operation on a file or directory, An ACL check determines whether that security principal (user, group, service principal, or managed identity) has the correct permission level to perform the operation.
To manage the ACL on the container, follow the below steps:
Go to the container in the storage account.
Navigate to any container, directory, or blob. Right-click the object, and then select Manage ACL.
The Access permissions tab of the Manage ACL page appears. Use the controls in this tab to manage access to the object.
To add a security principal to the ACL, select the Add principal button.
Find the security principal by using the search box, and then click the Select button.
You should create a security group in Azure AD for each of your team, and then maintain permissions on the group rather than for individual users.
Refer: Access control lists (ACLs) in Azure Data Lake Storage Gen2

Find out if Azure AD user has role assignment for a specific Azure Storage Account ADLS Gen2 container?

Have Azure Storage account with ADLS Gen2 containers. The permissions for users get added by code but what it does is go to the storage container > Access Control (IAM) > Roles > Storage Blob Data Contributor > Then adds a user, group, or service principle.
Is there an easy way via python to be able to check if a user or service principle is in a specific role (such as Storage Blob Data Contributor) for a specific container?
I've attached a screenshot of the screen in azure that I'm wanting to replicate the functionality it does in python.
I've tried Role Assignments - List For Scope with a filter but it does not seem to return the same.
Screenshot
One of the options you could try is using the Rest API Get Container ACLs. This will provide you a list of the the entities who have access to the container. You can run a quick search in this list to verify the access.
I couldn't find anything similar in the Python SDK.

Azure RBAC based access to Storage Account

I have a Service Principal that has been granted Contributor roles on a storage account.
When I attempt to create a container within that account I receive the following error message
One-time registration of Microsoft.Storage failed - The client 'd38eaaca-1429-44ef-8ce2-3c63a62849c9' with object id 'd38eaaca-1429-44ef-8ce2-3c63a62849c9' does not have authorization to perform action 'Microsoft.Storage/register/action' over scope '/subscriptions/********'
My goal is to allow a Service Principal READ-ONLY to the blobs contained within a given storage account and to create containers within that storage account. What are the steps needed to configure my principle to do that.
Regarding your error, please see this thread: In Azure as a Resource Group contributor why can't I create Storage Accounts and what should be done to prevent this situation?.
My goal is to allow a Service Principal READ-ONLY to the blobs
contained within a given storage account and to create containers
within that storage account. What are the steps needed to configure my
principle to do that.
As of today, it is not possible to do so. Simply because RBAC only applies to the control plane of the API. So using RBAC, you can control who can create/update/delete a storage account. Access to the data inside a storage account is still controlled by an account key. Anyone who has access to the account key will have complete control over that storage account.

Resources