Granted Reader & Storage Blob Data Reader Role access on Azure Data Lake Gen2 Storage Account to the user DataLakeTester
Also under Manage Access granted full rights on Access / Default section.
But when logged into into Azure Storage Explorer with above user, is successfully connected to the data lake but cannot list the containers and throw below error. Is there some other role assignment to be done?
The latest version of Storage Explore now available is 1.11.1. Please update and try again:https://github.com/Microsoft/AzureStorageExplorer/releases
In response to your query:
But when logged into Azure Storage Explorer with above user, is successfully connected to the data lake but cannot list the containers and throw below error. Is there some other role assignment to be done? It works fine on my side, could you try to sign out and sign in again?
The RBAC roles you have appear to be sufficient. It can take some time for RBAC changes to propagate. So accessing things in Storage Explorer might not work as expected for a few minutes.
Related
We are using fluentbit to output application logs to a Azure log analytics workspace. The application log does appear in the workspace as a table under the Logs blade, Custom Logs category. So far so good.
Due to the maximum retention period of the Log analytic workspace limited to 730 days, I thought linking a storage account to type Custom logs & IIS logs under the Linked storage accounts would solve the problem for me. My understanding is once a storage account is linked to type Custom logs & IIS logs, all Custom Logs will be written into the nominated storage account instead of the default storage account that comes with the creation of the Log analytics workspace. Is this understanding correct?
Secondly, after clicking on the Custom logs & IIS logs item, and selecting a storage account from the Pop-up blade on the left hand side, Azure Portal reported a message Successfully linked storage account . However, the Linked storage accounts view still reports No linked storage accounts.
Browsing the target storage account, no log seems to be written to the storage account.
Updates 1
Storage account network configuration.
Updates 2
The answer is accepted as it is technically correct. However, a few steps/details are missing in the documentation. In order to, map a customer storage account to a LA Workspace, one must build resources to match the following diagram.
Create a AMPLS resource.
Link the AMPLS resource to your LA workspace.
Create private endpoint on the target vnet for the AMPLS resource
Create storage account.
Create print endpoints (blob type) on the target vnet
Link the storage account to the LA workspace.
We need to follow few prerequisites before linking the storage account to the workspace.
Storage account should be in the same region as log analytics workspace.
Need to give permissions for other services to allow accessing the storage account.
Allow Azure Monitor to access the storage account. If you chose to allow only select networks to access your storage account, you should select the exception: “Allow trusted Microsoft services to access this storage account”
For rest of the configuration information refer to MS Docs.
By following the above documentation, I can link the storage account successfully as below:
I followed the documentation azure-datalake-gen2-sp-access and I mounted a ADLS2 storage in databricks, but when I try to see data from the GUI I get the next error:
Cluster easy-matches-cluster-001 does not have the proper credentials to view the content. Please select another cluster.
I don't find any documentation, only something about premium databricks, so I can only access with a premium databricks resource?
Edit1: I can see the mounted storage with dbutils.
After mounting the storage account, please do run this command do check if you have data access permissions to the mount point created.
dbutils.fs.ls("/mnt/<mount-point>")
If you have data access - you will see the files inside the storage
account.
Incase if you don't have data access- you will get this error - "This request is not authorized to perform this operation using this permissions", 403.
If you are able to mount the storage but unable to access, check if the ADLS2 account has the necessary roles assigned.
I was able to repro the same. Since you are using Azure Active Directory application, you would have to assign "Storage Blob Data Contributor" role to Azure Active Directory application too.
Below are steps for granting blob data contributor role on the registered application
1. Select your ADLS account. Navigate to Access Control (IAM). Select Add role assignment.
2. Select the role Storage Blob Data Contributor, Search and select your registered Azure Active Directory application and assign.
Back in Access Control (IAM) tab, search for your AAD app and check access.
3. Run dbutils.fs.ls("/mnt/<mount-point>") to confirm access.
Solved unmounting, mounting and restarting the cluster. I followed this doc: https://learn.microsoft.com/en-us/azure/databricks/kb/dbfs/remount-storage-after-rotate-access-key
If you still encounter the same issue when Access Control is checked. Do the following.
Use dbutils.fs.unmount() to unmount all storage accounts.
Restart the cluster.
Remount
Recently enabled storage analytics on ADLS Gen2 storage account.I can see the $logs container and the logs are writing to this on an hourly basis. But when I'm trying to add service principal to this container getting permission denied. I have storage data contributor role on this storage account, any special permission is required to achieve this?
In general, being able to manage IAM requires higher level roles to be granted to your account. I assume, that you're trying to grant access via Access Control (IAM) feature / API call. Using Storage Data Contributor is not sufficient as it only allows you to access containers and blobs with read / write / delete access.
You need a role which grants you Microsoft.Authorization/*/write permission in order to get it working.
The problem is resolved by adding the SP/groups from the portal at the container level instead of storage explorer.
I have an Azure Storage Account and want to grant read access to a colleague. All identities are in the same Azure Active Directory so it was easy to add him to the "Reader" role in the Access Control blade of the Azure portal.
When he opens Microsoft Azure Storage Explorer the subscription and storage account are visible but the node for Blob Containers can't be expanded. Exception says:
Could not obtain keys for Storage Account. Please check that you have
the correct permissions
This is expected behavior. Essentially to list storage keys, the user should be in a role that allows listKeys operation. The built-in Reader role does not have permission to perform listKeys operation.
The rationale (a bit convoluted though) behind this decision is that a user in Reader role should only be able to Read and not perform any inserts/updates or deletes. Considering if someone has account key for a storage account, they can do these operations. Thus the user in Reader role is not granted permission to list the account keys.
What you could do is create a Shared Access Signature (SAS) with read/list permissions and share that SAS URL with your colleague. Then they will be able to access the data in that storage account but won't be able to perform any create/update/delete operations.
Looks like this is now possible (In preview). Your AD users can be given the "Storage Blob Data Reader" privilege.
https://azure.microsoft.com/en-us/blog/announcing-the-preview-of-aad-authentication-for-storage/
I have a Service Principal that has been granted Contributor roles on a storage account.
When I attempt to create a container within that account I receive the following error message
One-time registration of Microsoft.Storage failed - The client 'd38eaaca-1429-44ef-8ce2-3c63a62849c9' with object id 'd38eaaca-1429-44ef-8ce2-3c63a62849c9' does not have authorization to perform action 'Microsoft.Storage/register/action' over scope '/subscriptions/********'
My goal is to allow a Service Principal READ-ONLY to the blobs contained within a given storage account and to create containers within that storage account. What are the steps needed to configure my principle to do that.
Regarding your error, please see this thread: In Azure as a Resource Group contributor why can't I create Storage Accounts and what should be done to prevent this situation?.
My goal is to allow a Service Principal READ-ONLY to the blobs
contained within a given storage account and to create containers
within that storage account. What are the steps needed to configure my
principle to do that.
As of today, it is not possible to do so. Simply because RBAC only applies to the control plane of the API. So using RBAC, you can control who can create/update/delete a storage account. Access to the data inside a storage account is still controlled by an account key. Anyone who has access to the account key will have complete control over that storage account.