I have an Azure account with Owner permission for the subscription we have. I can see that two permissions existing for the same subscription, One is owner, and the other is Contributor. I am trying to delete the blob cache with the following Azure CLI command:
az storage blob delete-batch --source <containerName> --account-name <storageAccountName> --auth-mode login
I am getting the below error
I am not sure, despite having enough permissions why I am getting this error. Please help
Attaching the permission of my subscription
My access permission to storage account
If you set the --auth-mode parameter to login, it means that you use Azure AD auth to retrieve Azure blob data. If so, the Azure AD Azure AD security principal you used to login should be assigned to the role Storage Blob Data Owner Storage Blob Data Contributor or Storage Blob Data Reader. Otherwise, you have no permissions to process Azure blob.
Now, your account just has been assigned to Owner, please set the --auth-mode parameter to key which means that users attempt to retrieve the account access key to use for processing Azure blob. The Owner role has the permissions to do that.
For more details, please refer to here and here
Related
When I issue the following command:
az storage entity query --account-name acc1 --table-name table1
I successfully get my query result with the following warning:
There are no credentials provided in your command and environment, we will query for account key for your storage account.
It is recommended to provide --connection-string, --account-key or --sas-token in your command as credentials.
You also can add `--auth-mode login` in your command to use Azure Active Directory (Azure AD) for authorization if your login account is assigned required RBAC roles.
For more information about RBAC roles in storage, visit https://docs.microsoft.com/azure/storage/common/storage-auth-aad-rbac-cli.
In addition, setting the corresponding environment variables can avoid inputting credentials in your command. Please use --help to get more information about environment variable usage.
To avoid above warning, I add --auth-mode login to the command:
az storage entity query --account-name acc1 --table-name table1 --auth-mode login
Then I get this error:
You do not have the required permissions needed to perform this operation.
Depending on your operation, you may need to be assigned one of the following roles:
"Storage Blob Data Owner"
"Storage Blob Data Contributor"
"Storage Blob Data Reader"
"Storage Queue Data Contributor"
"Storage Queue Data Reader"
"Storage Table Data Contributor"
"Storage Table Data Reader"
If you want to use the old authentication method and allow querying for the right account key, please use the "--auth-mode" parameter and "key" value.
My account is able to get the query result without --auth-mode login switch. Why it fails authorization with the switch?
When you don't specify the authentication type, it will try yo get the access key of the storage account:
This requires Microsoft.Storage/storageAccounts/listkeys/action permission. If you have contributor role oier the storage account, you have the required permission.
--auth-mode login means it will use AAD auth to connect to the storage. You can use of the built-in roles to access the storage (see documentation):
Storage Table Data Contributor
Storage Table Data Reader
When using AAD Auth, you could also disable access key authentication.
There is an good article related to RBAC management and data plane model:
Assign an Azure role for access to blob data.
I try to create a User delegation key from azure portal.
No matter what privileges I'm assigning to myself, I hit the same error message
You don't have permissions to grant read access. You can still create
a shared access signature, but you'll need an RBAC role with
additional permissions before you can grant that level of access to
your signature recipient.Learn more about Azure roles for access to
blob data
So far I have the following the roles assigned :
And the link provided in the error message says I need one of the following :
Contributor
Storage Account Contributor
Storage Blob Data Contributor
Storage Blob Data Owner
Storage Blob Data Reader
Storage Blob Delegator
So it should work, but it doesn't. What am I missing ?
The error usually occurs if you don't have required roles/permissions assigned to create User delegation key.
Please note that in order to create user delegation key, ensure to have role that includes action like below:
Microsoft.Storage/storageAccounts/blobServices/generateUserDelegationKey
The above action is included in the below roles:
Storage Blob Data Contributor
Storage Blob Data Owner
Storage Blob Data Reader
Storage Blob Delegator
Try assigning either Storage Blob Data Contributor / Storage Blob Data Owner roles as you didn't assign.
Please check at what scope you have assigned the role, make sure to assign the roles at the level of the storage account, the resource group, or the subscription.
I tried in my environment, and got the same error when the roles are not assigned:
After assigning the roles, I am able to create user delegation key successfully without errors.
If still the error persists, try creating an Azure Support ticket.
For more in detail, please refer below links:
Create SAS tokens for containers and blobs with the Azure portal | Microsoft Docs
azure-docs/storage-blob-user-delegation-sas-create-cli.md at main · MicrosoftDocs/azure-docs · GitHub
I am trying to create mount point to the ADLS Gen2 using key vault in databricks, however i am not being able to do so due to some error that i am getting.
I have contributor access and i tried with Storage Blob Data Contributor and contributor access to the SPN still i am not being able to create it the mount points.
I request some help please
configs= {"fs.azure.account.auth.type":"OAuth",
"fs.azure.account.oauth.provider.type":"org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
"fs.azure.account.oauth2.client.id":"abcdefgh",
"fs.azure.account.oauth2.client.secret":dbutils.secrets.get(scope="myscope",key="mykey"),
"fs.azure.account.oauth2.client.endpoint":"https://login.microsoftonline.com/tenantid/oauth2/token",
"fs.azure.createRemoteFileSystemDuringInitialization": "true"}
dbutils.fs.mount(
source= "abfss://cont1#storageaccount.dfs.core.windows.net/",
mount_point="/mnt/cont1",
extra_configs=configs)
the error i am getting is
An error occurred while calling o280.mount.
: HEAD https://storageaccount.dfs.core.windows.net/cont1?resource=filesystem&timeout=90
StatusCode=403
StatusDescription=This request is not authorized to perform this operation.
When performing the steps in the Assign the application to a role, make sure that your user account has the Storage Blob Data Contributor role assigned to it.
Repro: I have provided owner permission to the service principal and tried to run the “dbutils.fs.ls("mnt/azure/")”, returned same error message as above.
Solution: Now assigned the Storage Blob Data Contributor role to the service principal.
Finally, able to get the output without any error message after assigning the Storage Blob Data Contributor role to the service principal.
For more details, refer “Tutorial: Azure Data Lake Storage Gen2, Azure Databricks & Spark”.
Our CI pipeline needs to back up some files to Azure Blob Storage. I'm using the Azure CLI like this: az storage blob upload-batch -s . -d container/directory --account-name myaccount
When giving the service principal contributor access, it works as expected. However, I would like to lock down permissions so that the service principal is allowed to add files, but not delete, for example. What are the permissions required for this?
I've created a custom role giving it the same permissions as Storage Blob Data Contributor minus delete. This (and also just using the Storage Blob Data Contributor role directly) fails with a Storage account ... not found. Ok, I then proceeded to add more read permissions to the blob service. Not enough, now I'm at a point where it wants to do Microsoft.Storage/storageAccounts/listKeys/action. But if I give it access to the storage keys, then what's the point? With the storage keys the SP will have full access to the account, which I want to avoid in the first place. Why is az storage blob upload-batch requesting keys and can I prevent this from happening?
I've created a custom role giving it the same permissions as Storage Blob Data Contributor minus delete. This (and also just using the Storage Blob Data Contributor role directly) fails with a Storage account ... not found.
I can also reproduce your issue, actually what you did will work. The trick is the --auth-mode parameter of the command, if you did not specify it, it will use key by default, then the command will list all the storage accounts in your subscription, when it found your storage account, it will list the keys of the account and use the key to upload blobs.
However, the Storage Blob Data Contributor minus delete has no permission to list storage accounts, then you will get the error.
To solve the issue, just specify the --auth-mode login in your command, then it will use the credential of your service principal to get the access token, then use the token to call the REST API - Put Blob to upload blobs, principle see Authorize access to blobs and queues using Azure Active Directory.
az storage blob upload-batch -s . -d container/directory --account-name myaccount --auth-mode login
i followed the tutorial (below *)
and now have a Service Principal .
How can i use this Service Principal when reading a blob using Get-AzureStorageBlob ?
Get-AzureStorageBlob requires a New-AzureStorageContext , can i use the SP instead of the StorageAccountKey guid?
Thanks,Peter
https://azure.microsoft.com/en-us/documentation/articles/resource-group-authenticate-service-principal/
As far as I know, you cannot use a SPN for accessing items in blob storage. You will need to use the access keys or SAS tokens.
Recently, Azure has added an option to Manage access rights to Azure Storage data with RBAC. You need to add one of the built-in RBAC roles scoped to the storage account to your service principal.
Storage Blob Data Contributor (Preview)
Storage Blob Data Reader (Preview)
Then, if you want to use the AzureCLI to access the Blob Storage with a Service Principal
Log in with a service principal
$ az login --service-principal --tenant contoso.onmicrosoft.com -u http://azure-cli-2016-08-05-14-31-15 -p VerySecret \
Enable the preview extension
$ az extension add -n storage-preview
Use --auth-mode parameter with your AzureCLI command
$ az storage blob download --account-name storagesamples --container sample-container --name myblob.txt --file myfile.txt --auth-mode login
For more information please see:
Manage access rights to Azure Storage data with RBAC (Preview)
Use an Azure AD identity to access Azure Storage with CLI or PowerShell (Preview)
if your SPN has only reader role, you cannot access the storage w/o SAS or account key.
You can asign the SPN to contributor role and create SAS for other normal users.
then switch to other normal user to access the storage with SAS.