I am using the following code to obtain a token for azure blob service:
from azure.storage.blob import BlobServiceClient from azure.identity import InteractiveBrowserCredential, DeviceCodeCredential, ClientSecretCredential
credential = DeviceCodeCredential(authority="login.microsoftonline.com", tenant_id="***", client_id="***")
blobber = BlobServiceClient(account_url="https://***.blob.core.windows.net", credential=credential)
blobs = blobber.list_containers()
for b in blobs:
print(b)
It works perfectly.
However, during a certain timeframe, a user may need to invoke the blob service more than once. The key point is that the process may close and reopen several times.
Making the user go through the interactive token acquisition process each time the process restarts would be very annoying. I would like to persist the token and reuse it in later flows until it expires (assume persistence is secure).
What type of credential should I use? ClientSecretCredential doesn't work. Alternatively, perhaps there is a token cache mechanism I am not aware of.
EDIT:
I reposted a variation of this question. It also has a working answer.
Thank you Jim Xu.
According to my research, the DeviceCodeCredential doesn't cache tokens--each get_token(*scopes, **kwargs) call begins a new authentication flow.
According to your need, you can use ClientSecretCredential. Regarding how to implement it, please refer to the following steps
Create a service principal and assign Azure RABC role(such as Storage Blob Data Owner Storage Blob Data Contributor and Storage Blob Data Reader) to it to do Azure AD auth and access Azure blob storage. For more details, please refer to the document and the document
I use Azure CLI
#create a sevice pricipal and assign Storage Blob Data Contributor role at storage account level
az login
az ad sp create-for-rbac -n "MyApp" --role "Storage Blob Data Contributor" \
--scope "/subscriptions/<subscription>/resourceGroups/<resource-group>/providers/Microsoft.Storage/storageAccounts/<storage-account>" --sdk-auth
# just assign Storage Blob Data Contributor role at storage account level
az role assignment create --assignee <sp_name> --role "Storage Blob Data Contributor role"
--scope "/subscriptions/<subscription>/resourceGroups/<resource-group>/providers/Microsoft.Storage/storageAccounts/<storage-account>"
Code
from azure.identity import ClientSecretCredential
token_credential = ClientSecretCredential(
sp_tenant_id,
sp_application_id,
sp_application_secret
)
# Instantiate a BlobServiceClient using a token credential
from azure.storage.blob import BlobServiceClient
blob_service_client = BlobServiceClient(account_url=self.oauth_url, credential=token_credential)
blobs = blob_service_client.list_containers()
for b in blobs:
print(b)
Related
When I issue the following command:
az storage entity query --account-name acc1 --table-name table1
I successfully get my query result with the following warning:
There are no credentials provided in your command and environment, we will query for account key for your storage account.
It is recommended to provide --connection-string, --account-key or --sas-token in your command as credentials.
You also can add `--auth-mode login` in your command to use Azure Active Directory (Azure AD) for authorization if your login account is assigned required RBAC roles.
For more information about RBAC roles in storage, visit https://docs.microsoft.com/azure/storage/common/storage-auth-aad-rbac-cli.
In addition, setting the corresponding environment variables can avoid inputting credentials in your command. Please use --help to get more information about environment variable usage.
To avoid above warning, I add --auth-mode login to the command:
az storage entity query --account-name acc1 --table-name table1 --auth-mode login
Then I get this error:
You do not have the required permissions needed to perform this operation.
Depending on your operation, you may need to be assigned one of the following roles:
"Storage Blob Data Owner"
"Storage Blob Data Contributor"
"Storage Blob Data Reader"
"Storage Queue Data Contributor"
"Storage Queue Data Reader"
"Storage Table Data Contributor"
"Storage Table Data Reader"
If you want to use the old authentication method and allow querying for the right account key, please use the "--auth-mode" parameter and "key" value.
My account is able to get the query result without --auth-mode login switch. Why it fails authorization with the switch?
When you don't specify the authentication type, it will try yo get the access key of the storage account:
This requires Microsoft.Storage/storageAccounts/listkeys/action permission. If you have contributor role oier the storage account, you have the required permission.
--auth-mode login means it will use AAD auth to connect to the storage. You can use of the built-in roles to access the storage (see documentation):
Storage Table Data Contributor
Storage Table Data Reader
When using AAD Auth, you could also disable access key authentication.
There is an good article related to RBAC management and data plane model:
Assign an Azure role for access to blob data.
I try to create a User delegation key from azure portal.
No matter what privileges I'm assigning to myself, I hit the same error message
You don't have permissions to grant read access. You can still create
a shared access signature, but you'll need an RBAC role with
additional permissions before you can grant that level of access to
your signature recipient.Learn more about Azure roles for access to
blob data
So far I have the following the roles assigned :
And the link provided in the error message says I need one of the following :
Contributor
Storage Account Contributor
Storage Blob Data Contributor
Storage Blob Data Owner
Storage Blob Data Reader
Storage Blob Delegator
So it should work, but it doesn't. What am I missing ?
The error usually occurs if you don't have required roles/permissions assigned to create User delegation key.
Please note that in order to create user delegation key, ensure to have role that includes action like below:
Microsoft.Storage/storageAccounts/blobServices/generateUserDelegationKey
The above action is included in the below roles:
Storage Blob Data Contributor
Storage Blob Data Owner
Storage Blob Data Reader
Storage Blob Delegator
Try assigning either Storage Blob Data Contributor / Storage Blob Data Owner roles as you didn't assign.
Please check at what scope you have assigned the role, make sure to assign the roles at the level of the storage account, the resource group, or the subscription.
I tried in my environment, and got the same error when the roles are not assigned:
After assigning the roles, I am able to create user delegation key successfully without errors.
If still the error persists, try creating an Azure Support ticket.
For more in detail, please refer below links:
Create SAS tokens for containers and blobs with the Azure portal | Microsoft Docs
azure-docs/storage-blob-user-delegation-sas-create-cli.md at main · MicrosoftDocs/azure-docs · GitHub
I have an Azure account with Owner permission for the subscription we have. I can see that two permissions existing for the same subscription, One is owner, and the other is Contributor. I am trying to delete the blob cache with the following Azure CLI command:
az storage blob delete-batch --source <containerName> --account-name <storageAccountName> --auth-mode login
I am getting the below error
I am not sure, despite having enough permissions why I am getting this error. Please help
Attaching the permission of my subscription
My access permission to storage account
If you set the --auth-mode parameter to login, it means that you use Azure AD auth to retrieve Azure blob data. If so, the Azure AD Azure AD security principal you used to login should be assigned to the role Storage Blob Data Owner Storage Blob Data Contributor or Storage Blob Data Reader. Otherwise, you have no permissions to process Azure blob.
Now, your account just has been assigned to Owner, please set the --auth-mode parameter to key which means that users attempt to retrieve the account access key to use for processing Azure blob. The Owner role has the permissions to do that.
For more details, please refer to here and here
Our CI pipeline needs to back up some files to Azure Blob Storage. I'm using the Azure CLI like this: az storage blob upload-batch -s . -d container/directory --account-name myaccount
When giving the service principal contributor access, it works as expected. However, I would like to lock down permissions so that the service principal is allowed to add files, but not delete, for example. What are the permissions required for this?
I've created a custom role giving it the same permissions as Storage Blob Data Contributor minus delete. This (and also just using the Storage Blob Data Contributor role directly) fails with a Storage account ... not found. Ok, I then proceeded to add more read permissions to the blob service. Not enough, now I'm at a point where it wants to do Microsoft.Storage/storageAccounts/listKeys/action. But if I give it access to the storage keys, then what's the point? With the storage keys the SP will have full access to the account, which I want to avoid in the first place. Why is az storage blob upload-batch requesting keys and can I prevent this from happening?
I've created a custom role giving it the same permissions as Storage Blob Data Contributor minus delete. This (and also just using the Storage Blob Data Contributor role directly) fails with a Storage account ... not found.
I can also reproduce your issue, actually what you did will work. The trick is the --auth-mode parameter of the command, if you did not specify it, it will use key by default, then the command will list all the storage accounts in your subscription, when it found your storage account, it will list the keys of the account and use the key to upload blobs.
However, the Storage Blob Data Contributor minus delete has no permission to list storage accounts, then you will get the error.
To solve the issue, just specify the --auth-mode login in your command, then it will use the credential of your service principal to get the access token, then use the token to call the REST API - Put Blob to upload blobs, principle see Authorize access to blobs and queues using Azure Active Directory.
az storage blob upload-batch -s . -d container/directory --account-name myaccount --auth-mode login
i followed the tutorial (below *)
and now have a Service Principal .
How can i use this Service Principal when reading a blob using Get-AzureStorageBlob ?
Get-AzureStorageBlob requires a New-AzureStorageContext , can i use the SP instead of the StorageAccountKey guid?
Thanks,Peter
https://azure.microsoft.com/en-us/documentation/articles/resource-group-authenticate-service-principal/
As far as I know, you cannot use a SPN for accessing items in blob storage. You will need to use the access keys or SAS tokens.
Recently, Azure has added an option to Manage access rights to Azure Storage data with RBAC. You need to add one of the built-in RBAC roles scoped to the storage account to your service principal.
Storage Blob Data Contributor (Preview)
Storage Blob Data Reader (Preview)
Then, if you want to use the AzureCLI to access the Blob Storage with a Service Principal
Log in with a service principal
$ az login --service-principal --tenant contoso.onmicrosoft.com -u http://azure-cli-2016-08-05-14-31-15 -p VerySecret \
Enable the preview extension
$ az extension add -n storage-preview
Use --auth-mode parameter with your AzureCLI command
$ az storage blob download --account-name storagesamples --container sample-container --name myblob.txt --file myfile.txt --auth-mode login
For more information please see:
Manage access rights to Azure Storage data with RBAC (Preview)
Use an Azure AD identity to access Azure Storage with CLI or PowerShell (Preview)
if your SPN has only reader role, you cannot access the storage w/o SAS or account key.
You can asign the SPN to contributor role and create SAS for other normal users.
then switch to other normal user to access the storage with SAS.