For Azure there is an API Endpoint that allows to regenerate key.
The endpoint looks like
POST https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Storage/storageAccounts/{accountName}/regenerateKey?api-version=2017-06-01
Documentation states
When you have code that needs to access or modify resources, you must set up an Azure Active Directory (AD) application.
However, when I use it (i.e. create POST request) I'm getting error Authentication failed. The 'Authorization' header is missing. I tried to follow this tutorial and I did all the steps except Assign application to role What role should I select to be able to regenerate password? How do I do that? Am I getting this correct?
I think you could use Storage Account Key Operator Service Role.
The Storage Account Key Operators are allowed to list and regenerate keys on Storage Account.
Storage Account Contributor: Lets you manage storage accounts, but not access to them.
Contributor:Lets you manage everything except access to resources.
If you have some more action to operate, you could use more powerful roles, but if you only want to regenerate key, I suggest that you could use Storage Account Key Operator Service Role.
Related
I have an Azure Storage Account with public access disabled. Inside the storage account are a few Blob Storage Containers. Each container has access managed with AzureAD with varying permissions for each user/group to different Blob Containers.
I want to be able to download items inside the Blob Storage using tools like wget or curl using HTTP Basic Auth or specifying user information in the request.
I'm aware that AzCopy can be used to download Blobs from the containers, but previously we have used http requests to download artifacts and would like to continue using that method.
This question from 2016 Makes it seem like that its possible to do this with Shared Access Signature, which makes me believe that its possible to use with User Delegation SAS, but I have not found a way to set this up, and it requires a lot of parameters, more than a username/password or token.
Does Azure Blob Storage have a way where a user can access blob storage without AzCopy or any other specialized tools and authenticate via a method that does not require additional resources?
No. You must make a separate request for a token to send to Blob.
When a security principal (a user, group, or application) attempts to
access a blob resource, the request must be authorized, unless it is a
blob available for anonymous access. With Azure AD, access to a
resource is a two-step process. First, the security principal's
identity is authenticated and an OAuth 2.0 token is returned. Next,
the token is passed as part of a request to the Blob service and used
by the service to authorize access to the specified resource.
The authentication step requires that an application request an OAuth
2.0 access token at runtime.
Overview of Azure AD for blobs
In order to connect to Azure Shared Storage(in particularly File Share) to perform tasks like copying/removing/modifying files from remote to azure storage, we need either SAS(Shared Access Signature) or Active Directory Settings Enabled (and then assign roles based on requirement).
I wanted to implement the access using SAS approach, I tried generating SAS from UI, tried generating SAS by making use of Access Keys(Present Inside Storage Account - Confidential and most important key for storage account) both worked. But UI approach isn't conducive in my case, and Access token can't be given to anyone apart from the administrator.
So is there a way to generate SAS using Azure AD credentials or some service where we can create an account and password/key and that account can be used to create SAS token via curl(REST call) and not generating SAS via access keys(admin key).
The tricky part is to let your users create a sas token for the file share without granting them permissions on the whole storage account.
You can use a middle tier application that creates the SAS token and allow the users to use that app. An azure function with an HTTP trigger can be used for example. You grant the azure function access to the storage account using a Managed Service Identity and secure the access to the Azure function either with Active Directory or a function key, that you distribute to your users.
You can try with this approach:
A SAS token for access to a container, directory, or blob may be secured by using either Azure AD credentials or an account key.
Microsoft recommends that you use Azure AD credentials when possible as a security best practice, rather than using the account key, which can be more easily compromised. When your application design requires shared access signatures, use Azure AD credentials to create a user delegation SAS for superior security.
Create a User delegation SAS
Generate a User Delegation Key:
POST https://myaccount.blob.core.windows.net/?restype=service&comp=userdelegationkey
We aim to collect data from the Azure Management APIs. These APIs provide information on the resources we have running in Azure, the consumed budget, etc (example). Following our design choices, we prefer to exclusively use Azure Data Factory to make the HTTP requests and store the data into our data lakes. This is fairly obvious, using the REST linked service. However, we struggle to correctly set up the OAuth2 authentication dance with this method.
Our first idea was to store the token and the refresh token within the Azure Key Vault. A series of HTTP requests within the pipeline would then test whether the token is still valid or otherwise use the refresh token to get a new token. The downside to this approach is that the token within the Azure Key Vault is never updated, when needed, and that the logic becomes more complex.
Alternatively, we were trying to set up the authorization through combination of a registered app and service principal to our Azure AD account. The REST linked service within Data Factory can be created with a service principal, which would then handle most of the information of the scope and consent. The service principal is also accompanied with a Azure app, which would hold the token etc. Unfortunately, we are unable to make this setup function correctly.
Questions we have:
Can we actually use a service principal / app to store our OAuth2 tokens? If so, will these be automatically refreshed within our app?
How do we assign the correct privileges / authorizations to our app that it can use this (external) API?
Is the additional logic with HTTP calls within Azure Data Factory pipeline needed to update the tokens or can these apps / service principals handle this?
Thank you for your time and help!
It is not a good idea to store the tokens in the keyvault, because they will expire.
In your case, two options for you to use.
Use service principal to auth
Use managed identity to auth(best practice)
Steps to use service principal to auth:
1.Register an application with Azure AD and create a service principal.
2.Get values for signing in and create a new application secret.
3.To call the Azure REST API e.g. Resources - List you mentioned, your service principal needs the RBAC role in your subscription.
Navigate to the Azure portal -> Subscription -> add your service principal as a Contributor/Owner role in the subscription like below.
4.In the linked service, configure it like below, fix them with the values got from step 2.
Don't forget to replace the {subscriptionId} in the Base URL.
https://management.azure.com/subscriptions/{subscriptionId}/resources?api-version=2020-06-01
5.Test the linked service with a copy activity, it works fine.
Steps to use managed identity to auth:
1.Make sure your data factory has enabled the MSI(managed identity), if you create it in the portal or powershell, MSI will be enabled automatically, don't worry about that.
2.Navigate to the Subsctiption in the portal, add the role to the MSI like step 3 in Steps to use service principal to auth, just search for your ADF name in the bar, the MSI is essentially a service principal with the same name of your ADF, which is managed by azure.
3.Then in the linked service, just change it like below.
At last, answer your questions.
Can we actually use a service principal / app to store our OAuth2 tokens? If so, will these be automatically refreshed within our app?
As I mentioned, it is not a good idea, just use the service principal/MSI to auth like the steps above.
How do we assign the correct privileges / authorizations to our app that it can use this (external) API?
To use the Azure REST API, just assign the RBAC roles like above, specify the correct AAD resource e.g. https://management.azure.com in this case.
Is the additional logic with HTTP calls within Azure Data Factory pipeline needed to update the tokens or can these apps / service principals handle this?
No need to do other steps, when you use the configuration above, essentially it will use the client credential flow to get the token in the background for you automatically, then use the token to call the API.
I am writing a azure functions application in python. It is a http trigger application. The input for the application is userID.
{ "user":"TEST_USER_1" }
The functions should use the username specified in 'user' parameter and look up the password in azure keyvault and attempt a remote login to one of the service. For this, we need to add the application settings parameter for specific keyvault version BUT these changes are static.
For us, all the users and their passwords are added to azure keyvault And we will keep on provisioning more users.
Now, the challenge is how to dynamically access password for any user specified during API call ? Is it possible to access all the secrets in a keyvault without specifying individual secret version?
Here's what you need to do:
create a managed identity for your Azure Function
grant permission on Azure Key Vault (Get / List) for the previous identity
get access token from Azure AD in your function
use the token to retrieve the secret.
Some useful links:
https://learn.microsoft.com/en-us/azure/python/python-sdk-azure-authenticate
https://learn.microsoft.com/en-us/python/api/overview/azure/key-vault?view=azure-python
I am trying to implement KeyVault managed Storage Account in Azure to rotate storage keys using KeyVault. I did follow the documentation, which uses both "ServicePrincipalID" and "UserPrincipalID", but in my case i am provisioning my resources and implementing all the steps involved using my service principal (as we deploy using VSTS with service principal) and using "ServicePrincipalID" as ObjectID in place of "UserPrincipalID" (as there is no user intervention during provisioning and post-provisioning process). I did give my service principal "Owner" role and all required permissions for keyvault to access storage. But when i do "Add-AzureKeyVaultManagedStorageAccount" i get the below error which says "KeyVault is unable to perform the action on behalf of the caller". So i am not sure what access i am still missing, even after making my principal as Owner. Please find my screenshots below for more details. Would be glad to hear any suggestions to cross this hurdle.
Error
KeyVault details
Thanks
Chaitanya Alladi.
i get the below error which says "KeyVault is unable to perform the action on behalf of the caller". So i am not sure what access i am still missing, even after making my principal as Owner.
Unfortunly, we can't do that with service principle now.
AAD doesn't support get OBO(OnBehalfOf) token for service principle caller tokens.
We need to use the user credentials instead of Service Principal credentials. There are some operations that are only possible on behalf of the user and not Service Principal when it comes to storage account keys as of now.