I am writing a azure functions application in python. It is a http trigger application. The input for the application is userID.
{ "user":"TEST_USER_1" }
The functions should use the username specified in 'user' parameter and look up the password in azure keyvault and attempt a remote login to one of the service. For this, we need to add the application settings parameter for specific keyvault version BUT these changes are static.
For us, all the users and their passwords are added to azure keyvault And we will keep on provisioning more users.
Now, the challenge is how to dynamically access password for any user specified during API call ? Is it possible to access all the secrets in a keyvault without specifying individual secret version?
Here's what you need to do:
create a managed identity for your Azure Function
grant permission on Azure Key Vault (Get / List) for the previous identity
get access token from Azure AD in your function
use the token to retrieve the secret.
Some useful links:
https://learn.microsoft.com/en-us/azure/python/python-sdk-azure-authenticate
https://learn.microsoft.com/en-us/python/api/overview/azure/key-vault?view=azure-python
Related
People say not to store API Keys and passwords config files and instead to use a Secrets vault. eg. AWS or Azure.
But to access these you need a clientId and clientSecret. These need to be stored somewhere on the app. eg app.config. So I really don't understand what problem this solves if the hacker can use the clientId and clientSecret in the app the get the passwords or api keys anyway?
it seems even worse than the original problem storing one api key, since if they get access to the secrets manager they will have ALL THE KEYS and ALL the passwords.
AWS offers few different services to store your secrets. Let's say if you have a database password in an application configuration file, you can use either AWS Secret manager or AWS Parameter store to store them as secret.
To retrieve these values securely, you do not have to store another secret stored in your application. You can use a mechanism called role-based access in AWS.
If you running your application on an ec2 instance, you can configure an AWS role/profile and assign it to the ec2 instance which is linking the secret manager and the ec2 machine securely and your application has connectivity to decrypt the secret and use it inside the application.
If you are using Azure, the answer is to use Managed Identities.
The way it works is that you assign an identity to the resources (VMs, WebApps etc.) that need access to Key Vault. That way the resource becomes like a user in your Azure AD (much like a Service Principal or any other user). Then you can make use of Key Vault Access Policies to assign appropriate access to keys and secrets in your Key Vault to these Managed Identities. Doing this would not require you to specify a Client Id/Client Secret to access the Key Vault.
While the Managed Identity is something you assign to a resource, it could become cumbersome if you have many resources. That's where User Assigned Managed Identity comes into picture. A User Assigned Managed Identity is a resource in your Azure Subscription. The process is very much similar: you create such identity and then assign appropriate access to this identity on your Key Vault resources.
Now wherever you need to access Key Vault in your applications, you will specify the id of this identity. The application using appropriate SDK will get an access token on behalf of this identity and connect to Key Vault using that access token.
You can learn more about these identities here: https://learn.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/overview.
But to access these you need a clientId and clientSecret. These need to be stored somewhere on the app. eg app.config
You can use runtime permissions to access the secrets and parameters. In Azure it is called Managed Identities, in AWS there are service roles. I am more familiar with AWS, so I will use its terminology, but every larger cloud provider has similar approach with different names.
Basically you can assign the compute resource where your code runs (EC2/VM server, Lambda function, ECS container,...) a role - you can consider it as set of permissions. Using AWS API you can access the secrets or parameters from the code without storing any client credentials.
if they get access to the secrets manager they will have ALL THE KEYS and ALL the passwords.
That's why we all need to use principle of the least privileges, the defined runtime identity should have only permissions it really needs.
In order to connect to Azure Shared Storage(in particularly File Share) to perform tasks like copying/removing/modifying files from remote to azure storage, we need either SAS(Shared Access Signature) or Active Directory Settings Enabled (and then assign roles based on requirement).
I wanted to implement the access using SAS approach, I tried generating SAS from UI, tried generating SAS by making use of Access Keys(Present Inside Storage Account - Confidential and most important key for storage account) both worked. But UI approach isn't conducive in my case, and Access token can't be given to anyone apart from the administrator.
So is there a way to generate SAS using Azure AD credentials or some service where we can create an account and password/key and that account can be used to create SAS token via curl(REST call) and not generating SAS via access keys(admin key).
The tricky part is to let your users create a sas token for the file share without granting them permissions on the whole storage account.
You can use a middle tier application that creates the SAS token and allow the users to use that app. An azure function with an HTTP trigger can be used for example. You grant the azure function access to the storage account using a Managed Service Identity and secure the access to the Azure function either with Active Directory or a function key, that you distribute to your users.
You can try with this approach:
A SAS token for access to a container, directory, or blob may be secured by using either Azure AD credentials or an account key.
Microsoft recommends that you use Azure AD credentials when possible as a security best practice, rather than using the account key, which can be more easily compromised. When your application design requires shared access signatures, use Azure AD credentials to create a user delegation SAS for superior security.
Create a User delegation SAS
Generate a User Delegation Key:
POST https://myaccount.blob.core.windows.net/?restype=service&comp=userdelegationkey
In my application I have to store very sensitive data of its users, such as various password to other 3rd part services (user fill a form where he provides us login and password to 3rd part service)
The goal of the application is to setup other complex system using powershell scripts generated from over 100 inputs. There is a requirement to save user work as draft, and that is why I need to encrypt sensitive fields somehow.
I read a lot about Azure Key Vault and whenever I read about secrets it seems they are described to hold app settings rather then users secrets, so i am not sure if this is right to place those data.
Is Azure Key Vault secrets suitable for that job?
Moreover i am able to peek those value in azure portal as in plain text, and I want to avoid that. I suppose that I could encrypt them first and store already encrypted values, but this may be over engineering.
It's not clear what you are describing as user secrets. If it's user credentials, then you need to federate login to an Identity Provider like Azure AD or Azure AD B2C. Key Vault is NOT an identity provider, but a secret store. If it's application secrets (think connection strings) then you should look at Key Vault (with Managed Service Identity).
Conversely, Application Settings (in App Service) are exposed in the Portal but are encrypted at rest. So if you're careful about who can access what within your subscription namespace, you should be just fine.
For Azure there is an API Endpoint that allows to regenerate key.
The endpoint looks like
POST https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Storage/storageAccounts/{accountName}/regenerateKey?api-version=2017-06-01
Documentation states
When you have code that needs to access or modify resources, you must set up an Azure Active Directory (AD) application.
However, when I use it (i.e. create POST request) I'm getting error Authentication failed. The 'Authorization' header is missing. I tried to follow this tutorial and I did all the steps except Assign application to role What role should I select to be able to regenerate password? How do I do that? Am I getting this correct?
I think you could use Storage Account Key Operator Service Role.
The Storage Account Key Operators are allowed to list and regenerate keys on Storage Account.
Storage Account Contributor: Lets you manage storage accounts, but not access to them.
Contributor:Lets you manage everything except access to resources.
If you have some more action to operate, you could use more powerful roles, but if you only want to regenerate key, I suggest that you could use Storage Account Key Operator Service Role.
Usually when you use key vault to encrypt and decrypt data you have to keep your AD registered app's (that has the authorization to access key vault) ClientID and ClientSecret in plain text somewhere. This seems like a security problem if someone steals the the ClientID and Secret anyone can claim they are the registered app.
Is there or can there be a more secure approach?
You can use a certificate to authenticate instead of a secret.
There are three things you need to do for this approach:
Create a certificate to use.
When creating the Active Directory application that you will use to access the Key Vault, you need to pass in the certificate you created in step 1. I don't think you can do this through the portal at the minute, so you'll need to use the New-AzureRMADApplication PowerShell command.
Use that certificate when authenticating to Key Vault. You'll need to use an overload of the AuthenticationContext.AcquireTokenAsync() method that receives a ClientAssertionCertificate to do that. You can create a ClientAssertionCertificate by simply passing the client id and the X509Certificate2.
From this blog post you can get some some code for the first two steps.
In addition to using certificate-based with KeyVault, Azure Managed Service Identity also introduces a new way to make an Azure service become a service principal without any client app registration and client secret. Currently it is only available in preview stage for some services: Azure VM, Azure App Service, Azure Function, Azure Event Hub & Azure Service Bus. More information can be found here https://learn.microsoft.com/en-us/azure/active-directory/msi-overview
[Update] When ever you need to retrieve something from KeyVault, with Azure MSI you don't need a client secret. Only use AzureServiceTokenProvider() method to retrieve access token
In real-world deployment with automation (for example via Ansible), you can use an external certificate to store sensitive variables in Ansibe Vault and generate a 256bit chain to secure such an info. During the automation deployment, the cert is decrypted to access to these variables and perform further deployment. This way adds more encryption layer to the whole Azure deployment.