Allowing Azure CDN to access Azure KeyVault - azure

I'm trying to set up my Azure CDN endpoint to use HTTPS for the custom domain I already set up.
When I tried to point at the SSL cert in my Azure KeyVault, I got an error stating that I need to grant Azure CDN access to KeyVault. Any idea how I can do this -- hopefully through the Portal and NOT Powershell, though I have a feeling it'll end up requiring Powershell commands.
Basically, I'm trying to get my Azure CDN endpoint to use the SSL cert in my Azure KeyVault.
Anyway, I'd appreciate someone pointing to me an article or a set of instructions please. Thanks!

Instructions for enabling SSL with your own certificate stored in Azure KeyVault are described here:
https://learn.microsoft.com/en-us/azure/cdn/cdn-custom-ssl?tabs=option-2-enable-https-with-your-own-certificate#ssl-certificates.

There's something broken with this lately.
Azure tells you to do the thing that doesn't work:
New-AzADServicePrincipal -ApplicationId "205478c0-bd83-4e1b-a9d6-db63a3e1e1c8" -Role Contributor
If you're curious as to what it actually does, it just gives the hidden Microsoft.AzureFrontDoor-Cdn the role Contributor in your subscription's IAM. You can view it in the portal:
It seems that the "Contributor" role no longer contains the necessary permissions to read keyvaults - namely the Microsoft.KeyVault/vaults/secrets/readMetadata/action permission.
I've noticed that:
this permission is given to the built in Key Vault Secrets User role
all the roles that seem "superior" to the secrets user like Key Vault Administrator don't work, it HAS to be Key Vault Secrets User
So what fixed it for me is going into my keyvault and giving MYSELF (not azure cdn) the permission to read and list secrets. Seems that even as owner and god-emperor of your azure instance you can't access secrets by default ("AT" is me in the screenshot):

Related

Accessing managed secrets in Azure Key Vault with RBAC?

I have a Python script running on an Azure Virtual Machine which uploads a file into a file share in a storage account. The VM is given a user-assigned managed identity with the built-in 'Key Vault Secrets User' role.
I followed this tutorial to allow key vault to automatically manage the storage account access key as a secret. Therefore, it is a managed secret (not viewable through the portal but visible through the CLI). My Python app attempts to retrieve the access key from the vault and uses it to generate a SAS token with write permission to file shares. However, when I attempt to retrieve the secret from key vault, I get the following error:
azure.core.exceptions.HttpResponseError: (Forbidden) The user, group or application 'appid=xxx;iss=https://sts.windows.net/xxx/' does not have secrets get permission on key vault 'my-vault-name;location=eastus'. For help resolving this issue, please see https://go.microsoft.com/fwlink/?linkid=2125287
I allowed access to the vault with the managed identity I created through the portal, and this was yesterday, so the change has definitely propagated okay. What is the issue with my process? Do I need to give the managed identity more permissions than just 'Key Vault Secrets User'?
Solved. I got confused and allowed access to the managed identity through IAM and not through the vault Access Policies.

Users of PowerApp can't get secret from azure key vault (through Power Automate)

For a azure keyvault connection in Power Automate I am using an app registration. Users of a PowerApp I made can't seem to get secrets from the azure key vault unless I give them access to the keyvault. I was hoping adding the users to the acces policies in the keyvault would be enough.
Is there a way to let users get secrets in a PowerApp (through Power Automate) without giving them full access to the keyvault?
I am trying to do something similar as this
You could grant them the "get" permission only on secrets:
az keyvault set-policy --name myvault --secret-permissions get --upn <user ID/email>
However, a better approach might be to run your application as a service principal (or have middleware service that does - really depends on why users need access to the secrets) and it contact Key Vault directly. That service principal should be given minimal rights - basically the same command as above, except using --spn instead of --upn.

Secure management plane's key vault in Terraform

I am working in a Terraform code. The requirements are:
Put TF state file onto Azure storage account. The access key to the
storage account must be secured.
My TF program authenticate using
Service Principal which client_id, client_secret, tenant_id should be
encrypted and put onto Azure storage account.
My idea is to use a single KeyVault which contains all above secured secrets. But how a terraform program could secure the access key, connection to access the KeyVault? This is about management plane.
According to: https://learn.microsoft.com/en-us/azure/active-directory/develop/v2-oauth2-auth-code-flow
, this provides a way of coding authorization request that the application has already registered on Azure AD service previously.
How can I apply the similar idea with Terraform code? Or may you suggest any better way to realize the idea?
Thanks a lot.
You can use Azure DevOps pipeline and provide the service principal secrets as secure variable group (can be linked from the KV) or as a service connection.
Check:
https://blog.gft.com/pl/2020/03/04/secure-terraform-delivery-pipeline-best-practices-part-1-2/ (disclaimer: I am the author)
and
https://blog.gft.com/pl/2020/04/24/automating-infrastructure-deployment-on-azure-using-ci-cd-pipeline-and-terraform/
I don't think it's possible to encrypt both the storage access key and the secret of the service principal. For the backend state, you can use the service principal to authenticate so that you do not expose the storage access key. You just need to grant the right permission to the service principal. Take a look at this.
But for the service principal, if you stored things like client_id, client_secret, tenant_id in the storage account, you need to get them before execute the Terraform code. It also needs the storage access key or the service principal secret or SAS token.
So I think you can use the Azure CLI to authenticate: Log in with the service principal. Then you do not need to provide the azurerm with the secret. But actually, it's also not absolutely secure. You also can find the access token in the path ~/.azure/. For this, you can change the permission for this path. If you also think it's not secure, you can take a look at the Managed identity of the VM. But you need to have a VM before all the things.

Unable to Import Key Vault Certificate in Azure Government Cloud

I am trying to import a certificate from a Key Vault to an App Service to configure SSL in the Azure Government Cloud. When I do this I get an error stating:
Failed to get App Service Service principal details.
I am getting a similar error when I try to do this through an ARM template which it what caused me to try this manually. I have tried giving my App Service a managed identity and giving that identity access to the key vault. I have tried a technique that worked in the regular Azure Cloud of giving "Microsoft Azure App Service" account permission to the key vault but that doesn't seem to exist in the Government Cloud.
I would have expected this to simply work and allow me to configure my SSL correctly on the app service so I don't need to manage the certificates individually on every app service.
You have to enable in the Identity of your app services that can be assigned through the azure permissions, then you have to go to the KeyVault and grant the permissions to the App Services.
https://learn.microsoft.com/en-us/azure/app-service/media/app-service-managed-service-identity/msi-blade-system.png
More information:
https://learn.microsoft.com/en-us/azure/app-service/overview-managed-identity
https://learn.microsoft.com/es-es/azure/key-vault/tutorial-net-create-vault-azure-web-app
https://azure.microsoft.com/en-us/resources/samples/app-service-msi-keyvault-dotnet/
I eventually found the solution to the issue.
Following the directions found here:
https://github.com/Azure/azure-quickstart-templates/tree/master/201-web-app-certificate-from-key-vault
I tried to authorize the 'Microsoft.Azure.WebSites' Resource Provider as described in the link, but that GUID doesn't exist in the Government Cloud.
This link however does give you the equivalent GUID for the Government Cloud:
https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/azure-government/documentation-government-services-webandmobile.md#app-services
After using the script from the first link with the GUID value from the second link I was able to get both deployments and manual SSL added.

Creating a Secret Scope in Databricks backed by Azure Key Vault fails

You can create scopes in Databricks backed by Azure Keyvault instead of using the Databricks CLI. However, when you try to create a Scope, an obscure error message (with a spelling mistake!) is shown. It appears as not many people encounter this error:
"Internal error happened while granting read/list permission to Databricks ervice principal to KeyVault: XYZ"
Setting the Manage Principal to All Users does NOT help in this case.
I figured that this was a Service Principal issue in Azure AD. This particular user I was logged on to Databricks with was not an AD contributer and only had Contributer role on the Databricks and Keyvault service. I could not find any default Object ID in AD for Databricks so I assumed it was creating a service principal on the fly and connecting Databricks with Keyvault (I might be wrong here - it might already exist in AD when you enable the Databricks resource provider).
Logging in as an Admin with the rights to create service principals solved the problem. After that you can see in the Key Vault the DB service principal used in for the key retrieval:
As mentioned by #rcabr in his above comment there is already an SP by name 'AzureDatabricks' inside Enterprize Application, you need to get the object id details and add it in the access policy of the key vault. With this, the Databricks will be able to access the KeyVault

Resources