I'm trying to get hold of the Thumbprint value for a App Service Certificate to be used in the hostNameBindings:
I've tried [reference(resourceId(variables('sslRg'), 'Microsoft.CertificateRegistration/certificateOrders', variables('sslName')), '2021-03-01').Thumbprint] but that doesn't have the Thumbprint property:
https://learn.microsoft.com/en-us/azure/templates/microsoft.certificateregistration/2021-03-01/certificateorders?tabs=bicep
You can get the thumbprint value by using it like this -
[reference(resourceId('Microsoft.CertificateRegistration/certificateOrders', parameters('certificateOrderName'))).SignedCertificate.Thumbprint]
Please refer this template as example.
Related
In the first step of certificate configuration, I couldn't set key vault, I have tried to create a new one, but still not work.
There is always to show this error.
Failed to link certificate with the selected Key Vault. Check below errors for more detail.:
An error has occurred.
Have you followed the steps here?
I just bought a new App Service Certificate and created a new vault, I can store it successfully.
at the moment I´m working in azure with azure automation and automation account. For executing a runbook I want to authenticate me with a service principal + certificate.
Unfortunately I get the error message "The private key is not present in the X.509 certificate".
I will use:
Add-AzureRmAccount -ServicePrincipal -Tenant xxx -ApplicationId xxx -CertificateThumbprint xxx
But with Login-AzureRmAccount and Connect-AzureRmAccount I get the same error message.
What have I done so far?
Since I´m working in a big company I can´t use a self-signed certificate. Our team have created a .cer-File and .key (Private Key) file for me. After some testing I find out that I need something like this:
Example
Furthermore, I know that I can get this with an .pfx file but this is not accepted from the other team, which will import the certificate to my service principal since they only accept .cer files.
How can I get a .cer file with an public key included?
Thanks a lot!
Your certificate needs to include the private key if you want to sign in with it, which from the error message appears missing. A certificate with the private key included should have a file name of ~.pfx. For reference, check the MS doco here: "Clients which sign in with the service principal also need access to the certificate's private key"
I have a WCF service that is telling me it can't find its certificate:
Cannot find the X.509 certificate using the following search criteria:
StoreName 'My', StoreLocation 'CurrentUser', FindType
'FindByThumbprint', FindValue
'cf51e92041d0440a262df6a357f3f709f6f8d710'.
and the config specifies the certificate by the thumbprint
<serviceCertificate storeLocation="CurrentUser" storeName="My"
findValue="cf51e92041d0440a262df6a357f3f709f6f8d710"
x509FindType="FindByThumbprint" />
Using the powershell command Get-ChildItem cert:\CurrentUser\My finds the certificate. If I change the config file to specify LocalMachine the service starts correctly.
What is going on? I suppose I could change the config file, however, when publishing my service to Azure then it can't find the uploaded certificate because that is looking in CurrentUser. I know I could use different configs for different environments, but I don't want to blindly head down that path without understanding the "why" of things.
I basically want to create my HDI/Spark Cluster which accesses an Azure Data Lake Store by using ARM templates and also Azure Key Vault.
So far I created the cluster manually and stored the ARM template. Then I tried to populate the sensitive values from Azure Key Vault but I am struggeling how to pass in the "identityCertificate" correctly.
I also followed this steps to create the Certificate and everything: https://github.com/Azure/azure-quickstart-templates/tree/master/201-hdinsight-datalake-store-azure-storage
and then this steps to upload the certificate into the KeyVault: https://blogs.technet.microsoft.com/kv/2016/09/26/get-started-with-azure-key-vault-certificates/
However, referenceing the KeyVault secret in my ARM template always ends up in this error:
{ "status": "Failed", "error": { "code": "ResourceDeploymentFailure",
"message": "The resource operation completed with terminal provisioning state 'Failed'.", "details": [ { "code": "InvalidDocumentErrorCode", "message": "DeploymentDocument 'AmbariConfiguration_1_7'
failed the validation. Error: 'Error while getting access to the datalake storage account gbhdi: The specified network password is not correct.\r\n.'" } ] } }
doing everything manually in the Azure Portal using same certificate etc. works just fine
I also tried to set the "identityCertificate" parameter manually by using the Base64 encoded value of my certificate but this did not work either
Which value would I need to pass to my parameter if I hard-code it?
seems like I found the issue and it is actually related to the previously failed ARM deployments which leave some fragments of the HDI cluster and new deployments do not overwrite these fragments but use the old settings
after deleting the cluster (which was not working anyway) I could deploy it as expected.
However, it is worth mentioning that the certificate has to be stored in KeyVault as Secret and not as Key and that it has to be base64 encoded!
here is the PowerShell script that I used:
#Add Certificate to KeyVault
$base64Cert = [System.Convert]::ToBase64String((Get-Content $certFilePath -Encoding Byte))
$base64Cert | Out-File $certFilePath.Replace(".pfx", ".base64.txt")
$cer3 = Set-AzureKeyVaultSecret -VaultName $vaultName -Name $certName -
SecretValue (ConvertTo-SecureString –String $base64Cert –AsPlainText –Force)
hope that helps other people facing the same issue!
-gerhard
Thanks Gerhard, I think you saved me a couple of hours of investigation.
First I tried using plain text values. I changed the SecureString types to String in the template, and provided plain text passwords. For the identityCertificate parameter I added the Base64-encoded string of the certificate, and everything worked. If you wanted to hardcode it, that would be the way to do it. The failure in this could have been due to the previous failed attempts.
After that I tried to use the key vault. I added the password as a secret in the vault, and the certificate, well... as a certificate. Then it failed with the exact same error message you mentioned. So the solution was to add the Base64-encoded certificate as a secret too (through the UI).
I've uploaded a pfx certificate as a secret to my Azure Portal and I now want to use this to sign the credentials in IdentityServer4.
I've got reference to the vault in the startup of the api using:
builder.AddAzureKeyVault(
$"https://{config["Vault"]}.vault.azure.net/",
config["ClientId"],
config["ClientSecret"]);
But not too sure how to get the certificate out to pass to:
services.AddIdentityServer()
.AddSigningCredential(...)
Is it possibly to be able to reference the certificate directly from the key vault, or do I need to deploy the cert to the web app the api is running on?
Thanks
You are already building the IConfiguration object, so you can reference the pfx key just like you would reference any other object from the configuration.
If the key is in the keyvault you can reference it something like:
// Name of your pfx in the keyvault.
var key = _configuration["pfx"];
var pfxBytes = Convert.FromBase64String(key);
// Create the certificate.
var cert = new X509Certificate2(pfxBytes);
services.AddIdentityServer()
.AddSigningCredential(cert);
I would recommend using the keyvault for this but you could decide to upload the pfx to the certificate store of the web app. You can do this in Azure by going to your web app -> SSL Certificates -> Upload certificate and enter the password. Go to the Application Settings and add the app setting WEBSITE_LOAD_CERTIFICATES : <thumbprint>. The last thing you would do is retrieve the certificate from the store and add it again like AddSigningCredential(cert);.