I have some difficulties in managing Azure certificates from my code.
Indeed I'm trying to use Azure REST Services API (e.g. creating HTTP requests) in order to know my services state from my Azure web site.
It works well in local debugging, but my web role seams to have some limitation with the certificates manager. Bellow is what I do:
// this method stores a certificate from the resources
// to the local certificates manager
private X509Certificate2 StoreCertificate(Certificates certName)
{
X509Certificate2 newCert = null;
// get certificate from resources
newCert = this.GetCertificateFromResources(certName);
// store it into the local certificate manager
if (newCert != null)
{
var store = new X509Store(
StoreName.TrustedPeople,
StoreLocation.LocalMachine
);
store.Open(OpenFlags.ReadWrite);
store.Add(newCert);
}
// reset ref and try to load it from manager
newCert = null;
newCert = this.GetCertificate(certName);
return newCert;
}
An Access is denied error appends when I try to add the certificate.
Any idea ? Can I store certificates into the Azure VM ?
Should I use a special location to store those ?
Are you using a Cloud Service (web/worker role)? If so, which OS family? I've seen some reports that with a worker role using OS family 3, you need to run the role with elevated permissions in order to read certs from the local cert store. Not sure if that applies to web roles as well and adding to the cert store.
Has the service cert been added via the Azure management portal as well (or via the REST API or PowerShell)?
Well I have found lot of things:
I was deploying my code in a web site so that I cannot add a certificate to the Shared VM in Azure
I have tried to connect to the VM in a remote desktop session and I added a certificate manually.
Even in this case, I have an (403) Forbidden error in an InvalidOperationException.
So here is the current state:
a certificate has been created (makecert) and added manually in the VM that hosts my web role (deployed in a service)
this certificate has been uploaded to both the Azure Account certificates and to the Azure service certificates (the one that deploys my web role)
the thumbprint of this certificate has been added in my code and I can access to the certificate when my code is executed
So 2 questions:
Is there something I should do with my certificate ?
When I try my web role locally in the Azure emulator, everything works. Is there a special setting to update during the publish / deploy step ?
Thanks for your help.
In order to save the time of other developers, here is what I did to solve the main problem:
connect to the VM that deploys the web role: see there
create the certificate: see there
Eventually plays with the certificates manager (mmc.exe)
Then the certificate is available from the code.
Related
When I run my ASP dotnet core API locally (in release mode) it makes an external call, using a WCF and a clientcertificate (X509Certificate2), returning the data correctly. But when this API is deployed as an Azure App Service it states "The Client Certificate Credentials Were Not Recognized". The X509Certificate2 is loaded from the filesystem correctly (seen from remote debugging).
I've tried making the call with a normal HttpClient and adding the certificate, but this gave me the same results.
We also tried using a CertificateStore, with equal results.
private async Task ProcessRequestAsync(string endpoint, X509Certificate2 certificate, Func<SsoSoapType, Task> action)
{
BasicHttpsBinding binding = new BasicHttpsBinding();
EndpointAddress endpointAddress = new EndpointAddress(new Uri(endpoint));
ChannelFactory<SsoSoapType> factory = new ChannelFactory<SsoSoapType>(binding, endpointAddress);
factory.Credentials.ClientCertificate.Certificate = certificate;
binding.Security.Transport.ClientCredentialType = HttpClientCredentialType.Certificate;
await action(factory.CreateChannel());
if (factory != null)
{
if (factory.State == CommunicationState.Faulted)
factory.Abort();
else
factory.Close();
}
}
I expect the deployed version to behave just like my local version. But apparently this isn't the case.
Can someone explain where this is going wrong?
Or is it caused by some settings in the Azure portal that need to be set accordingly?
Kind regards,
Jacco
Alright, after some thorough research I found out that my Azure App Service had been set to hosting plan "D1". This means that the machine is shared between different app services and thus can't utilize the certificate store (as you would be able to see other peoples' certificates). After upgrading to hosting plan "B1" the issue was resolved.
I am trying to follow the tutorial for deploying a split-merge service (Azure Elastic Database... tools).
The first complication is that the doc instructs me to create an "Azure Cloud Service." The closest thing to that seems to be "Cloud service (classic)," so that's what I created.
When it came to creating a self-signed cert, I had to translate the parameters for makecert (which is deprecated and no longer seems to be present in any SDKs) to the powershell New-SelfSignedCertificate cmdlet. The relevant params I passed to the cmdlet were:
Subject: CN=*.cloudapp.net
KeySpec: KeyExchange
TextExtension: 2.5.29.37={text}1.3.6.1.5.5.7.3.1,1.3.6.1.5.5.7.3.2
I finally got the certificate created/exported/uploaded, got the service configuration file completed and created the service. Azure portal reports the web and workers running, but I can't hit the service URL (403 access denied even after prompting me to select my certificate). I confirmed that my certificate thumbprint shows correctly in the various places in the service configuration (DataEncryptionPrimaryCertificateThumbprint, DataEncryptionPrimary, AdditionalTrustedRootCertificationAuthorities, AllowedClientCertificateThumbprints, DataEncryptionPrimaryCertificateThumbprint). My certificate's thumbprint also shows as the thumbprint in configuration under the "Certificates" section as SSL, CA, and DataEncryptionPrimary.
The only thing I can think of that is causing the access denied is something mentioned in this doc, "If you are using self-signed certificates, you will need to disable certificate chain validation." The PowerShell cmdlet that it shows to use to disable chain validation in that case (for an API service; no clue how that differs from my service) fails with InvalidOperation.
Is there some way for me to disable certificate chain validation for my "classic" cloud service? Other suggestions of things to check?
I want to use certificates (uploaded, via the portal, to the cloud service deployment) in my cloud service webrole.
I would expect that - after uploading the certificates - they would be applied to my running web roles and I can then find the certificates via their thumb print.
I upload the certificate via the portal by going to my cloud service, selecting "Certificates" and then uploading the .pfx and providing the password.
This is the code I am using to try to get certificates:
var store = new X509Store(StoreName.My, StoreLocation.LocalMachine);
store.Open(OpenFlags.ReadOnly);
X509Certificate2 certificate = null;
foreach (X509Certificate2 cert in store.Certificates)
{
string certHash = cert.Thumbprint;
if (certHash.Equals(binding.SslThumbprint, StringComparison.OrdinalIgnoreCase))
{
certificate = cert;
break;
}
}
This works if I register the certificates in the .csdef file, but I need to be able to load the certificates dynamically. Changes to the .csdef file require deploying a new package - which is not an option.
There is a similar feature in azure websites that you can add a WEBSITE_LOAD_CERTIFICATES setting with a wildcard value to your app setting and then find them by thumbprint in the code. Basically I am looking for a similar feature in cloud services.
There is no ability to dynamically load certs uploaded to the Azure portal into a Cloud Role without specifying them first in the CSDEF/CSCFG files.
You can, however, upload your certs to some external storage (ie: Blob storage, SQL Azure db, etc or as Poul mentioned Key Vault) and load them from there.
HTH
I deployed a web application as a Web App on Azure App Service.
I uploaded some certificates to the Azure Portal, since the Web App runs over SSL, and we use another certificate to perform some decryption.
For the latter case I have a method (which works fine locally) to find a certificate:
public static X509Certificate2 FindCertificate(KnownCertificate certificate)
{
return FindCertificate(StoreName.My, StoreLocation.CurrentUser, X509FindType.FindByThumbprint, certificate.Thumbprint);
}
But I get an error that the certificate with thumbprint XYZ is not found. Although, on the Azure Portal it is present. (I had uploaded and imported it)
I am using StoreLocation.CurrentUser as suggested in THIS POST but it still does not work. Am I using the wrong store or what else am I missing?
EDIT: I have managed to remotetly debug my WebApp and with the ImmediateWindow feature of VisualStudio I have executed this code
new X509Store(StoreName.CertificateAuthority, StoreLocation.CurrentUser).Certificates.Find(findType, findValue, false).Count;
testing all possible combinations of StoreNames and StoreLocations but to no avail.
Is it possible like stated here that for using certificate with purposes other than https traffic you would need a Cloud Service and that (I suppose that) App Services do not support it?
You need to add WEBSITE_LOAD_CERTIFICATES to your web app App Settings. Set the value to either ' * ' or to the thumbprint of your certificate you want loaded into the web app environment. My personal preference is to set this value to ' * ', which means, load all certificates that have been uploaded.
After you apply this change you should be able to load your certificate from within your web app code.
More information on how to use certificates is available here. The article is a bit dated (in today's standards) but still relevant.
I followed the tutorial here for settings up the Azure Scheduler:
http://fabriccontroller.net/blog/posts/a-complete-overview-to-get-started-with-the-windows-azure-scheduler/
I want to run my application on an Azure Website but it is blocking me from creating my X509Certificate.
I found this article: http://blog.tylerdoerksen.com/2013/08/23/pfx-certificate-files-and-windows-azure-websites/
Which points out the issue:
Well it turns out that when you load certificates the system will use a local directory to store the key (??)
The default location for the key is the under the local user profile, and with Windows Azure Websites, there is no local user profile directory."
So following his advice and adding the following flag: "X509KeyStorageFlags.MachineKeySet" I can get away with:
CryptographicException: The system cannot find the file specified
but I now get:
CryptographicException: Access denied.
Is there really no way to use the SDK from an AzureWebsite?! It defeats a lot of appeal of the Azure Scheduler if I am forced into using a WebRole instead of an Azure Website.
In this thread: http://social.msdn.microsoft.com/Forums/windowsazure/en-US/cfe06e73-53e1-4030-b82d-53200be37647/load-privately-created-p12-cert-from-azureblob-and-have-it-be-trusted
It appears as if they are sucessfully creating a X509Certificate on an Azure Website so what is different that mine throws "Access Denied" when I try to?
The problem was with using the ManagementCertificate string in the PublishSettings file... I created a self signed certificate on my local machine using the VisualStudio Console and exported both a '.cer' and '.pfx'.
Uploaded the self signed .cer into my Azure/Settings/Management Certificates
Bundled the .pfx with my solution and published to Azure Web Sites
Then used the following code to create the certificate:
var certificate = new X509Certificate2(
HttpContext.Current.Server.MapPath("~/<filename>.pfx"), "<password>", X509KeyStorageFlags.MachineKeySet);