I'm getting the following error when starting my spring-boot application with the azure application-insights agent:
*************************
ApplicationInsights Java Agent failed to send telemetry data.
*************************
Description:
Unable to find valid certification path to requested target.
Action:
Please import the SSL certificate from https://westeurope.livediagnostics.monitor.azure.com/QuickPulseService.svc, into your custom java key store located at:
/truststore/truststore.jks
Learn more about importing the certificate here: https://go.microsoft.com/fwlink/?linkid=2151450
This error only appears sometimes, sometimes it works.
It seems that different Azure Servers use different chains (+ root CA's).
My truststore contains the following Certs:
Baltimore CyberTrust Root
Microsoft RSA TLS CA 02
Cert for live.applicationinsights.azure.com
cert for in.applicationinsights.azure.com
I found the following List of all Azure Certificates.
Which Certificates from this list do I need to add to the truststore for the error to go away permanently?
EDIT:
I've now added all the root certs from the page to my truststore, but I'm still getting the same error message. I do however see data in Azure Insights...
Try using application insight agent by which there is no need of modifying your spring boot application you only need to provide the correct environment variables
Here is the sample code for setting environment variables
APPLICATIONINSIGHTS_CONNECTION_STRING = <Copy connection string from Application Insights Resource Overview>
Else you can even set the environment variables by adding them in connecting string using applicationsettings.json file and below is the sample code for that
{
"connectionString": "Copy connection string from Application Insights Resource Overview"
}
For the complete information regarding adding environment variables you can refer the document suggested by #Paizo.
Related
I am trying to follow the tutorial for deploying a split-merge service (Azure Elastic Database... tools).
The first complication is that the doc instructs me to create an "Azure Cloud Service." The closest thing to that seems to be "Cloud service (classic)," so that's what I created.
When it came to creating a self-signed cert, I had to translate the parameters for makecert (which is deprecated and no longer seems to be present in any SDKs) to the powershell New-SelfSignedCertificate cmdlet. The relevant params I passed to the cmdlet were:
Subject: CN=*.cloudapp.net
KeySpec: KeyExchange
TextExtension: 2.5.29.37={text}1.3.6.1.5.5.7.3.1,1.3.6.1.5.5.7.3.2
I finally got the certificate created/exported/uploaded, got the service configuration file completed and created the service. Azure portal reports the web and workers running, but I can't hit the service URL (403 access denied even after prompting me to select my certificate). I confirmed that my certificate thumbprint shows correctly in the various places in the service configuration (DataEncryptionPrimaryCertificateThumbprint, DataEncryptionPrimary, AdditionalTrustedRootCertificationAuthorities, AllowedClientCertificateThumbprints, DataEncryptionPrimaryCertificateThumbprint). My certificate's thumbprint also shows as the thumbprint in configuration under the "Certificates" section as SSL, CA, and DataEncryptionPrimary.
The only thing I can think of that is causing the access denied is something mentioned in this doc, "If you are using self-signed certificates, you will need to disable certificate chain validation." The PowerShell cmdlet that it shows to use to disable chain validation in that case (for an API service; no clue how that differs from my service) fails with InvalidOperation.
Is there some way for me to disable certificate chain validation for my "classic" cloud service? Other suggestions of things to check?
I have a .NET Framework 4.7 application that allows users to upload X.509 certificates in PFX or PKCS#12 format (think: "SSL certificates" with the private key included), it then loads the certificate into a System.Security.Cryptography.X509Certificates.X509Certificate2 instance. As my application code also needs to re-export the certificate I specify the X509KeyStorageFlags.Exportable option.
When running under IIS on my production web-server, the Windows user-profile for the identity that w3wp.exe runs under is not loaded, so I do not specify the UserKeySet flag.
String filePassword = ...
Byte[] userProvidedCertificateFile = ...
using( X509Certificate2 cert = new X509Certificate2( rawData: userProvidedCertificateFile, password: filePassword, keyStorageFlags: X509KeyStorageFlags.Exportable | X509KeyStorageFlags.MachineKeySet | X509KeyStorageFlags.PersistKeySet )
{
...
}
In early 2017 I deployed this code to an Azure App Service (aka Azure Website) instance and it worked okay - after initially failing because I did have the UserKeySet flag set (as Azure App Services do not load a user-profile certificate store.
However, since mid-2017 (possibly around May or June) my application has stopped working - I assume the Azure App Service was moved to an updated system (though Kudu reports my application is running on Windows Server 2012 (NT 6.2.9200.0).
It currently fails with two error messages that varied depending on input:
CryptographicException "The system cannot find the file specified."
CryptographicException "Access denied."
I wrote an extensive test-case that tries different combinations of X509Certificate2 constructor arguments, as well as with and without the WEBSITE_LOAD_CERTIFICATES Azure application setting.
Here are my findings when working with an uploaded PFX/PKCS#12 certificate file that contains a private key and does not have password-protection:
Running under IIS Express on my development box:
Loading the certificate file always succeeds, regardless of X509KeyStorageFlags value.
Exporting the certificate file requires at least X509KeyStorageFlags.Exportable.
Running under IIS on a production server (not an Azure App Service) where the w3wp.exe user-profile is not loaded:
Loading the certificate file requires that X509KeyStorageFlags.UserKeySet is not set, but otherwise always succeeds.
Exporting the certificate file requires at least X509KeyStorageFlags.Exportable, but otherwise always succeeds, otherwise it fails with "Key not valid for use in specified state."
Running under Azure App Service, without WEBSITE_LOAD_CERTIFICATES defined:
Loading the certificate with MachineKeySet set and UserKeySet is not set fails with a CryptographicException: "Access denied."
Loading the certificate with any other keyStorageFlags value, including values like UserKeySet | MachineKeySet | Exportable or just DefaultKeySet fails with a CryptographicException: "The system cannot find the file specified."
As I was not able to load the certificate at all I could not test exporting certificates.
Running under Azure App Service, with WEBSITE_LOAD_CERTIFICATES defined as the thumbprint of the certificate that was uploaded:
Loading the certificate with MachineKeySet and UserKeySet is not set, fails with CryptographicException: "Access denied." .
So values like UserKeySet and UserKeySet | MachineKeySet and Exportable will work.
Exporting certificates requires X509KeyStorageFlags.Exportable - same as all other environments.
So it seems that WEBSITE_LOAD_CERTIFICATES seems to work - but only if the certificate being loaded into an X509Certificate2 instance has the same thumbprint as specified in WEBSITE_LOAD_CERTIFICATES.
Is there any way around this?
I thought more about how WEBSITE_LOAD_CERTIFICATES seems to make a difference - but I had a funny feeling about it really only working with the certificate thumbprint that's specified.
So I changed the WEBSITE_LOAD_CERTIFICATES value to a dummy thumbprint - an arbitrary 40-character Base16 string, and re-ran my test - and it worked, even though the thumbprint had no relation to the certificate I was working with.
It seems that simply having WEBSITE_LOAD_CERTIFICATES defined will enable the the Azure website's ability to use X509Certificate and X509Certificate2 - even if the loaded certificate is never installed into, or even retrieved from, any systemwide or user-profile certificate store (as seen in the Certificates snap-in for MMC.exe).
This behaviour does not seem to be documented anywhere, so I'm mentioning it here.
I've contacted Azure support about this.
Regarding the behavioural change I noticed at mid-year - it's very likely that I did have WEBSITE_LOAD_CERTIFICATES originally set for a testing certificate we were using. When I made a new deployment later in the year around June I must have reset the Application settings which removed the WEBSITE_LOAD_CERTIFICATES and so broke X509Certificate2 instances.
TL;DR:
Open your Azure App Service (Azure Website) blade in portal.azure.com
Go to the Application settings page
Scroll to App settings
Add a new entry key: WEBSITE_LOAD_CERTIFICATES, and provide a dummy (fake, made-up, randomly-generated) value for it.
The X509Certificate2( Byte[], String, X509KeyStorageFlags ) constructor will now work, but note:
keyStorageFlags: X509KeyStorageFlags.MachineKeySet will fail with "Access denied"
All other keyStorageFlags values, including MachineKeySet | UserKeySet will succeed (i.e. MachineKeySet by itself will fail, but MachineKeySet used in conjunction with other bits set will work).
I deployed a web application as a Web App on Azure App Service.
I uploaded some certificates to the Azure Portal, since the Web App runs over SSL, and we use another certificate to perform some decryption.
For the latter case I have a method (which works fine locally) to find a certificate:
public static X509Certificate2 FindCertificate(KnownCertificate certificate)
{
return FindCertificate(StoreName.My, StoreLocation.CurrentUser, X509FindType.FindByThumbprint, certificate.Thumbprint);
}
But I get an error that the certificate with thumbprint XYZ is not found. Although, on the Azure Portal it is present. (I had uploaded and imported it)
I am using StoreLocation.CurrentUser as suggested in THIS POST but it still does not work. Am I using the wrong store or what else am I missing?
EDIT: I have managed to remotetly debug my WebApp and with the ImmediateWindow feature of VisualStudio I have executed this code
new X509Store(StoreName.CertificateAuthority, StoreLocation.CurrentUser).Certificates.Find(findType, findValue, false).Count;
testing all possible combinations of StoreNames and StoreLocations but to no avail.
Is it possible like stated here that for using certificate with purposes other than https traffic you would need a Cloud Service and that (I suppose that) App Services do not support it?
You need to add WEBSITE_LOAD_CERTIFICATES to your web app App Settings. Set the value to either ' * ' or to the thumbprint of your certificate you want loaded into the web app environment. My personal preference is to set this value to ' * ', which means, load all certificates that have been uploaded.
After you apply this change you should be able to load your certificate from within your web app code.
More information on how to use certificates is available here. The article is a bit dated (in today's standards) but still relevant.
I followed the tutorial here for settings up the Azure Scheduler:
http://fabriccontroller.net/blog/posts/a-complete-overview-to-get-started-with-the-windows-azure-scheduler/
I want to run my application on an Azure Website but it is blocking me from creating my X509Certificate.
I found this article: http://blog.tylerdoerksen.com/2013/08/23/pfx-certificate-files-and-windows-azure-websites/
Which points out the issue:
Well it turns out that when you load certificates the system will use a local directory to store the key (??)
The default location for the key is the under the local user profile, and with Windows Azure Websites, there is no local user profile directory."
So following his advice and adding the following flag: "X509KeyStorageFlags.MachineKeySet" I can get away with:
CryptographicException: The system cannot find the file specified
but I now get:
CryptographicException: Access denied.
Is there really no way to use the SDK from an AzureWebsite?! It defeats a lot of appeal of the Azure Scheduler if I am forced into using a WebRole instead of an Azure Website.
In this thread: http://social.msdn.microsoft.com/Forums/windowsazure/en-US/cfe06e73-53e1-4030-b82d-53200be37647/load-privately-created-p12-cert-from-azureblob-and-have-it-be-trusted
It appears as if they are sucessfully creating a X509Certificate on an Azure Website so what is different that mine throws "Access Denied" when I try to?
The problem was with using the ManagementCertificate string in the PublishSettings file... I created a self signed certificate on my local machine using the VisualStudio Console and exported both a '.cer' and '.pfx'.
Uploaded the self signed .cer into my Azure/Settings/Management Certificates
Bundled the .pfx with my solution and published to Azure Web Sites
Then used the following code to create the certificate:
var certificate = new X509Certificate2(
HttpContext.Current.Server.MapPath("~/<filename>.pfx"), "<password>", X509KeyStorageFlags.MachineKeySet);
I got a problem with upgrading my deployment to windows server 2012, my deploy works fine with osfamily=2 and compiled with .net4, but failed at .net4.5 and osfamily=3,
the exception I saw when remote to the vm is "Keyset does not exist", seems to related to some certificates. My program using the certificates to encrypt some stream and should be able to using this certs to decode this stream after I deploy it.
I checked the certs on vm, it is installed fine, in the right place.
So I suspect this is an issue with the different secure policy with 2012 that prevent my role to get the key in the certs.
this blocks me for a while so Thank a lot for any clue!
Keyset does not exist typically refers to an error when your program is trying to access a private key of a certificate and is unable to do so, either because the private key does not exist or because it has no permissions to access it
You will need to find the certificate in question in your certificate store, verify that it contains a private key (that will show up in the properties of the certificate)
And then verify that your process/application pool has permissions to access the private key by right-clicking on the certificate from the certificate store and choosing: All Tasks->Manage Private Keys. From there, make sure to add sufficient users to the allowed list
Hope this helps