Azure App Service "The Client Certificate Credentials Were Not Recognized" - azure

When I run my ASP dotnet core API locally (in release mode) it makes an external call, using a WCF and a clientcertificate (X509Certificate2), returning the data correctly. But when this API is deployed as an Azure App Service it states "The Client Certificate Credentials Were Not Recognized". The X509Certificate2 is loaded from the filesystem correctly (seen from remote debugging).
I've tried making the call with a normal HttpClient and adding the certificate, but this gave me the same results.
We also tried using a CertificateStore, with equal results.
private async Task ProcessRequestAsync(string endpoint, X509Certificate2 certificate, Func<SsoSoapType, Task> action)
{
BasicHttpsBinding binding = new BasicHttpsBinding();
EndpointAddress endpointAddress = new EndpointAddress(new Uri(endpoint));
ChannelFactory<SsoSoapType> factory = new ChannelFactory<SsoSoapType>(binding, endpointAddress);
factory.Credentials.ClientCertificate.Certificate = certificate;
binding.Security.Transport.ClientCredentialType = HttpClientCredentialType.Certificate;
await action(factory.CreateChannel());
if (factory != null)
{
if (factory.State == CommunicationState.Faulted)
factory.Abort();
else
factory.Close();
}
}
I expect the deployed version to behave just like my local version. But apparently this isn't the case.
Can someone explain where this is going wrong?
Or is it caused by some settings in the Azure portal that need to be set accordingly?
Kind regards,
Jacco

Alright, after some thorough research I found out that my Azure App Service had been set to hosting plan "D1". This means that the machine is shared between different app services and thus can't utilize the certificate store (as you would be able to see other peoples' certificates). After upgrading to hosting plan "B1" the issue was resolved.

Related

Application insights not generated for .Net core app deployed on service fabric Linux cluster

I have a .Net core application that is deployed on service fabric Linux cluster. Application insights are configured in the app.
Startup.cs
public void ConfigureServices(IServiceCollection services)
{
ApplicationInsights.AspNetCore.Extensions.ApplicationInsightsServiceOptions aiOptions
= new ApplicationInsights.AspNetCore.Extensions.ApplicationInsightsServiceOptions
{
EnableAdaptiveSampling = false,
EnableQuickPulseMetricStream = false,
InstrumentationKey = "xxx"
};
services.AddApplicationInsightsTelemetry(aiOptions);
I have a controller class that has some action methods and logs the information.
[HttpPost]
public ActionResult actionMethod(...)
{
TraceLine("------------------------------------");
//some code
}
private static void TraceLine(string msg)
{
msg = $">> {DateTime.UtcNow.ToString("o")}: {msg}";
Log.Information(msg);
}
I am using Serilog, configured in appsettings.json & Program.cs
When I hit action method directly from local (without hosting it on even local sf cluster), via Postman, I see app insights getting generated and pushed to azure.
azure app insights snapshot
But when I hit the action method that is deployed on Azure service fabric I don't see any insight getting generated.
What am I missing here?
Any help is much appreciated!
Well, we need to check a few things here:
1) The app insights URL and the instrumentation key in the deployment parameter files for cluster hosted on cloud (Cloud.xml)
2) After checking the Cloud.xml, the best way is to access the log files and check what is the actual problem.
There's a description here which explains how to discover where the log files are stored.
You can use RDP to access the machine, which is explained here.
I was able to solve the issue by using Microsoft.ApplicationInsights.ServiceFabric.Native SDK in my application to log app insights.
Refer .NetCore section in ApplicationInsights-ServiceFabric on how to configure insights for service fabric application.

Why does my IdentityServer4 application fail to start in an Azure WebApp hosting environment?

I’m attempting to put together a simple identity server following the directions here: http://docs.identityserver.io/en/latest/ , and the project runs locally fine, I have only created a new project, not made any modifications. Running locally it launches, and I can load the .well-known/openid-configuration endpoint with no issues.
I take this bare bones project and attempt to upload it to an azure webapp and get the attached:
HTTP Error 502.5 - Process Failure
It appears that the application fails to start in azure.
I can publish a .NET Core MVC application no problem, but with this IS4 template I always receive the process failure. What am I missing? Should this not work out of the box here?
The accepted answer is incorrect and insecure. That default code is there for a reason. During development you have a developer signing credential, for production you have to implement your own signing credential. See the following answer:
How do I configure "key material" in Identity Server 4 to use SQL, KeyVault, or any other system?
You need to change the code in Startup.cs file. If you don't run it in development environment, it will throw the exception("need to configure key material"). I just comment these code. Then it works well in Azure.
public void ConfigureServices(IServiceCollection services)
{
// uncomment, if you want to add an MVC-based UI
//services.AddMvc().SetCompatibilityVersion(Microsoft.AspNetCore.Mvc.CompatibilityVersion.Version_2_1);
var builder = services.AddIdentityServer()
.AddInMemoryIdentityResources(Config.GetIdentityResources())
.AddInMemoryApiResources(Config.GetApis())
.AddInMemoryClients(Config.GetClients());
//if (Environment.IsDevelopment())
//{
builder.AddDeveloperSigningCredential();
//}
//else
//{
// throw new Exception("need to configure key material");
//}
}
The result:

web application in stage can't connect to azure cloud, working fine in development environment

While working on one of my project, I am uploading documents from our content server(as we are using documentum) to azure cloud. As we want to save some space on our content server. I have written below code to upload my zipped files on azure cloud. I development environment it is working fine but in stage environment it is throwing timeout errors, it shows
"An unknown failure occurred : Connection timed out: connect"
Also when I try to download the file(it works fine in development environment), but in stage it shows "An error occurred while enumerating the result". It doesn't throw any exception nor any error is there.
My code is :
if (Boolean.parseBoolean(azureCloudUseDefaultContainer)) {
container = client.getContainerReference(azureCloudDefaultContainer);
}else {
container = client.getContainerReference(DEFAULT_CONTAINER);
}
container.createIfNotExists();
CloudBlockBlob blob = container.getBlockBlobReference(assetName);
BlobRequestOptions blobRequestOptions = new BlobRequestOptions();
blobRequestOptions.setTimeoutIntervalInMs(10000);
blobRequestOptions.setRetryPolicyFactory(new RetryLinearRetry(3000,3));
//blob.upload(new FileInputStream(file), file.length());
// If blob already exist on cloud it means asset was uploaded in past, so no need to upload it again
// to avoid duplicate blobs on cloud
if(!blob.exists()){
blob.upload(new FileInputStream(file), file.length(), null, blobRequestOptions, null);
}
Is this any configuration issue or network problem. what is your opinion
If you are using SSL certificate (it is not clearly stated in your question so it is only my assumption) then you must also configure it for staging slot. Have you seen Azure documentation about configuring deployment slots? Some settings are not swapped when using slotsin App Service.
Settings that are swapped:
General settings - such as framework version, 32/64-bit, Web sockets
App settings (can be configured to stick to a slot)
Connection strings (can be configured to stick to a slot)
Handler mappings
Monitoring and diagnostic settings
WebJobs content
Settings that are not swapped:
Publishing endpoints
Custom Domain Names
SSL certificates and bindings
Scale settings
WebJobs schedulers

Certificate not found on Azure Web App

I deployed a web application as a Web App on Azure App Service.
I uploaded some certificates to the Azure Portal, since the Web App runs over SSL, and we use another certificate to perform some decryption.
For the latter case I have a method (which works fine locally) to find a certificate:
public static X509Certificate2 FindCertificate(KnownCertificate certificate)
{
return FindCertificate(StoreName.My, StoreLocation.CurrentUser, X509FindType.FindByThumbprint, certificate.Thumbprint);
}
But I get an error that the certificate with thumbprint XYZ is not found. Although, on the Azure Portal it is present. (I had uploaded and imported it)
I am using StoreLocation.CurrentUser as suggested in THIS POST but it still does not work. Am I using the wrong store or what else am I missing?
EDIT: I have managed to remotetly debug my WebApp and with the ImmediateWindow feature of VisualStudio I have executed this code
new X509Store(StoreName.CertificateAuthority, StoreLocation.CurrentUser).Certificates.Find(findType, findValue, false).Count;
testing all possible combinations of StoreNames and StoreLocations but to no avail.
Is it possible like stated here that for using certificate with purposes other than https traffic you would need a Cloud Service and that (I suppose that) App Services do not support it?
You need to add WEBSITE_LOAD_CERTIFICATES to your web app App Settings. Set the value to either ' * ' or to the thumbprint of your certificate you want loaded into the web app environment. My personal preference is to set this value to ' * ', which means, load all certificates that have been uploaded.
After you apply this change you should be able to load your certificate from within your web app code.
More information on how to use certificates is available here. The article is a bit dated (in today's standards) but still relevant.

Cannot add a certificate in local Azure Web role VM

I have some difficulties in managing Azure certificates from my code.
Indeed I'm trying to use Azure REST Services API (e.g. creating HTTP requests) in order to know my services state from my Azure web site.
It works well in local debugging, but my web role seams to have some limitation with the certificates manager. Bellow is what I do:
// this method stores a certificate from the resources
// to the local certificates manager
private X509Certificate2 StoreCertificate(Certificates certName)
{
X509Certificate2 newCert = null;
// get certificate from resources
newCert = this.GetCertificateFromResources(certName);
// store it into the local certificate manager
if (newCert != null)
{
var store = new X509Store(
StoreName.TrustedPeople,
StoreLocation.LocalMachine
);
store.Open(OpenFlags.ReadWrite);
store.Add(newCert);
}
// reset ref and try to load it from manager
newCert = null;
newCert = this.GetCertificate(certName);
return newCert;
}
An Access is denied error appends when I try to add the certificate.
Any idea ? Can I store certificates into the Azure VM ?
Should I use a special location to store those ?
Are you using a Cloud Service (web/worker role)? If so, which OS family? I've seen some reports that with a worker role using OS family 3, you need to run the role with elevated permissions in order to read certs from the local cert store. Not sure if that applies to web roles as well and adding to the cert store.
Has the service cert been added via the Azure management portal as well (or via the REST API or PowerShell)?
Well I have found lot of things:
I was deploying my code in a web site so that I cannot add a certificate to the Shared VM in Azure
I have tried to connect to the VM in a remote desktop session and I added a certificate manually.
Even in this case, I have an (403) Forbidden error in an InvalidOperationException.
So here is the current state:
a certificate has been created (makecert) and added manually in the VM that hosts my web role (deployed in a service)
this certificate has been uploaded to both the Azure Account certificates and to the Azure service certificates (the one that deploys my web role)
the thumbprint of this certificate has been added in my code and I can access to the certificate when my code is executed
So 2 questions:
Is there something I should do with my certificate ?
When I try my web role locally in the Azure emulator, everything works. Is there a special setting to update during the publish / deploy step ?
Thanks for your help.
In order to save the time of other developers, here is what I did to solve the main problem:
connect to the VM that deploys the web role: see there
create the certificate: see there
Eventually plays with the certificates manager (mmc.exe)
Then the certificate is available from the code.

Resources