HttpRequest with Certificate fails in Azure Web-role - azure

On my deployed azure web-role I try to send a request (GET) to a Web-Server that authorizes the request by the provided certificate of the requesting client.
ServicePointManager.SecurityProtocol = SecurityProtocolType.Ssl3;
var filepath = Path.GetTempPath();
string certpath = Path.Combine(filepath, "somecert.cer");
Trc.Information(string.Format("Certificate at {0} will be used", certpath));
X509Certificate cert = X509Certificate.CreateFromCertFile(certpath);
WebRequest request = WebRequest.Create(endPoint);
((HttpWebRequest)request).ProtocolVersion = HttpVersion.Version10;
((HttpWebRequest)request).IfModifiedSince = DateTime.Now;
((HttpWebRequest)request).AutomaticDecompression = DecompressionMethods.Deflate | DecompressionMethods.GZip;
((HttpWebRequest)request).ClientCertificates.Add(cert);
The above code works perfectly in the azure-emulator but not when it is deployed. Then the call to GetResponse fails always.
System.Net.WebException: The request was aborted: Could not create SSL/TLS secure channel.
at System.Net.HttpWebRequest.GetResponse()
at XYZ.Import.DataImport.OpenResponseStream(String endPoint)
I read through many of the existing discussion threads where using SecurityProtocolType.Ssl3 solved the problem but it does not in my case. Are there further debugging options considering that it is running on azure?
Update1
I tried all debugging steps that were suggested by Alexey. They are really helpfull but quite hard to execute properly on azure.
Here is with what I came up with after at least two hours.
I used the System.Net settings supplied by this post [1].
At first the output was not present in the expected folder. The file system settings on the folder need to be tweaked. Therefore the NT AUTHORITY\NETWORK SERVICE account should be allowed on the target folder.
After that the file didn't show up as expected because there seems to be a problem when only a app.config is supplied. See this thread [2]. So I provided a app.config a [ProjectAssembly].dll.config and a web.config with the content from the post [1].
To test if the Problem is related to User rights I tested with elevated rights and without like shown in post [3].
In advance I changed the Test-Project to execute in two modes. The first mode tries to load the public part in the *.cer file like shown in the code above.
The other version uses the private certificate that is loaded with this command
X509Certificate cert = new X509Certificate2(certpath, "MYPASSWORD", X509KeyStorageFlags.MachineKeySet);
As a result I gained the following insights.
When using the public part (.cer) it only works when the rights are elevated and the private cert is imported into the machine store
When using the private (.pfx) it only works if the private cert is imported into the machine store
The second setup with (.pfx) runs even without elevated rights
While debugging the CAPI2 log only had informations that had no direct relevance. The System.Net diagnostics from point one above contained this.
System.Net Information: 0 : [1756] SecureChannel#50346327 - Cannot find the certificate in either the LocalMachine store or the CurrentUser store.
[snip]
System.Net Error: 0 : [1756] Exception in HttpWebRequest#36963566:: - The request was aborted: Could not create SSL/TLS secure channel..
System.Net Error: 0 : [1756] Exception in HttpWebRequest#36963566::GetResponse - The request was aborted: Could not create SSL/TLS secure channel..
From this output and the changing situation when the elevated rights are used I would deduce that I should look further into the rights of the running web-role in combination with the certificate store.
[1] http://msdn.microsoft.com/de-de/library/ty48b824(v=vs.110).aspx
[2] Combined Azure web role and worker role project not seeing app.config when deployed
[3] http://blogs.msdn.com/b/farida/archive/2012/05/01/run-the-azure-worker-role-in-elevated-mode-to-register-httplistener.aspx

Remove SecurityProtocolType.Ssl3
Turn on CAPI2 log and check it for errors (on your local machine).
If there isn't error, then check location of CA and intermediate certificates.
Turn on system.net diagnostics and check this log for errors.
In this article describes how to find and turn on CAPI2 eventlog.
Hope this help.

Related

Domino App Service Pack Installation , failed to startup IAM services as tutorial

I had Configure the Domino Credential Store.
I had modified the Domino Proton Server settings that enable client authentication.
I created the Vault ID.
I created the IAM-store.nsf from template with error message.
Error executing agent 'DeleteExpiredDocs' in 'iam-store.nsf'. Agent signer 'Domino Template Development/Domino': You are not authorized to perform that operation
I gave the IAM's functional ID access to the database.
I installed the IAM services for domino with the following message.
result screen of install domino-iam-service-2.2.0.tgz
Since I would like to config the iam-services for my testing server.
I select to setup the pilot mode.
According to the tutorial, https://doc.cwpcollaboration.com/appdevpack/docs/en/iam_landing_page.html
I could access the demo database, with anonymous setting of proton server.
C:\src\domino-db\package>npm run ptest -- read serv.org.com:3003/App\node-demo.nsf -q "Form = 'Contact' and LastName = 'Moody'"
read the content of demo database
Config the pilot mode successfully.
What is doing wrong?
Error, when try to startup pilot mode of IAM Service
I have put all the certificates to the folder config/certs,
in which the certificates are created by create_certs.cmd from the tutorial.
And I have convert the ca.crt into ca.pem.
Besides, I also put the keys created by ProtonCA into the config/certs.
Keys created by ProtonMicroCA
According to the tutorial, I modified the make_certs.cmd as the following:
make_certs.cmd
the certificates are posted to the config/certs directory
I'm not sure about your complete setup, a support ticket would help us diagnose this better. There should be a ca folder in the config/certs directory that contains any root certs you're using (like the ca.pem you have)

Credentials manager for Azure Data Factory not working

Good day!
I am working on moving files via Azure Data Factory from on-prem file store and/or ftp site to Azure Blob storage using Copy Data activity. When setting security access, I am using credential manager. However, when clicking 'Set credential' a string 'Preparing...' shows for a split moment, and then nothing happens and box is left blank. What is exactly credentials manager? Is is a separate application, which needs to be installed or Windows credentials manager available via Administrative tools? I used IE for this. In Chrome it tries to install ClickOnce app, which fails to install with this error log (googling it reveals nothing). Does anyone know the solution?
IDENTITIES
Deployment Identity : CredentialsManager.application, Version=1.1.6273.1, Culture=neutral, PublicKeyToken=c3bce3770c238a49, processorArchitecture=msil
APPLICATION SUMMARY
* Online only application.
* Trust url parameter is set.
ERROR SUMMARY
Below is a summary of the errors, details of these errors are listed later in the log.
* Activation of C:\Users\YToropov\Downloads\CredentialsManager.application resulted in exception. Following failure messages were detected:
+ Deployment and application do not have matching security zones.
COMPONENT STORE TRANSACTION FAILURE SUMMARY
No transaction error was detected.
WARNINGS
There were no warnings during this operation.
OPERATION PROGRESS STATUS
* [4/5/2017 5:50:08 AM] : Activation of C:\Users\YToropov\Downloads\CredentialsManager.application has started.
* [4/5/2017 5:50:08 AM] : Processing of deployment manifest has successfully completed.
* [4/5/2017 5:50:08 AM] : Installation of the application has started.
ERROR DETAILS
Following errors were detected during this operation.
* [4/5/2017 5:50:08 AM] System.Deployment.Application.InvalidDeploymentException (Zone)
- Deployment and application do not have matching security zones.
- Source: System.Deployment
- Stack trace:
at System.Deployment.Application.DownloadManager.DownloadApplicationManifest(AssemblyManifest deploymentManifest, String targetDir, Uri deploymentUri, IDownloadNotification notification, DownloadOptions options, Uri& appSourceUri, String& appManifestPath)
at System.Deployment.Application.ApplicationActivator.DownloadApplication(SubscriptionState subState, ActivationDescription actDesc, Int64 transactionId, TempDirectory& downloadTemp)
at System.Deployment.Application.ApplicationActivator.InstallApplication(SubscriptionState& subState, ActivationDescription actDesc)
at System.Deployment.Application.ApplicationActivator.PerformDeploymentActivation(Uri activationUri, Boolean isShortcut, String textualSubId, String deploymentProviderUrlFromExtension, BrowserSettings browserSettings, String& errorPageUrl)
at System.Deployment.Application.ApplicationActivator.ActivateDeploymentWorker(Object state)
COMPONENT STORE TRANSACTION DETAILS
No transaction information is available.
You may need to clean this folder and try again by using IE11.
C:\Users{account}\AppData\Local\Apps\2.0
if it still not work, you may need to reset the internet options.
Instead of trying to use the credentials manager, can I suggest you create your data factory in Visual Studio. Then simply deploy it to Azure with different sets of configuration files.
Check out this blog post on how.
https://www.purplefrogsystems.com/paul/2017/01/using-azure-data-factory-configuration-files/
This way credentials do not need to be copied into any portal blades and can be handled using other tools. Plus source controlled.
The JSON strings will also be masked if viewed via the Author and Deploy blade.
Plus any changes can be dealt with locally and your on prem linked service in ADF just redeployed.
Hope this helps.
The credential manager is a .NET ClickOnce application running on your OnPrem machine. When using the credential manager to set the username/password, it directly talks to the Gateway so there is no username/password data transfer over the wire. If you use "by web browser" option, the encrypted username/password will be transferred over the wire with a post request and then gets pushed to Gateway. In both options credentials are encrypted, but the Credential Manages saves the roundtrip through public network.
The reason why you get this error is because Chrome by default does not support the .NET ClickOnce application. It should work if you are using IE or Edge.
For this to work on Chrome, you can add an extention to enable ClickOnce application support in Chrome, like the below one
https://chrome.google.com/webstore/detail/meta4-clickonce-launcher/jkncabbipkgbconhaajbapbhokpbgkdc?hl=en
Solution: Clear the oneClick cache and try to install the application again. Here is the way to clear oneClick cache
From command line run: rundll32 dfshim CleanOnlineAppCache
If it doesn’t work, delete the real folder:
Windows Vista/7/8/10
C:\users[username]\AppData\Local\Apps\2.0\
Windows XP/2003
C:\Documents and Settings\username\LocalSettings\Apps\2.0\
for more information, you can look at this. it may be helpful.
http://codeketchup.blogspot.sg/2013/06/how-to-fix-deployment-and-application.html
======================================================
security zone

SharePoint 2013 Forms Based Authentication is Slow– Why does SetPrincipalAndWriteSessionToken take 20 seconds or more?

We have a SharePoint implementation in which our web application is using Forms Based Authentication(FBA).
There are 2 servers in the farm. A web front end server that resides in a DMZ and a SQL server within the corporate network. A firewall separates them.
We are using SQL Authentication.
We need to force the user to change their password after the first successful login. Therefore we created a custom signin form for FBA based on the following article.
(https://sharepoint.stackexchange.com/questions/42541/how-to-create-a-custom-fba-login-page-that-forces-user-to-change-password-and-vi).
The code in question is:
private void SignInUser()
{
SecurityToken token = SPSecurityContext.SecurityTokenForFormsAuthentication
(new Uri(SPContext.Current.Web.Url),
GetMembershipProvider(SPContext.Current.Site),
GetRoleProvider(SPContext.Current.Site),
_userName,
_password, SPFormsAuthenticationOption.None);
SPFederationAuthenticationModule fam = SPFederationAuthenticationModule.Current;
fam.SetPrincipalAndWriteSessionToken(token, SPSessionTokenWriteType.WriteSessionCookie);
SPUtility.Redirect(System.Web.Security.FormsAuthentication.DefaultUrl,
SPRedirectFlags.UseSource, this.Context);
}
public static string GetMembershipProvider(SPSite site)
{
// get membership provider of whichever zone in the web app fba isenabled
SPIisSettings settings = GetFbaIisSettings(site);
if (settings == null) return null;
return settings.FormsClaimsAuthenticationProvider.MembershipProvider;
}
public static string GetMembershipProvider(SPSite site)
{
// get membership provider of whichever zone in the web app is fba enabled
SPIisSettings settings = GetFbaIisSettings(site);
if (settings == null) return null;
return settings.FormsClaimsAuthenticationProvider.MembershipProvider;
}
The code which takes the time is:
fam.SetPrincipalAndWriteSessionToken(token, SPSessionTokenWriteType.WriteSessionCookie);
From my understanding this line of code does the following:
Invokes the OnSessionSecurityTokenCreated method to raise the
SessionSecurityTokenCreated event
Invokes the AuthenticateSessionSecurityToken method on SPFederationAuthenticationModule.Current to set the thread principal and then write the session cookie.
Some other points to note are:
This 20 second login time also occurs for the default sharepoint fba page (/_forms/default.aspx)
It does not occur on a standalone dev machine.
For me this would indicate the bottleneck is network related.
Any help would be much appreciated.
I managed to shave about 13 seconds off the login process by resolving the following ULS log entry.
3/30/2016 11:08:53.71 w3wp.exe (0x2448) 0x1148 SharePoint Foundation Topology 8321 Critical A certificate validation operation took 23141.9482 milliseconds and has exceeded the execution time threshold. If this continues to occur, it may represent a configuration issue. Please see http://go.microsoft.com/fwlink/?LinkId=246987 for more details. bc926d9d-52af-f0fb-b2ae-236a27cd54f1
So, SharePoint uses certificates to sign security tokens that are issued by the Security Token Service (STS). Like all certificates, the validity of the STS certificate has to be verified periodically to make sure that the certificate has not been revoked. By default, the root certificate in the chain is not added to the Trusted Root Certificate Authorities store of the SharePoint servers. Because of this, the certificate revocation list (CRL) check for the certificate is performed over the Internet which is not possible on our WFE server.
I resolved this by exporting the root cert, on the WFE server, using
$rootCert = (Get-SPCertificateAuthority).RootCertificate
$rootCert.Export("Cert") | Set-Content C:\SharePointRootAuthority.cer -Encoding byte
And then importing the cert into the Trusted Root Certification Authorities store using the certificates mmc snapin.

how to connect to azure (management) rest api via C# in IIS

I am trying to setup a website (local testing atm), to connect to azure rest api to see our settings. I created a cert locally (W7 machine):
makecert -sky exchange -r -n "CN=azureConnectionNew" -pe -a sha1 -len 2048 -ss My "azureConnectionNew.cer"
I can see the cert in the certs MMC snap in. (do not have a right click edit permissions option when I view the cert in here).
I have a class library that setups up the connection, the cert is passed in by getting the cert (via the thumb string), this works great for the console app, but when I try and do this in a web app it all goes wrong. I get 403 errors.
I first thought that this was due to the fact that the website is running as the ApplicationPoolIdentity so doesn't have access to the cert. So I tried passing in the cert (to the same code as the console app), by loading the actual file:
var path = #"C:\temp\azureconnection\azureConnectionNew.cer";
var cert = new X509Certificate2();
cert.Import(path);
I still get 403 errors.
I tried exporting the cer file from MMC certificates snap in as a pfx file, (with private keys included). I set the local IIS set to use this cert and navigated to the https version of my local site but still got 403.
I am not sure how to include / setup / reference the cert so that IIS can send a HttpWebRequest from the server side to Azure and get a valid response.
It is always better to use Thumbprint of the certificate to get the certificate. Please make sure you have created the certificate correctly. Also please check you have placed the certificate in Personal certificate section in Local Machine. You can check this using MMC snap in. please try below code..
var store = new X509Store(StoreName.My, StoreLocation.LocalMachine);
store.Open(OpenFlags.OpenExistingOnly | OpenFlags.ReadOnly);
var certificate = store.Certificates
.Cast<X509Certificate2>()
.SingleOrDefault(c => string.Equals(c.Thumbprint, “CertificateThumbprint”, StringComparison.OrdinalIgnoreCase)); // please replace CertificateThumbprint with original Thumbprint
This isn't the right way to use the certificate - it needs to be stored in the personal/certificates store of the user running the code (you should update the App Pool identity to be a user who can login and into whose certificates you import the cert. Here's sample code showing you how to use the service API: http://code.msdn.microsoft.com/windowsazure/CSAzureManagementAPI-609fc31a/

Unable to authenticate to ASP.NET Web Api service with HttpClient

I have an ASP.NET Web API service that runs on a web server with Windows Authentication enabled.
I have a client site built on MVC4 that runs in a different site on the same web server that uses the HttpClient to pull data from the service. This client site runs with identity impersonation enabled and also uses windows authentication.
The web server is Windows Server 2008 R2 with IIS 7.5.
The challenge I am having is getting the HttpClient to pass the current windows user as part of its authentication process. I have configured the HttpClient in this manner:
var clientHandler = new HttpClientHandler();
clientHandler.UseDefaultCredentials = true;
clientHandler.PreAuthenticate = true;
clientHandler.ClientCertificateOptions = ClientCertificateOption.Automatic;
var httpClient = new HttpClient(clientHandler);
My understanding is that running the site with identity impersonation enabled and then building the client in this manner should result in the client authenticating to the service using the impersonated identity of the currently logged in user.
This is not happening. In fact, the client doesn't seem to be authenticating at all.
The service is configured to use windows authentication and this seems to work perfectly. I can go to http://server/api/shippers in my web browser and be prompted for windows authentication, once entered I receive the data requested.
In the IIS logs I see the API requests being received with no authentication and receiving a 401 challenge response.
Documentation on this one seems to be sparse.
I need some insight into what could be wrong or another way to use windows authentication with this application.
Thank You,
Craig
I have investigated the source code of HttpClientHandler (the latest version I was able to get my hands on) and this is what can be found in SendAsync method:
// BeginGetResponse/BeginGetRequestStream have a lot of setup work to do before becoming async
// (proxy, dns, connection pooling, etc). Run these on a separate thread.
// Do not provide a cancellation token; if this helper task could be canceled before starting then
// nobody would complete the tcs.
Task.Factory.StartNew(startRequest, state);
Now if you check within your code the value of SecurityContext.IsWindowsIdentityFlowSuppressed() you will most probably get true. In result the StartRequest method is executed in new thread with the credentials of the asp.net process (not the credentials of the impersonated user).
There are two possible ways out of this. If you have access to yours server aspnet_config.config, you should set following settings (setting those in web.config seems to have no effect):
<legacyImpersonationPolicy enabled="false"/>
<alwaysFlowImpersonationPolicy enabled="true"/>
If you can't change the aspnet_config.config you will have to create your own HttpClientHandler to support this scenario.
UPDATE REGARDING THE USAGE OF FQDN
The issue you have hit here is a feature in Windows that is designed to protect against "reflection attacks". To work around this you need to whitelist the domain you are trying to access on the machine that is trying to access the server. Follow below steps:
Go to Start --> Run --> regedit
Locate HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Lsa\MSV1_0 registry key.
Right-click on it, choose New and then Multi-String Value.
Type BackConnectionHostNames (ENTER).
Right-click just created value and choose Modify.
Put the host name(s) for the site(s) that are on the local computer in the value box and click OK (each host name/FQDN needs to be on it's own line, no wildcards, the name must be exact match).
Save everything and restart the machine
You can read full KB article regarding the issue here.
I was also having this same problem. Thanks to the research done by #tpeczek, I developed the following solution: instead of using the HttpClient (which creates threads and sends requests async,) I used the WebClient class which issues requests on the same thread. Doing so enables me to pass on the user's identity to WebAPI from another ASP.NET application.
The obvious downside is that this will not work async.
var wi = (WindowsIdentity)HttpContext.User.Identity;
var wic = wi.Impersonate();
try
{
var data = JsonConvert.SerializeObject(new
{
Property1 = 1,
Property2 = "blah"
});
using (var client = new WebClient { UseDefaultCredentials = true })
{
client.Headers.Add(HttpRequestHeader.ContentType, "application/json; charset=utf-8");
client.UploadData("http://url/api/controller", "POST", Encoding.UTF8.GetBytes(data));
}
}
catch (Exception exc)
{
// handle exception
}
finally
{
wic.Undo();
}
Note: Requires NuGet package: Newtonsoft.Json, which is the same JSON serializer WebAPI uses.
The reason why this is not working is because you need double hop authentication.
The first hop is the web server, getting impersonation with Windows authentication to work there is no problem. But when using HttpClient or WebClient to authenticate you to another server, the web server needs to run on an account that has permission to do the necessary delegation.
See the following for more details:
http://blogs.technet.com/b/askds/archive/2008/06/13/understanding-kerberos-double-hop.aspx
Fix using the "setspn" command:
http://www.phishthis.com/2009/10/24/how-to-configure-ad-sql-and-iis-for-two-hop-kerberos-authentication-2/
(You will need sufficient access rights to perform these operations.)
Just consider what would happen if any server was allowed to forward your credentials as it pleases... To avoid this security issue, the domain controller needs to know which accounts are allowed to perform the delegation.
To impersonate the original (authenticated) user, use the following configuration in the Web.config file:
<authentication mode="Windows" />
<identity impersonate="true" />
With this configuration, ASP.NET always impersonates the authenticated user, and all resource access is performed using the authenticated user's security context.

Resources