I have a web application in IIS that needs to authenticate the user using Windows Authentication. This normally works fine, but when I attempt to introduce a new site binding, the authentication stops working.
The application currently runs on the local dev clients, but the site binding is based on a certificate with an "Issued to" name that matches the local computer name. Applications running under this site that require Windows authentication run fine.
*:8445 (https) - binding certificate: devclientXXX.domain.com
For various reasons we want to replace that binding with an alias common to all dev clients, i.e. dev-localhost. So I get a new certificate and set up a new binding, so we have these:
*:8445 (https) - binding certificate: devclientXXX.domain.com
*:443 (https) - binding certificate: dev-localhost
The new site binding allows me to browse resources available with Anonymous Authentication.
However, when attempting to browse Windows Authentication resources, my credentials are rejected: In Chrome I get prompted repeatedly to enter my credentials without these being accepted.
Meanwhile, browsing using the original binding works just as before with my Windows credentials accepted without any prompt to enter them anew.
As far as I can tell, the two bindings only differ in the selected certificate.
Does anyone have any suggestion as to what might be the cause of this problem?
-S
I refined my google queries and found this:
https://serverfault.com/questions/722722/windows-auth-in-iis-does-not-work-when-browsing-to-the-website-on-the-server-run
This prompted me to modify the registry to add the following:
HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Lsa\MSV1_0
(Multi-String Value) BackConnectionHostNames = dev-localhost
This actually solved my problem!
Edit: Here's a PowerShell snippet for doing exactly that.
$hostName = "dev-localhost"
$value = (Get-ItemProperty "HKLM:\System\CurrentControlSet\Control\Lsa\MSV1_0").BackConnectionHostNames
if (-not($value | ? { $_ -eq $hostName }))
{
$value += $hostName
$item = New-ItemProperty "HKLM:\System\CurrentControlSet\Control\Lsa\MSV1_0" -Name "BackConnectionHostNames" -Value $value -PropertyType MultiString -Force
}
Related
I use the below script to import a certificate in a pipeline build process,
Powershell script:
param($PfxFilePath, $Password)
$absolutePfxFilePath = Resolve-Path -Path $PfxFilePath
Write-Output "Importing store certificate '$absolutePfxFilePath'..."
Add-Type -AssemblyName System.Security
$cert = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2
$cert.Import($absolutePfxFilePath, $Password, [System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]"PersistKeySet")
$store = new-object system.security.cryptography.X509Certificates.X509Store -argumentlist "MY", CurrentUser
$store.Open([System.Security.Cryptography.X509Certificates.OpenFlags]::"ReadWrite")
$store.Add($cert)
$store.Close()
Get below error:
. 'C:\JobAppAgent_work\1\s\JobApp\DevOps\Build\Import-PfxCertificate.ps1' -PfxFilePath $env:DOWNLOADSECUREFILE1_SECUREFILEPATH -Password ****
Importing store certificate 'C:\JobAppAgent_work_temp\DD.Job.Desktop_TemporaryKey.pfx'...
##[error]Exception calling "Import" with "3" argument(s): "The specified network password is not correct.
This script was running fine when build was running on Azrure PipeLines. Now I create a private Agent pool that runs on a Window 10 VM.
Make sure that the certificate is valid and has not expired. You can check the expiration date of the certificate by double-clicking on it and viewing the details.
Check that the certificate is properly installed on the machine where the build is being performed. If the certificate is not installed, it will not be available for use in the build process.
Make sure that the certificate is correctly referenced in the build pipeline. This may involve specifying the path to the certificate file or the thumbprint of the certificate.
If you are using a self-signed certificate, make sure that it is trusted by the machine where the build is being performed. To do this, you will need to install the certificate in the trusted root certification authorities store on the machine.
If you are using a certificate from a certificate authority (CA), make sure that the CA is trusted by the machine where the build is being performed. This may involve installing the root certificate of the CA on the machine.
This is a PowerShell script that imports a certificate from a file with a given password into the "MY" store in the current user's certificate store. The certificate is imported using the Import method of the X509Certificate2 class, which takes as input the path to the certificate file, the password, and a set of key storage flags. The script then creates an X509Store object representing the "MY" store in the current user's certificate store, opens the store in read-write mode, adds the imported certificate to the store, and closes the store.
This script assumes that the certificate file is in the Personal Information Exchange (PFX) format, which is a common format for storing certificates and their private keys. PFX files are often used to export or import certificates, and they can be password-protected for added security.
Verify that the password you are using to import the certificate is correct. It's possible that the password has been changed or entered incorrectly.
Check that the certificate file has not been damaged or modified in any way. If the file has been altered, it may be causing the import to fail.
Make sure that the certificate file is accessible to the machine where the script is being run. If the file is on a network share or another machine, check that the machine has the necessary permissions to read the file.
If the certificate file is password-protected, make sure that the password has not expired or been revoked.
Try running the script with different key storage flags to see if that has any effect on the error. For example, you could try using "Exportable" instead of "PersistKeySet" as the key storage flag.
I am trying to run a PowerShell script that calls Get-AzKeyVaultSecret using examples provided per Microsoft and keep getting an error stating No such host is known.
Generically, the error is simple enough but the fact that I'm not specifying a host address or IP during the call makes the error seem very abstract from the actual issue.
Line |
14 | Get-AzKeyVaultSecret -VaultName $KeyVaultName -Name $SecretName
| ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
| No such host is known.
I have tried just about everything I can think of to get this working and this is the error I receive everytime. I've checked that I have the appropriate privileges in Azure Access Policies and also check that I have the Access Control roles and etc. So I don't understand the error message.
I was previously attempting this using the AzureRM Powershell module but since realizing it is soon due for deprecation within a couple of years; I opted to go this route but it doesn't seem to be working.
What exactly does no such host mean and how do I resolve the problem? I am running under PowerShell 7
Because of the 1st comment regarding posting the remainder of the Script; I'll add that I receive the same error when calling the method directly in the PowerShell window.
PS C:\SQL Scripts\PowerShell> Get-AzKeyVaultSecret -VaultName 'myKeyVaultName' -Name 'myKeyVaultSecretName'
Get-AzKeyVaultSecret: No such host is known.
SHORT VERSION ANSWER:
The environment needs to be specified when working within private sectors such as Government, Education and etc.
LONG ANSWER/EXPLANATION:
The comment by #Ked Mardemootoo led me to view the issue from a different perspective. In a manner of sorts, the issue was determined to be somewhat network related and perhaps arguably a "DNS" issue but not a DNS issue as it were to relate to the system from where the call is being made.
The Get-AzKeyVaultSecret module performs some work underneath the hood which includes resolving the FQDN of the requested resource among other things using the Credentials provided to connect via the Connect-AzAccount module
In most common scenarios these requests are routed to Azure on the public networks but on a comparatively smaller scale where Azure is on a private sector/network e.g. Educational, Government and etc. there is an additional parameter switch where the Environment needs to be specified.
Connect-AzAccount
Connects User within public domain
Connect-AzAccount -Environment
Connects User within the private domain/sector specified with the Environment switch
If you have an Azure account, both methods will log you onto the Azure platform but if you're on a private sector and attempt to subsequently use modules to acquire information or resources without having designated the environment; you will receive the no such host is known
The error is somewhat cryptic and abstract and in my opinion should have been more specific to better clue the user as to the actual problem. Such as Resource not found or something similar.
Once I specified the Environment (something that isn't front and center in the documentation that I accessed); the module functioned as expected.
Hopefully this information helps others from falling into this pit of obscurity.
It appears there's something wrong with the DNS resolution on your machine.
I'd suggest running the command from a different device or from the Azure CloudShell to narrow it down further.
I've tried to replicate it on my end (within my context/subscription) to see what kind of error message shows up in different scenarios.
Wrong KV name shows clear error message:
PS /Users/kedmardemootoo> Get-AzKeyVaultSecret -VaultName 'kv-wrong-name' -Name 'correct-secret-name'
Get-AzKeyVaultSecret: nodename nor servname provided, or not known
Correct KV name but wrong Secret doesn't show any error/output:
PS /Users/kedmardemootoo> Get-AzKeyVaultSecret -VaultName 'kv-correct-name' -Name 'wrong-secret-name'
Correct KV and secret name but no access via access policies:
PS /Users/kedmardemootoo> Get-AzKeyVaultSecret -VaultName 'kv-correct-name' -Name 'wrong-secret-name'
Get-AzKeyVaultSecret: Operation returned an invalid status code 'Forbidden'
Correct KV and secret name with the right access policies:
PS /Users/kedmardemootoo> Get-AzKeyVaultSecret -VaultName 'kv-correct-name' -Name 'correct-secret-name'
Vault Name : kv-correct-name
Name : correct-secret-name
Version : 0abbb10de45a1235f5544
Id : https://kv-correct-name.vault.azure.net:443/secrets/correct-secret-name/0abbb10de45a1235f5544
Enabled : True
Expires : 06/03/2022 05:20:05
Not Before :
Created : 06/03/2022 05:29:07
Updated : 06/03/2022 05:34:09
Content Type :
Tags :
Get-AzKeyVaultSecret -VaultName 'myKeyVaultName' -Name 'myKeyVaultSecretName' -Debug
You can toggle -Debug switch to see the "Absolute Uri". You will see something like:
https://myKeyVaultName.vault.azure.net/secrets/myKeyVaultSecretName
You will get an error (no such host is known) if there is a typo in the VaultName or the VaultName does not exist.
A week ago I was able to access SharePoint Online programmatically through a c# application. Now I am getting the following error:
The remote server returned an error: (503) Server Unavailable.
I can access the SharePoint site in my browser completely fine.
I tried accessing it through SharePoint Online Management Shell but I get the same error when doing the following:
$adminUPN="name#business.com"
$userCredential = Get-Credential -UserName $adminUPN -Message "Type the password."
Connect-SPOService -Url https://business.sharepoint.com/sites/bd/resume/ -Credential $userCredential
When I try and connect without the credentials using:
Connect-SPOService -Url https://business.sharepoint.com/sites/bd/resume/
It firstly pops up a Microsoft Sign in window to enter just my username/email, which looks normal. But when I enter my email, click next it takes me to different sign in page which looks off (See screenshot below, I removed company information with black scribble).
After I enter my password and hit enter I get a different error:
Connect-SPOService : Could not authenticate to SharePoint Online https://business.sharepoint.com/sites/bd/resume/ using OAuth 2.0
Firstly, I want to confirm if this is a problem on my end or if this a problem with permission, etc on the admin end.
The cause for your issue could be the fact that the LegacyAuthProtocolsEnabled property, at tenant level, is set to False. Setting the value of this property to True can solve the issue.
To get the current value run the following command in PowerShell:
Connect-SPOService
Get-SPOTenant
To set the value to True for LegacyAuthProtocolsEnabled run the following commands in PowerShell:
Connect-SPOService
Set-SPOTenant -LegacyAuthProtocolsEnabled $True
After you run the commands it's necessary to wait some time until will work.
According to documentation, a value of False prevents Office clients using non-modern authentication protocols from accessing SharePoint Online resources.
A value of True- Enables Office clients using non-modern authentication protocols (such as, Forms-Based Authentication (FBA) or Identity Client Runtime Library (IDCRL)) to access SharePoint resources.
On my deployed azure web-role I try to send a request (GET) to a Web-Server that authorizes the request by the provided certificate of the requesting client.
ServicePointManager.SecurityProtocol = SecurityProtocolType.Ssl3;
var filepath = Path.GetTempPath();
string certpath = Path.Combine(filepath, "somecert.cer");
Trc.Information(string.Format("Certificate at {0} will be used", certpath));
X509Certificate cert = X509Certificate.CreateFromCertFile(certpath);
WebRequest request = WebRequest.Create(endPoint);
((HttpWebRequest)request).ProtocolVersion = HttpVersion.Version10;
((HttpWebRequest)request).IfModifiedSince = DateTime.Now;
((HttpWebRequest)request).AutomaticDecompression = DecompressionMethods.Deflate | DecompressionMethods.GZip;
((HttpWebRequest)request).ClientCertificates.Add(cert);
The above code works perfectly in the azure-emulator but not when it is deployed. Then the call to GetResponse fails always.
System.Net.WebException: The request was aborted: Could not create SSL/TLS secure channel.
at System.Net.HttpWebRequest.GetResponse()
at XYZ.Import.DataImport.OpenResponseStream(String endPoint)
I read through many of the existing discussion threads where using SecurityProtocolType.Ssl3 solved the problem but it does not in my case. Are there further debugging options considering that it is running on azure?
Update1
I tried all debugging steps that were suggested by Alexey. They are really helpfull but quite hard to execute properly on azure.
Here is with what I came up with after at least two hours.
I used the System.Net settings supplied by this post [1].
At first the output was not present in the expected folder. The file system settings on the folder need to be tweaked. Therefore the NT AUTHORITY\NETWORK SERVICE account should be allowed on the target folder.
After that the file didn't show up as expected because there seems to be a problem when only a app.config is supplied. See this thread [2]. So I provided a app.config a [ProjectAssembly].dll.config and a web.config with the content from the post [1].
To test if the Problem is related to User rights I tested with elevated rights and without like shown in post [3].
In advance I changed the Test-Project to execute in two modes. The first mode tries to load the public part in the *.cer file like shown in the code above.
The other version uses the private certificate that is loaded with this command
X509Certificate cert = new X509Certificate2(certpath, "MYPASSWORD", X509KeyStorageFlags.MachineKeySet);
As a result I gained the following insights.
When using the public part (.cer) it only works when the rights are elevated and the private cert is imported into the machine store
When using the private (.pfx) it only works if the private cert is imported into the machine store
The second setup with (.pfx) runs even without elevated rights
While debugging the CAPI2 log only had informations that had no direct relevance. The System.Net diagnostics from point one above contained this.
System.Net Information: 0 : [1756] SecureChannel#50346327 - Cannot find the certificate in either the LocalMachine store or the CurrentUser store.
[snip]
System.Net Error: 0 : [1756] Exception in HttpWebRequest#36963566:: - The request was aborted: Could not create SSL/TLS secure channel..
System.Net Error: 0 : [1756] Exception in HttpWebRequest#36963566::GetResponse - The request was aborted: Could not create SSL/TLS secure channel..
From this output and the changing situation when the elevated rights are used I would deduce that I should look further into the rights of the running web-role in combination with the certificate store.
[1] http://msdn.microsoft.com/de-de/library/ty48b824(v=vs.110).aspx
[2] Combined Azure web role and worker role project not seeing app.config when deployed
[3] http://blogs.msdn.com/b/farida/archive/2012/05/01/run-the-azure-worker-role-in-elevated-mode-to-register-httplistener.aspx
Remove SecurityProtocolType.Ssl3
Turn on CAPI2 log and check it for errors (on your local machine).
If there isn't error, then check location of CA and intermediate certificates.
Turn on system.net diagnostics and check this log for errors.
In this article describes how to find and turn on CAPI2 eventlog.
Hope this help.
I am trying to setup a website (local testing atm), to connect to azure rest api to see our settings. I created a cert locally (W7 machine):
makecert -sky exchange -r -n "CN=azureConnectionNew" -pe -a sha1 -len 2048 -ss My "azureConnectionNew.cer"
I can see the cert in the certs MMC snap in. (do not have a right click edit permissions option when I view the cert in here).
I have a class library that setups up the connection, the cert is passed in by getting the cert (via the thumb string), this works great for the console app, but when I try and do this in a web app it all goes wrong. I get 403 errors.
I first thought that this was due to the fact that the website is running as the ApplicationPoolIdentity so doesn't have access to the cert. So I tried passing in the cert (to the same code as the console app), by loading the actual file:
var path = #"C:\temp\azureconnection\azureConnectionNew.cer";
var cert = new X509Certificate2();
cert.Import(path);
I still get 403 errors.
I tried exporting the cer file from MMC certificates snap in as a pfx file, (with private keys included). I set the local IIS set to use this cert and navigated to the https version of my local site but still got 403.
I am not sure how to include / setup / reference the cert so that IIS can send a HttpWebRequest from the server side to Azure and get a valid response.
It is always better to use Thumbprint of the certificate to get the certificate. Please make sure you have created the certificate correctly. Also please check you have placed the certificate in Personal certificate section in Local Machine. You can check this using MMC snap in. please try below code..
var store = new X509Store(StoreName.My, StoreLocation.LocalMachine);
store.Open(OpenFlags.OpenExistingOnly | OpenFlags.ReadOnly);
var certificate = store.Certificates
.Cast<X509Certificate2>()
.SingleOrDefault(c => string.Equals(c.Thumbprint, “CertificateThumbprint”, StringComparison.OrdinalIgnoreCase)); // please replace CertificateThumbprint with original Thumbprint
This isn't the right way to use the certificate - it needs to be stored in the personal/certificates store of the user running the code (you should update the App Pool identity to be a user who can login and into whose certificates you import the cert. Here's sample code showing you how to use the service API: http://code.msdn.microsoft.com/windowsazure/CSAzureManagementAPI-609fc31a/