I am deploying azure fabric cluster referring the Microsoft docs
on deploying the cluster using service fabric explorer getting "Access to srvfabric.cloudapp.azure.com was denied" You don't have the user right to view the page
HTTP error 403
When I deployed the SFC cluster, Even I got the same error code as yours:-
Make sure your key vault certificate used to create the SFC cluster is correct with your SFC domain name added as common name CN like below:-
Download the above certificate as pfx in your local machine and import it in your cert-manager like below:-
Type certmgr.msc. as a run command and open certificate manager and import this certificate from your local machine like below:-
Click on Personal > certificates > All tasks > import > Import the certificate from your local folder by selecting the path below:-
Click next > next > Password is not required and import the certificate.
The certificate will be imported like below:-
While creating the Service Fabric, I Added the above certificate for authentication like below:-
Certificate added after SFC deployment:-
Now, Browse the URL from the Overview tab and if you receive an error related to Your connection is not private > Choose the advance tab and Proceed > If it prompts for a certificate > Select the certificate you imported in the step above > You will be routed to the SFC explorer successfully like below:-
SFC Explorer page loaded successfully:-
Related
For Application gateway all documentation says to upload pfx certificate but when I go to http settings for backend pool it only allows ".cer" certificate and it wont allow ".pfx" file to be uploaded, error displayed says wrong format ?
m i doing something wrong or somehow Azure has changed functionality but documentation is still not uploaded .
Strangely through this command I am able to upload PFX
az network application-gateway ssl-cert create
Screenshot attached
Update : I am trying to do this for an existing Application Gateway
Update 2 : Strangely when I am creating a gateway Azure shows me option for PFX but I dont know why it become cer if I am trying to do this for an existing one.
Is this one of Microsoft's easter eggs??
It seems you select wrong entrance on Portal.
If you configure Add HTTP setting, you really need a .cer certificate.
More information please refer to this link.
The command az network application-gateway ssl-cert create is used for configure SSL. You could find it on Portal Settings--Listener.
I'm using a Azure Resource Manager(ARM) template to create and update a resource group in a release definition in Visual Studio Team Services(VSTS). I'm using the same template to upload the .pfx certificate to the web app.
For the first deployment the certificate got uploaded perfectly, but from the next deployment the deployment fails with the error "Another certificate exists with same thumbprint ******** at location West US in the Resource Group MyResourceGroup".
I tried recreating the webapp, but to my surprise the deployment fails for the first time itself. Looks like the certificate got uploaded to the resource group.
Is there a way to overwrite the existing .pfx certificate for every deployment.
You do not have to upload certificate for all deployments. The first certificate will become available to all deployments
Certificates are exposed at the resource group level, so deploying the same certificate again will definitely error out.
However, I don't see a reason as to why you need to upload a certificate.
Does your application need to read this certificate? if yes, then there is a different way to do this. See this article:
https://azure.microsoft.com/en-us/blog/using-certificates-in-azure-websites-applications/
Until today I had never encountered this error. I have been able to redeploy my applications, certificates and all, with no issues. I believe in my case that someone had previously manually added the certificate using a different name, possibly through the portal, and then when my pipeline executed it attempted to add the certificate using a different name.
Certificates are child resources of Microsoft.Web under the resource group. There are likely a number of options for resolving but I am going to focus on removing the certificate using Resource Explorer. (I bet that there is a Azure CLI or Azure PowerShell command to do this too.)
In resource explorer, locate the certificates node associated with your resource group using the left hand navigation pane. This will likely be in something like subscriptions -> {subscription name} -> resourceGroups -> {resource group name} -> providers -> Microsoft.Web -> certificates -> {certificate name}
Once located, select your certificate and then can use the Actions (POST, DELETE) tab in the right hand pane to delete the certificate. You should then be able to redeploy.
I'm attempting to use the Virtual Machine Converter 3.1 to convert a VM to Azure VM. I am following the steps here:
https://technet.microsoft.com/en-us/library/dn874004.aspx
and I have created a management certificate using the instructions here:
https://msdn.microsoft.com/library/azure/gg551722.aspx
and I have uploaded this to my Azure Subscription. I have verified that the certificate is in my Personal Store, and I have even copied this to the Local Machine Store. Both Certificates show that they have private keys as expected and the certificate uploaded and shows in the Azure Management Certificates as well.
When I run the MVMC and I provide the Subscription ID and the Certificate Thumbprint I get the message: "No certificate exists with thumbprint XXXXXXXXXXXX...". I cannot get past this screen to successfully migrate the VM to Azure, does anyone have a recommendation or suggestion?
I know this is old, but I thought I would post the answer to help whoever finds this. ;)
From: https://support.microsoft.com/en-us/kb/2977336
"Certificate thumbprint is not found in the personal certificate store" error when you try to migrate to Microsoft Azure using Microsoft Virtual Machine Converter
To resolve this problem, follow these steps:
Start the MMC.exe process.
Click File, click Add/Remove Snap-in, and then click Certificates.
On the Certificates snap-in screen, click Add, and then select My user account. Click Finish, and then click OK.
Expand Console Root, expand Certificates - Current User, expand Personal, and then expand Certificates.
Right-click your Microsoft Azure certificate. By default, this is named Microsoft Azure Tools. Select All Tasks, and then click Export.
Click Next, and then click No, do not export the private key. Click Next.
On the Certificate Export Wizard screen, click DER encoded X.509 (.CER), and then click Next.
Type a file name, click Next, and then click Finish.
Expand Console Root\Certificates - Current User, expand Trusted Root Certification Authorities, and then expand Certificates.
Right-click Certificates, and then click Import.
Click Next, and then locate the file that you exported in step 8.
Follow the steps in the wizard to complete the import process. Verify that the Microsoft Azure Tools certificate now appears in both the Personal and Trusted Root Certification Authorities stores.
Return to MVMC, and then retry the Azure Connection task.
I followed the tutorial here for settings up the Azure Scheduler:
http://fabriccontroller.net/blog/posts/a-complete-overview-to-get-started-with-the-windows-azure-scheduler/
I want to run my application on an Azure Website but it is blocking me from creating my X509Certificate.
I found this article: http://blog.tylerdoerksen.com/2013/08/23/pfx-certificate-files-and-windows-azure-websites/
Which points out the issue:
Well it turns out that when you load certificates the system will use a local directory to store the key (??)
The default location for the key is the under the local user profile, and with Windows Azure Websites, there is no local user profile directory."
So following his advice and adding the following flag: "X509KeyStorageFlags.MachineKeySet" I can get away with:
CryptographicException: The system cannot find the file specified
but I now get:
CryptographicException: Access denied.
Is there really no way to use the SDK from an AzureWebsite?! It defeats a lot of appeal of the Azure Scheduler if I am forced into using a WebRole instead of an Azure Website.
In this thread: http://social.msdn.microsoft.com/Forums/windowsazure/en-US/cfe06e73-53e1-4030-b82d-53200be37647/load-privately-created-p12-cert-from-azureblob-and-have-it-be-trusted
It appears as if they are sucessfully creating a X509Certificate on an Azure Website so what is different that mine throws "Access Denied" when I try to?
The problem was with using the ManagementCertificate string in the PublishSettings file... I created a self signed certificate on my local machine using the VisualStudio Console and exported both a '.cer' and '.pfx'.
Uploaded the self signed .cer into my Azure/Settings/Management Certificates
Bundled the .pfx with my solution and published to Azure Web Sites
Then used the following code to create the certificate:
var certificate = new X509Certificate2(
HttpContext.Current.Server.MapPath("~/<filename>.pfx"), "<password>", X509KeyStorageFlags.MachineKeySet);
I am developing a Web API based web service to be hosted on Azure. I am using Azure 1.8 SDK.
When I try to deploy my cloud service, it takes a very long time to upload after which I get an error message which says:
12:09:52 PM - Error: The certificate with thumbprint d22e9de125640c48a4f83de06ae6069f09cfb76c was not found. Http Status Code: BadRequest OperationId: 50daf49111c9487f82f3be09763e7924
12:09:53 PM - Deployment failed with a fatal error
Apparently, the certificate being referred to is related to enabling remote desktop to role instances on the cloud (i am not very sure about this; saw this on the internet for a similar problem). However, I did not check the option to enable remote desktop on the instances while publishing.
What could be going wrong here?
What worked for me was:
Goto powershell and type mmc
Add certificates snap-in by going to File > Add/Remove Snap-in > Choose Certificates from the list > Choose My user Account
Right click on Certificates - Current User and select Find Certificates
On the dialog box, set Contains to 'azure' and Look in Field to 'Issued To'
Press Find Now. You should be able to see a list of certificates.
Check for the thumbprint by double-clicking the certificate > Details tab > scroll down to Thumbprint
Once you found your certificate, close the dialog, Right click and select Export
Select to export the private key. Follow the steps until you have a *pfx file for upload to Azure
Goto your service and select the Certificates tab
Click Upload, select the exported *pfx file, and supply the password you've set during export
Goto Dashbord and update the Cloud package
List item
The certificate used in your project doesn't exist on the cloud environment. Make sure the same certificate used by your project is uploaded to the cloud environment. If you are using Visual Studio then you can fix this error as follows:
Right click your Web Role / Worker Role (under Roles folder in the cloud project) → Properties → Certificates
Click on the ellipsis button under Thumbprint which will point to your certificate.
Upload the certificate which shown here to Windows Azure environment (Production or Staging)
have you uploaded your publishing settings file in visual studio and/or a management certificate? this is crucial to be a trusted point by your azure subscription, hence why you could be having this issue. BTW try upgrading to SDK 2.1 too for better support and better features (if possible of course).
Adding to Arbie's answer. You can avoid the first few steps. Just type "Manage user certificates" in windows search bar. Go to Personal > Certificates.
Your certificates would have Issued to "Windows Azure Tools".
You can check for the thumbprint by opening the certificate and checking the Details.