Virtual Machine Converter 3.1 Cannot find Thumbprint - azure

I'm attempting to use the Virtual Machine Converter 3.1 to convert a VM to Azure VM. I am following the steps here:
https://technet.microsoft.com/en-us/library/dn874004.aspx
and I have created a management certificate using the instructions here:
https://msdn.microsoft.com/library/azure/gg551722.aspx
and I have uploaded this to my Azure Subscription. I have verified that the certificate is in my Personal Store, and I have even copied this to the Local Machine Store. Both Certificates show that they have private keys as expected and the certificate uploaded and shows in the Azure Management Certificates as well.
When I run the MVMC and I provide the Subscription ID and the Certificate Thumbprint I get the message: "No certificate exists with thumbprint XXXXXXXXXXXX...". I cannot get past this screen to successfully migrate the VM to Azure, does anyone have a recommendation or suggestion?

I know this is old, but I thought I would post the answer to help whoever finds this. ;)
From: https://support.microsoft.com/en-us/kb/2977336
"Certificate thumbprint is not found in the personal certificate store" error when you try to migrate to Microsoft Azure using Microsoft Virtual Machine Converter
To resolve this problem, follow these steps:
Start the MMC.exe process.
Click File, click Add/Remove Snap-in, and then click Certificates.
On the Certificates snap-in screen, click Add, and then select My user account. Click Finish, and then click OK.
Expand Console Root, expand Certificates - Current User, expand Personal, and then expand Certificates.
Right-click your Microsoft Azure certificate. By default, this is named Microsoft Azure Tools. Select All Tasks, and then click Export.
Click Next, and then click No, do not export the private key. Click Next.
On the Certificate Export Wizard screen, click DER encoded X.509 (.CER), and then click Next.
Type a file name, click Next, and then click Finish.
Expand Console Root\Certificates - Current User, expand Trusted Root Certification Authorities, and then expand Certificates.
Right-click Certificates, and then click Import.
Click Next, and then locate the file that you exported in step 8.
Follow the steps in the wizard to complete the import process. Verify that the Microsoft Azure Tools certificate now appears in both the Personal and Trusted Root Certification Authorities stores.
Return to MVMC, and then retry the Azure Connection task.

Related

Facing access denied while accessing the azure fabric explorer 403

I am deploying azure fabric cluster referring the Microsoft docs
on deploying the cluster using service fabric explorer getting "Access to srvfabric.cloudapp.azure.com was denied" You don't have the user right to view the page
HTTP error 403
When I deployed the SFC cluster, Even I got the same error code as yours:-
Make sure your key vault certificate used to create the SFC cluster is correct with your SFC domain name added as common name CN like below:-
Download the above certificate as pfx in your local machine and import it in your cert-manager like below:-
Type certmgr.msc. as a run command and open certificate manager and import this certificate from your local machine like below:-
Click on Personal > certificates > All tasks > import > Import the certificate from your local folder by selecting the path below:-
Click next > next > Password is not required and import the certificate.
The certificate will be imported like below:-
While creating the Service Fabric, I Added the above certificate for authentication like below:-
Certificate added after SFC deployment:-
Now, Browse the URL from the Overview tab and if you receive an error related to Your connection is not private > Choose the advance tab and Proceed > If it prompts for a certificate > Select the certificate you imported in the step above > You will be routed to the SFC explorer successfully like below:-
SFC Explorer page loaded successfully:-

Error uploading .pfx certificate to Azure Web app using ARM template from VSTS

I'm using a Azure Resource Manager(ARM) template to create and update a resource group in a release definition in Visual Studio Team Services(VSTS). I'm using the same template to upload the .pfx certificate to the web app.
For the first deployment the certificate got uploaded perfectly, but from the next deployment the deployment fails with the error "Another certificate exists with same thumbprint ******** at location West US in the Resource Group MyResourceGroup".
I tried recreating the webapp, but to my surprise the deployment fails for the first time itself. Looks like the certificate got uploaded to the resource group.
Is there a way to overwrite the existing .pfx certificate for every deployment.
You do not have to upload certificate for all deployments. The first certificate will become available to all deployments
Certificates are exposed at the resource group level, so deploying the same certificate again will definitely error out.
However, I don't see a reason as to why you need to upload a certificate.
Does your application need to read this certificate? if yes, then there is a different way to do this. See this article:
https://azure.microsoft.com/en-us/blog/using-certificates-in-azure-websites-applications/
Until today I had never encountered this error. I have been able to redeploy my applications, certificates and all, with no issues. I believe in my case that someone had previously manually added the certificate using a different name, possibly through the portal, and then when my pipeline executed it attempted to add the certificate using a different name.
Certificates are child resources of Microsoft.Web under the resource group. There are likely a number of options for resolving but I am going to focus on removing the certificate using Resource Explorer. (I bet that there is a Azure CLI or Azure PowerShell command to do this too.)
In resource explorer, locate the certificates node associated with your resource group using the left hand navigation pane. This will likely be in something like subscriptions -> {subscription name} -> resourceGroups -> {resource group name} -> providers -> Microsoft.Web -> certificates -> {certificate name}
Once located, select your certificate and then can use the Actions (POST, DELETE) tab in the right hand pane to delete the certificate. You should then be able to redeploy.

Azure cloud deployment fails : Certificate with thumbprint was not found

I am developing a Web API based web service to be hosted on Azure. I am using Azure 1.8 SDK.
When I try to deploy my cloud service, it takes a very long time to upload after which I get an error message which says:
12:09:52 PM - Error: The certificate with thumbprint d22e9de125640c48a4f83de06ae6069f09cfb76c was not found. Http Status Code: BadRequest OperationId: 50daf49111c9487f82f3be09763e7924
12:09:53 PM - Deployment failed with a fatal error
Apparently, the certificate being referred to is related to enabling remote desktop to role instances on the cloud (i am not very sure about this; saw this on the internet for a similar problem). However, I did not check the option to enable remote desktop on the instances while publishing.
What could be going wrong here?
What worked for me was:
Goto powershell and type mmc
Add certificates snap-in by going to File > Add/Remove Snap-in > Choose Certificates from the list > Choose My user Account
Right click on Certificates - Current User and select Find Certificates
On the dialog box, set Contains to 'azure' and Look in Field to 'Issued To'
Press Find Now. You should be able to see a list of certificates.
Check for the thumbprint by double-clicking the certificate > Details tab > scroll down to Thumbprint
Once you found your certificate, close the dialog, Right click and select Export
Select to export the private key. Follow the steps until you have a *pfx file for upload to Azure
Goto your service and select the Certificates tab
Click Upload, select the exported *pfx file, and supply the password you've set during export
Goto Dashbord and update the Cloud package
List item
The certificate used in your project doesn't exist on the cloud environment. Make sure the same certificate used by your project is uploaded to the cloud environment. If you are using Visual Studio then you can fix this error as follows:
Right click your Web Role / Worker Role (under Roles folder in the cloud project) → Properties → Certificates
Click on the ellipsis button under Thumbprint which will point to your certificate.
Upload the certificate which shown here to Windows Azure environment (Production or Staging)
have you uploaded your publishing settings file in visual studio and/or a management certificate? this is crucial to be a trusted point by your azure subscription, hence why you could be having this issue. BTW try upgrading to SDK 2.1 too for better support and better features (if possible of course).
Adding to Arbie's answer. You can avoid the first few steps. Just type "Manage user certificates" in windows search bar. Go to Personal > Certificates.
Your certificates would have Issued to "Windows Azure Tools".
You can check for the thumbprint by opening the certificate and checking the Details.

How do I upload a VM to Azure

I see a lot of confusion about how to connect to Azure and upload a VM. It involves creating a management certificate with makecert and uploading with csupload, and there are a lot of flags to get wrong. So I thought I'd ask the question and answer it to save someone some trouble.
(cut from initial question and pasted as an answer)
Basic Principles
You must have Visual Studio and the Azure SDK installed.
To connect to Azure, you create a security certificate on your local machine that identifies you. Then you go to Azure and import the certificate. Now your local machine and Azure are able to talk to each other securely. For this reason you can't start the work on one machine and finish it on another. Work on one machine.
You must have the certificate in your Current User certificate store and also exported to your hard drive. You need a copy on the hard drive to upload to Azure, and you need it in the certificate store because when you connect to Azure that's where it will look for it. You can create it on your hard drive and import it, or you can create it in the certificate store and export it. The following instructions show you how to do the latter.
Create the Certificate
Open a Visual Studio command prompt as an Administrator. (right click on the menu item, and click "run as administrator".
Copy/paste the following:
makecert -sky exchange -r -n "CN=MyCertificateName" -pe -a sha256 -len 2048 -ss My "MyCertificateName.cer" This will create the certificate and install it in your Current User certificate store. It will not create a copy on your hard drive. It's the "My" key word that causes the certificate to be stored in the certificate store for your current account.
Open the certificate manager by typing certmgr in the start menu command. You should see Certificates - Current User at the top. Open Personal/Certificates and you should see the certificate you just created.
Right click the certificate and click All Tasks, Export. Click Next. Select No do not export the private key. Click Next. Select the DER encoded format. Click Next. Save the certificate on your hard drive somewhere with the same name as you used to create it (doesn't have to be the same but it avoids confusion).
Import the Certificate into Azure
Log into Azure.
Click Settings then Management Certificates then Upload.
Browse to the management certificate that you just exported and saved, and upload it.
Copy the Subscription Identifier and Thumbprint from the uploaded certificate and paste them into a text file. Save the file on your local hard drive. You need these numbers handy for the next step.
If you want to be safe, delete the certificate that you exported to your hard drive. You don't need it there any more. Azure will look for the certificate in your certificate store when it authorizes you, not on your hard drive.
At this point you are able to make a secure connection between your computer/account and Azure. You will now use this secure connection to upload your Virtual Machine.
Upload your Virtual Machine
First establish a secure connection to Azure. Open an Azure command prompt as an Administrator and enter the following:
csupload Set-Connection "SubscriptionId=YourSubscriptionIdGoesHere;CertificateThumbprint=YourCertificateThumbPrintGoesHere;ServiceManagementEndpoint=https://management.core.windows.net"
Finally it's time to upload the file. Open the Azure portal, select your storage account and copy the blobs service endpoint URL. Enter the following at the same Azure command prompt as above:
csupload Add-PersistentVMImage -Destination "YourBlobServiceEndPointUrlGoesHere/vhds/YourVhdNameGoesHere" -Label YourVhdNameGoesHere-LiteralPath "ThePathToYourVhdOnTheLocalComputerGoesHere" -OS Windows
The VHD should begin to upload.
Here's an easier way, you will need:
Windows Azure PowerShell
Open "Windows Azure PowerShell"
-OR- open a PS prompt and run:
Set-ExecutionPolicy RemoteSigned
Import-Module "C:\Program Files (x86)\Microsoft SDKs\Windows Azure\PowerShell\Azure\Azure.psd1"
2.
Get-AzurePublishSettingsFile
(Will prompt you to save a .publishsettings file required in the next step)
3.
Import-AzurePublishSettingsFile "C:\Temp\Windows Azure...credentials.publishsettings"
4.
add-azurevhd -destination "https://.blob.core.windows.net/vhds/File.vhd" -localfilepath "C:\Users\Public\Documents\Hyper-V\Virtual hard disks\File.vhd"
For more info see:
Get Started with Windows Azure Cmdlets

Azure/Wasabi: Certificate never makes it to the CurrentUser/My store

I am attempting to get Wasabi (the Enterprise library autoscaling block) to work within an Azure worker role. The Wasabi worker role (Extra Small, full trust) is scaling a different worker role within the same service. It works perfectly from a local console app, with an identical configuration - given the errors, the certificate isn't making it to the VM. I am using the latest versions of the Azure SDKs, the enterprise library autoscaling block, and the Azure portal.
Here are the steps I took, based on these docs: http://msdn.microsoft.com/en-us/library/hh680937(v=pandp.50).aspx
I created a management certificate as per the directions here: http://msdn.microsoft.com/en-us/library/gg432987.aspx.
I exported the .pfx with the private key and gave it a password.
I uploaded the .cer to the Settings->Management Certificates section on the portal.
I uploaded the .pfx with the correct password to the Cloud Services->(My Service)->Certificates, noting the thumbprint listed.
I created some trivial rules that scale up my app a few instances, and correctly configured the service information to use my new cert. The XML files are in blob storage. This exactly configuration works fine on my local machine in a console app.
I added an entry in the Wasabi role's configuration, using the correct cert name and thumbprint. I set it to use the CurrentUser\My store. I confirmed that the .csdef and .cscfg files were correctly updated.
I deploy the service to staging on Azure, using the publish tool. The certificate configuration setting correctly shows up in the Configuration setting for the role.
It doesn't scale the app. I check the trace entries, and it has an exception when trying to pull access the management API. It is trying to access the right subscription, and it's trying to find the correct certificate thumbprint in the right store, so my configuration is being loaded correctly. It claims that it cannot find the certificate with that thumbprint in that store.
I tried the LocalMachine\My store (configured in the role certificate settings, and in the service information store XML), and I got a different exceptionthe error listed in Azure WASABi SecurityNegotiationException. That fix was ultimately to go to CurrentUser, so that doesn't help me here.
I tried a lot of other combinations of CurrentUser\LocalMachine and different stores, and all CurrentUser locations result in certificate-not-found, and all LocalMachine stores result in the other exception.
I triple-checked the thumbprints in the role settings, the portal (certificate page) and the service information file, and they all match.
I then enabled remote-desktop and logged in to the Wasabi role instance, and used MMC to look at the certificate configuration for both the local machine and the current user. When I selected the LocalMachine store in the role certificate settings, the certificate did show up in the LocalMachine store, which tells me that the certificate is correctly installed in the service and the thumbprints match. When the CurrentUser store is selected, the certificate is not visible anywhere. This could be because the user that is used by RDP is not the same user as the service, but it does match the error.
So, in summary:
The certificate was correctly configured and installed in the portal (management certificates for the subscription, and service certificates).
Apparently, you must use the CurrentUser location, not the LocalMachine location, for the Wasabi role (as per the linked SO question).
When I'm trying to install to the CurrentUser, the certificate is not getting placed in the VM, at least not anywhere that the role can find it.
Any ideas?
Thanks!
See my answer to this SO POST. The certificate must be in LocalMachine and because of config changes in SDK 1.8 and Server2012 role initialization you have to run the worker role with elevated permissions to give NETWORK SERVICE access to the cert's private keys. Edit ServiceDefinition.csdef
<?xml version="1.0" encoding="utf-8"?>
<ServiceDefinition name="blah" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition" schemaVersion="2012-10.1.8">
<WorkerRole name="blah" vmsize="Small">
<Runtime executionContext="elevated" />
...
</WorkerRole>
</ServiceDefinition>

Resources