How do I upload a VM to Azure - azure

I see a lot of confusion about how to connect to Azure and upload a VM. It involves creating a management certificate with makecert and uploading with csupload, and there are a lot of flags to get wrong. So I thought I'd ask the question and answer it to save someone some trouble.

(cut from initial question and pasted as an answer)
Basic Principles
You must have Visual Studio and the Azure SDK installed.
To connect to Azure, you create a security certificate on your local machine that identifies you. Then you go to Azure and import the certificate. Now your local machine and Azure are able to talk to each other securely. For this reason you can't start the work on one machine and finish it on another. Work on one machine.
You must have the certificate in your Current User certificate store and also exported to your hard drive. You need a copy on the hard drive to upload to Azure, and you need it in the certificate store because when you connect to Azure that's where it will look for it. You can create it on your hard drive and import it, or you can create it in the certificate store and export it. The following instructions show you how to do the latter.
Create the Certificate
Open a Visual Studio command prompt as an Administrator. (right click on the menu item, and click "run as administrator".
Copy/paste the following:
makecert -sky exchange -r -n "CN=MyCertificateName" -pe -a sha256 -len 2048 -ss My "MyCertificateName.cer" This will create the certificate and install it in your Current User certificate store. It will not create a copy on your hard drive. It's the "My" key word that causes the certificate to be stored in the certificate store for your current account.
Open the certificate manager by typing certmgr in the start menu command. You should see Certificates - Current User at the top. Open Personal/Certificates and you should see the certificate you just created.
Right click the certificate and click All Tasks, Export. Click Next. Select No do not export the private key. Click Next. Select the DER encoded format. Click Next. Save the certificate on your hard drive somewhere with the same name as you used to create it (doesn't have to be the same but it avoids confusion).
Import the Certificate into Azure
Log into Azure.
Click Settings then Management Certificates then Upload.
Browse to the management certificate that you just exported and saved, and upload it.
Copy the Subscription Identifier and Thumbprint from the uploaded certificate and paste them into a text file. Save the file on your local hard drive. You need these numbers handy for the next step.
If you want to be safe, delete the certificate that you exported to your hard drive. You don't need it there any more. Azure will look for the certificate in your certificate store when it authorizes you, not on your hard drive.
At this point you are able to make a secure connection between your computer/account and Azure. You will now use this secure connection to upload your Virtual Machine.
Upload your Virtual Machine
First establish a secure connection to Azure. Open an Azure command prompt as an Administrator and enter the following:
csupload Set-Connection "SubscriptionId=YourSubscriptionIdGoesHere;CertificateThumbprint=YourCertificateThumbPrintGoesHere;ServiceManagementEndpoint=https://management.core.windows.net"
Finally it's time to upload the file. Open the Azure portal, select your storage account and copy the blobs service endpoint URL. Enter the following at the same Azure command prompt as above:
csupload Add-PersistentVMImage -Destination "YourBlobServiceEndPointUrlGoesHere/vhds/YourVhdNameGoesHere" -Label YourVhdNameGoesHere-LiteralPath "ThePathToYourVhdOnTheLocalComputerGoesHere" -OS Windows
The VHD should begin to upload.

Here's an easier way, you will need:
Windows Azure PowerShell
Open "Windows Azure PowerShell"
-OR- open a PS prompt and run:
Set-ExecutionPolicy RemoteSigned
Import-Module "C:\Program Files (x86)\Microsoft SDKs\Windows Azure\PowerShell\Azure\Azure.psd1"
2.
Get-AzurePublishSettingsFile
(Will prompt you to save a .publishsettings file required in the next step)
3.
Import-AzurePublishSettingsFile "C:\Temp\Windows Azure...credentials.publishsettings"
4.
add-azurevhd -destination "https://.blob.core.windows.net/vhds/File.vhd" -localfilepath "C:\Users\Public\Documents\Hyper-V\Virtual hard disks\File.vhd"
For more info see:
Get Started with Windows Azure Cmdlets

Related

Installed certificates on Batch account and Pool not available for task

I have an Azure Batch account setup with system assigned identity (the account was created through TF and User assigned identities are not yet supported).
A certificate is available to the batch account and on the pool as well.
When inspecting the node on the pool (scaled to one for now), it shows a certificate reference:
I've manually created a job and a simple task (/bin/bash -c 'ls -la $AZ_BATCH_CERTIFICATES_DIR/') to list contents and everything comes empty.
This seems to be the case for all self-signed certificates I've used to try this.
Can somebody please point out what I'm doing wrong?
(I've tried all combinations for Task-NonAdmin, TaskAdmin, Pool-NonAdmin, Pool-Admin together with LocalMachine, currentUser).
Thanks all!
Well, this thing happened:
Issue with Windows LocalMachine certificates:
If you are adding certificate references on your pool which install into the Windows LocalMachine certificate store, and are running tasks without admin access which need access to the certificate's private key, your tasks will work on the old agent but not work in the new agent.
Only pfx files where your non-admin task needs access to the private key should be moved to "My" in CurrentUser
https://github.com/Azure/Batch/issues/1
If I upload the certs to CurrentUser\My, the tasks do get the certs.

Azure Application gateway cannot upload pfx certificate

For Application gateway all documentation says to upload pfx certificate but when I go to http settings for backend pool it only allows ".cer" certificate and it wont allow ".pfx" file to be uploaded, error displayed says wrong format ?
m i doing something wrong or somehow Azure has changed functionality but documentation is still not uploaded .
Strangely through this command I am able to upload PFX
az network application-gateway ssl-cert create
Screenshot attached
Update : I am trying to do this for an existing Application Gateway
Update 2 : Strangely when I am creating a gateway Azure shows me option for PFX but I dont know why it become cer if I am trying to do this for an existing one.
Is this one of Microsoft's easter eggs??
It seems you select wrong entrance on Portal.
If you configure Add HTTP setting, you really need a .cer certificate.
More information please refer to this link.
The command az network application-gateway ssl-cert create is used for configure SSL. You could find it on Portal Settings--Listener.

Virtual Machine Converter 3.1 Cannot find Thumbprint

I'm attempting to use the Virtual Machine Converter 3.1 to convert a VM to Azure VM. I am following the steps here:
https://technet.microsoft.com/en-us/library/dn874004.aspx
and I have created a management certificate using the instructions here:
https://msdn.microsoft.com/library/azure/gg551722.aspx
and I have uploaded this to my Azure Subscription. I have verified that the certificate is in my Personal Store, and I have even copied this to the Local Machine Store. Both Certificates show that they have private keys as expected and the certificate uploaded and shows in the Azure Management Certificates as well.
When I run the MVMC and I provide the Subscription ID and the Certificate Thumbprint I get the message: "No certificate exists with thumbprint XXXXXXXXXXXX...". I cannot get past this screen to successfully migrate the VM to Azure, does anyone have a recommendation or suggestion?
I know this is old, but I thought I would post the answer to help whoever finds this. ;)
From: https://support.microsoft.com/en-us/kb/2977336
"Certificate thumbprint is not found in the personal certificate store" error when you try to migrate to Microsoft Azure using Microsoft Virtual Machine Converter
To resolve this problem, follow these steps:
Start the MMC.exe process.
Click File, click Add/Remove Snap-in, and then click Certificates.
On the Certificates snap-in screen, click Add, and then select My user account. Click Finish, and then click OK.
Expand Console Root, expand Certificates - Current User, expand Personal, and then expand Certificates.
Right-click your Microsoft Azure certificate. By default, this is named Microsoft Azure Tools. Select All Tasks, and then click Export.
Click Next, and then click No, do not export the private key. Click Next.
On the Certificate Export Wizard screen, click DER encoded X.509 (.CER), and then click Next.
Type a file name, click Next, and then click Finish.
Expand Console Root\Certificates - Current User, expand Trusted Root Certification Authorities, and then expand Certificates.
Right-click Certificates, and then click Import.
Click Next, and then locate the file that you exported in step 8.
Follow the steps in the wizard to complete the import process. Verify that the Microsoft Azure Tools certificate now appears in both the Personal and Trusted Root Certification Authorities stores.
Return to MVMC, and then retry the Azure Connection task.

Can't create new schedules from Azure Websites

I followed the tutorial here for settings up the Azure Scheduler:
http://fabriccontroller.net/blog/posts/a-complete-overview-to-get-started-with-the-windows-azure-scheduler/
I want to run my application on an Azure Website but it is blocking me from creating my X509Certificate.
I found this article: http://blog.tylerdoerksen.com/2013/08/23/pfx-certificate-files-and-windows-azure-websites/
Which points out the issue:
Well it turns out that when you load certificates the system will use a local directory to store the key (??)
The default location for the key is the under the local user profile, and with Windows Azure Websites, there is no local user profile directory."
So following his advice and adding the following flag: "X509KeyStorageFlags.MachineKeySet" I can get away with:
CryptographicException: The system cannot find the file specified
but I now get:
CryptographicException: Access denied.
Is there really no way to use the SDK from an AzureWebsite?! It defeats a lot of appeal of the Azure Scheduler if I am forced into using a WebRole instead of an Azure Website.
In this thread: http://social.msdn.microsoft.com/Forums/windowsazure/en-US/cfe06e73-53e1-4030-b82d-53200be37647/load-privately-created-p12-cert-from-azureblob-and-have-it-be-trusted
It appears as if they are sucessfully creating a X509Certificate on an Azure Website so what is different that mine throws "Access Denied" when I try to?
The problem was with using the ManagementCertificate string in the PublishSettings file... I created a self signed certificate on my local machine using the VisualStudio Console and exported both a '.cer' and '.pfx'.
Uploaded the self signed .cer into my Azure/Settings/Management Certificates
Bundled the .pfx with my solution and published to Azure Web Sites
Then used the following code to create the certificate:
var certificate = new X509Certificate2(
HttpContext.Current.Server.MapPath("~/<filename>.pfx"), "<password>", X509KeyStorageFlags.MachineKeySet);

Azure cloud deployment fails : Certificate with thumbprint was not found

I am developing a Web API based web service to be hosted on Azure. I am using Azure 1.8 SDK.
When I try to deploy my cloud service, it takes a very long time to upload after which I get an error message which says:
12:09:52 PM - Error: The certificate with thumbprint d22e9de125640c48a4f83de06ae6069f09cfb76c was not found. Http Status Code: BadRequest OperationId: 50daf49111c9487f82f3be09763e7924
12:09:53 PM - Deployment failed with a fatal error
Apparently, the certificate being referred to is related to enabling remote desktop to role instances on the cloud (i am not very sure about this; saw this on the internet for a similar problem). However, I did not check the option to enable remote desktop on the instances while publishing.
What could be going wrong here?
What worked for me was:
Goto powershell and type mmc
Add certificates snap-in by going to File > Add/Remove Snap-in > Choose Certificates from the list > Choose My user Account
Right click on Certificates - Current User and select Find Certificates
On the dialog box, set Contains to 'azure' and Look in Field to 'Issued To'
Press Find Now. You should be able to see a list of certificates.
Check for the thumbprint by double-clicking the certificate > Details tab > scroll down to Thumbprint
Once you found your certificate, close the dialog, Right click and select Export
Select to export the private key. Follow the steps until you have a *pfx file for upload to Azure
Goto your service and select the Certificates tab
Click Upload, select the exported *pfx file, and supply the password you've set during export
Goto Dashbord and update the Cloud package
List item
The certificate used in your project doesn't exist on the cloud environment. Make sure the same certificate used by your project is uploaded to the cloud environment. If you are using Visual Studio then you can fix this error as follows:
Right click your Web Role / Worker Role (under Roles folder in the cloud project) → Properties → Certificates
Click on the ellipsis button under Thumbprint which will point to your certificate.
Upload the certificate which shown here to Windows Azure environment (Production or Staging)
have you uploaded your publishing settings file in visual studio and/or a management certificate? this is crucial to be a trusted point by your azure subscription, hence why you could be having this issue. BTW try upgrading to SDK 2.1 too for better support and better features (if possible of course).
Adding to Arbie's answer. You can avoid the first few steps. Just type "Manage user certificates" in windows search bar. Go to Personal > Certificates.
Your certificates would have Issued to "Windows Azure Tools".
You can check for the thumbprint by opening the certificate and checking the Details.

Resources