Invalid VHD path error when trying to create Azure Image - azure

I’m trying to create new image using Azure Portal.
When I upload an image as any blob type to container in storage account and try to write a path to it I get the following error: “Invalid VHD blob path. Please make sure the path to the VHD is valid’
The path looks like this: “https://storageAccountName.blob.core.windows.net/containerName/filename”
What am I doing wrong and how to fix it?

Please make sure you have a VHD path with .vhd format like this: https://storageAccountName.blob.core.windows.net/containerName/myUploadedVHD.vhd
Also, it's recommended to upload .vhd files to the page blob,
For more references:
Prepare a Windows VHD or VHDX to upload to Azure
Create a Windows VM from a specialized disk by using PowerShell
Creating An Azure VM From The VHDX/VHD File

Related

How to upload a file from azure blob storage to Linux VM created on azure

I have one large file on my azure blob storage container. I want to move my file from blob storage to Linux VM created on azure> How can I do that using data factory? or any Powershell Command?
The easiest and without any tools is to generate SAS token for the blob and run CURL.
Generate SAS
And then CURL
curl <blob_sas_url> -o output.txt
If you need this automated every time you can generate SAS URL from the script or just use AzCopy.
Please reference this blog:How to copy data to VM from blob storage, it gives you a way to solve the problem with Data Factory:
"To anyone who might get into same problem in future, I solved my problem by using 'copy wizard' present in ADF.
We need to install Data Management Gateway on VM and register it before we use 'copy wizard'.
We need to specify blob storage as source and in destination we need to choose 'File Server Share' option. In 'File Server Share' option we need to specify user credentials which I suppose pipeline uses to login to VM, folder on VM where pipeline will copy the data."
From the Azure Blog Storage document, there is another way can help you Mount Blob storage as a file system with blobfuse on Linux.
Blobfuse is a virtual file system driver for Azure Blob storage. Blobfuse allows you to access your existing block blob data in your storage account through the Linux file system. Blobfuse uses the virtual directory scheme with the forward-slash '/' as a delimiter.
This guide shows you how to use blobfuse, and mount a Blob storage container on Linux and access data. To learn more about blobfuse, read the details in the blobfuse repository.
If you want to use AzCopy, you can reference this document Transfer data with AzCopy and Blob storage. You can download the AzCopy for Linux. It provided the command for upload and download files.
For example, upload file:
azcopy copy "<local-file-path>" "https://<storage-account-name>.<blob or dfs>.core.windows.net/<container-name>/<blob-name>"
For PowerShell, you need to use PowerShell Core 6.x and later on all platforms. It works with Windows and Linux virtual machines using Windows PowerShell 5.1 (Windows only) or PowerShell 6 (Windows and Linux).
You can find the PowerShell commands in this document:Quickstart: Upload, download, and list blobs by using Azure PowerShell
Here is another link talked about Copy Files to Azure VM using PowerShell Remoting 6 (Windows and Linux).
Hope this helps.
You have many options to copy content from the blob store to the disk on the VM:
1. Use AzCopy
2. Use Azure Pipelines - File copy task
3. Use Powershell cmdlets
A lot of content is available on these approaches on SO!
It seems this is not properly documented anywhere so I am sharing the most basic approach which is to use the azcopy tool that is available for both windows/linux OS. This approach doens't need the complexity of creating the credentials/tokens.
Download azcopy
Its simple executable which can be run directly after extraction
Create a managed identity(system-assigned identity) for your Virtual machine. Navigate to VM-> Identity -> Turn the Status to 'ON' -> Save
Now the VM can be assigned permission at the following levels:
Storage account
Container (file system)
Resource group
Subscription
For this case, navigate to storage account -> IAM -> Add role assignment -> Select role 'Storage Blob Data Contributor' -> Assign access to 'Virtual machine' -> Select the desired VM -> SAVE
NOTE: If you give access to the VM on IAM properties of a Resource Group, the VM will be able to access all the storage accounts of the RG.
Login to VM and assume the identity (run the command from the same location where the azcopy is located)
For windows : azcopy login --identity
For linux : ./azcopy login --identity
Upload or download the files now:
azcopy cp "source-file" "storageUri/blob-container/" --recursive=true
Example: azcopy cp "C:\test.txt" "https://mystorageaccount.blob.core.windows.net/backup/" --recursive=true
IAM permission can take few minutes to propagate. If you change/add the permissions/access level anywhere, run the azcopy login --identity command again to get the updated identity.
More info on Azcopy is available here

How to download the Azure blob content with the same name of File

I have an Azure storage account where I have created a folder to upload & download a file in it.I am also performing the rename operation on it e.g when I perform rename operation and upload the file into the blob all Blob metadata get updated successfully.
Please suggest the changes.
How to download the Azure blob content with the same name of File
As Gaurav Mantri said that you could specify the ContentDisposition property for your blob. Use the Azure Storage Explorer, you could quick set the ContentDisposition property as follows:
But when I downloading the image, the ContentDisposition seems not working at all. Then I found a similar issue, you need to set the DefaultServiceVersion for your blob storage service. And you need to write your code and set the DefaultServiceVersion, more details you could refer to here and choose your development language.
Test:
Additionally, if you upload/download your blob files by programming, you could refer to issue1 and issue2.

AzCopy uploading local files to Azure Storage as files, not Blobs

I'm attempting to upload 550K files from my local hard drive to Azure Blob Storage using the following command (AzCopy 5.1.1) -
AzCopy /Source:d:\processed /Dest:https://ContainerX.file.core.windows.net/fec-data/Reports/ /DestKey:SomethingSomething== /S
It starts churning right away.
But it's actually creating a new Azure File Storage folder called fec-data/reports rather than creating new blobs in the Azure Blob folder fec-data/reports I've already created.
What am I missing?
Also, is there anyway to keep the date created (or similar) values of the old files?
Thanks,
But it's actually creating a new Azure File Storage folder called
fec-data/reports rather than creating new blobs in the Azure Blob
folder fec-data/reports I've already created.
What am I missing?
The reason you're seeing this behavior is because you're uploading to File storage instead of Blob storage. To upload the files to Blob storage, you need to specify blob service endpoint (blob.core.windows.net). So your command would be:
AzCopy /Source:d:\processed /Dest:https://ContainerX.blob.core.windows.net/fec-data/Reports/ /DestKey:SomethingSomething== /S
Also, is there anyway to keep the date created (or similar) values of
the old files?
Assuming you want to keep the date created of the blob same as that of the desktop file, then it is not possible. Blob's Last Modified Date/Time is a system property that gets assigned when a blob is created and is updated every time that blob is changed. You could however make use of blob's metadata and store file's creation date/time there.
I think you have to get the instance of the bob where you want to deploy the file
like :
AzCopy /Source:d:\processed /Dest:https://ContainerX.blob.core.windows.net/fec-data/Reports/ /DestKey:SomethingSomething== /S
Blob: Upload
Upload single file
AzCopy /Source:C:\myfolder/Dest:https://myaccount.blob.core.windows.net/mycontainer /DestKey:key /Pattern:"abc.txt"
If the specified destination container does not exist, AzCopy will create it and upload the file into it.
Upload single file to virtual directory
AzCopy /Source:C:\myfolder /Dest:https://myaccount.blob.core.windows.net/mycontainer/vd /DestKey:key /Pattern:abc.txt
If the specified virtual directory does not exist, AzCopy will upload the file to include the virtual directory in its name (e.g., vd/abc.txt in the example above).
please refer the link :https://learn.microsoft.com/en-us/azure/storage/storage-use-azcopy

Failed vhd copy: "the blob type is invalid for this operation"

I'm trying to clone a virtual machine on Azure. I stopped the VM, and navigated to its VHD file in blob storage using Neudesic's Azure Storage Expolrer. But when I try to copy the BHD blob I get the error "the blob type is invalid for this operation" after the target blob is created but before any bytes are copied. What steps am I missing?
I think you have found a bug in Azure Storage Explorer. It is basically trying to copy as Block Blob where your original blob type is Page Blob.
What I did was I tried to copy a VHD file and captured the request in Fiddler. If you see the screenshot below, you'll notice that the x-ms-blob-type request header is going as BlockBlob instead of PageBlob.

How to attach my uploaded vhd to a virtual machine in Azure?

I have uploaded successfully my 1TB vhd (not containing Windows files) to Azure storage.
Now I want to attach it as a second drive to my virtual machine but in the attach list I can find only the "attach an empty disk" option!
I used Add-AzureVhd to upload the vhd file:
Creating new page blob of size 999653638656...
I linked the storage resource in Cloud Service but the vhd is still not available to mount.
The container of the storage where I uploaded my vhd is the same with the one where C: drive of my VM is saved.
The container access is set to private.
Will it help if I change it to Public Blob or Public Container?
What else to try?
Thanks
Take a look at the PowerShell command Add-AzureDataDisk. This should be what you're looking for, as you can specify media location of the uploaded vhd.
Alternatively, in the portal, go to Virtual Machines and navigate to the Disks tab, where you can create a new disk:
At this point, you can navigate to your uploaded vhd:
After this is done, the new disk should become available for you to add to a Virtual Machine.
It should show the options to attach (Empty Disk and Existing Disk) as show in this link from Azure documentation.
Assuming the above not possible for what ever may be the reason, the alternative is
As the you already claim you are able to see the Attach Empty Disk; you can attach a 1 TB disk and download and put the blob contents there.
You wont be charged for the out-bandwidth as it is all internal
Make sure you used CSUpload and not just pushed the VHD to blob storage. See: http://msdn.microsoft.com/en-us/library/windowsazure/gg466228.aspx

Resources