How can I tell what is using a large VHD? - azure

I am running a VM in Windows Azure. It has two disks attached to it (OS 40GB and DATA 60GB).
In addition to my two VHDs, the Storage has one more 40GB VHD named dmzvyyq2.jja20130312104458.vhd.
I would like to know where this VHD came from and what is using it. Surprisingly the 'LAST MODIFIED' date is yesterday so something must have updated it. I went through all options in the Portal but nothing seems to have this VHD attached.
Ultimately I would like to delete this VHD to save storage space and cost.

One possibility to find out this is by using Storage Analytics. If you have storage analytics enabled, you can view the contents of $logs blob container, download data for the data in question and check for all the activity on this particular blob. You can use a tool like Azure Management Studio from Cerebrata to view storage analytics data. However if you haven't enabled analytics on your storage account, it would be very tough to find out that information.

Related

Detailed logging in azure storage account?

Every 70 minutes or so, something is downloading some hundred megabytes of data from one of my azure storage accounts. Is there any way to figure out what is the cause?
All the logs and statistics I've looked at only give me graphs like this:
which tells me that stuff has been downloaded, but not what or by who
You can find this information by viewing Storage Analytics data especially the logging data contained in $logs blob container in the storage account in question.
You can use Microsoft Storage Explorer or any other storage explorer to browse the contents of this blob container and download appropriate blob. The request you would want to look into is GetBlob (that's the request sent to Azure Storage for downloading blob).

Azure SDK C# Convert Page Blob to VHD?

I have copied two VHDs into blob storage as page blobs. Using the SDK API wrappers from C#, how can I let Azure know that one is an OS disk, and one is a data disk? I want to set this up so I can then use the regular v1 portal GUI to create a new VM using the disks I uploaded.
Thanks.
AFAIK, you don't have to do anything special. You can simply create disks out of these page blobs (as long as they are valid VHDs) and start using them.
Once you do this, you should be able to create VM using OS disk and attach data disks to it.
For more information on attaching a disk to VM, you may find this link useful: https://azure.microsoft.com/en-us/documentation/articles/storage-windows-attach-disk/.

Determining the size of table and blob in Azure Storage Service

I would like to determine the table size for each table and each blob entity size of my azure storage.
I tried using $MetricsTransactionsBlob, $MetricsTransactionsTable,$MetricsTransactionsQueue Tables and the analytics of storage service but am not able to determine the size of a given table or a given blob.
I have also tried to download cerabata azure management studio but am unable to determine size of table / blob.
Can someone share sample code which can help me to determine the size?
As far as finding the size for a table in Azure Table Storage, there's nothing available out there to the best of my knowledge. This is something you would need to do on your own. You would need to fetch all entities from a table and calculate the size of each entity. You may find this blog post useful: http://blogs.msdn.com/b/avkashchauhan/archive/2011/11/30/how-the-size-of-an-entity-is-caclulated-in-windows-azure-table-storage.aspx
Coming to blob containers, storage analytics provide you this capability to some extent. You can find the total size of blob storage using $MetricsCapacityBlob table. More information about it here: http://msdn.microsoft.com/en-us/library/windowsazure/hh343264.aspx.
You mentioned that you're using Azure Management Studio from Cerebrata. AMS has support for storage analytics so you can explore $MetricsCapacityBlob table using the tool itself. Also you can find the size of a blob container by right clicking on the blob container in question and clicking on Storage Statistics context menu item. Other alternative from Cerebrata is their Azure Management Cmdlets product (http://www.cerebrata.com/products/azure-management-cmdlets/features). You can use Get-BlobContainerSize Cmdlet to find the size and total number of blobs in a blob container.

Why does my custom VM Image not show up in Azure Create VM Interface?

I recently went through the hassle of creating and uploading a .VHD image containing a nice little Debian installation to my Azure Storage Account. It was created in Fixed Mode and uploaded as a PageBlob.
After a couple of attempts I was able to create an Image from my Blob, but I have no idea where to go from here.
Obviously, I want to create a VM instance from my image, but I can't figure out how to select my Image. I followed the NEW > Compute > Virtual Machine > From Gallery link, and there is a tab labeled My Images, but my image does not show up there.
Does anyone have an idea why?
EDIT: When I try to create a Disk from my Blob, I get the following Error:
The storage account does not support this operation. Please check the location of this storage account or create a new storage account and retry.
But the Disk is not associated with any storage account, or is it?
If you've uploaded your VHD you should be able to create a new disk using Virtual Machines > Disks > Create Disk this will prompt you for the url of the VHD you uploaded and allow you to specify the OS type and a name for the disk.
From there you can create a new Virtual Machine. New > Compute > Virtual Machine > From Gallery > My Disks
EDIT
I'm told by a colleague that some storage accounts do not support disks for VM's and a workaround can be to create a new VM using the portal (either from scratch or using a pre-made gallery image) this will create a storage account called something like portalvhdxxxxxxxx. You should then be able to upload your VHD to this storage account and create your disk from there.
I just ran into the same problem and this question helped me get around it. To provide more details, Azure VMs are not currently supported in certain data centers (such as North Central US). So if you create a storage account in an unsupported data center, you'll be able to upload vhd blobs and even create an image from it via the Azure portal. However that image will not show up under My Images when you attempt to spin up a VM from it.
It's pretty confusing but that seems to be what is happening. So if you want to store your vhd blobs in a storage account that isn't called portalvhdxxxxxx, just make sure your storage account is created in a data center that supports VMs. To figure out exactly what those data centers are, they are the data centers that you can choose when you quick create a VM directly on the portal.
I had the same issue, where I was unable to find the image in "Shared Images". However, I've learnt that the subscription under which I was trying to create the VM is different from the Subscription in which the Shared Image is present.
After, I have selected "Default Subscription filter" in portal settings to 'Select All', I was able to see the image in "Shared Images".
Note: Though it is a Shared Image, the Default Subscription should be set to 'Select All' when you are creating VMs across different subscriptions.

Upload 650,000 documents to Azure

I can't seem to find any reference to bulk uploading data to azure.
I have a document store with 650,000 pdf document that take up about 1.2 TB of disk space.
Uploading those files to Azure via the web will be difficult. Is there a way I can mail a hard drive and have your team upload them for me?
If not can you recommend the best way to upload this many documents?
Maybe not the answer you expected, but you could use Amazon's AWS Import/Export (this allows you to mail them a HDD and they'll import it in your S3 account).
To transfer the data to a Windows Azure Storage Account you can leverage one of the new features in the 1.7.1 SDK: the StartCopyFromBlob method. This method allows you to copy a file at a specific url in an asynchronous way (you could use this to copy all files from your S3 to your Azure storage account).
Read the following blogpost for a fully working example: How to Copy a Bucket from Amazon S3 to Windows Azure Blob Storage using “Copy Blob”
While Azure doesn't offer a physical ingestion process today, if you talk nicely to the Azure team they can do this as a one off. If you like I can get a contact on the product team for you (dave at greenbutton dot com).
Alternatively there are solutions such as Aspera which provide for accelerated data transfers over UDP and is being beta test in Azure along with the Azure Media Services offering.
We have some tools that help with this as well http://www.greenbutton.com and leverage Aspera's technology.
As disk shipment are not supported by Windows Azure, your best bet is use a 3rd party application (or write your own one) which supports parallel upload. This way you can still upload much faster. 3rd party applications like Gladinet, Cloudberry could be used for upload the data but I am not sure how configurable they are to get maximum parallel upload to achieve fastest upload.
If you decide to write by yourself here is the starting point: Asynchronous Parallel Block Blob Transfers with Progress Change Notification
I know this is a bit too late for the OP, but in the Azure Management Portal, under Storage, pick your storage instance, then click the Import/Export link at the top. At the bottom of that screen, there is a "Create Import Job" link and icon. Also, if you click the blue help icon on the far right side, it says this:
You can use the Windows Azure Import/Export service to transfer large amounts of file data to Windows Azure Blob storage in situations where uploading over the network is prohibitively expensive or infeasible. You can also use the Import/Export service to transfer large quantities of data resident in Blob storage to your on-premises installations in a timely and cost-effective manner. Use the Windows Azure Import/Export Service to Transfer Data to Blob Storage
To transfer a large set of file data into Blob storage, you can send one or more hard drives containing that data to a Microsoft data center, where your data will be uploaded to your storage account. Similarly, to export data from Blob storage, you can send empty hard drives to a Microsoft data center, where the Blob data from your storage account will be copied to your hard drives and then returned to you. Before you send in a drive that contains data, you'll encrypt the data on the drive; when Microsoft exports your data to send to you, the data will also be encrypted before shipping.
Both windows azure storage powershell and azcopy could bulk upload data to azure.
For azure storage powershell, you could use ls -File -Recurse | Set-AzureStorageBlobContent -Container upload.
You can refer http://msdn.microsoft.com/en-us/library/dn408487.aspx for more details.
For azcopy, you can refer this article http://blogs.msdn.com/b/windowsazurestorage/archive/2012/12/03/azcopy-uploading-downloading-files-for-windows-azure-blobs.aspx

Resources