Move Azure VHD from Premium to Standard Storage [closed] - azure

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 6 years ago.
Improve this question
I have a VM that currently has the OS disk in Premium Storage -- I'd prefer that it use Standard Storage and my data disks use Premium Storage. That said, is there an easy method to move the existing VHD from Premium to Standard?

You will need to
delete the VM while preserving the disks
use AzCopy to copy the OS disk to Standard Storage
create a Premium Storage capable VM using the copied disk
This may be more trouble than it is worth. You can likely script it by downloading the configuration prior to deletion, doing the copy, then modify the configuration and create the new VM.

Jdixon04,
We published an article which outlines step by step guide to migrate to Premium Storage here (https://azure.microsoft.com/en-us/documentation/articles/storage-migration-to-premium-storage/). There is also a sample script at the end of the article if you wish to automate the flow. If you have multiple VMs to migrate, automation through PowerShell scripts will be helpful. Let us know if you need additional information.
Aung

Related

How can I backup an Azure Cosmos DB [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
I have an Azure Cosmos DB and I need to delete all the resources from this subscription. Is there any way to take a backup offline from the portal?
UPDATE : Cosmos DB now supports backup Online backup and on-demand data restore in Azure Cosmos DB
You can use Data Migration Tool suggested on Automatic online backup and restore with Azure Cosmos DB article to do the same.
There is no way to take a backup and import to Azure CosmosDB.
The recommendation is to open a support ticket (e.g. via Azure Portal) or call Azure Support to streamline the backup/restore strategy and to request Azure to restore the latest backup in case of a disaster event. In addition, you can contact the Azure CosmosDB team by sending an email to AskCosmosDB#microsoft.com.
Sajeetharan's answer is wrong.
mongodump --uri="PRIMARY_CONNECTION_STRING"
Use this command. It will create a dump in your current working DIR.
MS has finally introduced a backup policy feature to CosmosDB!See https://learn.microsoft.com/en-us/azure/cosmos-db/online-backup-and-restore#modify-the-backup-interval-and-retention-period
I guess this removes the need for third party and/or custom tools to do such a basic Ops routine.
Automatic backup is not for free... Standard support plan ($100/mo)
I'm using free A̶z̶u̶r̶e̶ ̶C̶o̶s̶m̶o̶s̶ ̶D̶B̶ ̶D̶a̶t̶a̶ ̶M̶i̶g̶r̶a̶t̶i̶o̶n̶ ̶T̶o̶o̶l̶.
EDIT: thx #e2eDev link Azure Cosmos DB Data Migration Tool 1.8.3 on github
There is both GUI tool dtui.exe even CLI tool dt.exe
Supports many protocols and even JSON format (for both import and export).

Azure Website content or blob storage - which is faster? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I have a website in Azure which advertises my software. Users can download my software file which is about 100Mb. Right now I'm putting this file in a folder in the Visual Studio project as a content file and publishing it along with the other project files.
Would moving the download files to Azure Blob Storage give faster download speeds than my current approach?
Geo-redundant blobs are replicated to a secondary datacentre, but unless you use "Read-Access Geo-Redundant Storage" the blob will only be served from the primary data centre. With RA-GRS you can optionally access the secondary via a different domain, but this will not improve latency for users outside your region.
For best download performance and scalability, store your file in blob storage and cache it close to your users with Azure CDN.
With the file in blob storage you can also get it geo-replicated, so that it's closer to your user. You can also setup a CDN endpoint to that blob storage.
And it will at least make your deployment a lost faster since you will not need to upload the 100MB file ;) (yes, I know, web deploy does not upload unmodified files, so that's not always the case)

How are VHDs in Azure Storage charged? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
If I create and attach a 200GB VHD to my Azure Virtual Machine and only consume 15GB of this drive, am I charged for the full 200GB or the 15GB?
VHD's are persisted as page blobs and so you are charged for the space consumed in the blob.
From this:
https://www.windowsazure.com/en-us/pricing/details/#storage
For Windows Azure Drive storage, you will be billed only for the storage space used by the page blob and the read/write transactions to the page blob. You will not be charged for read transactions that utilize the local drive cache. Windows Azure Drive usage is billed at the same rates as standard Windows Azure Storage and is included with Windows Azure Storage usage in your bill. There is not a separate line item for Windows Azure Drive on your bill.

deleting data disk in linux vm return error in windows azure [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
My linux vm on windows azure has a 300 gb data disk attached to it and when I try to delete it, I get an error in portal says the disk is not attached to. In the azure portal I still see the data disk is attached to linux vm but can not delete it.
Does any one see the same problem?
To delete a data disk from Windows Azure VM you would need to follow the given steps:
Identify where your Data Disk is physically located in specific Windows Azure Storage name and container and keep it
Now log into your Windows Azure VM (Linux/Windows) and unmount the data disk
After that you would need to detach the Data disk from Windows Azure VM configuration and this can be done via portal or using Powershell commands
After you detach the data disk from Windows Azure VM configuration the data disk is still listed at Virtual Machines -> Disks (Disk Type - Data Disk) section so go ahead select your specific Data Disk from the list and Delete it.
Step #4 will remove the VHD lease and the Data Disk will not be listed anymore in Virtual Machines -> Disks section however the VHD is still taking precious space in your Windows Azure Storage so you would need to remove it to reclaim the space.
To delete the VHD permanently, visit to Windows Azure Storage (you identified in step #1) using 3rd Party Azure Storage tools or any other method and then delete the VHD permanently.
Please try above steps and this should delete your Data disk from Windows Azure VM.

What tools can provide scheduled backups of Azure blob storage? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
I'm looking for the best way to prevent accidental deletion by IT - perhaps copying to disk or a separate Azure Storage account or Amazon. What tools can do this? Redgate Cloud Services seems like the closest fit for what I want but it seems to require config per container. I know of some other tools like Cloud Storage Studio and Azure Sync Tool exist but I don't think they support scheduled backups of blob storage.
Windows Azure storage is backed up Geo-replication which means there are total 6 copies of your data at any given time. There is no built-in service available in Windows Azure to backup data on Azure Storage to outside Azure Storage or user defined location.
Windows Azure Azure is manged by RESTful interface so 3rd party vendors have created application for such purposes. Besides above I had chance to use Gladinet Cloud Backup solution could be useful in your case. Based on my experience, there are a few backup tools available however and not a single one perfect to match everybody expectation.
A cheap way to prevent accidental deletion by IT is to snapshot the blobs into a backup container. IT would have to be very persistent and delete all of the snapshots taken of the original blob in order to accidentally delete it.
"A blob that has snapshots cannot be deleted unless the snapshots are also deleted. You can delete a snapshot individually, or tell the storage service to delete all snapshots when deleting the source blob. If you attempt to delete a blob that still has snapshots, your call will return an error."
http://msdn.microsoft.com/en-us/library/windowsazure/hh488361
CloudBerry Backup: it supports Amazon S3, Azure, Google, and much more cloud storage providers
http://www.cloudberrylab.com/amazon-s3-microsoft-azure-google-storage-online-backup.aspx

Resources