I have an Azure Fileshare and there are snapshots which were taken manually without the backup service. Meaning there is no Recovery Services vault.
I want to restore one of the snapshots. I can restore an individual file from the snapshot via Azure Portal. How can I restore the whole snapshot? (meaning not file by file)
Presently it's not possible, You can connect/mount the Azure file share into different OS system or use azcopy tool to copy the data Or Add new Recovervaullt.
This article explains how to use the Azure portal to restore an entire file share or specific files from a restore point created by Azure Backup.
Azure file share sits inside a Storage Account. Azure Backup can now see that file share and create backup snapshots of it, into the same Storage Account. To do this, the Storage Account is registered with a Recovery Services Vault, where the backup policy and retention points are managed.
Azure Backup offers a variety of options to restore your file share data. You can choose to restore the entire file share or individual files and folders. Restores can also be done to the original location or to alternate file shares in the same or different storage accounts. Azure Backup also preserves and restores all access control lists (ACLs) of files and folders.
I have the following question when backing up VMs using Azure Backup there are two types of recovery - Snapshot and Vault & Vault. What is the difference between those?
In the docs they mention at least this:
If the recovery type for a restore point is “Snapshot and vault” and I perform a restore operation, which recovery type will be used?
If the recovery type is “snapshot and vault”, restore will be automatically done from the local snapshot, which will be much faster compared to the restore done from the vault.
So this seems to be related to the Instant Restore feature.
Snapshot and Vault means there is a local snapshot in your storage account, and is faster to restore from.
The ones with Vault are slower as it needs to be pulled from the vault.
By default, these snapshots are kept for 2 days.
I created Recovery Services Vault in Azure and didn't see any option to choose from GRS or LRS. I am told it uses GRS by default. How can I change it to LRS?
That's right, GRS is the default option but you can change it.
From Proprieties of your vault > Backup Configuration > Update
Note: if you already have VMs protected, you shall delete them and backup them again.
I'm backing up my VM using a Vault in the same Storage Group but noticed that is was sent to GRS instead of LRS. GRS is double the price and I don't need anything better than LRS. I've checked the vault and backup setting but don't see a way to change or even configure that from the start.
Is there an option that I'm missing or is this controlled by Microsoft. On-line documents seem to indicate that is can be controlled but don't show how/where.
In fact, currently, we could not change the storage replication type if you have set up or have used this recovery service vault for backup. For the first time, you deploy this recovery service vault, you could change the storage replication type from GRS to LRS in the Azure portal.
In this case, you have to remove the old one and create a new vault. Refer to changing Azure Recovery Services Vault to LRS Storage. You also could vote this similar user voice here.
I am thinking of using Azure Blob Storage for document management system which I am developing. All Blobs ( images,videos, word/excel/pdf etc) will be stored in Azure Blob storage. As I understand, I need to create container and these files can be stored within the container.
I would like to know how to safeguard against accidental/malicious deletion of the container. If a container is deleted, all the files it contains will be lost. I am trying to figure out how to put backup and recovery mechanism in place for my storage account so that it is always guaranteed that if something happens to a container, I can recover files inside it.
Is there any way provided by Microsoft Azure for such backup and recovery or Do I need explicitly write a code in such a way that files are stored in two separate Blob storage account.
Anyone with access to your storage account's key (primary or secondary; there are two keys for a storage account) can manipulate the storage account in any way they see fit. The only way to ensure nothing happens? Don't give anyone access to the key(s). If you place the storage account within a resource group that only you have permissions on, you'll at least prevent others with access to the subscription from discovering the storage account and accessing it.
Within the subscription itself, you can place a lock on the actual resource (the storage account), so that nobody with access to the subscription accidentally deletes the entire storage account.
Note: with storage account keys, you do have the ability to regenerate the keys at any time. So if you ever suspected a key was compromised, you can perform a re-gen action.
Backups
There are several backup solutions offered for blob storage in case if containers get deleted.more product info can be found here:https://azure.microsoft.com/en-us/services/backup/
Redundancy
If you are concerned about availability, "The data in your Microsoft Azure storage account is always replicated to ensure durability and high availability. Replication copies your data, either within the same data center, or to a second data center, depending on which replication option you choose." , there are several replication options:
Locally redundant storage (LRS)
Zone-redundant storage (ZRS)
Geo-redundant storage (GRS)
Read-access geo-redundant storage (RA-GRS)
More details can be found here:
https://learn.microsoft.com/en-us/azure/storage/common/storage-redundancy
Managing Access
Finally, managing access to your storage account would be the best way to secure and ensure you'll avoid any loss on your data. You can provide read access only if you don't want anyone to delete files,folders etc.. through the use of SAS: Shared Access Signatures, allows you to create policies and provide access based on Read, Write, List, Delete, etc.. A quick GIF demo can be seen here: https://azure.microsoft.com/en-us/updates/manage-stored-access-policies-for-storage-accounts-from-within-the-azure-portal/
We are using blob to store documents and for documents management.
To prevent deletion of the blob, you can now enable soft deletion as described in here:
https://azure.microsoft.com/en-us/blog/soft-delete-for-azure-storage-blobs-ga/
You can also create your own automation around powershell,azcopy to do incremental and full backups.
The last element would be to use RA-GRS blobs where you can read from a secondary blob in read mode in another region in case the data center goes down.
Designing Highly Available Applications using RA-GRS
https://learn.microsoft.com/en-us/azure/storage/common/storage-designing-ha-apps-with-ragrs?toc=%2fazure%2fstorage%2fqueues%2ftoc.json
Use Microsoft's Azure Storage Explorer. It will allow you to download the full contents of blob containers including folders and subfolders with blobs. Conversely, you can upload to containers in the same way. Simple and free!