Azure Storage account consistently only adding Blob storage, missing Table/Queue/Files - azure

Whenever I create a new Storage (classic) account through the Azure portal I consistently have issues whereby the Table/Queue/File storage is not created at all, leaving the account with only Blob storage, like this:
Instead of like this (separate account):
I have tried this multiple times and all have had the same result. I don't see how I can be getting this wrong as there is only 4 options on the form to create the account, and none of them govern the content of the account.
When I then attempt to create a new Table or Queue in this new account I get a 502 Bad Gateway error.
Am I missing something here? Can anyone tell me how I can add the required storage types to the account.

Not sure what's up with the portal, but a storage account always comprises blob, table, queue, and file storage (unless you create a Premium storage account - that's strictly blobs).
You should be able to confirm this by creating an app to, say, create, write, and read from a queue or table.
EDIT I see you edited your question, showing that you did try to create a table/queue. If this is a non-premium account, I suggest reaching out to support, as this makes no sense.
EDIT 4/2017 Aside from Premium storage accounts (which only have page blobs), there is another type of general (non-premium) storage account, specific to blobs only, where you won't be able to create Tables and Queues, but it's not available via the "Classic" deployment model; it's available only via "Resource Manager" deployment model:

In my case the issue was due to selecting Zone Redundant Storage (ZRS).
Since ZRS accounts only support Block Blobs, you will not see the
table, queue or file endpoints listed on the portal for the new
account.
https://blogs.msdn.microsoft.com/windowsazurestorage/2014/08/01/introducing-zone-redundant-storage/
Recreating the storage account using Globaly Redundant Storage (GRS) worked.

Related

Blob storage compatibility with Azure Functions

I have some e-mail attachments being saved to Azure Blob.
I am now trying to write a Azure Functions App that would connect to that blob storage, run some scripts and re-save the file.
However, when selecting a storage account for the function, I couldn't select my blob storage account.
I went on the website and it said this:
When creating a function app, you must create or link to a general-purpose Azure Storage account that supports Blob, Queue, and Table storage. Some storage accounts don't support queues and tables. These accounts include blob-only storage accounts and Azure Premium Storage.
I'm wondering, is there any workaround this? and if not, perhaps any other suggestions? I'm becoming a little lost in all the options, and which one to actually choose.
Thanks!
EDIT: Might I add I writing the function Python
I think you are overlooking the fact that you can have multiple storage accounts. In order for an Azure Function to work you need a storage account. That storage account is used to store runtime information of the Azure Function for internal purposes like state management. This storage account is subject to restrictions as you already found out. There is no workaround for that.
However, if the function you are writing needs to access another storage account it is free to do so. You just have to provide details to connect to that specific storage account. In that case you also have a clear seperation between the storage account that is used by the azure function for its internal operations and the storage account your application needs to connect and which you have total control about withouth having to worry that you break things by deleting internal used blobs/tables/queues.
You can have a blob triggered function that gets triggered when changes occur on your specific blob storage. That doesn't need to be the storage account that the azure function internally uses, which is created/selected when creating the azure function.
Here is a sample that shows how to add a blob triggered azure function in Python. MyStorageAccountAppSetting refers to an app setting that holds the connection string to the storage account that you use for storage.
The snippet from the website you are quoting is for storing the function app code itself and any related modules. It does not pertain to what your function can access when the code of your function executes.
When your function executes it will need to use the Azure Blob Storage SDK/modules to connect to your blob storage account and read the email attachments. Here's a quickstart guide for using Azure Storage with Python: Quickstart with Azure Storage Blobs SDK for Python
General-purpose v2 storage accounts support the latest Azure Storage features and incorporate all of the functionality of general-purpose v1 and Blob storage accounts here
There are more integration options with GPv2 accounts including Azure Function Triggers. See: Azure Blob storage bindings for Azure Functions
Further refer: Types of storage accounts
If Blob, based on your need, you can choose an access tier based on the frequency of access for the data (e-mail attachments)Access tiers for Azure Blob Storage - hot, cool, and archive. If General purpose storage account, its standard performance tier.

Azure blob container backup and recovery

I am thinking of using Azure Blob Storage for document management system which I am developing. All Blobs ( images,videos, word/excel/pdf etc) will be stored in Azure Blob storage. As I understand, I need to create container and these files can be stored within the container.
I would like to know how to safeguard against accidental/malicious deletion of the container. If a container is deleted, all the files it contains will be lost. I am trying to figure out how to put backup and recovery mechanism in place for my storage account so that it is always guaranteed that if something happens to a container, I can recover files inside it.
Is there any way provided by Microsoft Azure for such backup and recovery or Do I need explicitly write a code in such a way that files are stored in two separate Blob storage account.
Anyone with access to your storage account's key (primary or secondary; there are two keys for a storage account) can manipulate the storage account in any way they see fit. The only way to ensure nothing happens? Don't give anyone access to the key(s). If you place the storage account within a resource group that only you have permissions on, you'll at least prevent others with access to the subscription from discovering the storage account and accessing it.
Within the subscription itself, you can place a lock on the actual resource (the storage account), so that nobody with access to the subscription accidentally deletes the entire storage account.
Note: with storage account keys, you do have the ability to regenerate the keys at any time. So if you ever suspected a key was compromised, you can perform a re-gen action.
Backups
There are several backup solutions offered for blob storage in case if containers get deleted.more product info can be found here:https://azure.microsoft.com/en-us/services/backup/
Redundancy
If you are concerned about availability, "The data in your Microsoft Azure storage account is always replicated to ensure durability and high availability. Replication copies your data, either within the same data center, or to a second data center, depending on which replication option you choose." , there are several replication options:
Locally redundant storage (LRS)
Zone-redundant storage (ZRS)
Geo-redundant storage (GRS)
Read-access geo-redundant storage (RA-GRS)
More details can be found here:
https://learn.microsoft.com/en-us/azure/storage/common/storage-redundancy
Managing Access
Finally, managing access to your storage account would be the best way to secure and ensure you'll avoid any loss on your data. You can provide read access only if you don't want anyone to delete files,folders etc.. through the use of SAS: Shared Access Signatures, allows you to create policies and provide access based on Read, Write, List, Delete, etc.. A quick GIF demo can be seen here: https://azure.microsoft.com/en-us/updates/manage-stored-access-policies-for-storage-accounts-from-within-the-azure-portal/
We are using blob to store documents and for documents management.
To prevent deletion of the blob, you can now enable soft deletion as described in here:
https://azure.microsoft.com/en-us/blog/soft-delete-for-azure-storage-blobs-ga/
You can also create your own automation around powershell,azcopy to do incremental and full backups.
The last element would be to use RA-GRS blobs where you can read from a secondary blob in read mode in another region in case the data center goes down.
Designing Highly Available Applications using RA-GRS
https://learn.microsoft.com/en-us/azure/storage/common/storage-designing-ha-apps-with-ragrs?toc=%2fazure%2fstorage%2fqueues%2ftoc.json
Use Microsoft's Azure Storage Explorer. It will allow you to download the full contents of blob containers including folders and subfolders with blobs. Conversely, you can upload to containers in the same way. Simple and free!

Azure Blob Storage: Does Microsoft Implement Redundant Backups?

I've searched the web and contacted technical support yet no one seems to be able to give me a straight answer on whether items in Azure Blob Storage are backed up or not.
What I mean is, do I need to create a twin storage account as a "backup" and program copies of all content from one storage to another, or are the contents of a client's Blob Storage automatically redundantly backed up by Microsoft?
I know with AWS, storage is redundantly backed up via onsite drives as well as across other nodes in the cluster.
do I need to create a twin storage account as a "backup" and program
copies of all content from one storage to another, or are the contents
of a client's Blob Storage automatically redundantly backed up by
Microsoft?
Yes, you will need to do backup manually. Azure Storage does not back up the contents of your storage account automatically.
Azure Storage does provide geo-redundant replication (provided you configure the redundancy level for your storage account as GRS or RA-GRS) but that is not back up. Once you delete content from your primary account (location, it will automatically be removed from secondary account (geo-redundant location).
Both AWS (EBS) and Azure(Blob Storage) options provides durability by replicating the data across different data centers. This is for the high availability and durability of the data to provide the guarantee by the cloud provider.
In order to ensure that your data is durable, Azure Storage has the
ability to keep (and manage) multiple copies of your data. This is
called replication, or sometimes redundancy. When you set up your
storage account, you select a replication type. In most cases, this
setting can be modified after the storage account is set up.
For more details refer the replication section in documentation.
If you need to capture changes to the storage and allow restore to previous versions (e.g In situations like data corruption or application feature requirements like restore points, backups), you need to take a SnapShot manually. This is common for both AWS and Azure.
For more details on creating a Snapshot of Blob in Azure refer the documentation.

What kind of Azure Storage Account did I create?

I created a new Storage account in the Azure portal and choosed an existing Resource Group. It did not create a classic storage account but some kind of resource group-ish storage account that doesn´t have all the options a classic storage account has.
As an example, I could create the "files" folder through code, but I can´t use the code: "blockBlob.UploadFromStream(fileStream);", it gives me error 400 bad request. The same code works when I upload to a classic storage account.
What kind of storage account is seen in my image? Which is more correct, to create a classic storage account (blue icon in image) or the one I did (green/white/grey icon in image)?
First, I would suggest you have a look at David's reply in this thread to know the difference between new Azure storage account and classic Azure storage account.
it gives me error 400 bad request.
There are a lot of issues to caused 400 error. I would suggest your check your code to find out detailed issues. Please test use your code to create container (container name must meet the limitations) to see whether it will work. It is better if you could provide key code.
You created an Azure Resource Manager storage account of type Premium storage in region "North Europe" within your given resource group.
There is not really a right or wrong. As almost always it depends on your use-case.
I want to suggest these docs and samples for getting started with code addressing block blobs and Azure storage in general. Run and explore this code against the storage emulator and/or live storage accounts (classic/ARM standard/ARM premium). May be this helps finding a bug or a misconfiguration in your project.
Azure Blob Storage Samples for .NET
Azure Storage Samples
The issue here has nothing to do with Classic vs Resource Manager. It's related to the fact that the storage account is of type "Premium."
Premium storage accounts are exclusively used for Azure disks (attached durable storage), which are Page Blobs.
Premium storage doesn't support generic blobs/tables/queues. For that, you'd need a non-premium storage account.

Queues working with Azure Storage (Classic) but not with modern

Doing some work with Azure and queues, and I can get it to work with a classic storage account, but not the standard (newer) storage account.
"Modern" Azure storage account
Classic Storage Account
If I run this code against each of them...:
CloudQueueClient queueClient = storageAccount.CreateCloudQueueClient();
var documentProcessQueue = queueClient.GetQueueReference("documentprocessrequest");
documentProcessQueue.CreateIfNotExists();
...It works for Classic, but with "modern" I get this error:
An unhandled exception of type
'Microsoft.WindowsAzure.Storage.StorageException' occurred in
Microsoft.WindowsAzure.Storage.dll
Additional information: The remote name could not be resolved:
'xxxxxxdocstest.queue.core.windows.net'
What am I missing? Am I doing something wrong, or does queues simply not work with "moderns" storage accounts (sounds unlikely)?
I am using the latest version of the Azure SDK. I have tested the connectionstrings to the storage accounts with other things, like uploading a blog, and they do work.
EDIT:
I created a "modern" Azure Storage account like the below screenshot (and tested with this one, same error) - and I changed the first image to reflect this account.
Basically you're getting this error is because you created a Blob Storage kind of storage account which only supports blobs. Please see this link to learn more about account kinds: https://azure.microsoft.com/en-in/documentation/articles/storage-create-storage-account/.
At this moment, it is not possible to convert this account into a regular (standard) storage account. Thus you would need to create a new storage account and transfer any blobs that you may have in this account to a newer account.
When you create the new storage account, please ensure that the redundancy type of that account is not ZRS or Premium LRS as storage accounts with these redundancy type only support blobs again.

Resources