Azure Storage Total Requests High - azure

I have signed up for Azure Storage the other day. I noticed today when I went into the Azure portal that there are about 500 requests per hour to the table storage. The strange thing is that I'm not using Table Storage and my site isn't live at the moment. So what could possibly be making all these requests? Any ideas?

Azure Storage has this feature called Storage Analytics which performs logging and provides metrics data for a storage account. This data gets stored in the same storage account under special tables (starting with $ e.g. $MetricsCapacityBlob). By default some analytics data is collected and this is why you're seeing these requests.
One way to check the transactions is by exploring contents of $logs blob container. It will tell you in details from where the requests to your storage accounts are being originated.

OK, mystery solved. It turns out it's the actual Azure Portal that is generating the traffic. I originally thought it was the SDK somehow making the calls, but then I had the website turned off, and the portal open, and it continued making requests. Close portal for a while, no requests.

Related

Azure activity logs not displaying any write data

I'm trying to set up logging for a storage resource (table specifically, though it seems like the activity log doesn't and just logs the entire Storage account).
The logging seems to log my ListKeys operations, occasional access from ApplicationInsights, but isn't logging any writes/reads I'm making to the tables themselves through either my app or the Microsoft Azure Storage Explorer. This table has been written to multiple times over the past few weeks, but yet none of that activity shows up.
Am I misinterpreting this page, which states that this activity log should track posts/deletes? Do I need any additional setup to track these operations?
Per my understanding, you could leverage Storage Analytics logging to log the operations on your storage. For the detailed operations that are logged for the corresponding storage service, you could refer to this official document.
According to your description, I have tested my operations against table storage by using REST API and Storage Explorer Tool. Here is my test result, you could refer to it.
Table Storage Analytics logging
Table Storage Metrics
As noted in this document:
As requests are logged, Storage Analytics will upload intermediate results as blocks. Periodically, Storage Analytics will commit these blocks and make them available as a blob.
In summary, please follow this tutorial to enable and configure Storage Analytics, then wait for some time and check your table storage logging.
If you are leveraging the Azure Activity log, remember that it is meant for control plane operations. So listkeys would show up there.
if you are looking for data plane operations (such as entity writes into a table), then make sure Diagnostics are turned on inside the Storage account that you are writing to.
Azure Activity Log is only for management plane records through Azure Resource Manager (ARM), specifically PUT/DELETE/POST which includes ListKeys which is an HTTP POST.
For storage analytics logging, you can use this article to see the types of data logged.

Azure Storage account consistently only adding Blob storage, missing Table/Queue/Files

Whenever I create a new Storage (classic) account through the Azure portal I consistently have issues whereby the Table/Queue/File storage is not created at all, leaving the account with only Blob storage, like this:
Instead of like this (separate account):
I have tried this multiple times and all have had the same result. I don't see how I can be getting this wrong as there is only 4 options on the form to create the account, and none of them govern the content of the account.
When I then attempt to create a new Table or Queue in this new account I get a 502 Bad Gateway error.
Am I missing something here? Can anyone tell me how I can add the required storage types to the account.
Not sure what's up with the portal, but a storage account always comprises blob, table, queue, and file storage (unless you create a Premium storage account - that's strictly blobs).
You should be able to confirm this by creating an app to, say, create, write, and read from a queue or table.
EDIT I see you edited your question, showing that you did try to create a table/queue. If this is a non-premium account, I suggest reaching out to support, as this makes no sense.
EDIT 4/2017 Aside from Premium storage accounts (which only have page blobs), there is another type of general (non-premium) storage account, specific to blobs only, where you won't be able to create Tables and Queues, but it's not available via the "Classic" deployment model; it's available only via "Resource Manager" deployment model:
In my case the issue was due to selecting Zone Redundant Storage (ZRS).
Since ZRS accounts only support Block Blobs, you will not see the
table, queue or file endpoints listed on the portal for the new
account.
https://blogs.msdn.microsoft.com/windowsazurestorage/2014/08/01/introducing-zone-redundant-storage/
Recreating the storage account using Globaly Redundant Storage (GRS) worked.

Azure storage monitoring from the web portal

I'm trying to enable the monitoring tiles in my Azure Storage account, but for the life of me can't get it to work. It keeps coming up with the error"monitoring may not be enabled. Please turn on diagnostics".
I've tried ticking all the checkboxes under Settings->Diagnostics->Monitoring but nothing seems to work.
Is there a trick I'm missing?
Are you trying to monitor a Premium Azure Storage account? I dont believe those can be monitored at this time, as monitored/analytical data for them is stored in Table Storage and Premium Storage accounts do not support Table Storage or Queue storage.

Is this a sensible Azure Blob Storage setup and are there restructuring tools to help me migrate to it?

I think we have gone slightly wrong on the way we have used Azure storage in a SAAS system. We created a storage account per client (Securtiy was prime consideration) and containers per system area e.g. Vehicle, Work etc
Having done further reading it seems a suggestion would be that we should have used one account for all clients. Each client should have a container (so we can programmatically create it) which we then secure. Then files should just be structured using "virtual" folder structure e.g. Container called "Client A". Then Files for the Jobs (in Work area of system) stored like Work/Jobs/{entity id}/blah.pdf. Does this sound sensible?
If so we now have about 10 accounts that we need to restructure. Are there any tools that will let us easily copy one accounts contents to another containers account? I appreciate we probably can't move the files between accounts (as we set them up ages ago so can't use native copy function) so I guess some sort of copy. There are GB of files across all the accounts.
It may not be such a bad idea to keep different storage accounts per client. The benefits of doing that (to me) are:
Better security as mentioned by you.
You'll be able to achieve better throughput / client as each client will have their own storage account. If you keep one storage account for all clients, and if one client starts hitting that account badly other clients will be impacted.
Better scalability. Each storage account can hold up to 200 TB of data. So if you keep just one storage account and assuming each client consumes 100 GB of data, you'll be able to accommodate only 2000 clients (I hope my math is right :)). With individual storage accounts, you won't be restricted in that sense.
There're some downsides as well. Some of them are:
Management would be a nightmare. Imagining you have 2000 customers then you would end up managing 2000 storage accounts.
You may be limited by Windows Azure. Currently by default you get about 10 or 20 storage accounts per subscription and you would need to contact support to manually up that limit. They can do that for you but I would imagine you would want this to be a self-service model where you would be able to create as many storage accounts as you want without contacting support.
Now coming to your question about tooling, you could possibly write something on your own which makes use of Copy Blob functionality. This functionality allows you to copy blob data across storage accounts asynchronously. Basically this is what you would do is:
First create a blob container for each client in the target storage account.
Enumerate all blob containers in source storage account.
For each blob container in source storage account, enumerate the blobs.
Copy each blob asynchronously to target storage account in the client's blob container.
If you're a PowerShell fan, you can look into Cerebrata's Azure Management Cmdlets (http://www.cerebrata.com/Products/AzureManagementCmdlets) as well which wraps this functionality. I could have recommended Cerebrata's Azure Management Studio as well but I haven't tried this functionality just yet there [Disclosure: I'm one of the devs on Cerebrata team].
Hope this helps.
Adding to Gaurav Mantri answer...
You can have shared storage account for customers and use Shared Access Signature(SAS) to limiting access to particular container or blobs(as well as for tables and queues)...
http://msdn.microsoft.com/en-us/library/windowsazure/hh508996.aspx

Azure storage metrics data

I am trying to implement azure storage metrics code in my role but i am checking if there is easy way to get azure storage metric data about my files usage. my code is stable and i do not want to change code again.
Actually if you already have Windows Azure role running, then you don't need to make any changes to your code and you still can get Windows Azure Blob storage Metrics data.
I have written a blog about it last time as Collecting Windows Azure Storage REST API level metrics data without a single line of programming, just by using tools
Please try above and see if this works for you.
Storage analytics is disabled by default, so any operations against your storage up til now has not been logged for analysis.
You may choose to enable analytics at any time, for both logging (detailed access information for every single object) and metrics (hourly rollups). Further, you may choose which specific storage service to track (blobs, tables, queues) and which operations to track (read, write, delete). Once analytics are enabled, you may access the resulting analytics data from any app (as long as you have the storage account name + key).
Persistent Systems just published a blog post on enabling storage analytics for Java apps. The same principles may be applied to a .net app (and the sdk's are very similar).
Additionally, Full Scale 180 published a sample app encapsulating storage analytics (based on REST API, as it was written before SDK v1.6 came out).

Resources