Azure CDN file download statistics - azure

How do I see file download statistics with Azure CDN and/or get statistics using a .NET program? I want to put a file in Azure CDN and see how many times the file was downloaded, more details like country of downloaders etc would be nice too.

I don't think there is currently such statistics available. All you can see is the number CDN transactions and CDN bandwidth consumed. You can see this in the monthly bill (in billing history).

Try to use "manage cdn" button on the portal and it leads you to another portal where you can see analytics. Programmatical access is not supported yet.

Related

Azure Blob Storage - Static Site analytics

I've got a static web site hosted in Azure Blob Storage with Cloudflare as my CDN. It's such a small site (not even 1Mb and only 1 blog post), but I'm getting 1.1-1.2Gb of requests each month for the past 6 months or so with no explanation. Is there a way to find out what is being requested? In Azure, I can only find information about the performance, up-time, etc, but nothing about url's and I need to pay to get this info from Cloudflare (I believe). Has anyone else experience such strange requests?
I suggest you open Diagnostic settings and download Azure Storage Explorer to view the logs.
When you finished settings, u can check logs by tools. You can see request urls and http status info.
The previous data should not be visible, but you can monitor and analyze future requests.
When I did a lookup on those two IP's, they were both registered to Cloudflare, one makes sense given I'm using their service, but to have what appears to be their bot hit my site with this frequency doesn't... Wonder if there's a setting

CDN does not choose closest PoP for blob storage

I have a storage account in South Cental US Data center with images. I have a CDN (Verizon standard) endpoint for this storage account.I am using SAS key for accessing my storage account. While accessing contents from CDN, it actually takes more time than getting the data directly from storage account. I investigated and found that the CDN content is being downloaded from US Verizon POP location instead of my closet pop location. I am accessing from India and verizon has a pop location on my city.
Any suggestion what would be the issue?
Optimization choices are designed to use best-practice behaviors to improve content delivery performance and better origin offload. Your scenario choices affect performance by modifying configurations for partial caching, object chunking, and the origin failure retry policy.
This article provides an overview of various optimization features and when you should use them. For more information on features and limitations, see the respective articles on each individual optimization type: https://learn.microsoft.com/en-us/azure/cdn/cdn-optimization-overview
Try switching the optimization scenarios, it may resolve the latency issue you are experiencing. let me know if this helped.

Cloud Services - Two web roles sharing file system

I have a very special requirement which is:
Two web roles accessing a local shared file location.
I am aware of the "Local Storage" role settings, but those are only accessible within each role scope.
Does anyone know another option to accomplish this?
------- EDIT --------
As suggested I will explain more clearly what I'm trying to achieve here.
I'm implementing Only Office which is a web editor for office files. Their product requires to have a file saved on the file system to be opened on the editor.
I don't want to mix their ASP.NET MVC open source project with my code, so that's why I want to deploy their website as a separate webrole.
-------- END EDIT ------------
Thanks
In your question, you state that (my emphasis):
I'm implementing Only Office which is a web editor for office files. Their product requires to have a file saved on the file system to be opened on the editor.
If Only Office's requirement is to have temporary file storage that is used while the document is being edited, you may be able to get away with this in a Cloud Service Web Role. This is assuming that your users wouldn't be too angry if the temp. working document was 'lost' during a role re-start.
Web (and Worker) Roles are non-durable and the Azure Service Fabric might bounce them if they need to patch the underlying host or they might just crash due to a fault (which is usually why you deploy them in pairs - fault-tolerance etc.) If you save something to the file system on a Web Role, you are not guaranteed that it will be there if the role is bounced.
If however you need durability, you will need to implement something based around Azure Blob Storage and possibly something based on Blob Leases. However I imagine that Only Office doesn't have an implementation for Azure....
Failing that, you could try running on Azure Web App Service, however I imagine you would have the same issue re. backing storage and would need to implement something on Blob Storage.
So, finally, if you want complete control and something akin to running on-premise, take a look at using an IaaS Virtual Machine where you have all of the file system to play-with as you please.
==UPDATE==
Taking a look at the Only Office website, there is a SaaS offering Only Office SaaS Hosting which is probably cheaper to run for a year than the time taken for me to write this answer!
Failing that, if you look at the requirements for Only Office Document Server there is no way you're going to run that on a Web Role. Go Azure IaaS VM's.
You basically have 2 options here, both mentioned in the comments. You can use BLOB storage, or you can use an SMB share using Azure Files, which I believe is in preview still. We have used Azure files to mount an SMB share on several linux boxes. One thing we have noticed is that it is not particularly fast. It is also built on top of blob storage. Here is a link to Azure Files https://azure.microsoft.com/en-us/documentation/articles/storage-dotnet-how-to-use-files/.
If you choose to use blob storage and you will need to consider concurrency.
https://azure.microsoft.com/en-us/blog/managing-concurrency-in-microsoft-azure-storage-2/
I would suggest to use Azure File Services, you could have a share like URI to be used.
take a look at this:
https://azure.microsoft.com/en-us/documentation/articles/storage-dotnet-how-to-use-files/

Getting web page hit count with IIS logs in Azure

I have a website hosted in Azure as a cloud service (not as a website), and I need to get the hit count for every web page of the site.
I enabled Azure Diagnostics, and I see the IIS logs copied to my blob storage, however this logs contain very few data (only one hit to a javascript file).
Furthermore, putting "Verbose" or "All" in the diagnostics configuration of the web role doesn't seem to affect the results, I get only one line (an access to a css file, or an image file, etc).
I'm using Azure SDK 2.0.
Is it possible to use the included IIS logs generated by azure to get a hit count? What should I need to change in the diagnostics configuration?
Or should I need a different approach to achieve this?
The IIS logs it produces are the same ones you'd find on a Windows Server anywhere. Note that depending on the settings you provided to the diagnostics it might take a little while before the data is moved to the storage account. Setting the level of verbosity for the configuration determines what is moved from the instances over to the storage account. Did you give it plenty of time to move the data over before looking at the file in storage again? Sometimes it just brings over what it has, and of course, there could be buffering which means when the file was brought over not everything was in it, etc.
You should be able to get this information from the logs, and yes, you should be able to do it from the IIS logs. That being said, if what you are after is a hits per page I would suggest actually a different approach. Look at an analytics provider like Google Analytics or one of the competitors to that. You'll get a massive amount of information beyond just page hits and no need to worry about parsing log files, etc.

Storage Transaction Profiler for Windows Azure Web Deploy Accelerator

I've recently begun using the Web Deployment Accelerator for my Windows Azure account. It is providing an immediate return in time saved and is an excellent offering.
However since "everything" is now stored to Azure Storage rather to the regular E:Drive I am immediately seeing a cost consequence for using the tool.
In one day I have racked up a mighty 4 cent NZD charge. In order to do that I had to burn through about 80,000 storage transactions and frankly i cant figure where they all went.
I uploaded 6 sites that are very small wouldn't have more than 300 files each. So I'm wondering:
a. is there is a profiling tool for the Web Deployment Accelerator that will allow me to see where and how 80,000 storage transactions were used for such a small offering. Is it storage transaction intensive tool? Has any cost analysis been carried out in terms of how this tool operates? Has it been optimised with cost in mind?
b. If I'm using this tool do i pay for 2 storage transactions per http request to a site? As since the tool now writes the web server logs to table storage, that would be one storage request to pull the http request resource (img, script, etc) and a storage request to write the log entry as well would it not?
I'm not concerned about current charges I 'm concerned about the future if i start rolling all my hosted business into the cloud. I mean Im now being charged even just to "look" at my data right? If i list the contents of a storage folder using a tool like Azure Storage Explorer that's x number of storage transactions where x = number of files in the folder?
Not sure of a 3rd-party profiler tool, but Windows Azure Storage logging and metrics will give you very detailed info regarding both individual accesses and hourly rollups. It's pretty straightforward to enable, and the November 2011 SDK includes support for the API calls required for enabling. See here for an overview of what's offered for metrics and logging.
My team worked with Fullscale180 to build a storage library, Azure Store XRay, to demonstrate how to enable and query storage metrics and logging. Note: This was published before the SDK had logging and metrics support, so it uses the REST API calls instead. But that won't impact you if you try to use the library.
You can also look at another code demo, Cloud Ninja, which calls the XRay library for its metrics display (see here for running demo).
Regarding querying storage for objects in blob containers: that's not a 1:1 transaction:file scenario. You can specify the maximum number of blobs to return when listing items in a container. It's possible that all blobs are returned in one transaction. Of course, if you then grab each blob, each of these will be at least one transaction (depending on blob size). See here for details about listing blobs.

Resources