Detailed logging in azure storage account? - azure

Every 70 minutes or so, something is downloading some hundred megabytes of data from one of my azure storage accounts. Is there any way to figure out what is the cause?
All the logs and statistics I've looked at only give me graphs like this:
which tells me that stuff has been downloaded, but not what or by who

You can find this information by viewing Storage Analytics data especially the logging data contained in $logs blob container in the storage account in question.
You can use Microsoft Storage Explorer or any other storage explorer to browse the contents of this blob container and download appropriate blob. The request you would want to look into is GetBlob (that's the request sent to Azure Storage for downloading blob).

Related

How to prevent large files upload to Azure via Put Blob

Because of bandwidth issue, I'm currently sending my files directly to Azure blob storage.
But I don't want anyone with the access token to upload any file that is too large. According to Microsoft docs: Documentation
Note that the storage emulator only supports blob sizes up to 2 GB.
Is there any way that I can prevent user to upload file which is larger than 10MB to my blob storage via Azure settings or policies.
Please notice that the limit you show is about the storage emulator.(This is a local virtual storage)
For real Azure blob Storage, please refer to this:
https://learn.microsoft.com/en-us/azure/azure-resource-manager/management/azure-subscription-service-limits#azure-blob-storage-limits
And about your requirement, Azure Storage does not have such a feature that allows you to customize the size of the uploaded file, you can only design it in the code.

Getting error while trying to download logs for Azure Table Storage using AzCopy

I would like to download logs for Azure Table Storage for the past 3 days. I am trying to that using AzCopy. I tried different combinations of URL HTTP/https and patterns, but still, I am getting an error while downloading the logs. Am I passing any incorrect parameters to AzCopy?
The $logs container will be automatically created when Storage Analytics is enabled for a storage account. If you can not find the $logs container, please make sure if you have enabled Storage Analytics.
I enabled Storage Analytics and I can download Storage Logging log data using Azure Copy Tool (AzCopy) on my side.

Is Azure Blob storage the right place to store many (small) communication logs?

I am working with a program which connects to multiple APIs, the logs for each operation (HTML/XML/Json) need to be stored for possible later review. Is it feasible to store each request/reply in an Azure blob? There can be hundreds of requests per second (all of which need storing) which vary in size and have an average size of 100kB.
Because the logs need to be searchable (by metadata) my plan is to store it in Azure Blob and put metadata (with blob locations, custom application-related request and content identifiers, etc) in an easily-searchable database.
You can store logs in the Azure table storage or Blob storage but Microsoft itself recommends using Blob storage. Azure Storage Analytics stores log data in Blob storage.
This 'Azure Storage Table Design Guide' points out several draw backs of using table storage for logs and also provides details on how to use the blob storage to store logs. Read the 'Log data anti-pattern' section in particular for this use case.

How can I tell what is using a large VHD?

I am running a VM in Windows Azure. It has two disks attached to it (OS 40GB and DATA 60GB).
In addition to my two VHDs, the Storage has one more 40GB VHD named dmzvyyq2.jja20130312104458.vhd.
I would like to know where this VHD came from and what is using it. Surprisingly the 'LAST MODIFIED' date is yesterday so something must have updated it. I went through all options in the Portal but nothing seems to have this VHD attached.
Ultimately I would like to delete this VHD to save storage space and cost.
One possibility to find out this is by using Storage Analytics. If you have storage analytics enabled, you can view the contents of $logs blob container, download data for the data in question and check for all the activity on this particular blob. You can use a tool like Azure Management Studio from Cerebrata to view storage analytics data. However if you haven't enabled analytics on your storage account, it would be very tough to find out that information.

Upload 650,000 documents to Azure

I can't seem to find any reference to bulk uploading data to azure.
I have a document store with 650,000 pdf document that take up about 1.2 TB of disk space.
Uploading those files to Azure via the web will be difficult. Is there a way I can mail a hard drive and have your team upload them for me?
If not can you recommend the best way to upload this many documents?
Maybe not the answer you expected, but you could use Amazon's AWS Import/Export (this allows you to mail them a HDD and they'll import it in your S3 account).
To transfer the data to a Windows Azure Storage Account you can leverage one of the new features in the 1.7.1 SDK: the StartCopyFromBlob method. This method allows you to copy a file at a specific url in an asynchronous way (you could use this to copy all files from your S3 to your Azure storage account).
Read the following blogpost for a fully working example: How to Copy a Bucket from Amazon S3 to Windows Azure Blob Storage using “Copy Blob”
While Azure doesn't offer a physical ingestion process today, if you talk nicely to the Azure team they can do this as a one off. If you like I can get a contact on the product team for you (dave at greenbutton dot com).
Alternatively there are solutions such as Aspera which provide for accelerated data transfers over UDP and is being beta test in Azure along with the Azure Media Services offering.
We have some tools that help with this as well http://www.greenbutton.com and leverage Aspera's technology.
As disk shipment are not supported by Windows Azure, your best bet is use a 3rd party application (or write your own one) which supports parallel upload. This way you can still upload much faster. 3rd party applications like Gladinet, Cloudberry could be used for upload the data but I am not sure how configurable they are to get maximum parallel upload to achieve fastest upload.
If you decide to write by yourself here is the starting point: Asynchronous Parallel Block Blob Transfers with Progress Change Notification
I know this is a bit too late for the OP, but in the Azure Management Portal, under Storage, pick your storage instance, then click the Import/Export link at the top. At the bottom of that screen, there is a "Create Import Job" link and icon. Also, if you click the blue help icon on the far right side, it says this:
You can use the Windows Azure Import/Export service to transfer large amounts of file data to Windows Azure Blob storage in situations where uploading over the network is prohibitively expensive or infeasible. You can also use the Import/Export service to transfer large quantities of data resident in Blob storage to your on-premises installations in a timely and cost-effective manner. Use the Windows Azure Import/Export Service to Transfer Data to Blob Storage
To transfer a large set of file data into Blob storage, you can send one or more hard drives containing that data to a Microsoft data center, where your data will be uploaded to your storage account. Similarly, to export data from Blob storage, you can send empty hard drives to a Microsoft data center, where the Blob data from your storage account will be copied to your hard drives and then returned to you. Before you send in a drive that contains data, you'll encrypt the data on the drive; when Microsoft exports your data to send to you, the data will also be encrypted before shipping.
Both windows azure storage powershell and azcopy could bulk upload data to azure.
For azure storage powershell, you could use ls -File -Recurse | Set-AzureStorageBlobContent -Container upload.
You can refer http://msdn.microsoft.com/en-us/library/dn408487.aspx for more details.
For azcopy, you can refer this article http://blogs.msdn.com/b/windowsazurestorage/archive/2012/12/03/azcopy-uploading-downloading-files-for-windows-azure-blobs.aspx

Resources