Azure Storage Explorer not Responding on HDIsight Node Container - azure

I am using ASE to access my Azure Storage Accounts and it works great, but for some reason when I try to access my HDIsight cluster container (the one that has the same name as the HDIsight cluster), I get nothing, it seems to time out without a message. This is quite frustrating. A search turned up nothing so I suspect this is not usual behavior?
Any ideas about what I could do to fix this?
Here is a fiddler screen shot. Looks like it transferred about 15+MB of data, but it never displayed it.... odd.
(note: I just noticed that it actually does work if I try and use ASE from a VM in the same data center as my storage account)

I haven't looked at the source code of Azure Storage Explorer (ASE) but from the Fiddler trace it seems that ASE is trying to fetch all blobs in your blob containers first and then displaying them in the UI (on the basis of multiple requests to your storage account and requests containing a continuation token as a query string parameter (marker parameter)) and it looks like you have lots of blobs in the container. Given that ASE makes use of .Net Storage Client Library, my guess is that it is using ListBlobs method which fetches all blobs in the container instead of ListBlobsSegmented method.
Your options would be:
Get the source code of ASE from CodePlex and mess around with it and implement some kind of pagination.
File a bug/feature request on ASE to support pagination.
Use another Storage Explorer. If you're looking for desktop based blob explorer, I would recommend looking into Cerebrata tools (www.cerebrata.com). If you're looking for browser based tool, I would recommend looking at Cloud Portam (www.cloudportam.com) [Disclosure: I am developing Cloud Portam]

Related

moving locally stored documented documents to azure

I want to spike whether azure and the cloud is a good fit for us.
We have a website where users upload documents to our currently hosted website.
Every document has an equivalent record in a database.
I am using terraform to create the azure infrastructure.
What is my best way of migrating the documents from the local file path on the server to azure?
Should I be using file storage or blob storage. I am confused about the difference.
Is there anything in terraform that can help with this?
Based on your comments, I would recommend storing them in Blob Storage. This service is suited for storing and serving unstructured data like files and images. There are many other features like redundancy, archiving etc. that you may find useful in your scenario.
File Storage is more suitable in Lift-and-Shift kind of scenarios where you're moving an on-prem application to the cloud and the application writes data to either local or network attached disk.
You may also find this article useful: https://learn.microsoft.com/en-us/azure/storage/common/storage-decide-blobs-files-disks
UPDATE
Regarding uploading files from local computer to Azure Storage, there are actually many options available:
Use a Storage Explorer like Microsoft's Storage Explorer.
Use AzCopy command-line tool.
Use Azure PowerShell Cmdlets.
Use Azure CLI.
Write your own code using any available Storage Client libraries or directly consuming REST API.

How to make code on an Azure VM trigger from storage blob change (like Functions do)

I've got some image processing code that I need to run in Azure. It's perfect for an Azure Function, but unfortunately requires a component with a complex installation procedure and therefore will need to run in a VM.
However, I'd like to make it behave much like an Azure Function, and trigger whenever new items arrive in blob storage.
My question is: Does Azure provide me with any handy way of doing this, or do I have to write code that polls the blob storage looking for new items?
Have a look at Azure WebJobs SDK. It shares API model with Functions, but you can host it in any .NET application. Blob Trigger.

View Azure Blob Metadata Online

Is there a way to examine an Azure blob's metadata through a web interface or the Azure portal?
I'm running into a problem where I set metadata on a blob programmatically, without any problems, but when I go back to read the metadata in another section of the program there isn't any. So I'd like to confirm that the metadata was, in fact, written to the cloud.
One of the simplest ways to set/get an Azure Storage Blob's metadata is by using the cross-platform Microsoft Azure Storage Explorer, which is a standalone app from Microsoft that allows you to easily work with Azure Storage data on Windows, macOS and Linux.
Just right click on the blob you want to examine and select Properties, you will see the metadata list if they exist.
Note: Version tested - 0.8.7
There is no way to check this on the portal, however you can try the Storage Explorer tool.
if you want to check the metadata in your code, please try this Get blob metadata

What is the best strategy for using Windows Azure as a file storage system - with http download capabilities

I need to store multiple files that users upload, and then provide these users with the capability of accessing their files via http. There are two key considerations:
- Storage (which is my primary concern here)
- Security (which let's leave aside for now)
The question is:
What is the most cost efficient and performant way of storing all these files and giving access to them later? I believe the answer is:
- Store files within Azure Storage Account, and have a key that references them in an SQL Azure database.
I am correct on this?
Is a blob storage flat? Or can I create something like folders inside it to better organize my files?
The idea of using SQL Azure to store metadata for your blobs is a pretty common scenario, which allows you to take advantage of SQL for searching, and blobs for storage.
Blobs are organized by container. So you'd have something like:
http://mystorage.blob.core.windows.net/mycontainer/myfile.doc
You can also simulate a hierarchy using a delimiter, but in reality there's just container plus blob.
If you keep the container or blob private, the user would either have to go through your web front end (or web service), or you'd have to provide them with a special URL with a Shared Access Signature appended, which is a time-limited URL.
I would recommend you to take a look at BlobShare Sample which is a simple file sharing application that demonstrates the storage services of the Windows Azure Platform, together with the authentication and authorization capabilities of Access Control Service (ACS). The full sample code is located at following link:
http://blobshare.codeplex.com/
You can use this sample code immediately, just by adding proper reference to your Windows Azure Account credentials. The best thing with this sample is that you can provide blob access directly through Access Control Services. You can also modify the code to add SAS support as well as blob download from public containers. Once you have it working and understood the concept you can tweak to make it the way you would want.

Is it possible to mount blob storage to my local machine for deployment?

I have a build script that it would be very useful to configure to dump some files into Azure blob storage so they can be picked up by my Azure web role.
My preferred plan was to find some way of mounting the blob storage on my build server as a mapped drive and simply using Robocopy copy to copy the files over. This will involve the least ammount of friction as I already am deploying some files like this to other web servers using WebDrive.
I found a piece of software that will allow me to do that: http://www.gladinet.com/
However on further investigation I found that it needs port 80 to run without some hairy looking hacking about on the server.
So is there another piece of software I could use or perhaps another way I haven't considered, such as deploying the files to a local folder that is automagically synced with blob storage?
Update in response to #David Makogon
I am using http://waacceleratorumbraco.codeplex.com/ this performs 2 way synchronisation between the blob storage and the web roles. I have tested this with http://cloudberrylab.com/ and I can deploy files manually to the blob and they are deployed correctly to the web roles. Also I have done the reverse and updated files in the web roles which have then been synced back to the blob and I have subsequently edited/downloaded them from blob storage.
What I'm really looking for is a way to automate the cloudberry side of things. So I don't have a manual step to copy a few files over. I will investigate the Powershell solutions in the meantime.
I know this is an old post - but in case someone else comes here... the answer is now "yes". I've been working on a CodePlex project to do exactly that. (All source code is available).
http://azuredrive.codeplex.com/
If you're comfortable using powershell in your build process then you could use the Cerebrata Cmdlets to upload the files. If that doesn't work for you, you could write a custom activity (but this sounds quite a bit more involved).
Mounting a cloud drive from a non-Windows Azure compute instance (e.g. your local build machine) is not supported.
Having said that: Even if you could mount a Cloud Drive from your build machine, your compute instances would need access to it too, and there can only be one writer. If your compute instances only needed read-only access, they'd need to create a snapshot after you upload new files.
This really doesn't sound like a good idea though. As knightpfhor suggested, the Cerebrata cmdlets provide this capability (look at Import-File). This allows you to push individual files into their own blobs. You can optimize further by pushing a single ZIP file into a blob. You can then use a technique similar to the one described by Nate Totten in his multi-tenant web role sample, to detect new zip files and expand them to your local storage. Nate's blog post is here.
Oh, and if you don't want to use the Cerebrata cmdlets, you can upload blobs directly with the Windows Azure Storage REST API (though the cmdlets are very simple to use and integrate seamlessly with PowerShell).

Resources