where to find "blobs name" and "containername" on MS Azure storage - azure

I have created a "blobs" storage on MS cloud and when I tried to read data from it to Jupyter notebook for some analysis which according to this documentation, it requires "container_name" as well as "blob_name". But when I created the storage, as far as I remember I didn't come across the step where I had to assign the blob name. However, apparently I needed it. Unfortunately, so far I couldn't find it but I believed that I could guess the "container_name". I did a quick research on google but couldn't find any resources that says exactly where it is. So, I would like to know how I can find out the "container_name" as well as "blob_name" from the MS Azure panel.
Thank you in advance.

You could access them via your storage account in the azure portal, refer to the screenshot.
Choose a container, you could see the blobs, including names, blob type, size,etc.

Related

View Azure Blob Metadata Online

Is there a way to examine an Azure blob's metadata through a web interface or the Azure portal?
I'm running into a problem where I set metadata on a blob programmatically, without any problems, but when I go back to read the metadata in another section of the program there isn't any. So I'd like to confirm that the metadata was, in fact, written to the cloud.
One of the simplest ways to set/get an Azure Storage Blob's metadata is by using the cross-platform Microsoft Azure Storage Explorer, which is a standalone app from Microsoft that allows you to easily work with Azure Storage data on Windows, macOS and Linux.
Just right click on the blob you want to examine and select Properties, you will see the metadata list if they exist.
Note: Version tested - 0.8.7
There is no way to check this on the portal, however you can try the Storage Explorer tool.
if you want to check the metadata in your code, please try this Get blob metadata

What kind of Azure Storage Account did I create?

I created a new Storage account in the Azure portal and choosed an existing Resource Group. It did not create a classic storage account but some kind of resource group-ish storage account that doesn´t have all the options a classic storage account has.
As an example, I could create the "files" folder through code, but I can´t use the code: "blockBlob.UploadFromStream(fileStream);", it gives me error 400 bad request. The same code works when I upload to a classic storage account.
What kind of storage account is seen in my image? Which is more correct, to create a classic storage account (blue icon in image) or the one I did (green/white/grey icon in image)?
First, I would suggest you have a look at David's reply in this thread to know the difference between new Azure storage account and classic Azure storage account.
it gives me error 400 bad request.
There are a lot of issues to caused 400 error. I would suggest your check your code to find out detailed issues. Please test use your code to create container (container name must meet the limitations) to see whether it will work. It is better if you could provide key code.
You created an Azure Resource Manager storage account of type Premium storage in region "North Europe" within your given resource group.
There is not really a right or wrong. As almost always it depends on your use-case.
I want to suggest these docs and samples for getting started with code addressing block blobs and Azure storage in general. Run and explore this code against the storage emulator and/or live storage accounts (classic/ARM standard/ARM premium). May be this helps finding a bug or a misconfiguration in your project.
Azure Blob Storage Samples for .NET
Azure Storage Samples
The issue here has nothing to do with Classic vs Resource Manager. It's related to the fact that the storage account is of type "Premium."
Premium storage accounts are exclusively used for Azure disks (attached durable storage), which are Page Blobs.
Premium storage doesn't support generic blobs/tables/queues. For that, you'd need a non-premium storage account.

What is using my Azure Blob Accounts

After starting to use it for a while we have accumulated a bunch of storage accounts. There doesn’t seem to be a way to figure out if those storage accounts are used, and what they are used by. It looks like even spinning up a VM creates a storage account.
Is there a way (without the PowerShell) to see what is being used and delete the unused storage?
As others have said, it's not possible to accurately give you an answer, but you can iterate over the storage accounts, and within that loop iterate the containers to see which ones have blobs or not. I would approach this from within VS by creating a new project, then using NuGet to add a reference to the WindowsAzure.Storage client library. This will make iterating those collections easier. It is essentially a wrapper to the Azure Management API. There is likely a way to do it with PS as well.

Unable to create an Import/Export Job on the new Azure portal

I have been trying to set up an import job as described here; the problem is that we do not have "Classic" storage, rather we are trying to set it up with "New" storage. Using the new portal I cannot find the place where one is meant to create a new job. The linked article shows how to do this for classic storage on the old portal only.
I have tried using the second approach they mention, which is to use the API, but that is turning out to be more of a pain than I though.
Does anyone know where I can add an import/export job in the new portal? Is this possible with "new" storage? If I manage to get the API way to work, can it be applied to "new" storage or is it only for "classic"?
Unfortunately, Import/Export is not available in the Preview Portal, and does not work at this time with v2 storage accounts. Can you use a Classic storage account instead?
We may be able to provide a sample code to unblock your scenario. Can you please send an email with your detailed requirements to waimportexport#microsoft.com so that we can set up a call to discuss further.
Thanks,
Aung

Azure Storage Explorer not Responding on HDIsight Node Container

I am using ASE to access my Azure Storage Accounts and it works great, but for some reason when I try to access my HDIsight cluster container (the one that has the same name as the HDIsight cluster), I get nothing, it seems to time out without a message. This is quite frustrating. A search turned up nothing so I suspect this is not usual behavior?
Any ideas about what I could do to fix this?
Here is a fiddler screen shot. Looks like it transferred about 15+MB of data, but it never displayed it.... odd.
(note: I just noticed that it actually does work if I try and use ASE from a VM in the same data center as my storage account)
I haven't looked at the source code of Azure Storage Explorer (ASE) but from the Fiddler trace it seems that ASE is trying to fetch all blobs in your blob containers first and then displaying them in the UI (on the basis of multiple requests to your storage account and requests containing a continuation token as a query string parameter (marker parameter)) and it looks like you have lots of blobs in the container. Given that ASE makes use of .Net Storage Client Library, my guess is that it is using ListBlobs method which fetches all blobs in the container instead of ListBlobsSegmented method.
Your options would be:
Get the source code of ASE from CodePlex and mess around with it and implement some kind of pagination.
File a bug/feature request on ASE to support pagination.
Use another Storage Explorer. If you're looking for desktop based blob explorer, I would recommend looking into Cerebrata tools (www.cerebrata.com). If you're looking for browser based tool, I would recommend looking at Cloud Portam (www.cloudportam.com) [Disclosure: I am developing Cloud Portam]

Resources