View Azure Blob Metadata Online - azure

Is there a way to examine an Azure blob's metadata through a web interface or the Azure portal?
I'm running into a problem where I set metadata on a blob programmatically, without any problems, but when I go back to read the metadata in another section of the program there isn't any. So I'd like to confirm that the metadata was, in fact, written to the cloud.

One of the simplest ways to set/get an Azure Storage Blob's metadata is by using the cross-platform Microsoft Azure Storage Explorer, which is a standalone app from Microsoft that allows you to easily work with Azure Storage data on Windows, macOS and Linux.
Just right click on the blob you want to examine and select Properties, you will see the metadata list if they exist.
Note: Version tested - 0.8.7

There is no way to check this on the portal, however you can try the Storage Explorer tool.

if you want to check the metadata in your code, please try this Get blob metadata

Related

readonly access to azure storage account

Our software uses Azure blob & Azure table storage.
I would like developers to be able to look through our production data with the Microsoft Azure Storage Explorer, but not be allowed to accidentaly edit it's data.
I don't want to allow anonymous access to the data (read only) as suggested here.
What would be a good way to achieve this?
Make use of Shared Access Signature option to connect to Azure Blob Storage from the Storage Explorer.
Find more details about SAS here.
Find more details about SAS in Storage Explorer here.

How to find the path to blob?

I want to export a machine learning model I created in Azure Machine Learning studio. One of the required input is "Path to blob beginning with container"
How do I find this path? I have already created a blob storage but I have no idea how to find the path to the blob storage.
you should be able to find this from the Azure portal. Open the storage account, drill down into blobs, then your container. Use properties for the context menu, the URL should be the path ?
You can also get this URL using Azure Storage Explorer(on-prem) Software.
Also through an Online version of Azure Storage Explorer
You can also simply guess the URL if you know the storage account name and container name
https://[storageaccount].blob.core.windows.net/[container]/[blob]

Azure Storage Explorer not Responding on HDIsight Node Container

I am using ASE to access my Azure Storage Accounts and it works great, but for some reason when I try to access my HDIsight cluster container (the one that has the same name as the HDIsight cluster), I get nothing, it seems to time out without a message. This is quite frustrating. A search turned up nothing so I suspect this is not usual behavior?
Any ideas about what I could do to fix this?
Here is a fiddler screen shot. Looks like it transferred about 15+MB of data, but it never displayed it.... odd.
(note: I just noticed that it actually does work if I try and use ASE from a VM in the same data center as my storage account)
I haven't looked at the source code of Azure Storage Explorer (ASE) but from the Fiddler trace it seems that ASE is trying to fetch all blobs in your blob containers first and then displaying them in the UI (on the basis of multiple requests to your storage account and requests containing a continuation token as a query string parameter (marker parameter)) and it looks like you have lots of blobs in the container. Given that ASE makes use of .Net Storage Client Library, my guess is that it is using ListBlobs method which fetches all blobs in the container instead of ListBlobsSegmented method.
Your options would be:
Get the source code of ASE from CodePlex and mess around with it and implement some kind of pagination.
File a bug/feature request on ASE to support pagination.
Use another Storage Explorer. If you're looking for desktop based blob explorer, I would recommend looking into Cerebrata tools (www.cerebrata.com). If you're looking for browser based tool, I would recommend looking at Cloud Portam (www.cloudportam.com) [Disclosure: I am developing Cloud Portam]

Azure data factory

I have two questions.
Is there any way to move data(CSV files in FTP server) periodically in to Azure Storage account using ADF?
After Switching Azure mode using
switch-azuremode AzureResourceManager
I could not use Get- help datafactory
(Used Powershell in Admin mode, added Azure account using "Add-AzureAccount")
Thanks
I try to answer your questions in order:
Not a native way at the moment, but you can create a custom activity which loads your CSV files into Azure storage. The scheduling can be done via JSON (as most of the functionality in Data Factory). For the future you can expect that there will be some way in a future way.
Haven't tried it, but you could try Get-Help azuredatafactory (caution: no whitespaces) or help azuredatafactory or have a look at the cmdlets reference.

Saving XML file with Azure

I have an XML file that I'm saving on my local machine (Windows 8).
I want to allow the application to push that XML up to Azure. Then I would also need the ability to recall it as a sync process.
I'm not sure what services I need and if Azure would even provide it.
Any tips would be appreciated.
I suggest you to use the Azure Storage Blob. Push the XML file as BLOB in your Azure Storage Blob, you will get/have a URL to access that.

Resources