how to watch a folder of Azure Storage Explorer BLOB - azure

we have some blob containers in Azure Storage,
I would like to have a Dashboard with links to some specific folders e.g. to see at one glance the latest files in a specific folder of the Blob Container.
At the moment it is only possible with some clicks, navigation down and sorting into the folder.
I already tried to create a Metrics chart on the Dashboard, but it gives me only BLOB count and stats for the whole BLOB not for granular folders.
Any ideas how to whatch specific folders immediately?

Thing is, folders don't exists in Azure Storage Blobs. There are only containers and blobs inside containers. Blobs define virtual folders. Tools like the Azure Portal or Azure Storage Explorer use the / seperator in the blob url as a way to present a virtual folder structure.
So the answer is that it is impossible since there are no physical folders, as stated in the docs as well:
Blob storage offers three types of resources:
The storage account.
A container in the storage account
A blob in a container

Related

Can we upload folder with files in Azure Blob storage like file share?

I need a suggestion on Azure Blob storage. We are using Azure blob storage heavily for various files and now we have a need to categorize these files and stored in folder structure based on certain category. so need to stored same way that we use Azure File share.
Example - Azure Storage Account A - Container A - Folder1 - File1,File4,File9
Example - Azure Storage Account A - Container A - Folder2 - File11,File7,File10
Example - Azure Storage Account A - Container A - Folder3 - File21,File8,File2
We dont want to move to file share as this will require huge efforts, various changes and Azure File share is expensive as well.
Second question is how many blob containers i can create in single storage, is there any limitations?
Please suggest...
For folder or directory support, you can use the Azure Data Lake storage Gen2, hierarchical namespace feature.
There is no upper limit for the number of containers or blobs in a single Storage account.

Storage statistics of a folder in Azure Blob Storage

I would like the stats of specific folders in Azure Blob Storage. For example I would like to know how many files are present in a folder, whats the size of each file or whats the total size of a folder. Does blob storage provide similar data through an api endpoint?
Edit: I have a very large number of files on Azure Blob so I am looking for a solution where I do not have to iterate over all the files in order to calculate total size of the virtual folder.
Does blob storage provide similar data through an api endpoint?
As such Azure Blob Storage does not provide an API to get storage statistics at the folder level but you can make use of List Blobs REST API operation to get that information.
List Blobs operation will list the blobs insider a container but you can use prefix parameter to get the list of blobs inside a virtual folder where prefix will be the path of the virtual folder. For example, if you wish to list the blobs inside folder1 virtual folder, you would specify prefix as folder1/
Each item in the list is a blob which will have a size attribute which will give you the size of the blob. You will simply add the size of individual blobs to get the total size of the folder.

Moving files from Azure Blob / Files storage to Azure FTP space

Would like to know whether it is a feasible to move the folder ( with files ) from Azure blob/file storage to webapp root.
Scenario: Would like to replace gallery of images folder used by static HTML site for gallery section weekly using powershell.
Request suggestions or alternatives as not sure how to handle this in azure and schedule swapping of folders between blob and ftp.
You can use the BlobTrigger trigger with WebJob deployed on the same web app and copy the files from blob storage to the local file system.
Would like to replace gallery of images folder used by static HTML site for gallery section weekly using powershell.
Please try to store the images in Azure blob directly. We can access the images in Azure blob with 'Full public read access' mode or 'Public read access for blobs only' mode. Refer to this article for more details. Then we can use Scheduler Webjob to replace the images directly.
It wasn't clear to me exactly what you are trying to do. If you have a legacy app / adoption of FTP you can mount an FTP server on Azure File Storage. Or alternatively Blob Storage can be used for public data as described above. If you want a simple tool for interacting with Blob Storage then you can try Storage Explorer.

Azure storage for files in specific folder structure

Currently i have some ftp where on it i have some deep structure of folders and files within it. It could be even 10 levels down from root folder. As i migrated already with success my local database to azure database, i wonder also whether is there any azure ftp i could use to migrate this as well. I know we have something like Azure storage and i could create Container for it of type File or Blobs - are one of those could be used like particural ftp - could i create folder structure there somehow using container and either File or Blob for that purpose, how it works there? Does either container blob or file for such purposes?
Let me add to what NDJ has written. So both Azure Blobs and Files would serve your purpose.
As mentioned by NDJ, Azure Blob Storage is a 2-level hierarchy system. At the top you have a blob container and the each blob container contains 0 or more files. So it does not support a folder structure per se but as NDJ mentioned, you can create an illusion of a sub folder by using appropriate blob delimiters (usually /). If you were to compare it with local file system, a directory at the root level (C:) is a container in blob storage and then the files would go in there. So imagine you have a folder called images in C:\ of your computer, that would be a container in blob storage. Now imagine that you have 2 sub folders beneath this folder (let's call them hires and lores) and both of them contains some files (say image1.png). When you move them to Azure Blob Storage, the container name would be images but the blob names would be hires/image1.png and lores/image1.png. Some of the storage explorers would take this delimiter (/) and show you that your container contains 2 folders and inside each folder you have an image called image1.png but in reality there are only 2 blobs in that blob container.
Azure File Service is a close match to your local file system. At the top level, you've got a Share and each share will container directories and files. Each directory can again contain many directories and files.
As NDJ mentioned, there's no FTP access to Azure Storage but there are many tools that will allow you to upload files from local computer to Azure Storage and many of them will preserve the file hierarchy. You can always write code to upload the files yourself. If you decide to use Azure Files, you can simply mount a File Storage Share as a network drive on your local computer and then transfer the files from your local computer to Azure Files as if you're transferring files from one drive to another.
UPDATE
Regarding difference between Azure Blob Storage and File Storage, both are used to store files. There are a few differences that I could think of:
A Share in Azure File Storage can be mount as a network drive on your local computer/Azure VM whereas a Blob Container in Azure Blob Storage can't. So if you have an application which writes files to local file system, you can take the application as is and make use of Azure File Storage and write the file to that network drive without making many changes to your code (typical example of Lift-And-Shift kind of application.
You can set ACL on a Blob Container whereas you can't do the same on a Share. This makes Azure Blob Storage ideal for storing static content (images, css, js) for your websites. For exposing files in File Storage, you would need to resort to Shared Access Signature.
You can set the size of a Share (default is 5GB) whereas no such thing exist for a Blob Container. A blob container can go up to the size of a storage account.
To understand Azure Files, I would recommend reading this: https://azure.microsoft.com/en-in/documentation/articles/storage-dotnet-how-to-use-files/.
Azure blob supports 10 levels down (up to 254. Basically the files are stored non hierarchically, but each / separator gives the appearance of directories.
It's relatively trivial to write something to move files to azure, as far as I know there is no ftp functionality yet - but it has been requested. It looks like some people have already created some code for this
You can now use Storage Explorer across all platforms to easily work within any folder structure.

Azure - Check if a new blob is uploaded to a container

Are there ways to check if a container in Azure has a new blob (doesn't matter which blob it is)? LastModifiedUtc does not seem to change if a blob is dropped into the container
You should use a BlobTrigger function in an App Service resource.
Documentation
Windows Azure Blob Storage does not provide this functionality out of the box. You would need to handle this on your end. A few things come to my mind (just thinking out loud):
If the blobs are uploaded using your application (and not through 3rd party tools), after the blob is uploaded, you could just update the container properties (may be add/update a metadata entry with information about the last blob uploaded). You could also make an entry into Azure Table Storage and keep on updating it with the information about last blob uploaded. As I said above, this method will only work if all blobs are uploaded through your application.
You could manually iterate through blobs in the blob container periodically and then sort them by last modified date. This method would work fine for a blob container having lesser number of blobs. If the number of blobs are more (say in tens of thousands), then you would end up fetching a long list because blob storage only sorts the blob by blob name.

Resources