Store csv files on Azure blobs - azure

I have few csv file present on my local system. I want to upload them into Azure blobs in a particular directory structure. I need to create the directory structure as well on azure.
Please suggest the possible options to achieve that.

1 - Create Your storage in Azure
2 - get Azure storage explorer
https://azure.microsoft.com/en-us/features/storage-explorer/
3- start the app, login and Navigate to your Blob
4- Drag and drop folder and files :)
basically this MS provided Software allows you to use your storage account in a classic folders and files structure

Related

Can we upload folder with files in Azure Blob storage like file share?

I need a suggestion on Azure Blob storage. We are using Azure blob storage heavily for various files and now we have a need to categorize these files and stored in folder structure based on certain category. so need to stored same way that we use Azure File share.
Example - Azure Storage Account A - Container A - Folder1 - File1,File4,File9
Example - Azure Storage Account A - Container A - Folder2 - File11,File7,File10
Example - Azure Storage Account A - Container A - Folder3 - File21,File8,File2
We dont want to move to file share as this will require huge efforts, various changes and Azure File share is expensive as well.
Second question is how many blob containers i can create in single storage, is there any limitations?
Please suggest...
For folder or directory support, you can use the Azure Data Lake storage Gen2, hierarchical namespace feature.
There is no upper limit for the number of containers or blobs in a single Storage account.

Azure storage options to serve content on Azure Web App

I am a total newbie to Azure WebApps and storage, I need some clarification/confirmation. The main thing to take note of, my application (described below) requires a folder hierarchy. Blob is out of the question and file share doesn't allow anonymous access unless I use Shared Access Signature (SAS).
Am I understanding Azure storage correctly, it's either you fit into the Azure storage model or you don't?
Can anyone advise how I can achieve what's required by the CMS application as described below by using Blobs?
The only option I see is to find a way to change the CMS application so that it always has the SAS in the URL to every file it requests from storage in order to serve content on my Web App? If so, is it a problem if I set my SAS to expire sometime in the distant future?
https://<appname>.file.core.windows.net/instance1/site1/file1.jpg?<SAS>
Problem with using Blob
So far my understanding is that Blob storage doesn't allow "sub folders" as it's a container that holds unstructured data, therefore I'm unable to use this based on my application (described below) as it requires folder structure.
The problem with using File Share
File share seemed perfect as it allows for folder hierarchy, naturally that's what I've used.
However, no anonymous access is allowed for files stored in file storage, the access needs to be authorised. One way of authorising the access is to create a SAS on a file/share level with Read permission and then using that SAS URL to access the file.
Cannot access Windows azure file storage document
My application
I've created a Linux Web App running open source CMS application. This application allows creation of multiple websites, for each website's content such as images, docs, multimedia to be stored on a file server. These files are then served to the website via a defined URL.
The CMS application allows for a settings of the location where it should save its files, this would be a folder on the file server. It then creates a new sub folder for every site it hosts in that location.
Example folder hierarchy
/instance1
/site1
/file1
/file2
/site2
/file1
/file2
Am I understanding Azure storage correctly, it's either you fit into
the Azure storage model or you don't?
You can use Azure Storage Model for your CMS Application. You can use either Blob Storage or File Share
Can anyone advise how I can achieve what's required by the CMS
application as described below by using Blobs?
You can use Data Lake Gen 2 storage account if you want to use Azure Blob Storage.
Data Lake Gen 2 storage enables hierarchical namespace so that you can use subfolders in the Blob Storage as per your requirements
Problem with using Blob
Blob Storage allows subfolders if we use Data Lake Gen 2 storage account. You can enable Blob Public Anonymous access
The problem with using File Share
Azure File Share supports but does not allow public anonymous access. You can use Azure Managed Identity (System-Assigned) for your web app to access the Azure File Share.
Then your application would be able to access the Azure File Share without SAS token
The issue of not having real folders in a blob storage shouldn't be any issue for your use case. Just because it doesn't have your traditional folders doesn't mean it can't serve content on e.g. instance1/site1/file1. That's still possible but the instance1/site1/ will just be part of the name of the blob.
Tools like the Azure Portal or Storage Explorer will actually show folders by using the delimiter / and querying data that appears to be inside a folder by using the path as prefix.

Azure storage sync mechanisms

I have a problem that I have been wracking my brain about and figured I would need some perspective and insight from people who are a lot more knowledgeable about this.
What I have currently: Web based application hosted in azure uses azure blob store to store files that are generated as part of data import processes. We have a seperate application that extends the original web application that allows users to upload files and these files are currently also stored in azure blob store.
Where I am trying to go: I have a requirement that wants the ability to map network file shares on a users laptop and be able to access these files that currently reside in the blob.
Since Azure blob does not support SMB I have no way of actually doing this with a blob store.
I could use Azure files in conjunction with a File Server running the sync agent. However, this requires a lot of work both in terms of refactoring, setup and some custom service that add remove permissions on the file server.
I'm wondering if there is a service or a piece of software that exists in the market currently that allows me to continue using blob and perhaps sync the blob files into a file server that can then allow users to access and open files using windows file explorer? I found one that looks like an open source project but only does a one way sync from the blob to the file share. Ideally I'd like to find a solution that does a two way sync like azure file sync does.
Any thoughts and ideas will be appreciated.
Since the max number of blob containers, file shares is unlimited. Per my understanding, you could leverage the following approaches:
Migrate the data from blob storage to azure file share instead of blob storage, then the subsequent file store is azure file storage.
Note: Currently you must specify storage account key when mounting file shares, details you could follow this feedback. I recommend that you'd better do not map network file shares on a users laptop.
You could still use the blob storage, and you could create each blob container for each user and generate each blob container SAS token for your users, then the users could leverage Azure Storage Explorer to manage their blob files or use AzCopy and other command tools to download the blob files into their laptop file system.
Note: For security consideration, you could combine a stored access policy with a SAS, in order to revoke the permissions, you just need to invalidate the related access policy instead of regenerating the account key. Details you could follow Controlling a SAS with a stored access policy and Shared Access Signatures, Part 2: Create and use a SAS with Blob storage.

Moving files from Azure Blob / Files storage to Azure FTP space

Would like to know whether it is a feasible to move the folder ( with files ) from Azure blob/file storage to webapp root.
Scenario: Would like to replace gallery of images folder used by static HTML site for gallery section weekly using powershell.
Request suggestions or alternatives as not sure how to handle this in azure and schedule swapping of folders between blob and ftp.
You can use the BlobTrigger trigger with WebJob deployed on the same web app and copy the files from blob storage to the local file system.
Would like to replace gallery of images folder used by static HTML site for gallery section weekly using powershell.
Please try to store the images in Azure blob directly. We can access the images in Azure blob with 'Full public read access' mode or 'Public read access for blobs only' mode. Refer to this article for more details. Then we can use Scheduler Webjob to replace the images directly.
It wasn't clear to me exactly what you are trying to do. If you have a legacy app / adoption of FTP you can mount an FTP server on Azure File Storage. Or alternatively Blob Storage can be used for public data as described above. If you want a simple tool for interacting with Blob Storage then you can try Storage Explorer.

Azure storage for files in specific folder structure

Currently i have some ftp where on it i have some deep structure of folders and files within it. It could be even 10 levels down from root folder. As i migrated already with success my local database to azure database, i wonder also whether is there any azure ftp i could use to migrate this as well. I know we have something like Azure storage and i could create Container for it of type File or Blobs - are one of those could be used like particural ftp - could i create folder structure there somehow using container and either File or Blob for that purpose, how it works there? Does either container blob or file for such purposes?
Let me add to what NDJ has written. So both Azure Blobs and Files would serve your purpose.
As mentioned by NDJ, Azure Blob Storage is a 2-level hierarchy system. At the top you have a blob container and the each blob container contains 0 or more files. So it does not support a folder structure per se but as NDJ mentioned, you can create an illusion of a sub folder by using appropriate blob delimiters (usually /). If you were to compare it with local file system, a directory at the root level (C:) is a container in blob storage and then the files would go in there. So imagine you have a folder called images in C:\ of your computer, that would be a container in blob storage. Now imagine that you have 2 sub folders beneath this folder (let's call them hires and lores) and both of them contains some files (say image1.png). When you move them to Azure Blob Storage, the container name would be images but the blob names would be hires/image1.png and lores/image1.png. Some of the storage explorers would take this delimiter (/) and show you that your container contains 2 folders and inside each folder you have an image called image1.png but in reality there are only 2 blobs in that blob container.
Azure File Service is a close match to your local file system. At the top level, you've got a Share and each share will container directories and files. Each directory can again contain many directories and files.
As NDJ mentioned, there's no FTP access to Azure Storage but there are many tools that will allow you to upload files from local computer to Azure Storage and many of them will preserve the file hierarchy. You can always write code to upload the files yourself. If you decide to use Azure Files, you can simply mount a File Storage Share as a network drive on your local computer and then transfer the files from your local computer to Azure Files as if you're transferring files from one drive to another.
UPDATE
Regarding difference between Azure Blob Storage and File Storage, both are used to store files. There are a few differences that I could think of:
A Share in Azure File Storage can be mount as a network drive on your local computer/Azure VM whereas a Blob Container in Azure Blob Storage can't. So if you have an application which writes files to local file system, you can take the application as is and make use of Azure File Storage and write the file to that network drive without making many changes to your code (typical example of Lift-And-Shift kind of application.
You can set ACL on a Blob Container whereas you can't do the same on a Share. This makes Azure Blob Storage ideal for storing static content (images, css, js) for your websites. For exposing files in File Storage, you would need to resort to Shared Access Signature.
You can set the size of a Share (default is 5GB) whereas no such thing exist for a Blob Container. A blob container can go up to the size of a storage account.
To understand Azure Files, I would recommend reading this: https://azure.microsoft.com/en-in/documentation/articles/storage-dotnet-how-to-use-files/.
Azure blob supports 10 levels down (up to 254. Basically the files are stored non hierarchically, but each / separator gives the appearance of directories.
It's relatively trivial to write something to move files to azure, as far as I know there is no ftp functionality yet - but it has been requested. It looks like some people have already created some code for this
You can now use Storage Explorer across all platforms to easily work within any folder structure.

Resources