File managment on Azure blob storage - azure

Is there a service or a(n open-source) library that could help in programmatically manage files stored on Azure blob storage. By manage I mean search with security trimming, authorizing download, document-versioning.
I've looked online but most of the solutions are more of end-user solutions. Should I build my own layer to talk to blob storage or there's a way that can take a part of this burden off my shoulders?
A solution that could work with both cloud (blob) and on-premise storage would be great!

I am not aware of any such library. That being said the storage platform does include the underlying capabilities upon which you could implement these features. For example, for authorizing access to objects you could issue SAS tokens to authorized users. For document versioning you can create snapshots on objects etc.

Related

When should we use file share in azure as compared to Azure Blobs?

Could someone please tell some examples where we can use Azure file share in azure instead of Azure Blobs. In the internet whenever I search I get it can be mounted or it follows SMB protocol. But still I am not understanding a single case where we can use Azure File share.
For this I tried to look into When to use Azure blob storage versus Azure file share?
-This is a similar question but doesn't answer my question.
Azure provides a variety of storage tools and services, including Azure Storage. To determine which Azure technology is best suited for your scenario, see Review your storage options in the Azure Cloud Adoption Framework.
For detailed information and examples refer to this article: https://learn.microsoft.com/en-us/azure/storage/common/storage-introduction
It depends mostly on your use-case and how you plan to access the data. If you simply want to mount and access your files Azure Files will be your best fit. If you are looking for the lowest cost and want to access your data programmatically through your application Azure Blob would be a better fit. Both are accessible through the portal or Azure Storage Explorer.
I also recommend this Learn module which covers the difference in data types and solutions.
Additional information: Azure Blob Storage vs Azure File Storage
Cost details of Azure Blob Storage pricing & Azure Files pricing
In short: if you ...
have an application that needs to store or access files in the cloud, use Blob Storage
need a file share that can be used by, for instance, a server, use File Shares
Azure Files shares can be mounted concurrently by cloud or on-premises deployments of Windows, Linux, and macOS. Azure Files shares can also be cached on Windows Servers with Azure File Sync for fast access near where the data is being used.
This means a File Share is, somewhat simplified, similar to a network share you would have in a local environment.
Azure Blob Storage helps you create data lakes for your analytics needs, and provides storage to build powerful cloud-native and mobile apps. Optimize costs with tiered storage for your long-term data, and flexibly scale up for high-performance computing and machine learning workloads.
This means Blob Storage is what you need when you're building powerful cloud-native and mobile apps.

Azure Face API - How to see stored face images/templates?

I am consuming the Azure Face API (Detection and Find Similars), but I do not see any documentation that explains how to access the stored data in the Azure's platform (in the UI of the Azure's website).
According to this, only facial templates are stored. But, how can I see them? Are these resources accessible for devs? Any other data stored?
By default, we can access the data with blob storage as the default storage in azure portal. By default, blob storage cannot be accessed in the public manner. It will be private, and we need to have it accessed publicly. Using the shared access signature (SAS) we can provide secured access to the resources in storage account.
Use the below procedure to get the SAS token and access them in API calls.

moving locally stored documented documents to azure

I want to spike whether azure and the cloud is a good fit for us.
We have a website where users upload documents to our currently hosted website.
Every document has an equivalent record in a database.
I am using terraform to create the azure infrastructure.
What is my best way of migrating the documents from the local file path on the server to azure?
Should I be using file storage or blob storage. I am confused about the difference.
Is there anything in terraform that can help with this?
Based on your comments, I would recommend storing them in Blob Storage. This service is suited for storing and serving unstructured data like files and images. There are many other features like redundancy, archiving etc. that you may find useful in your scenario.
File Storage is more suitable in Lift-and-Shift kind of scenarios where you're moving an on-prem application to the cloud and the application writes data to either local or network attached disk.
You may also find this article useful: https://learn.microsoft.com/en-us/azure/storage/common/storage-decide-blobs-files-disks
UPDATE
Regarding uploading files from local computer to Azure Storage, there are actually many options available:
Use a Storage Explorer like Microsoft's Storage Explorer.
Use AzCopy command-line tool.
Use Azure PowerShell Cmdlets.
Use Azure CLI.
Write your own code using any available Storage Client libraries or directly consuming REST API.

Rich ACLs with Azure Storage - delegating to AD?

How do I build a rich storage ACL policy system with Azure storage?
I want to have a blob container that has the following users:
public - read-only against some set of blobs
Uploader - read-write against some subset of blob names, these keys are shared out to semi-trusted build machines
shared admin - full capabilities against this blob subset
Ideally these users are accounts driven through Azure AD, so I can use the full directory service power with them... :)
My understanding of shared access keys is that they are (1) time-limited and (2) have to be created with hand-tooled code. My desire is that I can do something similar to AWS IAM policies on S3... :-)
Thing like AWS IAM Policies for S3 does not exist for Azure Blob Storage today. Azure recently started a Role Based Access Control (RBAC) and is available for Azure Storage but it is limited to performing management activities only like creating storage accounts etc. It is yet not available for perform data management activities like uploading blobs etc.
You may want to look at Azure Rights Management Service (Azure RMS) and see if it is a right solution for your needs. If you search for Azure RMS Blob you will find one of the search results link to a PDF file which talks about securing blob storage with this service (the link directly downloads the PDF file and hence I could not include it here).
If you're looking for a 3rd party service to do this, do take a look at the "Team Edition" of Cloud Portam (a service I am building currently). We recently released the Team Edition. In short, Cloud Portam is a browser-based Azure Explorer and it supports managing Azure Storage, Search Service and DocumentDB accounts. The Team Edition makes use of your Azure AD for user authentication and you can grant permissions (None, Read-Only, Read-Write and Read-Write-Delete) on the Azure resources you manage through this application.
Paul,
While Gaurav is correct in that Azure Storage does not have AD integration today, I wanted to point out a couple of things about shared access signatures from your post:
My understanding of shared access keys is that they are (1) time-limited and (2) have to be created with hand-tooled code
1) A sas token/uri does not need to have an expiry date on it (it's an optional field), so in that sense they are not time-limited and need not be regenerated unless you change the shared key with which you generated the token
2) You can use PS cmdlets to do this for e.g.: https://msdn.microsoft.com/en-us/library/dn806416.aspx. Some storage explorers also support creation of sas tokens/uris statically without you having to write code for it.

What is the best strategy for using Windows Azure as a file storage system - with http download capabilities

I need to store multiple files that users upload, and then provide these users with the capability of accessing their files via http. There are two key considerations:
- Storage (which is my primary concern here)
- Security (which let's leave aside for now)
The question is:
What is the most cost efficient and performant way of storing all these files and giving access to them later? I believe the answer is:
- Store files within Azure Storage Account, and have a key that references them in an SQL Azure database.
I am correct on this?
Is a blob storage flat? Or can I create something like folders inside it to better organize my files?
The idea of using SQL Azure to store metadata for your blobs is a pretty common scenario, which allows you to take advantage of SQL for searching, and blobs for storage.
Blobs are organized by container. So you'd have something like:
http://mystorage.blob.core.windows.net/mycontainer/myfile.doc
You can also simulate a hierarchy using a delimiter, but in reality there's just container plus blob.
If you keep the container or blob private, the user would either have to go through your web front end (or web service), or you'd have to provide them with a special URL with a Shared Access Signature appended, which is a time-limited URL.
I would recommend you to take a look at BlobShare Sample which is a simple file sharing application that demonstrates the storage services of the Windows Azure Platform, together with the authentication and authorization capabilities of Access Control Service (ACS). The full sample code is located at following link:
http://blobshare.codeplex.com/
You can use this sample code immediately, just by adding proper reference to your Windows Azure Account credentials. The best thing with this sample is that you can provide blob access directly through Access Control Services. You can also modify the code to add SAS support as well as blob download from public containers. Once you have it working and understood the concept you can tweak to make it the way you would want.

Resources