Azure blob storage and security - azure

I'm in the process of deciding which technology to use for a project. The project will store large amounts of documents (Word, PDF etc) and I'm trying to figure out the best way of storing them. So far, I've come up with:
Standard hosting, and using the file system
Standard hosting, use Full Text Search and store documents in SQL Server
Use Azure Blobs.
Under no circumstances can the documents be visible to anyone. Only certain, authorised people should be able to view documents. Can anyone point me in the direction of how you can secure the documents so that you can't just point a browser to it and it will be visible?

Windows Azure blobs are a great place to store a lot of documents. By default, blobs can only be retrieved by someone who has the access key for the account. (The headers on API calls have to be signed with that key, so there's a cryptographic guarantee that unauthorized third-parties can't access them.)
I assume you'll host a front-end of some sort that gives access to authorized users, and this front-end will act as a proxy to the files themselves.

Don't store your documents in a web server directory. It's as simple as that. Why go through the all the efforts of configuring a web server when you don't want the files on the web in the first place?

Related

Kentico Azure blob integration

On my Kentico project I have integrated Azure blob storage instead of saving files locally. Followed this article. https://docs.kentico.com/k12/custom-development/working-with-physical-files-using-the-api/configuring-file-system-providers/configuring-azure-storage
Things are working alright except for a one problem. Now all the files are accessible publically. There are some PDF files in the media library that I won't only the logged in users to view but now any one can view these files. Is there any workaround for this issue?
Files in Media Library are always accessible via the direct link and you can't restrict them to logged-in users only. Regardless it's Azure storage or local disk.
But there are two ways of achieving this:
Presentation-only restrictions. When you present those PDF links to the website user - display them only to logged-in users. The files will still be accessible via direct links but only logged-in users will see them.
Hard restrictions. As far as I know, these restrictions can be set up only for files stored in CMS tree. This approach will check permissions when accessing files via direct link.
If you are storing files in blob there is no way. You can restrict the access to the whole container with SAS token (or individual blob), but not a to a specific folder. Folder is purely virtual structure, it exists only in a file path.

Storing images on Azure web app. File system limit and performance?

I am building an asp.net Web App hosted with azure. I have decided to use file system to store images instead of in the database, with the image path's stored in the database. The images are added by users, and they can add a lot at once, up to 100.
My question is, what are the limits on file system storage of images? Currently, they are all just being stored in a single directory. Eventually there will be thousands (possibly hundreds of thousands) images in a single directory. Will this have a drastic effect on performance when these images are embedded in a page? None of the images are bigger than 1 mb. Would it be better practice to make sub-directories?
I strongly recommand to use Azure Blob storage to store file which is the best way to store static content in Azure.
And, if you have to scale your application in the future (web farm, azure web app or whatever that has more than one VM instance), storing files on the server is not a best practice.
You can find some "best practices" about static content like images on this page: https://msdn.microsoft.com/en-us/library/dn589776.aspx
And the Azure Storage documentation that will help you to get started: https://azure.microsoft.com/en-us/documentation/services/storage/
Hope this helps,
Julien

Uploading Multiple SCORM Websites/Packages (Mini Websites) To Existing Azure Website

We have a solution which allows our customers to upload SCORM (http://scorm.com/scorm-explained/) packages which are basically mini websites, just html pages and a predefined javascript interface which allows the packages to talk to our site. This all works fine on a self hosted IIS machine as we build up a folder structure of each SCORM module within the website root and can allow the user to open each one and complete the course.
However these can be quite large, containing multiple videos, etc. and while on a self hosted machine we can place the site on a large hard drive how would this work if we wanted to migrate the solution to azure cloud services. I have read in several places the site must be stateless and the VM can be re-imaged at any time, does that mean that we shouldn't store anything in the folder structure that wasn't part of the original package? Is there a way to configure a shared permanent folder for our websites to use?
In a word BlobStorage!
As you quite rightly point out, Azure VMs are stateless so you need a single, persisted repository for your data / file, and blob storage easily fits the bill.
Without further information on precise requirements or how you explicitly need to reference these files, it's a bit hard to suggest the best way for you. However, here's a couple of things...
This video provides a quick overview of blob storage and retrieving items directly from BlobStorage. If you scenario requires you to serve up or store files on the local machine, this might be a good starting point.
BlobStorage supports direct access over HTTP and HTTPS, so you may want to simply reference files directly from BlobStorage (something that shouldn't be a problem as they're web assets). Check out the HTTP section in this article for the URL format. Note: you can also secure up these blob using SharedAccessSignatures if you want to restrict access.
There's no reason why you cannot use a standard VM then map a network drive using the Microsoft Azure File Service. I can't say I've personally done this, I used something else, but the concept a shared resource existing in BlobStorage is very doable.
HTH

What is the best strategy for using Windows Azure as a file storage system - with http download capabilities

I need to store multiple files that users upload, and then provide these users with the capability of accessing their files via http. There are two key considerations:
- Storage (which is my primary concern here)
- Security (which let's leave aside for now)
The question is:
What is the most cost efficient and performant way of storing all these files and giving access to them later? I believe the answer is:
- Store files within Azure Storage Account, and have a key that references them in an SQL Azure database.
I am correct on this?
Is a blob storage flat? Or can I create something like folders inside it to better organize my files?
The idea of using SQL Azure to store metadata for your blobs is a pretty common scenario, which allows you to take advantage of SQL for searching, and blobs for storage.
Blobs are organized by container. So you'd have something like:
http://mystorage.blob.core.windows.net/mycontainer/myfile.doc
You can also simulate a hierarchy using a delimiter, but in reality there's just container plus blob.
If you keep the container or blob private, the user would either have to go through your web front end (or web service), or you'd have to provide them with a special URL with a Shared Access Signature appended, which is a time-limited URL.
I would recommend you to take a look at BlobShare Sample which is a simple file sharing application that demonstrates the storage services of the Windows Azure Platform, together with the authentication and authorization capabilities of Access Control Service (ACS). The full sample code is located at following link:
http://blobshare.codeplex.com/
You can use this sample code immediately, just by adding proper reference to your Windows Azure Account credentials. The best thing with this sample is that you can provide blob access directly through Access Control Services. You can also modify the code to add SAS support as well as blob download from public containers. Once you have it working and understood the concept you can tweak to make it the way you would want.

Allowing access to Azure Storage nodes to select users?

Given a stored file on Azure Storage (blobs, tables or queues -- doesn't matter), is it possible to allow access to it for all, but only based on permissions?
For example, I have a big storage of images, and a DB containing users and authorizations. I want user X to only be able to access images Y and Z. So, the URL will be generally inaccessible, unless you provide some sort of a temporary security token along with it. How's that possible?
I know I can shut the storage from the outside world, and allow access to it only through an application checking this stuff, but this would require the application to be on Azure as well, and on-premise app won't be able to deliver any content from Azure Storage.
It is from my understanding that most CDNs provide such capability, and I sure hope so Azure provides a solution for this as well!
Itamar.
I don't think you can achieve this level of access filtering. The only methods I'm aware of are described in this msdn article
Managing Access to Containers and Blobs
and here a blog that describes a little part of code to implement it
Using Container-Level Access Policies in Windows Azure Storage
I'm not sure this would fit your need. If I understood it right I would do it this way :
1. Organize your content in container that match the roles
2. On your on premise application check if user has access and if yes generate the right URL to give him a temporary access to the resource.
Of course this only works if the users have to go through a central point to get access to the content in the blob. If they bookmark the generated link it will fail once the expiration date is passed.
Good luck.
This is actually possible to implement with Blob storage. Consider (a) a UI that is like explorer, and (b) that users are already authenticated (could use Access Control Service, but don't need to).
The explorer-like UI could expose resources that are appropriate to the authenticated user. The underlying access to these resources would be Shared Access Signature-controlled at the granularity appropriate for the objects - for example, restrict only see one file in a folder, or the whole folder, or ability to create a file in a folder, etc., can all be expressed.
This explorer-like UI but would need access to logic that would present the right files for a given user, while also creating the appropriate Shared-Access-Signatures as needed. Note that this logic would not need to be hosted in Azure, rather would just need access to the proper storage key (from the Azure portal).

Resources