Kentico Azure blob integration - kentico

On my Kentico project I have integrated Azure blob storage instead of saving files locally. Followed this article. https://docs.kentico.com/k12/custom-development/working-with-physical-files-using-the-api/configuring-file-system-providers/configuring-azure-storage
Things are working alright except for a one problem. Now all the files are accessible publically. There are some PDF files in the media library that I won't only the logged in users to view but now any one can view these files. Is there any workaround for this issue?

Files in Media Library are always accessible via the direct link and you can't restrict them to logged-in users only. Regardless it's Azure storage or local disk.
But there are two ways of achieving this:
Presentation-only restrictions. When you present those PDF links to the website user - display them only to logged-in users. The files will still be accessible via direct links but only logged-in users will see them.
Hard restrictions. As far as I know, these restrictions can be set up only for files stored in CMS tree. This approach will check permissions when accessing files via direct link.

If you are storing files in blob there is no way. You can restrict the access to the whole container with SAS token (or individual blob), but not a to a specific folder. Folder is purely virtual structure, it exists only in a file path.

Related

Azure storage options to serve content on Azure Web App

I am a total newbie to Azure WebApps and storage, I need some clarification/confirmation. The main thing to take note of, my application (described below) requires a folder hierarchy. Blob is out of the question and file share doesn't allow anonymous access unless I use Shared Access Signature (SAS).
Am I understanding Azure storage correctly, it's either you fit into the Azure storage model or you don't?
Can anyone advise how I can achieve what's required by the CMS application as described below by using Blobs?
The only option I see is to find a way to change the CMS application so that it always has the SAS in the URL to every file it requests from storage in order to serve content on my Web App? If so, is it a problem if I set my SAS to expire sometime in the distant future?
https://<appname>.file.core.windows.net/instance1/site1/file1.jpg?<SAS>
Problem with using Blob
So far my understanding is that Blob storage doesn't allow "sub folders" as it's a container that holds unstructured data, therefore I'm unable to use this based on my application (described below) as it requires folder structure.
The problem with using File Share
File share seemed perfect as it allows for folder hierarchy, naturally that's what I've used.
However, no anonymous access is allowed for files stored in file storage, the access needs to be authorised. One way of authorising the access is to create a SAS on a file/share level with Read permission and then using that SAS URL to access the file.
Cannot access Windows azure file storage document
My application
I've created a Linux Web App running open source CMS application. This application allows creation of multiple websites, for each website's content such as images, docs, multimedia to be stored on a file server. These files are then served to the website via a defined URL.
The CMS application allows for a settings of the location where it should save its files, this would be a folder on the file server. It then creates a new sub folder for every site it hosts in that location.
Example folder hierarchy
/instance1
/site1
/file1
/file2
/site2
/file1
/file2
Am I understanding Azure storage correctly, it's either you fit into
the Azure storage model or you don't?
You can use Azure Storage Model for your CMS Application. You can use either Blob Storage or File Share
Can anyone advise how I can achieve what's required by the CMS
application as described below by using Blobs?
You can use Data Lake Gen 2 storage account if you want to use Azure Blob Storage.
Data Lake Gen 2 storage enables hierarchical namespace so that you can use subfolders in the Blob Storage as per your requirements
Problem with using Blob
Blob Storage allows subfolders if we use Data Lake Gen 2 storage account. You can enable Blob Public Anonymous access
The problem with using File Share
Azure File Share supports but does not allow public anonymous access. You can use Azure Managed Identity (System-Assigned) for your web app to access the Azure File Share.
Then your application would be able to access the Azure File Share without SAS token
The issue of not having real folders in a blob storage shouldn't be any issue for your use case. Just because it doesn't have your traditional folders doesn't mean it can't serve content on e.g. instance1/site1/file1. That's still possible but the instance1/site1/ will just be part of the name of the blob.
Tools like the Azure Portal or Storage Explorer will actually show folders by using the delimiter / and querying data that appears to be inside a folder by using the path as prefix.

Deep link to text file in Azure Data Lake Store

I am trying to quickly access text files via URL. The Azure portal (http://portal.azure.com) can (at best) link to the explore view of a specific folder, but I have not found any way to deep link into a specific file.
I also tried Azure Storage Explorer, which does support adl:// URLs but (apart from opening slowly) it only browses to the folder and it doesn't actually open it.
My use case is that at the end of each data processing job, I want to print a URL to open a text file for browsing.
Any ideas or workarounds?
In fact , there's no anonymous access allowed for files stored in ADLS. The access needs to be authorized so that we can't open it via the url directly.
Based on your situation, I suggest you creating your own endpoint (For example: Azure Function) as proxy to access resources with being authorized. You could access Azure Function with the url of the file you want to open as parameter.Then make the request to get the content of the file to display for browsing.
In addition, considering the security of accessing files , you need to focus on the Access control in Azure Data Lake Store.
Hope it helps you.

Move existing images present in Media folders of Orchard to CDN

We have an existing site that uses Media folder of Orchard to store images, and being used heavily in web pages. Now we want to share the Media folder to be shared across two different Web Apps(one is production site and the other is a Staging).
In some of the cases where content developer adds an image to site, it is actually stored on the file system in the production, but we miss these images in our Mirror site, so we have to do a manual copy.
Currently we are thinking to store the media files in Azure blob storage, so that I can share the images between Production and Staging, had anyone that? if yes, please share your thoughts
Any other ideas?
You need to use the Microsoft Azure Media Storage module to enable storing the assets in Azure Blobs.
There is a setup process for this described in docs.
The connection string will happily work shared between multiple projects.
If you have tenants then they can have their own isolated Storage accounts as well (and therefore their own custom domains).
When you enable it though it won't automatically copy the existing assets over to your Azure Blob Storage. I think there is a tool called AzCopy which you can use to move files in and out of cloud storage.
FYI although it is a kind of CDN, by default Azure Blob Storage is just stored in one data center, replicated 3 times. There is actually a different product offering on Azure for a true CDN if you want it to be replicated to points around the world to speed up asset delivery for global users but that doesn't seem to be what you're looking for based on your original question.
Sharing a blob storage with media between production and staging just works.
I regularly copy my production site to my local machine and run the site locally and see all images.
Maybe watch out that you only add images add the production site, not sure which references there are to the file in the Orchard Database.
Have a look in the database, or just try it out and let us know.

Uploading Multiple SCORM Websites/Packages (Mini Websites) To Existing Azure Website

We have a solution which allows our customers to upload SCORM (http://scorm.com/scorm-explained/) packages which are basically mini websites, just html pages and a predefined javascript interface which allows the packages to talk to our site. This all works fine on a self hosted IIS machine as we build up a folder structure of each SCORM module within the website root and can allow the user to open each one and complete the course.
However these can be quite large, containing multiple videos, etc. and while on a self hosted machine we can place the site on a large hard drive how would this work if we wanted to migrate the solution to azure cloud services. I have read in several places the site must be stateless and the VM can be re-imaged at any time, does that mean that we shouldn't store anything in the folder structure that wasn't part of the original package? Is there a way to configure a shared permanent folder for our websites to use?
In a word BlobStorage!
As you quite rightly point out, Azure VMs are stateless so you need a single, persisted repository for your data / file, and blob storage easily fits the bill.
Without further information on precise requirements or how you explicitly need to reference these files, it's a bit hard to suggest the best way for you. However, here's a couple of things...
This video provides a quick overview of blob storage and retrieving items directly from BlobStorage. If you scenario requires you to serve up or store files on the local machine, this might be a good starting point.
BlobStorage supports direct access over HTTP and HTTPS, so you may want to simply reference files directly from BlobStorage (something that shouldn't be a problem as they're web assets). Check out the HTTP section in this article for the URL format. Note: you can also secure up these blob using SharedAccessSignatures if you want to restrict access.
There's no reason why you cannot use a standard VM then map a network drive using the Microsoft Azure File Service. I can't say I've personally done this, I used something else, but the concept a shared resource existing in BlobStorage is very doable.
HTH

Upload and use web folder on Windows Azure

We already allow our users to upload files through the app to the Azure Blob Storage, and then view them inside the app.
What we need now is to allow the upload of an entire folder containing web files (html, js, css, images...) maintaining the folder structure that it has and then be able to run these files in the browser. The link references between the files must be maintained also so it can work.
What will be the correct way to do this?
Is it possible through Blob Storage or do we need to upload the folder and its contents directly to the file system?
Thanks!
Note the Azure Blob storage doesn't have a concept of a "folder". The closest you would get would be to name a file "foldername/filename.ext". How you populate blob storage in this fashion would depend on how you allow a user to upload all their files. Perhaps as a zip file or through some form of ajax-based file upload UI on a web page... not sure. Ultimately you can't build folders but should be able to replicate the behaviours of one.

Resources