I have a WordPress site that I am deploying to Azure Websites. Curious if I use multiple instances, what happens when a user uploads content? Does the content get propagated automatically? I am not able to find anything in the Azure documentation regarding it.
Thanks
Actually Azure Websites uses a single shared location for content so updates will be reflected on all instances since they are uploaded into the same location.
Any updates made to the Azure Web Site HD should be considered temporary. At BUILD last year it was stated that each instance will get a copy of the “original” blob storage so any updates to one will not be propagated to the others. Of course, most of the internal workings of Azure are not published so this may have changed since then. There is a nice WordPress plugin that will store uploaded media to your Windows Azure Blog Storage account http://wordpress.org/extend/plugins/windows-azure-storage/ the user manual can be found here: http://plugins.svn.wordpress.org/windows-azure-storage/trunk/UserGuide.pdf
Related
So, I can do this in AWS and I'm curious if it's possible in Azure blob storage.
Currently I've mapped a domain like so:
www.contoso.com -> http://asdf.blob.core.windows.net/
asdf has a container called 'main'. which contains my index.html, js, and .css files
Currently to hit the site I need to go to a url like this:
www.contoso.com/main/index.html
But what I really want is for people to be able to reach the files with just the .com address
Does anyone know if this is possible in azure?
While you can store all of your static assets in Azure blobs, Azure doesn't have the notion of a default file in blob storage. That is, you cannot map your domain name to index.html for example. You can, however, reference all other static content directly e.g. mysite.com/foo.html (since foo.html would effectively be a blob).
You'd need a way to host your root file though. Where you place that is really up to you (e.g. web app, VM, cloud service). Since it's static content, web app might provide the best starting point (given there's a free tier, vs consuming an entire VM to host just a single file).
If google took you here, since June 2018, Azure blob storage introduced a new feature static website
And this article should get you started.
PS. The performance data in the article above looks strange around Azure functions.
I think the root container would solve your problem:
https://learn.microsoft.com/en-us/rest/api/storageservices/fileservices/working-with-the-root-container
Simply create a container called '$root', and then you can address blobs without referring to the container; e.g, www.contoso.com/index.html
We have an existing site that uses Media folder of Orchard to store images, and being used heavily in web pages. Now we want to share the Media folder to be shared across two different Web Apps(one is production site and the other is a Staging).
In some of the cases where content developer adds an image to site, it is actually stored on the file system in the production, but we miss these images in our Mirror site, so we have to do a manual copy.
Currently we are thinking to store the media files in Azure blob storage, so that I can share the images between Production and Staging, had anyone that? if yes, please share your thoughts
Any other ideas?
You need to use the Microsoft Azure Media Storage module to enable storing the assets in Azure Blobs.
There is a setup process for this described in docs.
The connection string will happily work shared between multiple projects.
If you have tenants then they can have their own isolated Storage accounts as well (and therefore their own custom domains).
When you enable it though it won't automatically copy the existing assets over to your Azure Blob Storage. I think there is a tool called AzCopy which you can use to move files in and out of cloud storage.
FYI although it is a kind of CDN, by default Azure Blob Storage is just stored in one data center, replicated 3 times. There is actually a different product offering on Azure for a true CDN if you want it to be replicated to points around the world to speed up asset delivery for global users but that doesn't seem to be what you're looking for based on your original question.
Sharing a blob storage with media between production and staging just works.
I regularly copy my production site to my local machine and run the site locally and see all images.
Maybe watch out that you only add images add the production site, not sure which references there are to the file in the Orchard Database.
Have a look in the database, or just try it out and let us know.
As the title says, I'm looking for a way to access an azure files share (in preview) directly from an azure website. I cannot use any REST API or anything like this and I was looking for the possibility of mounting a SMB share directly into the website (through the new portal or any other way).
I found the following links, from which I understand that this is still under review (http://feedback.azure.com/forums/169385-web-apps-formerly-websites/suggestions/6084609-allow-map-azure-file-share-microsoft-azure-file-s) and also a SO question (Can the new Azure File Service be used from Azure WebSites?) that doesn't answer my question.
To be honest and for the sake of giving more details, my scenario is pretty simple - I have some websites and also some virtual machines that should access the files from the azure files service. Regarding the VM, the approach is pretty straight forward and easy but regarding the WebSites, I don't find any way at this moment.
On the other hand, regardless of the answer to the above question, does it make sense to (or do I have the possibility to) enable CDN over an Azure Files Share?
Thank you very much.
As of today, no single technology will serve your purpose. You can't use File Service as you don't have the capability to mount a share in an Azure Website as well as it is not suited for streaming purposes (all access to files there need to be authorized and there's no concept of Shared Access Signature in File Service today).
I guess, you would have to pick one of the two technologies (Blob Service and File Service) and make some compromises to make it work in both Websites and Virtual Machines.
Assuming you go with File Service, then you can mount them in the Virtual Machine and do the processing on the files there. On the website front, you would need to use Storage Client library to download the relevant files in some folder in your website and stream those files from there.
Assuming you go with Blob Service, then you can simply stream them in your website directly from blob storage (no need to have those files in your website). In the Virtual Machine, when you need to process those files (blobs), you would simply download them to your VM for processing and then re-upload them in blob storage.
Does it make sense to (or do I have the possibility to) enable CDN
over an Azure Files Share?
Currently it is not possible to serve Azure File Service files via CDN.
We have a solution which allows our customers to upload SCORM (http://scorm.com/scorm-explained/) packages which are basically mini websites, just html pages and a predefined javascript interface which allows the packages to talk to our site. This all works fine on a self hosted IIS machine as we build up a folder structure of each SCORM module within the website root and can allow the user to open each one and complete the course.
However these can be quite large, containing multiple videos, etc. and while on a self hosted machine we can place the site on a large hard drive how would this work if we wanted to migrate the solution to azure cloud services. I have read in several places the site must be stateless and the VM can be re-imaged at any time, does that mean that we shouldn't store anything in the folder structure that wasn't part of the original package? Is there a way to configure a shared permanent folder for our websites to use?
In a word BlobStorage!
As you quite rightly point out, Azure VMs are stateless so you need a single, persisted repository for your data / file, and blob storage easily fits the bill.
Without further information on precise requirements or how you explicitly need to reference these files, it's a bit hard to suggest the best way for you. However, here's a couple of things...
This video provides a quick overview of blob storage and retrieving items directly from BlobStorage. If you scenario requires you to serve up or store files on the local machine, this might be a good starting point.
BlobStorage supports direct access over HTTP and HTTPS, so you may want to simply reference files directly from BlobStorage (something that shouldn't be a problem as they're web assets). Check out the HTTP section in this article for the URL format. Note: you can also secure up these blob using SharedAccessSignatures if you want to restrict access.
There's no reason why you cannot use a standard VM then map a network drive using the Microsoft Azure File Service. I can't say I've personally done this, I used something else, but the concept a shared resource existing in BlobStorage is very doable.
HTH
I have just implemented Umbraco in an Azure Cloud Instance. I was able to migrate my existing SQL Database to run on SQL Azure and everything runs fine, except for the images and documents inside the media folder.
By default the media folder resides in [siteroot]/Media.
Is there a way to map this folder to azure storage? If not I don't think I'm going to be able to scale up my cloud instances, since the images depend on the virtual server's local storage.
Edit: Bounty Started
What I have so far is this:
Define a stand alone web role which would hold the media directory
and all the files.
Map this folder to the Azure Blobg Storage service with Cloud Drive, in order to minimize the risk of losing data and relying on a
single point of storage.
Somehow (and this is the part I don't know how to accomplish) keep all the folder of [siteRoot]/media synced with this shared drive on
all running instances.
I've seen a similar approach taken with the Azure Accelerator project from Umbraco here: http://azureaccelerators.codeplex.com/releases
But they haven't updated the release since 2011, and I'm not sure it would work with the current version of Azure.
Edit 2:
Umbraco has their own accelerator, but they've deprecated it in favor of using Websites instead of Web Roles:
https://github.com/Microsoft-DPE/wa-accelerator-umbraco
This release works with the 1.6 SDK. Current version is 1.8 I believe...
I'm not sure about a way of mapping the path to storage, but depending on the version of Umbraco you are using, I think from 4.9 (possibly 4.10) they introduced FileSystemProviders configuration which may help solve your problem.
My understanding of it is that it allows you to replace the default Umbraco FileSystemProvider, Umbraco.Core.IO.PhysicalFileSystem with your own custom implementation. I'm pretty sure you could implement an Azure-based provider that wrote and read from the blob storage. In the source, it looks fairly straightforward, a matter of implementing their IFileSystem.
Ended up using Matt Brailsford's Universal Media Picker solution:
http://our.umbraco.org/projects/backoffice-extensions/universal-media-picker
The final solution actually circumvents the Umbraco Media Folder and reads directly from Blob Storage, so I had to rewrite all the macros and templates that rendered images before and point them directly to the Blob Storage account
Unfortunately theres no way to map a NTFS directory to BlobStorage directly.
Have a look at the CloudDrive class of the Windows Azure SDK. This feature allows you to upload a Virtual Hard Disk file (.vhd file) into your blob storage and mount it as a local drive inside Windows Azure Instances.
You sould know that (if you're using multiple instances) only one cloud instance can mount the VHD in read/write mode. The rest of them has only read access to the drive. If the "Media" folder stores static content that you update manually only a few times, this is okay. But if user content is placed there, too, you might want only one instance to mount the VHD and grant other instances access to it via Network Share.
This package provided by Ali Sheikh Taheri solves the problem of the media folder
http://our.umbraco.org/projects/backoffice-extensions/ast-azure-media-sync