Best way to store database backup files in azure app service? - azure

I have an app running on Azure app service, I have created some batch scripts which can take backup of the databases (DB running on some other servers i.e. 3rd party cloud db services - Not azure). Question is what is the best way/place to store these backup files in azure app services. Creating a folder named "Backup" in my source directory would overwrite these backups every time code is deployed. Followings are some of the concerns
Security of backup files
Backup files should be easily downloaded whenever I want to restore it
Backup files Shouldn't be overwritten or lost when the deployment is done or app slots are switched.
I was thinking of storing files in %HOME% directory, is it good idea ?
Also is there any size or storage limit with azure app service plans ?

I would recommend that you store the backups outside the Azure app service. Here's some problems with storing the files in App service:
You can't move the app easily from an App Service to an another.
App service has some storage limitations: Free and Shared sites get 1GB of space, Basic sites get 10GB, and Standard sites get 50GB.
It's not easy to access the backups outside of your app.
Instead, Azure Blob Storage is an ideal place for storing large files.
Regarding your concerns:
1) You can make the Azure Blob Storage container private, so that you can only access it if you know the key.
2) There's multiple ways to access the backups stored in Azure Blob Storage:
Azure Storage Explorer is a GUI for accessing the blob storage.
AZCopy which you can easily use from .BAT-files
Simple C#
3) When storing backups in Blob Storage, deployments slots doesn't affect the backups.
Blob storage also offers "Archive" tier which is ideal for storing the rarely used backups.

Related

Azure blob storage streaming performance issue

My application till this day was working with local zip files,
meaning I was using a direct return new FileStream()
in the application and the local zip file that was located on the SDD/Network drive path (zip files can be hundreds of GB).
I configured the application to work with Azure Blob Storage, meaning each FileStream that was returned in now return as the Azure Blob SDK method:
GetBlobStreamAsync(ContainerName, BlobName).ConfigureAwait(false).GetAwaiter().GetResult()
I uploaded some zip files to a container in the blob storage and set the connection string in the application to work with that storage account.
The application was deployed and running on a virtual windows machine located in the same region of the Azure Storage Blob.
Note: This is a private cloud network.
When the app is streaming the zip file on Azure blob storage it seems that the performance has decreased by at least 8-9 times (problematic with hundreds of GB).
Speed comparison is between local C: drive on the same windows virtual machine that the application is running on an Azure Storage account which is located in the same region.
Note: NW Bandwidth - is 50 GB on the VM on azure
Solutions that I tried:
Azure blob Premium Performance storage - Didn’t improve performance
.Net Core - advantage of performance enhancements (we work with .Net framework so this is irrelevant).
Network File System (NFS) 3.0 performance considerations in Azure Blob storage - (Does not work with private cloud).
Hot, Cool, and Archive access tiers for blob data - The default is Hot so we already tried this scenario with no improvements.
Solutions I want to try:
Azure Files Share Storage as a cache solution
.Net Framework configuration - lists several quick configuration settings that you can use to make significant performance improvements
Question:
Does anyone have any suggestions on how can I optimize the streaming in front of the Azure Storage Blob?
Azure Files (share) or Storage Blob services are likely not the right services to be utilized for this scenario. There are two possible paths:
Break a single file into multiple files and leverage Storage Blob service that handles throughput better than Azure Files. Azure Files performs better with small(er) files which are typical to user documents (PDFs, Word, Excel, etc.)
Switch over to a more dedicated service that is designed specifically for large-size data transfer if breaking up a single file into multiple blobs is not an option.
The recommendation for each option will highly depend on the implementation details, requirements and constraints of the system.

Migrating Large Content Web App to Azure

I have a large web app with around 20 gigabytes of images and mp3s. It currently uses standard file IO libraries to read and write the sounds and mp3s. I'd like to migrate it to Azure, but I have concerns about storing that much content. Is it possible to use an App Service to host the web app and some sort of storage mounted to the root of the site for the assets without rewritting all of the file access to use blobs or some other api?
If you look at the App Service plans here, you will notice that with Standard and better plans, you get storage more than 20GB (50GB+) so it is certainly possible to take your app as is and run it in Azure. However it is not recommended practice.
What you should do is make use of Azure Blob Storage for storing media content. You will need to make some changes as you can't simply mount Azure Blob Storage as a network drive.
There's Azure Files as well that can be mounted as a network drive but as of today you can't mount a File Share as a network drive in an Azure WebApp. You would need to deploy your application in either a Virtual Machine (IaaS) or rewrite your application to run as a Cloud Service.

Azure Cloud Web Service, storage options

We are migrating our PHP website to Azure Cloud Web Service (Web Role).
Currently the website saves user submitted image files to the filesystem via drive letter access. These images are then served via a url e.g. content.example.com.
What options have I got id I want persistent file storage on an Azure Cloud Web Service.
I am currently assessing BLOB Storage for this.
Thanks
Blob storage is the right answer. Although you could convert your images in base64 and save them in Azure Sql as well, it is really not recommended.
Check: Azure, best way to store and deploy static content (e.g. images/css)? or Where to store things like user pictures using Azure? Blob Storage?
One of the options to reduce re-writing of your application is to mount blob-storage as a network drive. Here is some information how to do it: http://blogs.msdn.com/b/windowsazurestorage/archive/2014/05/12/introducing-microsoft-azure-file-service.aspx
Mounting of the drives can be done on Web-Role start-up task and can be scripted.

Azure Websites - Persistent File Issues

We are looking at migrating some sites from Azure Cloud Services to Azure Websites (as that is how things seem to be going). Cloud services obviously we were told specifically didn't preserve the file system state as they were re-deployed on machine failure.
I am assuming websites are built on Blob storage. Is there a page from Microsoft that confirms if I upload files to the site via FTP etc. that they are persistent, backed up and preserved as part of the site? If they are persistent what are their SLAs? Is there any inbuilt function to backup local files? What happens on instance scale out to files on local file system? Can I get access to the underlying blob storage?
If they are standard is there any issues with letting users upload files in a hosting sense? I appreciate risks of what users shouldn't upload. If files are persistent is it sill best practice to offload to blob storage?
Yes, files that are part of your Web Site are persisted. You can access them via FTP.
You can use the backup service (currently in preview) to schedule backups to blob storage.
The Azure Web Sites SLA is available here: http://www.microsoft.com/en-us/download/details.aspx?id=39303

How to achieve Incremental deployment of Blob Files storage files to different environments of windows azure storage?

We are new to Windows azure and are developing a web application. In the beginning of the project , we have deployed complete code to different environments which actually publish complete code and uploaded blob objects to azure storage as we linked sitefinity to hold blob objects in azure storage . But now as we are in the middle of development , we are just required to upload any new blob files created which can be quite less in numbers (1 or 2 or maybe few).Now I would like to know best process to sync these blob files to different azure storage environments which is for each cloud service. So ideally we would like to update staging cloud service and staging storage first and then test there and then once no bugs are found, then will be required to update UAT and production storages as well with the changed or new blob objects.
Please help.
You can use the Azure Storage Explorer to manually upload/download blobs from storage accounts very easily. For one or two blobs, this would be an easy solution, otherwise you will need to write a tool that connects to the blob storage via an API and does the copying for you.

Resources