How to achieve Incremental deployment of Blob Files storage files to different environments of windows azure storage? - visual-studio-2012

We are new to Windows azure and are developing a web application. In the beginning of the project , we have deployed complete code to different environments which actually publish complete code and uploaded blob objects to azure storage as we linked sitefinity to hold blob objects in azure storage . But now as we are in the middle of development , we are just required to upload any new blob files created which can be quite less in numbers (1 or 2 or maybe few).Now I would like to know best process to sync these blob files to different azure storage environments which is for each cloud service. So ideally we would like to update staging cloud service and staging storage first and then test there and then once no bugs are found, then will be required to update UAT and production storages as well with the changed or new blob objects.
Please help.

You can use the Azure Storage Explorer to manually upload/download blobs from storage accounts very easily. For one or two blobs, this would be an easy solution, otherwise you will need to write a tool that connects to the blob storage via an API and does the copying for you.

Related

Azure blob storage streaming performance issue

My application till this day was working with local zip files,
meaning I was using a direct return new FileStream()
in the application and the local zip file that was located on the SDD/Network drive path (zip files can be hundreds of GB).
I configured the application to work with Azure Blob Storage, meaning each FileStream that was returned in now return as the Azure Blob SDK method:
GetBlobStreamAsync(ContainerName, BlobName).ConfigureAwait(false).GetAwaiter().GetResult()
I uploaded some zip files to a container in the blob storage and set the connection string in the application to work with that storage account.
The application was deployed and running on a virtual windows machine located in the same region of the Azure Storage Blob.
Note: This is a private cloud network.
When the app is streaming the zip file on Azure blob storage it seems that the performance has decreased by at least 8-9 times (problematic with hundreds of GB).
Speed comparison is between local C: drive on the same windows virtual machine that the application is running on an Azure Storage account which is located in the same region.
Note: NW Bandwidth - is 50 GB on the VM on azure
Solutions that I tried:
Azure blob Premium Performance storage - Didn’t improve performance
.Net Core - advantage of performance enhancements (we work with .Net framework so this is irrelevant).
Network File System (NFS) 3.0 performance considerations in Azure Blob storage - (Does not work with private cloud).
Hot, Cool, and Archive access tiers for blob data - The default is Hot so we already tried this scenario with no improvements.
Solutions I want to try:
Azure Files Share Storage as a cache solution
.Net Framework configuration - lists several quick configuration settings that you can use to make significant performance improvements
Question:
Does anyone have any suggestions on how can I optimize the streaming in front of the Azure Storage Blob?
Azure Files (share) or Storage Blob services are likely not the right services to be utilized for this scenario. There are two possible paths:
Break a single file into multiple files and leverage Storage Blob service that handles throughput better than Azure Files. Azure Files performs better with small(er) files which are typical to user documents (PDFs, Word, Excel, etc.)
Switch over to a more dedicated service that is designed specifically for large-size data transfer if breaking up a single file into multiple blobs is not an option.
The recommendation for each option will highly depend on the implementation details, requirements and constraints of the system.

Azure WebApp storing Files

I am updating a system that had all of it's files stored inside of sql server.
It's going from an on prem server to a Azure webapp.
My questions are:
I think I should be using a storage blob for these files. Is that correct or is there a better option inside of Azure that I should be using?
Is there a quick way to migrate files from sql to that blob?
For storage purposes, do I write the file to the blob and then store the hyperlink to that file?
The staging environment gets updated with the latest data from production when they do a release, is there a way to migrate storage blob to a different resource group for when they do this?
Yes, I would use blob.
Quickest way would be a quick powershell or cli script or console app to pull the files from the database and upload them to blob.
I don't store the entire hyperlink to the file in the database, just the path. That way the storage account and container can be environment configurations.
I would recommend against doing this... we've found since we started doing automated continuous deployment, we haven't had a reason to move backwards, which has eliminated a lot of effort. That being said, AzCopy is a utility that allows you to do server-side copy of blobs between storage accounts (along with many other types of source and destination if needed). That should do what you need.
To answer your questions:
I think I should be using a storage blob for these files. Is that
correct or is there a better option inside of Azure that I should be
using?
That's correct. Blob storage is meant for this purpose only.
Is there a quick way to migrate files from sql to that blob?
I'm not aware of any automated way to do that. What you would need to do is read the binary data from SQL Database and then create a stream out of it and upload that stream. You can use Azure Storage SDK for uploading purpose.
For storage purposes, do I write the file to the blob and then store
the hyperlink to that file?
Under normal circumstances, it is recommended approach however considering you have a need to create a staging environment that will be a copy of production environment (including database I am assuming), I would recommend you store 2 things in your database: blob container name and blob name (or you could store relative URL e.g. <container-name>/<blob-name>). Assuming you keep storage account name somewhere in the configuration file, you can create the URL dynamically using https://<account-name>.blob.core.windows.net/<container-name>/<blob-name> pattern.
The staging environment gets updated with the latest data from
production when they do a release, is there a way to migrate storage
blob to a different resource group for when they do this?
Azure Storage provides Copy Blobs functionality using which you can copy blobs from one blob container to another in same or a different storage account. You can use that to copy data from production environment to staging environment.

Copy files from Azure Blob storage to Azure File Storage with Azure functions

I have files in Azure Blob Storage that I want to be able to share with users through an FTP server running on an Azure VM.
As I understand it you can't mount Blob Storage on a VM but you can mount an Azure File Share using 'net use'.
The files on the Blob Storage will be uploaded incrementally so ideally I would like to copy them to Azure files when they are uploaded and Azure Function seems like the ideal way since they are easy to set up and handle the trigger on the Blob Storage for me.
How would I go about copying a file from Blob Storage to an Azure File Share using an Azure function?
You can setup a Trigger Binding on the Azure Function to be triggered by Blobs in the Azure Blob Storage Container. Then you'll have to download the file stream of the blob and upload it to the Azure Storage File Share.
Azure Functions do not include support for an Output Binding directly to an Azure Storage File Share. So, you'll need to either use the Azure Storage SDK from in code, or look into mounting the File Share to the Azure Functions runtime environment so you can write file updates to it from within the Azure Function.
An alternative solutions would be to use Azure Logic Apps to implement this without writing any code. This article might help for integrating with an Azure Storage File Share -> Connect to on-premises file systems form logic apps with the File System connector

Best way to store database backup files in azure app service?

I have an app running on Azure app service, I have created some batch scripts which can take backup of the databases (DB running on some other servers i.e. 3rd party cloud db services - Not azure). Question is what is the best way/place to store these backup files in azure app services. Creating a folder named "Backup" in my source directory would overwrite these backups every time code is deployed. Followings are some of the concerns
Security of backup files
Backup files should be easily downloaded whenever I want to restore it
Backup files Shouldn't be overwritten or lost when the deployment is done or app slots are switched.
I was thinking of storing files in %HOME% directory, is it good idea ?
Also is there any size or storage limit with azure app service plans ?
I would recommend that you store the backups outside the Azure app service. Here's some problems with storing the files in App service:
You can't move the app easily from an App Service to an another.
App service has some storage limitations: Free and Shared sites get 1GB of space, Basic sites get 10GB, and Standard sites get 50GB.
It's not easy to access the backups outside of your app.
Instead, Azure Blob Storage is an ideal place for storing large files.
Regarding your concerns:
1) You can make the Azure Blob Storage container private, so that you can only access it if you know the key.
2) There's multiple ways to access the backups stored in Azure Blob Storage:
Azure Storage Explorer is a GUI for accessing the blob storage.
AZCopy which you can easily use from .BAT-files
Simple C#
3) When storing backups in Blob Storage, deployments slots doesn't affect the backups.
Blob storage also offers "Archive" tier which is ideal for storing the rarely used backups.

What's best way to host large amount of content in Windows Azure

I have one application which consumes over 40GB content of xml files. Each file is about 40 ~ 100KB.
Today there is a tool to generate new files or update existing files and then publish them to on-premise servers.
Now I'm moving my application to Azure. Because I have no way to make any changes on the publishing tool, I can build another tool to upload all the content to Azure blob storage and sync blob storage with on-premise content.
Is there other way to do this?
You can use Azure Storage Explorer or Azure Explorer from Cerebrata
http://www.cerebrata.com/products/azure-explorer/introduction
https://azurestorageexplorer.codeplex.com/

Resources