I'm currenlty using Azure Blob to store files, and upload/download from ASP.Net Application hosted outside of Azure. (I do not have Web Role and Worker Role.)
Is it possible to zip multiple files into one zip file within Azure Blob before downloading?
Thanks in advance!
THe only way to achieve this would be to do it by using a WIndows Azure Compute Role in the cloud. You obviously wouldn't want to do it on your on-prem servers as you'd round-trip the files twice.
One approach you might consider would be to build a download 'client' in Silverlight. This could handle the communications to blob stgorage and pull down the blobs (maybe in parallel) and then create the zip client side for saving.
But the short answer is this is not possible using WIndows Azure storage alone.
Related
We currently have Window service to process Inbound/outbound files.
In Bound files we read data and perform some calculations and store data in Database.
Out Bound files we generate data from the database.
We want to migrate to azure now. I have following questions .
1) what is the best way to store files in azure (Blob or File Share in azure) . We have only ".pdf,.txt,.xlsx" formats no videos
2) Which process is better to process files - WebJobs, Virtual Machine and install window service , Azure Batch Jobs, azure kubernetes service,Service Fabric.
Please some can help me on this.
Thanks
How are you receiving the files API, FTP or some other way? There are a ton of details that are needed to really answer this, but here are my thoughts.
Blob storage would be more cost effective. You only need to use a file share if you want to be able to map a network drive from a VM.
If processing one file would complete in less than 10 minutes I would look at Azure functions for that. If you’re processing thousands of files per day Azure functions would be expensive so I would look at running them on an App Service on VMs or moving to Service Fabric.
If you have a web site that’s used to upload the files and you’re already using Azure App service then you could use Web Jobs.
I have a problem that I have been wracking my brain about and figured I would need some perspective and insight from people who are a lot more knowledgeable about this.
What I have currently: Web based application hosted in azure uses azure blob store to store files that are generated as part of data import processes. We have a seperate application that extends the original web application that allows users to upload files and these files are currently also stored in azure blob store.
Where I am trying to go: I have a requirement that wants the ability to map network file shares on a users laptop and be able to access these files that currently reside in the blob.
Since Azure blob does not support SMB I have no way of actually doing this with a blob store.
I could use Azure files in conjunction with a File Server running the sync agent. However, this requires a lot of work both in terms of refactoring, setup and some custom service that add remove permissions on the file server.
I'm wondering if there is a service or a piece of software that exists in the market currently that allows me to continue using blob and perhaps sync the blob files into a file server that can then allow users to access and open files using windows file explorer? I found one that looks like an open source project but only does a one way sync from the blob to the file share. Ideally I'd like to find a solution that does a two way sync like azure file sync does.
Any thoughts and ideas will be appreciated.
Since the max number of blob containers, file shares is unlimited. Per my understanding, you could leverage the following approaches:
Migrate the data from blob storage to azure file share instead of blob storage, then the subsequent file store is azure file storage.
Note: Currently you must specify storage account key when mounting file shares, details you could follow this feedback. I recommend that you'd better do not map network file shares on a users laptop.
You could still use the blob storage, and you could create each blob container for each user and generate each blob container SAS token for your users, then the users could leverage Azure Storage Explorer to manage their blob files or use AzCopy and other command tools to download the blob files into their laptop file system.
Note: For security consideration, you could combine a stored access policy with a SAS, in order to revoke the permissions, you just need to invalidate the related access policy instead of regenerating the account key. Details you could follow Controlling a SAS with a stored access policy and Shared Access Signatures, Part 2: Create and use a SAS with Blob storage.
I have one application which consumes over 40GB content of xml files. Each file is about 40 ~ 100KB.
Today there is a tool to generate new files or update existing files and then publish them to on-premise servers.
Now I'm moving my application to Azure. Because I have no way to make any changes on the publishing tool, I can build another tool to upload all the content to Azure blob storage and sync blob storage with on-premise content.
Is there other way to do this?
You can use Azure Storage Explorer or Azure Explorer from Cerebrata
http://www.cerebrata.com/products/azure-explorer/introduction
https://azurestorageexplorer.codeplex.com/
We are looking at migrating some sites from Azure Cloud Services to Azure Websites (as that is how things seem to be going). Cloud services obviously we were told specifically didn't preserve the file system state as they were re-deployed on machine failure.
I am assuming websites are built on Blob storage. Is there a page from Microsoft that confirms if I upload files to the site via FTP etc. that they are persistent, backed up and preserved as part of the site? If they are persistent what are their SLAs? Is there any inbuilt function to backup local files? What happens on instance scale out to files on local file system? Can I get access to the underlying blob storage?
If they are standard is there any issues with letting users upload files in a hosting sense? I appreciate risks of what users shouldn't upload. If files are persistent is it sill best practice to offload to blob storage?
Yes, files that are part of your Web Site are persisted. You can access them via FTP.
You can use the backup service (currently in preview) to schedule backups to blob storage.
The Azure Web Sites SLA is available here: http://www.microsoft.com/en-us/download/details.aspx?id=39303
We are new to Windows azure and are developing a web application. In the beginning of the project , we have deployed complete code to different environments which actually publish complete code and uploaded blob objects to azure storage as we linked sitefinity to hold blob objects in azure storage . But now as we are in the middle of development , we are just required to upload any new blob files created which can be quite less in numbers (1 or 2 or maybe few).Now I would like to know best process to sync these blob files to different azure storage environments which is for each cloud service. So ideally we would like to update staging cloud service and staging storage first and then test there and then once no bugs are found, then will be required to update UAT and production storages as well with the changed or new blob objects.
Please help.
You can use the Azure Storage Explorer to manually upload/download blobs from storage accounts very easily. For one or two blobs, this would be an easy solution, otherwise you will need to write a tool that connects to the blob storage via an API and does the copying for you.