I use Azure cloud blob to serve images. Sometimes I need to make an image temporarily unavailable (ie. someone reported the image) and later may recover or delete it. How do I achieve this?
My container is public:
container.SetPermissions(new BlobContainerPermissions { PublicAccess = BlobContainerPublicAccessType.Blob });
This is how I made the blob:
CloudBlockBlob block_blob = ImagesContainer.GetBlockBlobReference(blob_name);
block_blob.Properties.ContentType = file.ContentType;
block_blob.Properties.CacheControl = "public, max-age=2592000"; // 30 days
block_blob.UploadFromStream(file.InputStream);
Is there something like:
block_blob.Properties.AccessType = private;
block_blob.SetProperties();
so that I can make it unavailable to everyone? And later I might recover it by setting the property to "public". I can't find any properties related to this usage.
Thanks very much.
There is no way to do this in a public container. I would recommend copying the blob to a private container and then deleting it from the public container, and then you can copy it back whenever you want to make it available again. Using the Copy Blob API (http://msdn.microsoft.com/en-us/library/azure/dd894037.aspx) between two containers in a single storage account will be very, very fast even for large files.
Related
I want to know if there is a benefit to zipping files before sending them to Azure Blob Storage - strictly for transfer purposes. Put another way, will pre-zipping files make file transfers any faster when going to/from blob storage? Or does this automatically happen at the transport level by using gzip?
As of 12th August 2015 Azure blob storage (when mounted to the CDN) now supports automatic GZip compression.
Compression method - Supported compression methods are
gzip/deflate/bzip2, a supported method must be set in the
Accept-Encoding Request Header.
Improve performance by compressing files
UPDATE
I'm unsure of what and how I originally did this, but all I can think is that I was looking at the results incorrectly. Everything I can read about azure (from MSDN, to the code itself) is now telling me that Azure does not support gzip for transfer purposes. I do not know under what circumstances I was able to get the following results and am unable to reproduce them now. Needless to say, I'm very disappointed.
(THIS ANSWER IS INCORRECT, SEE THE UPDATE ABOVE) The answer is no, there is no benefit for transfer speed purposes to zip a file before sending to blob storage. By turning on Fiddler, you can see that the transport level automatically gzips content across the wire. Screenshots below confirm this:
Edit 1 - Quick Clarifications for Gaurav
The byte array that comes back in code has a length of 386803, but the network card only saw 23505 bytes go by, because it was gzipped by Azure in the response. I didn't have to do anything for that to happen.
Here is the code I'm using to initiate the request from Blob Storage
public Byte[] Read(string containerName, string filename)
{
CheckContainer(containerName);
Initialize();
// Retrieve reference to a previously created container.
CloudBlobContainer container = _blobClient.GetContainerReference(containerName);
// Retrieve reference to a blob named "photo1.jpg".
CloudBlockBlob blockBlob = container.GetBlockBlobReference(filename);
byte[] buffer;
// Save blob contents to a file.
using (var stream = new MemoryStream())
{
blockBlob.DownloadToStream(stream);
stream.Seek(0, SeekOrigin.Begin);
buffer = new byte[stream.Length];
stream.Read(buffer, 0, (int)stream.Length);
}
return buffer;
}
I'm working on a personal project to manage users of my club, it's hosted on the free Azure package (for now at least), partly as an experiment to try out Azure. Part of creating their records is to add a photo, so I've got a Contact Card view that lets me see who they are, when they came and a photo.
I have installed ImageResizer and it's really easy to resize the 10MP photos from my camera and save them to the file system locally, but it seems that for Azure I need to use their Blobs to Upload Pictures to Windows Azure Web Sites, and that's new to me. The documentation on ImageResizer says that I need to use AzureReader2 in order to work with Azure blobs but it isn't free. It also says in their best practices #5 to
Use dynamic resizing instead of pre-resizing your images.
Which is not what I was thinking, I was going to resize to 300x300 and 75x75 (for thumbnail) when creating the users record. But if I should be storing full size images as blobs and dynamically resizing on the way out then can I just use standard means to Upload a blob into a container to save it to Azure, then when I want to display the images use the ImageResizer and pass it each image to resize as required. That way not needing to use the AzureReader2, or have I misunderstood what it does / how it works?
Is there another way to consider?
I've not yet implemented cropping, but that's next to tackle when I've worked out how to actually store the images properly
With some trepidation, I'm going to disagree with astaykov here. I believe you CAN use ImageResizer with Azure WITHOUT needing AzureReader2. Maybe I should qualify that by saying 'It works on my setup' :)
I'm using ImageResizer in an MVC 3 application. I have a standard Azure account with an images container.
Here's my test code for the view:
#using (Html.BeginForm( "UploadPhoto", "BasicProfile", FormMethod.Post, new { enctype = "multipart/form-data" }))
{
<input type="file" name="file" />
<input type="submit" value="OK" />
}
And here's the corresponding code in the Post Action method:
// This action handles the form POST and the upload
[HttpPost]
public ActionResult UploadPhoto(HttpPostedFileBase file)
{
// Verify that the user selected a file
if (file != null && file.ContentLength > 0)
{
string newGuid = Guid.NewGuid().ToString();
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConfigurationManager.AppSettings["StorageConnectionString"]);
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve reference to a previously created container.
CloudBlobContainer container = blobClient.GetContainerReference("images");
// Retrieve reference to the blob we want to create
CloudBlockBlob blockBlob = container.GetBlockBlobReference(newGuid + ".jpg");
// Populate our blob with contents from the uploaded file.
using (var ms = new MemoryStream())
{
ImageResizer.ImageJob i = new ImageResizer.ImageJob(file.InputStream,
ms, new ImageResizer.ResizeSettings("width=800;height=600;format=jpg;mode=max"));
i.Build();
blockBlob.Properties.ContentType = "image/jpeg";
ms.Seek(0, SeekOrigin.Begin);
blockBlob.UploadFromStream(ms);
}
}
// redirect back to the index action to show the form once again
return RedirectToAction("UploadPhoto");
}
This is 'rough and ready' code to test the theory and could certainly stand improvement but, it does work both locally and when deployed on Azure. I can also view the images I've uploaded, which are correctly re-sized.
Hope this helps someone.
The answer to the concrete question:
If using ImageResizer with Azure blobs do I need the AzureReader2
plugin?
is YES. And as described in the Image Resizer's documentation - that plugin is used to read/process/serve images out of Blob Storage. So there is no doubt - if you are going to use Image Resizer, AzureReader2 is your needed plugin to make things right. It will take care of Blob uploads/serve.
Although I question Image Resizer's team competency on Windows Azure, since they are referencing Azure SDK v.2, while the most current version for Azure SDK is 1.8. What they mean is the Azure Storage Client Library, which has versions 1.7 and 2.x. Whereas version 2.x is recommended one to use and comes with Azure SDK 1.8. So, do not search for Azure SDK 2.0, install the latest one, which is 1.8. And by the way, use the Nuget Package Manager to install the Azure Storage Library v. 2.0.x.
You can also upload resized versions to azure. So, you first upload the original image as a blob, say with the name /original/xxx.jpg; then you create a resize of the image and upload that to azure with the name say /thumbnail/xxx.jpg. If you want to create the resized versions on the fly or on a separate thread, you may need to temporarily save the original to disk.
The question I am asking is specifically because I don't want to use AzureDirectory project. I am just trying something on my own.
cloudStorageAccount = CloudStorageAccount.Parse("DefaultEndpointsProtocol=http;AccountName=xxxx;AccountKey=xxxxx");
blobClient=cloudStorageAccount.CreateCloudBlobClient();
List<CloudBlobContainer> containerList = new List<CloudBlobContainer>();
IEnumerable<CloudBlobContainer> containers = blobClient.ListContainers();
if (containers != null)
{
foreach (var item in containers)
{
Console.WriteLine(item.Uri);
}
}
/* Used to test connectivity
*/
//state the file location of the index
string indexLocation = containers.Last().Name.ToString();
Lucene.Net.Store.Directory dir =
Lucene.Net.Store.FSDirectory.Open(indexLocation);
//create an analyzer to process the text
Lucene.Net.Analysis.Analyzer analyzer = new
Lucene.Net.Analysis.Standard.StandardAnalyzer(Lucene.Net.Util.Version.LUCENE_30);
//create the index writer with the directory and analyzer defined.
bool findexExists = Lucene.Net.Index.IndexReader.IndexExists(dir);
Lucene.Net.Index.IndexWriter indexWritr = new Lucene.Net.Index.IndexWriter(dir, analyzer,!findexExists, Lucene.Net.Index.IndexWriter.MaxFieldLength.UNLIMITED);
//create a document, add in a single field
Lucene.Net.Documents.Document doc = new Lucene.Net.Documents.Document();
string path="D:\\try.html";
TextReader reader = new FilterReader("D:\\try.html");
doc.Add(new Lucene.Net.Documents.Field("url",path,Lucene.Net.Documents.Field.Store.YES,Lucene.Net.Documents.Field.Index.NOT_ANALYZED));
doc.Add(new Lucene.Net.Documents.Field("content",reader.ReadToEnd().ToString(),Lucene.Net.Documents.Field.Store.YES,Lucene.Net.Documents.Field.Index.ANALYZED));
indexWritr.AddDocument(doc);
indexWritr.Optimize();
indexWritr.Commit();
indexWritr.Close();
Now the issue is after indexing is completed I am not able to see any files created inside the container. Can anybody help me out?
You're using the FSDirectory there, which is going to write files to the local disk.
You're passing it a list of containers in blob storage. Blob storage is a service made available over a REST API, and is not addressable directly from the file system. Therefore the FSDirectory is not going to be able to write your index to storage.
Your options are :
Mount a VHD disk on the machine, and store the VHD in blob storage. There are some instructions on how to do this here: http://blogs.msdn.com/b/avkashchauhan/archive/2011/04/15/mount-a-page-blob-vhd-in-any-windows-azure-vm-outside-any-web-worker-or-vm-role.aspx
Use the Azure Directory, which you refer to in your question. I have rebuilt the AzureDirectory against the latest storage SDK: https://github.com/richorama/AzureDirectory
Another alternative for people looking around - I wrote up a directory that uses the azure shared cache (preview) which can be an alternative for AzureDirectory (albeit for bounded search sets)
https://github.com/ajorkowski/AzureDataCacheDirectory
I'm currently attempting to automate the deployment of an application to an Azure Worker role by pulling a file into the role from blob storage and working with it via a batch script, also located in blob storage. I'm using onStart to accomplish this. Here's a reduced version of my onStart method:
Getting ready to pull the files down:
public override bool OnStart()
{
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");
container.CreateIfNotExist();
CloudBlob file = container.GetBlobReference("file.bat");
Actually getting the files into the role:
LocalResource localResource = RoleEnvironment.GetLocalResource("localStore");
string filePath = System.IO.Path.Combine(localResource.RootPath, "file.bat");
using (var fileStream = System.IO.File.OpenWrite(#filePath))
{
file.DownloadToStream(fileStream);
}
This is how I get the batch file and the dependencies into the role. My problem now is - originally, I built the batch file with the assumption that the other files would be dropped right on C:\. For example - C:\installer.exe, C:\archive.zip, etc. But now the files are in localStorage.
I'm thinking I can either A) Somehow tell the batch file where localStorage is by dynamically writing the script onStart, or B) change localStorage to use C:\.
I'm not sure how to do either, or what the best thing to do here would be. Thoughts?
I would not change the LocalStorage to use C: (how would you do this anyways?). Take a look at Steve's blogpost: Using a Local Storage Resource From a Startup Task. He explains how you can get a LocalResource using powershell (and even call that script from a batch file).
And why not use the Windows Azure Bootstrapper? This is a little tool that can help you with the configuration of your role without having to write any code, you simply call it from a startup task and it can download files (also from blob storage like you're doing), work with local resources, ...
bootstrapper.exe -get http://download.microsoft.com/download/F/3/1/F31EF055-3C46-4E35-AB7B-3261A303A3B6/AspNetMVC3ToolsUpdateSetup.exe -lr $lr(temp) -run $lr(temp)\AspNetMVC3ToolsUpdateSetup.exe -args /q
Note: Instead of using absolute references in your batch file, make it use relative paths using %~dp0
I was trying to upload a 34 MB file onto the blob but it is prompting me some error
XML Parsing Error: no element found
Location: http://127.0.0.1:83/Default.aspx
Line Number 1, Column 1:
What should I do....How to solve it
I am able to upload small files of size 500KB.. but I have a file of size 34 MB to be uploaded into my blob container
I tried it using
protected void ButUpload_click(object sender, EventArgs e)
{
// store upladed file as a blob storage
if (uplFileUpload.HasFile)
{
name = uplFileUpload.FileName;
// get refernce to the cloud blob container
CloudBlobContainer blobContainer = cloudBlobClient.GetContainerReference("documents");
// set the name for the uploading files
string UploadDocName = name;
// get the blob reference and set the metadata properties
CloudBlob blob = blobContainer.GetBlobReference(UploadDocName);
blob.Metadata["FILETYPE"] = "text";
blob.Properties.ContentType = uplFileUpload.PostedFile.ContentType;
// upload the blob to the storage
blob.UploadFromStream(uplFileUpload.FileContent);
}
}
But I am not able to upload it.. Can anyone tell me How to do that....
Blobs larger than 64MB must be uploaded using block blobs. You break the file into blocks, upload all the blocks (associating each block with a unique string identifier), and at the very end you post the list of block IDs to the blob to commit the entire batch in one go.
Uploading in blocks is also recommended for large blobs less than 64MB in size. It is very easy for a hiccup in the network connection or routing through the internet to lose a frame or two in a very large upload, which will corrupt or invalidate the entire upload. Use smaller blocks to reduce your exposure to cosmic events.
More info in this discussion thread: http://social.msdn.microsoft.com/Forums/en-NZ/windowsazure/thread/f4575746-a695-40ff-9e49-ffe4c99b28c7
I would start by dropping some logging into the project to try and track the problem down. It may not be happening where you think. There might also be a permissions error. Try adding some dummy data into the database. If it still fails that might be a potential problem.
But track it down yourself with some debug, logging and some code review, I bet you can get to the bottom of the problem sooner that way. And it will also help to make your code more robust.
You can use Blobs here. I think its an issue with your web request size. You can change this setting in the web.config by increasing the number of the maxRequestLength attribute in the element. If you are sending chunks of 500Kb, then you are wasting bandwidth and bringing down performance. Send bigger chunks of data such as 1-2 Mb per chunk. See my Silverlight or HTML5 based upload control for chunked uploads. Pick Your Azure File Upload Control: Silverlight and TPL or HTML5 and AJAX
Use the Blob Transfer Utility to download and upload all your blob files.
It's a tool to handle thousands of (small/large) blob transfers in a effective way.
Binaries and source code, here: http://bit.ly/blobtransfer