Azure Blob storage Download - azure

I am learning Azure, I have successfully uploaded and list files in my containers. When I run the code below on my home pc everything works fine, no exceptions, however when I run on my work pc i catch an exception that states:
Blob data corrupted. Incorrect number of bytes received '12288' / '-1'
The file seems to download to my local drive just fine, I just cannot figure why it works different on two different PCs, exact same code.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("My connection string");
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");
CloudBlockBlob blockBlob = container.GetBlockBlobReference("ARCS.TXT");
using (var fileStream = System.IO.File.OpenWrite(#"c:\a\ARCS.txt"))
{
blockBlob.DownloadToStream(fileStream);
}

May be your Organization's firewall is blocking a specific port. I have written a blog which discuss similar kind of port related issues. Will request you to verify once from that. http://nabaruns.blogspot.in/2012/11/common-port-related-troubleshoot-in.html
Regards
Nabarun

Your code looks correct.
That is a weird issue. It's more weird becuase file gets downloaded properly even after the error. I would recomend you use Azure storage explorer on both of your machines.
If Azure storage explorer works fine on both the machine then next step would be to check the SDK version on both machine. There are chances of such error with older version of SDK.
You may also want to try Commandline Downloader to trouble shoot your issue.
Note - Azure storage explorer and Commandline Downloader are open source. If download through them works fine then you can download its code and debug through it also.

I'd recommend trying CloudBlob.DownloadToFile or CloudBlob.DownloadToStream instead of CloudBlockBlob

Related

How to upload a large file in chunks with parallelism in Azure SDK v12?

In Azure SDK v11, we had the option to specify the ParallelOperationThreadCount through the BlobRequestOptions. In Azure SDK v12, I see that the BlobClientOptions does not have this, and the BlockBlobClient (previously CloudBlockBlob in Azure SDK v11), there is only mention of parallelism in the download methods.
We have three files: one 200MB, one 150MB, and one 20MB. For each file, we want the file to be split into blocks and have those uploaded in parallel. Is this automatically done by the BlockBlobClient? If possible, we would like to do these operations for the 3 files in parallel as well.
You also can take use of StorageTransferOptions in v12.
The sample code below:
BlobServiceClient blobServiceClient = new BlobServiceClient(conn_str);
BlobContainerClient containerClient= blobServiceClient.GetBlobContainerClient("xxx");
BlobClient blobClient = containerClient.GetBlobClient("xxx");
//set it here.
StorageTransferOptions transferOptions = new StorageTransferOptions();
//transferOptions.MaximumConcurrency or other settings.
blobClient.Upload("xxx", transferOptions:transferOptions);
By the way, for uploading large files, you can also use Microsoft Azure Storage Data Movement Library for better performance.
Using Fiddler, I verified that BlockBlobClient does indeed upload the files in chunks without needing to do any extra work. For doing each of the major files in parallel, I simply had a task for each one, added it to a list tasks and used await Task.WhenAll(tasks).

If using ImageResizer with Azure blobs do I need the AzureReader2 plugin?

I'm working on a personal project to manage users of my club, it's hosted on the free Azure package (for now at least), partly as an experiment to try out Azure. Part of creating their records is to add a photo, so I've got a Contact Card view that lets me see who they are, when they came and a photo.
I have installed ImageResizer and it's really easy to resize the 10MP photos from my camera and save them to the file system locally, but it seems that for Azure I need to use their Blobs to Upload Pictures to Windows Azure Web Sites, and that's new to me. The documentation on ImageResizer says that I need to use AzureReader2 in order to work with Azure blobs but it isn't free. It also says in their best practices #5 to
Use dynamic resizing instead of pre-resizing your images.
Which is not what I was thinking, I was going to resize to 300x300 and 75x75 (for thumbnail) when creating the users record. But if I should be storing full size images as blobs and dynamically resizing on the way out then can I just use standard means to Upload a blob into a container to save it to Azure, then when I want to display the images use the ImageResizer and pass it each image to resize as required. That way not needing to use the AzureReader2, or have I misunderstood what it does / how it works?
Is there another way to consider?
I've not yet implemented cropping, but that's next to tackle when I've worked out how to actually store the images properly
With some trepidation, I'm going to disagree with astaykov here. I believe you CAN use ImageResizer with Azure WITHOUT needing AzureReader2. Maybe I should qualify that by saying 'It works on my setup' :)
I'm using ImageResizer in an MVC 3 application. I have a standard Azure account with an images container.
Here's my test code for the view:
#using (Html.BeginForm( "UploadPhoto", "BasicProfile", FormMethod.Post, new { enctype = "multipart/form-data" }))
{
<input type="file" name="file" />
<input type="submit" value="OK" />
}
And here's the corresponding code in the Post Action method:
// This action handles the form POST and the upload
[HttpPost]
public ActionResult UploadPhoto(HttpPostedFileBase file)
{
// Verify that the user selected a file
if (file != null && file.ContentLength > 0)
{
string newGuid = Guid.NewGuid().ToString();
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConfigurationManager.AppSettings["StorageConnectionString"]);
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve reference to a previously created container.
CloudBlobContainer container = blobClient.GetContainerReference("images");
// Retrieve reference to the blob we want to create
CloudBlockBlob blockBlob = container.GetBlockBlobReference(newGuid + ".jpg");
// Populate our blob with contents from the uploaded file.
using (var ms = new MemoryStream())
{
ImageResizer.ImageJob i = new ImageResizer.ImageJob(file.InputStream,
ms, new ImageResizer.ResizeSettings("width=800;height=600;format=jpg;mode=max"));
i.Build();
blockBlob.Properties.ContentType = "image/jpeg";
ms.Seek(0, SeekOrigin.Begin);
blockBlob.UploadFromStream(ms);
}
}
// redirect back to the index action to show the form once again
return RedirectToAction("UploadPhoto");
}
This is 'rough and ready' code to test the theory and could certainly stand improvement but, it does work both locally and when deployed on Azure. I can also view the images I've uploaded, which are correctly re-sized.
Hope this helps someone.
The answer to the concrete question:
If using ImageResizer with Azure blobs do I need the AzureReader2
plugin?
is YES. And as described in the Image Resizer's documentation - that plugin is used to read/process/serve images out of Blob Storage. So there is no doubt - if you are going to use Image Resizer, AzureReader2 is your needed plugin to make things right. It will take care of Blob uploads/serve.
Although I question Image Resizer's team competency on Windows Azure, since they are referencing Azure SDK v.2, while the most current version for Azure SDK is 1.8. What they mean is the Azure Storage Client Library, which has versions 1.7 and 2.x. Whereas version 2.x is recommended one to use and comes with Azure SDK 1.8. So, do not search for Azure SDK 2.0, install the latest one, which is 1.8. And by the way, use the Nuget Package Manager to install the Azure Storage Library v. 2.0.x.
You can also upload resized versions to azure. So, you first upload the original image as a blob, say with the name /original/xxx.jpg; then you create a resize of the image and upload that to azure with the name say /thumbnail/xxx.jpg. If you want to create the resized versions on the fly or on a separate thread, you may need to temporarily save the original to disk.

Azure Library for Lucene.net Error. File Not Found After write.lock created

When executing the following code in an empty azure container, I get file not found error (segments.gen; The specified blob does not exist.).
AzureDirectory azureDirectory = new AzureDirectory(account, "audiobookindex"); // <-- audiobookindex is the name of the blog storage container on my Azure account
// Create the index writerIndexWriter indexWriter = new IndexWriter(azureDirectory, new StandardAnalyzer(), true);
It seems to be failing on the OpenInput inside the Azure Library for Lucene.net assembly. However I don't understand while it's even calling that method. Would think it would just try to create it.
Also, the assembly and code IS hitting the container because it creates a write.lock file that I can see in the container.
Any suggestions?
This should solve this problem. The examples in market are developed with older apis and older framework version etc. I found the above solution which works fine! No need of putting interfering with debugger ;)

Automating App Deployment in Azure with LocalResource

I'm currently attempting to automate the deployment of an application to an Azure Worker role by pulling a file into the role from blob storage and working with it via a batch script, also located in blob storage. I'm using onStart to accomplish this. Here's a reduced version of my onStart method:
Getting ready to pull the files down:
public override bool OnStart()
{
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");
container.CreateIfNotExist();
CloudBlob file = container.GetBlobReference("file.bat");
Actually getting the files into the role:
LocalResource localResource = RoleEnvironment.GetLocalResource("localStore");
string filePath = System.IO.Path.Combine(localResource.RootPath, "file.bat");
using (var fileStream = System.IO.File.OpenWrite(#filePath))
{
file.DownloadToStream(fileStream);
}
This is how I get the batch file and the dependencies into the role. My problem now is - originally, I built the batch file with the assumption that the other files would be dropped right on C:\. For example - C:\installer.exe, C:\archive.zip, etc. But now the files are in localStorage.
I'm thinking I can either A) Somehow tell the batch file where localStorage is by dynamically writing the script onStart, or B) change localStorage to use C:\.
I'm not sure how to do either, or what the best thing to do here would be. Thoughts?
I would not change the LocalStorage to use C: (how would you do this anyways?). Take a look at Steve's blogpost: Using a Local Storage Resource From a Startup Task. He explains how you can get a LocalResource using powershell (and even call that script from a batch file).
And why not use the Windows Azure Bootstrapper? This is a little tool that can help you with the configuration of your role without having to write any code, you simply call it from a startup task and it can download files (also from blob storage like you're doing), work with local resources, ...
bootstrapper.exe -get http://download.microsoft.com/download/F/3/1/F31EF055-3C46-4E35-AB7B-3261A303A3B6/AspNetMVC3ToolsUpdateSetup.exe -lr $lr(temp) -run $lr(temp)\AspNetMVC3ToolsUpdateSetup.exe -args /q
Note: Instead of using absolute references in your batch file, make it use relative paths using %~dp0

Azure: Downloading from Blob Storage results in permissions error?

I’ve uploaded some files to Blob storage, and now I’m using the OnStart method to retrieve those files and run them. Right now I’m working locally.
Using the following code:
using (var fileStream = System.IO.File.OpenWrite(#"C:\testfolder"))
{
blob.DownloadToStream(fileStream);
}
Results in a “Access to the path 'C:\testfolder' is denied.” error.
What do you think is causing this? And - will this be an issue once the project is actually pushed up to Azure? I can change permissions locally, but I'm hoping that once it's actually in a live worker role, it won't be an issue.
Any help would be awesome :)
Scratch that - it looks like the C:\testfolder should specify the file name, not the location. I've changed it to C:\testfolder\test.txt and it works just fine :).

Resources