Upload 2GB file to Azure blob - azure

I have a zipped 2GB file that I need to upload to Windows Azure through my home cable connection, how long should I expect that this file gets uploaded? Has anyone averaged their upload time for their files?
Also when I do this to create a blob for uploading a file
CloudBlob _blob = _container.GetBlobReference("file1");
is this creating CloudBlockBlob or CloudPageBlob by default? I have been using the above code to upload files and it has been quite slow.

CloudBlob _blob = _container.GetBlobReference("file1");
It does not create CloudBlockBlob or CloudPageBlob by default.
If you want to use CloudBlockBlob (Azure SDK v2.0):
// Retrieve reference to a blob named "myblob".
CloudBlockBlob blob = container.GetBlockBlobReference("myblob");
now split your file to a small pieces (4MB max), and upload each piece like this:
blob.PutBlock(blockId, memoryStream, null);
where: blockId is a base64-encoded block ID that identifies the block.
and memoryStream A stream that provides the data for the block.
MSDN

Related

Is there a default image type for Azure Blob Storage?

When I upload an jpg image to Azure Blob Storage, then pull it back out, it always comes back as a png. Is this a safe to assume default? I could not find any documentation. I can easily convert it back to a jpg in my byte conversion as I store the extension when I pull it out of the container, but can I safely assume any image type that is stored in Azure Blob Storage is done as a png?
but can I safely assume any image type that is stored in Azure Blob Storage is done as a png
As far as I know, blob storage doesn't have the default file type. It contains a content-type which could tell the user the file's content type is. If you set the blob name is xxx.jpg. Then it will store the xxx.jpg in the blob storage.
Here is an example.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("connection string");
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve a reference to a container.
CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");
// Create the container if it doesn't already exist.
container.CreateIfNotExists();
CloudBlockBlob blockBlob = container.GetBlockBlobReference("brandotest.jpg");
// Create or overwrite the "myblob" blob with contents from a local file.
using (var fileStream = System.IO.File.OpenRead(#"D:\1.PNG"))
{
blockBlob.UploadFromStream(fileStream);
}
Result:
By using storage explorer, you could find the image's type in the blob storage. It is brandotest.jpg.
I guess you set the download image file's type when you use codes to download it.
Like this(I set the download file's path is myfile.png):
CloudBlockBlob blockBlob = container.GetBlockBlobReference("brandotest.jpg");
// Save blob contents to a file.
using (var fileStream = System.IO.File.OpenWrite(#"D:\authserver\myfile.png"))
{
blockBlob.DownloadToStream(fileStream);
}
The result will be myfile.png.
All in all, the file's type you have download to local or upload to the blob is according to your set when you using codes to get or upload it.
It is best not to assume that png is the default. I think it depends on your upload mechanism and the name you give for the file during the upload process.

How to download files with white space on name on Azure Blob Storage?

I'm trying to download a file from this URL:
https://renatoleite.blob.core.windows.net/mycontainer/documents/Test Document.pdf
The browser is changing the URL to this:
https://renatoleite.blob.core.windows.net/mycontainer/documents/Test%20Document.pdf
My file in the blob storage has the name: Test Document.pdf
So, when I clicks to download, the Azure say that file not exist:
The specified resource does not exist.
Probably because the browser is trying to get the file with "%20" in the name.
How I can solve this?
As far as I know, if you want to upload the file space name by using azure storage api, it will auto encoded the name(replace the space with %20) when uploading it.
You could see below example:
I uploaded the Test Document.pdf to the blob storage.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
CloudConfigurationManager.GetSetting("StorageConnectionString"));
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve a reference to a container.
CloudBlobContainer container = blobClient.GetContainerReference("brando");
// Create the container if it doesn't already exist.
container.CreateIfNotExists();
// Retrieve reference to a blob named "myblob".
CloudBlockBlob blockBlob = container.GetBlockBlobReference("Test Document.pdf");
// Create or overwrite the "myblob" blob with contents from a local file.
using (var fileStream = System.IO.File.OpenRead(#"D:\Test Document.pdf"))
{
blockBlob.UploadFromStream(fileStream);
}
Then I suggest you could use storage explorer(right click the properties to see its url) or azure portal to see its url from the blob's property.
The url like this:
You could find it replace the space with %20.

Azure blob storage split blobs and store on Cloud file share

I am uploading zip files to Azure Blob Storage which are relatively huge.
Now I need to go to those containers, get the blob reference and store that zip file into multiple zips on Cloud file share. I am not sure how to proceed with that.
var storageAccount = AzureUtility.CreateStorageAccountFromConnectionString();
var container = AzureUtility.GetAzureCloudBlobContainerReference("fabcd");
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
CloudFileShare share = fileClient.GetShareReference("sample-share");
share.CreateIfNotExists();
CloudBlockBlob sourceBlob = container.GetBlockBlobReference("Test.zip");
var file = share.GetRootDirectoryReference().GetFileReference("Test.zip").Exists();
if(file)
{
//split and share
}
any suggestions
My understanding is that you're trying to download a blob and then divide and upload it into multiple files.
There's two options here both of which are exposed through the .Net API. The blob API exposes both DownloadRangeTo* methods and DownloadTo* methods. The file API exposes UploadFrom* methods. If you know in advance the divisions you want to make you can download the range and then upload it to the file. Otherwise you can download the entire blob, divide it client side, and then upload the divisions.

Azure Storage Emulator - How to specify hierarchical path?

I'm using Azure Storage Emulator 4.1.0.0 and VS 2013.
How can I upload a file from a dev box and specify the hierarchical path on Azure Blob storage?
I'm developing locally with VS 2013 & Azure Storage Emulator 4.1.0.0.
I'd like to upload a file but specify the Blob's hierarchical path to the file.
const string ImageToUpload = #"C:\Users\Me\Documents\Visual Studio 2013\Projects\AzureCloudService1\DataBlobStorage1\HelloWorld2.png";
CloudBlockBlob blockBlob = container.GetBlockBlobReference(ImageToUpload);
await blockBlob.UploadFromFileAsync(ImageToUpload, FileMode.Open).ConfigureAwait(configureawait);
This code does upload the file, but winds up storing the source's physical path to the file. How do I specify to Azure Blob storage the desired hierarchical path?
Thx!
Imagine you want the path to be AzureCloudService1\DataBlobStorage1\HelloWorld2.png in your container, this is what you would have to do:
CloudBlockBlob blockBlob = container.GetBlockBlobReference("AzureCloudService1/DataBlobStorage1/HelloWorld2.png");
Ah... GetBlockBlobReference is what sets the blob name, and it can be named as desired. The actual file can be read into a fileStream from some other physical location and uploaded into the blob, like so:
string fileName = HttpContext.Current.Server.MapPath("~/App_Data/Clients/Clientname/" + "HelloWorld.png");
CloudBlockBlob blockBlob = container.GetBlockBlobReference(#"Clients\ClientName");
using (var fileStream = File.OpenRead(fileName))
{
await blockBlob.UploadFromStreamAsync(fileStream);
}

Download/Upload file from AzureBlob to Nodejs server

I want to download a file from Azure blob storage to Nodejs server file system,do some processing on that file on Nodejs side and upload the updated copy of file to the Blob storage again from the file system.
Please suggest some way/or anyone has implemented the same.
Thanks :)
Downloading a Blob using Node.js:
getBlobToFile - writes the blob contents to file
getBlobToStream - writes the blob contents to a stream
getBlobToText - writes the blob contents to a string
createReadStream - provides a stream to read from the blob
The following example demonstrates using getBlobToStream to download the contents of the myblob blob and store it to the output.txt file using a stream:
var fs = require('fs');
blobSvc.getBlobToStream('mycontainer', 'myblob', fs.createWriteStream('output.txt'), function(error, result, response){
if(!error){
// blob retrieved
}
});
Upload a file
How to: Upload a blob into a container
A blob can be either block, or page based. Block blobs allow you to more efficiently upload large data, while page blobs are optimized for read/write operations. For more information, see Understanding block blobs and page blobs.
Block blobs
To upload data to a block blob, use the following:
createBlockBlobFromLocalFile - creates a new block blob and uploads the contents of a file.
createBlockBlobFromStream - creates a new block blob and uploads the contents of a stream.
createBlockBlobFromText - creates a new block blob and uploads the contents of a string.
createWriteStreamToBlockBlob - provides a write stream to a block blob.
The following example uploads the contents of the test.txt file into myblob.
blobSvc.createBlockBlobFromLocalFile('mycontainer', 'myblob', 'test.txt', function(error, result, response){
if(!error){
// file uploaded
}
});

Resources