Can't rename blob file in Azure Storage - azure

I am trying to rename blob in azure storage via .net API and it is I am unable to rename a blob file after a day : (
Here is how I am doing it, by creating new blob and copy from old one.
var newBlob = blobContainer.GetBlobReferenceFromServer(filename);
newBlob.StartCopyFromBlob(blob.Uri);
blob.Delete();
There is no new blob on server so I am getting http 404 Not Found exception.
Here is working example that i have found but it is for old .net Storage api.
CloudBlob blob = container.GetBlobReference(sourceBlobName);
CloudBlob newBlob = container.GetBlobReference(destBlobName);
newBlob.UploadByteArray(new byte[] { });
newBlob.CopyFromBlob(blob);
blob.Delete();
Currently I am using 2.0 API. Where I am I making a mistake?

I see that you're using GetBlobReferenceFromServer method to create an instance of new blob object. For this function to work, the blob must be present which will not be the case as you're trying to rename the blob.
What you could do is call GetBlobReferenceFromServer on the old blob, get it's type and then either create an instance of BlockBlob or PageBlob and perform copy operation on that. So your code would be something like:
CloudBlobContainer blobContainer = storageAccount.CreateCloudBlobClient().GetContainerReference("container");
var blob = blobContainer.GetBlobReferenceFromServer("oldblobname");
ICloudBlob newBlob = null;
if (blob is CloudBlockBlob)
{
newBlob = blobContainer.GetBlockBlobReference("newblobname");
}
else
{
newBlob = blobContainer.GetPageBlobReference("newblobname");
}
//Initiate blob copy
newBlob.StartCopyFromBlob(blob.Uri);
//Now wait in the loop for the copy operation to finish
while (true)
{
newBlob.FetchAttributes();
if (newBlob.CopyState.Status != CopyStatus.Pending)
{
break;
}
//Sleep for a second may be
System.Threading.Thread.Sleep(1000);
}
blob.Delete();

The code in OP was almost fine except that an async copy method was called. The simplest code in new API should be:
var oldBlob = cloudBlobClient.GetBlobReferenceFromServer(oldBlobUri);
var newBlob = container.GetBlobReference("newblobname");
newBlog.CopyFromBlob(oldBlob);
oldBlob.Delete();

Related

Read the blob content on Azure Storage

I'm using the Microsoft.Azure.Storage.Blob nuget package trying to get the list of the blobs in a container and than reading the content.
With the ListBlobs() method I see all the blobs.
Every blob item has an URI but I cannot see the blob name that I need for the GetBlobReferenceFromServer().
For this reason the blob name is a constant in following sample code.
What is the right way? Do I have to split and parse the URI to find the blob name?
Do I have to use another method?
Microsoft.Azure.Storage.Blob.CloudBlobContainer container =
new Microsoft.Azure.Storage.Blob.CloudBlobContainer(new Uri("https://myaccount.blob.core.windows.net/containername"),
new Microsoft.Azure.Storage.Auth.StorageCredentials("myaccount", "**********=="));
IEnumerable<Microsoft.Azure.Storage.Blob.IListBlobItem> blobs = container.ListBlobs();
foreach (var blobItem in blobs)
{
//string blobUri = blobItem.Uri.ToString();
Microsoft.Azure.Storage.Blob.ICloudBlob blockBlob = container.GetBlobReferenceFromServer("blobname");
MemoryStream downloadStream = new MemoryStream();
blockBlob.DownloadToStream(downloadStream);
string blobContent = Encoding.UTF8.GetString(downloadStream.ToArray());
}
With the ListBlobs() method I see all the blobs. Every blob item has
an URI but I cannot see the blob name that I need for the
GetBlobReferenceFromServer().
The reason for this is that ListBlobs method returns an enumerable of type IListBlobItem which does not have the name property. In order to get the name of the blob, you can cast it to either CloudBlob or CloudBlockBlob which implement this interface and you will be able to get the name of the blob which you can use GetBlobReferenceFromServer method.
BTW, once you have listed the blob you don't really need to call GetBlobReferenceFromServer method as you already have all the information about the blob as part of listing. GetBlobReferenceFromServer makes another request to storage to fetch same set of properties that you already have as part of listing.
So your code can simply be:
foreach (var blobItem in blobs)
{
var blockBlob = (CloudBlockBlob) blobItem;
MemoryStream downloadStream = new MemoryStream();
blockBlob.DownloadToStream(downloadStream);
string blobContent = Encoding.UTF8.GetString(downloadStream.ToArray());
}
Or, if you don't go down casting route, you can simply create an instance of CloudBlockBlob using the URI you got as part of the listing.
Something like:
foreach (var blobItem in blobs)
{
var blockBlob = new CloudBlockBlob(blobItem.Uri, container.ServiceClient);
MemoryStream downloadStream = new MemoryStream();
blockBlob.DownloadToStream(downloadStream);
string blobContent = Encoding.UTF8.GetString(downloadStream.ToArray());
}

Why can't I download an Azure Blob using an asp.net core application published to Azure server

I am trying to download a Blob from an Azure storage account container. When I run the application locally, I get the correct "Download" folder C:\Users\xxxx\Downloads. When I publish the application to Azure and try to download the file, I get an error. I have tried various "Knownfolders", and some return empty strings, others return the folders on the Azure server. I am able to upload files fine, list the files in a container, but am struggling with downloading a file.
string conn =
configuration.GetValue<string>"AppSettings:AzureContainerConn");
CloudStorageAccount storageAcct = CloudStorageAccount.Parse(conn);
CloudBlobClient blobClient = storageAcct.CreateCloudBlobClient();
CloudBlobContainer container =
blobClient.GetContainerReference(containerName);
Uri uriObj = new Uri(uri);
string filename = Path.GetFileName(uriObj.LocalPath);
// get block blob reference
CloudBlockBlob blockBlob = container.GetBlockBlobReference(filename);
Stream blobStream = await blockBlob.OpenReadAsync();
string _filepath = _knownfolder.Path + "\\projectfiles\\";
Directory.CreateDirectory(_filepath);
_filepath = _filepath + filename;
Stream _file = new MemoryStream();
try
{
_file = File.Open(_filepath, FileMode.Create, FileAccess.Write);
await blobStream.CopyToAsync(_file);
}
finally
{
_file.Dispose();
}
The expected end result is the file ends up in the folder within the users "Downloads" folder.
Since you're talking about publishing to Azure, the code is probably from a web application, right? And the code for the web application runs on the server. Which means the code is trying to download the blob to the server running the web application.
To present a downloadlink to the user to enable them to download the file, use the FileStreamResult which
Represents an ActionResult that when executed will write a file from a stream to the response.
A (pseudo code) example:
[HttpGet]
public FileStreamResult GetFile()
{
var stream = new MemoryStream();
CloudBlockBlob blockBlob = container.GetBlockBlobReference(filename);
blockBlob.DownloadToStream(stream);
blockBlob.Seek(0, SeekOrigin.Begin);
return new FileStreamResult(stream, new MediaTypeHeaderValue("text/plain"))
{
FileDownloadName = "someFile.txt"
};
}

Azure Storage Search Blobs by Metadata

I have CloudBlockBlobs that have metadata.
CloudBlockBlob blockBlob = container.GetBlockBlobReference("myblob.jpg");
using (var fileStream = System.IO.File.OpenRead(filePath))
{
blockBlob.UploadFromStream(fileStream);
blockBlob.Properties.ContentType = "image/jpg";
blockBlob.Metadata.Add("Title", "Yellow Pear");
blockBlob.SetProperties();
}
I see the Metadata is there:
Debug.WriteLine(blockBlob.Metadata["Title"]);
Now later if I query from storage I see the blobs but the Metadata is missing:
(in the below I know blobItems[0] had Metadata when uploaded but now blobItems[0].Metadata.Count == 0)
var blobItems = container.ListBlobs(
null, false, BlobListingDetails.Metadata);
I also noticed the Metadata is not available when I obtain the blob by itself:
CloudBlockBlob a = container.GetBlockBlobReference("myblob.jpg");
//Below throws an exception
var b = a.Metadata["Title"];
Thank you!
There are some issues with your code :(.
The blob doesn't have any metadata set actually. After setting the metadata, you're calling blob.SetProperties() method which only sets the blob's properties (ContentType in your example). To set the metadata, you would actually need to call blob.SetMetadata() method.
Your upload code is currently making 2 calls to storage service: 1) upload blob and 2) set properties. If you call SetMetadata then it would be 3 calls. IMHO, these can be combined in just 1 call to storage service by doing something like below:
using (var fileStream = System.IO.File.OpenRead(filePath))
{
blockBlob.Properties.ContentType = "image/jpg";
blockBlob.Metadata.Add("Title", "Yellow Pear");
blockBlob.UploadFromStream(fileStream);
}
This will not only upload the blob but also set it's properties and metadata in a single call to storage service.
Regarding
I also noticed the Metadata is not available when I obtain the blob by
itself:
CloudBlockBlob a = container.GetBlockBlobReference("myblob.jpg");
//Below throws an exception
var b = a.Metadata["Title"];
Basically the code above is just creating an instance of the blob on the client side. It doesn't actually fetch the properties (and metadata) of the blob. To fetch details about the blob, you would need to call FetchAttributes method on the blob. Something like:
CloudBlockBlob a = container.GetBlockBlobReference("myblob.jpg");
a.FetchAttributes();
If after that you retrieve blob's metadata, you should be able to see it (provided metadata was created properly).

Delete "subpath" from Azure Storage

I know Azure doesn't have actual subpaths, but if I have for example container/projectID/iterationNumber/filename.jpg and I delete a project, how can I delete from ProjectID? Is it possible through coding?
I don't want to use the azure application as I am creating a web app.
Thanks in Advance
EDIT:
This is the code provided by Microsoft to target on specific item:
// Retrieve storage account from connection string.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
CloudConfigurationManager.GetSetting("StorageConnectionString"));
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve reference to a previously created container.
CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");
// Retrieve reference to a blob named "myblob.txt".
CloudBlockBlob blockBlob = container.GetBlockBlobReference("myblob.txt");
// Delete the blob.
blockBlob.Delete();
SystemDesignModel
public static SystemDesign returnImageURL(IListBlobItem item)
{
if (item is CloudBlockBlob)
{
var blob = (CloudBlockBlob)item;
return new SystemDesign
{
URL = blob.Uri.ToString(),
};
}
return null;
}
}
As you know, blob storage does not have the concept of subfolders. It has just 2 level hierarchy - container & blobs. So in essence, a subfolder is just a prefix that you attach to blob name. In your example, the actual file you uploaded is filename.jpg but its name from blob storage perspective is projectID/iterationNumber/filename.jpg.
Since there is no concept of subfolder, you just can't delete it like we do on our local computer. However there's a way. Blob storage provides a way to search for blobs starting with a certain blob prefix. So what you have to do is first list all blobs that start with certain prefix (projectID in your case) and then delete the blobs one at a time returned as a result of listing operations.
Take a look at sample code below:
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
CloudConfigurationManager.GetSetting("StorageConnectionString"));
var container = storageAccount.CreateCloudBlobClient().GetContainerReference("container");
BlobContinuationToken token = null;
do
{
var listingResult = container.ListBlobsSegmented("blob-prefix (projectID in your case)", true, BlobListingDetails.None, 5000, token, null, null);
token = listingResult.ContinuationToken;
var blobs = listingResult.Results;
foreach (var blob in blobs)
{
(blob as ICloudBlob).DeleteIfExists();
Console.WriteLine(blob.Uri.AbsoluteUri + " deleted.");
}
}
while (token != null);

NPOI writing XLS file converting to Azure Blob

I'm trying to convert current application that uses NPOI for creating xls document on the server to Azure hosted application. I have little experience with NPOI and Azure so 2 strikes right there. I have the app uploading the xls to Blob container however it is always blank (9 bytes). From what I understand NPOI uses filestream to write to the file so I just changed that to write to the blob container.
Here is what i think are the relevant portions:
internal void GenerateExcel(DataSet ds, int QuoteID, string ReportFileName)
{
string ExcelFileName = string.Format("{0}_{1}.xls",ReportFileName,QuoteID);
try
{
//these 2 strings will get deleted but left here for now to run side by side at the moment
string ReportDirectoryPath = HttpContext.Current.Server.MapPath(".") + "\\Reports";
if (!Directory.Exists(ReportDirectoryPath))
{
Directory.CreateDirectory(ReportDirectoryPath);
}
string ExcelReportFullPath = ReportDirectoryPath + "\\" + ExcelFileName;
if (File.Exists(ExcelReportFullPath))
{
File.Delete(ExcelReportFullPath);
}
// Create a new workbook.
var workbook = new HSSFWorkbook();
//Rest of the NPOI XLS rows cells etc. etc. all works fine when writing to disk////////////////
// Retrieve storage account from connection string.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve a reference to a container.
CloudBlobContainer container = blobClient.GetContainerReference("pricingappreports");
// Create the container if it doesn't already exist.
if (container.CreateIfNotExists())
{
container.SetPermissions(new BlobContainerPermissions { PublicAccess = BlobContainerPublicAccessType.Blob });
}
// Retrieve reference to a blob with the same name.
CloudBlockBlob blockBlob = container.GetBlockBlobReference(ExcelFileName);
// Write the output to a file on the server
String file = ExcelReportFullPath;
using (FileStream fs = new FileStream(file, FileMode.Create))
{
workbook.Write(fs);
fs.Close();
}
// Write the output to a file on Azure Storage
String Blobfile = ExcelFileName;
using (FileStream fs = new FileStream(Blobfile, FileMode.Create))
{
workbook.Write(fs);
blockBlob.UploadFromStream(fs);
fs.Close();
}
}
I'm uploading to the Blob and the file exists, why doesn't the data get written to the xls?
Any help would be appreciated.
Update: I think I found the problem. Doesn't look like you can write to a file in Blob Storage. Found this Blog which pretty much answers my questions: it doesn't use NPOI but the concept is the same. http://debugmode.net/2011/08/28/creating-and-updating-excel-file-in-windows-azure-web-role-using-open-xml-sdk/
Thanks
Can you install fiddler and check the request and the response packets? You may also need to seek back to 0 between two writes . So the correct code here could be to add the below before trying to write the stream to blob.
workbook.Write(fs);
fs.Seek(0, SeekOrigin.Begin);
blockBlob.UploadFromStream(fs);
fs.Close();
I also noticed that you are using String Blobfile = ExcelFileName instead of String Blobfile = ExcelReportFullPath.

Resources