I'm using the Microsoft.Azure.Storage.Blob nuget package trying to get the list of the blobs in a container and than reading the content.
With the ListBlobs() method I see all the blobs.
Every blob item has an URI but I cannot see the blob name that I need for the GetBlobReferenceFromServer().
For this reason the blob name is a constant in following sample code.
What is the right way? Do I have to split and parse the URI to find the blob name?
Do I have to use another method?
Microsoft.Azure.Storage.Blob.CloudBlobContainer container =
new Microsoft.Azure.Storage.Blob.CloudBlobContainer(new Uri("https://myaccount.blob.core.windows.net/containername"),
new Microsoft.Azure.Storage.Auth.StorageCredentials("myaccount", "**********=="));
IEnumerable<Microsoft.Azure.Storage.Blob.IListBlobItem> blobs = container.ListBlobs();
foreach (var blobItem in blobs)
{
//string blobUri = blobItem.Uri.ToString();
Microsoft.Azure.Storage.Blob.ICloudBlob blockBlob = container.GetBlobReferenceFromServer("blobname");
MemoryStream downloadStream = new MemoryStream();
blockBlob.DownloadToStream(downloadStream);
string blobContent = Encoding.UTF8.GetString(downloadStream.ToArray());
}
With the ListBlobs() method I see all the blobs. Every blob item has
an URI but I cannot see the blob name that I need for the
GetBlobReferenceFromServer().
The reason for this is that ListBlobs method returns an enumerable of type IListBlobItem which does not have the name property. In order to get the name of the blob, you can cast it to either CloudBlob or CloudBlockBlob which implement this interface and you will be able to get the name of the blob which you can use GetBlobReferenceFromServer method.
BTW, once you have listed the blob you don't really need to call GetBlobReferenceFromServer method as you already have all the information about the blob as part of listing. GetBlobReferenceFromServer makes another request to storage to fetch same set of properties that you already have as part of listing.
So your code can simply be:
foreach (var blobItem in blobs)
{
var blockBlob = (CloudBlockBlob) blobItem;
MemoryStream downloadStream = new MemoryStream();
blockBlob.DownloadToStream(downloadStream);
string blobContent = Encoding.UTF8.GetString(downloadStream.ToArray());
}
Or, if you don't go down casting route, you can simply create an instance of CloudBlockBlob using the URI you got as part of the listing.
Something like:
foreach (var blobItem in blobs)
{
var blockBlob = new CloudBlockBlob(blobItem.Uri, container.ServiceClient);
MemoryStream downloadStream = new MemoryStream();
blockBlob.DownloadToStream(downloadStream);
string blobContent = Encoding.UTF8.GetString(downloadStream.ToArray());
}
Related
I am trying to download a block blob from Azure storage explorer. I am able to download all the block blobs that exist in the root directory of my container. I am unable to download blobs that are nested in subfolders inside the container.
CloudBlockBlob blob = container.GetBlockBlobReference(fileName);
SharedAccessBlobPolicy sasConstraints = new SharedAccessBlobPolicy();
sasConstraints.SharedAccessExpiryTime = DateTimeOffset.UtcNow.AddHours(1);
sasConstraints.Permissions = SharedAccessBlobPermissions.Read;
string sasBlobToken = blob.GetSharedAccessSignature(sasConstraints);
return blob.Uri.AbsoluteUri + sasBlobToken;
I couldn't get the absolute path of blockBlob using GetBlockBlobReference(fileName). The below code solved my issue. I got the listing and then used LINQ to get the blockBlob with the absolute path details.
This post helped as well
do
{
var listingResult = await blobDirectory.ListBlobsSegmentedAsync(useFlatBlobListing, blobListingDetails, maxBlobsPerRequest, continuationToken, null, null);
//The below lined fetched the blockBlob with the correct directory details.
var blockBlob = listingResult.Results.Where(x => x.Uri.AbsolutePath.Contains(fileName)).Count()>0 ? (CloudBlockBlob)listingResult.Results.Where(x=>x.Uri.AbsolutePath.Contains(fileName)).FirstOrDefault():null;
if (blockBlob != null)
{
sasConstraints.SharedAccessExpiryTime = expiryTimeSAS;
sasConstraints.Permissions = SharedAccessBlobPermissions.Read;
string sasBlobToken = blockBlob.GetSharedAccessSignature(sasConstraints);
return blockBlob.Uri.AbsoluteUri + sasBlobToken;
}
continuationToken = listingResult.ContinuationToken;
} while (continuationToken != null);
Correct me if there is any other efficient way of pulling the blob information from a list of directories in a container.
Below Solution helps to access single File Absolute path residing under Directory( or Folder Path).
public static String GetBlobUri(string dirPath, string fileName)
{
//Get a reference to a blob within the container.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("Blob Key");
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("Blob Container");
CloudBlockBlob blockBlob = container.GetBlockBlobReference(dirPath+fileName);
return blockBlob.Uri.AbsoluteUri;
}
Hope this helps someone trying to access Blob File path based on multi level Directory(Level1/Level2/Level3) path.
Just use the ListBlobs mentioned in the answer from Gaurav Mantri to retrieve all files (blobs) within your desired subfolder. Then iterate over it and download it:
var storageAccount = CloudStorageAccount.Parse("yourConnectionString");
var client = storageAccount.CreateCloudBlobClient();
var container = client.GetContainerReference("yourContainer");
var blobs = container.ListBlobs(prefix: "subdirectory1/subdirectory2", useFlatBlobListing: true);
foreach (var blob in blobs)
{
blob.DownloadToFileAsync("yourFilePath");
}
I have CloudBlockBlobs that have metadata.
CloudBlockBlob blockBlob = container.GetBlockBlobReference("myblob.jpg");
using (var fileStream = System.IO.File.OpenRead(filePath))
{
blockBlob.UploadFromStream(fileStream);
blockBlob.Properties.ContentType = "image/jpg";
blockBlob.Metadata.Add("Title", "Yellow Pear");
blockBlob.SetProperties();
}
I see the Metadata is there:
Debug.WriteLine(blockBlob.Metadata["Title"]);
Now later if I query from storage I see the blobs but the Metadata is missing:
(in the below I know blobItems[0] had Metadata when uploaded but now blobItems[0].Metadata.Count == 0)
var blobItems = container.ListBlobs(
null, false, BlobListingDetails.Metadata);
I also noticed the Metadata is not available when I obtain the blob by itself:
CloudBlockBlob a = container.GetBlockBlobReference("myblob.jpg");
//Below throws an exception
var b = a.Metadata["Title"];
Thank you!
There are some issues with your code :(.
The blob doesn't have any metadata set actually. After setting the metadata, you're calling blob.SetProperties() method which only sets the blob's properties (ContentType in your example). To set the metadata, you would actually need to call blob.SetMetadata() method.
Your upload code is currently making 2 calls to storage service: 1) upload blob and 2) set properties. If you call SetMetadata then it would be 3 calls. IMHO, these can be combined in just 1 call to storage service by doing something like below:
using (var fileStream = System.IO.File.OpenRead(filePath))
{
blockBlob.Properties.ContentType = "image/jpg";
blockBlob.Metadata.Add("Title", "Yellow Pear");
blockBlob.UploadFromStream(fileStream);
}
This will not only upload the blob but also set it's properties and metadata in a single call to storage service.
Regarding
I also noticed the Metadata is not available when I obtain the blob by
itself:
CloudBlockBlob a = container.GetBlockBlobReference("myblob.jpg");
//Below throws an exception
var b = a.Metadata["Title"];
Basically the code above is just creating an instance of the blob on the client side. It doesn't actually fetch the properties (and metadata) of the blob. To fetch details about the blob, you would need to call FetchAttributes method on the blob. Something like:
CloudBlockBlob a = container.GetBlockBlobReference("myblob.jpg");
a.FetchAttributes();
If after that you retrieve blob's metadata, you should be able to see it (provided metadata was created properly).
I upgraded WindowsAzure.Storage to 4.0.3
I want to output to a webpage a list of blobs in a folder, where clicking on the link downloads the blob. As the blobs are in a secure container each URI needs a shared access signature.
I used to have:
var dir = Container.GetDirectoryReference(folderName);
List<IListBlobItem> blobs = dir.ListBlobs().ToList();
var blobsInFolder = new List<Uri>();
foreach (IListBlobItem listBlobItem in blobs)
{
var blob = Container.GetBlockBlobReference(listBlobItem.Uri.ToString());
string sasBlobToken = blob.GetSharedAccessSignature(_sasConstraints);
blobsInFolder.Add(new Uri(blob.Uri + sasBlobToken));
}
return blobsInFolder;
This no longer works as GetBlockBlobReference no longer accepts a URI but a filename. IListBlobItem does not include the filename.
I could start chopping up the Uri to get the folder and filename
var blob = Container.GetBlockBlobReference(folderName + "/" + Path.GetFileName(listBlobItem.Uri.AbsolutePath));
...but I feel that's going the wrong way (that I shouldn't have to do this?). Can someone point me in the right way please?
Try casting IListBlobItem to CloudBlockBlob
foreach (IListBlobItem listBlobItem in blobs)
{
var blob = (CloudBlockBlob) listBlobItem;
string sasBlobToken = blob.GetSharedAccessSignature(_sasConstraints);
blobsInFolder.Add(new Uri(blob.Uri + sasBlobToken));
}
return blobsInFolder;
I know Azure doesn't have actual subpaths, but if I have for example container/projectID/iterationNumber/filename.jpg and I delete a project, how can I delete from ProjectID? Is it possible through coding?
I don't want to use the azure application as I am creating a web app.
Thanks in Advance
EDIT:
This is the code provided by Microsoft to target on specific item:
// Retrieve storage account from connection string.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
CloudConfigurationManager.GetSetting("StorageConnectionString"));
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve reference to a previously created container.
CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");
// Retrieve reference to a blob named "myblob.txt".
CloudBlockBlob blockBlob = container.GetBlockBlobReference("myblob.txt");
// Delete the blob.
blockBlob.Delete();
SystemDesignModel
public static SystemDesign returnImageURL(IListBlobItem item)
{
if (item is CloudBlockBlob)
{
var blob = (CloudBlockBlob)item;
return new SystemDesign
{
URL = blob.Uri.ToString(),
};
}
return null;
}
}
As you know, blob storage does not have the concept of subfolders. It has just 2 level hierarchy - container & blobs. So in essence, a subfolder is just a prefix that you attach to blob name. In your example, the actual file you uploaded is filename.jpg but its name from blob storage perspective is projectID/iterationNumber/filename.jpg.
Since there is no concept of subfolder, you just can't delete it like we do on our local computer. However there's a way. Blob storage provides a way to search for blobs starting with a certain blob prefix. So what you have to do is first list all blobs that start with certain prefix (projectID in your case) and then delete the blobs one at a time returned as a result of listing operations.
Take a look at sample code below:
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
CloudConfigurationManager.GetSetting("StorageConnectionString"));
var container = storageAccount.CreateCloudBlobClient().GetContainerReference("container");
BlobContinuationToken token = null;
do
{
var listingResult = container.ListBlobsSegmented("blob-prefix (projectID in your case)", true, BlobListingDetails.None, 5000, token, null, null);
token = listingResult.ContinuationToken;
var blobs = listingResult.Results;
foreach (var blob in blobs)
{
(blob as ICloudBlob).DeleteIfExists();
Console.WriteLine(blob.Uri.AbsoluteUri + " deleted.");
}
}
while (token != null);
I am trying to rename blob in azure storage via .net API and it is I am unable to rename a blob file after a day : (
Here is how I am doing it, by creating new blob and copy from old one.
var newBlob = blobContainer.GetBlobReferenceFromServer(filename);
newBlob.StartCopyFromBlob(blob.Uri);
blob.Delete();
There is no new blob on server so I am getting http 404 Not Found exception.
Here is working example that i have found but it is for old .net Storage api.
CloudBlob blob = container.GetBlobReference(sourceBlobName);
CloudBlob newBlob = container.GetBlobReference(destBlobName);
newBlob.UploadByteArray(new byte[] { });
newBlob.CopyFromBlob(blob);
blob.Delete();
Currently I am using 2.0 API. Where I am I making a mistake?
I see that you're using GetBlobReferenceFromServer method to create an instance of new blob object. For this function to work, the blob must be present which will not be the case as you're trying to rename the blob.
What you could do is call GetBlobReferenceFromServer on the old blob, get it's type and then either create an instance of BlockBlob or PageBlob and perform copy operation on that. So your code would be something like:
CloudBlobContainer blobContainer = storageAccount.CreateCloudBlobClient().GetContainerReference("container");
var blob = blobContainer.GetBlobReferenceFromServer("oldblobname");
ICloudBlob newBlob = null;
if (blob is CloudBlockBlob)
{
newBlob = blobContainer.GetBlockBlobReference("newblobname");
}
else
{
newBlob = blobContainer.GetPageBlobReference("newblobname");
}
//Initiate blob copy
newBlob.StartCopyFromBlob(blob.Uri);
//Now wait in the loop for the copy operation to finish
while (true)
{
newBlob.FetchAttributes();
if (newBlob.CopyState.Status != CopyStatus.Pending)
{
break;
}
//Sleep for a second may be
System.Threading.Thread.Sleep(1000);
}
blob.Delete();
The code in OP was almost fine except that an async copy method was called. The simplest code in new API should be:
var oldBlob = cloudBlobClient.GetBlobReferenceFromServer(oldBlobUri);
var newBlob = container.GetBlobReference("newblobname");
newBlog.CopyFromBlob(oldBlob);
oldBlob.Delete();