Our old ASP.Net application referenced Microsoft.Azure.Management.Storage.Fluent for the code below.
private static void DeleteBlobs(IAzure azure, string sourceContainer, string dbName)
{
var sAcc = GetStorageAccount(azure);
CloudBlobClient bClient = sAcc.CreateCloudBlobClient();
CloudBlobContainer srcCont = bClient.GetContainerReference(sourceContainer);
var srcDir = srcCont.GetDirectoryReference(dbName);
var blobs = srcDir.ListBlobs(useFlatBlobListing: true).ToList();
foreach (CloudBlockBlob blob in blobs)
{
CloudBlockBlob dBlob = srcCont.GetBlockBlobReference(blob.Name);
//Delete the source blob after copying
dBlob.Delete();
}
}
Our new WinUI 3 code, which uses the Azure.Management.Fluent package, is as follows. Similar, but the CloudBlobDirectory.ListBlobs does not exist, and we cannot seem to find an equivalent that will work for the foreach statement.
public static async void DeleteBlobsTest(string dbName)
{
var sAcc = GetStorageAccount(_StorageConnectionString);
CloudBlobClient bClient = sAcc.CreateCloudBlobClient();
CloudBlobContainer srcCont = bClient.GetContainerReference(_ActiveProjectsContainer);
var srcDir = srcCont.GetDirectoryReference(dbName);
var blobs = srcDir.ListBlobs(useFlatBlobListing: true).ToList(); //ListBlobs() does not exist
foreach (CloudBlockBlob blob in blobs)
{
CloudBlockBlob dBlob = srcCont.GetBlockBlobReference(blob.Name);
//Delete the source blob after copying
await dBlob.DeleteAsync();
}
}
We tried replacing var blobs = srcDir.ListBlobs(useFlatBlobListing: true).ToList(); with the two lines of code below, but it did not work, giving the error: Unable to cast object of type 'Microsoft.WindowsAzure.Storage.Blob.CloudBlobDirectory' to type 'Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob. Both our original and replacement return IEnumerable<IListBlobItem> interface enumeration, but different implementations, it seems. In any case, are there any workable replacements for ListBlob() in the WinUI 3 package?
BlobResultSegment blobSegment = await srcDir.ListBlobsSegmentedAsync(new BlobContinuationToken());
IEnumerable<IListBlobItem> blobs = blobSegment.Results;
Per Simon Mourier's comment. The best solution seems to be to use a different package and write the code accordingly. The package I had is now deprecated and doesn't seem to appear anymore in NuGit.
I added the package: Azure.Storage.Blobs
My code now looks like this, similar but not an exact match with the original:
public static void DeleteBlobs(string dbName)
{
BlobServiceClient bSvcCl = GetStorageAccountTest(_StorageConnectionString);
BlobContainerClient contCl = bSvcCl.GetBlobContainerClient(_ActiveProjectsContainer);
var blobs = contCl.GetBlobs();
foreach(BlobItem blob in blobs)
{
BlobClient sourceBlob = contCl.GetBlobClient(blob.Name);
sourceBlob.Delete();
}
}
solution/code how to upload large files more than 100 mb of size to blob storage in azure hosted app service (webapi) using .Net Core, But from local machine it's working not from azure app service.
Error showing file is too large to upload
Tried like one Example below -
[RequestFormLimits(MultipartBodyLengthLimit = 6104857600)]
[RequestSizeLimit(6104857600)]
public async Task<IActionResult> Upload(IFormfile filePosted)
{
string fileName = Path.GetFileName(filePosted.FileName);
string localFilePath = Path.Combine(fileName);
var fileStream = new FileStream(localFilePath, FileMode.Create);
MemoryStream ms = new MemoryStream();
filePosted.CopyTo(ms);
ms.WriteTo(fileStream);
BlobServiceClient blobServiceClient = new BlobServiceClient("ConnectionString");
var containerClient = new blobServiceClient.GetBlobContainerClient("Container");
BlobUploadOptions options = new BlobUploadOptions
{
TransferOptions = new StorageTransferOptions
{
MaximumConcurrency = 8,
MaximumTransferSize = 220 * 1024 * 1024
}
}
Blobsclient bc = containerClient.GetBlobClient("Name");
await bc.UploadAsync(fileStream, options);
ms.Dispose();
return Ok()
}
I tried in my environment and got below results:
To upload large files from local storage to Azure blob storage or file storage, you can use Azure data movement library,It provides high-performance for uploading, downloading larger files.
Code:
using Microsoft.Azure.Storage;
using Microsoft.Azure.Storage.Blob;
using Microsoft.Azure.Storage.DataMovement;
class program
{
public static void Main(string[] args)
{
string storageConnectionString = "<Connection string>";
CloudStorageAccount account = CloudStorageAccount.Parse(storageConnectionString);
CloudBlobClient blobClient = account.CreateCloudBlobClient();
CloudBlobContainer blobContainer = blobClient.GetContainerReference("test");
blobContainer.CreateIfNotExists();
string sourceBlob = #"C:\Users\download\sample.docx";
CloudBlockBlob destPath = blobContainer.GetBlockBlobReference("sample.docx");
TransferManager.Configurations.ParallelOperations = 64;
// Setup the transfer context and track the download progress
SingleTransferContext context = new SingleTransferContext
{
ProgressHandler = new Progress<TransferStatus>(progress =>
{
Console.WriteLine("Bytes Upload: {0}", progress.BytesTransferred);
})
};
// download thee blob
var task = TransferManager.UploadAsync(
sourceBlob, destPath, null, context, CancellationToken.None);
task.Wait();
}
}
Console:
Portal:
After I executed the above code and got successfully uploaded large file to azure blob storage.
I am trying to get all files in a directory of a blob container. I am able to get all files in that whole container. But I am not able to specify a directory.
Here is my Code
public async Task<IDictionary<string,DateTime>> GetBlobFiles(string directory="adf_exports")
{
IDictionary<string, DateTime> files = new Dictionary<string, DateTime>();
try
{
var accountName = _configuration["StorageAccount"];
var blobEndpoint = $"https://{accountName}.blob.core.windows.net";
var credential = new DefaultAzureCredential();
BlobServiceClient blobServiceClient = new BlobServiceClient(new Uri(blobEndpoint), credential);
var containerClient = blobServiceClient.GetBlobContainerClient(_configuration["BlobContainer"] + "/" + directory);
//var blobClient = containerClient.GetBlobClient(directory);
var list = containerClient.GetBlobs();
//var blobs = list.Where(b => Path.GetExtension(b.Name).Equals(".json"));
foreach (var item in list)
{
string name = item.Name;
BlockBlobClient blockBlob = containerClient.GetBlockBlobClient(name);
//using (var fileStream = File.OpenWrite(#"C:\Users\mbcrump\Downloads\test\" + name))
//{
// blockBlob.DownloadTo(fileStream);
//}
}
await foreach(BlobItem blob in containerClient.GetBlobsAsync())
{
files.Add(blob.Name, DateTime.Now);
}
return files;
}
catch (Exception ex)
{
return files;
}
}
It triggers the below error
The requested URI does not represent any resource on the server.
RequestId:b6449bde-d01e-003e-598b-a53f0f000000
Time:2021-09-09T15:02:09.1881302Z Status: 400 (The requested URI does
not represent any resource on the server.) ErrorCode: InvalidUri
But if we wont specify the directory and just the container like this
var containerClient = blobServiceClient.GetBlobContainerClient(_configuration["BlobContainer"]);
it works without any issues. But returns all folders and files in that container.
But How can I specify a folder alone.
FYI I am using Managed Identity to access blob. Connection strings or access keys are restricted in our environment.
You are receiving that error because there are no specified resources in the mentioned subscription.
Here is the code that worked for me
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
using System;
using System.IO;
using System.Threading.Tasks;
namespace ViewBlobList
{
class Program
{
static async Task Main(string[] args)
{
BlobServiceClient blobServiceClient = new BlobServiceClient("**Connection String**");
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient("Container Name");
Console.WriteLine("Listing blobs..."+"\n");
// List all blobs in the container
var blobs = containerClient.GetBlobs();
foreach (BlobItem blobItem in blobs)
{
Console.WriteLine(blobItem.Name);
}
Console.Read();
}
}
}
here is the output
For more information, you can refer Quickstart: Azure Blob Storage library v12 - .NET | Microsoft Docs.
i'm trying to download file from azure blob storage, but it returns only part of file. What i'm doing wrong ? File in storage is not corrupted
public async Task<byte[]> GetFile(string fileName)
{
var blobClient = BlobContainerClient.GetBlobClient(fileName);
var downloadInfo = await blobClient.DownloadAsync();
byte[] b = new byte[downloadInfo.Value.ContentLength];
await downloadInfo.Value.Content.ReadAsync(b, 0, (int)downloadInfo.Value.ContentLength);
return b;
}
I'm using Azure.Storage.Blobs 12.4.2 package. I tried this code and it works for me
public async Task<byte[]> GetFile(string fileName)
{
var blobClient = BlobContainerClient.GetBlobClient(fileName);
using (var memorystream = new MemoryStream())
{
await blobClient.DownloadToAsync(memorystream);
return memorystream.ToArray();
}
}
I am not able to full understand your code as the current BlobClient as of v11.1.1 does not expose any download methods. As #Guarav Mantri-AIS mentioned the readAsync can behave in that manner.
Consider an alternative using the DownloadToByteArrayAsync() which is part of the API. I have include the code required to connect but of course this is just for the purpose of demonstrating a full example.
Your method would be condensed as follows:
public async Task<byte[]> GetFile(string containerName, string fileName)
{
//i am getting the container here, not sure where or how you are doing this
var container = GetContainer("//your connection string", containerName);
//Get the blob first
ICloudBlob blob = container.GetBlockBlobReference(fileName);
//and now download it straight to a byte array
return await blobClient.DownloadAsync();
}
public CloudBlobContainer GetContainer(string connectionString, string containerName)
{
//1. connect to the account
var account = CloudStorageAccount.Parse(connectionString);
//2. create a client
var blobClient = _account.CreateCloudBlobClient();
//3. i am getting the container here, not sure where or how you are doing this
return = _blobClient.GetContainerReference(containerName);
}
What is the most efficient way to get the count on the number of blobs in an Azure Storage container?
Right now I can't think of any way other than the code below:
CloudBlobContainer container = GetContainer("mycontainer");
var count = container.ListBlobs().Count();
If you just want to know how many blobs are in a container without writing code you can use the Microsoft Azure Storage Explorer application.
Open the desired BlobContainer
Click the Folder Statistics icon
Observe the count of blobs in the Activities window
I tried counting blobs using ListBlobs() and for a container with about 400,000 items, it took me well over 5 minutes.
If you have complete control over the container (that is, you control when writes occur), you could cache the size information in the container metadata and update it every time an item gets removed or inserted. Here is a piece of code that would return the container blob count:
static int CountBlobs(string storageAccount, string containerId)
{
CloudStorageAccount cloudStorageAccount = CloudStorageAccount.Parse(storageAccount);
CloudBlobClient blobClient = cloudStorageAccount.CreateCloudBlobClient();
CloudBlobContainer cloudBlobContainer = blobClient.GetContainerReference(containerId);
cloudBlobContainer.FetchAttributes();
string count = cloudBlobContainer.Metadata["ItemCount"];
string countUpdateTime = cloudBlobContainer.Metadata["CountUpdateTime"];
bool recountNeeded = false;
if (String.IsNullOrEmpty(count) || String.IsNullOrEmpty(countUpdateTime))
{
recountNeeded = true;
}
else
{
DateTime dateTime = new DateTime(long.Parse(countUpdateTime));
// Are we close to the last modified time?
if (Math.Abs(dateTime.Subtract(cloudBlobContainer.Properties.LastModifiedUtc).TotalSeconds) > 5) {
recountNeeded = true;
}
}
int blobCount;
if (recountNeeded)
{
blobCount = 0;
BlobRequestOptions options = new BlobRequestOptions();
options.BlobListingDetails = BlobListingDetails.Metadata;
foreach (IListBlobItem item in cloudBlobContainer.ListBlobs(options))
{
blobCount++;
}
cloudBlobContainer.Metadata.Set("ItemCount", blobCount.ToString());
cloudBlobContainer.Metadata.Set("CountUpdateTime", DateTime.Now.Ticks.ToString());
cloudBlobContainer.SetMetadata();
}
else
{
blobCount = int.Parse(count);
}
return blobCount;
}
This, of course, assumes that you update ItemCount/CountUpdateTime every time the container is modified. CountUpdateTime is a heuristic safeguard (if the container did get modified without someone updating CountUpdateTime, this will force a re-count) but it's not reliable.
The API doesn't contain a container count method or property, so you'd need to do something like what you posted. However, you'll need to deal with NextMarker if you exceed 5,000 items returned (or if you specify max # to return and the list exceeds that number). Then you'll make add'l calls based on NextMarker and add the counts.
EDIT: Per smarx: the SDK should take care of NextMarker for you. You'll need to deal with NextMarker if you're working at the API level, calling List Blobs through REST.
Alternatively, if you're controlling the blob insertions/deletions (through a wcf service, for example), you can use the blob container's metadata area to store a cached container count that you compute with each insert or delete. You'll just need to deal with write concurrency to the container.
Example using PHP API and getNextMarker.
Counts total number of blobs in an Azure container.
It takes a long time: about 30 seconds for 100000 blobs.
(assumes we have a valid $connectionString and a $container_name)
$blobRestProxy = ServicesBuilder::getInstance()->createBlobService($connectionString);
$opts = new ListBlobsOptions();
$nblobs = 0;
while($cont) {
$blob_list = $blobRestProxy->listBlobs($container_name, $opts);
$nblobs += count($blob_list->getBlobs());
$nextMarker = $blob_list->getNextMarker();
if (!$nextMarker || strlen($nextMarker) == 0) $cont = false;
else $opts->setMarker($nextMarker);
}
echo $nblobs;
If you are not using virtual directories, the following will work as previously answered.
CloudBlobContainer container = GetContainer("mycontainer");
var count = container.ListBlobs().Count();
However, the above code snippet may not have the desired count if you are using virtual directories.
For instance, if your blobs are stored similar to the following: /container/directory/filename.txt where the blob name = directory/filename.txt the container.ListBlobs().Count(); will only count how many "/directory" virtual directories you have. If you want to list blobs contained within virtual directories, you need to set the useFlatBlobListing = true in the ListBlobs() call.
CloudBlobContainer container = GetContainer("mycontainer");
var count = container.ListBlobs(null, true).Count();
Note: the ListBlobs() call with useFlatBlobListing = true is a much more expensive/slow call...
Bearing in mind all the performance concerns from the other answers, here is a version for v12 of the Azure SDK leveraging IAsyncEnumerable. This requires a package reference to System.Linq.Async.
public async Task<int> GetBlobCount()
{
var container = await GetBlobContainerClient();
var blobsPaged = container.GetBlobsAsync();
return await blobsPaged
.AsAsyncEnumerable()
.CountAsync();
}
With Python API of Azure Storage it is like:
from azure.storage import *
blob_service = BlobService(account_name='myaccount', account_key='mykey')
blobs = blob_service.list_blobs('mycontainer')
len(blobs) #returns the number of blob in a container
If you are using Azure.Storage.Blobs library, you can use something like below:
public int GetBlobCount(string containerName)
{
int count = 0;
BlobContainerClient container = new BlobContainerClient(blobConnctionString, containerName);
container.GetBlobs().ToList().ForEach(blob => count++);
return count;
}
Another Python example, works slow but correctly with >5000 files:
from azure.storage.blob import BlobServiceClient
constr="Connection string"
container="Container name"
blob_service_client = BlobServiceClient.from_connection_string(constr)
container_client = blob_service_client.get_container_client(container)
blobs_list = container_client.list_blobs()
num = 0
size = 0
for blob in blobs_list:
num += 1
size += blob.size
print(blob.name,blob.size)
print("Count: ", num)
print("Size: ", size)
I have spend quite period of time to find the below solution - I don't want to some one like me to waste time - so replying here even after 9 years
package com.sai.koushik.gandikota.test.app;
import com.microsoft.azure.storage.CloudStorageAccount;
import com.microsoft.azure.storage.blob.*;
public class AzureBlobStorageUtils {
public static void main(String[] args) throws Exception {
AzureBlobStorageUtils getCount = new AzureBlobStorageUtils();
String storageConn = "<StorageAccountConnection>";
String blobContainerName = "<containerName>";
String subContainer = "<subContainerName>";
Integer fileContainerCount = getCount.getFileCountInSpecificBlobContainersSubContainer(storageConn,blobContainerName, subContainer);
System.out.println(fileContainerCount);
}
public Integer getFileCountInSpecificBlobContainersSubContainer(String storageConn, String blobContainerName, String subContainer) throws Exception {
try {
CloudStorageAccount storageAccount = CloudStorageAccount.parse(storageConn);
CloudBlobClient blobClient = storageAccount.createCloudBlobClient();
CloudBlobContainer blobContainer = blobClient.getContainerReference(blobContainerName);
return ((CloudBlobDirectory) blobContainer.listBlobsSegmented().getResults().stream().filter(listBlobItem -> listBlobItem.getUri().toString().contains(subContainer)).findFirst().get()).listBlobsSegmented().getResults().size();
} catch (Exception e) {
throw new Exception(e.getMessage());
}
}
}
Count all blobs in a classic and new blob storage account. Building on #gandikota-saikoushik, this solution works for blob containers with a very large number of blobs.
//setup set values from Azure Portal
var accountName = "<ACCOUNTNAME>";
var accountKey = "<ACCOUTNKEY>";
var containerName = "<CONTAINTERNAME>";
uristr = $"DefaultEndpointsProtocol=https;AccountName={accountName};AccountKey={accountKey}";
var storageAccount = Microsoft.WindowsAzure.Storage.CloudStorageAccount.Parse(uristr);
var client = storageAccount.CreateCloudBlobClient();
var container = client.GetContainerReference(containerName);
BlobContinuationToken continuationToken = new BlobContinuationToken();
blobcount = CountBlobs(container, continuationToken).ConfigureAwait(false).GetAwaiter().GetResult();
Console.WriteLine($"blobcount:{blobcount}");
public static async Task<int> CountBlobs(CloudBlobContainer container, BlobContinuationToken currentToken)
{
BlobContinuationToken continuationToken = null;
var result = 0;
do
{
var response = await container.ListBlobsSegmentedAsync(continuationToken);
continuationToken = response.ContinuationToken;
result += response.Results.Count();
}
while (continuationToken != null);
return result;
}
List blobs approach is accurate but slow if you have millions of blobs. Another way that works in a few cases but is relatively fast is querying the MetricsHourPrimaryTransactionsBlob table. It is at the account level and metrics get aggregated hourly.
https://learn.microsoft.com/en-us/azure/storage/common/storage-analytics-metrics
You can use this
public static async Task<List<IListBlobItem>> ListBlobsAsync()
{
BlobContinuationToken continuationToken = null;
List<IListBlobItem> results = new List<IListBlobItem>();
do
{
CloudBlobContainer container = GetContainer("containerName");
var response = await container.ListBlobsSegmentedAsync(null,
true, BlobListingDetails.None, 5000, continuationToken, null, null);
continuationToken = response.ContinuationToken;
results.AddRange(response.Results);
} while (continuationToken != null);
return results;
}
and then call
var count = await ListBlobsAsync().Count;
hope it will be useful