Unable to bulk import devices with Private Storage account - azure

I am working on stress testing for our IoT Usecase.
For testing, I need to create 100 devices.
So, I have developed one azure function to use IoTHubs Import Device feature as per MSFT docs.
When I used, sample code using Public Storage account for Import/Output blob container SAS token.
It worked as per expectation and created devices on IoTHub.
But when I am using same code with Private Storage account, it is sometimes throwing reading error and sometimes it is throwing writing error on blob storage, even if SAS token has all required permissions (Read, Write, Delete, Create, List, Add etc..), and even private storage account is also having DNS configuration available. And for other use, I am able to add/update/delete blobs on the same private storage account.
Only problem I am facing is while calling ImportDevicesAsync method of IoTHub's RegistryManager.
My sample code is as below:
To create devices.txt file and upload it on proper container.
for (int i = deviceIndex; i < deviceCount + deviceIndex; i++)
{
var deviceToAdd = new ExportImportDevice()
{
Id = $"{devicePrefix}{i.ToString().PadLeft(6, '0')}",
ImportMode = importMode == "delete" ? ImportMode.Delete : ImportMode.Create,
Status = DeviceStatus.Enabled,
Authentication = new AuthenticationMechanism()
{
SymmetricKey = new SymmetricKey()
{
PrimaryKey = CryptoKeyGenerator.GenerateKey(32),
SecondaryKey = CryptoKeyGenerator.GenerateKey(32)
}
},
Tags = new TwinCollection(initialTags.SerializeObject())
};
serializedDevices.Add(deviceToAdd.SerializeObject());
}
// Write the list to the blob
StringBuilder sb = new();
serializedDevices.ForEach(serializedDevice => sb.AppendLine(serializedDevice));
Uri uri = new(assetsBlockBlobUrl + "?" + assetsBlobContainerSas);
CloudBlockBlob blob = new(uri);
await blob.DeleteIfExistsAsync();
using (CloudBlobStream stream = await blob.OpenWriteAsync())
{
byte[] bytes = Encoding.UTF8.GetBytes(sb.ToString());
for (var i = 0; i < bytes.Length; i += 500)
{
int length = Math.Min(bytes.Length - i, 500);
await stream.WriteAsync(bytes.AsMemory(i, length));
}
}
To import devices from the same container using registryManager.ImportDeviceAsync method:
RegistryManager registryManager = RegistryManager.CreateFromConnectionString(Environment.GetEnvironmentVariable("iotHubConnectionString"));
JobProperties importJob = await registryManager.ImportDevicesAsync(containerSasUri, containerSasUri);
////Wait until job is finished
while (true)
{
importJob = await registryManager.GetJobAsync(importJob.JobId);
_logger.LogInformation("import job " + importJob.Status);
if (importJob.Status == JobStatus.Completed)
{
return Common.Utils.GetObjectResult(importMode == "delete" ? MessageConstants.SuccessDeletedAsset : MessageConstants.SuccessCreatedAsset);
}
else if (importJob.Status == JobStatus.Failed)
{
return Common.Utils.GetObjectResult(importMode == "delete" ? MessageConstants.DeleteDeviceFail : MessageConstants.CreateDeviceFail);
}
else if (importJob.Status == JobStatus.Cancelled)
{
return Common.Utils.GetObjectResult(importMode == "delete" ? MessageConstants.DeviceDeletionCancel : MessageConstants.DeviceCreationCancel);
}
await Task.Delay(TimeSpan.FromSeconds(5));
}

Related

How to mock Azure blobContainerClient.GetBlobsAsync()

I have a Azure blob container which I am accessing using below code -
var blobContainerClient = GetBlobContainer(containerName);
if (blobContainerClient != null)
{
// List all blobs in the container
await foreach (BlobItem blobItem in blobContainerClient.GetBlobsAsync())
{
queuedBlobsList.Add(new QueuedBlobs { BlobName = blobItem.Name, LastModified = blobItem.Properties.LastModified });
}
}
private BlobContainerClient GetBlobContainer(string containerName)
{
return gen2StorageClient != null
? gen2StorageClient.GetBlobContainerClient(containerName)
: gen1StorageClient.GetBlobContainerClient(containerName);
}
The clients are initialised in constructor -
public class BlobService : IBlobService
{
private readonly BlobServiceClient gen1StorageClient, gen2StorageClient;
public BlobService(BlobServiceClient defaultClient, IAzureClientFactory<BlobServiceClient> clientFactory)
{
gen1StorageClient = defaultClient;
if (clientFactory != null)
{
gen2StorageClient = clientFactory.CreateClient("StorageConnectionString");
}
}
}
And my unit test where I am setting GetBlobsAsync is like this -
But I want to add list of BlobItems to test another loop.
private static Mock<BlobContainerClient> GetBlobContainerClientMockWithListOfBlobs()
{
var blobContainerClientMock = new Mock<BlobContainerClient>("UseDevelopmentStorage=true", EnvironmentConstants.ParallelUploadContainer);
var cancellationToken = new CancellationToken();
var blobs = new List<BlobItem>();
//AsyncPageable<BlobItem> blobItems = new AsyncPageable<BlobItem>(); -- Not allowing
blobContainerClientMock.Setup(x => x.GetBlobsAsync(BlobTraits.All, BlobStates.All, null, cancellationToken)).Returns(It.IsAny<AsyncPageable<BlobItem>>());
return blobContainerClientMock;
}
I came to this question because I also had the same issue.
Based on this article
AsyncPageable<T> and Pageable<T> are classes that represent collections of models returned by the service in pages.
The method GetBlobsAsync returns an AsyncPageable.
To Create an AsyncPageable you need first to create a BlobItem Page.
To create a Page<T> instance, use the Page<T>.FromValues method, passing a list of items, a continuation token, and the Response.
So let's start creating the list of items:
var blobList = new BlobItem[]
{
BlobsModelFactory.BlobItem("Blob1"),
BlobsModelFactory.BlobItem("Blob2"),
BlobsModelFactory.BlobItem("Blob3")
};
Note: BlobItem has an internal constructor but I found in this answer that there's a BlobsModelFactory.
After having the list of blobs is time to create a Page<BlobItem>:
Page<BlobItem> page = Page<BlobItem>.FromValues(blobList, null, Mock.Of<Response>());
And finally, create the AsyncPageable<BlobItem>
AsyncPageable<BlobItem> pageableBlobList = AsyncPageable<BlobItem>.FromPages(new[] { page });
And now you are able to use this to mock GetBlobsAsync method:
blobContainerClientMock
.Setup(m => m.GetBlobsAsync(
It.IsAny<BlobTraits>(),
It.IsAny<BlobStates>(),
It.IsAny<string>(),
It.IsAny<CancellationToken>()))
.Returns(pageableBlobList);
I hope this helps others with this issue.
André

Azure DocumentDB, When uploading af executing - no result?

In my project I am supposed to get data from openweathermap.org and put that in a collection in my DocumentDB database in Azure.
The code below works locally on my development machine, but when i upload the project, it runs and succeed (says the dashboard) but no documents are created. I can only create the documents if I run from local machine.
Why is that?
Here is my code:
public static void Main()
{
JobHost host = new JobHost();
// The following code ensures that the WebJob will be running continuously
host.Call(typeof(Program).GetMethod("saveWeatherDataToAzureDocumentDB"));
}
[NoAutomaticTrigger]
public static async void saveWeatherDataToAzureDocumentDB()
{
string endpointUrl = ConfigurationManager.AppSettings["EndPointUrl"];
string authorizationKey = ConfigurationManager.AppSettings["AuthorizationKey"];
string url = "http://api.openweathermap.org/data/2.5/weather?q=hanstholm,dk&appid=44db6a862fba0b067b1930da0d769e98";
var request = WebRequest.Create(url);
string text;
var response = (HttpWebResponse)request.GetResponse();
using (var sr = new StreamReader(response.GetResponseStream()))
{
text = sr.ReadToEnd();
}
// Create a new instance of the DocumentClient
var client = new DocumentClient(new Uri(endpointUrl), authorizationKey);
// Check to verify a database with the id=FamilyRegistry does not exist
Database database = client.CreateDatabaseQuery().Where(db => db.Id == "weatherdata").AsEnumerable().FirstOrDefault();
// If the database does not exist, create a new database
if (database == null)
{
database = await client.CreateDatabaseAsync(
new Database
{
Id = "weatherdata"
});
}
// Check to verify a document collection with the id=FamilyCollection does not exist
DocumentCollection documentCollection = client.CreateDocumentCollectionQuery(database.SelfLink).Where(c => c.Id == "weathercollection").AsEnumerable().FirstOrDefault();
// If the document collection does not exist, create a new collection
if (documentCollection == null)
{
documentCollection = await client.CreateDocumentCollectionAsync("dbs/" + database.Id,
new DocumentCollection
{
Id = "weathercollection"
});
}
//Deserialiser til et dynamisk object
if (text == "")
{
mark m = new mark() { name = "Something" };
await client.CreateDocumentAsync(documentCollection.DocumentsLink, m);
}
else
{
var json = JsonConvert.DeserializeObject<dynamic>(text);
json["id"] = json["name"] + "_" + DateTime.Now;
await client.CreateDocumentAsync(documentCollection.DocumentsLink, json);
}
}
public sealed class mark
{
public string name { get; set; }
}
UPDATE - This is what I have in my App.config
<appSettings>
<!-- Replace the value with the value you copied from the Azure management portal -->
<add key="EndPointUrl" value="https://<My account>.documents.azure.com:443/"/>
<!-- Replace the value with the value you copied from the Azure management portal -->
<add key="AuthorizationKey" value="The secret code from Azure"/>
Also, At DocumentDB Account i find the Connection string like this. AccountEndpoint=https://knoerregaard.documents.azure.com:443/;AccountKey=my secret password
How should I apply this to the WebJob?
Appriciate your help!

Upload and encode Audio file in Windows Azure Media Services

Everywhere online I can find out some explanation related to Video files uploaded to Azure Media Services.
Based on the tutorials I wrote my own code.
After running the StoreAudio method I have:
New Blob on Storage
New Asset on Media Services
New Job successfully completed on Media Services
The created asset is Not Published
When I try to get from the convertedAsset properties like ID or URI I get an exception
Why are ID and URI null? Why is the content "not published"?
Code:
public string StoreAudio(int ID, byte[] file)
{
try
{
var blobContainerName = AudioBookContainer; //+ AudioChapterID % 1000?
var fileName = ID + ".mp3";
var mediaBlobContainer = blobClient.GetContainerReference(blobContainerName);
mediaBlobContainer.CreateIfNotExists();
using (MemoryStream ms = new MemoryStream(file))
{
var reference = mediaBlobContainer.GetBlockBlobReference(fileName);
reference.UploadFromStream(ms);
}
IAsset asset = _context.Assets.Create(fileName, AssetCreationOptions.None);
IAccessPolicy writePolicy = _context.AccessPolicies.Create("writePolicy", TimeSpan.FromMinutes(120), AccessPermissions.Write);
ILocator destinationLocator = _context.Locators.CreateLocator(LocatorType.Sas, asset, writePolicy);
Uri uploadUri = new Uri(destinationLocator.Path);
string assetContainerName = uploadUri.Segments[1];
CloudBlobContainer assetContainer = blobClient.GetContainerReference(assetContainerName);
var sourceCloudBlob = mediaBlobContainer.GetBlockBlobReference(fileName);
sourceCloudBlob.FetchAttributes();
if (sourceCloudBlob.Properties.Length > 0)
{
IAssetFile assetFile = asset.AssetFiles.Create(fileName);
var destinationBlob = assetContainer.GetBlockBlobReference(fileName);
destinationBlob.DeleteIfExists();
destinationBlob.StartCopyFromBlob(sourceCloudBlob);
destinationBlob.FetchAttributes();
if (sourceCloudBlob.Properties.Length != destinationBlob.Properties.Length)
throw new Exception("Error copying");
}
destinationLocator.Delete();
writePolicy.Delete();
asset = _context.Assets.Where(a => a.Id == asset.Id).FirstOrDefault(); //At this point, you can create a job using your asset.
var encodedAsset = EncodeToWMA(asset);
return encodedAsset.Id;
//var ismAssetFiles = encodedAsset.AssetFiles.ToList().Where(f => f.Name.EndsWith(".ism", StringComparison.OrdinalIgnoreCase)).ToArray();
//if (ismAssetFiles.Count() != 1)
// throw new ArgumentException("The asset should have only one, .ism file");
//ismAssetFiles.First().IsPrimary = true;
//ismAssetFiles.First().Update();
asset.Delete();
return encodedAsset.Uri.AbsoluteUri;
}
catch(Exception exx)
{
return exx.Message + exx.InnerException;
}
}
private static IMediaProcessor GetLatestMediaProcessorByName(string mediaProcessorName)
{
var processor = _context.MediaProcessors.Where(p => p.Name == mediaProcessorName).ToList().OrderBy(p => new Version(p.Version)).LastOrDefault();
if (processor == null)
throw new ArgumentException(string.Format("Unknown media processor", mediaProcessorName));
return processor;
}
public static IAsset EncodeToWMA(IAsset asset)
{
IJob job = _context.Jobs.Create("Convert MP3 to WMA");
IMediaProcessor processor = GetLatestMediaProcessorByName("Windows Azure Media Encoder");
ITask task = job.Tasks.AddNew("My encoding task", processor, "WMA High Quality Audio", TaskOptions.None);
task.InputAssets.Add(asset);
task.OutputAssets.AddNew(asset.Name.Replace(".mp3", ".wma"), AssetCreationOptions.None);
job.Submit();
Task progressJobTask = job.GetExecutionProgressTask(CancellationToken.None);
progressJobTask.Wait();
return task.OutputAssets.First();
}
Adding more explanation:
I suggested you to create SAS locator is because we don't support Audio streaming for WMA file for now, therefore, Getting an Origin Streaming locator won't work for you.
The reason why the asset is "not published": you never publish the asset - getting a SAS locator or Origin locator is the way to get your asset published.
For audio file, you could ask for a SAS locator to access to the file-SAS locator is used for progressive download.
_context.Locators.Create(LocatorType.Sas,outputAsset,AccessPermissions.Read,TimeSpan.FromDays(30));

Append to CloudBlockBlob stream

We have a file system abstraction that allows us to easily switch between local and cloud (Azure) storage.
For reading and writing files we have the following members:
Stream OpenRead();
Stream OpenWrite();
Part of our application "bundles" documents into one file. For our local storage provider OpenWrite returns an appendable stream:
public Stream OpenWrite()
{
return new FileStream(fileInfo.FullName, FileMode.Open, FileAccess.ReadWrite, FileShare.ReadWrite, BufferSize, useAsync: true);
}
For Azure blob storage we do the following:
public Stream OpenWrite()
{
return blob.OpenWrite();
}
Unfortunately this overrides the blob contents each time. Is it possible to return a writable stream that can be appended to?
Based on the documentation for OpenWrite here http://msdn.microsoft.com/en-us/library/microsoft.windowsazure.storage.blob.cloudblockblob.openwrite.aspx, The OpenWrite method will overwrite an existing blob unless explicitly prevented using the accessCondition parameter.
One thing you could do is read the blob data in a stream and return that stream to your calling application and let that application append data to that stream. For example, see the code below:
static void BlobStreamTest()
{
storageAccount = CloudStorageAccount.DevelopmentStorageAccount;
CloudBlobContainer container = storageAccount.CreateCloudBlobClient().GetContainerReference("temp");
container.CreateIfNotExists();
CloudBlockBlob blob = container.GetBlockBlobReference("test.txt");
blob.UploadFromStream(new MemoryStream());//Let's just create an empty blob for the sake of demonstration.
for (int i = 0; i < 10; i++)
{
try
{
using (MemoryStream ms = new MemoryStream())
{
blob.DownloadToStream(ms);//Read blob data in a stream.
byte[] dataToWrite = Encoding.UTF8.GetBytes("This is line # " + (i + 1) + "\r\n");
ms.Write(dataToWrite, 0, dataToWrite.Length);
ms.Position = 0;
blob.UploadFromStream(ms);
}
}
catch (StorageException excep)
{
if (excep.RequestInformation.HttpStatusCode != 404)
{
throw;
}
}
}
}
There is now a CloudAppendBlob class that allows you to add content to an existing blob :
var account = CloudStorageAccount.Parse("storage account connectionstring");
var client = account.CreateCloudBlobClient();
var container = client.GetContainerReference("container name");
var blob = container.GetAppendBlobReference("blob name");
In your case you want to append from a stream:
await blob.AppendFromStreamAsync(new MemoryStream());
But you can append from text, byte array, file. Check the documentation.

Getting blob count in an Azure Storage container

What is the most efficient way to get the count on the number of blobs in an Azure Storage container?
Right now I can't think of any way other than the code below:
CloudBlobContainer container = GetContainer("mycontainer");
var count = container.ListBlobs().Count();
If you just want to know how many blobs are in a container without writing code you can use the Microsoft Azure Storage Explorer application.
Open the desired BlobContainer
Click the Folder Statistics icon
Observe the count of blobs in the Activities window
I tried counting blobs using ListBlobs() and for a container with about 400,000 items, it took me well over 5 minutes.
If you have complete control over the container (that is, you control when writes occur), you could cache the size information in the container metadata and update it every time an item gets removed or inserted. Here is a piece of code that would return the container blob count:
static int CountBlobs(string storageAccount, string containerId)
{
CloudStorageAccount cloudStorageAccount = CloudStorageAccount.Parse(storageAccount);
CloudBlobClient blobClient = cloudStorageAccount.CreateCloudBlobClient();
CloudBlobContainer cloudBlobContainer = blobClient.GetContainerReference(containerId);
cloudBlobContainer.FetchAttributes();
string count = cloudBlobContainer.Metadata["ItemCount"];
string countUpdateTime = cloudBlobContainer.Metadata["CountUpdateTime"];
bool recountNeeded = false;
if (String.IsNullOrEmpty(count) || String.IsNullOrEmpty(countUpdateTime))
{
recountNeeded = true;
}
else
{
DateTime dateTime = new DateTime(long.Parse(countUpdateTime));
// Are we close to the last modified time?
if (Math.Abs(dateTime.Subtract(cloudBlobContainer.Properties.LastModifiedUtc).TotalSeconds) > 5) {
recountNeeded = true;
}
}
int blobCount;
if (recountNeeded)
{
blobCount = 0;
BlobRequestOptions options = new BlobRequestOptions();
options.BlobListingDetails = BlobListingDetails.Metadata;
foreach (IListBlobItem item in cloudBlobContainer.ListBlobs(options))
{
blobCount++;
}
cloudBlobContainer.Metadata.Set("ItemCount", blobCount.ToString());
cloudBlobContainer.Metadata.Set("CountUpdateTime", DateTime.Now.Ticks.ToString());
cloudBlobContainer.SetMetadata();
}
else
{
blobCount = int.Parse(count);
}
return blobCount;
}
This, of course, assumes that you update ItemCount/CountUpdateTime every time the container is modified. CountUpdateTime is a heuristic safeguard (if the container did get modified without someone updating CountUpdateTime, this will force a re-count) but it's not reliable.
The API doesn't contain a container count method or property, so you'd need to do something like what you posted. However, you'll need to deal with NextMarker if you exceed 5,000 items returned (or if you specify max # to return and the list exceeds that number). Then you'll make add'l calls based on NextMarker and add the counts.
EDIT: Per smarx: the SDK should take care of NextMarker for you. You'll need to deal with NextMarker if you're working at the API level, calling List Blobs through REST.
Alternatively, if you're controlling the blob insertions/deletions (through a wcf service, for example), you can use the blob container's metadata area to store a cached container count that you compute with each insert or delete. You'll just need to deal with write concurrency to the container.
Example using PHP API and getNextMarker.
Counts total number of blobs in an Azure container.
It takes a long time: about 30 seconds for 100000 blobs.
(assumes we have a valid $connectionString and a $container_name)
$blobRestProxy = ServicesBuilder::getInstance()->createBlobService($connectionString);
$opts = new ListBlobsOptions();
$nblobs = 0;
while($cont) {
$blob_list = $blobRestProxy->listBlobs($container_name, $opts);
$nblobs += count($blob_list->getBlobs());
$nextMarker = $blob_list->getNextMarker();
if (!$nextMarker || strlen($nextMarker) == 0) $cont = false;
else $opts->setMarker($nextMarker);
}
echo $nblobs;
If you are not using virtual directories, the following will work as previously answered.
CloudBlobContainer container = GetContainer("mycontainer");
var count = container.ListBlobs().Count();
However, the above code snippet may not have the desired count if you are using virtual directories.
For instance, if your blobs are stored similar to the following: /container/directory/filename.txt where the blob name = directory/filename.txt the container.ListBlobs().Count(); will only count how many "/directory" virtual directories you have. If you want to list blobs contained within virtual directories, you need to set the useFlatBlobListing = true in the ListBlobs() call.
CloudBlobContainer container = GetContainer("mycontainer");
var count = container.ListBlobs(null, true).Count();
Note: the ListBlobs() call with useFlatBlobListing = true is a much more expensive/slow call...
Bearing in mind all the performance concerns from the other answers, here is a version for v12 of the Azure SDK leveraging IAsyncEnumerable. This requires a package reference to System.Linq.Async.
public async Task<int> GetBlobCount()
{
var container = await GetBlobContainerClient();
var blobsPaged = container.GetBlobsAsync();
return await blobsPaged
.AsAsyncEnumerable()
.CountAsync();
}
With Python API of Azure Storage it is like:
from azure.storage import *
blob_service = BlobService(account_name='myaccount', account_key='mykey')
blobs = blob_service.list_blobs('mycontainer')
len(blobs) #returns the number of blob in a container
If you are using Azure.Storage.Blobs library, you can use something like below:
public int GetBlobCount(string containerName)
{
int count = 0;
BlobContainerClient container = new BlobContainerClient(blobConnctionString, containerName);
container.GetBlobs().ToList().ForEach(blob => count++);
return count;
}
Another Python example, works slow but correctly with >5000 files:
from azure.storage.blob import BlobServiceClient
constr="Connection string"
container="Container name"
blob_service_client = BlobServiceClient.from_connection_string(constr)
container_client = blob_service_client.get_container_client(container)
blobs_list = container_client.list_blobs()
num = 0
size = 0
for blob in blobs_list:
num += 1
size += blob.size
print(blob.name,blob.size)
print("Count: ", num)
print("Size: ", size)
I have spend quite period of time to find the below solution - I don't want to some one like me to waste time - so replying here even after 9 years
package com.sai.koushik.gandikota.test.app;
import com.microsoft.azure.storage.CloudStorageAccount;
import com.microsoft.azure.storage.blob.*;
public class AzureBlobStorageUtils {
public static void main(String[] args) throws Exception {
AzureBlobStorageUtils getCount = new AzureBlobStorageUtils();
String storageConn = "<StorageAccountConnection>";
String blobContainerName = "<containerName>";
String subContainer = "<subContainerName>";
Integer fileContainerCount = getCount.getFileCountInSpecificBlobContainersSubContainer(storageConn,blobContainerName, subContainer);
System.out.println(fileContainerCount);
}
public Integer getFileCountInSpecificBlobContainersSubContainer(String storageConn, String blobContainerName, String subContainer) throws Exception {
try {
CloudStorageAccount storageAccount = CloudStorageAccount.parse(storageConn);
CloudBlobClient blobClient = storageAccount.createCloudBlobClient();
CloudBlobContainer blobContainer = blobClient.getContainerReference(blobContainerName);
return ((CloudBlobDirectory) blobContainer.listBlobsSegmented().getResults().stream().filter(listBlobItem -> listBlobItem.getUri().toString().contains(subContainer)).findFirst().get()).listBlobsSegmented().getResults().size();
} catch (Exception e) {
throw new Exception(e.getMessage());
}
}
}
Count all blobs in a classic and new blob storage account. Building on #gandikota-saikoushik, this solution works for blob containers with a very large number of blobs.
//setup set values from Azure Portal
var accountName = "<ACCOUNTNAME>";
var accountKey = "<ACCOUTNKEY>";
var containerName = "<CONTAINTERNAME>";
uristr = $"DefaultEndpointsProtocol=https;AccountName={accountName};AccountKey={accountKey}";
var storageAccount = Microsoft.WindowsAzure.Storage.CloudStorageAccount.Parse(uristr);
var client = storageAccount.CreateCloudBlobClient();
var container = client.GetContainerReference(containerName);
BlobContinuationToken continuationToken = new BlobContinuationToken();
blobcount = CountBlobs(container, continuationToken).ConfigureAwait(false).GetAwaiter().GetResult();
Console.WriteLine($"blobcount:{blobcount}");
public static async Task<int> CountBlobs(CloudBlobContainer container, BlobContinuationToken currentToken)
{
BlobContinuationToken continuationToken = null;
var result = 0;
do
{
var response = await container.ListBlobsSegmentedAsync(continuationToken);
continuationToken = response.ContinuationToken;
result += response.Results.Count();
}
while (continuationToken != null);
return result;
}
List blobs approach is accurate but slow if you have millions of blobs. Another way that works in a few cases but is relatively fast is querying the MetricsHourPrimaryTransactionsBlob table. It is at the account level and metrics get aggregated hourly.
https://learn.microsoft.com/en-us/azure/storage/common/storage-analytics-metrics
You can use this
public static async Task<List<IListBlobItem>> ListBlobsAsync()
{
BlobContinuationToken continuationToken = null;
List<IListBlobItem> results = new List<IListBlobItem>();
do
{
CloudBlobContainer container = GetContainer("containerName");
var response = await container.ListBlobsSegmentedAsync(null,
true, BlobListingDetails.None, 5000, continuationToken, null, null);
continuationToken = response.ContinuationToken;
results.AddRange(response.Results);
} while (continuationToken != null);
return results;
}
and then call
var count = await ListBlobsAsync().Count;
hope it will be useful

Resources