How do I access blob storage sub containers in my ASP.NET Core web application? - azure

I've created a solution to access my container contents within my asp.net core 3.1 application and return that contents as a list to my view. At the moment the application access data in the root container which is called upload, however, this container has many sub containers and I would like to list the blobs in a specific one called 1799.
So, instead of accessing upload and showing me the full contents of that container, I want to access upload/1799 and list all the blobs within that container.
I cannot see of anyway to add this sub container to my method and allow this to happen, can anyone help?
Here is my code so far:
CarController.cs
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
namespace MyProject.Controllers
{
public class HomeController : Controller
{
private readonly IConfiguration _configuration;
private readonly string accessKey = string.Empty;
public HomeController(IConfiguration configuration)
{
_configuration = configuration;
accessKey = _configuration.GetConnectionString("AzureStorage");
}
[HttpGet]
public IActionResult Edit(int id)
{
var car = _carService.GetVessel(id);
string strContainerName = "uploads";
string subdir = "1799";
var filelist = new List<BlobListViewModel>();
BlobServiceClient blobServiceClient = new BlobServiceClient(accessKey);
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(strContainerName);
var blobs = containerClient.GetBlobs();
foreach (var item in blobs)
{
filelist.Add(new BlobListViewModel
{
FileName = item.Name
});
}
return View(filelist);
}
}
I've hunted through all the documentation and I can't find anything related to this.

Thanks to David Browne for his comment, there is indeed a prefix option in GetBlob(). THey act like mini-filters in some ways allowing you to define what properties are returned and the blob state etc. Here is my code which has the options set to zero meaning Default for both Trait and State.
[HttpGet]
public IActionResult Edit(int id)
{
var car = _carService.GetVessel(id);
string strContainerName = "uploads";
string subdir = "1799";
var filelist = new List<BlobListViewModel>();
BlobServiceClient blobServiceClient = new BlobServiceClient(accessKey);
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(strContainerName);
//Traits, States, Prefix
var blobs = containerClient.GetBlobs(0, 0, subdir);
foreach (var item in blobs)
{
filelist.Add(new BlobListViewModel
{
FileName = item.Name
});
}
return View(filelist);
}

Related

Get complete hierarchy of azure blob structure based on prefix

I have following hierarchy in my azure storage container :
Container
-- Folder 1
-- Folder 2
-- Folder 2.1
-- File 1
-- File 2
-- File 3
What I'm searching for is a generic function where I can pass string e.g. "container/Folder1/Folder2" and it return me the hierarchy i.e.
-- Folder 2.1
-- File 1
-- File 2
-- File 3
I have following code in place but in this I'm not able to pass the prefix as "container/Folder1/Folder2". If I add "/" in prefix string then it throws error that invalid uri string.
static void printCloudDirectories(IEnumerable<IListBlobItem> blobList, Container cont)
{
foreach (var blobitem in blobList)
{
if (blobitem is CloudBlobDirectory)
{
var container = new Container();
var directory = blobitem as CloudBlobDirectory;
Console.WriteLine(directory.Prefix);
container.Name = directory.Prefix;
BlobContinuationToken token = null;
var directories = directory.ListBlobsSegmentedAsync(token).Result.Results;
printCloudDirectories(directories, container);
cont.Containers.Add(container);
}
else
{
cont.Children.Add(blobitem.Uri.AbsoluteUri);
}
}
}
public static void ListClientMethod(CloudBlobClient cloudBlobClient)
{
BlobContinuationToken token = null;
var containerSegments = cloudBlobClient.ListContainersSegmentedAsync(token).Result;
List<Container> containers = new List<Container>();
foreach (var container in containerSegments.Results)
{
Console.WriteLine("Container: " + container.Name);
var cont = new Container();
cont.Name = container.Name;
// ADD A CALL TO printCloudDirectories:
BlobContinuationToken token1 = null;
var blobs = container.ListBlobsSegmentedAsync(token1).Result.Results;
printCloudDirectories(blobs, cont);
containers.Add(cont);
}
}
public class Container
{
public Container()
{
Children = new List<string>();
Containers = new List<Container>();
}
public string Name { get; set; }
public List<string> Children { get; set; }
public List<Container> Containers { get; set; }
}
I use c# as coding language
Please use ListBlobsSegmentedAsync(String, Boolean, BlobListingDetails, Nullable<Int32>, BlobContinuationToken, BlobRequestOptions, OperationContext) method.
The 1st parameter to this method is the Blob Prefix and you need to specify Folder 1/Folder 2/ there.
2nd parameter to this method is useFlatBlobListing and you need to pass true for that.
It should return you a result like:
Folder 1/Folder 2/Folder 2.1/File 1
Folder 1/Folder 2/Folder 2.1/File 2
Folder 1/Folder 2/Folder 2.1/File 3
and you should be able to construct the desired treeview based on this.

Tidier way of getting Directory Names only from Blob Container

I am trying to get the directory names only of any directories in a specific location withing Blob Storage
I have the helper class below
public static class BlobHelper
{
private static CloudBlobContainer _cloudBlobContainer;
private const string _containerName = "administrator";
public static void Setup(string connectionString)
{
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(connectionString);
CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
_cloudBlobContainer = cloudBlobClient.GetContainerReference(_containerName);
}
public static List<string> GetDirectoryNames(string relativeAddress)
{
var result = new List<string>();
var directory = _cloudBlobContainer.GetDirectoryReference(relativeAddress);
var folders = directory.ListBlobs().OfType<CloudBlobDirectory>();
foreach (var folder in folders)
{
var name = folder.Uri.AbsolutePath;
name = name.Replace(folder.Parent.Prefix, string.Empty)
.Replace(#"/", string.Empty)
.Replace(_containerName, string.Empty);
result.Add(name);
}
}
}
The process to get the directory names only (i.e. not the full hierarchy) feels a bit hacky, although it does work
Is there a better way to do this?
I tried the approach below
var directory = _cloudBlobContainer.GetDirectoryReference(relativeAddress);
var blobs = directory.ListBlobs(true).OfType<CloudBlobDirectory>();;
var blobNames = blobs.OfType<CloudBlockBlob>().Select(b => b.Name).ToList();
return blobNames;
The main difference with the above is the use of UseFlatBlobListing as true
However, this approach results in no folders at all being returned, whereas my other logic does at least give me the 2 folders I expect to find
Any ideas what I am doing wrong?
Cheers
Paul
I suppose your code is OK, I don't understand what you mean "a bit hacky". I think you want to get the directory directly.
Cause no method directory get the directory, for now known method to do it with v11 sdk mostly use the blob uri to do it.
And the below is my way to do it.
CloudBlobContainer cloudBlobContainer = cloudBlobClient.GetContainerReference("test");
BlobContinuationToken blobContinuationToken = null;
var blobsSeg = cloudBlobContainer.ListBlobsSegmented(null, blobContinuationToken);
var directories= blobsSeg.Results.OfType<CloudBlobDirectory>().Select(b => b.Prefix).ToList();
foreach (string directory in directories) {
Console.WriteLine(directory);
}
The result it returns it will be like below pic. Hope this could help you.

Transferring files from previously deleted azure media service to new media service- is it possible?

Is it possible to transfer files from an old media service account which has been accidentally deleted to a new media services account.
Microsoft tech support have been no help.
I'm able to copy over the files to the new media service account, but when i test to see if I can publish one of the assets in the portal, It gives me successful streaming urls, but when I try to access those I get a network error
<serverError>
<status>404</status>
<subStatus>1000</subStatus>
<hresult>MPE_STORAGE_RESOURCE_NOT_FOUND</hresult>
<activityId>80028340-0004-F800-B63F-84710C7967BB</activityId>
<serviceId>4A42CB8E-4542-0C18-2C0D-4B460D96B604</serviceId>
</serverError>
I don't think it can find the manifest file. which is named pc124m190o_AdaptiveStreaming_manifest.xml
The name of the metadata file could also be a potential problem
5f7e8f45-87e9-49ce-a2ae-7bb673bf0b0f_metadata.xml
has anyone successfully done this?
Here is the code I'm using to copy the files. Maybe the error is here?
class Program
{
// Read values from the App.config file.
private static readonly string _sourceStorageAccountName =
ConfigurationManager.AppSettings["SourceStorageAccountName"];
private static readonly string _sourceStorageAccountKey =
ConfigurationManager.AppSettings["SourceStorageAccountKey"];
private static readonly string _NameOfBlobContainerYouWantToCopy =
ConfigurationManager.AppSettings["NameOfBlobContainerYouWantToCopy"];
private static readonly string _AMSAADTenantDomain =
ConfigurationManager.AppSettings["AMSAADTenantDomain"];
private static readonly string _AMSRESTAPIEndpoint =
ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
private static readonly string _AMSClientId =
ConfigurationManager.AppSettings["AMSClientId"];
private static readonly string _AMSClientSecret =
ConfigurationManager.AppSettings["AMSClientSecret"];
private static readonly string _AMSStorageAccountName =
ConfigurationManager.AppSettings["AMSStorageAccountName"];
private static readonly string _AMSStorageAccountKey =
ConfigurationManager.AppSettings["AMSStorageAccountKey"];
// Field for service context.
private static CloudMediaContext _context = null;
private static CloudStorageAccount _sourceStorageAccount = null;
private static CloudStorageAccount _destinationStorageAccount = null;
static void Main(string[] args)
{
AzureAdTokenCredentials tokenCredentials = new AzureAdTokenCredentials(_AMSAADTenantDomain,
new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
AzureEnvironments.AzureCloudEnvironment);
var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
// Create the context for your source Media Services account.
_context = new CloudMediaContext(new Uri(_AMSRESTAPIEndpoint), tokenProvider);
_sourceStorageAccount =
new CloudStorageAccount(new StorageCredentials(_sourceStorageAccountName,
_sourceStorageAccountKey), true);
_destinationStorageAccount =
new CloudStorageAccount(new StorageCredentials(_AMSStorageAccountName,
_AMSStorageAccountKey), true);
CloudBlobClient sourceCloudBlobClient =
_sourceStorageAccount.CreateCloudBlobClient();
// CreateAssetFromExistingBlobs(sourceContainer);
List<string> containers=GetAllContainerNames(sourceCloudBlobClient);
foreach(string item in containers)
{
CloudBlobContainer sourceContainer =
sourceCloudBlobClient.GetContainerReference(item);
CreateAssetFromExistingBlobs(sourceContainer);
Console.WriteLine("finished " + item);
}
}
static private IAsset CreateAssetFromExistingBlobs(CloudBlobContainer sourceBlobContainer)
{
CloudBlobClient destBlobStorage = _destinationStorageAccount.CreateCloudBlobClient();
// Create a new asset.
IAsset asset = _context.Assets.Create("NewAsset_" + Guid.NewGuid(), AssetCreationOptions.None);
IAccessPolicy writePolicy = _context.AccessPolicies.Create("writePolicy",
TimeSpan.FromHours(24), AccessPermissions.Write);
ILocator destinationLocator =
_context.Locators.CreateLocator(LocatorType.Sas, asset, writePolicy);
// Get the asset container URI and Blob copy from mediaContainer to assetContainer.
CloudBlobContainer destAssetContainer =
destBlobStorage.GetContainerReference((new Uri(destinationLocator.Path)).Segments[1]);
if (destAssetContainer.CreateIfNotExists())
{
destAssetContainer.SetPermissions(new BlobContainerPermissions
{
PublicAccess = BlobContainerPublicAccessType.Blob
});
}
var blobList = sourceBlobContainer.ListBlobs();
foreach (CloudBlockBlob sourceBlob in blobList)
{
var assetFile = asset.AssetFiles.Create((sourceBlob as ICloudBlob).Name);
ICloudBlob destinationBlob = destAssetContainer.GetBlockBlobReference(assetFile.Name);
CopyBlob(sourceBlob, destAssetContainer);
sourceBlob.FetchAttributes();
assetFile.ContentFileSize = (sourceBlob as ICloudBlob).Properties.Length;
assetFile.Update();
Console.WriteLine("File {0} is of {1} size", assetFile.Name, assetFile.ContentFileSize);
}
asset.Update();
destinationLocator.Delete();
writePolicy.Delete();
// Set the primary asset file.
// If, for example, we copied a set of Smooth Streaming files,
// set the .ism file to be the primary file.
// If we, for example, copied an .mp4, then the mp4 would be the primary file.
var ismAssetFile = asset.AssetFiles.ToList().
Where(f => f.Name.EndsWith(".ism", StringComparison.OrdinalIgnoreCase)).ToArray().FirstOrDefault();
// The following code assigns the first .ism file as the primary file in the asset.
// An asset should have one .ism file.
if (ismAssetFile != null)
{
ismAssetFile.IsPrimary = true;
ismAssetFile.Update();
}
return asset;
}
Here is what my media storage window looks like
The manifest file you mentioned is technically not required for streaming. What is missing is that when you copied the files to your new Storage account the new Media Services account knows nothing of them. You must create new assets and copy the files into the new assets for Media Services to see them. I recommend doing an import with Azure Media Services Explorer in your new account from the Storage account.

UWP apps accessing files from random location on system

in UWP there are files and permissions restrictions, so we can only acces files directly from few folders or we can use filepicker to access from anywhere on system.
how can I use the files picked from filepicker and use them anytime again when the app runs ? tried to use them again by path but it gives permission error. I know about the "futureacceslist" but its limit is 1000 and also it will make the app slow if I am not wrong? .
Is there a better way to do this ? or can we store storage files link somehow in local sqlite database?
If you need to access lots of files, asking the user to select the parent folder and then storing that is probably a better solution (unless you want to store 1,000 individually-picked files from different locations). You can store StorageFolders in the access list as well.
I'm not sure why you think it will make your app slow, but the only real way to know if this will affect your performance is to try it and measure against your goals.
Considering this method..
public async static Task<byte[]> ToByteArray(this StorageFile file)
{
byte[] fileBytes = null;
using (IRandomAccessStreamWithContentType stream = await file.OpenReadAsync())
{
fileBytes = new byte[stream.Size];
using (DataReader reader = new DataReader(stream))
{
await reader.LoadAsync((uint)stream.Size);
reader.ReadBytes(fileBytes);
}
}
return fileBytes;
}
This class..
public class AppFile
{
public string FileName { get; set; }
public byte[] ByteArray { get; set; }
}
And this variable
List<AppFile> _appFiles = new List<AppFile>();
Just..
var fileOpenPicker = new FileOpenPicker();
IReadOnlyList<StorageFile> files = await fileOpenPicker.PickMultipleFilesAsync();
foreach (var file in files)
{
var byteArray = await file.ToByteArray();
_appFiles.Add(new AppFile { FileName = file.DisplayName, ByteArray = byteArray });
}
UPDATE
using Newtonsoft.Json;
using System.Linq;
using Windows.Security.Credentials;
using Windows.Storage;
namespace Your.Namespace
{
public class StateService
{
public void SaveState<T>(string key, T value)
{
var localSettings = ApplicationData.Current.LocalSettings;
localSettings.Values[key] = JsonConvert.SerializeObject(value);
}
public T LoadState<T>(string key)
{
var localSettings = ApplicationData.Current.LocalSettings;
if (localSettings.Values.ContainsKey(key))
return JsonConvert.DeserializeObject<T>(((string) localSettings.Values[key]));
return default(T);
}
public void RemoveState(string key)
{
var localSettings = ApplicationData.Current.LocalSettings;
if (localSettings.Values.ContainsKey(key))
localSettings.Values.Remove((key));
}
public void Clear()
{
ApplicationData.Current.LocalSettings.Values.Clear();
}
}
}
A bit late, but, yes the future access list will slow down your app in that it returns storagfile, storagefolder, or storeageitem objects. These run via the runtime broker which hits a huge performance barrier at about 400 objects regardless of the host capability

azure CloudBlobDirectory.ListBlobs() returns "The specified resource does not exist.", but fetchAttributes() works using same data

I am getting a "The specified resource does not exist" exception when I try to iterate the result of a ListBlobs() call. I can get the blob attributes when I access it directly, but I'm trying to get a list of all the blobs in the subdirectory.
I wrote this little test to see exactly where the problem is. I have a test driver and two methods here. The first method, "GetBlockBlobDateTime" runs fine and returns a date time of an existing blob. The second method "GetBlobDirFiles" uses the same inputs and throws the excpetion when I try to iterate the blobItems at.
foreach (IListBlobItem blobItem in blobItems)
Note that the same data is used for both methods. What am I missing?
public static void DoTest(string baseURL, string container, string directory, string fileName)
{
DateTime t = GetBlockBlobDateTime( baseURL, container, directory, fileName);
List<string> fileList = GetBlobDirFiles( baseURL, container, directory);
}
public static DateTime GetBlockBlobDateTime(string baseURL, string container, string directory, string fileName)
{
CloudBlobClient blobClient = new CloudBlobClient(baseURL);
CloudBlobDirectory blobDir = blobClient.GetBlobDirectoryReference(container);
CloudBlobDirectory subDirectory = blobDir.GetSubdirectory(directory);
CloudBlockBlob cloudBlockBlob = subDirectory.GetBlockBlobReference(fileName);
cloudBlockBlob.FetchAttributes();
DateTime cloudTimeStampUTC = cloudBlockBlob.Properties.LastModifiedUtc;
return cloudTimeStampUTC;
}
public static List<string> GetBlobDirFiles(string baseURL, string container, string directory)
{
CloudBlobClient blobClient = new CloudBlobClient(baseURL);
CloudBlobDirectory blobDir = blobClient.GetBlobDirectoryReference(container);
CloudBlobDirectory subDirectory = blobDir.GetSubdirectory(directory);
IEnumerable<IListBlobItem> blobItems = subDirectory.ListBlobs();
List<string> fileList = new List<string>();
foreach (IListBlobItem blobItem in blobItems)
{
fileList.Add(blobItem.Uri.ToString());
}
return fileList;
}
OK, I figured it out:
Apparently, you don't need permissions to get file attributes, but you do to list files in the directory.
CloudBlobClient blobClient = new CloudBlobClient(baseURL);
works when you are going to fetch attributes like this:
cloudBlockBlob.FetchAttributes();
But you need to provide credentials like this:
CloudBlobClient blobClient =
new CloudBlobClient(baseURL,
new StorageCredentialsAccountAndKey(myAccount, myKey));
when you are going to list the blobs like this:
var blobList = subDirectory.ListBlobs();
foreach (var blobInfo in blobList)

Resources