I am trying to get the directory names only of any directories in a specific location withing Blob Storage
I have the helper class below
public static class BlobHelper
{
private static CloudBlobContainer _cloudBlobContainer;
private const string _containerName = "administrator";
public static void Setup(string connectionString)
{
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(connectionString);
CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
_cloudBlobContainer = cloudBlobClient.GetContainerReference(_containerName);
}
public static List<string> GetDirectoryNames(string relativeAddress)
{
var result = new List<string>();
var directory = _cloudBlobContainer.GetDirectoryReference(relativeAddress);
var folders = directory.ListBlobs().OfType<CloudBlobDirectory>();
foreach (var folder in folders)
{
var name = folder.Uri.AbsolutePath;
name = name.Replace(folder.Parent.Prefix, string.Empty)
.Replace(#"/", string.Empty)
.Replace(_containerName, string.Empty);
result.Add(name);
}
}
}
The process to get the directory names only (i.e. not the full hierarchy) feels a bit hacky, although it does work
Is there a better way to do this?
I tried the approach below
var directory = _cloudBlobContainer.GetDirectoryReference(relativeAddress);
var blobs = directory.ListBlobs(true).OfType<CloudBlobDirectory>();;
var blobNames = blobs.OfType<CloudBlockBlob>().Select(b => b.Name).ToList();
return blobNames;
The main difference with the above is the use of UseFlatBlobListing as true
However, this approach results in no folders at all being returned, whereas my other logic does at least give me the 2 folders I expect to find
Any ideas what I am doing wrong?
Cheers
Paul
I suppose your code is OK, I don't understand what you mean "a bit hacky". I think you want to get the directory directly.
Cause no method directory get the directory, for now known method to do it with v11 sdk mostly use the blob uri to do it.
And the below is my way to do it.
CloudBlobContainer cloudBlobContainer = cloudBlobClient.GetContainerReference("test");
BlobContinuationToken blobContinuationToken = null;
var blobsSeg = cloudBlobContainer.ListBlobsSegmented(null, blobContinuationToken);
var directories= blobsSeg.Results.OfType<CloudBlobDirectory>().Select(b => b.Prefix).ToList();
foreach (string directory in directories) {
Console.WriteLine(directory);
}
The result it returns it will be like below pic. Hope this could help you.
Related
I've created a solution to access my container contents within my asp.net core 3.1 application and return that contents as a list to my view. At the moment the application access data in the root container which is called upload, however, this container has many sub containers and I would like to list the blobs in a specific one called 1799.
So, instead of accessing upload and showing me the full contents of that container, I want to access upload/1799 and list all the blobs within that container.
I cannot see of anyway to add this sub container to my method and allow this to happen, can anyone help?
Here is my code so far:
CarController.cs
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
namespace MyProject.Controllers
{
public class HomeController : Controller
{
private readonly IConfiguration _configuration;
private readonly string accessKey = string.Empty;
public HomeController(IConfiguration configuration)
{
_configuration = configuration;
accessKey = _configuration.GetConnectionString("AzureStorage");
}
[HttpGet]
public IActionResult Edit(int id)
{
var car = _carService.GetVessel(id);
string strContainerName = "uploads";
string subdir = "1799";
var filelist = new List<BlobListViewModel>();
BlobServiceClient blobServiceClient = new BlobServiceClient(accessKey);
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(strContainerName);
var blobs = containerClient.GetBlobs();
foreach (var item in blobs)
{
filelist.Add(new BlobListViewModel
{
FileName = item.Name
});
}
return View(filelist);
}
}
I've hunted through all the documentation and I can't find anything related to this.
Thanks to David Browne for his comment, there is indeed a prefix option in GetBlob(). THey act like mini-filters in some ways allowing you to define what properties are returned and the blob state etc. Here is my code which has the options set to zero meaning Default for both Trait and State.
[HttpGet]
public IActionResult Edit(int id)
{
var car = _carService.GetVessel(id);
string strContainerName = "uploads";
string subdir = "1799";
var filelist = new List<BlobListViewModel>();
BlobServiceClient blobServiceClient = new BlobServiceClient(accessKey);
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(strContainerName);
//Traits, States, Prefix
var blobs = containerClient.GetBlobs(0, 0, subdir);
foreach (var item in blobs)
{
filelist.Add(new BlobListViewModel
{
FileName = item.Name
});
}
return View(filelist);
}
I'm trying to read the content of a blob inside an azure function.
Here's the code:
Note:
If I comment out the using block and return the blob i.e.
return new OkObjectResult(blob);
I get back the blob object.
However, if I use the using block, I get 500.
Any idea why I can't get the content?
string storageConnectionString = "myConnectionString";
CloudStorageAccount storageAccount;
CloudStorageAccount.TryParse(storageConnectionString, out storageAccount);
CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = cloudBlobClient.GetContainerReference("drawcontainer");
var blob = drawingsContainer.GetBlockBlobReference("notes.txt");
using (StreamReader reader = new StreamReader(blob.OpenRead()))
{
content = reader.ReadToEnd();
}
return new OkObjectResult(content);
HTTP 500 indicates that the code has error. The most probable reason for error is the variable 'content'. Define the variable 'content' outside the using block as the scope of the content variable defined inside it is limited to the block only. Declare it outside the using block, something like below:
try
{
string content = string.Empty;
using (StreamReader reader = new StreamReader(blob.OpenRead()))
{
content = reader.ReadToEnd();
}
}
catch (Exception ex)
{
// Log exception to get the details.
}
Always make use of try catch to get more details about errors in the code.
The OpenRead method didn't exist so I used the async one and it solved it.
I got to this solution after creating an azure function in VS and publishing it and it works.
Here's the code I used:
public static class Function1
{
[FunctionName("Function1")]
public static async Task<ActionResult> Run([HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)]HttpRequest req, TraceWriter log)
{
log.Info("C# HTTP trigger function processed a request.");
string storageConnectionString = "DefaultEndpointsProtocol=https;AccountName=avitest19a1c;AccountKey=<AccessKey>";
CloudStorageAccount storageAccount = null;
CloudStorageAccount.TryParse(storageConnectionString, out storageAccount);
CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer drawingsContainer = cloudBlobClient.GetContainerReference("drawcontainer");
var blob = drawingsContainer.GetBlockBlobReference("notes.txt");
string content = string.Empty;
**var contentStream = await blob.OpenReadAsync();**
using (StreamReader reader = new StreamReader(contentStream))
{
content = reader.ReadToEnd();
}
return new OkObjectResult(content);
}
}
Is it possible to transfer files from an old media service account which has been accidentally deleted to a new media services account.
Microsoft tech support have been no help.
I'm able to copy over the files to the new media service account, but when i test to see if I can publish one of the assets in the portal, It gives me successful streaming urls, but when I try to access those I get a network error
<serverError>
<status>404</status>
<subStatus>1000</subStatus>
<hresult>MPE_STORAGE_RESOURCE_NOT_FOUND</hresult>
<activityId>80028340-0004-F800-B63F-84710C7967BB</activityId>
<serviceId>4A42CB8E-4542-0C18-2C0D-4B460D96B604</serviceId>
</serverError>
I don't think it can find the manifest file. which is named pc124m190o_AdaptiveStreaming_manifest.xml
The name of the metadata file could also be a potential problem
5f7e8f45-87e9-49ce-a2ae-7bb673bf0b0f_metadata.xml
has anyone successfully done this?
Here is the code I'm using to copy the files. Maybe the error is here?
class Program
{
// Read values from the App.config file.
private static readonly string _sourceStorageAccountName =
ConfigurationManager.AppSettings["SourceStorageAccountName"];
private static readonly string _sourceStorageAccountKey =
ConfigurationManager.AppSettings["SourceStorageAccountKey"];
private static readonly string _NameOfBlobContainerYouWantToCopy =
ConfigurationManager.AppSettings["NameOfBlobContainerYouWantToCopy"];
private static readonly string _AMSAADTenantDomain =
ConfigurationManager.AppSettings["AMSAADTenantDomain"];
private static readonly string _AMSRESTAPIEndpoint =
ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
private static readonly string _AMSClientId =
ConfigurationManager.AppSettings["AMSClientId"];
private static readonly string _AMSClientSecret =
ConfigurationManager.AppSettings["AMSClientSecret"];
private static readonly string _AMSStorageAccountName =
ConfigurationManager.AppSettings["AMSStorageAccountName"];
private static readonly string _AMSStorageAccountKey =
ConfigurationManager.AppSettings["AMSStorageAccountKey"];
// Field for service context.
private static CloudMediaContext _context = null;
private static CloudStorageAccount _sourceStorageAccount = null;
private static CloudStorageAccount _destinationStorageAccount = null;
static void Main(string[] args)
{
AzureAdTokenCredentials tokenCredentials = new AzureAdTokenCredentials(_AMSAADTenantDomain,
new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
AzureEnvironments.AzureCloudEnvironment);
var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
// Create the context for your source Media Services account.
_context = new CloudMediaContext(new Uri(_AMSRESTAPIEndpoint), tokenProvider);
_sourceStorageAccount =
new CloudStorageAccount(new StorageCredentials(_sourceStorageAccountName,
_sourceStorageAccountKey), true);
_destinationStorageAccount =
new CloudStorageAccount(new StorageCredentials(_AMSStorageAccountName,
_AMSStorageAccountKey), true);
CloudBlobClient sourceCloudBlobClient =
_sourceStorageAccount.CreateCloudBlobClient();
// CreateAssetFromExistingBlobs(sourceContainer);
List<string> containers=GetAllContainerNames(sourceCloudBlobClient);
foreach(string item in containers)
{
CloudBlobContainer sourceContainer =
sourceCloudBlobClient.GetContainerReference(item);
CreateAssetFromExistingBlobs(sourceContainer);
Console.WriteLine("finished " + item);
}
}
static private IAsset CreateAssetFromExistingBlobs(CloudBlobContainer sourceBlobContainer)
{
CloudBlobClient destBlobStorage = _destinationStorageAccount.CreateCloudBlobClient();
// Create a new asset.
IAsset asset = _context.Assets.Create("NewAsset_" + Guid.NewGuid(), AssetCreationOptions.None);
IAccessPolicy writePolicy = _context.AccessPolicies.Create("writePolicy",
TimeSpan.FromHours(24), AccessPermissions.Write);
ILocator destinationLocator =
_context.Locators.CreateLocator(LocatorType.Sas, asset, writePolicy);
// Get the asset container URI and Blob copy from mediaContainer to assetContainer.
CloudBlobContainer destAssetContainer =
destBlobStorage.GetContainerReference((new Uri(destinationLocator.Path)).Segments[1]);
if (destAssetContainer.CreateIfNotExists())
{
destAssetContainer.SetPermissions(new BlobContainerPermissions
{
PublicAccess = BlobContainerPublicAccessType.Blob
});
}
var blobList = sourceBlobContainer.ListBlobs();
foreach (CloudBlockBlob sourceBlob in blobList)
{
var assetFile = asset.AssetFiles.Create((sourceBlob as ICloudBlob).Name);
ICloudBlob destinationBlob = destAssetContainer.GetBlockBlobReference(assetFile.Name);
CopyBlob(sourceBlob, destAssetContainer);
sourceBlob.FetchAttributes();
assetFile.ContentFileSize = (sourceBlob as ICloudBlob).Properties.Length;
assetFile.Update();
Console.WriteLine("File {0} is of {1} size", assetFile.Name, assetFile.ContentFileSize);
}
asset.Update();
destinationLocator.Delete();
writePolicy.Delete();
// Set the primary asset file.
// If, for example, we copied a set of Smooth Streaming files,
// set the .ism file to be the primary file.
// If we, for example, copied an .mp4, then the mp4 would be the primary file.
var ismAssetFile = asset.AssetFiles.ToList().
Where(f => f.Name.EndsWith(".ism", StringComparison.OrdinalIgnoreCase)).ToArray().FirstOrDefault();
// The following code assigns the first .ism file as the primary file in the asset.
// An asset should have one .ism file.
if (ismAssetFile != null)
{
ismAssetFile.IsPrimary = true;
ismAssetFile.Update();
}
return asset;
}
Here is what my media storage window looks like
The manifest file you mentioned is technically not required for streaming. What is missing is that when you copied the files to your new Storage account the new Media Services account knows nothing of them. You must create new assets and copy the files into the new assets for Media Services to see them. I recommend doing an import with Azure Media Services Explorer in your new account from the Storage account.
I've created a storage account on Azure for uploading files to. I am using KenoUI Async upload widget which allows multiple file uploads. What I'd like to do is have the system create a container based on the passing in Id parameter if it doesn't already exist.
I'm struggling with the actual upload part however, I'm not sure how to pass the enumerable files to the storage account. Here is my code so far:
public ActionResult Async_Save(IEnumerable<HttpPostedFileBase> files, string id)
{
//Connect to Azure
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("my_AzureStorageConnectionString"));
//Create Blob Client
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
//Retrieve a reference to a container
CloudBlobContainer blobContainer = blobClient.GetContainerReference("vehicle_" + id);
try
{
// Create the container if it doesn't already exist
blobContainer.CreateIfNotExists();
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
Console.WriteLine(ex.InnerException);
}
foreach (var file in files)
{
//This doesn't seem right to me and it's where I'm struggling
var fileName = Path.GetFileName(file.FileName);
var physicalPath = Path.Combine(blobContainer, fileName);
file.SaveAs(physicalPath);
}
// Return an empty string to signify success
return Content("");
}
What I've attempted to do is create a method that connects to my Azure storage account and:
Check for the existence of a container with the same ID as the
the parameter that's passed in, if it doesn't exist, create it.
Upload the files to the existing or newly created directory and give them a prefix of the ID i.e. "_".
As Gaurav Mantri said above, we will need to use Azure Storage SDK.
For better performance, we can do as below:
Parallel.ForEach(files, file =>
{
CloudBlockBlob blob = blobContainer.GetBlockBlobReference(file.FileName);
blob.UploadFromStream(file.InputStream);
file.InputStream.Close();
});
Your code will be better as below:
public ActionResult Async_Save(List<HttpPostedFileBase> files, string id)
{
// Connect to Azure
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("my_AzureStorageConnectionString"));
try
{
//Create Blob Client
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
//Retrieve a reference to a container
CloudBlobContainer blobContainer = blobClient.GetContainerReference("vehicle_" + id);
// Create the container if it doesn't already exist
blobContainer.CreateIfNotExists();
Parallel.ForEach(files, file =>
{
CloudBlockBlob blob = blobContainer.GetBlockBlobReference(file.FileName);
blob.UploadFromStream(file.InputStream);
file.InputStream.Close();
});
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
Console.WriteLine(ex.InnerException);
}
// Return an empty string to signify success
return new ContentResult();
}
More information about how to use Azure Storage SDK, we can refer to:
Get Quick Start with Storage SDK
To upload a blob in Azure Storage, you will need to use Azure Storage SDK.
Please replace the following code:
foreach (var file in files)
{
//This doesn't seem right to me and it's where I'm struggling
var fileName = Path.GetFileName(file.FileName);
var physicalPath = Path.Combine(blobContainer, fileName);
file.SaveAs(physicalPath);
}
With something like:
foreach (var file in files)
{
var blob = blobContainer.GetBlockBlobReference(file.FileName);
using (var s = file.InputStream)
{
blob.UploadFromStream(s);
}
}
I am getting a "The specified resource does not exist" exception when I try to iterate the result of a ListBlobs() call. I can get the blob attributes when I access it directly, but I'm trying to get a list of all the blobs in the subdirectory.
I wrote this little test to see exactly where the problem is. I have a test driver and two methods here. The first method, "GetBlockBlobDateTime" runs fine and returns a date time of an existing blob. The second method "GetBlobDirFiles" uses the same inputs and throws the excpetion when I try to iterate the blobItems at.
foreach (IListBlobItem blobItem in blobItems)
Note that the same data is used for both methods. What am I missing?
public static void DoTest(string baseURL, string container, string directory, string fileName)
{
DateTime t = GetBlockBlobDateTime( baseURL, container, directory, fileName);
List<string> fileList = GetBlobDirFiles( baseURL, container, directory);
}
public static DateTime GetBlockBlobDateTime(string baseURL, string container, string directory, string fileName)
{
CloudBlobClient blobClient = new CloudBlobClient(baseURL);
CloudBlobDirectory blobDir = blobClient.GetBlobDirectoryReference(container);
CloudBlobDirectory subDirectory = blobDir.GetSubdirectory(directory);
CloudBlockBlob cloudBlockBlob = subDirectory.GetBlockBlobReference(fileName);
cloudBlockBlob.FetchAttributes();
DateTime cloudTimeStampUTC = cloudBlockBlob.Properties.LastModifiedUtc;
return cloudTimeStampUTC;
}
public static List<string> GetBlobDirFiles(string baseURL, string container, string directory)
{
CloudBlobClient blobClient = new CloudBlobClient(baseURL);
CloudBlobDirectory blobDir = blobClient.GetBlobDirectoryReference(container);
CloudBlobDirectory subDirectory = blobDir.GetSubdirectory(directory);
IEnumerable<IListBlobItem> blobItems = subDirectory.ListBlobs();
List<string> fileList = new List<string>();
foreach (IListBlobItem blobItem in blobItems)
{
fileList.Add(blobItem.Uri.ToString());
}
return fileList;
}
OK, I figured it out:
Apparently, you don't need permissions to get file attributes, but you do to list files in the directory.
CloudBlobClient blobClient = new CloudBlobClient(baseURL);
works when you are going to fetch attributes like this:
cloudBlockBlob.FetchAttributes();
But you need to provide credentials like this:
CloudBlobClient blobClient =
new CloudBlobClient(baseURL,
new StorageCredentialsAccountAndKey(myAccount, myKey));
when you are going to list the blobs like this:
var blobList = subDirectory.ListBlobs();
foreach (var blobInfo in blobList)