I am using the Azure object store in the following manner:
I have one container, and beneath it many blobs in a directory structure.
I am using Azure Blob Storage api to manage it.
Is there a way to delete an entire directory?
Do I really need to list all the blobs under it and then delete them one by one?
Is there a workaround like deleting all blobs with the same uri prefix (again, without listing them and then deleting them one by one)?
I don't know if there is a new solution, but we did that using https://msdn.microsoft.com/library/microsoft.windowsazure.storage.blob.cloudblobcontainer.listblobs.aspx - if we see what is going on with Fiddler, there are only prefix-ed blobs returned. Please see if that will work for you:
static void GetBlobsByPrefix(string Container, string Prefix)
{
if (!string.IsNullOrEmpty(Prefix))
{
var _Container = GetBlobContainer(Container);
var _Blobs = _Container.ListBlobs(Prefix, true);
foreach (IListBlobItem blob in _Blobs)
{
....
}
}
}
static CloudBlobContainer GetBlobContainer(string container)
{
CloudStorageAccount _StorageAccount = CloudStorageAccount.Parse(
CloudConfigurationManager.GetSetting("rus_AzureStorageConnectionString"));
CloudBlobClient _BlobClient = _StorageAccount.CreateCloudBlobClient();
CloudBlobContainer _Container = _BlobClient.GetContainerReference(container);
return _Container;
}
You can delete the container and all your blobs will be deleted. Containers on Azure Storage act as a "folder" for your blobs.
Related
The task here is to delete the folders inside the storage account from a Logic App. I am seeking a similar action as "Delete Blob" to delete the folders also. For example, directory structure is like
XYZ -> 2021-06-14 -> filename.json
I want to delete the folder itself but unable to find a direct action for the same. Any work arounds are also accepted.
Here are some links where you have some details about deletion of files or folder inside blob using Logic App
1)https://lucavallarelli.altervista.org/blog/gdpr-logic-app-delete-blob/
2)https://sameeraman.wordpress.com/2017/08/25/logic-apps-delete-files-older-than-x-days-from-a-blob-storage/
Azure Blob Storage does not really have the concept of folders. The hierarchy is very simple: storage account > container > blob.
I have 2 ways for this
WAY - 1
we can delete a specific Blob from the container by using delete method as follows:
public void DeleteBlob()
{
var _containerName = "appcontainer";
string _storageConnection = CloudConfigurationManager.GetSetting("StorageConnectionString");
CloudStorageAccount cloudStorageAccount = CloudStorageAccount.Parse(_storageConnection);
CloudBlobClient _blobClient = cloudStorageAccount.CreateCloudBlobClient();
CloudBlobContainer _cloudBlobContainer = _blobClient.GetContainerReference(_containerName);
CloudBlockBlob _blockBlob = _cloudBlobContainer.GetBlockBlobReference("f115a610-a899-42c6-bd3f-74711eaef8d5-.jpg");
//delete blob from container
_blockBlob.Delete();
}
Delete Action will be:
public ActionResult DeleteBlob()
{
imageService.DeleteBlob();
return View();
}
WAY - 2
Listing the blobs within the required container, you can then delete them individually
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("your storage account");
CloudBlobContainer container = storageAccount.CreateCloudBlobClient().GetContainerReference("pictures");
foreach (IListBlobItem blob in container.GetDirectoryReference("users").ListBlobs(true))
{
if (blob.GetType() == typeof(CloudBlob) || blob.GetType().BaseType == typeof(CloudBlob))
{
((CloudBlob)blob).DeleteIfExists();
}
}
For further more analysis you can refer the following links link 1 link 2.
My code below uploads an Excel file from a user's form submission. However, I'd like to work with the Excel file using EPplus to write it to the database. For that, I need the file's address on the disk, rather than the web address. How can I get that?
Relevant code:
{
public class ExcelService
{
public async Task<string> UploadExcelAsync(HttpPostedFileBase upload)
{
string excelFullPath = null;
if (upload == null || upload.ContentLength == 0)
{
return null;
}
try
{
CloudStorageAccount cloudStorageAccount = ConnectionString.GetConnectionString();
CloudBlobClient cloudBlobClient = cloudStorageAccount.CreateCloudBlobClient();
CloudBlobContainer cloudBlobContainer = cloudBlobClient.GetContainerReference("excel");
if (await cloudBlobContainer.CreateIfNotExistsAsync())
{
await cloudBlobContainer.SetPermissionsAsync(
new BlobContainerPermissions
{
PublicAccess = BlobContainerPublicAccessType.Blob
}
);
}
string excelName = Guid.NewGuid().ToString() + "-" + Path.GetExtension(upload.FileName);
CloudBlockBlob cloudBlockBlob = cloudBlobContainer.GetBlockBlobReference(excelName);
cloudBlockBlob.Properties.ContentType = upload.ContentType;
await cloudBlockBlob.UploadFromStreamAsync(upload.InputStream);
excelFullPath = cloudBlockBlob.Uri.ToString();
}
catch (Exception ex)
{
}
return excelFullPath;
}
}
}
You can't. Azure Storage Blobs is a cloud service that only exposes web access to the stored blobs, so there is no physical location on a disk you can access. So unless you download the file, edit it and upload it again there is not way to edit it directly.
An alternative could be to use Azure Storage Files. Still an Azure Cloud based service for storage but it allows you to map folders to your machine so you can access it like any network share. See the docs. This, of course, only works if you are able to create a file share on the web server.
Unfortunately you cannot access Azure Storage Files shares using Azure Web Apps (source) so if that it how your web app is hosted than you are out of luck for that.
I have rather foolishly uploaded a vhd to Azure file storage thinking I can create a virtual machine from it only to find out it really needs to be in blob storage.
I know I can just upload it again - but it is very large and my upload speed is very slow.
My question is - can I move a file from file storage to blob storage without downloading/uploading again? I.e. is there anything in the Azure portal UI to do it, or even a PowerShell command?
You can try AzCopy:
AzCopy.exe /Source:{*URL to source container*} /Dest:{*URL to dest container*} /SourceKey:{*key1*} /DestKey:{*key2*} /S
When copying from File Storage to Blob Storage, the default blob type is block blob, user can specify option /BlobType:page to change the destination blob type.
AzCopy by default copies data between two storage endpoints asynchronously. Therefore, the copy operation will run in the background using spare bandwidth capacity that has no SLA in terms of how fast a blob will be copied, and AzCopy will periodically check the copy status until the copying is completed or failed. The /SyncCopy option ensures that the copy operation will get consistent speed.
In c#:
public static CloudFile GetFileReference(CloudFileDirectory parent, string path)
{
var filename = Path.GetFileName(path);
var fullPath = Path.GetDirectoryName(path);
if (fullPath == string.Empty)
{
return parent.GetFileReference(filename);
}
var dirReference = GetDirectoryReference(parent, fullPath);
return dirReference.GetFileReference(filename);
}
public static CloudFileDirectory GetDirectoryReference(CloudFileDirectory parent, string path)
{
if (path.Contains(#"\"))
{
var paths = path.Split('\\');
return GetDirectoryReference(parent.GetDirectoryReference(paths.First()), string.Join(#"\", paths.Skip(1)));
}
else
{
return parent.GetDirectoryReference(path);
}
}
The code to copy:
// Source File Storage
string azureStorageAccountName = "shareName";
string azureStorageAccountKey = "XXXXX";
string name = "midrive";
CloudStorageAccount storageAccount = new CloudStorageAccount(new StorageCredentials(azureStorageAccountName, azureStorageAccountKey), true);
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
CloudFileShare fileShare = fileClient.GetShareReference(name);
CloudFileDirectory directorio = fileShare.GetRootDirectoryReference();
CloudFile cloudFile = GetFileReference(directorio, "SourceFolder\\fileName.pdf");
// Destination Blob
string destAzureStorageAccountName = "xx";
string destAzureStorageAccountKey = "xxxx";
CloudStorageAccount destStorageAccount = new CloudStorageAccount(new StorageCredentials(destAzureStorageAccountName, destAzureStorageAccountKey), true);
CloudBlobClient destClient = destStorageAccount.CreateCloudBlobClient();
CloudBlobContainer destContainer = destClient.GetContainerReference("containerName");
CloudBlockBlob destBlob = destContainer.GetBlockBlobReference("fileName.pdf");
// copy
await TransferManager.CopyAsync(cloudFile, destBlob, true);
Another option is to use Azure CLI...
az storage copy -s /path/to/file.txt -d https://[account].blob.core.windows.net/[container]/[path/to/blob]
More info here: az storage copy
Thanks to Gaurav Mantri for pointing me in the direction of AzCopy.
This does allow me to copy between file and blob storage using the command:
AzCopy.exe /Source:*URL to source container* /Dest:*URL to dest container* /SourceKey:*key1* /DestKey:*key2* /S
However as Gaurav also rightly points out in the comment the resulting blob will be of type Block Blob and this is no good for me. I need one of type Page Blob in order to create a VM out of it using https://github.com/Azure/azure-quickstart-templates/tree/master/201-vm-specialized-vhd
There is no way to change the blob type as far as I can see once it is up there in the cloud, so it looks like my only option is to wait for a lengthy upload again.
I know Azure doesn't have actual subpaths, but if I have for example container/projectID/iterationNumber/filename.jpg and I delete a project, how can I delete from ProjectID? Is it possible through coding?
I don't want to use the azure application as I am creating a web app.
Thanks in Advance
EDIT:
This is the code provided by Microsoft to target on specific item:
// Retrieve storage account from connection string.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
CloudConfigurationManager.GetSetting("StorageConnectionString"));
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve reference to a previously created container.
CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");
// Retrieve reference to a blob named "myblob.txt".
CloudBlockBlob blockBlob = container.GetBlockBlobReference("myblob.txt");
// Delete the blob.
blockBlob.Delete();
SystemDesignModel
public static SystemDesign returnImageURL(IListBlobItem item)
{
if (item is CloudBlockBlob)
{
var blob = (CloudBlockBlob)item;
return new SystemDesign
{
URL = blob.Uri.ToString(),
};
}
return null;
}
}
As you know, blob storage does not have the concept of subfolders. It has just 2 level hierarchy - container & blobs. So in essence, a subfolder is just a prefix that you attach to blob name. In your example, the actual file you uploaded is filename.jpg but its name from blob storage perspective is projectID/iterationNumber/filename.jpg.
Since there is no concept of subfolder, you just can't delete it like we do on our local computer. However there's a way. Blob storage provides a way to search for blobs starting with a certain blob prefix. So what you have to do is first list all blobs that start with certain prefix (projectID in your case) and then delete the blobs one at a time returned as a result of listing operations.
Take a look at sample code below:
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
CloudConfigurationManager.GetSetting("StorageConnectionString"));
var container = storageAccount.CreateCloudBlobClient().GetContainerReference("container");
BlobContinuationToken token = null;
do
{
var listingResult = container.ListBlobsSegmented("blob-prefix (projectID in your case)", true, BlobListingDetails.None, 5000, token, null, null);
token = listingResult.ContinuationToken;
var blobs = listingResult.Results;
foreach (var blob in blobs)
{
(blob as ICloudBlob).DeleteIfExists();
Console.WriteLine(blob.Uri.AbsoluteUri + " deleted.");
}
}
while (token != null);
I have created a private blob in a container on Azure.
Unfortunately it changes to public when I upload files. I have tried finding a way to set files as private when uploading, since that might be the problem, but I can't find anything.
Any ideas as to why this is?
Should private files be treated different when uploading?
My upload code:
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference(containerName);
if (file != null)
{
CloudBlockBlob blockBlob = container.GetBlockBlobReference(filename);
blockBlob.UploadFromStream(file.InputStream);
blockBlob.Properties.ContentEncoding = MimeTypes.GetContentType(filename);
if (ht != null)
{
foreach (DictionaryEntry item in ht)
{
blockBlob.Metadata[item.Key.ToString()] = item.Value.ToString();
}
blockBlob.SetMetadata();
}
blockBlob.Metadata["Created"] = DateTime.UtcNow.ToString();
blockBlob.SetProperties();
}
The code seems allright, please take a look here:
How to use Blobs in Windows Azure
Your blob will get the same access permissions as your container.
Some extra information: It seems you using StorageClient Library v1.7. This one is deprecated, v3.0 is the recommended version (StorageClient Library v3.0.0).
There is an issue in the StorageClient Library v.3.0.0 using containers on your local (test) machine though.