Can blob versions be retained while moving or renaming? - azure

I am using the latest .NET SDK for Azure Storage and enabled versions in Blob storage. I can upload, list all versions of blobs using my C# code. I would like to maintain the versions in case of a move or rename. Is it possible to do such a thing automatically? If not, is there any workaround that might help?

Figured it out using the docs. Here are the snippets of my code that worked:
var files = blobContainerClient.GetBlobs(BlobTraits.Metadata, BlobStates.Version, prefix: prefix);
foreach (BlobItem file in files)
{
var versionFile = blobContainerClient.GetBlobClient(file.Path).WithVersion(file.versionId);
if (file != null)
{
string newFileName = file.Name.Split("/").Last();
string newPath = $"{destFolderPath}/{newFileName}";
BlobClient newFile = rootDir.GetBlobClient(newPath);
await newFile.StartCopyFromUriAsync(file.Uri);
fileCount++;
if (file == files.Last())
{
var rootFile = blobContainerClient.GetBlobClient(file.Path);
rootFile.DeleteIfExists();
}
}
}
Removed some of the project specific code here so anyone looking to use this, please initialize appropriate classes.

Azure blob supports blob versioning natively, you can enable it by following this doc.
You can also maintain old blob versions by creating Blob snapshots.
If you are using .NET SDK, this official doc could be helpful.

Related

Is there a way to write a spreadsheetlight excel file directly to blob storage?

I am porting some old application services code into Microsoft Azure and need a little help.
I am pulling in some data from a stored procedure and creating a SpreadsheetLight document (this was brought in from the old code since my users want to keep the extensive formatting that was already built into this process). That code works fine, but I need to write the file directly into our azure blob container without saving a local copy first (as this process will run in a pipeline). Using some sample code I found as a guide, I was able to get it working in debug by saving locally and then uploading that out to the blob storage... but now I need to find a way to remove that local save prior to the upload.
Well I actually stumbled on the solution.
string connectionString = "YourConnectionString";
string containerName = "YourContainerName";
string strFileNameXLS = "YourOutputFile.xlsx";
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
BlobContainerClient blobContainerClient = blobServiceClient.GetBlobContainerClient(containerName);
BlobClient blobClient = blobContainerClient.GetBlobClient(strFileNameXLS);
SLDocument doc = YourSLDocument();
using (MemoryStream ms = new MemoryStream())
{
doc.SaveAs(ms);
ms.Position = 0;
await blobClient.UploadAsync(ms, true);
ms.Close();
}

I want to download a file from storage

file path
https://ubj10edustgcdn.azureedge.net/userdata/documents/a09b372d-cffe-438a-b98d-123d118ac0a5/provincial management service batch.pdf
how can i downlaod file from this path by c#
var filepath ="https://ubj10edustgcdn.azureedge.net/userdata/documents/a09b372d-cffe-438a-b98d-123d118ac0a5/provincial management service batch.pdf";
return File(filepath, "application/force-download", Path.GetFileName(filepath));
this code return exception that virtual path is not valid.
The URL contains azureedge.net, which means it's served from a CDN. Which also means you will not be downloading from storage, but 'just' a file from the internet.
Have a look at How to download a file from a URL in C#?
using (var client = new WebClient())
{
client.DownloadFile("http://example.com/file/song/a.mpeg", "a.mpeg");
}
If you actually want to download a file from Azure Storage, you will need to know the (internal) path to the file and you can use code like this to download the file:
string connectionString = CloudConfigurationManager.GetSetting("StorageConnection");
BlobContainerClient container = new BlobContainerClient(connectionString, "<CONTAINER-NAME>");
var blockBlob = container.GetBlobClient("<FILE-NAME>");
using (var fileStream = System.IO.File.OpenWrite("<LOCAL-FILE-NAME>"))
{
blockBlob.DownloadTo(fileStream);
}
Source: Uploading and Downloading a Stream into an Azure Storage Blob

How to get lease on Azure storage file?

I have multi instance web application in azure cloud. I want to make sure that only one of them read and process file from azure storage file. I thought using lease option I can make it happen.
However I am unable to find proper methods for this operation. I found this REST API option but I am looking for something like this.
This example shows for blob storage. What are file storage options? I am using latest file storage nuget.
EDIT:
This is how I am getting data from storage file and processing the data.
var storageAccount = CloudStorageAccount.Parse("FileStorageConnectionString");
var fileClient = storageAccount.CreateCloudFileClient();
var share = fileClient.GetShareReference("StorageFileShareName");
var file = this.share.GetRootDirectoryReference().GetFileReference("file.txt");
---processing file and renaming the file after processing---
How to implement lease on this file so that no other cloud instance get the access to this file? After processing the file I will also rename it.
Please try something like below:
string connectionString = "DefaultEndpointsProtocol=https;AccountName=<account-name>;AccountKey=<account-key>;EndpointSuffix=core.windows.net;";
var shareClient = new ShareClient(connectionString, "test");
shareClient.CreateIfNotExists();
var fileClient = shareClient.GetRootDirectoryClient().GetFileClient("test.txt");
var bytes = Encoding.UTF8.GetBytes("This is a test");
fileClient.Create(bytes.Length);
fileClient.Upload(new MemoryStream(bytes));
var leaseClient = fileClient.GetShareLeaseClient();
Console.WriteLine("Acquiring lease...");
var leaseId = leaseClient.Acquire();
Console.WriteLine("Lease Id: " + leaseId);
Console.WriteLine("Breaking lease...");
leaseClient.Break();
Console.WriteLine("Lease broken...");
It makes use of Azure.Storage.Files.Shares Nuget package.

Lucene.NET and storing data on Azure Blob Storage

The question I am asking is specifically because I don't want to use AzureDirectory project. I am just trying something on my own.
cloudStorageAccount = CloudStorageAccount.Parse("DefaultEndpointsProtocol=http;AccountName=xxxx;AccountKey=xxxxx");
blobClient=cloudStorageAccount.CreateCloudBlobClient();
List<CloudBlobContainer> containerList = new List<CloudBlobContainer>();
IEnumerable<CloudBlobContainer> containers = blobClient.ListContainers();
if (containers != null)
{
foreach (var item in containers)
{
Console.WriteLine(item.Uri);
}
}
/* Used to test connectivity
*/
//state the file location of the index
string indexLocation = containers.Last().Name.ToString();
Lucene.Net.Store.Directory dir =
Lucene.Net.Store.FSDirectory.Open(indexLocation);
//create an analyzer to process the text
Lucene.Net.Analysis.Analyzer analyzer = new
Lucene.Net.Analysis.Standard.StandardAnalyzer(Lucene.Net.Util.Version.LUCENE_30);
//create the index writer with the directory and analyzer defined.
bool findexExists = Lucene.Net.Index.IndexReader.IndexExists(dir);
Lucene.Net.Index.IndexWriter indexWritr = new Lucene.Net.Index.IndexWriter(dir, analyzer,!findexExists, Lucene.Net.Index.IndexWriter.MaxFieldLength.UNLIMITED);
//create a document, add in a single field
Lucene.Net.Documents.Document doc = new Lucene.Net.Documents.Document();
string path="D:\\try.html";
TextReader reader = new FilterReader("D:\\try.html");
doc.Add(new Lucene.Net.Documents.Field("url",path,Lucene.Net.Documents.Field.Store.YES,Lucene.Net.Documents.Field.Index.NOT_ANALYZED));
doc.Add(new Lucene.Net.Documents.Field("content",reader.ReadToEnd().ToString(),Lucene.Net.Documents.Field.Store.YES,Lucene.Net.Documents.Field.Index.ANALYZED));
indexWritr.AddDocument(doc);
indexWritr.Optimize();
indexWritr.Commit();
indexWritr.Close();
Now the issue is after indexing is completed I am not able to see any files created inside the container. Can anybody help me out?
You're using the FSDirectory there, which is going to write files to the local disk.
You're passing it a list of containers in blob storage. Blob storage is a service made available over a REST API, and is not addressable directly from the file system. Therefore the FSDirectory is not going to be able to write your index to storage.
Your options are :
Mount a VHD disk on the machine, and store the VHD in blob storage. There are some instructions on how to do this here: http://blogs.msdn.com/b/avkashchauhan/archive/2011/04/15/mount-a-page-blob-vhd-in-any-windows-azure-vm-outside-any-web-worker-or-vm-role.aspx
Use the Azure Directory, which you refer to in your question. I have rebuilt the AzureDirectory against the latest storage SDK: https://github.com/richorama/AzureDirectory
Another alternative for people looking around - I wrote up a directory that uses the azure shared cache (preview) which can be an alternative for AzureDirectory (albeit for bounded search sets)
https://github.com/ajorkowski/AzureDataCacheDirectory

How to create a sub container in azure storage location

How can I create a sub container in the azure storage location?
Windows Azure doesn't provide the concept of heirarchical containers, but it does provide a mechanism to traverse heirarchy by convention and API. All containers are stored at the same level. You can gain simliar functionality by using naming conventions for your blob names.
For instance, you may create a container named "content" and create blobs with the following names in that container:
content/blue/images/logo.jpg
content/blue/images/icon-start.jpg
content/blue/images/icon-stop.jpg
content/red/images/logo.jpg
content/red/images/icon-start.jpg
content/red/images/icon-stop.jpg
Note that these blobs are a flat list against your "content" container. That said, using the "/" as a conventional delimiter, provides you with the functionality to traverse these in a heirarchical fashion.
protected IEnumerable<IListBlobItem>
GetDirectoryList(string directoryName, string subDirectoryName)
{
CloudStorageAccount account =
CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
CloudBlobClient client =
account.CreateCloudBlobClient();
CloudBlobDirectory directory =
client.GetBlobDirectoryReference(directoryName);
CloudBlobDirectory subDirectory =
directory.GetSubdirectory(subDirectoryName);
return subDirectory.ListBlobs();
}
You can then call this as follows:
GetDirectoryList("content/blue", "images")
Note the use of GetBlobDirectoryReference and GetSubDirectory methods and the CloudBlobDirectory type instead of CloudBlobContainer. These provide the traversal functionality you are likely looking for.
This should help you get started. Let me know if this doesn't answer your question:
[ Thanks to Neil Mackenzie for inspiration ]
Are you referring to blob storage? If so, the hierarchy is simply StorageAccount/Container/BlobName. There are no nested containers.
Having said that, you can use slashes in your blob name to simulate nested containers in the URI. See this article on MSDN for naming details.
I aggree with tobint answer and I want to add something this situation because I also
I need the same way upload my games html to Azure Storage with create this directories :
Games\Beautyshop\index.html
Games\Beautyshop\assets\apple.png
Games\Beautyshop\assets\aromas.png
Games\Beautyshop\customfont.css
Games\Beautyshop\jquery.js
So After your recommends I tried to upload my content with tool which is Azure Storage Explorer and you can download tool and source code with this url : Azure Storage Explorer
First of all I tried to upload via tool but It doesn't allow to hierarchical directory upload because you don't need : How to create sub directory in a blob container
Finally, I debug Azure Storage Explorer source code and I edited Background_UploadBlobs method and UploadFileList field in StorageAccountViewModel.cs file. You can edit it what you wants.I may have made spelling errors :/ I am so sorry but That's only my recommend.
If you are tying to upload files from Azure portal:
To create a sub folder in container, while uploading a file you can go to Advanced options and select upload to a folder, which will create a new folder in the container and upload the file into that.
Kotlin Code
val blobClient = blobContainerClient.getBlobClient("$subDirNameTimeStamp/$fileName$extension");
this will create directory having TimeStamp as name and inside that there will be your Blob File. Notice the use of slash (/) in above code which will nest your blob file by creating folder named as previous string of slash.
It will look like this on portal
Sample code
string myfolder = "<folderName>";
string myfilename = "<fileName>";
string fileName = String.Format("{0}/{1}.csv", myfolder, myfilename);
CloudBlockBlob blob = container.GetBlockBlobReference(fileName);

Resources