Azure Blob Storage Virtual Directories using Shared Access Signatures - azure

I can't get shared access policies to work with virtual directories in blob storage. It works fine for containers. As far as I know, virtual directories are containers so SAS should work?
When I attempt to access a resource in a virtual directory using a SAS I get this response:
<?xml version="1.0" encoding="utf-8"?>
<Error>
<Code>AuthenticationFailed</Code>
<Message>Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. RequestId:XXXXXXXX-000X-00XX-XXXX-XXXXXX000000 Time:2016-08-15T13:28:57.6925768Z</Message>
<AuthenticationErrorDetail>Signature did not match. String to sign used was r 2016-08-15T13:29:53Z /blob/xxxxxxxxxx/mycontainer 2015-12-11</AuthenticationErrorDetail>
</Error>
Example code to demonstrate:
public static async Task<string> GetFilePath()
{
var storageAccount = CloudStorageAccount.Parse( "DefaultEndpointsProtocol=https;AccountName=xxxxxxxxxx;AccountKey=xxxxxxxxxx" );
var blobClient = storageAccount.CreateCloudBlobClient();
var containerName = "mycontainer/myvd"; // remove /myvd and SAS will work fine
var containerReference = blobClient.GetContainerReference( containerName );
var blobName = "readme.txt";
var blobReference = await containerReference.GetBlobReferenceFromServerAsync( blobName );
var sasConstraints = new SharedAccessBlobPolicy();
sasConstraints.SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes( 1 );
sasConstraints.Permissions = SharedAccessBlobPermissions.Read;
var sasContainerToken = containerReference.GetSharedAccessSignature( sasConstraints );
var path = $"{blobClient.BaseUri.ToString()}{containerName}/{blobName}{sasContainerToken}";
return path;
}

Reason for this error is because Shared Access Signatures are only supported either at blob container level or blob level. In fact, in Azure Blob Storage there's no such thing as a Virtual Directory; it only supports 2-level hierarchy: Blob Container & Blob. A Virtual Directory is simply a prefix that you apply to a file (blob) name.
Based on this, I would recommend making following changes to your code:
var containerName = "mycontainer"; // remove /myvd and SAS will work fine
var containerReference = blobClient.GetContainerReference( containerName );
var blobName = "myvd/readme.txt"; //Your blob name is actually "myvd/readme.txt"
var blobReference = await containerReference.GetBlobReferenceFromServerAsync( blobName );

Related

Azure Data Storage: Unable to upload file (Not authorized to perform this operation using this permission)

I'm trying to follow the example to upload a file to Azure Data Storage as mentioned in the documentation : https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-dotnet?tabs=visual-studio%2Cmanaged-identity%2Croles-azure-portal%2Csign-in-azure-cli%2Cidentity-visual-studio
Following is my code:
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
using System;
using System.IO;
using Azure.Identity;
// TODO: Replace <storage-account-name> with your actual storage account name
var blobServiceClient = new BlobServiceClient(
new Uri("https://[some azure storage]"),
new DefaultAzureCredential());
// Set container name
string containerName = "data";
// Get container
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(containerName);
// Create a local file in the ./data/ directory for uploading and downloading
string localPath = "data";
Directory.CreateDirectory(localPath);
string fileName = "testupload" + Guid.NewGuid().ToString() + ".txt";
string localFilePath = Path.Combine(localPath, fileName);
// Write text to the file
await File.WriteAllTextAsync(localFilePath, "Hello, World!");
// Get a reference to a blob
BlobClient blobClient = containerClient.GetBlobClient(fileName);
Console.WriteLine("Uploading to Blob storage as blob:\n\t {0}\n", blobClient.Uri);
// Upload data from the local file
await blobClient.UploadAsync(localFilePath, true);
But I'm getting an error message that the request is not authorized.
Error message:
Azure.RequestFailedException: 'This request is not authorized to perform this operation using this permission.
I have Contributor role (which based on description is Grant full access to manage all resources ....), is this role still not enough to perform the operation?
I tried in my environment and got below results:
Initially I tried same code in my environment and got same error
Console:
Azure.RequestFailedException: 'This request is not authorized to perform this operation using this permission.
The above error occurs when your principal doesn't has access to azure blob storage.
For accessing blob storage through identity, Gaurav Mantri comment it is correct, you need a role to access blob storage,
The roles are
Storage-blob-contributor(or)
Storage-blob-owner
Go to portal -> storage accounts -> Access Control (IAM) ->Add -> Add role assignments -> storage-blob-contributor or storage-blob-owner role to the storage account.
Portal:
After assigning role to my storage account, I executed same code and it successfully uploaded file in azure blob storage.
Code:
using Azure.Storage.Blobs;
using System;
using System.IO;
using Azure.Identity;
// TODO: Replace <storage-account-name> with your actual storage account name
var blobServiceClient = new BlobServiceClient(
new Uri("https://[some azure storage]"),
new DefaultAzureCredential());
// Set container name
string containerName = "test";
// Get container
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(containerName);
// Create a local file in the ./data/ directory for uploading and downloading
string localPath = "data";
Directory.CreateDirectory(localPath);
string fileName = "testupload" + Guid.NewGuid().ToString() + ".txt";
string localFilePath = Path.Combine(localPath, fileName);
// Write text to the file
await File.WriteAllTextAsync(localFilePath, "Hello, World!");
// Get a reference to a blob
BlobClient blobClient = containerClient.GetBlobClient(fileName);
Console.WriteLine("Uploading to Blob storage as blob:\n\t {0}\n", blobClient.Uri);
// Upload data from the local file
await blobClient.UploadAsync(localFilePath, true);
Console:
Portal:
Make sure you have to change the Network Access to Enable Public to All, if you're not using VPN or dedicated Network to access Azure Environment.

Write file to blob storage and save the SaS URL using C#

I am trying to create an Azure Function that create files in blob storage and then save a pre-signed blob file url that is generated dynamically in an azure table so that we can return blob file url to the client program to open.
I am able to create the files in blob storage and save the urls. Right now, the code makes the file urls public, I am not sure how can I make the current code generate SaS url instead of public url and save it to the azure table.
I didn't see any example that shows the usage of CloudBlobClient and SaS. Appreciate any help.
[FunctionName("CreateFiles")]
public static async void Run([QueueTrigger("JobQueue", Connection = "")]string myQueueItem,
[Table("SubJobTable", Connection = "AzureWebJobsStorage")] CloudTable subJobTable,
ILogger log)
{
Job job = JsonConvert.DeserializeObject<Job>(myQueueItem);
var storageAccount = CloudStorageAccount.Parse("UseDevelopmentStorage=true");
CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
string containerName = $"{job.Name.ToLowerInvariant()}{Guid.NewGuid().ToString()}";
CloudBlobContainer cloudBlobContainer =
cloudBlobClient.GetContainerReference(containerName);
cloudBlobContainer.CreateIfNotExists();
BlobContainerPermissions permissions = new BlobContainerPermissions
{
PublicAccess = BlobContainerPublicAccessType.Blob
};
cloudBlobContainer.SetPermissions(permissions);
string localPath = "./data/";
string localFileName = $"{job.Id}.json";
string localFilePath = Path.Combine(localPath, localFileName);
File.WriteAllText(localFilePath, myQueueItem);
CloudBlockBlob cloudBlockBlob = cloudBlobContainer.GetBlockBlobReference(localFileName);
log.LogInformation("Uploading to Blob storage as blob:\n\t {0}\n", cloudBlockBlob.Uri.AbsoluteUri);
cloudBlockBlob.UploadFromFile(localFilePath);
// update the table with file uri
DynamicTableEntity entity = new DynamicTableEntity(job.Id, job.PracticeId);
entity.Properties.Add("FileUri", new EntityProperty(cloudBlockBlob.Uri.AbsoluteUri));
entity.Properties.Add("Status", new EntityProperty("Complete"));
TableOperation mergeOperation = TableOperation.InsertOrMerge(entity);
subJobTable.Execute(mergeOperation);
}
It looks like that your code is making use of the older version of the SDK (Microsoft.Azure.Storage.Blob). If that's the case, then you would need to use GetSharedAccessSignature method in CloudBlob to generate a shared access signature token.
Your code would be something like:
...
cloudBlockBlob.UploadFromFile(localFilePath);
var sasToken = cloudBlockBlob. GetSharedAccessSignature(sas-token-parameters);
var sasUrl = "${cloudBlockBlob.Uri.AbsoluteUri}?${sasToken}";//Add question mark only if sas token does not have it.
...

How to download from Azure blob storage using connection string and stored URL [duplicate]

I'm trying to upload images to Azure Blob Storage. I'm using .Net Core and Azure.Storage.Blobs v12.8.0.
The following code is what I have so far.
try
{
var documentByteArray = // a valid byte array of a png file
var blobUri = new Uri("https://mystorageaccount.blob.core.windows.net/images/myfile.png");
BlobClient blobClient = new BlobClient(blobUri);
using (MemoryStream stream = new MemoryStream(documentByteArray))
{
await blobClient.UploadAsync(stream, true, default);
await blobClient.SetHttpHeadersAsync(new BlobHttpHeaders
{
ContentType = "image/png"
});
}
}
catch (Exception ex)
{
//
}
...but somewhat predictably it fails with the exception Server failed to authenticate the request. Please refer to the information in the www-authenticate header.. I say predictable because I've not added any authentication...
And this is the problem/question. How do I add authentication so it will upload?
I know there are Access Keys that I can use - but how? I can't find any examples in MS documentation.
Any insight is appreciated.
If you have access to the Azure Portal, you can get the connection string of the storage account (under "Access Keys" section).
Once you have the connection string, you can use the following code:
var connectionString = "your-storage-account-connection-string";
var containerName = "images";
var blobName = "myfile.png";
var blobClient = new BlobClient(connectionString, containerName, blobName);
//do the upload here...
Other option is to use storage account name and access key (again you can get it from Azure Portal). You would do something like:
var accountName = "account-name";
var accountKey = "account-key";
var blobUri = new Uri("https://mystorageaccount.blob.core.windows.net/images/myfile.png");
var credentials = new StorageSharedKeyCredential(accountName, accountKey);
var blobClient = new BlobClient(blobUri, credentials);
//do the upload here...
You can find more information about BlobClient constructors here: https://learn.microsoft.com/en-us/dotnet/api/azure.storage.blobs.blobclient?view=azure-dotnet.
You should upload your blob through a CloudStorageAccount instance, like this:
var storageAccount = new CloudStorageAccount(
new StorageCredentials("<your account name>", "<your key>"),
"core.windows.net",
true);
var blobClient = storageAccount.CreateCloudBlobClient();
var container = blobClient.GetContainerReference(containerName);
var blob = container.GetBlockBlobReference(fileName);
await blob.UploadFromStreamAsync(stream);

Fine-uploader Azure upload Url is being changed

I had my project working in Core 1, but when I changed to Core 2 it,no longer uploads the images to Azure.I get a 404 response code and "Problem sending file to Azure" message. The correct URL is returned from the server, but then,finuploader calls a URL with current URL in Front
The URL in chrome Console returning the 404 is shown as
https://localhost:44348/House/%22https://Customstorage.blob.core.windows.net/images/b0975cc7-fae5-4130-bced-c26272d0a21c.jpeg?sv=2017-04-17&sr=b&sig=UFUEsHT1GWI%2FfMd9cuHmJsl2j05W1Acux52UZ5IsXso%3D&se=2017-09-16T04%3A06%3A36Z&sp=rcwd%22
This is being added to the URL somewhere
https://localhost:44348/House/%22
I create the SAS with
public async Task<string> SAS(string bloburi, string name)
{
CloudStorageAccount storageAccount = new CloudStorageAccount(
new Microsoft.WindowsAzure.Storage.Auth.StorageCredentials(
"<storage-account-name>",
"<access-key>"), true)
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("images");
return await Task.FromResult<string>(GetBlobSasUri(container, bloburi));
}
private static string GetBlobSasUri(CloudBlobContainer container, string blobName, string policyName = null)
{
string sasBlobToken;
// Get a reference to a blob within the container.
// Note that the blob may not exist yet, but a SAS can still be created for it.
var uri = new Uri(blobName);
//get the name of the file from the path
string filename = System.IO.Path.GetFileName(uri.LocalPath);
CloudBlockBlob blob = container.GetBlockBlobReference(filename);
if (policyName == null)
{
// Create a new access policy and define its constraints.
// Note that the SharedAccessBlobPolicy class is used both to define the parameters of an ad-hoc SAS, and
// to construct a shared access policy that is saved to the container's shared access policies.
SharedAccessBlobPolicy adHocSAS = new SharedAccessBlobPolicy()
{
// When the start time for the SAS is omitted, the start time is assumed to be the time when the storage service receives the request.
// Omitting the start time for a SAS that is effective immediately helps to avoid clock skew.
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(1),
Permissions = SharedAccessBlobPermissions.Read | SharedAccessBlobPermissions.Write | SharedAccessBlobPermissions.Create | SharedAccessBlobPermissions.Delete
};
// Generate the shared access signature on the blob, setting the constraints directly on the signature.
sasBlobToken = blob.GetSharedAccessSignature(adHocSAS);
}
else
{
// Generate the shared access signature on the blob. In this case, all of the constraints for the
// shared access signature are specified on the container's stored access policy.
sasBlobToken = blob.GetSharedAccessSignature(null, policyName);
}
// Return the URI string for the container, including the SAS token.
return blob.Uri + sasBlobToken;
}
Fine-uploader -
var uploader = new qq.azure.FineUploader({
element: document.getElementById('uploader'),
template: 'qq-template',
autoUpload: false,
request: {
endpoint:'https://customstorage.blob.core.windows.net/images'
},
signature: {
endpoint: '/House/SAS'
},
uploadSuccess: {
endpoint: '/House/UploadImage'
}
});

Create Shared Access Token with Microsoft.WindowsAzure.Storage returns 403

I have a fairly simple method that uses the NEW Storage API to create a SAS and copy a blob from one container to another.
I am trying to use this to Copy blob BETWEEN STORAGE ACCOUNTS. So I have TWo Storage accounts, with the exact same Containers, and I am trying to copy a blob from the Storage Account's Container to another Storage Account's Container.
I don't know if the SDK is built for that, but it seems like it would be a common scenario.
Some additional information:
I create the token on the Destination Container.
Does that token need to be created on both the source and destination? Does it take time to register the token? Do I need to create it for each request, or only once per token "lifetime"?
I should mention a 403 is an Unauthorized Result http error code.
private static string CreateSharedAccessToken(CloudBlobClient blobClient, string containerName)
{
var container = blobClient.GetContainerReference(containerName);
var blobPermissions = new BlobContainerPermissions();
// The shared access policy provides read/write access to the container for 10 hours:
blobPermissions.SharedAccessPolicies.Add("SolutionPolicy", new SharedAccessBlobPolicy()
{
// To ensure SAS is valid immediately we don’t set start time
// so we can avoid failures caused by small clock differences:
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(1),
Permissions = SharedAccessBlobPermissions.Write |
SharedAccessBlobPermissions.Read
});
blobPermissions.PublicAccess = BlobContainerPublicAccessType.Blob;
container.SetPermissions(blobPermissions);
return container.GetSharedAccessSignature(new SharedAccessBlobPolicy(), "SolutionPolicy");
}
Down the line I use this token to call a copy operation, which returns a 403:
var uri = new Uri(srcBlob.Uri.AbsoluteUri + blobToken);
destBlob.StartCopyFromBlob(uri);
My version of Azure.Storage is 2.1.0.2.
Here is the full copy method in case that helps:
private static void CopyBlobs(
CloudBlobContainer srcContainer, string blobToken,
CloudBlobContainer destContainer)
{
var srcBlobList
= srcContainer.ListBlobs(string.Empty, true, BlobListingDetails.All); // set to none in prod (4perf)
//// get the SAS token to use for all blobs
//string token = srcContainer.GetSharedAccessSignature(
// new SharedAccessBlobPolicy(), "SolutionPolicy");
bool pendingCopy = true;
foreach (var src in srcBlobList)
{
var srcBlob = src as ICloudBlob;
// Determine BlobType:
ICloudBlob destBlob;
if (srcBlob.Properties.BlobType == BlobType.BlockBlob)
{
destBlob = destContainer.GetBlockBlobReference(srcBlob.Name);
}
else
{
destBlob = destContainer.GetPageBlobReference(srcBlob.Name);
}
// Determine Copy State:
if (destBlob.CopyState != null)
{
switch (destBlob.CopyState.Status)
{
case CopyStatus.Failed:
log.Info(destBlob.CopyState);
break;
case CopyStatus.Aborted:
log.Info(destBlob.CopyState);
pendingCopy = true;
destBlob.StartCopyFromBlob(destBlob.CopyState.Source);
return;
case CopyStatus.Pending:
log.Info(destBlob.CopyState);
pendingCopy = true;
break;
}
}
// copy using only Policy ID:
var uri = new Uri(srcBlob.Uri.AbsoluteUri + blobToken);
destBlob.StartCopyFromBlob(uri);
//// copy using src blob as SAS
//var source = new Uri(srcBlob.Uri.AbsoluteUri + token);
//destBlob.StartCopyFromBlob(source);
}
}
And finally the account and client (vetted) code:
var credentials = new StorageCredentials("BAR", "FOO");
var account = new CloudStorageAccount(credentials, true);
var blobClient = account.CreateCloudBlobClient();
var sasToken = CreateSharedAccessToken(blobClient, "content");
When I use a REST client this seems to work... any ideas?
Consider also this problem:
var uri = new Uri(srcBlob.Uri.AbsoluteUri + blobToken);
Probably you are calling the "ToString" method of Uri that produce a "Human redable" version of the url. If the blobToken contain special caracters like for example "+" this will cause a token malformed error on the storage server that will refuse to give you the access.
Use this instead:
String uri = srcBlob.Uri.AbsoluteUri + blobToken;
Shared Access Tokens are not required for this task. I ended up with two accounts and it works fine:
var accountSrc = new CloudStorageAccount(credsSrc, true);
var accountDest = new CloudStorageAccount(credsSrc, true);
var blobClientSrc = accountSrc.CreateCloudBlobClient();
var blobClientDest = accountDest.CreateCloudBlobClient();
// Set permissions on the container.
var permissions = new BlobContainerPermissions {PublicAccess = BlobContainerPublicAccessType.Blob};
srcContainer.SetPermissions(permissions);
destContainer.SetPermissions(permissions);
//grab the blob
var sourceBlob = srcContainer.GetBlockBlobReference("FOO");
var destinationBlob = destContainer.GetBlockBlobReference("BAR");
//create a new blob
destinationBlob.StartCopyFromBlob(sourceBlob);
Since both CloudStorageAccount objects point to the same account, copying without a SAS token would work just fine as you also mentioned.
On the other hand, you need either a publicly accessible blob or a SAS token to copy from another account. So what you tried was correct, but you established a container-level access policy, which can take up to 30 seconds to take effect as also documented in MSDN. During this interval, a SAS token that is associated with the stored access policy will fail with status code 403 (Forbidden), until the access policy becomes active.
One more thing that I would like to point is; when you call Get*BlobReference to create a new blob object, the CopyState property will not be populated until you do a GET/HEAD operation such as FetchAttributes.

Resources