Authentication Failure when uploading to Azure Blob Storage using Azure.Storage.Blob v12.9.0 - azure

I get this error when trying to upload files to blob storage. The error is present both when I run on localhost and when I run in Azure Function.
My connection string looks like:
DefaultEndpointsProtocol=https;AccountName=xxx;AccountKey=xxx;EndpointSuffix=core.windows.net
Authentication information is not given in the correct format. Check the value of the Authorization header.
Time:2021-10-14T15:56:26.7659660Z
Status: 400 (Authentication information is not given in the correct format. Check the value of Authorization header.)
ErrorCode: InvalidAuthenticationInfo
But this used to work in the past but recently its started throwing this error for a new storage account I created. My code looks like below
public AzureStorageService(IOptions<AzureStorageSettings> options)
{
_connectionString = options.Value.ConnectionString;
_containerName = options.Value.ImageContainer;
_sasCredential = new StorageSharedKeyCredential(options.Value.AccountName, options.Value.Key);
_blobServiceClient = new BlobServiceClient(new BlobServiceClient(_connectionString).Uri, _sasCredential);
_containerClient = _blobServiceClient.GetBlobContainerClient(_containerName);
}
public async Task<string> UploadFileAsync(IFormFile file, string location, bool publicAccess = true)
{
try
{
await _containerClient.CreateIfNotExistsAsync(publicAccess
? PublicAccessType.Blob
: PublicAccessType.None);
var blobClient = _containerClient.GetBlobClient(location);
await using var fileStream = file.OpenReadStream();
// throws Exception here
await blobClient.UploadAsync(fileStream, true);
return blobClient.Uri.ToString();
}
catch (Exception e)
{
Console.WriteLine(e);
throw;
}
}
// To be able to do this, I have to create the container client via the blobService client which was created along with the SharedStorageKeyCredential
public Uri GetSasContainerUri()
{
if (_containerClient.CanGenerateSasUri)
{
// Create a SAS token that's valid for one hour.
var sasBuilder = new BlobSasBuilder()
{
BlobContainerName = _containerClient.Name,
Resource = "c"
};
sasBuilder.ExpiresOn = DateTimeOffset.UtcNow.AddHours(1);
sasBuilder.SetPermissions(BlobContainerSasPermissions.Write);
var sasUri = _containerClient.GenerateSasUri(sasBuilder);
Console.WriteLine("SAS URI for blob container is: {0}", sasUri);
Console.WriteLine();
return sasUri;
}
else
{
Console.WriteLine(#"BlobContainerClient must be authorized with Shared Key
credentials to create a service SAS.");
return null;
}
}

Please change the following line of code:
_blobServiceClient = new BlobServiceClient(new BlobServiceClient(_connectionString).Uri, _sasCredential);
to
_blobServiceClient = new BlobServiceClient(_connectionString);
Considering your connection string has all the necessary information, you don't really need to do all the things you're doing and you will be using BlobServiceClient(String) constructor which expects and accepts the connection string.
You can also delete the following line of code:
_sasCredential = new StorageSharedKeyCredential(options.Value.AccountName, options.Value.Key);
and can probably get rid of AccountName and Key from your configuration settings if they are not used elsewhere.

Related

Deleting Blob from Azure storage using C#

The package which I am using is Azure.Storage.Blobs (v12.9.1) and I am trying to delete a blob.
Here is the code I have written (I do not get any errors):
//path - storage url without token
public async Task<bool> DeleteFilefromStorage(string path)
{
try
{
BlobServiceClient blobServiceClient = new BlobServiceClient(Helper.StorageCS);
string containerName = Helper.ContainerName;
Uri uri = new Uri(path);
string filename = Path.GetFileName(uri.LocalPath);
BlobContainerClient blobContainerClient = blobServiceClient.GetBlobContainerClient(containerName);
var blob = blobContainerClient.GetBlobClient(filename);
return await blob.DeleteIfExistsAsync();
}
catch
{
throw;
}
}
The reason your code is failing is because your blob URL is something like https://mystorage.blob.core.windows.net/mycontainer/files/ba143f66-ba18-478a-85d6-0d661e6894dd.xlsx where the file (ba143f66-ba18-478a-85d6-0d661e6894dd.xlsx) is inside a virtual folder called files.
However when you do string filename = Path.GetFileName(uri.LocalPath);, it will only return ba143f66-ba18-478a-85d6-0d661e6894dd.xlsx and not files/ba143f66-ba18-478a-85d6-0d661e6894dd.xlsx.
Because of this when you try to delete the file, you will get a 404 error. Since DeleteIfExistsAsync method will eat 404 (Not Found) error, you will not get any errors but at the same time the blob will not be deleted as well (because it does not exist).

Fine-uploader Azure upload Url is being changed

I had my project working in Core 1, but when I changed to Core 2 it,no longer uploads the images to Azure.I get a 404 response code and "Problem sending file to Azure" message. The correct URL is returned from the server, but then,finuploader calls a URL with current URL in Front
The URL in chrome Console returning the 404 is shown as
https://localhost:44348/House/%22https://Customstorage.blob.core.windows.net/images/b0975cc7-fae5-4130-bced-c26272d0a21c.jpeg?sv=2017-04-17&sr=b&sig=UFUEsHT1GWI%2FfMd9cuHmJsl2j05W1Acux52UZ5IsXso%3D&se=2017-09-16T04%3A06%3A36Z&sp=rcwd%22
This is being added to the URL somewhere
https://localhost:44348/House/%22
I create the SAS with
public async Task<string> SAS(string bloburi, string name)
{
CloudStorageAccount storageAccount = new CloudStorageAccount(
new Microsoft.WindowsAzure.Storage.Auth.StorageCredentials(
"<storage-account-name>",
"<access-key>"), true)
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("images");
return await Task.FromResult<string>(GetBlobSasUri(container, bloburi));
}
private static string GetBlobSasUri(CloudBlobContainer container, string blobName, string policyName = null)
{
string sasBlobToken;
// Get a reference to a blob within the container.
// Note that the blob may not exist yet, but a SAS can still be created for it.
var uri = new Uri(blobName);
//get the name of the file from the path
string filename = System.IO.Path.GetFileName(uri.LocalPath);
CloudBlockBlob blob = container.GetBlockBlobReference(filename);
if (policyName == null)
{
// Create a new access policy and define its constraints.
// Note that the SharedAccessBlobPolicy class is used both to define the parameters of an ad-hoc SAS, and
// to construct a shared access policy that is saved to the container's shared access policies.
SharedAccessBlobPolicy adHocSAS = new SharedAccessBlobPolicy()
{
// When the start time for the SAS is omitted, the start time is assumed to be the time when the storage service receives the request.
// Omitting the start time for a SAS that is effective immediately helps to avoid clock skew.
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(1),
Permissions = SharedAccessBlobPermissions.Read | SharedAccessBlobPermissions.Write | SharedAccessBlobPermissions.Create | SharedAccessBlobPermissions.Delete
};
// Generate the shared access signature on the blob, setting the constraints directly on the signature.
sasBlobToken = blob.GetSharedAccessSignature(adHocSAS);
}
else
{
// Generate the shared access signature on the blob. In this case, all of the constraints for the
// shared access signature are specified on the container's stored access policy.
sasBlobToken = blob.GetSharedAccessSignature(null, policyName);
}
// Return the URI string for the container, including the SAS token.
return blob.Uri + sasBlobToken;
}
Fine-uploader -
var uploader = new qq.azure.FineUploader({
element: document.getElementById('uploader'),
template: 'qq-template',
autoUpload: false,
request: {
endpoint:'https://customstorage.blob.core.windows.net/images'
},
signature: {
endpoint: '/House/SAS'
},
uploadSuccess: {
endpoint: '/House/UploadImage'
}
});

Create Shared Access Token with Microsoft.WindowsAzure.Storage returns 403

I have a fairly simple method that uses the NEW Storage API to create a SAS and copy a blob from one container to another.
I am trying to use this to Copy blob BETWEEN STORAGE ACCOUNTS. So I have TWo Storage accounts, with the exact same Containers, and I am trying to copy a blob from the Storage Account's Container to another Storage Account's Container.
I don't know if the SDK is built for that, but it seems like it would be a common scenario.
Some additional information:
I create the token on the Destination Container.
Does that token need to be created on both the source and destination? Does it take time to register the token? Do I need to create it for each request, or only once per token "lifetime"?
I should mention a 403 is an Unauthorized Result http error code.
private static string CreateSharedAccessToken(CloudBlobClient blobClient, string containerName)
{
var container = blobClient.GetContainerReference(containerName);
var blobPermissions = new BlobContainerPermissions();
// The shared access policy provides read/write access to the container for 10 hours:
blobPermissions.SharedAccessPolicies.Add("SolutionPolicy", new SharedAccessBlobPolicy()
{
// To ensure SAS is valid immediately we don’t set start time
// so we can avoid failures caused by small clock differences:
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(1),
Permissions = SharedAccessBlobPermissions.Write |
SharedAccessBlobPermissions.Read
});
blobPermissions.PublicAccess = BlobContainerPublicAccessType.Blob;
container.SetPermissions(blobPermissions);
return container.GetSharedAccessSignature(new SharedAccessBlobPolicy(), "SolutionPolicy");
}
Down the line I use this token to call a copy operation, which returns a 403:
var uri = new Uri(srcBlob.Uri.AbsoluteUri + blobToken);
destBlob.StartCopyFromBlob(uri);
My version of Azure.Storage is 2.1.0.2.
Here is the full copy method in case that helps:
private static void CopyBlobs(
CloudBlobContainer srcContainer, string blobToken,
CloudBlobContainer destContainer)
{
var srcBlobList
= srcContainer.ListBlobs(string.Empty, true, BlobListingDetails.All); // set to none in prod (4perf)
//// get the SAS token to use for all blobs
//string token = srcContainer.GetSharedAccessSignature(
// new SharedAccessBlobPolicy(), "SolutionPolicy");
bool pendingCopy = true;
foreach (var src in srcBlobList)
{
var srcBlob = src as ICloudBlob;
// Determine BlobType:
ICloudBlob destBlob;
if (srcBlob.Properties.BlobType == BlobType.BlockBlob)
{
destBlob = destContainer.GetBlockBlobReference(srcBlob.Name);
}
else
{
destBlob = destContainer.GetPageBlobReference(srcBlob.Name);
}
// Determine Copy State:
if (destBlob.CopyState != null)
{
switch (destBlob.CopyState.Status)
{
case CopyStatus.Failed:
log.Info(destBlob.CopyState);
break;
case CopyStatus.Aborted:
log.Info(destBlob.CopyState);
pendingCopy = true;
destBlob.StartCopyFromBlob(destBlob.CopyState.Source);
return;
case CopyStatus.Pending:
log.Info(destBlob.CopyState);
pendingCopy = true;
break;
}
}
// copy using only Policy ID:
var uri = new Uri(srcBlob.Uri.AbsoluteUri + blobToken);
destBlob.StartCopyFromBlob(uri);
//// copy using src blob as SAS
//var source = new Uri(srcBlob.Uri.AbsoluteUri + token);
//destBlob.StartCopyFromBlob(source);
}
}
And finally the account and client (vetted) code:
var credentials = new StorageCredentials("BAR", "FOO");
var account = new CloudStorageAccount(credentials, true);
var blobClient = account.CreateCloudBlobClient();
var sasToken = CreateSharedAccessToken(blobClient, "content");
When I use a REST client this seems to work... any ideas?
Consider also this problem:
var uri = new Uri(srcBlob.Uri.AbsoluteUri + blobToken);
Probably you are calling the "ToString" method of Uri that produce a "Human redable" version of the url. If the blobToken contain special caracters like for example "+" this will cause a token malformed error on the storage server that will refuse to give you the access.
Use this instead:
String uri = srcBlob.Uri.AbsoluteUri + blobToken;
Shared Access Tokens are not required for this task. I ended up with two accounts and it works fine:
var accountSrc = new CloudStorageAccount(credsSrc, true);
var accountDest = new CloudStorageAccount(credsSrc, true);
var blobClientSrc = accountSrc.CreateCloudBlobClient();
var blobClientDest = accountDest.CreateCloudBlobClient();
// Set permissions on the container.
var permissions = new BlobContainerPermissions {PublicAccess = BlobContainerPublicAccessType.Blob};
srcContainer.SetPermissions(permissions);
destContainer.SetPermissions(permissions);
//grab the blob
var sourceBlob = srcContainer.GetBlockBlobReference("FOO");
var destinationBlob = destContainer.GetBlockBlobReference("BAR");
//create a new blob
destinationBlob.StartCopyFromBlob(sourceBlob);
Since both CloudStorageAccount objects point to the same account, copying without a SAS token would work just fine as you also mentioned.
On the other hand, you need either a publicly accessible blob or a SAS token to copy from another account. So what you tried was correct, but you established a container-level access policy, which can take up to 30 seconds to take effect as also documented in MSDN. During this interval, a SAS token that is associated with the stored access policy will fail with status code 403 (Forbidden), until the access policy becomes active.
One more thing that I would like to point is; when you call Get*BlobReference to create a new blob object, the CopyState property will not be populated until you do a GET/HEAD operation such as FetchAttributes.

Windows Azure Blob

I've been trying to create a Windows Azure Blob containing an image file. I followed these tutorials: http://www.nickharris.net/2012/11/how-to-upload-an-image-to-windows-azure-storage-using-mobile-services/ and http://www.windowsazure.com/en-us/develop/mobile/tutorials/upload-images-to-storage-dotnet/. Finally the following code represents a merging of them. On the last line, however, an exception is raised:
An exception of type 'System.TypeLoadException' occurred in
mscorlib.ni.dll but was not handled in user code
Additional information: A binding for the specified type name was not
found. (Exception from HRESULT: 0x80132005)
Even the container is created the table, but It doesn't work properly.
private async void SendPicture()
{
StorageFile media = await StorageFile.GetFileFromPathAsync("fanny.jpg");
if (media != null)
{
//add todo item to trigger insert operation which returns item.SAS
var todoItem = new Imagem()
{
ContainerName = "mypics",
ResourceName = "Fanny",
ImageUri = "uri"
};
await imagemTable.InsertAsync(todoItem);
//Upload image direct to blob storage using SAS and the Storage Client library for Windows CTP
//Get a stream of the image just taken
using (var fileStream = await media.OpenStreamForReadAsync())
{
//Our credential for the upload is our SAS token
StorageCredentials cred = new StorageCredentials(todoItem.SasQueryString);
var imageUri = new Uri(todoItem.SasQueryString);
// Instantiate a Blob store container based on the info in the returned item.
CloudBlobContainer container = new CloudBlobContainer(
new Uri(string.Format("https://{0}/{1}",
imageUri.Host, todoItem.ContainerName)), cred);
// Upload the new image as a BLOB from the stream.
CloudBlockBlob blobFromSASCredential =
container.GetBlockBlobReference(todoItem.ResourceName);
await blobFromSASCredential.UploadFromStreamAsync(fileStream.AsInputStream());
}
}
}
Please use Assembly Binding Log Viewer to see which load is failing. As also mentioned in the article, the common language runtime's failure to locate an assembly typically shows up as a TypeLoadException in your application.

Checking if a queue exists

I have a very basic question about Windows Azure Storage Queue errors/access.
I am trying to find out if the given storage account already contains a queue by the given name - say "queue1". I do not want to create the queue if it does not exist, and so am not keen on using the CreateIfNotExist method. The permissions I have given to the SAS token are - processing and Add (since all I want to do is to add a new message to the queue only if it already exists, and throw an error otherwise)
The problem is that when I try to get reference to a fake named queue, and add a message to it, I get a 403. 403 can also occur when the SAS token does not have permissions, so I cannot be sure what is causing the error.
Is there a way I could explicitly know if the queue exists or not?
I have tried the BeginExist, and EndExist methods but they always return false even when I can see the queue being there.
Any suggestions?
The Get Queue Metadata REST API operation will return status code 200 if the queue exists or a Queue Service Error Code otherwise.
Regarding to authorization,
This operation can be performed by the account owner and by anyone with a shared access signature that has permission to perform this operation.
A GET request to
https://myaccount.queue.core.windows.net/myqueue?comp=metadata
Will return a response like:
Response Status:
HTTP/1.1 200 OK
Response Headers:
Transfer-Encoding: chunked
x-ms-approximate-messages-count: 0
Date: Fri, 16 Sep 2011 01:27:38 GMT
Server: Windows-Azure-Queue/1.0 Microsoft-HTTPAPI/2.0
Are you sure you're getting a 403 error even if the queue does not exist. Based on what you described above, I created a simple console app. The queue does not exist in my storage account. When I try to add a message with valid SAS token, I get a 404 error:
CloudStorageAccount storageAccount = new CloudStorageAccount(new StorageCredentials("account", "key"), false);
CloudQueueClient client = storageAccount.CreateCloudQueueClient();
CloudQueue queue = client.GetQueueReference("non-existent-queue");
var queuePolicy = new SharedAccessQueuePolicy();
var sas = queue.GetSharedAccessSignature(new SharedAccessQueuePolicy()
{
SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(30),
Permissions = SharedAccessQueuePermissions.Add | SharedAccessQueuePermissions.ProcessMessages | SharedAccessQueuePermissions.Update
}, null);
StorageCredentials creds = new StorageCredentials(sas);
var queue1 = new CloudQueue(queue.Uri, creds);
try
{
queue1.AddMessage(new CloudQueueMessage("This is a test message"));
}
catch (StorageException excep)
{
//Get 404 error here
}
Next, I made the SAS token invalid by setting it's expiry to 30 minutes before current time. Now when I run the application, I get 403 error as expected.
CloudStorageAccount storageAccount = new CloudStorageAccount(new StorageCredentials("account", "key"), false);
CloudQueueClient client = storageAccount.CreateCloudQueueClient();
CloudQueue queue = client.GetQueueReference("non-existent-queue");
var queuePolicy = new SharedAccessQueuePolicy();
var sas = queue.GetSharedAccessSignature(new SharedAccessQueuePolicy()
{
SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(-30),//-30 to ensure SAS is invalid
Permissions = SharedAccessQueuePermissions.Add | SharedAccessQueuePermissions.ProcessMessages | SharedAccessQueuePermissions.Update
}, null);
StorageCredentials creds = new StorageCredentials(sas);
var queue1 = new CloudQueue(queue.Uri, creds);
try
{
queue1.AddMessage(new CloudQueueMessage("This is a test message"));
}
catch (StorageException excep)
{
//Get 403 error here
}
There is now an Exists and ExistsAsync (with various overloads).
Example of the former in use:
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(connectionString);
CloudQueueClient queueClient = storageAccount.CreateCloudQueueClient();
CloudQueue queue = queueClient.GetQueueReference(queueName);
bool doesExist = queue.Exists();
You will want a reference to Microsoft.Azure.Storage.Queue (I believe older 'cloud' assemblies may not have had these properties - initially I could only access ExistsAsync before I had reference the right package, once I had added the above via Nuget Exists also was available)
For more details see the following links:
Exists
ExistsAsync
There is no Exists method in the v12 as well. Wrote a simple helper method to do the check:
private async Task<bool> QueueExistsAsync(QueueClient queue)
{
try
{
await queue.GetPropertiesAsync();
return true;
}
catch (RequestFailedException ex)
{
if (ex.Status == (int) HttpStatusCode.NotFound)
{
return false;
}
throw;
}
}

Resources