How to add Authorization header to SAS URI? - azure

I am working on a POC where I have to create a simulated device and connected to IOT HUB , this part is done after this some external application sends message to IOT HUB for that device.
Message contains the blob storage SAS URI, this same file I need to download to device.
Simulated device able to get the SAS URI and but when I am start downloading the file below error I am getting.
Exception in thread "main" com.microsoft.azure.storage.StorageException: Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
Rectify me if my approach is wrong and correct me with appropriate approch for this use case.
private static void download(String message) throws StorageException, IOException, JSONException, URISyntaxException {
// need to download the file to simulator in folder
try {
JSONObject jsonObject = new JSONObject(message);
String sasUri = (String) jsonObject.get("fileUrl");
System.out.println("SAS URI from hub ->" + sasUri + " ");
URI url = new URI(sasUri);
//downloadFile(sasUri);
System.out.println("end of file download function");
CloudBlob blob = new CloudBlockBlob(url);
blob.downloadToFile("/path/to/download/file");
} catch(Exception e) {
e.printStackTrace();
}
}
Below is SAS URI :-
https://*******.blob.core.windows.net/test/testfile.zip?sv=2017-07-29&ss=b&srt=sco&sp=rwdlac&se=2018-04-16T13:33:22Z&st=2018-04-16T05:00:22Z&spr=https&sig=***********
I am getting the SAS URI from azure portal directly , not generating at runtime.
Thanks in advance!

To narrow down the issue you can have a try of the following method to see if it helps.
Get the URI of the blob in Azure Portal by click "Download" like this:
After that, the file will be downloaded. You can find the URI in explorer download history. The URI format will like this:
https://[storage-account].blob.core.windows.net/testdownload/20180417_4.zip?sv=2017-07-29&ss=bqtf&srt=sco&sp=rwdlacup&se=2018-04-19T15:52:15Z&sig=[signature]
Directly use this URI in the following code piece and it will work.
CloudBlob blob = new CloudBlockBlob(url);
await blob.DownloadToFileAsync(imgPath, System.IO.FileMode.CreateNew);
Upate: Another way to get absolute URI to the blob from Azure Portal looks like this:
First get SAS token. Note the Start and expiry date/time. The token is valid only in this time period.
Second get blob URL.
Finally, the complete absolute URI to the blob is blob URL plus SAS token. It will like this:
https://ritastorageaccount.blob.core.windows.net/?sv=2017-07-29&ss=b&srt=sco&sp=rwdlac&se=2018-04-26T10:01:47Z&st=2018-04-26T02:01:47Z&spr=https&sig=[SIG]

Related

I want to generate SAS URL dynamically via C# code for Azure Blob Container. Using this SAS URL we must be able to upload the files

I want to generate SAS URL dynamically via C# code for Azure Blob Container. Using this SAS URL we must be able to upload the files to the Azure Blob Container. I have tried multiple ways to generate the SAS URL by following the Microsoft docs. But I am always getting AuthorizationResourceTypeMismatch Error or AuthorizationPermissionMismatch.
Error: AuthorizationPermissionMismatch This request is not authorized to perform this operation using this permission.
private static Uri GetServiceSasUriForContainer(BlobContainerClient containerClient,
string storedPolicyName = null)
{
// Check whether this BlobContainerClient object has been authorized with Shared Key.
if (containerClient.CanGenerateSasUri)
{
// Create a SAS token that's valid for one hour.
BlobSasBuilder sasBuilder = new BlobSasBuilder()
{
BlobContainerName = containerClient.Name,
Resource = "c"
};
if (storedPolicyName == null)
{
sasBuilder.ExpiresOn = DateTimeOffset.UtcNow.AddHours(1);
sasBuilder.SetPermissions(BlobContainerSasPermissions.Read);
}
else
{
sasBuilder.Identifier = storedPolicyName;
}
Uri sasUri = containerClient.GenerateSasUri(sasBuilder);
Console.WriteLine("SAS URI for blob container is: {0}", sasUri);
Console.WriteLine();
return sasUri;
}
else
{
Console.WriteLine(#"BlobContainerClient must be authorized with Shared Key
credentials to create a service SAS.");
return null;
}
}
Error: AuthenticationFailed Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
I need this sas url because I use this url in my javascript to upload files into the Azure Blob Container.
Can someone help me out achieving this goal?
The reason you're getting this error is because you are creating the SAS token with Read permission (BlobContainerSasPermissions.Read).
In order to upload a blob in a container using SAS URL, the SAS token needs either Write (BlobContainerSasPermissions.Write) or Create (BlobContainerSasPermissions.Create) permission. Please create a SAS token with one of these permissions and you should not get this error.
To learn more about the permissions, please see this link: https://learn.microsoft.com/en-us/rest/api/storageservices/create-service-sas#permissions-for-a-directory-container-or-blob.

How to programmatically find out what operations I can do in a blob storage?

I am using libraries Microsoft.Azure.Storage.Blob 11.2.3.0 and Microsoft.Azure.Storage.Common 11.2.3.0 to connect to an Azure BlobStorage from a .NET Core 3.1 application.
When I started working on this, I had been given connection strings that gave me full access to the BlobStorage (or rather, the entire cloud storage account). Based upon those, I chose to write my connection code "defensively", making use of Exists() and CreateIfNotExists() from the CloudBlobContainer class to ensure the application would not fail when a container was not yet existing.
Now, I'm connecting a BlobStorage container using a SAS. While I can freely retrieve and upload blobs within the container like this, unfortunately, it seems that I am not allowed to do anything on the container level. Not only CreateIfNotExists, but even the mere querying of existence by Exists() throws a StorageException saying
This request is not authorized to perform this operation.
The documentation does not mention the exception.
Is there any way to check preemptively whether I am allowed to check the container's existence?
I have tried looking into the container permissions retrieved from GetPermissions, but that will throw an exception, as well.
The only other alternative I can see is to check for container existence within a try-catch-block and assume existence if an exception is thrown ...
There's a no definitive way to identify if an operation can be performed using a SAS token other than performing that operation and catching any exception that may be thrown by the operation. The exception that is of your interest is Unauthorized (403).
However you can try to predict if an operation can be performed by looking at the SAS token. If it is a Service SAS Token and not an Account SAS Token, that means all the account related operations are not not allowed. The way to distinguish between an Account SAS token and a Service SAS token is that the former will contain attributes like SignedServices (ss) and SignedResourceTypes (srt).
Next thing you would want to do is look for SignedPermissions (sp) attribute in your SAS token. This attribute will tell you what all operations are possible with the SAS token. For example, if your SAS token is a Service SAS token and if it includes Delete (d) permission, that would mean you can use this SAS token to delete a blob.
Please see these tables for the permissions/allowed operations combinations:
Service SAS Token: https://learn.microsoft.com/en-us/rest/api/storageservices/create-service-sas#permissions-for-a-directory-container-or-blob
Account SAS Token: https://learn.microsoft.com/en-us/rest/api/storageservices/create-service-sas#permissions-for-a-directory-container-or-blob
Please note that the operation might still fail for any number of reasons like SAS token has expired, account key has changed since the generation of SAS token, IP restrictions etc.
I tried in in my system to check whether the container exist or not able check it and if container not exists created container and able to upload file.
You need to give proper permission for your SAS Token
const string sasToken = “SAS Token”
const string accountName = "teststorage65";
const string blobContainerName = "example";
const string blobName = "test.txt";
const string myFileLocation = #"Local Path ";
var storageAccount = new CloudStorageAccount(storageCredentials, accountName, null, true);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer blobContainer = blobClient.GetContainerReference(blobContainerName);
var result=blobContainer.Exists();
if (result == true)
{
Console.WriteLine("Container exists");
}
else
{
// blobContainer.CreateIfNotExists();
Console.WriteLine("Conatiner not exists");
Console.WriteLine("Creating Container "+ blobContainerName);
blobContainer.CreateIfNotExists();
}
// blobContainer.CreateIfNotExists();
//Console.WriteLine("Creating Container ");
CloudBlockBlob cloudBlob = blobContainer.GetBlockBlobReference(blobName);
cloudBlob.UploadFromFile(myFileLocation);
OUTPUT

Images uploaded to Azure blob storage unavailable when browsing by direct URL

I have uploaded a number of images to a Blob container on an Azure storage account of type StorageV2 (general purpose v2).
These were uploaded programmatically. Here's the code I used:
public Task CopyFile(string fileName, string targetPath)
{
var blobRef = Container.GetBlockBlobReference(targetPath);
blobRef.Properties.ContentType = GetContentType(fileName);
return blobRef.UploadFromFileAsync(fileName);
}
public string GetContentType(string fileName)
{
var provider = new FileExtensionContentTypeProvider();
if (!provider.TryGetContentType(fileName, out var contentType))
{
contentType = "application/octet-stream";
}
return contentType;
}
Container is an initialized CloudBlobContainer instance.
When I use the Storage Explorer I can see the uploaded files. If I view the properties of any file it lists a Uri property. However, if I copy the value (a URL) and paste into a browser I see the following error page:
<Error>
<Code>ResourceNotFound</Code>
<Message>
The specified resource does not exist. RequestId:12485818-601e-0017-6f69-56c3df000000 Time:2019-08-19T08:35:13.2123849Z
</Message>
</Error>
But if I double-click the file in Storage Explorer it downloads the image correctly. The URL it uses is the same as the one I copied earlier as far as I could tell, except for some additional querystrings that look like this: ?sv=2018-03-28&ss=bqtf&srt=sco&sp=rwdlacup&se=2019-08-19T16:49:38Z&sig=%2FJs7VnGKsjplalKXCcl0XosgUkPWJccg0qdvCSZlDSs%3D&_=1566204636804
I assume this must mean my blobs are not publically available, but I can't find any setting that will make my images available publically at their known URI. Can anyone point me in the right direction here? Thank you.
Check the access level that set to your container.
If that is a Private then you will have the error that you experiencing: ResourceNotFound
As far as I know, if you container's access level is Private, you use the direct url to access the blob then you will get the error. If you want to access it, you need to generate a SAS token for it.
For more details, please refer to
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-manage-access-to-resources
https://learn.microsoft.com/en-us/azure/storage/common/storage-sas-overview

Azure Blob Storage V2 does not deliver content-disposition header any more

I have a question about the "content-disposition" blob property of a file in Azure Blog Storage V2.
I configured this property of my file howto-201901.pdf as "attachment; filename=howto.pdf" with Azure Storage Explorer 1.6.2 (see screenshot) asn in Azure Portal. The property is set on file, but not delivered as header info on downloading.
With previous storage V1, was it no problem. If I downloaded the file howto-2010901.pdf, was the http header content-disposition set and the browser downloaded the file like my configuration howto-pdf.
But since 2 or 3 month, maybe since my upgrade to storage V2, doesn't work this feature. The browser download the file with the original name.
Is there anyone, who has information for me to solve this behavior?
Best Tino
Content-Disposition header in response is not sent when the download URL is not authenticated. For the client to receive Content-Disposition
Create an SAS token with limited access.
Append it to blob download link.
This is a possible solution and worked for me.
Instead of creating a new policy, can you take also an existing policy from your blob storage. see https://learn.microsoft.com/en-us/azure/storage/common/storage-dotnet-shared-access-signature-part-1
private Uri GetDownloadUri(CloudBlockBlob blob)
{
try
{
// Return the SAS token.
var query = GenerateSASQueryString(blob);
UriBuilder newUri = new UriBuilder(blob.Uri)
{
Query = query
};
return newUri.Uri;
}
catch (UriFormatException ex)
{
Console.WriteLine(ex);
}
return blob.Uri;
}
private string GenerateSASQueryString(CloudBlockBlob blob)
{
if (blob == null)
return null;
// Create a new access policy for the account.
SharedAccessBlobPolicy policy = new SharedAccessBlobPolicy
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessExpiryTime = DateTimeOffset.UtcNow.AddHours(24),
SharedAccessStartTime = DateTimeOffset.UtcNow
};
// Return the SAS token.
var query = blob.GetSharedAccessSignature(policy);
return query;
}
I'm facing the same issue, but I'm very confused about .NET SDKs and the SharedAccessBlobPolicy.
I'm using Azure.Storage.Blobs version 12.4.1 SDK for managing storage. Is it possible to use it to set SharedAccessBlobPolicy, or am I supposed to do it different way? I tried to look into documentation, but it is not really helpful, I could find just information about Microsoft.Azure.Storage.Blob SDK version 11 which is considered legacy and deprecated now.

Getting Image from API, then storing it as Azure Storage Blob (image error)

So I'm trying to do this:
Get Users from AD using Graph Api in a Azure Function (C# or Node)
For each user, get their photo, using Graph Api (in the same Azure Function)
With the photo data, upload as a blob to Azure Storage
Now, I have 1 and 2 working correctly. But now, I have no idea how to convert that image/jpeg string into a Blob in Azure Storage. I've tried a lot, researched a lot but it's been really difficult.
I've tried to use
blob.Properties.ContentType = "image/jpeg"
blob.UploadText(imgString);
But it doesn't work.
So my code looks something like this:
I get a fresh oAuth token from azure (good for 3600 sec)
Get /Users from AD Graph API
For each user, I use /user//photo/$value resource, which returns a image/jpeg data.
From that data (a string?) I try to blob.UploadText but it fails.
The way I'm getting the image data from GraphApi is using RestSharp, like this:
var client = new RestClient("https://graph.microsoft.com/v1.0/users/" + email + "/photo/$value");
var request = new RestRequest(Method.GET);
request.AddHeader("cache-control", "no-cache");
request.AddHeader("authorization", "Bearer " + token);
request.AddHeader("content-type", "image/jpeg");
return client.Execute(request);
So I return an IRestResponse, which contains something like this:
response.ContentType //to get the content type
response.Content // to get the body (the image)
blob.UploadText(response.Content);
And that's what I'm trying to do, but it doesn't work, the file gets saved OK but when you open it, you can't really see the image. I think the issue might be some encoding, I've tried setting to different encoding types with no luck.
Take a look at this next picture. To the right, I got the image from Graph Api using PHP, and set header as image/jpeg, then echo the image data and that's it. It works. On the left, it's the Azure Function with javascript or c#, I get the image and when I try to do the same (show in the browser) I get a different binary string and no picture on the page (like if it wasn't image data), so it looks like as if the problem is encoding. I'm saving this binary data on a blob file with UploadText but it's not working.
Any ideas?
Any ideas?
Please have to use RawBytes as blob content. It works correctly on my side.
blob.UploadFromByteArray(response.RawBytes,0,response.RawBytes.Length-1);
The following is my test demo code
var connectionString = "storage connection string";
CloudStorageAccount storageAccount =
CloudStorageAccount.Parse(connectionString);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container =
blobClient.GetContainerReference("container");
container.CreateIfNotExists();
CloudBlockBlob blob = container.GetBlockBlobReference("test.jpeg");
blob.UploadFromByteArray(response.RawBytes,0,response.RawBytes.Length-1);

Resources