I want to generate SAS URL dynamically via C# code for Azure Blob Container. Using this SAS URL we must be able to upload the files - azure

I want to generate SAS URL dynamically via C# code for Azure Blob Container. Using this SAS URL we must be able to upload the files to the Azure Blob Container. I have tried multiple ways to generate the SAS URL by following the Microsoft docs. But I am always getting AuthorizationResourceTypeMismatch Error or AuthorizationPermissionMismatch.
Error: AuthorizationPermissionMismatch This request is not authorized to perform this operation using this permission.
private static Uri GetServiceSasUriForContainer(BlobContainerClient containerClient,
string storedPolicyName = null)
{
// Check whether this BlobContainerClient object has been authorized with Shared Key.
if (containerClient.CanGenerateSasUri)
{
// Create a SAS token that's valid for one hour.
BlobSasBuilder sasBuilder = new BlobSasBuilder()
{
BlobContainerName = containerClient.Name,
Resource = "c"
};
if (storedPolicyName == null)
{
sasBuilder.ExpiresOn = DateTimeOffset.UtcNow.AddHours(1);
sasBuilder.SetPermissions(BlobContainerSasPermissions.Read);
}
else
{
sasBuilder.Identifier = storedPolicyName;
}
Uri sasUri = containerClient.GenerateSasUri(sasBuilder);
Console.WriteLine("SAS URI for blob container is: {0}", sasUri);
Console.WriteLine();
return sasUri;
}
else
{
Console.WriteLine(#"BlobContainerClient must be authorized with Shared Key
credentials to create a service SAS.");
return null;
}
}
Error: AuthenticationFailed Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
I need this sas url because I use this url in my javascript to upload files into the Azure Blob Container.
Can someone help me out achieving this goal?

The reason you're getting this error is because you are creating the SAS token with Read permission (BlobContainerSasPermissions.Read).
In order to upload a blob in a container using SAS URL, the SAS token needs either Write (BlobContainerSasPermissions.Write) or Create (BlobContainerSasPermissions.Create) permission. Please create a SAS token with one of these permissions and you should not get this error.
To learn more about the permissions, please see this link: https://learn.microsoft.com/en-us/rest/api/storageservices/create-service-sas#permissions-for-a-directory-container-or-blob.

Related

How to programmatically find out what operations I can do in a blob storage?

I am using libraries Microsoft.Azure.Storage.Blob 11.2.3.0 and Microsoft.Azure.Storage.Common 11.2.3.0 to connect to an Azure BlobStorage from a .NET Core 3.1 application.
When I started working on this, I had been given connection strings that gave me full access to the BlobStorage (or rather, the entire cloud storage account). Based upon those, I chose to write my connection code "defensively", making use of Exists() and CreateIfNotExists() from the CloudBlobContainer class to ensure the application would not fail when a container was not yet existing.
Now, I'm connecting a BlobStorage container using a SAS. While I can freely retrieve and upload blobs within the container like this, unfortunately, it seems that I am not allowed to do anything on the container level. Not only CreateIfNotExists, but even the mere querying of existence by Exists() throws a StorageException saying
This request is not authorized to perform this operation.
The documentation does not mention the exception.
Is there any way to check preemptively whether I am allowed to check the container's existence?
I have tried looking into the container permissions retrieved from GetPermissions, but that will throw an exception, as well.
The only other alternative I can see is to check for container existence within a try-catch-block and assume existence if an exception is thrown ...
There's a no definitive way to identify if an operation can be performed using a SAS token other than performing that operation and catching any exception that may be thrown by the operation. The exception that is of your interest is Unauthorized (403).
However you can try to predict if an operation can be performed by looking at the SAS token. If it is a Service SAS Token and not an Account SAS Token, that means all the account related operations are not not allowed. The way to distinguish between an Account SAS token and a Service SAS token is that the former will contain attributes like SignedServices (ss) and SignedResourceTypes (srt).
Next thing you would want to do is look for SignedPermissions (sp) attribute in your SAS token. This attribute will tell you what all operations are possible with the SAS token. For example, if your SAS token is a Service SAS token and if it includes Delete (d) permission, that would mean you can use this SAS token to delete a blob.
Please see these tables for the permissions/allowed operations combinations:
Service SAS Token: https://learn.microsoft.com/en-us/rest/api/storageservices/create-service-sas#permissions-for-a-directory-container-or-blob
Account SAS Token: https://learn.microsoft.com/en-us/rest/api/storageservices/create-service-sas#permissions-for-a-directory-container-or-blob
Please note that the operation might still fail for any number of reasons like SAS token has expired, account key has changed since the generation of SAS token, IP restrictions etc.
I tried in in my system to check whether the container exist or not able check it and if container not exists created container and able to upload file.
You need to give proper permission for your SAS Token
const string sasToken = “SAS Token”
const string accountName = "teststorage65";
const string blobContainerName = "example";
const string blobName = "test.txt";
const string myFileLocation = #"Local Path ";
var storageAccount = new CloudStorageAccount(storageCredentials, accountName, null, true);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer blobContainer = blobClient.GetContainerReference(blobContainerName);
var result=blobContainer.Exists();
if (result == true)
{
Console.WriteLine("Container exists");
}
else
{
// blobContainer.CreateIfNotExists();
Console.WriteLine("Conatiner not exists");
Console.WriteLine("Creating Container "+ blobContainerName);
blobContainer.CreateIfNotExists();
}
// blobContainer.CreateIfNotExists();
//Console.WriteLine("Creating Container ");
CloudBlockBlob cloudBlob = blobContainer.GetBlockBlobReference(blobName);
cloudBlob.UploadFromFile(myFileLocation);
OUTPUT

Local files in Azure Function

I am trying to access google drive using the azure function(time-triggered), it creates a token file during runtime when permissions are given to access the drive. It stores that file locally, and the azure function works fine locally.
But when deployed I get an error where my local system path is described in an error that I receive. When I have deployed the function why is it storing my local system path?
It should access the path where the Azure function is stored.
Code
public DriveService GetService()
{
//get Credentials from client_secret.json file
UserCredential credential;
string clientSecretString = config[Constant.ClientSecret];
log.LogInformation("String value is " + clientSecretString);
byte[] clientSecret = Encoding.UTF8.GetBytes(clientSecretString);
using (var stream = new MemoryStream(clientSecret)) <----------------- Error Message
{
log.LogInformation("Current path is " + Environment.CurrentDirectory);
credential = GoogleWebAuthorizationBroker.AuthorizeAsync(
GoogleClientSecrets.Load(stream).Secrets,
Scopes,
"user",
CancellationToken.None,
new FileDataStore(Environment.CurrentDirectory, false)).Result;
}
log.LogInformation("Completed ");
//create Drive API service.
DriveService service = new DriveService(new BaseClientService.Initializer()
{
HttpClientInitializer = credential,
ApplicationName = Constant.ApplicationName,
});
return service;
}
Error Message:
[Error] at System.Threading.Tasks.Task.ThrowIfExceptional(Boolean includeTaskCanceledExceptions)at System.Threading.Tasks.Task1.GetResultCore(Boolean waitCompletionNotification)at System.Threading.Tasks.Task1.get_Result()at ExportSheetsToExcelPowerBi.GoogleDriveService.GetService() in C:\Users\username\Documents\Project\GoogleDriveService.cs:line
For Azure function apps there is no need to read secrets from the token file generated by Google Auth. As answered on one of your previous posts. You can configure your function app to use Google login for authentication purposes when running on Azure. To achieve this you have to generate client id and client secret using the Google sign-in for server-side apps, using this connection you can store the tokens obtained in the token store. Please refer to this document to configure your function app to use Google Login, refer to this document regarding the token store and how to retrieve and refresh the token obtained.

Python: Upload a package to Azure using SAS URI

I am using the Microsoft's Hardware dashboard API to automate the submission of my (.CAB) package for signing. I have followed the steps in this documentation: https://learn.microsoft.com/en-us/windows-hardware/drivers/dashboard/create-a-new-submission-for-a-product
The response of new submission contains the SAS(Shared Access Signature) URI
like this: (changed the sig and accnt_name for security)
'''https://accnt_name.blob.core.windows.net/scsjc/cexxxxxxxxxx?sv=2017-04-17&sr=b&sig=xxxxxxxxxxxxxx&se=2019-07-10T18:15:58Z&sp=rwl&rscd=attachment%3B filename%3Dinitial_xxxxxxxx.cab'''
I need to use this SAS URI to upload by package to azure blob storage.
The examples in documentation shows C# or .NET as follows:
string sasUrl =
"https://productingestionbin1.blob.core.windows.net/ingestion/26920f66-
b592-4439-9a9d-fb0f014902ec?sv=2014-02-
14&sr=b&sig=usAN0kNFNnYE2tGQBI%2BARQWejX1Guiz7hdFtRhyK%2Bog%3D&se=2016-
06-17T20:45:51Z&sp=rwl";
Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob blockBob =
new Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob(new
System.Uri(sasUrl));
await blockBob.UploadFromStreamAsync(stream);
I want to use the SAS URI obtained from submission resource JSON Response to upload the package.
This link Download file from AZURE BLOB CONTAINER using SAS URI in PYTHON suggests that there is no equivalent method in python and BlockBlobService can be used.
from azure.storage.blob import BlockBlobService
blobservice = BlockBlobService("storage_account",sas_token="?sv=2018-03-
28&ss=bfqt&srt=sco&sp=rwdlacup&se=2019-04-24T10:01:58Z&st=2019-04-
23T02:01:58Z&spr=https&sig=xxxxxxxxx")
blobservice.create_blob_from_path(container_name, local_file_name,
full_path_to_file)
However I am not sure of what is storage_account name and container name from the SAS URI obtained from submission resource.
Also I have created a separate azure storage account and added a new container, blob in it. I have tried passing the new container and storage account name with SAS access token from SAS URI (obtained from submission JSON response micorsoft hardware api) but always get below ERROR
'''
AzureHttpError: Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. ErrorCode: AuthenticationFailed
AuthenticationFailedServer failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
RequestId:5463b7d2-901e-0068-6994-36782e000000
Time:2019-07-09T20:23:04.5760736ZSignature did not match. String to sign used was rwl
2019-07-10T18:15:58Z
/blob/evcertautomation/ev2/initial_1152921504628106590.cab
2017-04-17
attachment; filename=initial_1152921504628106563.cab
'''
Thanks in advance
If you have a blob SAS URI as you post below, you can easily upload a file to the blob in Python with requests.
https://accnt_name.blob.core.windows.net/scsjc/cexxxxxxxxxx?sv=2017-04-17&sr=b&sig=xxxxxxxxxxxxxx&se=2019-07-10T18:15:58Z&sp=rwl&rscd=attachment%3B filename%3Dinitial_xxxxxxxx.cab
First, you must have to inspect the values of parameters se and sp. The se parameter means the expire time of the blob SAS URI, and the sp parameter means the operation permisson of the blob SAS URL like w for Blob Write Permission
So for your blob SAS URL above, you have the blob write permission to upload a file to this blob before the time 2019-07-10T18:15:58Z.
Here is my sample code for uploading via a blob sas uri.
import requests
blob_sas_uri = '<your blob sas uri which must includes `sp=w` and do the write operation before `se`>'
local_file_name = '<your local file name>'
headers = {
'x-ms-blob-type': 'BlockBlob'
}
data = open(local_file_name).read()
r = requests.put(blob_sas_uri, headers=headers, data=data)
print(r.status_code)
If you see the result is 201, it works fine and succeed for uploading.
As reference, there is a similar offical sample Example: Upload a Blob using a Container’s Shared Access Signature which using a wide container permission.
As per the SAS URI you provided: '''https://accnt_name.blob.core.windows.net/scsjc/cexxxxxxxxxx?sv=2017-04-17&sr=b&sig=xxxxxxxxxxxxxx&se=2019-07-10T18:15:58Z&sp=rwl&rscd=attachment%3B filename%3Dinitial_xxxxxxxx.cab'''
The account name should be accnt_name, the container should be scsjc.
So your code should look like below:
from azure.storage.blob import BlockBlobService
storage_account ="accnt_name"
token="?sv=2018-03-
28&ss=bfqt&srt=sco&sp=rwdlacup&se=2019-04-24T10:01:58Z&st=2019-04-
23T02:01:58Z&spr=https&sig=xxxxxxxxx"
container="scsjc"
blobservice = BlockBlobService(storage_account,sas_token=token)
blobservice.create_blob_from_path(container, local_file_name,
full_path_to_file)

How to add Authorization header to SAS URI?

I am working on a POC where I have to create a simulated device and connected to IOT HUB , this part is done after this some external application sends message to IOT HUB for that device.
Message contains the blob storage SAS URI, this same file I need to download to device.
Simulated device able to get the SAS URI and but when I am start downloading the file below error I am getting.
Exception in thread "main" com.microsoft.azure.storage.StorageException: Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
Rectify me if my approach is wrong and correct me with appropriate approch for this use case.
private static void download(String message) throws StorageException, IOException, JSONException, URISyntaxException {
// need to download the file to simulator in folder
try {
JSONObject jsonObject = new JSONObject(message);
String sasUri = (String) jsonObject.get("fileUrl");
System.out.println("SAS URI from hub ->" + sasUri + " ");
URI url = new URI(sasUri);
//downloadFile(sasUri);
System.out.println("end of file download function");
CloudBlob blob = new CloudBlockBlob(url);
blob.downloadToFile("/path/to/download/file");
} catch(Exception e) {
e.printStackTrace();
}
}
Below is SAS URI :-
https://*******.blob.core.windows.net/test/testfile.zip?sv=2017-07-29&ss=b&srt=sco&sp=rwdlac&se=2018-04-16T13:33:22Z&st=2018-04-16T05:00:22Z&spr=https&sig=***********
I am getting the SAS URI from azure portal directly , not generating at runtime.
Thanks in advance!
To narrow down the issue you can have a try of the following method to see if it helps.
Get the URI of the blob in Azure Portal by click "Download" like this:
After that, the file will be downloaded. You can find the URI in explorer download history. The URI format will like this:
https://[storage-account].blob.core.windows.net/testdownload/20180417_4.zip?sv=2017-07-29&ss=bqtf&srt=sco&sp=rwdlacup&se=2018-04-19T15:52:15Z&sig=[signature]
Directly use this URI in the following code piece and it will work.
CloudBlob blob = new CloudBlockBlob(url);
await blob.DownloadToFileAsync(imgPath, System.IO.FileMode.CreateNew);
Upate: Another way to get absolute URI to the blob from Azure Portal looks like this:
First get SAS token. Note the Start and expiry date/time. The token is valid only in this time period.
Second get blob URL.
Finally, the complete absolute URI to the blob is blob URL plus SAS token. It will like this:
https://ritastorageaccount.blob.core.windows.net/?sv=2017-07-29&ss=b&srt=sco&sp=rwdlac&se=2018-04-26T10:01:47Z&st=2018-04-26T02:01:47Z&spr=https&sig=[SIG]

How to use SharedAccessSignature to access blobs

I am trying to access a blob stored in a private container in Windows Azure. The container has a Shared Access Signature but when I try
to access the blob I get a StorgeClientException "Server failed to authenticate the request. Make sure the Authorization header is formed
correctly including the signature".
The code that created the container and uploaded the blob looks like this:
// create the container, set a Shared Access Signature, and share it
// first this to do is to create the connnection to the storage account
// this should be in app.config but as this isa test it will just be implemented
// here:
// add a reference to Microsoft.WindowsAzure.StorageClient
// and Microsoft.WindowsAzure.StorageClient set up the objects
//storageAccount = CloudStorageAccount.DevelopmentStorageAccount;
storageAccount = CloudStorageAccount.Parse(ConfigurationManager.AppSettings["ConnectionString"]);
blobClient = storageAccount.CreateCloudBlobClient();
// get a reference tot he container for the shared access signature
container = blobClient.GetContainerReference("blobcontainer");
container.CreateIfNotExist();
// now create the permissions policy to use and a public access setting
var permissions = container.GetPermissions();
permissions.SharedAccessPolicies.Remove("accesspolicy");
permissions.SharedAccessPolicies.Add("accesspolicy", new SharedAccessPolicy
{
// this policy is live immediately
// if the policy should be delatyed then use:
//SharedAccessStartTime = DateTime.Now.Add(T); where T is some timespan
SharedAccessExpiryTime =
DateTime.UtcNow.AddYears(2),
Permissions =
SharedAccessPermissions.Read | SharedAccessPermissions.Write
});
// turn off public access
permissions.PublicAccess = BlobContainerPublicAccessType.Off;
// set the permission on the ocntianer
container.SetPermissions(permissions);
var sas = container.GetSharedAccessSignature(new SharedAccessPolicy(), "accesspolicy");
StorageCredentialsSharedAccessSignature credentials = new StorageCredentialsSharedAccessSignature(sas);
CloudBlobClient client = new CloudBlobClient(storageAccount.BlobEndpoint,
new StorageCredentialsSharedAccessSignature(sas));
CloudBlob sasblob = client.GetBlobReference("blobcontainer/someblob.txt");
sasblob.UploadText("I want to read this text via a rest call");
// write the SAS to file so I can use it later in other apps
using (var writer = new StreamWriter(#"C:\policy.txt"))
{
writer.WriteLine(container.GetSharedAccessSignature(new SharedAccessPolicy(), "securedblobpolicy"));
}
The code I have been trying to use to read the blob looks like this:
// the storace credentials shared access signature is copied directly from the text file "c:\policy.txt"
CloudBlobClient client = new CloudBlobClient("https://my.azurestorage.windows.net/", new StorageCredentialsSharedAccessSignature("?sr=c&si=accesspolicy&sig=0PMoXpht2TF1Jr0uYPfUQnLaPMiXrqegmjYzeg69%2FCI%3D"));
CloudBlob blob = client.GetBlobReference("blobcontainer/someblob.txt");
Console.WriteLine(blob.DownloadText());
Console.ReadLine();
I can make the above work by adding the account credentials but that is exactly what I'm trying to avoid. I do not want something
as sensitive as my account credentials just sitting out there and I have no idea on how to get the signature into the client app without having the account credentials.
Any help is greatly appreciated.
Why this?
writer.WriteLine(container.GetSharedAccessSignature(new SharedAccessPolicy(), "securedblobpolicy"));
and not writing the sas string you already created?
It's late and I could easily be missing something but it seems that you might not be saving the same access signature that you're using to write the file in the first place.
Also perhaps not relevant here but I believe there is a limit on the number of container-wide policies you can have. Are you uploading multiple files to the same container with this code and creating a new container sas each time?
In general I think it would be better to request a sas for an individual blob at the time you need it with a short expiry time.
Is "my.azurestorage.windows.net" just a typo? I would expect something there like "https://account.blob.core.windows.net".
Otherwise the code looks pretty similar to the code in http://blog.smarx.com/posts/shared-access-signatures-are-easy-these-days, which works.

Resources