Azure Blob Container SAS Works in Code and Not AzCopy - azure

I'm using the following code to generate a SAS for a blob container using the Storage Client SDK version 8.0.0 to match what AzCopy is using.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("myconnectionstring");
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");
SharedAccessBlobPolicy sasConstraints = new SharedAccessBlobPolicy();
sasConstraints.SharedAccessStartTime = DateTime.Now.AddHours(-24);
sasConstraints.SharedAccessExpiryTime = DateTime.Now.AddHours(24);
sasConstraints.Permissions = SharedAccessBlobPermissions.List | SharedAccessBlobPermissions.Read;
string sas = container.GetSharedAccessSignature(sasConstraints, null);
Console.WriteLine(sas);
Console.WriteLine(container.Uri);
StorageCredentials creds = new StorageCredentials(sas);
var sasContainer = new CloudBlobContainer(new Uri(container.Uri + sas));
var items = sasContainer.GetBlobReference("items.txt");
This all works fine in code, but when I copy the SAS token into AzCopy to download a blob as the "SourceSAS" parameter, I get a 403 saying the signature does not match (via Fiddler). The parameters used to create the signature look ok to me. I'm confused as to why this works in code but not via AzCopy.
If I create a SAS token using the Azure Storage Explorer tool, it works fine. This SAS token targets a different version of the service, but I can't force the SDK to change that parameter. Would anyone know why this is happening? I will post the Fiddler content in an update.

As you have done, I do a test to generate SAS token that associated with an access policy in a application with installing WindowsAzure.Storage v8.0.0, I could download blob from my container via SAS token that is generated in my app. Both AzCopy v5.0.0 and the latest version (5.2.0) could recognize this valid SAS token.
Please check if you could access https://{storageaccount}.blob.core.windows.net/{mycontainer}/items.txt?{sas token} in your browser after you generate a SAS token.
Besides, please try to upgrade SDK and increase ExpiryTime, or create a new container and do same test to check whether same issue will appear.

Related

How to programmatically find out what operations I can do in a blob storage?

I am using libraries Microsoft.Azure.Storage.Blob 11.2.3.0 and Microsoft.Azure.Storage.Common 11.2.3.0 to connect to an Azure BlobStorage from a .NET Core 3.1 application.
When I started working on this, I had been given connection strings that gave me full access to the BlobStorage (or rather, the entire cloud storage account). Based upon those, I chose to write my connection code "defensively", making use of Exists() and CreateIfNotExists() from the CloudBlobContainer class to ensure the application would not fail when a container was not yet existing.
Now, I'm connecting a BlobStorage container using a SAS. While I can freely retrieve and upload blobs within the container like this, unfortunately, it seems that I am not allowed to do anything on the container level. Not only CreateIfNotExists, but even the mere querying of existence by Exists() throws a StorageException saying
This request is not authorized to perform this operation.
The documentation does not mention the exception.
Is there any way to check preemptively whether I am allowed to check the container's existence?
I have tried looking into the container permissions retrieved from GetPermissions, but that will throw an exception, as well.
The only other alternative I can see is to check for container existence within a try-catch-block and assume existence if an exception is thrown ...
There's a no definitive way to identify if an operation can be performed using a SAS token other than performing that operation and catching any exception that may be thrown by the operation. The exception that is of your interest is Unauthorized (403).
However you can try to predict if an operation can be performed by looking at the SAS token. If it is a Service SAS Token and not an Account SAS Token, that means all the account related operations are not not allowed. The way to distinguish between an Account SAS token and a Service SAS token is that the former will contain attributes like SignedServices (ss) and SignedResourceTypes (srt).
Next thing you would want to do is look for SignedPermissions (sp) attribute in your SAS token. This attribute will tell you what all operations are possible with the SAS token. For example, if your SAS token is a Service SAS token and if it includes Delete (d) permission, that would mean you can use this SAS token to delete a blob.
Please see these tables for the permissions/allowed operations combinations:
Service SAS Token: https://learn.microsoft.com/en-us/rest/api/storageservices/create-service-sas#permissions-for-a-directory-container-or-blob
Account SAS Token: https://learn.microsoft.com/en-us/rest/api/storageservices/create-service-sas#permissions-for-a-directory-container-or-blob
Please note that the operation might still fail for any number of reasons like SAS token has expired, account key has changed since the generation of SAS token, IP restrictions etc.
I tried in in my system to check whether the container exist or not able check it and if container not exists created container and able to upload file.
You need to give proper permission for your SAS Token
const string sasToken = “SAS Token”
const string accountName = "teststorage65";
const string blobContainerName = "example";
const string blobName = "test.txt";
const string myFileLocation = #"Local Path ";
var storageAccount = new CloudStorageAccount(storageCredentials, accountName, null, true);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer blobContainer = blobClient.GetContainerReference(blobContainerName);
var result=blobContainer.Exists();
if (result == true)
{
Console.WriteLine("Container exists");
}
else
{
// blobContainer.CreateIfNotExists();
Console.WriteLine("Conatiner not exists");
Console.WriteLine("Creating Container "+ blobContainerName);
blobContainer.CreateIfNotExists();
}
// blobContainer.CreateIfNotExists();
//Console.WriteLine("Creating Container ");
CloudBlockBlob cloudBlob = blobContainer.GetBlockBlobReference(blobName);
cloudBlob.UploadFromFile(myFileLocation);
OUTPUT

Copy Blob in Azure Blob Storage using Java v12 SDK

My Application is in a Kubernetes cluster and I'm using Java v12 SDK to interact with the Blob Storage. To authorize against Blob Storage I'm using Managed Identities.
My application needs to copy blobs within one container. I haven't found any particular recommendations or examples of how SDK should be used to do the copy.
I figured that the following approach works when I'm working with the emulator
copyBlobClient.copyFromUrl(sourceBlobClient.getBlobUrl());
However, when this gets executed in the cluster I get the following error
<Error>
<Code>CannotVerifyCopySource</Code>
<Message>The specified resource does not exist. RequestId: __ Time: __ </Message>
</Error>
Message says "resource does not exist" but the blob is clearly there. My container has private access, though.
Now when I change the public access level to "Blob(anonymous read access for blobs only)" everything works as excepted. However, public access not acceptable to me.
Main question - what are the right ways to implement copy blob using Java v12 SDK.
What I could miss or misconfigured in my situation?
And the last is the error message itself. There is a part which says "CannotVerifyCopySource" which kind of helps you understand that there is something with access, but the message part is clearly misleading. Shouldn't it be more explicit about the error?
If you want to use Azure JAVA SDK to copy blob with Azure MSI, please refer to the following details
Copy blobs between storage accounts
If you copy blobs between storage accounts with Azure MSI. We should do the following actions
Assign Azure Storage Blob Data Reader to the MSI in the source container
Assign Azure Storage Blob Data Contributor to the MSI in the dest container. Besides when we copy blob, we need write permissions to write content to blob
Generate SAS token for the blob. If the souce blob is public, we can directly use source blob URL without sas token.
For example
try {
BlobServiceClient blobServiceClient = new BlobServiceClientBuilder()
.endpoint("https://<>.blob.core.windows.net/" )
.credential(new DefaultAzureCredentialBuilder().build())
.buildClient();
// get User Delegation Key
OffsetDateTime delegationKeyStartTime = OffsetDateTime.now();
OffsetDateTime delegationKeyExpiryTime = OffsetDateTime.now().plusDays(7);
UserDelegationKey key =blobServiceClient.getUserDelegationKey(delegationKeyStartTime,delegationKeyExpiryTime);
BlobContainerClient sourceContainerClient = blobServiceClient.getBlobContainerClient("test");
BlobClient sourceBlob = sourceContainerClient.getBlobClient("test.mp3");
// generate sas token
OffsetDateTime expiryTime = OffsetDateTime.now().plusDays(1);
BlobSasPermission permission = new BlobSasPermission().setReadPermission(true);
BlobServiceSasSignatureValues myValues = new BlobServiceSasSignatureValues(expiryTime, permission)
.setStartTime(OffsetDateTime.now());
String sas =sourceBlob.generateUserDelegationSas(myValues,key);
// copy
BlobServiceClient desServiceClient = new BlobServiceClientBuilder()
.endpoint("https://<>.blob.core.windows.net/" )
.credential(new DefaultAzureCredentialBuilder().build())
.buildClient();
BlobContainerClient desContainerClient = blobServiceClient.getBlobContainerClient("test");
String res =desContainerClient.getBlobClient("test.mp3")
.copyFromUrl(sourceBlob.getBlobUrl()+"?"+sas);
System.out.println(res);
} catch (Exception e) {
e.printStackTrace();
}
Copy in the same account
If you copy blobs in the same storage account with Azure MSI, I suggest you assign Storage Blob Data Contributor to the MSI in the storage account. Then we can do copy action with the method copyFromUrl.
For example
a. Assign Storage Blob Data Contributor to the MSI at the account level
b. code
try {
BlobServiceClient blobServiceClient = new BlobServiceClientBuilder()
.endpoint("https://<>.blob.core.windows.net/" )
.credential(new DefaultAzureCredentialBuilder().build())
.buildClient();
BlobContainerClient sourceContainerClient = blobServiceClient.getBlobContainerClient("test");
BlobClient sourceBlob = sourceContainerClient.getBlobClient("test.mp3");
BlobContainerClient desContainerClient = blobServiceClient.getBlobContainerClient("output");
String res =desContainerClient.getBlobClient("test.mp3")
.copyFromUrl(sourceBlob.getBlobUrl());
System.out.println(res);
} catch (Exception e) {
e.printStackTrace();
}
For more details, please refer to here and here
I had the same issue using the Java SDK for Azure I solved it by copying the blob using the URL + the SAS token. Actually the resource you're getting through the URL won't appear as available if you don't have the right access to it. Here is the code I used to solve the problem:
BlobClient sourceBlobClient = blobServiceClient
.getBlobContainerClient(currentBucketName)
.getBlobClient(sourceKey);
// initializing the copy blob client
BlobClient copyBlobClient = blobServiceClient
.getBlobContainerClient(newBucketName)
.getBlobClient(newKey);
// Creating the SAS Token to get the permission to copy the source blob
OffsetDateTime expiryTime = OffsetDateTime.now().plusDays(1);
BlobSasPermission permission = new BlobSasPermission().setReadPermission(true);
BlobServiceSasSignatureValues values = new BlobServiceSasSignatureValues(expiryTime, permission)
.setStartTime(OffsetDateTime.now());
String sasToken = sourceBlobClient.generateSas(values);
//Making the copy using the source blob URL + generating the copy
var res = copyBlobClient.copyFromUrl(sourceBlobClient.getBlobUrl() +"?"+ sasToken);
Perhaps another way is to use the streaming API to download and upload data. In our company, we are not allowed to generate SAS token on our storage account due to security and we use the following to copy from an append blob to a block blob (overwriting):
BlobAsyncClient src;
BlobAsyncClient dest;
//...
AppendBlobAsyncClient srcAppend = src.getAppendBlobAsyncClient();
Flux<ByteBuffer> streamData = srcAppend.downloadStream();
Mono<BlockBlobItem> uploaded = dest.upload(streamData, new ParallelTransferOptions(), true);
This returns Mono<BlockBlobItem> and you need to subscribe it to start the process. If used in a non-reactive context, perhaps the easiest way is to block().
Note that this will only copy the data and additional work is needed if you also need to copy the metadata and tags. For tags, there is BlobAsyncClientBase.getTags(). For meta data, there is BlobAsyncClientBase.getProperties(). You can get these tags and metadata from the source and apply the same to dest

Azure blob returns 403 forbidden from Azure function running on portal

I've read several posts regarding similar queries, like this one, but I keep getting 403.
Initially I wrote code in Visual Studio - azure function accessing a storage blob - and everything runs fine. But when I deploy the very same function, it throws 403! I tried the suggested, moving to x64 etc and removing additional files, but nothing works.
Please note - i have verified several times - the access key is correct and valid.
So, I did all the following
(1) - I wrote a simple Azure function on Portal itself (to rule out the deployment quirks), and voila, same 403!
var storageConnection = "DefaultEndpointsProtocol=https;AccountName=[name];AccountKey=[key1];EndpointSuffix=core.windows.net";
var cloudStorageAccount = CloudStorageAccount.Parse(storageConnection);
var blobClient = cloudStorageAccount.CreateCloudBlobClient();
var sourceContainer = blobClient.GetContainerReference("landing");
CloudBlockBlob blob = container.GetBlockBlobReference("a.xlsx");
using (var inputStream = new MemoryStream())
{
log.Info($"Current DateTime: {DateTime.Now}");
log.Info("Starting download of blob...");
blob.DownloadToStream(inputStream); // <--- 403 thrown here!!
log.Info("Download Complete!");
}
(2) - I verified the date time by logging it, and its UTC on the function server
(3) - I used Account SAS key, generated on portal, but still gives 403. I had waited for over 30seconds after SAS key generation, to ensure that the SAS key propagates.
var sasUri = "https://[storageAccount].blob.core.windows.net/?sv=2017-11-09&ss=b&srt=sco&sp=rwdlac&se=2019-07-31T13:08:46Z&st=2018-09-01T03:08:46Z&spr=https&sig=Hm6pA7bNEe8zjqVelis2y842rY%2BGZg5CV4KLn288rCg%3D";
StorageCredentials accountSAS = new StorageCredentials(sasUri);
var cloudStorageAccount = new CloudStorageAccount(accountSAS, "[storageAccount]", endpointSuffix: null, useHttps: true);
// rest of the code same as (1)
(4) - I generated the SAS key on the fly in code, but again 403.
static string GetContainerSasUri(CloudBlobContainer container)
{
//Set the expiry time and permissions for the container.
//In this case no start time is specified, so the shared access signature becomes valid immediately.
SharedAccessBlobPolicy sasConstraints = new SharedAccessBlobPolicy();
sasConstraints.SharedAccessStartTime = DateTimeOffset.UtcNow.AddMinutes(-5);
sasConstraints.SharedAccessExpiryTime = DateTimeOffset.UtcNow.AddMinutes(25);
sasConstraints.Permissions = SharedAccessBlobPermissions.Write | SharedAccessBlobPermissions.Add | SharedAccessBlobPermissions.Create;
//Generate the shared access signature on the container, setting the constraints directly on the signature.
string sasContainerToken = container.GetSharedAccessSignature(sasConstraints);
//Return the URI string for the container, including the SAS token.
return container.Uri + sasContainerToken + "&comp=list&restype=container";
}
and used the above as
var sourceContainer = blobClient.GetContainerReference("landing");
var sasKey = GetContainerSasUri(sourceContainer);
var container = new CloudBlobContainer(new Uri(sasKey));
CloudBlockBlob blob = container.GetBlockBlobReference("a.xlsx");
I completely fail to understand why the code works flawlessly when running from visual studio, accessing the storage (not emulator) on cloud, but when same is either deployed or run explicitly on the portal, it fails to run.
What am I missing here?
Since you have excluded many possible causes, the only way I can reproduce your problem is to configure Firewall on Storage Account.
Locally the code works as you may have added your local IP into White List while this step was omitted for Function. On portal, go to Resource Explorer under Platform features. Search outboundIpAddresses and add those(usually four) IPs into Storage Account White List.
If you have added Function IPs but still get 403 error, check location of Storage and Function app. If they live in the same region(like both in Central US), two communicate in an internal way without going through outboundIpAddresses. Workaround I can offer is to create a Storage in different region if Firewall is necessary in your plan. Otherwise just allow all networks to Storage.

How can I get an Azure CloudBlockBlob from a storage URL with a SAS?

I have am trying to refactor our MVC code which has a lot of pages which make use of download url which point at a blob with a SAS. It would be great to be able to pass the Url to the controller and use it to locate the associated Blob. E.g. Have an action that has the download Url as its only input parameter. I can also create a link helper that only shows the delete link if the SAS exposes delete etc.
It would be a great help if I could pass the Url to Azure and get a CloudBlockBlob in return. So I could delete it, update it, get metadata etc.
The only way I can do it presently is resorting to using techniques like
var deleteBlobRequest = BlobRequest.Delete(new Uri(fileUrl), 30, null, DeleteSnapshotsOption.IncludeSnapshots, "");
deleteBlobRequest.GetResponse().Close();
This works but it seems very odd.
I can't figure out the code to get a CloudBlockBlob from the Uri.
Any ideas? I am presently using Azure Storage 1.7
You don't have to do anything special. If you construct a blob with a SAS Uri, storage client library takes care of this for you. For example, take this code:
CloudBlockBlob cloudBlockBlob = new CloudBlockBlob("http://127.0.0.1:10000/devstoreaccount1/temp/sastest.txt?sr=b&st=2013-01-25T04%3A28%3A09Z&se=2013-01-25T05%3A28%3A09Z&sp=rwd&sig=jIWWFwZ6MXaL6FD%2F2%2FpqPl1g4f0ElFrr1fKNg5U%2FAkg%3D");
cloudBlockBlob.Delete();
This would work just fine.
Here is the code to get the permissions of a SAS key (supposing the blobUrl is an url with the SAS key):
// Get permssions for current SAS key.
var queryString = HttpUtility.ParseQueryString(blobUrl);
var permissionsText = queryString["sp"];
var permissions = SharedAccessBlobPermissions.None;
if (permissionsText.Contains("w"))
permissions = permissions | SharedAccessBlobPermissions.Write;
if (permissionsText.Contains("r"))
permissions = permissions | SharedAccessBlobPermissions.Read;
if (permissionsText.Contains("d"))
permissions = permissions | SharedAccessBlobPermissions.Delete;
if (permissionsText.Contains("l"))
permissions = permissions | SharedAccessBlobPermissions.List;
And this will get an ICloudBlob based on an URL with SAS key (supposing the blobUrl is an url with the SAS key):
// Get the blob reference.
var blobUri = new Uri(blobUrl);
var path = String.Format("{0}{1}{2}{3}", blobUri.Scheme, Uri.SchemeDelimiter, blobUri.Authority, blobUri.AbsolutePath);
var blobClient = new CloudBlobClient(new Uri(path), new Microsoft.WindowsAzure.Storage.Auth.StorageCredentials(blobUri.Query));
ICloudBlob blobReference = blobClient.GetBlobReferenceFromServer(new Uri(path));

How to use SharedAccessSignature to access blobs

I am trying to access a blob stored in a private container in Windows Azure. The container has a Shared Access Signature but when I try
to access the blob I get a StorgeClientException "Server failed to authenticate the request. Make sure the Authorization header is formed
correctly including the signature".
The code that created the container and uploaded the blob looks like this:
// create the container, set a Shared Access Signature, and share it
// first this to do is to create the connnection to the storage account
// this should be in app.config but as this isa test it will just be implemented
// here:
// add a reference to Microsoft.WindowsAzure.StorageClient
// and Microsoft.WindowsAzure.StorageClient set up the objects
//storageAccount = CloudStorageAccount.DevelopmentStorageAccount;
storageAccount = CloudStorageAccount.Parse(ConfigurationManager.AppSettings["ConnectionString"]);
blobClient = storageAccount.CreateCloudBlobClient();
// get a reference tot he container for the shared access signature
container = blobClient.GetContainerReference("blobcontainer");
container.CreateIfNotExist();
// now create the permissions policy to use and a public access setting
var permissions = container.GetPermissions();
permissions.SharedAccessPolicies.Remove("accesspolicy");
permissions.SharedAccessPolicies.Add("accesspolicy", new SharedAccessPolicy
{
// this policy is live immediately
// if the policy should be delatyed then use:
//SharedAccessStartTime = DateTime.Now.Add(T); where T is some timespan
SharedAccessExpiryTime =
DateTime.UtcNow.AddYears(2),
Permissions =
SharedAccessPermissions.Read | SharedAccessPermissions.Write
});
// turn off public access
permissions.PublicAccess = BlobContainerPublicAccessType.Off;
// set the permission on the ocntianer
container.SetPermissions(permissions);
var sas = container.GetSharedAccessSignature(new SharedAccessPolicy(), "accesspolicy");
StorageCredentialsSharedAccessSignature credentials = new StorageCredentialsSharedAccessSignature(sas);
CloudBlobClient client = new CloudBlobClient(storageAccount.BlobEndpoint,
new StorageCredentialsSharedAccessSignature(sas));
CloudBlob sasblob = client.GetBlobReference("blobcontainer/someblob.txt");
sasblob.UploadText("I want to read this text via a rest call");
// write the SAS to file so I can use it later in other apps
using (var writer = new StreamWriter(#"C:\policy.txt"))
{
writer.WriteLine(container.GetSharedAccessSignature(new SharedAccessPolicy(), "securedblobpolicy"));
}
The code I have been trying to use to read the blob looks like this:
// the storace credentials shared access signature is copied directly from the text file "c:\policy.txt"
CloudBlobClient client = new CloudBlobClient("https://my.azurestorage.windows.net/", new StorageCredentialsSharedAccessSignature("?sr=c&si=accesspolicy&sig=0PMoXpht2TF1Jr0uYPfUQnLaPMiXrqegmjYzeg69%2FCI%3D"));
CloudBlob blob = client.GetBlobReference("blobcontainer/someblob.txt");
Console.WriteLine(blob.DownloadText());
Console.ReadLine();
I can make the above work by adding the account credentials but that is exactly what I'm trying to avoid. I do not want something
as sensitive as my account credentials just sitting out there and I have no idea on how to get the signature into the client app without having the account credentials.
Any help is greatly appreciated.
Why this?
writer.WriteLine(container.GetSharedAccessSignature(new SharedAccessPolicy(), "securedblobpolicy"));
and not writing the sas string you already created?
It's late and I could easily be missing something but it seems that you might not be saving the same access signature that you're using to write the file in the first place.
Also perhaps not relevant here but I believe there is a limit on the number of container-wide policies you can have. Are you uploading multiple files to the same container with this code and creating a new container sas each time?
In general I think it would be better to request a sas for an individual blob at the time you need it with a short expiry time.
Is "my.azurestorage.windows.net" just a typo? I would expect something there like "https://account.blob.core.windows.net".
Otherwise the code looks pretty similar to the code in http://blog.smarx.com/posts/shared-access-signatures-are-easy-these-days, which works.

Resources