Azure Blob Storage SAS Token dose not expire as expected - azure

My application generates SAS tokens to access existing blobs within my container. However, my SAS token dose not look like it is expiring. I am able to view and get blob from container way past expiration time I am claiming.
Here is the code :
public string GenerateSasToken([NotNull] string containerName, [NotNull] string blobName)
{
var startTime = DateTimeOffset.UtcNow
var expiredTime = startTime.AddSeconds(20);
var blobClient = new BlobClient(_options.Value.ConnectionString, containerName, blobName);
var sasBuilder = new BlobSasBuilder(BlobContainerSasPermissions.Read, expiredTime)
{
BlobName = blobName,
BlobContainerName = containerName,
StartsOn = startTime,
ExpiresOn = expiredTime
};
var uri = blobClient.GenerateSasUri(sasBuilder);
return uri.ToString();
}
Token been generated is valid and I am able to use it, but it dose not expire after 20 seconds in fact it dose not expire even after 15 minutes.
Am I missing something within this API?
Thank you!
Edit:
I am attaching SAS token that was generated.
?sv=2020-08-04&st=2022-01-24T21%3A20%3A41Z&se=2022-01-24T21%3A21%3A01Z&sr=b&sp=r&sig=signature-here

Even though the SAS token is expired, because of the browser caching, you would still be able to access the blob storage using the same SAS token
In order to avoid this, you can override the cache-control header in the SAS token as suggested by #Gaurav Mantri
You need to set CacheControl value in your BlobSasBuilder function to override the cache-control header
Your BlobSasBuilder function can be as below:
var sasBuilder = new BlobSasBuilder(BlobContainerSasPermissions.Read, expiredTime)
{
BlobName = blobName,
BlobContainerName = containerName,
StartsOn = startTime,
ExpiresOn = expiredTime,
CacheControl = "max-age=" + expiredTime
};

Related

send message to Azure service bus by Azure scheduler using post

i want to send message to Azure service bus by Azure scheduler using post
like demo in this page
http://www.prasadthinks.com/
but i don't know how to set 'authorization' property in Http Header.
As far as I know, the 'authorization' property must contains the service bus's access token.
You could use your shared access policies's key-name and key to generate the access token by using codes.
More details, you could refer to below codes.
string keyName = "keyname";
string key = "key";
var sasToken = createToken("http://yourservicebusname.servicebus.windows.net/queuename", keyName, key);
createToken function:
private static string createToken(string resourceUri, string keyName, string key)
{
TimeSpan sinceEpoch = DateTime.UtcNow - new DateTime(1970, 1, 1);
var expiry = Convert.ToString((int)sinceEpoch.TotalSeconds + 7200); //EXPIRES in 2h
string stringToSign = HttpUtility.UrlEncode(resourceUri) + "\n" + expiry;
HMACSHA256 hmac = new HMACSHA256(Encoding.UTF8.GetBytes(key));
var signature = Convert.ToBase64String(hmac.ComputeHash(Encoding.UTF8.GetBytes(stringToSign)));
//this is the auth token
var sasToken = String.Format(CultureInfo.InvariantCulture,
"SharedAccessSignature sr={0}&sig={1}&se={2}&skn={3}",
HttpUtility.UrlEncode(resourceUri), HttpUtility.UrlEncode(signature), expiry, keyName);
return sasToken;
}
The result is like below:
This is the 'authorization' property, you could copy it.But this token has two hours limit.
The azure scheduler job setting like below:
Besides, the azure scheduler job have already support send the message to the service bus, you don't need to create the sas token by yourself, you could just add the keyName and key in its authentication settings.
More details, you could refer to below images:

can i specify a queue name when generate shared access signatures (SAS) for my azure storage account

here is the docs that describe how to Constructing a Service SAS.
the document says, you can specify a table name , so that the sas can only access that specific table.
Can i do the same thing with queue, so the sas can only access that specific queue?
Can i do the same thing with queue, so the sas can only access that
specific queue?
Sure you can! Take a look at the code below:
static void GenerateSasForQueue()
{
var cred = new StorageCredentials(accountName, accountKey);
var account = new CloudStorageAccount(cred, true);
var client = account.CreateCloudQueueClient();
var queue = client.GetQueueReference("queue-name");
var sasPolicy = new SharedAccessQueuePolicy()
{
SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-15),
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(2),
Permissions = SharedAccessQueuePermissions.Add | SharedAccessQueuePermissions.Read |
SharedAccessQueuePermissions.Update | SharedAccessQueuePermissions.ProcessMessages
};
var sasToken = queue.GetSharedAccessSignature(sasPolicy);
var sasUrl = string.Format("{0}{1}", queue.Uri.AbsoluteUri, sasToken);
}
This code will generate a SAS Token on the queue named queue-name in your storage account with all permissions valid for 2 hours from the date of SAS creation.

How do i find bloburl with shared access token expired?

i have written this below code to get the blob url with cache expiry token, actually have set 2 hours to expire the blob url,
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference(containerName);
CloudBlockBlob blockBlob = container.GetBlockBlobReference("blobname");
//Create an ad-hoc Shared Access Policy with read permissions which will expire in 2 hours
SharedAccessBlobPolicy policy = new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(2),
};
SharedAccessBlobHeaders headers = new SharedAccessBlobHeaders()
{
ContentDisposition = string.Format("attachment;filename=\"{0}\"", "blobname"),
};
var sasToken = blockBlob.GetSharedAccessSignature(policy, headers);
blobUrl = blockBlob.Uri.AbsoluteUri + sasToken;
using this above code i get the blob url with valid expiry token, now i want to check blob url is valid or not in one client application.
I tried web request and http client approach by passing the URL and get the response status code. if the response code is 404 then I assuming the URL is expired if not the URL is still valid,but this approach taking more time.
Please suggest me any other way.
I tried running code very similar to yours, and I am getting a 403 error, which is actually what is expected in this case. Based on your question, I am not sure whether the 403 is more helpful to you than the 404. Here is code running in a console application that returns a 403:
class Program
{
static void Main(string[] args)
{
string blobUrl = CreateSAS();
CheckSAS(blobUrl);
Console.ReadLine();
}
//This method returns a reference to the blob with the SAS, and attempts to read it.
static void CheckSAS(string blobUrl)
{
CloudBlockBlob blob = new CloudBlockBlob(new Uri(blobUrl));
//If the DownloadText() method is run within the two minute period that the SAS is valid, it succeeds.
//If it is run after the SAS has expired, it returns a 403 error.
//Sleep for 3 minutes to trigger the error.
System.Threading.Thread.Sleep(180000);
Console.WriteLine(blob.DownloadText());
}
//This method creates the SAS on the blob.
static string CreateSAS()
{
string containerName = "forum-test";
string blobName = "blobname";
string blobUrl = "";
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference(containerName);
container.CreateIfNotExists();
CloudBlockBlob blockBlob = container.GetBlockBlobReference(blobName + DateTime.Now.Ticks);
blockBlob.UploadText("Blob for forum test");
//Create an ad-hoc Shared Access Policy with read permissions which will expire in 2 hours
SharedAccessBlobPolicy policy = new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(2),
};
SharedAccessBlobHeaders headers = new SharedAccessBlobHeaders()
{
ContentDisposition = string.Format("attachment;filename=\"{0}\"", blobName),
};
var sasToken = blockBlob.GetSharedAccessSignature(policy, headers);
blobUrl = blockBlob.Uri.AbsoluteUri + sasToken;
return blobUrl;
}
}
There are cases in which SAS failures do return a 404, which can create problems for troubleshooting operations using SAS. The Azure Storage team is aware of this issue and in future releases SAS failures may return a 403 instead. For help troubleshooting a 404 error, see http://azure.microsoft.com/en-us/documentation/articles/storage-monitoring-diagnosing-troubleshooting/#SAS-authorization-issue.
I also ran into the same issue a few days back. I was actually expecting storage service to return a 403 error code when the SAS token has expired but storage service returns 404 error.
Given that we don't have any other option, the way you're doing it is the only viable way but it is still not correct because you could get 404 error if the blob is not present in the storage account.
Maybe you can parse "se" argument from the generated SAS, which means expiry time, e.g. "se=2013-04-30T02%3A23%3A26Z". However, since the server time might not be the same as client time, the solution may be unstable.
http://azure.microsoft.com/en-us/documentation/articles/storage-dotnet-shared-access-signature-part-1/
You're using UTC time for SharedAccessExpiryTime (see "Expiry Time" in https://learn.microsoft.com/en-us/azure/storage/common/storage-dotnet-shared-access-signature-part-1#parameters-common-to-account-sas-and-service-sas-tokens).
The expiry time then is registered under the se claim in the token the value of which can be checked against current UTC time on the client side before actually using the token. This way you'd save yourself from making an extra call to the Blob storage only to find out whether the token is expired.

Create Shared Access Token with Microsoft.WindowsAzure.Storage returns 403

I have a fairly simple method that uses the NEW Storage API to create a SAS and copy a blob from one container to another.
I am trying to use this to Copy blob BETWEEN STORAGE ACCOUNTS. So I have TWo Storage accounts, with the exact same Containers, and I am trying to copy a blob from the Storage Account's Container to another Storage Account's Container.
I don't know if the SDK is built for that, but it seems like it would be a common scenario.
Some additional information:
I create the token on the Destination Container.
Does that token need to be created on both the source and destination? Does it take time to register the token? Do I need to create it for each request, or only once per token "lifetime"?
I should mention a 403 is an Unauthorized Result http error code.
private static string CreateSharedAccessToken(CloudBlobClient blobClient, string containerName)
{
var container = blobClient.GetContainerReference(containerName);
var blobPermissions = new BlobContainerPermissions();
// The shared access policy provides read/write access to the container for 10 hours:
blobPermissions.SharedAccessPolicies.Add("SolutionPolicy", new SharedAccessBlobPolicy()
{
// To ensure SAS is valid immediately we don’t set start time
// so we can avoid failures caused by small clock differences:
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(1),
Permissions = SharedAccessBlobPermissions.Write |
SharedAccessBlobPermissions.Read
});
blobPermissions.PublicAccess = BlobContainerPublicAccessType.Blob;
container.SetPermissions(blobPermissions);
return container.GetSharedAccessSignature(new SharedAccessBlobPolicy(), "SolutionPolicy");
}
Down the line I use this token to call a copy operation, which returns a 403:
var uri = new Uri(srcBlob.Uri.AbsoluteUri + blobToken);
destBlob.StartCopyFromBlob(uri);
My version of Azure.Storage is 2.1.0.2.
Here is the full copy method in case that helps:
private static void CopyBlobs(
CloudBlobContainer srcContainer, string blobToken,
CloudBlobContainer destContainer)
{
var srcBlobList
= srcContainer.ListBlobs(string.Empty, true, BlobListingDetails.All); // set to none in prod (4perf)
//// get the SAS token to use for all blobs
//string token = srcContainer.GetSharedAccessSignature(
// new SharedAccessBlobPolicy(), "SolutionPolicy");
bool pendingCopy = true;
foreach (var src in srcBlobList)
{
var srcBlob = src as ICloudBlob;
// Determine BlobType:
ICloudBlob destBlob;
if (srcBlob.Properties.BlobType == BlobType.BlockBlob)
{
destBlob = destContainer.GetBlockBlobReference(srcBlob.Name);
}
else
{
destBlob = destContainer.GetPageBlobReference(srcBlob.Name);
}
// Determine Copy State:
if (destBlob.CopyState != null)
{
switch (destBlob.CopyState.Status)
{
case CopyStatus.Failed:
log.Info(destBlob.CopyState);
break;
case CopyStatus.Aborted:
log.Info(destBlob.CopyState);
pendingCopy = true;
destBlob.StartCopyFromBlob(destBlob.CopyState.Source);
return;
case CopyStatus.Pending:
log.Info(destBlob.CopyState);
pendingCopy = true;
break;
}
}
// copy using only Policy ID:
var uri = new Uri(srcBlob.Uri.AbsoluteUri + blobToken);
destBlob.StartCopyFromBlob(uri);
//// copy using src blob as SAS
//var source = new Uri(srcBlob.Uri.AbsoluteUri + token);
//destBlob.StartCopyFromBlob(source);
}
}
And finally the account and client (vetted) code:
var credentials = new StorageCredentials("BAR", "FOO");
var account = new CloudStorageAccount(credentials, true);
var blobClient = account.CreateCloudBlobClient();
var sasToken = CreateSharedAccessToken(blobClient, "content");
When I use a REST client this seems to work... any ideas?
Consider also this problem:
var uri = new Uri(srcBlob.Uri.AbsoluteUri + blobToken);
Probably you are calling the "ToString" method of Uri that produce a "Human redable" version of the url. If the blobToken contain special caracters like for example "+" this will cause a token malformed error on the storage server that will refuse to give you the access.
Use this instead:
String uri = srcBlob.Uri.AbsoluteUri + blobToken;
Shared Access Tokens are not required for this task. I ended up with two accounts and it works fine:
var accountSrc = new CloudStorageAccount(credsSrc, true);
var accountDest = new CloudStorageAccount(credsSrc, true);
var blobClientSrc = accountSrc.CreateCloudBlobClient();
var blobClientDest = accountDest.CreateCloudBlobClient();
// Set permissions on the container.
var permissions = new BlobContainerPermissions {PublicAccess = BlobContainerPublicAccessType.Blob};
srcContainer.SetPermissions(permissions);
destContainer.SetPermissions(permissions);
//grab the blob
var sourceBlob = srcContainer.GetBlockBlobReference("FOO");
var destinationBlob = destContainer.GetBlockBlobReference("BAR");
//create a new blob
destinationBlob.StartCopyFromBlob(sourceBlob);
Since both CloudStorageAccount objects point to the same account, copying without a SAS token would work just fine as you also mentioned.
On the other hand, you need either a publicly accessible blob or a SAS token to copy from another account. So what you tried was correct, but you established a container-level access policy, which can take up to 30 seconds to take effect as also documented in MSDN. During this interval, a SAS token that is associated with the stored access policy will fail with status code 403 (Forbidden), until the access policy becomes active.
One more thing that I would like to point is; when you call Get*BlobReference to create a new blob object, the CopyState property will not be populated until you do a GET/HEAD operation such as FetchAttributes.

SAS token is expiring while Blob download

I have a SAS token which will expire within 2 minutes.
SAS = AzureClient.GetCloudContainer().GetSharedAccessSignature(new SharedAccessPolicy()
{
SharedAccessExpiryTime = DateTime.UtcNow + TimeSpan.FromMinutes(1)
}, "readonly");
var sasCreds = new StorageCredentialsSharedAccessSignature(SAS);
CloudStorageAccount _storageAccount = AzureClient.GetCloudStorageAccount();
CloudBlobClient sasBlobClient = new CloudBlobClient(_storageAccount.BlobEndpoint, sasCreds);
CloudBlob sasBlob = sasBlobClient.GetBlobReference("blobname");
Where readonly is the policy name.
Now I am doing the following operation:
using (BlobStream stream = sasBlob.OpenRead())
{
using (FileStream fileStream = File.OpenWrite(#"Smething.txt"))
{
BlobStreamReader(stream,fileStream);
}
}
private void BlobStreamReader(BlobStream blob,Stream OutputStream)
{
int buffersize = 4194304; // 4MB
byte[] data = new byte[buffersize];
do
{
int bytesRead = blob.Read(data,0,buffersize);
if (bytesRead == 0) break;
OutputStream.Write(data,0,bytesRead);
}
while (true);
}
The problem is that download is failing when the SAS is expired. I had an understanding that the SAS token is needed only for authentication and if download is started with its expiry time then download will continue even if the SAS is expired.
It is correct that the SAS token is needed only for authentication. However, in your case, BlobStream will issue a new request whenever it needs more data from the server. Because each request needs to be authenticated separately and your SAS token expires before the entire download is finished, it is expected to fail.
If you want to download the entire blob, DownloadToStream is actually a better alternative, because it will only issue a single request to the server and then download the entire blob.

Resources