Pre-signed url with expiry for Azure blob - azure

We are using Azure blob storage for storing unstructured content.
Our setup is as follows
Our Browser client(accessing files) -> Our Backend (our cloud platform) -> Proxy managing Azure account (our cloud platform)-> Azure blob storage.
The proxy managing the Azure account has the account credentials. It generates a SAS token and gives to the consumers like our backend. This SAS token has infinite expiry time.
Now we from our backend want to generate a pre-signed url (similar concept of S3) with an expiration time and give to the browser client. This is required since we want to download the content directly from the browser bypassing our backend for large files.
It seems the generated signed url will always have unlimited expiry time as our SAS token.
Please note we (our backend ) does not have access to the Azure storage so we cannot generate an access token.
Is there any way our problem could be solved ?
Best Regards,
Saurav

If I understand correctly, you get a SAS Token that never expires. However you would want to specify an expiry date when you use this token in your SAS URL. This is not possible.
Essentially a SAS URL for a blob is base blob URL (https://account.blob.core.windows.net/container/blob) + SAS Token.
You cannot change any of the parameters of the SAS Token when using it in SAS URL because the sig portion of SAS URL is computed based on the other parameters in your SAS Token like se, st etc. Doing so will invalidate the SAS Token.
Furthermore, you can't create a new SAS Token using another SAS Token.
Only solution to your problem is to have your proxy service create a SAS Token with predefined expiry time.

Related

PowerBI Secure Access Blob Storage

I have created a PowerBI report with embedded videos using the HTML customer visual. The videos are hosted on an Azure Blob Storage so I have generated a SAS token that I have added to the video URL in my data.
However, I would look to increase the security of Blob Storage and find a better solution to avoid having a SAS token out there for everyone to use and with limited control over.
For example, would it be possible to access a token provider that would generate an access token on the fly if a set of credentials is correct? This way, I would be able to control the access to the videos.
I looked into Shared Access Signature but was not able to implement it from PowerBi. Any other ideas are welcome!
I tried to reproduce the same in my environment and got the results like below:
To avoid a SAS token accessible to everyone, try generating the access token by specifying user's IP address or IP Range who are allowed to access:
If the user's IP address is not in the range, the user will get the below error while accessing the SAS URL:
Would it be possible to access a token provider that would generate an access token on the fly if a set of credentials is correct.
You can make use of Authorization-Code Flow to generate the access token which will ask for the Users interaction while generating the token.
Make use of the below Parameters to generate the access token:
GET https://login.microsoftonline.com/TenantID/oauth2/v2.0/token
client_id:ClientID
client_secret:*****
grant_type:authorization_code
scope:scope
redirect_uri:redirect_uri
code:code
A sign-in screen will appear to validate the Users credentials:
Access token got generated successfully like below:

Can we Reset SAS Url for azure blob storage?

How to create SAS url for azure blob storage using java How to generate azure blob storage SAS url using java?
do we have any api where i can expire any existing sas url?
Requirement: Generate any sas url with expiry time as 7 days and i want some api where i can reset this sas url at any point of time within expiry time limit.
but looks like we don't have this i guess and every time it will create new sas url but we can expire any existing sas url at any point of time.
You can create an access policy on the container with set start and end date. Then create a SAS token on the container or an individual blob using that policy. Once the policy is deleted it will also invalidate all SAS tokens that were generated with it.
public BlobServiceSasSignatureValues(String identifier)
//Where identifier is the name of the access policy
It's limited to 5 access policies on a table, but if you are a bit flexible on the expiration date you could make one each week, so every blob would be available for at least a week and up to two at most. It also implies that once you remove the policy all url's for that week no longer work.
I don't think there's any other way you are able to invalidate the generated url's apart from generating new accesss keys.

Azure blob storage : Is there any way to show custom html page when the SAS token expired for a file?

For one of my project I am uploading file to the Azure Blob Storage by setting expiry date for 1 week via SAS token and sending the URL to the user via email. By default, Azure shows error text in XML format when the link is expired. Is there any way to show some custom html page instead of XML error when the URL is expired?
No, this is currently not possible in Blob Storage. As a workaround, you could route the requests through, for instance, an HTTP-triggered Azure Function. This could check the SAS token and either return the file from Blob or a custom error page.

How to serve client request for Storage Blob after key rotation in Azure

I have blob storage with some resources. I provide SAS tokens to clients, and every token is generated only for specific blob to client. After some amount of time I want to rotate my account keys, thus all actual clients' tokens will be invalidated (clients do not have account key, they have only token).
I was wondering if someone had similiar case, when using REST API to Azure Storage have to provide new SAS tokens to clients after key rotation. I know that in this situation client will get 403 Unauthorize, so one option is to send another request for new token, and then retry request for resource.
Or maybe I could send back 301 Moved http code and link for REST endpoint that regenerates new token, thus client wouldn't have to have addtional knowlegde about anothoer endpoint.
Does anyone any experience with token rotation like this one?
As mentioned in the comment, due to your clients are directly accessing the blob, you wouldn't know if they got 403 error unless they tell you about the same.
If it is acceptable, you could take a look at Authorize access to Azure blobs and queues using Azure Active Directory, when it has been configured, even if you rotate the account keys, the client also can access the storage. But this feature just could apply to at least container level, not blob level, not sure if it is acceptable.

Azure Shared Access Signature creation

I am generating a shared access signature(SAS) for one of the blob containers which is v 2 using the Azure portal. I am trying to upload a file from frontend for which SAS is required. The problem is SAS is expiring every day. Is there a way to update the SAS automatically using the code or is there a way to do the authentication using Azure AD.
So Basically I have a front end where user logs in using Azure AD, now i want to utilize his session to allow him to upload to Azure storage. As he is already authorized, i feel there should be a way to generate SAS on the fly for his session.
Shared access signatures are useful for providing limited permissions to your storage account to clients that should not have the account key.
If you are the one writing data to the storage account, do so server side. If you do that, you can validate the user is logged in. If that's the case, allow your backend to write in the storage account using one of the access keys (or better yet, a managed identity.
Of course, you could have your front-end request a SAS token from the back-end, for instance from an API. This could simply be implemented, for instance using an Azure Function. And the SAS token could use near-term expiration times. In the end, you're still opening up parts of the storage account to anyone who can access the frontend.
With near-term expiration, even if a SAS is compromised, it's valid only for a short time. This practice is especially important if you cannot reference a stored access policy. Near-term expiration times also limit the amount of data that can be written to a blob by limiting the time available to upload to it
Source: Using shared access signatures (SAS)
Taken from that same article:
The following code example creates an account SAS that is valid for the Blob and File services, and gives the client permissions read, write, and list permissions to access service-level APIs. The account SAS restricts the protocol to HTTPS, so the request must be made with HTTPS.
static string GetAccountSASToken()
{
// To create the account SAS, you need to use your shared key credentials. Modify for your account.
const string ConnectionString = "DefaultEndpointsProtocol=https;AccountName=account-name;AccountKey=account-key";
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConnectionString);
// Create a new access policy for the account.
SharedAccessAccountPolicy policy = new SharedAccessAccountPolicy()
{
Permissions = SharedAccessAccountPermissions.Read | SharedAccessAccountPermissions.Write | SharedAccessAccountPermissions.List,
Services = SharedAccessAccountServices.Blob | SharedAccessAccountServices.File,
ResourceTypes = SharedAccessAccountResourceTypes.Service,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(24),
Protocols = SharedAccessProtocol.HttpsOnly
};
// Return the SAS token.
return storageAccount.GetSharedAccessSignature(policy);
}

Resources