I am trying to use Azure Managed storage account keys. I succeeded in setting up a managed storage account with 1 day regeneration period for testing purposes. My questions are
Is it possible for me to access this storage account from any other application e.g Storage Explorer, Cloud Explorer, Power BI Desktop etc. If yes, how to get the key?
I still see keys for this storage account in azure portal. Are they invalid ? or will they change every time keyvault regenerates keys for this storage account?
I had set -ActiveKeyName Key2. Each time i regenerate the key Key1 is being regenerated. If Key1 is regenerated then is Key2 still valid even after 1 day? This active key concept is not so clear in the documentation. Can someone explain it.
Is Sas token the only way to get access to storage account resources. I just want to have full access to storage account for the regeneration period. is it possible without using Sas token?
I created SAS Definition from powershell and create SAS token out of it whenever i want to access Storage account. I think SAS Token would be invalidated but not SAS Definion. I am assuming i don't have to handle expiry in the code because i always get new SAS Token. Am i doing it correctly?
I know it's been 11 months, and you either abandoned this or figured it out for yourself. I will answer your question in case anyone finds this question.
Yes! Any application that you use should talk to the KeyVault to get a SAS token. Avoid using the storage account keys, they are still valid, but may change at any time. If you just need one time access you can use powershell to get a sas token that you can use.
They are valid, but will change whenever KeyVault rotates them, so don't use them, and don't change them yourself.
There are two valid keys at any one time. Only one of the keys are used to issue SAS tokens at any one time. This is the active key. When it is time to rotate, KeyVault regenerates the key that is not active, and then sets the newly created key as active.
Lets do an example. Lets say the keys are called key1 and key2. key1 is equal to 'A' and key2 is equal to 'b'. Let key1 be the active key.
Regenerate key2. key2 is now equal to 'c'
Set key2 as active. New sas tokens are now generated with key2.
Now the keys have been rotated, but key1 is still valid. It will be changed next time the keys are rotated. This way, as long as the rotation period is longer than the lifetime of the tokens, no token will become invalid before it expires.
No the keys are still valid so they can also be used, but you don't know when they will change.
The SAS definition is where the lifetime of a token is declared. When you created it, a secret was created in KeyVault. Every time you get that secret, you get a new token. If you do not store the token, but ask for a new every time you will always get a valid one. But you might want to cache the token, as going to KeyVault every time is slow.
How to create the managed storage account
How create the SAS definition
Related
I create SAS tokens on the server on demand and send them to users so that they can upload blobs. By default, each token is set to expire in an hour. I'm also using Azure Functions for server-side processing.
var cloudStorageAccount = // create a new CloudStorageAccount
var sharedAccessAccountPolicy = new SharedAccessAccountPolicy
{
Permissions = SharedAccessAccountPermissions.Read | SharedAccessAccountPermissions.Write,
Services = SharedAccessAccountServices.Blob,
ResourceTypes = SharedAccessAccountResourceTypes.Object,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(1),
Protocols = SharedAccessProtocol.HttpsOnly
};
var token = cloudStorageAccount.GetSharedAccessSignature(sharedAccessAccountPolicy);
What I'd like to do is to expire a SAS token once it's been successfully used once by listening to blob changes via EventGridTrigger. For example, if it took the user 10 minutes to upload a file, that token should no longer be usable. This is to prevent abuse because the API that generates SAS tokens is rate limited. If I want the user to upload only one file in an hour, I need a way to enforce this. Somebody with a fast internet connection can theoretically upload dozens of files if the token expires in an hour.
So, my question would be, is it possible to programmatically expire a token even if its expiry date has not been reached? Another alternative that would work in my scenario is to generate one-time tokens if it's possible.
I believe you can use a user delegation SAS for this. A user delegation SAS can be created and revoked programmatically.
https://learn.microsoft.com/en-us/rest/api/storageservices/create-user-delegation-sas
Revoke the user delegation key
You can revoke the user delegation key by calling the Revoke User Delegation Keys operation. When you revoke the user delegation key, any shared access signatures relying on that key become invalid. You can then call the Get User Delegation Key operation again and use the key to create new shared access signatures. This approach is the quickest way to revoke a user delegation SAS.
You cannot revoke a SAS token before it's expiration data unless it's associated with a Security policy and you revoke that policy or you change the Access Key associated with the storage account. I don't think either of these ideas really applies to your case. SAS tokens are essentially self-contained and cannot be altered once issued, so you cannot expire them early.
See the Revocation section on this page for the official explanation:
https://learn.microsoft.com/en-us/azure/storage/blobs/security-recommendations#revocation
"A service SAS that is not associated with a stored access policy cannot be revoked."
Also, there are no one-time use SAS tokens and according to this feedback request, Microsoft has no plans to implement that feature: https://feedback.azure.com/forums/217298-storage/suggestions/6070592-one-time-use-sas-tokens
Your best bet is to simply keep the expiration time as short as possible for your use case. If you absolutely must limit uploads for a specific user, then you consider having the user go through a separate controlled app instead of directly to storage (like a Web API) that can be used as a gatekeeper (checking previous uploads and implementing limit logic).
How to create SAS url for azure blob storage using java How to generate azure blob storage SAS url using java?
do we have any api where i can expire any existing sas url?
Requirement: Generate any sas url with expiry time as 7 days and i want some api where i can reset this sas url at any point of time within expiry time limit.
but looks like we don't have this i guess and every time it will create new sas url but we can expire any existing sas url at any point of time.
You can create an access policy on the container with set start and end date. Then create a SAS token on the container or an individual blob using that policy. Once the policy is deleted it will also invalidate all SAS tokens that were generated with it.
public BlobServiceSasSignatureValues(String identifier)
//Where identifier is the name of the access policy
It's limited to 5 access policies on a table, but if you are a bit flexible on the expiration date you could make one each week, so every blob would be available for at least a week and up to two at most. It also implies that once you remove the policy all url's for that week no longer work.
I don't think there's any other way you are able to invalidate the generated url's apart from generating new accesss keys.
Would like to run a local Jupiter notebook connecting to Azure databricks cluster, and need to use dbutils to get secrets. This requires to save a privileged token locally and it is only lasting for 2 days. Is there any way to generate token longer than that or to keep using dbutils locally longer?
Note: Due to security restrictions, calling dbutils.secrets.get requires obtaining a privileged authorization token from your workspace. This is different from your REST API token, and starts with dkea.... The first time you run dbutils.secrets.get, you are prompted with instructions on how to obtain a privileged token. You set the token with dbutils.secrets.setToken(token), and it remains valid for 48 hours.
There are two types of databricks secrets:
Databricks-backed scopes
Azure Key Vault-backed scopes
This is possible by configuring secrets with Azure Key vault.
To reference secrets stored in an Azure Key Vault, you can create a secret scope backed by Azure Key Vault. You can then leverage all of the secrets in the corresponding Key Vault instance from that secret scope. B
Reference: https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secret-scopes#akv-ss
https://learn.microsoft.com/en-us/azure/databricks/security/secrets/example-secret-workflow
Hope this helps.
I suppose you followed this tutorial to make Jupyther work with Azure Databricks via Databricks Connect.
And no, as it says here, there is no
way to generate token longer than that or to keep using dbutils locally longer.
A token expires after 48 hours.
I am planning to use keyVault to manage Storage Account Keys.
My question is, when the keys get rotated, would the SAS token previously served by the keyVault get invalidated ?
For example, if I request a SAS for a blob with 30days validity but the key rotation period I set is 3 days, then effectively the validity of the SAS would be 3 days or 30 days ?
PS: I asked this query in the MS doc but did not get a reply for this. That is why I am asking you good people of SO.
My question is, when the keys get rotated, would the SAS token
previously served by the keyVault get invalidated ?
By default, the answer is yes, the keyvault will get invalidated.
If the SAS token is about to expire, we should get sasToken again from keyvault and update it.
More information about keyvault and storage account, please refer to this link.
For example, if I request a SAS for a blob with 30days validity but
the key rotation period I set is 3 days, then effectively the validity
of the SAS would be 3 days or 30 days ?
As far as I know, if we follow official article, the answer is 3 days.
We can use keyvault to manage Azure storage account, update storage account key or get storage account key.
For example, we can use this command Update-AzureKeyVaultManagedStorageAccountKey to update storage account key.
That's actually a bit more complicated than another answer presents.
For starters, storage accounts have two storage account keys, both of which would give access to that account.
SAS tokens are derived from either of those keys. They will keep working until they expire on their own OR until they key they derived from is rotated (whichever is sooner).
Key vault managed storage accounts have a notion of "active key". Whenever you request a SAS token from KV, it will use currently active key to generate the SAS token it returns.
Whenever auto-rotation happens, KV will rotate the key that is NOT currently active and make it active key. The previously active key will become "inactive" but it will stay until next auto-rotation, which means that any SAS tokens generated before rotation will continue working until they expire or another rotation happens.
All that does not matter of course if you use Update-AzureKeyVaultManagedStorageAccountKey and rotate currently active key. In that case all previously produced SAS tokens will immediately become invalid.
So, as long as you stick to auto-rotation only AND the duration on your SAS tokens is less that auto-rotation period, SAS tokens should not get invalid because of storage key change.
I have generated SAS using the primary access key. I want to know that if the primary & secondary access keys of the Storage Account are changed, will the SAS generated using the earlier Primary/Secondary access keys will continue to work or do I need to generate a new SAS every time the storage access keys are changed?
Once you regenerate the account keys, the SAS tokens created with the old key will become invalid (you will start getting 403 errors if you use them). You would need to generate new SAS tokens with new account keys.