I am using an API (node.js) to generate a read only shared access signature for an iOS app using Azure Mobile Services. The API generates the SAS using the following code...
var azure = require('azure-storage');
var blobService = azure.createBlobService(accountName, accountKey);
var sas = blobService.generateSharedAccessSignature("containerName", null, sharedAccessPolicy);
This works great when I want a SAS for access to one container. But I really need access to all containers in the storage account. I could obviously do this with a separate API call for each container but this would require hundreds of extra calls.
I have looked everywhere for a solution but I can't get anything to work, I would very much appreciate knowing if there is a way to generate a SAS for all containers in a storage account?
You can construct an account-level SAS, where you get to specify:
services to include (blob, table, queue, file)
resource access (e.g. container create & delete)
permissions (e.g. read, write, list)
protocol (e.g. https only, vs http+https)
Just like a service-specific SAS, you get to specify expiry date (and optionally start date).
Given your use case, you can tailor your account SAS to be just for blobs; there's no need to include unneeded services (in your case, tables/queues/files).
More specifics are documented here.
Related
Hello recently I have been in the process of trying to use this azure graph request noted here
https://learn.microsoft.com/en-us/graph/api/user-exportpersonaldata?view=graph-rest-1.0&tabs=http
Now when you do that request as stated in it you provide a storage location which is, "This is a shared access signature (SAS) URL to an Azure Storage account, to where data should be exported."
Every time I provide by SAS url I get this error, "Storage destination needs to have a Service SAS, not an Account SAS"
Can someone please help me understand what this means? The documentation it links is not clear.
Storage destination needs to have a Service SAS, not an Account SAS
Difference between Account SAS and Service SAS is described here: https://learn.microsoft.com/en-us/rest/api/storageservices/delegate-access-with-shared-access-signature#types-of-shared-access-signatures.
You're providing an SAS URL for the entire account (e.g. https://account.blob.core.windows.net/?sas-parameters) whereas it is expected that you provide a SAS URL for a specific blob container (e.g. https://account.blob.core.windows.net/blob-container/?sas-parameters).
There are two possible solutions:
Create a SAS URL for a specific blob container. Or in other words create a Service SAS as the error message is telling you to do. You can do so using a tool like Microsoft Storage Explorer.
Insert the blob container name in your account SAS URL so that it looks like something like this https://account.blob.core.windows.net/blob-container/?sas-parameters.
Please note that if you're using an Account SAS, it should at least have Write permission on Object for Blob service.
Situation:
We have a web-app on azure, and blob storage, via our web-app we write data into the blob, and currently read that data back out returning it as responses in the web-app.
What we're trying to do:
Trying to find a way to restrict access to the blob so that only our web-app can access it. Currently setting up an IP address in the firewall settings works fine if we have a static IP (we often test running the web app locally from our office and that lets us read/write to the blob just fine). However when we use the IP address of our web app (as read from the cross domain page of the web app) we do not get the same access, and get errors trying to read/write to the blob.
Question:
Is there a way to restrict access to the blob to the web app without having to set up a VPN on azure (too expensive)? I've seen people talk about using SAS to generate time valid links to blob content, and that makes sense for only allowing users to access content via our web-app (which would then deliver them the link), but that doesn't solve the problem of our web-app not being able to write to the blob when not publicly accessible.
Are we just trying to miss-use blobs? or is this a valid way to use them, but you have to do so via the VPN approach?
Another option would be to use Azure AD authentication combined with a managed identity on your App Service.
At the time of writing this feature is still in preview though.
I wrote on article on how to do this: https://joonasw.net/view/azure-ad-authentication-with-azure-storage-and-managed-service-identity.
The key parts:
Enable Managed Identity
Add the generated service principal the necessary role in the storage account/blob container
Change your code to use AAD access tokens acquired with the managed identity instead of access key/SAS token
Acquiring the token using https://www.nuget.org/packages/Microsoft.Azure.Services.AppAuthentication/1.1.0-preview:
private async Task<string> GetAccessTokenAsync()
{
var tokenProvider = new AzureServiceTokenProvider();
return await tokenProvider.GetAccessTokenAsync("https://storage.azure.com/");
}
Reading a blob using the token:
private async Task<Stream> GetBlobWithSdk(string accessToken)
{
var tokenCredential = new TokenCredential(accessToken);
var storageCredentials = new StorageCredentials(tokenCredential);
// Define the blob to read
var blob = new CloudBlockBlob(new Uri($"https://{StorageAccountName}.blob.core.windows.net/{ContainerName}/{FileName}"), storageCredentials);
// Open a data stream to the blob
return await blob.OpenReadAsync();
}
SAS Keys is the correct way to secure and grant access to your Blob Storage. Contrary to your belief, this will work with a private container. Here's a resource you may find helpful:
http://www.siddharthpandey.net/use-shared-access-signature-to-share-private-blob-in-azure/
Please also review Microsoft's guidelines on securing your Blob storage. This addresses many of the concerns you outline and is a must read for any Azure PaaS developer:
https://learn.microsoft.com/en-us/azure/storage/common/storage-security-guide
I need to load external data (in blob storage) to my Azure data warehouse using Polybase. I had it working fine when I was using Classic Azure Storage.
Recently, I have to update our Storage to ARM and I could not figure out how to set up the firewall rule on the ARM Storage to my Azure data warehouse. If I set the firewall to "All networks" everything works seamlessly. However, I cannot let the blob wide open.
I tried using nslookup to find the outbound ip for our Azure Data warehouse and put the value into the Firewall of the Storage; I got "This request is not authorized to perform this operation." error
Is there a way I can find the ip address for an Azure Data warehouse? Or I should use different approach to make it work?
Any Suggestions are appreciated.
Kevin
Under the section 1.1 Create a Credential, it states:
Don't skip this step if you are using this tutorial as a template for loading your own data. To access data through a credential, use the following script to create a database-scoped credential, and then use it when defining the location of the data source.
-- A: Create a master key.
-- Only necessary if one does not already exist.
-- Required to encrypt the credential secret in the next step.
CREATE MASTER KEY;
-- B: Create a database scoped credential
-- IDENTITY: Provide any string, it is not used for authentication to Azure storage.
-- SECRET: Provide your Azure storage account key.
CREATE DATABASE SCOPED CREDENTIAL AzureStorageCredential
WITH
IDENTITY = 'user',
SECRET = '<azure_storage_account_key>'
;
-- C: Create an external data source
-- TYPE: HADOOP - PolyBase uses Hadoop APIs to access data in Azure blob storage.
-- LOCATION: Provide Azure storage account name and blob container name.
-- CREDENTIAL: Provide the credential created in the previous step.
CREATE EXTERNAL DATA SOURCE AzureStorage
WITH (
TYPE = HADOOP,
LOCATION = 'wasbs://<blob_container_name>#<azure_storage_account_name>.blob.core.windows.net',
CREDENTIAL = AzureStorageCredential
);
Edit: (additional way to access Blobs from ADW through the use of SAS):
You also can create a Storage linked service by using a shared access signature. It provides the data factory with restricted/time-bound access to all/specific resources (blob/container) in the storage.
A shared access signature provides delegated access to resources in your storage account. You can use a shared access signature to grant a client limited permissions to objects in your storage account for a specified time. You don't have to share your account access keys. The shared access signature is a URI that encompasses in its query parameters all the information necessary for authenticated access to a storage resource. To access storage resources with the shared access signature, the client only needs to pass in the shared access signature to the appropriate constructor or method. For more information about shared access signatures, see Shared access signatures: Understand the shared access signature model.
Full document can be found here
I am looking for a tool or usage example to generate and view SAS (Shared Access Signatures) of both Azure Block Blob and Azure File Share. There are lots of examples for Block Blob and Containers but what about Azure File Share SAS examples or tools.
Ability to create Shared Access Signature on a File Service Share is announced in the latest version of REST API. You must use Storage Client Library 5.0.0 for that purpose.
First, install this library from Nuget:
Install-Package WindowsAzure.Storage -Version 5.0.0
Then the process of creating a SAS on a File Service Share is very much similar to creating a SAS on a blob container. Please see sample code below:
static void FileShareSas()
{
var account = new CloudStorageAccount(new StorageCredentials(accountName, accountKey), true);
var fileClient = account.CreateCloudFileClient();
var share = fileClient.GetShareReference("share");
var sasToken = share.GetSharedAccessSignature(new Microsoft.WindowsAzure.Storage.File.SharedAccessFilePolicy()
{
Permissions = Microsoft.WindowsAzure.Storage.File.SharedAccessFilePermissions.List,
SharedAccessExpiryTime = new DateTimeOffset(DateTime.UtcNow.AddDays(1))
});
}
In the above code, we're creating a SAS with List permission that will expire one day from current date/time (in UTC).
Also if you're looking for a tool to do so, may I suggest you take a look at Cloud Portam (Disclosure: I am building this tool). Recently we released the functionality to manage SAS on a Share.
As far as I know in Azure Storage we can delegate access to our storage to single person using SAS on CONTAINER basis.
I need to delegate access on per BLOB basis to prevent hotlinking.
We are using Asp.Net MVC. Sorry for my English:)
Edit: And how new Azure user can create CDN?
So you can create a SAS on a blob. The approach is similar to the way you create a SAS on a blob container. Since you're using ASP.Net MVC, I'm assuming you would want to use .Net Storage Client API to create SAS on a blob. To create a SAS on a blob, just call GetSharedAccessSignature method on the blob object you have created.
For example, the code below would give you a SAS URL where user has permission to download a blob:
var sas = blob.GetSharedAccessSignature(new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-5),
SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(15),
});
return string.Format(CultureInfo.InvariantCulture, "{0}{1}", blob.Uri, sas);
I wrote a blog post some time ago which describes SAS functionality on blobs and containers in more details: http://gauravmantri.com/2013/02/13/revisiting-windows-azure-shared-access-signature/
Regarding your question about CDN, I believe the functionality to create DSN nodes was taken away from the Windows Azure Portal when new portal was announced. I guess you would need to wait for the functionality to come up again on the portal.