Deploy multiple linked ARM templates in Private Storage Container using single SAS token - azure

I have a master ARM template that has 3 linked child templates located in a private storage in Azure. In order to deploy each of the linked templates I need to generate and pass 3 blob SAS (shared access signature) tokens(one each per template) and append it to the URI of the ARM template.
Is it possible to access all these 3 ARM templates in the private storage using a single SAS token (may be a container token)?

Of course, you could generate container-level shared access signature URI in the Azure Storage Explorer to access it.
Right-click your file -> Get SAS -> choose the option in the screenshot
Or generate SAS token in the Azure portal, you could set allowed services and more details in it.

Related

How to use SAS Token in Azure File Copy Pipeline Task?

I am trying to copy ADF ARM templates to a storage account using the Azure File Copy Task in my Azure release pipeline. Since the storage account has firewall and networking set up, I want to use the SAS token to allow the Pipeline agent to copy the files to the storage account.
However, I am not able to find any documentation as to how to pass the SAS token as the optional argument(or at another place).
The task version is 2*.
How do I use the SAS token for copying the files?
I changed the file copy task version to 4*. ; and then I was able to add
--sas-token=$(sasToken) into the Optional Arguments and it worked for me.

az cli command to grant access to a source blob uri

What CLI command can be used to grant access to an Azure image's source blob uri to a specific App Registration's clientId?
Background
An image is created using the CLI. That image includes a source blob uri whose address is given in the portal as:
https://myblobid.blob.core.windows.net/vhds/letters-and-numbers-guid.vhd
Other tools such as ARM templates need to be able to access that same source blob URI, but the problem is that the source blob uri is not visible either to calling ARM templates or when pasted in raw form in the Azure portal.
There seems to be a permission issue.
The ARM templates will be run using a specific clientId associated with an App Registration that can be assigned any Role that you tell us it needs to have.
So what CLI command must be typed in order to give the clientId we specify the ability to run ARM template commands that can successfully access and use the given source blob uri?

Azure - Blob Storage - Mixing Custom Domain with SAS

I have an Azure Storage account that hosts a static web site as explained here. This means the static web site "lives" in a storage container named $web. This web site is accessible via a custom domain. This is currently working as desired. However, there is one file that I want to restrict access to.
There is one file in the $web storage container that I only want individuals to access if a) they have a key and b) it's during a specific time window. My thinking was that I could accomplish this with a Shared Access Signature (SAS). However, while testing this approach, it doesn't seem to work. It seems that everything in the $web storage container is publicly visible whether a SAS has been generated or not. Is this correct?
Is there a way to require that a file in the $web storage container have an SAS? Or, do I need to "host" the file in a separate storage container (thus removing it from my custom domain)?
Thank you.
When visit the files stored in $web container via primary static website endpoint(for example, https://contosoblobaccount.z22.web.core.windows.net/index.html), the files are always be accessible whether the container is public or private. So it doesn't matter the sas token is specified or not.
And the sas token only take effects if the $web container is private access, and people visit it via primary blob service endpoint(For example, https://contosoblobaccount.blob.core.windows.net/$web/index.html).
Please refer to this official doc for more details.
So for your purpose, you should put it in another container with private access.

Storage destination needs to have a Service SAS, not an Account SAS. What Does This Mean?

Hello recently I have been in the process of trying to use this azure graph request noted here
https://learn.microsoft.com/en-us/graph/api/user-exportpersonaldata?view=graph-rest-1.0&tabs=http
Now when you do that request as stated in it you provide a storage location which is, "This is a shared access signature (SAS) URL to an Azure Storage account, to where data should be exported."
Every time I provide by SAS url I get this error, "Storage destination needs to have a Service SAS, not an Account SAS"
Can someone please help me understand what this means? The documentation it links is not clear.
Storage destination needs to have a Service SAS, not an Account SAS
Difference between Account SAS and Service SAS is described here: https://learn.microsoft.com/en-us/rest/api/storageservices/delegate-access-with-shared-access-signature#types-of-shared-access-signatures.
You're providing an SAS URL for the entire account (e.g. https://account.blob.core.windows.net/?sas-parameters) whereas it is expected that you provide a SAS URL for a specific blob container (e.g. https://account.blob.core.windows.net/blob-container/?sas-parameters).
There are two possible solutions:
Create a SAS URL for a specific blob container. Or in other words create a Service SAS as the error message is telling you to do. You can do so using a tool like Microsoft Storage Explorer.
Insert the blob container name in your account SAS URL so that it looks like something like this https://account.blob.core.windows.net/blob-container/?sas-parameters.
Please note that if you're using an Account SAS, it should at least have Write permission on Object for Blob service.

Azure blob storage networking rules (Ip) for Azure data warehouse

I need to load external data (in blob storage) to my Azure data warehouse using Polybase. I had it working fine when I was using Classic Azure Storage.
Recently, I have to update our Storage to ARM and I could not figure out how to set up the firewall rule on the ARM Storage to my Azure data warehouse. If I set the firewall to "All networks" everything works seamlessly. However, I cannot let the blob wide open.
I tried using nslookup to find the outbound ip for our Azure Data warehouse and put the value into the Firewall of the Storage; I got "This request is not authorized to perform this operation." error
Is there a way I can find the ip address for an Azure Data warehouse? Or I should use different approach to make it work?
Any Suggestions are appreciated.
Kevin
Under the section 1.1 Create a Credential, it states:
Don't skip this step if you are using this tutorial as a template for loading your own data. To access data through a credential, use the following script to create a database-scoped credential, and then use it when defining the location of the data source.
-- A: Create a master key.
-- Only necessary if one does not already exist.
-- Required to encrypt the credential secret in the next step.
CREATE MASTER KEY;
-- B: Create a database scoped credential
-- IDENTITY: Provide any string, it is not used for authentication to Azure storage.
-- SECRET: Provide your Azure storage account key.
CREATE DATABASE SCOPED CREDENTIAL AzureStorageCredential
WITH
IDENTITY = 'user',
SECRET = '<azure_storage_account_key>'
;
-- C: Create an external data source
-- TYPE: HADOOP - PolyBase uses Hadoop APIs to access data in Azure blob storage.
-- LOCATION: Provide Azure storage account name and blob container name.
-- CREDENTIAL: Provide the credential created in the previous step.
CREATE EXTERNAL DATA SOURCE AzureStorage
WITH (
TYPE = HADOOP,
LOCATION = 'wasbs://<blob_container_name>#<azure_storage_account_name>.blob.core.windows.net',
CREDENTIAL = AzureStorageCredential
);
Edit: (additional way to access Blobs from ADW through the use of SAS):
You also can create a Storage linked service by using a shared access signature. It provides the data factory with restricted/time-bound access to all/specific resources (blob/container) in the storage.
A shared access signature provides delegated access to resources in your storage account. You can use a shared access signature to grant a client limited permissions to objects in your storage account for a specified time. You don't have to share your account access keys. The shared access signature is a URI that encompasses in its query parameters all the information necessary for authenticated access to a storage resource. To access storage resources with the shared access signature, the client only needs to pass in the shared access signature to the appropriate constructor or method. For more information about shared access signatures, see Shared access signatures: Understand the shared access signature model.
Full document can be found here

Resources