Block access to Azure storage by IP - azure

I realize that when you create a Shared Access Signature (SAS), you can limit the SAS to only be viable from certain IP ranges.
But what I need is, to secure the Azure storage account, such that even if you have the access keys, you would be unable to access anything on the account, unless the request was coming from a set of white-listed IP ranges. Is this at all possible?

As far as I know, Azure doesn’t support the IP limitation on access keys.
You should know that the Azure storage account & access keys are about Management Plane Security. It grants complete access which is not a good choice to share your storage account and access keys to someone else.
Depending on your needs, using SAS with IP limits is your best choice. They are useful for providing limited permissions to your storage account to clients that should not have the account key. As such, they are a vital part of the security model for any application using Azure Storage.
Azure Storage security guide

Not sure when this started but there's an option now
https://learn.microsoft.com/en-us/azure/storage/common/storage-network-security
You can essentially block all communication with a storage account to a set of IPs, CIDR blocks or Azure VNets
Go to your storage account > Firewalls and virtual networks > select 'Selected Networks'
Then specify IPs, CIDR blocks or Azure VNets.
NOTE: The moment you turn this on, it essentially blocks any connections regardless of SAS tokens they present, including access via Azure Portal (you'll get an access denied on the blades showing your containers.) and Storage Explorer. If you have apps running that use this account, make sure your restrictions include them before pressing Save.
If you configured static site hosting, they will also be affected. This affects the whole account, not only blobs. So if you have apps accessing tables or files in the account, make sure you add them to the list.
If you want to upload, edit, download things from your storage account and you're not in the networks specified, you will have to add your current IP.

Related

Azure: How to provide limited Access Level to a Container in a Storage Account?

Me and my team are using Azure Synapse Analytics to ingest data from a REST API to a Azure Data Lake Storage Gen2, in order to create views automatically.
The only way we could manage to do this in our Workspace was by previously changing the Public Access Level to the Container inside our Storage Account to "Container (anonymous read access for containers and blobs)".
Is there any way to avoid doing this, and just enable this level of access to specific containers for a limited amount of users / IPs, while keeping it "Private (no anonymous access)"?
Click to Azure Portal view of the Containers inside a certain Storage Account resource
Yes, you can use Shared Access Signatures for it. You can find more information in here:
https://learn.microsoft.com/en-us/azure/storage/common/storage-sas-overview
I do agree with #Thiago Custodio and #Nandan, we can use SAS Token and URL for limited access to Container in the storage accounts.
To get SAS token:
Firstly, open the storage account, then click on container.
Then click on your container and then click on Shared Access tokens as below.
Then you can select what access you want to give access as below. Then Generate the token.
The token comes as below, now you can send this token to whom you want to give access your container.
Alternatively, you can also create a private endpoint as below:
Firstly, click on Networking, the click on Private endpoint connections and then (+) create a end point.
Now this Container can be accessed from the Virtual Machine integrated with this private endpoint.

Azure Storage Account - Container level access and ACL

Azure Storage Account
In one of our use case, we would like to use Azure Storage for sharing it with customers so that they can upload their data to us.
In that context, we are planning to create storage account per customer. In order for customer to access the account, we are planning
to share the storage account keys.
We are facing following issues
How to create keys specific to azure storage account container, so that customer can only access specific container.
Is it possible to have individual keys and access at container level.
For certain container, we want to give read-write access.
For others, we want to give only read access.
If i have storage account keys, does that mean i have access to everything under that storage account.
Is there a better solution to this ? Essentially we need a ftp site for customers to upload data.
Sounds like you want to use a shared access signature (SAS):
https://learn.microsoft.com/en-us/azure/storage/common/storage-sas-overview
A shared access signature (SAS) provides secure delegated access to resources in your storage account without compromising the security of your data. With a SAS, you have granular control over how a client can access your data. You can control what resources the client may access, what permissions they have on those resources, and how long the SAS is valid, among other parameters.
You can't have access key for a container level, there are for the whole Storage Account
To give access at a container level (or even finer grain) you need a Shared Access Signature. Documentation here
You can have as many SAS as you need, and you are allowed to define them with the desired permissions (read, read-write etc...)

How to whitelist azure API management in storage account's firewall?

I am using an Azure API management service to serve as a small API accessing a table storage in my storage account. I am using the table storage REST API (eg: https://learn.microsoft.com/en-us/rest/api/storageservices/query-entities)
I had no problems accessing the tablestorage using sharedkey-lite authorization, running a little script in policies, but due to business needs I needed to restrict access to the storage account.
Because of monetary considerations I cannot put the apim inside the vnet (nor external or internal) so I need to find another way to access the storage account.
I have tried adding the apim public ip to the firewall exceptions, but that still returned 403 forbidden.
I have added a managed identity allowing read access to the entire storage account and using the policy expression:
<authentication-managed-identity resource="https://storage.azure.com/"/>
But after digging more into the docs it seems that table storage is not supported by MSI only blob and queue (https://learn.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/services-support-managed-identities#azure-storage-blobs-and-queues)
Does anyone has an idea how to access the table storage REST API? I cannot wrap my head around why IP whitelisting does not work.
This 403 forbidden error only have two ways to solve:
1, put the api in a vnet and let vnet access.
2, let the outbound ip of the service to access. You need to get the outbound ip.

Give Access to storage account in azure

I am new to azure and trying to learn azure storage. Suppose I have created a storage acoount and stored few documents, and want everyone to access mt document. If I give my URL, everyone can access it but I want few users to access my storage account and can also upload documents they want.
Please refer me how to achive this and if possible please refer and link which will be usefull for me.
Thanks in Advance.
There are a couple of ways you can do this:
Generate and distribute SAS tokens with read/write privileges. This will give a Url which expires at a given point in time. You can do all this through the portal, through code, or by using context menus within Azure Storage Explorer. Here is a sample of how to do it with code.
You can also assign the AAD users to a role which has permission to manipulate resources in the storage account. Here is a list of current roles so you can select the proper one based on your use case. There are preview roles which don't appear to be working.
EDIT: MS just announced the preview of AAD support down to the scope of a container or queue. This is likely the granularity you were looking for.
EDIT 2 : Full RBAC support for storage is now available
You can generate SAS token
This way you can grant access to others without sharing the account keys.
You can create SAS token on specific Service( Blob, Queue, File ) or an Account SAS which allows you to grant permission to multiple services within storage account.( Queue and Table for eg. )
SAS tokens give you granular control over types of access including:
The interval over which the SAS is valid, including the start time and the expiry time.
The permissions granted by the SAS. For example, a SAS for a blob might grant read and write permissions to that blob, but not delete permissions.
An optional IP address or range of IP addresses from which Azure Storage will accept the SAS. For example, you might specify a range of IP addresses belonging to your organization.
The protocol over which Azure Storage will accept the SAS. You can use this optional parameter to restrict access to clients using HTTPS.
Azure Storage offers these options for authorizing access to secure resources:
Azure Active Directory (Azure AD) integration (Preview) for blobs and queues. The Azure AD provides role-based access control (RBAC) for fine-grained control over a client's access to resources in a storage account. For more information, see
Authenticating requests to Azure Storage using Azure Active Directory (Preview).
Shared Key authorization for blobs, files, queues, and tables. A client using Shared Key passes a header with every request that is signed using the storage account access key. For more information, see
Authorize with Shared Key.
Shared access signatures for blobs, files, queues, and tables. Shared access signatures (SAS) provide limited delegated access to resources in a storage account. Adding constraints on the time interval for which the signature is valid or on permissions it grants provides flexibility in managing access. For more information, see
Using shared access signatures (SAS).
Anonymous public read access for containers and blobs. Authorization is not required. For more information, see
Manage anonymous read access to containers and blobs.
By default, all resources in Azure Storage are secured and are available only to the account owner. Although you can use any of the authorization strategies outlined above to grant clients access to resources in your storage account, Microsoft recommends using Azure AD when possible for maximum security and ease of use.

is it possible to aggregate Azure resources from different subscriptions?

Our team has Windows Azure MSDN - Visual Studio Premium subscriptions for all our devs. I have been taking advantage of the $100 per month allowance and am building more infrastructure in the cloud.
However, I would like other members of our team to access certain of the assets. I am quite new to the Azure infrastructure, so this might be a dumb question. But can they access my blobs? and can I control exactly who can access my blobs?
They can obviously RDP into my VMs, that's not an issue. I assume they can hit my VMs too, via the IP address, inside Azure, etc. However, I am more interested in the Blobs. Mostly because I am starting to upload a lot of utility data (large sample datasets, common software we all install, etc.) and I would like to avoid all of us having to upload all of it again for each subscriptions.
As of today (11/8/2013), you cannot "pool" MSDN resources meaning..have 4 subscriptions add up to $400/month and do ala carte cloud services
You can have one admin/or several for multiple subscriptions, this will allow you to view the different subscriptions in the portal and manage them in a single spot
You can also have different deployment profiles, so one Visual Studio instance can deploy to different Azure accounts.
Specific to your question, you have blob access keys and if you share the name of the storage account and key...yes they can access your data located there.
Yes, it is possible to control access to your blobs by using SAS (Shared Access Signatures)
SAS grants granular access to container, blob, table, & queue
This should be a good resource to start with :
Manage Access to Windows Azure Storage Resources
Create and Use a Shared Access Signature
However, I would like other members of our team to access certain of
the assets. I am quite new to the Azure infrastructure, so this might
be a dumb question. But can they access my blobs? and can I control
exactly who can access my blobs?
To answer specifically this question, Yes your team members can access the data stored in any blob storage account in any of your subscription. There are two ways by which you can provide them access to blob storage:
By giving them account name/account key: Using this, they get full access to storage account and essentially become owners of that storage account.
By using Shared Access Signature: If you want to give them restricted access to blob storage, you would need to use SAS as described by Dan Dinu. SAS basically gives you a URL using which users in possession of that URL can explore storage (by writing some code), however it is not possible to identify which user accessed which storage. For that you would need to write something on your own.

Resources