How to configure access to a single blob storage container - azure

I need to enable one external user, to be able to access a single directory in a single container in my datalake, in order to upload some data. From what I see in the documentation, it should be possible to simply use RBAC & ACL, so that the user can authenticate himself later on using Powershell and Connect-AzureAD(or to obtain a OAuth2 token).
However, I am having trouble with all those inherited permissions. Once I add a user to my active directory, he is not able to see anything, unless I give him at least reader access on the subscription level. This gives him at least reader permission on all the resources in this subscription, which cannot be removed.
Is it possible to configure this access in such a way, that my user is only able to see a single datalake, single container, and a single folder within this container?

If you want just the one user to access only a single directory/container in your storage account, you should rather look at Shared Access Signatures or Stored Access policies.
For SAS : https://husseinsalman.com/securing-access-to-azure-storage-part-4-shared-access-signature/
For SAS built on top of Stored Acess Policies : https://husseinsalman.com/securing-access-to-azure-storage-part-5-stored-access-policy/
Once you have configured the permissions just for that directory/container, you can send that Shared Access Signature to the user and he/she can use Azure Storage Explorer to perform and file upload/delete etc actions on your container.
Download Azure storage explorer here : https://azure.microsoft.com/en-us/features/storage-explorer/#overview
For how to use Azure Storage Explorer : https://www.red-gate.com/simple-talk/cloud/azure/using-azure-storage-explorer/
More on using Azure storage explorer with azure data lake Gen 2 : https://medium.com/microsoftazure/guidance-for-using-azure-storage-explorer-with-azure-ad-authorization-for-azure-storage-data-access-663c2c88efb

Related

Azure: How to provide limited Access Level to a Container in a Storage Account?

Me and my team are using Azure Synapse Analytics to ingest data from a REST API to a Azure Data Lake Storage Gen2, in order to create views automatically.
The only way we could manage to do this in our Workspace was by previously changing the Public Access Level to the Container inside our Storage Account to "Container (anonymous read access for containers and blobs)".
Is there any way to avoid doing this, and just enable this level of access to specific containers for a limited amount of users / IPs, while keeping it "Private (no anonymous access)"?
Click to Azure Portal view of the Containers inside a certain Storage Account resource
Yes, you can use Shared Access Signatures for it. You can find more information in here:
https://learn.microsoft.com/en-us/azure/storage/common/storage-sas-overview
I do agree with #Thiago Custodio and #Nandan, we can use SAS Token and URL for limited access to Container in the storage accounts.
To get SAS token:
Firstly, open the storage account, then click on container.
Then click on your container and then click on Shared Access tokens as below.
Then you can select what access you want to give access as below. Then Generate the token.
The token comes as below, now you can send this token to whom you want to give access your container.
Alternatively, you can also create a private endpoint as below:
Firstly, click on Networking, the click on Private endpoint connections and then (+) create a end point.
Now this Container can be accessed from the Virtual Machine integrated with this private endpoint.

How to browse Azure Data lake gen 2 using GUI tool

First some background:
I want to facilitate access to the different groups of data scientists in Azure Data Lake gen 2. However, we don’t want provide access to them to the entire data lake because they are not supposed to see all the data for security reasons. They must be able to see only some limited files/folders. We are doing that by adding the data scientists’ AAD groups to the ACL of the data lake folders. You can refer to the following links to get more insights and to know what I am talking about:
https://learn.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-access-control
Now the problem:
Since the data scientists are granted access to a very specific/limited area, they are able to access/browse those folders/files using Azure databricks (python commands/code etc.). However, they are not able to browse using Azure Storage Explorer.
So is there some way so that they can browse the datalake using Azure storage explorer or some other GUI tool.
Or is it possible to create some custom role for such a scenario and grant that role to the data scientists AAD groups so that they may just have access to the specific area (i.e. a custom role that may be created that would only have “execute” access on the ADLS gen 2 file-systems.)
As far as I knew, we have no way to use RABC role to control access on some folders in the file system(container). Because when we assign role to ADD group, we need to define a scope. The smallest scope in Azure data lake gen2 is file system(container). If you just want to control access on it, you do not need to create custom role and you can directly use the build-in role Storage Blob Data Reader. If one user has the role, he can read all files in the file system. For more details, please refer to the document
It is not possible to access data via Storage Explorer only with ACL permissions assigned. Unfortunately, you need to use ACLs in combination with RBAC role assigned on the Storage Account level (e.g. Reader), to be able to see Storage Account itself from the Storage Explorer. Then you can introduce granular permissions using ACL on specific containers/folders/files, however with Reader still they will be able to see the names of all the containers in the Storage Account (but cannot see the containers content until specified via ACL or Data RBAC assignment on container level).
As you noticed, the only option to access specific folder/file using only ACL permissions is via code e.g. Powershell or Python.

Protecting assets in BLOB storage using Azure B2C

I have a web app (asp.net core) which authenticates using Azure B2C for user accounts (OIDC). I now want to allow the users to access ‘protected resources’ such as images etc. My plan is to put these in Azure blob storage but I need to protect these so that only the authorized user can access their own image. I don’t want the scenario where anyone who knows the URL of a file can access it, only the logged in user.
Is this possible with Azure B2C and Blob storage, and if so, what is the best approach to secure these?
I was thinking of creating a container per user, with their B2C Object ID as the container name, so the structure may look like:
Files/04aaffcc-c725-4ff5-9565-cc2fb3d7b4df/image1.jpg
Files/04aaffcc-c725-4ff5-9565-cc2fb3d7b4df/image2.jpg
Files/04aaffcc-c725-4ff5-9565-cc2fb3d7b4df/movie1.mp4
Files/81f052a1-c8c2-4db5-9872-c16c803d1c3f/image66.jpg
Files/81f052a1-c8c2-4db5-9872-c16c803d1c3f/movie-19.mp4
So I need to restrict access so that only the logged in user with the correct object id (e.g. 81f052a1-c8c2-4db5-9872-c16c803d1c3f) can access their own resources (e.g. image66.jpg)
Any ideas on how best to implement this and what constructs Azure supports?
Thanks
I am assuming that users can't access the blob storage files directly. The storage should be abstracted by your service since storage and implementation can change anytime.
I would also have another folder (named images) inside objectId container because there might be different types of file in future.
Then lets say service is hosted on http://contoso.com. The image url to the user provided will be http://contoso.com/userImages/image123.jpg
When someone tries to access the resource, I would read objectId from the token and grant access accordingly.
You should think of sharing scenarios as well, you will need to build another table for the same as who owns the resource and who is it shared with. ObjectId based containers are not useful in such cases, it might be a flat container with image names as guid. and then mapping image names to the file name and other properties.
Usa Shared Access Signatures: https://learn.microsoft.com/en-us/azure/storage/common/storage-sas-overview.

Azure Container Level Access

We have recruitment that need to store the file in Blob Storage. The blob storage account which i have created for a company. There are multiple site for a same company. We need to restrict the site member to see other site files. So I need the access key based on container level.
The container will be created dynamically from C#. The credential / access key that should be created while creating the container from C# and the container level access key / credential will be shared with site members not the storage account access key. Storage account key will be in application configure side. So storage account key will be hidden from the site members.
How do I get the container level access key / credential in Azure blob storage?
I think SAS could meet your requirements.With a SAS, you can grant clients access to resources in your storage account, without sharing your account keys. And you could set the interval over which the SAS is valid and the permissions granted by the SAS. For example, a SAS for a blob might grant read and write permissions to that blob, but not delete permissions.
You could create SAS pointing to one or more resources and including a token that contains a special set of query parameters.
Here are two examples about how to use SAS, first SAS examples and create SAS.
If you still have other questions, please me know.

Give Access to storage account in azure

I am new to azure and trying to learn azure storage. Suppose I have created a storage acoount and stored few documents, and want everyone to access mt document. If I give my URL, everyone can access it but I want few users to access my storage account and can also upload documents they want.
Please refer me how to achive this and if possible please refer and link which will be usefull for me.
Thanks in Advance.
There are a couple of ways you can do this:
Generate and distribute SAS tokens with read/write privileges. This will give a Url which expires at a given point in time. You can do all this through the portal, through code, or by using context menus within Azure Storage Explorer. Here is a sample of how to do it with code.
You can also assign the AAD users to a role which has permission to manipulate resources in the storage account. Here is a list of current roles so you can select the proper one based on your use case. There are preview roles which don't appear to be working.
EDIT: MS just announced the preview of AAD support down to the scope of a container or queue. This is likely the granularity you were looking for.
EDIT 2 : Full RBAC support for storage is now available
You can generate SAS token
This way you can grant access to others without sharing the account keys.
You can create SAS token on specific Service( Blob, Queue, File ) or an Account SAS which allows you to grant permission to multiple services within storage account.( Queue and Table for eg. )
SAS tokens give you granular control over types of access including:
The interval over which the SAS is valid, including the start time and the expiry time.
The permissions granted by the SAS. For example, a SAS for a blob might grant read and write permissions to that blob, but not delete permissions.
An optional IP address or range of IP addresses from which Azure Storage will accept the SAS. For example, you might specify a range of IP addresses belonging to your organization.
The protocol over which Azure Storage will accept the SAS. You can use this optional parameter to restrict access to clients using HTTPS.
Azure Storage offers these options for authorizing access to secure resources:
Azure Active Directory (Azure AD) integration (Preview) for blobs and queues. The Azure AD provides role-based access control (RBAC) for fine-grained control over a client's access to resources in a storage account. For more information, see
Authenticating requests to Azure Storage using Azure Active Directory (Preview).
Shared Key authorization for blobs, files, queues, and tables. A client using Shared Key passes a header with every request that is signed using the storage account access key. For more information, see
Authorize with Shared Key.
Shared access signatures for blobs, files, queues, and tables. Shared access signatures (SAS) provide limited delegated access to resources in a storage account. Adding constraints on the time interval for which the signature is valid or on permissions it grants provides flexibility in managing access. For more information, see
Using shared access signatures (SAS).
Anonymous public read access for containers and blobs. Authorization is not required. For more information, see
Manage anonymous read access to containers and blobs.
By default, all resources in Azure Storage are secured and are available only to the account owner. Although you can use any of the authorization strategies outlined above to grant clients access to resources in your storage account, Microsoft recommends using Azure AD when possible for maximum security and ease of use.

Resources