ADLS Gen2 --> ACL on a folder level - azure

I have a question regarding permissions for ADLS Gen2
short description:
I have a Gen2 storage account and created a container.
Folder Structure looks something like this
StorageAccount1
--->Container1
--->Folder1
--->Files 1....n
Also i have a service principal from a customer..
Now i have to provide the customer write only permission to only Folder1 (should not be able to delete files inside Folder1)
I have assigned the service principle below permissions in the Access control list
Container1 --> Execute
Folder1 --> Write , Execute
with this the customer can now put data into this Folder1.. but how do i prevent him from deleting any files inside it? ( i dont wanna use SAS )
Or is there any other way other than ACL?
Please help :)
ACL for ADLSgen2

Please check if below can be worked.
ACLs are the way to control access to be applied on the file and
folder level unlike others which are for container level.
Best practice is to always restrict access using (Azure RBAC)
on the Storage Account/Container level has several Azure built-in
roles that you can assign to users, groups, service principals, and
managed identities and then combine with ACLs with more restrictive
control on the file and folder level.
Ex: Storage Blob Data Contributor has read/write/delete permission .Assign an Azure role
If the built-in roles don't meet the specific needs of your
organization, you can create your own Azure custom roles.
Reference
To assign roles, you must be assigned a role that has role assignments write permission, such as Owner or User Access Administrator at the scope you are trying to assign the role.
To create a custom role with customized permissions.
Create a new file C:\CustomRoles\customrole1.json like example below. The ID should be set to null on initial role creation as a
new ID is generated automatically.
{
"Name": "Restrict user from delete operation on Storage",
"ID": null,
"IsCustom": true,
"Description": "This role will restrict the user from delete operation on the storage account. However, customer will be able to see the storage account, container, blob.",
"Actions": [
"Microsoft.Storage/storageAccounts/read",
"Microsoft.Storage/storageAccounts/blobServices/containers/read",
"Microsoft.Storage/storageAccounts/blobServices/generateUserDelegationKey/action"
],
"NotActions": [
"Microsoft.Storage/storageAccounts/blobServices/containers/delete"
],
"DataActions": [
"Microsoft.Storage/storageAccounts/blobServices/containers/blobs/read",
"Microsoft.Storage/storageAccounts/blobServices/containers/blobs/write",
"Microsoft.Storage/storageAccounts/blobServices/containers/blobs/move/action",
"Microsoft.Storage/storageAccounts/blobServices/containers/blobs/add/action"
],
"NotDataActions": [
"Microsoft.Storage/storageAccounts/blobServices/containers/blobs/delete",
],
"AssignableScopes": [
"/subscriptions/dxxxx7-xxxx"
]
}
Using the above role definition, by running the below powershell script to create a custom role:
New-AzRoleDefinition -InputFile "C:\CustomRoles\customrole1.json"
see below References: for details.
How to restrict the user from upload/download or delete blob in
storage account - Microsoft Tech Community
Azure built-in roles - Azure RBAC | Microsoft Docs
Also try to enable soft delete to restore delete actions if the role has delete permissions .
Though mentioned to not use .Just in case .Shared access signature (SAS) can be used to restrict access to blob container or an individual blob. Folder in blob storage is virtual and not a real folder. You may refer to the suggestion mentioned in this article
References
permissions-are-required-to-recursively-delete-a-directory-and-its-contents
Cannot delete blobs from ADLS gen2 when connected via Private Endpoint - Microsoft Q&A
Authorize with Azure Active Directory (REST API) - Azure Storage | Microsoft Docs

Related

How to create custom RBAC/ABAC role in Azure?

The requirement is to create access package with few roles so that the users can perform below activities:
Read & write access to data stored in a given blob container ('abc' blob container).
Role to access azure data factory to build pipeline, process & load the data to a staging area (to Blob container or SQL server).
DDL & DML and execute permission role to access the data/database in SQL server environment.
I was referring Azure RBAC and built-in-roles but unable to get clear idea considering the above points.
My question is, is there any build in roles there OR do I need to create the custom role? And, how to create custom role (for above requirements) considering baseline security?
Is there any ways, can I get additional actions by referring which I can write custom JSON scripts?
My question is, Is the RBAC roles possible for SQL Server in a VM? If yes, how?
Additionally, if I have both PaaS instance of SQL Server and VM instance of SQL Server (that is, SQL Server in VM) - how the RBAC roles will be managed for both?
According to your requirements, please go through below workarounds if they are helpful:
Read & write access to data stored in a given blob container (‘abc'
blob container).
You can make use of built-in role like Storage Blob Data Contributor which allows operations like read, write and delete Azure Storage containers and blobs. If you want to know more in detail, go through this reference.
Role to access azure data factory to build pipeline, process & load
the data to a staging area (to Blob container or SQL server).
You can make use of built-in role like Data Factory Contributor which allows operations like create and manage data factories, as well as child resources within them. Those child resources include pipelines, datasets, linked services… With this role, you can build pipeline, process and load the data. If you want to know more in detail, go through this reference.
DDL & DML and execute permission role to access the data/database in
SQL server environment.
You can make use of built-in role like SQL Server Contributor which allows operations like manage SQL Servers and Databases. If you want to know more in detail, go through this reference.
If you want to create a custom role for all these, make sure you have Owner or User Access Administrator role on the subscription. You can create a custom role in 3 ways:
Clone a role – You can make use of existing roles and modify the permissions by adding and deleting them according to your need.
Start from scratch – In this, you must add all permissions you need manually by picking them from their providers and excluding the permissions you don’t need.
Start from JSON – Here, you can just upload a JSON file where you can create separately by including all needed permissions in Actions variable whereas excluded permissions in notActions variable. If the permissions are related to data, then add them to DataActions and notDataActions based on your need. In Assignable scope, you can include the scope where the role should be available i.e., subscription or resource group as per need.
Considering baseline security, it is always suggested to give read permissions only. But as you need write permission for blob container and building pipeline, you can just add only those(read/write) in Actions section and remaining all(delete) in NotActions section.
If you want to add additional actions, simply include those permissions in Actions section in JSON file and make sure to give read permissions to resource groups.
A sample custom role JSON file for your reference:
{
"assignableScopes": [
"/"
],
"description": "Combining all 3 requirements",
"id": "/subscriptions/{subscriptionId}/providers/Microsoft.Authorization/roleDefinitions/***************************",
"name": "**********************",
"permissions": [
{
"actions": [
"Microsoft.Authorization/*/read",
"Microsoft.Resources/subscriptions/resourceGroups/read",
"Microsoft.ResourceHealth/availabilityStatuses/read",
"Microsoft.Resources/deployments/*",
"Microsoft.Storage/storageAccounts/blobServices/containers/read",
"Microsoft.Storage/storageAccounts/blobServices/containers/write",
"Microsoft.DataFactory/dataFactories/*",
"Microsoft.DataFactory/factories/*",
"Microsoft.Sql/locations/*/read",
"Microsoft.Sql/servers/*",
],
"notActions": [
"Microsoft.Storage/storageAccounts/blobServices/containers/delete",
"Microsoft.Sql/servers/azureADOnlyAuthentications/delete",
"Microsoft.Sql/servers/azureADOnlyAuthentications/write"
],
"dataActions": [
"Microsoft.Storage/storageAccounts/blobServices/containers/blobs/delete",
"Microsoft.Storage/storageAccounts/blobServices/containers/blobs/read",
"Microsoft.Storage/storageAccounts/blobServices/containers/blobs/write",
"Microsoft.Storage/storageAccounts/blobServices/containers/blobs/move/action",
"Microsoft.Storage/storageAccounts/blobServices/containers/blobs/add/action"
],
"notDataActions": []
}
],
"roleName": "Custom Role Contributor",
"roleType": "CustomRole",
"type": "Microsoft.Authorization/roleDefinitions"
}
Reference:
Azure custom roles - Azure RBAC | Microsoft Docs

Azure Synapse severless SQL pool - query execution fails

After completing tutorial 1, I am working on this tutorial 2 from Microsoft Azure team to run the following query (shown in step 3). But the query execution gives the error shown below:
Question: What may be the cause of the error, and how can we resolve it?
Query:
SELECT
TOP 100 *
FROM
OPENROWSET(
BULK 'https://contosolake.dfs.core.windows.net/users/NYCTripSmall.parquet',
FORMAT='PARQUET'
) AS [result]
Error:
Warning: No datasets were found that match the expression 'https://contosolake.dfs.core.windows.net/users/NYCTripSmall.parquet'. Schema cannot be determined since no files were found matching the name pattern(s) 'https://contosolake.dfs.core.windows.net/users/NYCTripSmall.parquet'. Please use WITH clause in the OPENROWSET function to define the schema.
NOTE: The path of the file in the container is correct, and actually I generated the following query just by right clicking the file inside container and generated the script as shown below:
Remarks:
Azure Data Lake Storage Gen2 account name: contosolake
Container name: users
Firewall settings used on the Azure Data lake account:
Azure Data Lake Storage Gen2 account is allowing public access (ref):
Container has required access level (ref)
UPDATE:
The owner of the subscription is someone else, and I did not get the option Check the "Assign myself the Storage Blob Data Contributor role on the Data Lake Storage Gen2 account" box described in item 3 of Basics tab > Workspace details section of tutorial 1. I also do not have permissions to add roles - although I'm the owner of synapse workspace. So I am using workaround described in the Configure anonymous public read access for containers and blobs from Azure team.
--Workaround
If you are unable to granting Storage Blob Data Contributor, use ACL to grant permissions.
All users that need access to some data in this container also needs
to have the EXECUTE permission on all parent folders up to the root
(the container). Learn more about how to set ACLs in Azure Data Lake
Storage Gen2.
Note:
Execute permission on the container level needs to be set within the
Azure Data Lake Gen2. Permissions on the folder can be set within
Azure Synapse.
Go to the container holding NYCTripSmall.parquet.
--Update
As per your update in comments, it seems you would have to do as below.
Contact the Owner of the storage account, and ask them to perform the following tasks:
Assign the workspace MSI to the Storage Blob Data Contributor role on
the storage account
Assign you to the Storage Blob Data Contributor role on the storage
account
--
I was able to get the query results following the tutorial doc you have mentioned for the same dataset.
Since you confirm that the file is present and in the right path, refresh linked ADLS source and publish query before running, just in case if a transient issue.
Two things I suspect are
Try setting Microsoft network routing in Network Routing settings in ADLS account.
Check if built-in pool is online and you have atleast contributer roles on both Synapse workspace and Storage account. (If the current credentials using to run the query has not created the resources)

Provide Azure VM access to multiple Storage Accounts

I have two storage accounts stinboundclient1 & stinboundclient2 and storage account have initial "stinbound" is common for both. Now inside storage accounts there are containers for each environment (dev,test,prod). Now I have a dev Virtual Machine (DevVM) and it needs access to only "dev" container of both storage accounts. What is the best way we can provide read/contributor access to VM using azure policy or custom role or any other approach?
Please do not suggest manual way of providing RBAC permission to VM bcoz its tedious task to provide each container that access as eventually there will 30-40 clients storage accounts.
Storage Account & Containers:
stinboundclient1/dev
stinboundclient1/test
stinboundclient1/prod
stinboundclient2/dev
stinboundclient2/test
stinboundclient2/prod
DevVM needs access to stinbound/dev*
Similarly Test and Prod need access respective containers::
TestVM needs access to stinbound*/test
ProdVM needs access to stinbound*/prod
It seems to me that what you are looking for is actually what Microsoft calls Attribute-based Access Control (ABAC).
That way, you can grant access to a scope and add a particular condition for this access to be effective on the name of the container, a tag to be present, etc.
This feature is still in Preview though.

How to set up ACL by not using RBAC in ADLS gen2?

Please let me know how did you set up the ACL by not using RBAC. I tried the below steps:
Created a user in Active Directory
In Storage(Gen2) -> IAM -> Gave the reader access to the user
In Storage Explorer - > Right click on the root folder -> manage access - > Giving Read, Write and execute permission.
Still this is not working. I guess since i have given reader role in IAM, ACL is not getting applied.
However if i do not set read access in IAM. User is unable to see the storage account when he is logging to the Azure portal. Please Let me know how shall i apply ACL ?
I have 5 folders. I want to give rwx access to 3 folders for DE team and rx access to DS team.
If you use ACL to access ADLS Gen2 via Azure portal, it is impossible. Because in Azure portal, in default users will use account key to access ADLS Gen2. So users should have permission to list the account key. But ACL cannot do that. For more details, please refer to here. If you want to use ACL, I suggest you use azcopy.
For example
My ADLS Gen2
FileSystem
test
folder
result_csv
I want to list all files in the folder result_csv.
Configure ACL. For more details about ACL, please refer to here
Operation / result_csv/
list /result_csv --x r-x
Test
azcopy login --tenant-id <your tenant>
azcopy list "your url"

Grant access to Azure Data Lake Gen2 Access via ACLs only (no RBAC)

my goal is to restrict access to a Azure Data Lake Gen 2 storage on a directory level (which should be possible according to Microsoft's promises).
I have two directories data, and sensitive in a data lake gen 2 container. For a specific user, I want to grant read access to the directory data and prevent any access to directory sensitive.
Along the documentation I removed all RBAC assignements for that user (on storage account as well as data lake container) so that I have no inherited read access on the directories. Then I added a Read-ACL statement to the data directory for that user.
My expectation:
The user can directly download files from the data directory.
The user can not access files of the sensitive directoy
Reality:
When I try to download files from the data directory I get a 403 ServiceCode=AuthorizationPermissionMismatch
az storage blob directory download -c containername -s data --account-name XXX --auth-mode login -d "./download" --recursive
RESPONSE Status: 403 This request is not authorized to perform this operation using this permission.
I expect that this should work. Otherwhise I only can grant access by assigning the Storage Blob Reader role but that applies to all directory and file within a container and cannot be overwritten by ACL statements. Did I something wrong here?
According to my research, if you want to grant a security principal read access to a file, we need to give the security principal Execute permissions to the container, and to each folder in the hierarchy of folders that lead to the file. for more details, please refer to the document
I found that I could not get ACLs to work without an RBAC role. I ended up creating a custom "Storage Blob Container Reader" RBAC role in my resource group with only permission "Microsoft.Storage/storageAccounts/blobServices/containers/read" to prevent access to listing and reading the actual blobs.

Resources