I'm trying to configure access to a Cosmos DB database that contains multiple containers. In our scenario we want different teams of users to have read and write permissions within specific containers only, if they were to attempt any access to another container then they should be denied.
I've been reading about role based access within Cosmos and it sounds exactly like what we need. So i'm trying to create custom roles for this but I'm getting confused by the different permissions available.
I can't embed images but the link below is the permissions i've found for containers within DocumentDB.
Link to permissions:
There's permissions such as Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/write but it sounds like these are for the actual maintenance of containers themselves rather than the data within them. The set below that in the image mention the throughput of the container so that doesn't seem right either.
Previous material i'd seen mentioned that these roles were only available to create from something like PowerShell, however when I saw these within the portal I assumed that may have been outdated, but is that still the case?
Essentially, the setup I want is:
Role 1 given access to read and write items in container 1 only
Role 2 given access to read and write items in container 2 only
Also, as another quick side question. Are permissions additive? So can I create this role just for this purpose of accessing that container and then add it onto another more generic role, or would I need to include basic permissions that you would find in something like the Cosmos DB Account Reader Role?
The permissions you have are for management operations on Cosmos DB resources. The permissions you are looking for are these which are documented at Configure role-based access control with Azure Active Directory for your Azure Cosmos DB account
First step is to create a role definition for each container. You can use the Built-in role definitions such as Cosmos DB Built-in Data Contributor, or if you want to limit to discreet actions, you can build a custom role definition using az cli, PowerShell or ARM/Bicep.
Once you have your role definitions, you can then Create role assignments for each container to any service principal within your AAD tenant.
Last step is to Initialize our SDK with Azure AD. This is available for our .NET, Java, Python and JS SDK's. You will need to ensure you are using the correct version of our SDK's so best to upgrade if using older versions. To authenticate via AAD you need to create a client secret credential. The token from which, is what is passed to the Cosmos client when creating a new instance.
Lastly, you will want to use a custom query string when you want to access your data via the Cosmos Data Explorer. You may also want to restrict access to your data to only via AAD. To do this you will need to deploy an ARM template to your account (be sure to do a GET first so you don't accidentally destroy your resources). Then add "disableLocalAuth": true to the properties of the databaseAccount resource.
Related
I am trying to speed up my resource management in the azure portal, and to do so I need to assign a role to a UserIdentity which will allow it to create and manage database's and containers, in both the SQL databases and the Gremlin database.
I can get it to work by manually adding permissions in the portal, and I can get the reading and writing of data work using custom roles, but I can't find what role definitions to use, as they seem to not be accessible in custom roles.
I was trying "Microsoft.DocumentDb/databaseAccounts/" which gives the error, The provided data action string [Microsoft.DocumentDb/databaseAccounts/] does not correspond to any valid SQL data action,even though its listed in the roles.
Any help would be appreciated.
I need to enable one external user, to be able to access a single directory in a single container in my datalake, in order to upload some data. From what I see in the documentation, it should be possible to simply use RBAC & ACL, so that the user can authenticate himself later on using Powershell and Connect-AzureAD(or to obtain a OAuth2 token).
However, I am having trouble with all those inherited permissions. Once I add a user to my active directory, he is not able to see anything, unless I give him at least reader access on the subscription level. This gives him at least reader permission on all the resources in this subscription, which cannot be removed.
Is it possible to configure this access in such a way, that my user is only able to see a single datalake, single container, and a single folder within this container?
If you want just the one user to access only a single directory/container in your storage account, you should rather look at Shared Access Signatures or Stored Access policies.
For SAS : https://husseinsalman.com/securing-access-to-azure-storage-part-4-shared-access-signature/
For SAS built on top of Stored Acess Policies : https://husseinsalman.com/securing-access-to-azure-storage-part-5-stored-access-policy/
Once you have configured the permissions just for that directory/container, you can send that Shared Access Signature to the user and he/she can use Azure Storage Explorer to perform and file upload/delete etc actions on your container.
Download Azure storage explorer here : https://azure.microsoft.com/en-us/features/storage-explorer/#overview
For how to use Azure Storage Explorer : https://www.red-gate.com/simple-talk/cloud/azure/using-azure-storage-explorer/
More on using Azure storage explorer with azure data lake Gen 2 : https://medium.com/microsoftazure/guidance-for-using-azure-storage-explorer-with-azure-ad-authorization-for-azure-storage-data-access-663c2c88efb
This article says that an Azure subscription owner has access to all the resources in the subscription. However to get access to an Azure database, one must either be a user in the database, or be part of the Azure Admin AD group.
Can a subscription owner access the database regardless of the SQL security? If so, how?
The article you refer to gives a very high-level overview on RBAC roles provided in Azure.
It is important to understand these built-in roles that give access to the resources (the management plane) vs those that give access to the resource data (the data plane).
For example, many built-in roles give users access to data, for example: Storage and KeyVault.
As for databases, it all depends on the type of database engine your refer to. Each have specific particularities in terms of roles and permissions.
SQL Database is managed right in the SQL server. This link provides additional details on how this is done. SQL Database
Other modern database engines, such as Cosmos DB, come with different Azure Built-in roles (just like Key Vault or Storage). See this link in order to give you a better idea on the roles and permissions assigned for each roles. Role-based access control in Azure Cosmos DB
First some background:
I want to facilitate access to the different groups of data scientists in Azure Data Lake gen 2. However, we don’t want provide access to them to the entire data lake because they are not supposed to see all the data for security reasons. They must be able to see only some limited files/folders. We are doing that by adding the data scientists’ AAD groups to the ACL of the data lake folders. You can refer to the following links to get more insights and to know what I am talking about:
https://learn.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-access-control
Now the problem:
Since the data scientists are granted access to a very specific/limited area, they are able to access/browse those folders/files using Azure databricks (python commands/code etc.). However, they are not able to browse using Azure Storage Explorer.
So is there some way so that they can browse the datalake using Azure storage explorer or some other GUI tool.
Or is it possible to create some custom role for such a scenario and grant that role to the data scientists AAD groups so that they may just have access to the specific area (i.e. a custom role that may be created that would only have “execute” access on the ADLS gen 2 file-systems.)
As far as I knew, we have no way to use RABC role to control access on some folders in the file system(container). Because when we assign role to ADD group, we need to define a scope. The smallest scope in Azure data lake gen2 is file system(container). If you just want to control access on it, you do not need to create custom role and you can directly use the build-in role Storage Blob Data Reader. If one user has the role, he can read all files in the file system. For more details, please refer to the document
It is not possible to access data via Storage Explorer only with ACL permissions assigned. Unfortunately, you need to use ACLs in combination with RBAC role assigned on the Storage Account level (e.g. Reader), to be able to see Storage Account itself from the Storage Explorer. Then you can introduce granular permissions using ACL on specific containers/folders/files, however with Reader still they will be able to see the names of all the containers in the Storage Account (but cannot see the containers content until specified via ACL or Data RBAC assignment on container level).
As you noticed, the only option to access specific folder/file using only ACL permissions is via code e.g. Powershell or Python.
Given an Azure CosmosDB DB instance that is created from the Azure portal, it is possible to create multiple databases from a shell connection with the following commands:
use someNewDbName;
db.someNewCollectionName.insert({});
With other DB providers that expose MongoDB APIs, it is possible to configure user roles on either a database or colletion level (for users that exist on the same DB instance).
For example, with self-hosted MongoDB, the db.createUser() allows the roles parameter which accepts the db option. MongoDB Atlas allows similar operations to be performed through their UI.
Is it possible to do the same with CosmosDB? Within the Azure Portal, selecting the CosmosDB, and then Access control (IAM) and then Roles leads to a list of built in roles as well as a text that says it is possible to define your own roles but no indication as to how to do that.
I am able to create custom role with following method using Powershell
This role was displayed in the list of available role under "Add Role assignment" Tab
These links might help you
https://learn.microsoft.com/en-us/azure/role-based-access-control/tutorial-custom-role-powershell
https://learn.microsoft.com/en-us/azure/role-based-access-control/custom-roles
I also tried to create users and roles for an Azure CosmosDB using the MongoDB interface and followed this documentation: https://learn.microsoft.com/en-GB/azure/cosmos-db/secure-access-to-data?tabs=using-primary-key.
It seems however that this is simply not supported by the MongoDB interface. I followed the above documentation using the role-based access control approach and eventually run into the following issue when executing the below command:
az cosmosdb sql role definition create --account-name <some-account> --resource-group <some-resource-group> --body #role-definition.json
(BadRequest) The Database Account [<some-database>] has API type
[MongoDB] which is invalid for processing SQL Role Definitions.
The above is also confirmed documentation on the (resource)token-based approach: https://learn.microsoft.com/en-us/rest/api/cosmos-db/permissions (see first line):
Azure Cosmos DB is a globally distributed multi-model database that supports the document, graph, and key-value data models. The content in this section is for managing permission resources using the SQL API via REST.
Hope that this helps.