How to create custom RBAC/ABAC role in Azure? - azure

The requirement is to create access package with few roles so that the users can perform below activities:
Read & write access to data stored in a given blob container ('abc' blob container).
Role to access azure data factory to build pipeline, process & load the data to a staging area (to Blob container or SQL server).
DDL & DML and execute permission role to access the data/database in SQL server environment.
I was referring Azure RBAC and built-in-roles but unable to get clear idea considering the above points.
My question is, is there any build in roles there OR do I need to create the custom role? And, how to create custom role (for above requirements) considering baseline security?
Is there any ways, can I get additional actions by referring which I can write custom JSON scripts?
My question is, Is the RBAC roles possible for SQL Server in a VM? If yes, how?
Additionally, if I have both PaaS instance of SQL Server and VM instance of SQL Server (that is, SQL Server in VM) - how the RBAC roles will be managed for both?

According to your requirements, please go through below workarounds if they are helpful:
Read & write access to data stored in a given blob container (‘abc'
blob container).
You can make use of built-in role like Storage Blob Data Contributor which allows operations like read, write and delete Azure Storage containers and blobs. If you want to know more in detail, go through this reference.
Role to access azure data factory to build pipeline, process & load
the data to a staging area (to Blob container or SQL server).
You can make use of built-in role like Data Factory Contributor which allows operations like create and manage data factories, as well as child resources within them. Those child resources include pipelines, datasets, linked services… With this role, you can build pipeline, process and load the data. If you want to know more in detail, go through this reference.
DDL & DML and execute permission role to access the data/database in
SQL server environment.
You can make use of built-in role like SQL Server Contributor which allows operations like manage SQL Servers and Databases. If you want to know more in detail, go through this reference.
If you want to create a custom role for all these, make sure you have Owner or User Access Administrator role on the subscription. You can create a custom role in 3 ways:
Clone a role – You can make use of existing roles and modify the permissions by adding and deleting them according to your need.
Start from scratch – In this, you must add all permissions you need manually by picking them from their providers and excluding the permissions you don’t need.
Start from JSON – Here, you can just upload a JSON file where you can create separately by including all needed permissions in Actions variable whereas excluded permissions in notActions variable. If the permissions are related to data, then add them to DataActions and notDataActions based on your need. In Assignable scope, you can include the scope where the role should be available i.e., subscription or resource group as per need.
Considering baseline security, it is always suggested to give read permissions only. But as you need write permission for blob container and building pipeline, you can just add only those(read/write) in Actions section and remaining all(delete) in NotActions section.
If you want to add additional actions, simply include those permissions in Actions section in JSON file and make sure to give read permissions to resource groups.
A sample custom role JSON file for your reference:
{
"assignableScopes": [
"/"
],
"description": "Combining all 3 requirements",
"id": "/subscriptions/{subscriptionId}/providers/Microsoft.Authorization/roleDefinitions/***************************",
"name": "**********************",
"permissions": [
{
"actions": [
"Microsoft.Authorization/*/read",
"Microsoft.Resources/subscriptions/resourceGroups/read",
"Microsoft.ResourceHealth/availabilityStatuses/read",
"Microsoft.Resources/deployments/*",
"Microsoft.Storage/storageAccounts/blobServices/containers/read",
"Microsoft.Storage/storageAccounts/blobServices/containers/write",
"Microsoft.DataFactory/dataFactories/*",
"Microsoft.DataFactory/factories/*",
"Microsoft.Sql/locations/*/read",
"Microsoft.Sql/servers/*",
],
"notActions": [
"Microsoft.Storage/storageAccounts/blobServices/containers/delete",
"Microsoft.Sql/servers/azureADOnlyAuthentications/delete",
"Microsoft.Sql/servers/azureADOnlyAuthentications/write"
],
"dataActions": [
"Microsoft.Storage/storageAccounts/blobServices/containers/blobs/delete",
"Microsoft.Storage/storageAccounts/blobServices/containers/blobs/read",
"Microsoft.Storage/storageAccounts/blobServices/containers/blobs/write",
"Microsoft.Storage/storageAccounts/blobServices/containers/blobs/move/action",
"Microsoft.Storage/storageAccounts/blobServices/containers/blobs/add/action"
],
"notDataActions": []
}
],
"roleName": "Custom Role Contributor",
"roleType": "CustomRole",
"type": "Microsoft.Authorization/roleDefinitions"
}
Reference:
Azure custom roles - Azure RBAC | Microsoft Docs

Related

ADLS Gen2 --> ACL on a folder level

I have a question regarding permissions for ADLS Gen2
short description:
I have a Gen2 storage account and created a container.
Folder Structure looks something like this
StorageAccount1
--->Container1
--->Folder1
--->Files 1....n
Also i have a service principal from a customer..
Now i have to provide the customer write only permission to only Folder1 (should not be able to delete files inside Folder1)
I have assigned the service principle below permissions in the Access control list
Container1 --> Execute
Folder1 --> Write , Execute
with this the customer can now put data into this Folder1.. but how do i prevent him from deleting any files inside it? ( i dont wanna use SAS )
Or is there any other way other than ACL?
Please help :)
ACL for ADLSgen2
Please check if below can be worked.
ACLs are the way to control access to be applied on the file and
folder level unlike others which are for container level.
Best practice is to always restrict access using (Azure RBAC)
on the Storage Account/Container level has several Azure built-in
roles that you can assign to users, groups, service principals, and
managed identities and then combine with ACLs with more restrictive
control on the file and folder level.
Ex: Storage Blob Data Contributor has read/write/delete permission .Assign an Azure role
If the built-in roles don't meet the specific needs of your
organization, you can create your own Azure custom roles.
Reference
To assign roles, you must be assigned a role that has role assignments write permission, such as Owner or User Access Administrator at the scope you are trying to assign the role.
To create a custom role with customized permissions.
Create a new file C:\CustomRoles\customrole1.json like example below. The ID should be set to null on initial role creation as a
new ID is generated automatically.
{
"Name": "Restrict user from delete operation on Storage",
"ID": null,
"IsCustom": true,
"Description": "This role will restrict the user from delete operation on the storage account. However, customer will be able to see the storage account, container, blob.",
"Actions": [
"Microsoft.Storage/storageAccounts/read",
"Microsoft.Storage/storageAccounts/blobServices/containers/read",
"Microsoft.Storage/storageAccounts/blobServices/generateUserDelegationKey/action"
],
"NotActions": [
"Microsoft.Storage/storageAccounts/blobServices/containers/delete"
],
"DataActions": [
"Microsoft.Storage/storageAccounts/blobServices/containers/blobs/read",
"Microsoft.Storage/storageAccounts/blobServices/containers/blobs/write",
"Microsoft.Storage/storageAccounts/blobServices/containers/blobs/move/action",
"Microsoft.Storage/storageAccounts/blobServices/containers/blobs/add/action"
],
"NotDataActions": [
"Microsoft.Storage/storageAccounts/blobServices/containers/blobs/delete",
],
"AssignableScopes": [
"/subscriptions/dxxxx7-xxxx"
]
}
Using the above role definition, by running the below powershell script to create a custom role:
New-AzRoleDefinition -InputFile "C:\CustomRoles\customrole1.json"
see below References: for details.
How to restrict the user from upload/download or delete blob in
storage account - Microsoft Tech Community
Azure built-in roles - Azure RBAC | Microsoft Docs
Also try to enable soft delete to restore delete actions if the role has delete permissions .
Though mentioned to not use .Just in case .Shared access signature (SAS) can be used to restrict access to blob container or an individual blob. Folder in blob storage is virtual and not a real folder. You may refer to the suggestion mentioned in this article
References
permissions-are-required-to-recursively-delete-a-directory-and-its-contents
Cannot delete blobs from ADLS gen2 when connected via Private Endpoint - Microsoft Q&A
Authorize with Azure Active Directory (REST API) - Azure Storage | Microsoft Docs

Provide Azure VM access to multiple Storage Accounts

I have two storage accounts stinboundclient1 & stinboundclient2 and storage account have initial "stinbound" is common for both. Now inside storage accounts there are containers for each environment (dev,test,prod). Now I have a dev Virtual Machine (DevVM) and it needs access to only "dev" container of both storage accounts. What is the best way we can provide read/contributor access to VM using azure policy or custom role or any other approach?
Please do not suggest manual way of providing RBAC permission to VM bcoz its tedious task to provide each container that access as eventually there will 30-40 clients storage accounts.
Storage Account & Containers:
stinboundclient1/dev
stinboundclient1/test
stinboundclient1/prod
stinboundclient2/dev
stinboundclient2/test
stinboundclient2/prod
DevVM needs access to stinbound/dev*
Similarly Test and Prod need access respective containers::
TestVM needs access to stinbound*/test
ProdVM needs access to stinbound*/prod
It seems to me that what you are looking for is actually what Microsoft calls Attribute-based Access Control (ABAC).
That way, you can grant access to a scope and add a particular condition for this access to be effective on the name of the container, a tag to be present, etc.
This feature is still in Preview though.

Securing an Azure Function

I'm trying to apply the least privilege principle to an Azure Function. What I want is to make a FunctionApp have only read access to a, for example, storage queue. What I've tried so far is:
Enable managed identity in the FunctionApp
Create a role that only allows read access to the queues (role definition below)
Go to the storage queue IAM permissions, and add a new role assignment, using the new role and the Function App.
But it didn't work. If I try to write to that queue from my function (using an output binding) the item is written, when I expected a failure. I've tried using the builtin role "Storage Queue Data Reader (Preview)" with the same result.
What's the right way to add/remove permissions of a Function App?
Role definition:
{
"Name": "Reader WorkingSA TestQueue Queue",
"IsCustom": true,
"Description": "Read TestQueue queue on WorkingSA storage accoung.",
"actions": ["Microsoft.Storage/storageAccounts/queueServices/queues/read"],
"dataActions": [
"Microsoft.Storage/storageAccounts/queueServices/queues/messages/read"
],
"notActions": [],
"notDataActions": [],
"AssignableScopes": [
"/subscriptions/XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX/resourceGroups/TestAuth-dev-rg"
]
}
#anirudhgarg has pointed the right way.
The managed identity and RBAC you set makes difference only when you use managed identity access token to reach Storage service in Function app. It means those settings have no effect on function binding as it internally connects to Storage using connection string. If you haven't set connection property for the output binding, it leverages the AzureWebJobsStorage app settings by default.
To be more specific, connection string has nothing to do with Azure Active Directory Authentication process so it can't be influenced by AAD configuration. Hence if a function takes advantage of Storage Account connection string(e.g. uses Storage related binding), we can't limit its access with other settings. Likewise, no connection string usage means no access.
Update for using SAS token
If the queue mentioned is used in a Queue Trigger/input binding, we can restrict function with read and process(get message then delete)access, here comes SAS token.
Prerequisite:
Queue locates at Storage account other than the one specified by AzureWebJobsStorage app setting. AzureWebJobsStorage requires connection string offering full access with Account key.
Function app is 2.0. Check it on Function app settings> Runtime version: 2.xx (~2). In 1.x it requires more permissions like AzureWebJobsStorage.
Then get SAS token on portal as below and put it in app settings.

Run an arbitrary Azure provider operation

In the Azure CLI resource manager, we can list providers and their operations.
azure provider list
azure provider operations show Microsoft.Web/sites/*
How do we run one of the listed operations. For instance, how would we run this:
Operation : Microsoft.Web/sites/sourcecontrols/web/Read
OperationName : Get Web App's source control configuration
ProviderNamespace : Microsoft Web Apps
ResourceName : Web App Source Control
Description : Get Web App's source control configuration settings.
The purpose of azure provider operations show is to display operations that are supported by the various providers so that you can use them to create custom Role Based Access Control (RBAC) roles. They are not actual commands or endpoints that can be executed.
To create a custom RBAC role, you first create a JSON file describing the role and operations allowed by the role, then pass the file to azure role create.
More details here.. https://azure.microsoft.com/en-us/documentation/articles/role-based-access-control-manage-access-azure-cli/#create-a-custom-role

Allow users to start/stop particular Azure VMs

Our sales team will be using Azure VMs to do sales demos. I would like to be able to allow certain people to be able to start/stop their own VMs at will. I've seen being able to add people as an administrator in the management portal, but this seems to give them access to our whole subscription. I'd like to be able to manage this without having everyone create their own subscription.
Example scenario:
Person A is able to start/stop Person A's dedicated VM.
Person B is able to start/stop Person B's dedicated VM.
etc.
In order to allow a user to start and stop a virtual machine you need to create a custom role with the right permissions.
In this answer I will list the steps to follow in order to get this result using the azure command line interface. You can do the same using the Power Shell or the Azure Rest Api (find more information about the commands to be used with the Power Shell at this link and with the Azure Rest Api at this link).
Create a JSON file with the following content (let us name it newRole.json):
{
"Name": "Virtual Machine Operator",
"IsCustom": true,
"Description": "Can deallocate, start and restart virtual machines.",
"Actions": [
"Microsoft.Compute/*/read",
"Microsoft.Compute/virtualMachines/start/action",
"Microsoft.Compute/virtualMachines/restart/action",
"Microsoft.Compute/virtualMachines/deallocate/action"
],
"NotActions": [
],
"AssignableScopes": [
"/subscriptions/11111111-1111-1111-1111-111111111111"
]
}
A short explanation of each field of the JSON file:
Name: the name of the new role. This is the name that will be shown in the azure portal
Is Custom: specifies that it is a user defined role
Description: a short description of the role, is is shown as well in the azure portal
Actions: the list of action that can be performed by a user associated to this role. Respectively each line allows the user to:
See the list of virtual machines (not all of them, we will see later how to specify which VM will be visible to each user)
Start one of the virtual machine among those in the list
Restart one of the virtual machine among those in the list
Deallocate one of the virtual machine among those in the list
No Actions: the list of action that can't be performed by a user associated to this role. In this case the list is empty, in general it has to be a subset of the previous field.
AssignableScopes: the set of your subscriptions where the role has to be added. Each code is prefixed by the /subscription/ string. You can find the code of your subscription by accessing the subscription menu (identified by this icon)
and copy the value under SUBSCRIPTION ID column
Login to your azure account with the azure cli executing the command az login. More information about how to install the azure cli and perform the login process respectively here and here.
Create the new role executing the command az role definition create --role-definition newRole.json.
Access the portal and select the virtual machine that has to be powered on and off by a user of your choice
After you selected the machine select Access control (Iam)
From the new blade select Add
Fill in the fields as follow:
Role: Select the role you just created, in our case Virtual Machine Operator
Assign access to: Azure AD user, group, or application.
Select: the email associated to the account that needs to start/restart/stop the VM
Press save
After this operations when the user will access the portal she will see the selected VM in the list of the virtual machines. If she selects the virtual machine she will be able to start/restart/stop it.
Open your VM in portal.azure.com
navigate to Access control (IAM) → Role Assignments and click Add Role Assignment.
Select standard role Virtual Machine Contributor,
Assign access to leave by default (Azure AD user, group ...),
In Select field enter email of new limited user and select Guest.
Save.
That's all.
Ive created a custom role to allow this. I've tested and it works.
You have to start with the "Virtual Machine User Login" role then add the additional permissions. This does of course give the user log permissions as well but I assume if you are allowing them to start and stop the VM then you would also want them the ability to log in.
Via the GUI:
1. Add Custom Role
2. Select "Clone a role" and role to close is "Virtual Machine User Login"
3. Click Next
4. Select add permissions
5. Scroll down to "Microsoft.Compute.VirtualMachines" and tick
Microsoft.Compute/virtualMachines/start/action"
"Microsoft.Compute/virtualMachines/powerOff/action"
"Microsoft.Compute/virtualMachines/deallocate/action"
6. Click Next, select subscription, Next, Next then "Create".
7. List item
All permissions for the role:
Action Microsoft.Network/publicIPAddresses/read
Action Microsoft.Network/virtualNetworks/read
Action Microsoft.Network/virtualNetworks/read
Action Microsoft.Network/loadBalancers/read
Action Microsoft.Network/networkInterfaces/read
Action Microsoft.Compute/virtualMachines/*/read
Action Microsoft.Compute/virtualMachines/start/action
Action Microsoft.Compute/virtualMachines/powerOff/action
Action Microsoft.Compute/virtualMachines/deallocate/action
DataAction Microsoft.Compute/virtualMachines/login/action
Here's the JSON:
{ "properties": { "roleName": "VM_Operator_test", "description": "", "assignableScopes": [ "/subscriptions/exampesubscription/EXAMPLE_RG" ], "permissions": [ { "actions": [ "Microsoft.Network/publicIPAddresses/read", "Microsoft.Network/virtualNetworks/read", "Microsoft.Network/loadBalancers/read", "Microsoft.Network/networkInterfaces/read", "Microsoft.Compute/virtualMachines/*/read", "Microsoft.Compute/virtualMachines/start/action", "Microsoft.Compute/virtualMachines/powerOff/action", "Microsoft.Compute/virtualMachines/deallocate/action" ], "notActions": [], "dataActions": [ "Microsoft.Compute/virtualMachines/login/action" ], "notDataActions": [] } ] }}
Currently this is not possible. Though it is possible via some programming. What you see on Azure Portal can be achieved through Azure Service Management API. What you could do is write an application which consumes this API and there you could define all the rules.
If you think your sales folks will not mess around, another thing you could do is create some custom PowerShell scripts by making use of Azure PowerShell Cmdlets and they can just execute those scripts to start/stop the VMs.
My recommendation would be to build your own façade that leverages the Azure Management API to perform these tasks for you. This allows you to put in place your own controls around access/authorization as well as rig it to span multiple subscriptions (should this ever prove necessary). This façade could potentially be hosted in a free tier Azure web site.
Through azure-cli
create custom-role file "VirtualMachineStartStop.json"
{
"Name": "Virtual Machine Start Stop Access",
"IsCustom": true,
"Description": "Start/Restart/Deallocate virtual machines",
"Actions": [
"Microsoft.Storage/*/read",
"Microsoft.Network/*/read",
"Microsoft.Compute/*/read",
"Microsoft.Compute/virtualMachines/start/action",
"Microsoft.Compute/virtualMachines/restart/action",
"Microsoft.Compute/virtualMachines/deallocate/action"
],
"NotActions": [
],
"AssignableScopes": [
"/subscriptions/<azure_subscription_id_here>"
]
}
Create role
az role definition create --role-definition "./VirtualMachineStartStop.json"
Confirm the role creation
az role definition list --custom-role-only true

Resources