i'm getting the error below when i want to create a vm on azure :
does not have authorization to perform action 'Microsoft.Storage/register/action' over scope
I created an application on the classic portal and followed this tutorial :
https://azure.microsoft.com/en-us/documentation/articles/resource-group-create-service-principal-portal/
after that i created a resource group on the new portal and assigned owner to this application.
I'm using this puppet script to create the vm :
azure_vm { 'sample':
ensure => present,
location => 'westus',
image => 'canonical:ubuntuserver:14.04.2-LTS:latest',
user => 'azureuser',
password => 'Password',
size => 'Standard_A0',
resource_group => 'puppettest123',
}
when i run it i get this exact error :
Error: {"error"=>{"code"=>"AuthorizationFailed", "message"=>"The client '5b0bc6d-fcad-4223-8527-a2c9afc2661' with object id '5b0bc6d-fcad-4223-8527-a2c9afc2661' does not have authorization to perform action 'Microsoft.Storage/register/action' over scope '/subscriptions/5ad96a9-45de-4fe1-91e8-2514dd5e6a9'."}}
Error: /Stage[main]/Main/Azure_vm[sample]/ensure: change from absent to present failed: {"error"=>{"code"=>"AuthorizationFailed", "message"=>"The client '5b0bc6d-fcad-4223-8527-a2c9afc2661' with object id '5b0bc6d-fcad-4223-8527-a2c9afc2661' does not have authorization to perform action 'Microsoft.Storage/register/action' over scope '/subscriptions/5ad96a9-45de-4fe1-91e8-2514dd5e6a9'."}}
https://forge.puppet.com/puppetlabs/azure
Any ideas on how i can fix this issue ?
I fixed the problem using this command :
azure role assignment create --objectId 7dbc8265-51ed-4038-8e13-31948c7f4ce7 -o Owner -c /subscriptions/{subscriptionId}/
azure role assignment create --objectId 7dbc8265-51ed-4038-8e13-31948c7f4ce7 -o Owner -c /subscriptions/{subscriptionId}/
Related
Rukmini, the SP has contributor role on the subscription not sure why it is erroring and it worked after ading it in a Reader role?
Here is a snapshot
I am trying some operations from az cli(terraform commands) getting the following error:
The client '87c92100-.....' with object id '87c92100....' does not
have authorization to perform action
'Microsoft.Resources/subscriptions/resourcegroups/read' over scope
'/subscriptions/f151ee3f-4725-......../resourcegroups/tfmainrg' or the
scope is invalid
I searched everywhere, RG, AAD, storage account....not able to find where this object or client resdes. Can you please tell me how can I find what object is this? Thanks
I tried to reproduce the same in my environment and got the same error as below:
The error "Authorization Failed" usually occurs if the user does not have access/role to read the resource group.
To resolve the error, make sure to assign Reader role to the user in the subscription level like below:
Go to Azure Portal -> Subscriptions -> Select your subscription -> Access control (IAM) -> Add role assignment -> Select Reader
You can also use az cli to assign the Reader role to the user by using below command:
The Owner or User administrator role is required to assign any role.
So, use another tenant level Owner to assign the role.
az role assignment create --assignee-object-id UserObjectID --scope subscriptions/SubscriptionID --role reader
The reader role successfully granted to the user like below:
After assigning the role, I am able to read the resource group successfully:
az group show -n ResourceGroupName
Based on your further requirement you can assign the roles by referring to the below MsDoc:
Azure built-in roles - Azure RBAC
Today I have tried to perform action on Azur ADF using CLI (Portal for that subscription can be only used as "read") AZ CLI is installed on AZ VM that via Managed identity has received Contributor role on the whole subscription. Running command ended with AuthorizationFailed.
After logging into AZ CLI with AZ login -i
and running command az datafactory configure-factory-repo
(AuthorizationFailed) The client 'CLIENT_ID' with object id
'CLIENT_ID' does not have authorization to perform action
'Microsoft.DataFactory/locations/configureFactoryRepo/action' over scope
'/subscriptions/SUBSCRIPTION_ID' or the scope is invalid.
If access was recently granted, please refresh your credentials.
Code: AuthorizationFailed Message: The client 'CLIENT_ID'
with object id 'CLIENT_ID' does not have authorization to
perform action 'Microsoft.DataFactory/locations/configureFactoryRepo/action' over scope
'/subscriptions/SUBSCRIPTION_ID' or the scope is invalid. If access
was recently granted, please refresh your credentials.
I have checked and VM Contributor role has Microsoft.DataFactory/locations/configureFactoryRepo/action
What else I should check?(I have no access to AZ AD)
Edit:
CLIENT_ID is equal to principalId of VM from which I'm running commands.
I assume that the CLIENT_ID and SUBSCRIPTION_ID actually are real values and you have replaced them to not disclose the here, correct?
To be sure that you are in the correct context you could first issue 'az account show' after you logged in using 'az login -i'. Is the response to that what you expected?
-- Edit --
The client ID should be the client id of the managed identity, also sometimes referred to as App ID (same thing). So when you log in with -i I believe it should be the same output as when you do the az account show. So that's a good thing.
Then I kind of get the feeling that it is a scope error. It looks a lot like you run in to this and it's by design as of now. But have a look at lmicverm's comment. You might use the the other call (Create or update Factory) as a workaround?
I create a Managed Identity for a Function app and assigned it to DocumentDB Account Contributor by following the two sections below
https://learn.microsoft.com/en-us/azure/cosmos-db/managed-identity-based-authentication#assign-a-system-assigned-managed-identity-to-a-function-app
https://learn.microsoft.com/en-us/azure/cosmos-db/managed-identity-based-authentication#grant-access-to-your-azure-cosmos-account
Microsoft.Azure.Services.AppAuthentication
I got an exception when I tried to run the code from the section below:
https://learn.microsoft.com/en-us/azure/cosmos-db/managed-identity-based-authentication#programmatically-access-the-azure-cosmos-db-keys
Could not load file or assembly 'System.Text.Encodings.Web,
Version=6.0.0.0, Culture=neutral, PublicKeyToken=cc7b13ffcd2ddd51'.
The system cannot find the file specified.
at
System.Text.Json.Serialization.Metadata.JsonPropertyInfo.DeterminePropertyName()
at
System.Text.Json.Serialization.Metadata.JsonPropertyInfo.GetPolicies(Nullable1 ignoreCondition, Nullable1 declaringTypeNumberHandling) at
...
System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task
task) at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult()
at
Cosmos.Samples.AzureFunctions.AzureFunctionsCosmosClientMI.d__7.MoveNext()
in
C:.ME\MyLab.Code\AzureCode\CosmosDB\azure-cosmos-dotnet-v3-usage\AzureFunctions\AzureFunctionsCosmosClientMI.cs:line
85
Azure.Identity
Since AppAuthentication is not recommended by MS, then I switched to using Azure.Identity by following the links below:
https://learn.microsoft.com/en-us/dotnet/api/overview/azure/identity-readme?view=azure-dotnet
https://joonasaijala.com/2021/07/01/how-to-using-managed-identities-to-access-cosmos-db-data-via-rbac-and-disabling-authentication-via-keys/
and the code below
static string cosmosUrl = "https://xxx.documents.azure.com:443/";
private static CosmosClient client = new CosmosClient(cosmosUrl, new DefaultAzureCredential());
var container = client.GetContainer("FamilyDatabase", "FamilyContainer");
try
{
var result = await container.CreateItemAsync<Item>(data, new PartitionKey(data.LastName));
return new OkObjectResult(result.Resource.Id);
}
catch (CosmosException cosmosException)
{
log.LogError("Creating item failed with error {0}", cosmosException.ToString());
return new BadRequestObjectResult($"Failed to create item. Cosmos Status Code {cosmosException.StatusCode}, Sub Status Code {cosmosException.SubStatusCode}: {cosmosException.Message}.");
}
However, I got the exception below both locally and running it in Azure.
Failed to create item. Cosmos Status Code Forbidden, Sub Status Code
5301: Response status code does not indicate success: Forbidden (403);
Substatus: 5301; ActivityId: xxxx-bf03-4355-8642-5d316f9d3373;
Reason: (Request blocked by Auth xxxx : Request is blocked because
principal [xxx-2bff-44e9-97be-9ffeb3aae3ee] does not have
required RBAC permissions to perform action
[Microsoft.DocumentDB/databaseAccounts/readMetadata] on resource [/].
Learn more: https://aka.ms/cosmos-native-rbac. ActivityId:
xxx-bf03-4355-8642-5d316f9d3373,
Microsoft.Azure.Documents.Common/2.14.0, Windows/10.0.14393
cosmos-netstandard-sdk/3.24.1);.
Locally, I logged into VS following the link
https://learn.microsoft.com/en-us/dotnet/api/overview/azure/identity-readme?view=azure-dotnet#authenticating-via-visual-studio
Any idea for resolving issues with Azure.Identity?
Ref:
Connect Function App to CosmosDB with Managed Identity
https://github.com/Azure/azure-sdk-for-net/tree/Azure.Identity_1.5.0/sdk/identity/Azure.Identity/samples
I ran into this same error this morning while setting up CosmosDB to use the Managed Identity of my Azure VM. The error message states that your principal does not have the RBAC permission Microsoft.DocumentDB/databaseAccounts/readMetadata. Once you give the principal you are using that permission authentication using Azure.Identity should work.
The DocumentDB Account Contributor doesn't have the role Microsoft.DocumentDB/databaseAccounts/readMetadata, and I couldn't find a built in Azure role that contained that permission so I created my own custom CosmosDBReadWrite role by following the example in this article.
To create the custom role definitions and assignments, you will need to have the Azure CLI installed.
Configuring Custom Role Definitions
Prepare JSON Role Definition File
First you will need to create a json file with the role definition. Here are 2 different custom role configuration json files, one for read-only access to the CosmosDB and the other has read-write role access.
Json file for read-only custom role
{
"RoleName": "CosmosDBReadOnlyRole",
"Type": "CustomRole",
"AssignableScopes": ["/"],
"Permissions": [{
"DataActions": [
"Microsoft.DocumentDB/databaseAccounts/readMetadata",
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/items/read",
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/executeQuery",
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/readChangeFeed"
]
}]
}
Json file for read-write custom role
{
"RoleName": "CosmosDBReadWriteRole",
"Type": "CustomRole",
"AssignableScopes": ["/"],
"Permissions": [{
"DataActions": [
"Microsoft.DocumentDB/databaseAccounts/readMetadata",
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/items/*",
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/*"
]
}]
}
Create and Assign Role Definition
After you have your json file saved with your custom role definition, we can create the custom role with the Azure CLI and then assign it to the correct principal.
Create your custom role using the JSON file you created above
resourceGroupName='<myResourceGroup>'
accountName='<myCosmosAccount>'
az cosmosdb sql role definition create -a $accountName -g $resourceGroupName -b #role-definition.json
After you create the role, the definition of the created role should be returned. If not, use the following command to find the roleDefinitionId which can be found in the name property.
az cosmosdb sql role definition list --account-name $accountName -g $resourceGroupName
Finally, apply the custom role to your principal that needs permission to access CosmosDB.
resourceGroupName='<myResourceGroup>'
accountName='<myCosmosAccount>'
roleDefinitionId = '<roleDefinitionId>'
principalId = '<ID for the Object that needs access to the CosmosDB>'
az cosmosdb sql role assignment create -a $accountName -g $resourceGroupName -s "/" -p $principalId -d $roleDefinitionId
Hope this solves the error you are running into as well!
I have below terraform script, to create a new service account and make it owner.
The scripts creates the service account, but it will throw an error on assigning role
resource "google_service_account" "pci_api_service_account" {
account_id = "pci-api"
display_name = "Api"
project = var.project_id
}
resource "google_service_account_iam_member" "pci_api_owner_binding" {
# service_account_id = "projects/pcb-poc-pci/serviceAccounts/infra-admin-sa#pcb-poc-pci.iam.gserviceaccount.com"
service_account_id = google_service_account.pci_api_service_account.name
role = "roles/owner"
member = "serviceAccount:${google_service_account.pci_api_service_account.email}"
depends_on = [
google_service_account.pci_api_service_account
]
}
and I already autheticated with infra-admin-sa service account by running
gcloud auth activate-service-account --project=pcb-poc-pci --key-file ~/sa/pcb-poc-pci-test-sa-94aa6c81d650.json
When I run terragrunt apply I get this error for the second script
Error: Error applying IAM policy for service account 'projects/pcb-poc-pci/serviceAccounts/pci-api#pcb-poc-pci.iam.gserviceaccount.com': Error setting IAM policy for service account 'projects/pcb-poc-pci/serviceAccounts/pci-api#pcb-poc-pci.iam.gserviceaccount.com': googleapi: Error 403: Permission iam.serviceAccounts.setIamPolicy is required to perform this operation on service account projects/pcb-poc-pci/serviceAccounts/pci-api#pcb-poc-pci.iam.gserviceaccount.com., forbidden
These are the Roles of that service account
Based on google doc here and the error message, Service Account Admin should be enough, which my service account already have
Not sure what I missed
Solution 1
seems command line was not picking the correct credential/service account although I used gcloud auth activate-service-account command.
so I added this to my script
provider "google" {
credentials = file(var.service_account_file_path)
project = var.project_id
}
and now it's working fine
Solution 2
as per #John Hansley comments below
export GOOGLE_APPLICATION_CREDENTIALS=fullpath.json
then terraform will be picking that service account file and scripts will run successfully.
This method is preferred since less issue in CICD pipeline and other deveopers, to set terraform variables
user { 'acc1':
ensure => present,
managehome => true,
password => 'Test123',
groups => ['Administrators'],
auth_membership => 'minimum',
notify => Exec['app config']
}
exec { 'app config':
path => 'c:\\program files (x86)\\app\\bin',
command => 'config.bat -f responsefile.rsp',
refreshonly => true
}
The user is getting created, but I need the local account to be used for the app configuration.
The above puppet script is executed by domain account abc\myname, and the application requires a local account to be used for the configuration.
So I have created a local account through puppet and using notify to tell exec to use the account created by the puppet. But when it is executed, the application is throwing error: need a local account or administrator
In logs the error is myname is not a local account or administrator.
I see that exec is not using the local user acc1 created by puppet.
Is there any other way wherein I can direct the exec to use a particular local user account to use for configuration.
Please advise.
Specify user => 'acc1' in the exec resource: Resource Type Reference: exec: user.
So I have created a local account through puppet and using notify to tell exec to use the account created by the puppet.
The notify is a ordering and refreshing mechanic - when resource A notifies resource B, no information is carried between them other than an update to resource B is required. It doesn't specify that resource B uses resource A necessarily, it doesn't inherit any properties from the resource A.