Moving Resources between subcriptions with AzureDevops - azure

Currently im trying to create dynamic environments via AzureDevops.
One of these steps to achieve this is to take a copy of our production databases and place them in a temp resource group (Production Subscription) and then move the sql server and associated databases to our non-production subscription. From here we then create the web apps and deploy code.
When i run this via Az Cli i am able to move the resources with the following
SQLSERVERID=$(az resource show -g $RSGNAMETEMP -n $SQLSERVERNAME --resource-type "Microsoft.Sql/servers" --query id --output tsv)
az resource move --destination-group $RSGNAME --ids $SQLSERVERID --destination-subscription-id $SANDBOXSUBSCRIPTIONID
However when i run this via AzureDevops i get the following error
ERROR: The client (...) with object id (...) has permission to perform action on scope however, it does not have permission to perform action (...) on the linked scope(s).
I believe this problem is happening when you configure the AZ Cli step in AzureDevops you select the Subscription from the drop down list. The account / service principal only has access rights to that specific subscription and not to multiple. Is it possible to configure a service principal (that can be used in AzureDevops) that can connect to multiple subscriptions?

Yes, just go to Azure portal, navigate to the desired subscription blade, go to Access Control, press + sigh at the top and add your principal as a contributor to the subscription.
to find service principal name use this:
Click Manage link in the Azure Subscription field in your VSTS job, it will navigate you to a new blade. Click Manage Service Principal there. It will take you to the application page in Azure AD. After that you can copy name under Managed application in local directory field and use that name to grant it Key Vault permissions.

Related

Terraform "AuthorizationFailed" with Azure provider

I'm an owner of an Azure resource group but not have permissions on the subscription or on the management group.
When configuring the "azurerm" provider inside my .tf file, I've added subscription id and tenant id (I'm not the owner of that subscription).
--------------------- UPDATE ---------------------
I'm trying to apply Linux virtual machine using Terraform but having authorization issues while planning the .tf file.
I've listed all my accounts using Azure CLI (want to connect the second subscription in the output below):
I've succeeded authenticating to the subscription using Azure CLI with the command (it worked):
az account set --subscription="SUBSCRIPTION_ID"
It's my default and current subscription:
Also, I was able to create and manage resources inside my resource group in that subscription using Azure CLI.
However, I added the exact tenant ID and the exact subscription ID inside my .tf file and still got the same credentials errors during the "terraform plan".
Using Azure CLI or Azure portal I am able to create and manage resources inside the resource group's scope, although using terraform I'm facing problems.
Thank you :)
According to your story, you just set the tenant id and subscription id in the azure provider, so it seems you authenticate via Azure CLI. No matter you have a user account or a service principal, the owner role of the resource group is enough to create virtual machine in the resource group. In this way, you need to logging into the Azure CLI first. As it shows in the link I have provided.

Authorize button when Linking Variable Group to Azure Key Vault in Azure DevOps is not working - why?

I am trying to link Azure Key Vault secrets to a variable group in Azure Pipelines (part in Azure DevOps). Microsoft documentation here.
However, the "Authorize" button does not seem to work. It spins endlessly. Screenshot.
My target Azure Key Vault already has the service principal included in its access policy with Get and List permissions. Screenshot.
Anyone seen this issue before?
This workaround also seems like a bug for Azure Key Vault deployments using ARM templates.
If the service principal in question is added to the Azure Key Vault (AKV) access policies through an ARM template by referencing the service principal's Object ID (as Microsoft documentation calls for), permission errors with Azure Pipelines follow.
However, if I manually add the service principal to the AKV's access policies by referencing the service principal's application (client) ID, the permissions errors go away entirely.
Again, feels like a bug. And now my automated deployment pipeline doesn't quite work because of this manual step.
Also, in the AKV ARM template, if I were to combine the mandatory field objectId with the optional field applicationId, the service principal shows up as a "compound identity". That does not fix the permissions issues in Azure Pipelines. I do not see a way of adding a service principal properly without doing it manually.
Firstly, please make sure the service connection is working correctly. Then refresh the page and try it again. Alternately you can also try in browser inprivate session.
Just as the message said "The specified Azure service connection needs to have "Get, List" secret management permissions on the selected key vault."
Basically, we need to click the "Authorize" button to enable Azure Pipelines to set these permissions for the specific service connection.
If that doesn't work, we can also manually set the permissions for the specific service connection.
Go to Project settings - > Service connections -> Select the
specific ARM service connection
Click Edit to popup the Update Authentication for xxx dialog
Click the "use the full version of the service connection dialog."
link, to get the Service principal client ID
Go to your key vault in Azure portal -> Access Policies -> Add a new
Access Policy -> Select a template (e.g Key&Secret Management) - >
Select Get, List for Secret permissions.
Click Select Principal -> Copy and paste the Service principal client ID
to search the user/application -> Select the searched
user/application
After that you can see the new APPLICATION access policy.
Try it again after successfully adding the application access policy.
UPDATE:
Generally in Azure DevOps we need to create a ARM service connection (the client which can access the azure sources) first before deploying an Azure Key Vault through an ARM template.
Actually when you select the Azure subscription then click Authorize in Azure resource group deployment task
, the ARM service connection is created automatically. You just need to check the AppID and get the ObjectID to use in the ARM template.
We can get the Service principal client ID (AppID) by following above steps. After that we can get ObjectId by the AppID with running the following command: (See Find service principal object ID using PowerShell for details.)
$(Get-AzureADServicePrincipal -Filter "AppId eq 'a89c3dee-f5bf-4ea1-a805-d4c729a4add3'").ObjectId
Then you can specific the ObjectId when deploying the Azure Key Vault through an ARM template.

Not able to make an Azure app as member of an Azure Group

I would like to add an Azure app as member of the Azure Group. I am owner of the Group but when I click on Add-->Member , it only lists individual users and there is no option for adding an app:-
I am not trying to provide access to the SG so it can access the app (for that I will have to go to the specific app page) rather I am trying to make the app as the 'member' of an Azure group that I already own. But I just don't see an option for doing that.
If your group is an Office group, it does not support to add the service principal as a member(i.e. the MSI of your datafactory, which is essentially a service principal created by azure automatically in general, see this link).
If you want to add the service principal to the group, you need to use the Security group, see this link.
If your User type is member, but you are not able to create the Security group, the UsersPermissionToCreateGroupsEnabled setting should be set with false in your Azure AD tenant.
See To restrict the default permissions for member users:
For more details, refer to this link.
You need to run this command first from powershell to create the managed identity
Set-AzDataFactoryV2 -ResourceGroupName <resourceGroupName> -Name <dataFactoryName> -Location <region>
https://learn.microsoft.com/en-us/azure/data-factory/data-factory-service-identity

Unable to create storage for persisting account files in Azure Cloud Shell (CLI)

I'm trying to setup Azure CLI. The first step is to create a storage for account files. I'm using my Developer Program Benefit subscription.
After I click "Create storage" I get an error:
Storage creation failed
Error:409
{"error":{"code":"MissingSubscriptionRegistration","message":"The subscription is not registered to use namespace 'Microsoft.Storage'. See https://aka.ms/rps-not-found for how to register subscriptions."}}
Can't create a storage account. Please try again.
How to resolve this issue?
The reason you're getting this error is because Microsoft.Storage resource provider which manages Storage Account related resources and activities is not registered with your Azure Subscription.
To fix this, please run the following command:
azure provider register --namespace "Microsoft.Storage" --subscription "<your subscription name or id>"
To get the subscription name/id, please run the following command:
azure account list
For more details on why you're getting this error, please see this: The subscription is not registered to use namespace 'Microsoft.DataFactory error
If you prefer GUI, then you can do it on the Azure portal too:
On the left pane click More services and then select Subscriptions
Select the subscription, in our case Developer Program Benefit
In the Settings area click Resource providers
Find Microsoft.Storage and click Register
This same error with detailed solutions are provided in this post:
How to fix Azure Cloud Shell error "MissingSubscriptionRegistration - The subscription is not registered to use namespace 'Microsoft.Storage'"

The client with object id does not have authorization to perform action 'Microsoft.DataFactory/datafactories/datapipelines/read' over scope

I was trying to invoke data factory pipeline from azure function programmatically. Its throwing following error.
link:
http://eatcodelive.com/2016/02/24/starting-an-azure-data-factory-pipeline-from-c-net/
AuthorizationFailed: The client 'XXXX-XXXXX-XXXX' with object id 'XXX829e05'XXXX-XXXXX' does not have authorization to perform action
'Microsoft.DataFactory/datafactories/datapipelines/read' over scope
'/subscriptions/XXXXXX-4bf5-84c6-3a352XXXXXX/resourcegroups/fffsrg/providers/Microsoft.DataFactory/datafactories/ADFTestFFFS/datapipelines/ADFTutorialPipelineCustom'.
tried to search similar issues, but none of the search result gave me solution to my problem, Can you please guide us what could be the issue?
Objective is to, run data factory pipeline whenever file being added to blob. so to achieve the result we are trying to invoke data factory pipeline from azure function using blob trigger.
Step 1: login to your azure portal
Step 2: find Subscriptions in left side menu bar and click.
step 3: Click on Access Control IAM and then click on Add.
Step 4: In Add Permission window, select contributor for role. In select input box, type the app name you created in Azure AD (Created in Azure Active Directory)and select it. In my case I created Azure Resource Management.
Step 5:After you have given successful permission, click on Refresh in your subscription window and you will see your app showing in the list. See below example.
SEE Common problem when using Azure resource groups & RBAC
https://blogs.msdn.microsoft.com/azure4fun/2016/10/20/common-problem-when-using-azure-resource-groups-rbac/
This issue is more likely to happen in newer subscriptions and usually happens if a certain resource type has never been created before in that subscription.
Subscription admins often fix this issue by granting resource group owners contributor rights on the subscription level which contradicts with their strategy of isolating access down to the level of resource group level not the subscription level.
Root cause
Some admins say, that some resources require access to the subscription level to be able to create these resources and that ‘owner’ rights on a resource group level is not sufficient. That is not true.
Let’s take a step back to understand how this all works first.
To provision any resources in azure (using the resource manager model) you need to have a resource provider that supports the creation of that resource. For example, if you will provision a virtual machine, you need to have a ‘Microsoft.Compute’ resource provider available in the subscription first before you can do that.
Resource providers are registered on the level of the subscription only.
Luckily, the Azure Resource Manager (ARM) is intelligent enough to figure that out for you. When a new Azure resource gets provisioned, if the resource provider required for that resource type is not registered in the subscription yet, ARM will attempt to register it for you. That action (resource provider registration) requires access to the subscription level.
By default, any new azure subscription will be pre-registered with a list of commonly used resource providers. The resource provider for IoTHub for instance, is not one of them.
When a user is granted owner rights only on a specific resource group, if that user tries to provision a resource that requires registering a resource provider for the first time, that operation will fail. That is what happened in our case above when trying to provision IoThub.
So the bottom line is, we DO NOT need to grant access permissions to the subscription level for users to be able to create resources like HDInsight, IotHub and SQLDW …etc within their resource groups that they have owner rights on, as long as the resource providers for these resources is already registered.
You get the error that you are not authorized to perform action 'Microsoft.DataFactory/datafactories/datapipelines/read' over scope of pipeline because you don't have the relevant permissions on the datafactory.
You either need to have "Contributor" /"DataFactoryContributor" permissions to create & manage data factory resources or child resources. More details of the azure RBAC roles in the following link:
https://learn.microsoft.com/en-us/azure/active-directory/role-based-access-built-in-roles
Since the customer is trying to use the ADF client from inside Azure Function, the recommendation is to use AAD application and service principal for authentication of ADF client. You can find the instructions for creating AAD application and service principal here:
https://learn.microsoft.com/en-us/azure/azure-resource-manager/resource-group-authenticate-service-principal
Please follow the instructions on how to create the Active Directory application, service principal, and then assign it to the Data Factory Contributor role in the following link and the code sample for using service principal with ADF client.
We recently had this issue with the same message and found that it was caused by the user being logged in with a different subscription (we have 2). Using az login --subscription resolved the problem for us.
For anyone else running into a similar issue with the same error message - After "az login" I was recieving the same error when attempting to create a resource group as Owner, I solved this with:
az account set --subscription "Azure Subscription 1"
Basically it stems from the subscription not being set, you can find the details here:
https://learn.microsoft.com/en-us/cli/azure/manage-azure-subscriptions-azure-cli#get-the-active-subscription
Solution:
Step 1: Register an app in Azure Active directory.
Step 2: Assign 'Data Factory Contributor' role to the same app. we can achieve this by using power shell.
The below code works for me. Please try out in power shell after logged in with Azure credential.
Implementation:
Step 1: $azureAdApplication = New-AzureRmADApplication -DisplayName <AppName> -HomePage <URL> -IdentifierUris <URL with domain> -Password <Password>
Step 2: New-AzureRmRoleAssignment -RoleDefinitionName "Data Factory Contributor" -ServicePrincipalName $azureAdApplication.ApplicationId
Follow this post : https://learn.microsoft.com/en-us/azure/azure-resource-manager/resource-group-create-service-principal-portal
In this post , Role is given as "Reader" which should be "Owner" instead otherwise it would give permission error on deployment.
I solved by following this post:
https://www.nwcadence.com/blog/resolving-authorizationfailed-2016
with the command in PowerShell:
Get-AzureRmResourceProvider -ListAvailable | Select-Object ProviderNamespace | Foreach-Object { Register-AzureRmResourceProvider -ProviderName $_.ProviderNamespace}
I solved by finding the Enterprise Application > Object ID.
(it is weird that it does not use App Reg > Application Id)
https://jeanpaul.cloud/2020/02/03/azure-data-factory-pipeline-execution-error/

Resources