I'm trying to derive the Registration Key and Url of my Azure Automation DSC account inside the ARM template at runtime. I've tried using the same syntax as you would for a storage account, ie.
listKeys(resourceId('Microsoft.Storage/storageAccounts', 'StorageAccountName'), '2015-05-01-preview').key1)
by doing this:
listKeys(resourceId('Microsoft.Automation/automationAccounts', 'AutomationAccountName'), '2015-05-01-preview').key1)
but no luck (it appears the function simply returns null). This would naturally make provisioning an automation account and and a VM and wiring up the VM to the automation account in the same template easy as pie. Has anyone successfully got something similar to work?
As per this GitHub Ticket, this is still under development.
https://github.com/azureautomation/automation-packs/issues/7
With version 2015-10-31 of the Azure Automation API, the following seems to work.
Getting the registration URL:
reference(resourceId('Microsoft.Automation/automationAccounts/', 'AutomationAccountName'), '2015-10-31').RegistrationUrl
Getting the Primary key:
listKeys(resourceId('Microsoft.Automation/automationAccounts/', 'AutomationAccountName'), '2015-10-31').keys[0].value
Getting the Secondary key:
listKeys(resourceId('Microsoft.Automation/automationAccounts/', 'AutomationAccountName'), '2015-10-31').keys[1].value
For reference, the object returned from the listKeys() template function for an Automation account resource looks like this (can easily be found by adding an output value using listKeys() to the outputs section of an ARM template):
{
"keys": [
{
"KeyName": "Primary",
"Permissions": "Full",
"Value": "VALUE OF PRIMARY KEY"
},
{
"KeyName": "Secondary",
"Permissions": "Full",
"Value": "VALUE OF SECONDARY KEY"
}
]
}
Related
Hope you are doing well.
Is there a command in the Spark Utilities to see all secrets in an Azure Key Vault run in an Azure Synapse Spark Notebook?
Page https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/microsoft-spark-utilities?pivots=programming-language-python#credentials-utilities
I can reference a Secret Like so mssparkutils.credentials.getSecret('azure key vault name','secret name') but can I list all the secrets?
Thanks
Unfortunately, there is no command available to list all secrets in Key Vault.
You may checkout my answer on MS Q&A platform on how to use Access Secret from vault using Synapse pyspark notebook.
Appreciate if you could share the feedback on our Azure Synapse feedback channel. Which would be open for the user community to upvote & comment on. This allows our product teams to effectively prioritize your request against our existing feature backlog and gives insight into the potential impact of implementing the suggested feature.
You can call Azure Key Vault (AKV) via its REST API and the GetSecret methods, which returns a list of secrets in their full URL form. You could use a Web activity in Synapse pipelines to call this. Example settings:
Setting
Value
Notes
URL
{vaultBaseUrl}/secrets?api-version=7.2
See below for sample URL
Method
GET
Authentication
Managed Identity
Resource
https://vault.azure.net
Sample Key Vault URL
https://yourKeyVault-akv.vault.azure.net/secrets?api-version=7.2
Sample results:
{
"value": [
{
"id": " https://yourKeyVault-akv.vault.azure.net/secrets/somepassword ",
"attributes": {
"enabled": true,
"created": 1635948403,
"updated": 1635948403,
"recoveryLevel": "Recoverable+Purgeable",
"recoverableDays": 90
},
"tags": {}
},
{
"id": " https://yourKeyVault-akv.vault.azure.net/secrets/someusername ",
"attributes": {
"enabled": true,
"created": 1635949171,
"updated": 1635949171,
"recoveryLevel": "Recoverable+Purgeable",
"recoverableDays": 90
},
"tags": {}
}
],
You are able to loop through the values with a For Each activity, eg the Items value would be:
#activity('Web Get AKV Secrets').output.value
Refer to the individual secret inside the For Each activity like this:
#item.id
Get the actual secret name by using split and last functions, eg
#last(split(item().id, '/'))
You could then pass the individual secret name or the collection into a Synapse notebook as a parameter.
I am creating a key vault through an Azure Blueprint: it gets created with no problem.
The thing is that, in order to access the Key Vault (Listing it, putting or getting values) Access Policies must be configured.
With ARM templates, I could insert a section like :
"accessPolicies": [
{
"tenantId": "22222222-3333-4444-aaaa-eeeeeeeeeeee",
"objectId": "77777777-6666-4444-8888-111111111111",
"permissions": {
"keys": [
"Get",
...
"Restore"
],
"secrets": [
"Get",
...
"Restore"
],
"certificates": []
}
},
but I'd need to have a TenantId and an ObjectId to hardcode, or to get as parameter, which is not the right way to do it.
Unfortunately I could not find a way to assign these access policies to the Key Vault, without which the key vault itself just can't be used, unless making those settings manual (and deleting them each time the blueprint is upgraded).
Is there a guideline or a best practice to do this in the proper way ?
The tenantId you can get dynamically using "[subscription().tenantId]". See official documentation.
Regarding the objectId, using a parameter is usually the right way as stated in this answer. ObjectIDs are not on the same layer as ARM components and therefore there aren't real way to get those dynamically using ARM.
Starting from the answer from #jul_DW, I realized that ARM could not set in an easy way users, so I should have used to something in the Blueprint.
And infact, one of the key features for Blueprint is Role Assignment.
To use this approach we need to enable the RBAC for the Key Vault, but that is done easily in the ARM template itself:
...
"type": "Microsoft.KeyVault/vaults",
"properties": {
...
"enableRbacAuthorization": true,
...
}
Once the RBAC is enabled, the ARM template should stay away from assigning permissions, and the Role Assignment feature from Blueprints should be used.
In my case, I assigned a Key Vault Administrator role to the needed user group, that can be indicated as a parameter. In this way the ARM template is kept as simple as possible, and at the same time we have a great flexibility in assigning roles to different users in the various environments.
I was asked by Azure Support to post this question, just to see if anyone had a useful opinion.
I am stepping through MS Azure training courses. I created the usual free account to go through these. I've gone through a few dozen of them, and am now at this one:
https://learn.microsoft.com/en-us/learn/modules/secure-and-isolate-with-nsg-and-service-endpoints/3-exercise-network-security-groups?source=learn
This attempts to use the Azure PowerShell service. I had some trouble getting to the PowerShell page. It appears that if I'm not already logged into the portal, it goes into a semi-infinite loop, trying to get to the shell page, then trying to login, then the shell page, and finally it gives up and says "We couldn't sign you in. Please try again.".
However, I was able to work around this. If in a separate tab, I log into the Azure Portal, and then go back and follow the link to Azure Cloud Shell, it passes the login gate and sends me to the page where I choose Bash or PowerShell. The course specifies using Bash. When I select that, it then asks me to create a Storage object. When I confirm that, it gives me the following error (subscription id elided):
{
"error": {
"code": "RequestDisallowedByPolicy",
"target": "cs733f82532facdx4f04x95b",
"message": "Resource 'cs733f82532facdx4f04x95b' was disallowed by policy. Policy identifiers: '[{\"policyAssignment\":{\"name\":\"Enforce tag on resource\",\"id\":\"/subscriptions/xxxxx/providers/Microsoft.Authorization/policyAssignments/740514d625684aad84ef8ca0\"},\"policyDefinition\":{\"name\":\"Enforce tag on resource\",\"id\":\"/subscriptions/xxxxx/providers/Microsoft.Authorization/policyDefinitions/be3862a6-ca1e-40b0-a024-0c0c7d1e8b3e\"}}]'.",
"additionalInfo": [
{
"type": "PolicyViolation",
"info": {
"policyDefinitionDisplayName": "Enforce tag on resource",
"evaluationDetails": {
"evaluatedExpressions": [
{
"result": "True",
"expressionKind": "Field",
"expression": "tags[Department]",
"path": "tags[Department]",
"targetValue": "false",
"operator": "Exists"
}
]
},
"policyDefinitionId": "/subscriptions/xxxxx/providers/Microsoft.Authorization/policyDefinitions/be3862a6-ca1e-40b0-a024-0c0c7d1e8b3e",
"policyDefinitionName": "be3862a6-ca1e-40b0-a024-0c0c7d1e8b3e",
"policyDefinitionEffect": "deny",
"policyAssignmentId": "/subscriptions/xxxxx/providers/Microsoft.Authorization/policyAssignments/740514d625684aad84ef8ca0",
"policyAssignmentName": "740514d625684aad84ef8ca0",
"policyAssignmentDisplayName": "Enforce tag on resource",
"policyAssignmentScope": "/subscriptions/xxxxx",
"policyAssignmentParameters": {
"tagName": {
"value": "Department"
}
}
}
}
]
}
}
I think the simple conclusion from this is that my free account doesn't have enough rights to do what is needed here. The documentation I've read seems to imply that I have to get additional rights on the account in order to do this. However, I'm just using a free account that I created to go through the Azure training courses. It doesn't really make sense to ask me to do this. I've seen other Azure courses create a temporary sandbox supposedly because they have particular objects pre-created in the sandbox, but I'm also thinking that the sandbox has particular permissions that are not available in the free account. It seems to me that the only reasonable fix for this problem is for that course to be refactored to use a temporary sandbox with the correct permissions.
I'm just looking for any opinions on this, and confirmations that this is what should be done.
It doesn't look like you are creating resource, cloudshell storage, on your free subscription. Except if you added to a Work/Corporate tenant.
From the information you provide, subscription you are trying to use has a policy to enforce tags Department, mean any resource created should have a tag with Department information.
I'm using Azure to host a series of services and need to create some custom roles for RBAC. I'm running into errors when trying to run the appropriate commands to create the roles.
There are a number of good documentation sources and walkthroughs on how to do this, and it seems easy in theory, but I keep running into issues running the Powershell Commandlet:
New-AzureRmRoleDefinition -InputFile [pathtojsondoc]
The input file is just json describing the role
"Name": "Test Role",
"IsCustom": true,
"Description": "Role Description",
"Actions": [
"Microsoft.Resources/subscriptions/resourceGroups/read"
],
"NotActions": [ ],
"AssignableScopes": [ ]
}
When calling the New-AzureRmRoleDefinition function I receive the following error:
New-AzureRmRoleDefinition : InvalidApiVersionParameter: The api-version '2015-07-01' is invalid. The supported versions are '2016-09-01,2016-07-01,2016-06-01,2016-02-01,2015-11-01,2015-01-01,2014-04-01-preview,2014-04-01,2014-01-0
I'm looking for a resolution to either fix the call as is so it succeeds, or a workaround for creating custom roles.
It turns out that the error I was receiving was a bad error message causing an improper diagnosis.
After updating to the latest version of Azure PowerShell I received the same error - so I did a bit more digging and found an updated code sample.
I was setting the assignable scopes in the file to be to specific resource groups using the format /subscriptions/*******/myresourcegroupname but the correct format is /subscriptions/*******/resourceGroups/myresourcegroupname notice the extra /resourceGroups/ path.
I am trying to automate creating an API Connection for a storage account in Azure using Resource Manager templates.
I am using the listKeys method in ARM to retrieve the access key of the storage account. I went through this question and it is not working for me.
When I use the method in the outputs section of the template, it is working fine and successfully retrieving and displaying the access key.
"outputs": {
"listKeysOutput": {
"type": "string",
"value": "[listKeys(resourceId('Microsoft.Storage/storageAccounts', parameters('storagename')), providers('Microsoft.Storage', 'storageAccounts').apiVersions[0]).keys[0].value]"
}
}
However, when I try to use the same function inside a connection resource (as shown below), the template executes without any error. But on accessing the API Connection from the Azure portal, it says 'parameter is missing'.
"parameterValues": {
"accesskey": "[listKeys(resourceId('Microsoft.Storage/storageAccounts', parameters('storagename')), providers('Microsoft.Storage', 'storageAccounts').apiVersions[0]).keys[0].value]",
"accountName": "[parameters('storagename')]"
}
Am I missing something here? Or the output of listKeys is not being accepted by the 'accesskey' property?
I had a similar experience a few months ago, and resolved it by using a connection string directly in my code and then passing the connection string into the connections. The value looked like this:
[concat('DefaultEndpointsProtocol=https;AccountName=', variables('storageConfigs')[0].name,';AccountKey=', listKeys(resourceId('Microsoft.Storage/storageAccounts/', variables('storageConfigs')[0].name), variables('defaultStorageApiVersion')).key1)]
I used a storage config object as an input, so that's why it looks like above you could replace variables('storageConfigs')[0].name with whatever name or variable function you use in your code. Looks like above it may be storagename
Two things that might be causing the issue:
Ensure the API Connection has a dependency on the storage account
Capitalise the key in "accessKey" (some things in templates are case sensitive)
#Naren, I recommend you can use this API function to get your Storage Key
POST
https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Storage/storageAccounts/{accountName}/listKeys?api-version={api-version}
You could get the same result as the template.
{
“keys”: [
{
“keyName”: “key1”,
“value”: "key1Value”,
“permissions”: “FULL”
},
{
“keyName”: “key2”,
“value”: "key2Value”,
“permissions”: “FULL”
},
]
}
Just for your reference:
https://msdn.microsoft.com/en-us/library/mt163589.aspx
Dependency is indeed a requirement so that the storage account is already created before the deployment of the api connection is initiated.
The problem with the OP template code is the use of accesskey while the correct parameter name is accessKey (Note the capital K) for an Azure Blob api connection resource.
For anyone who struggles with the lack of documentation for the required parameters of API Connection resources - initiate this API Call:
https://management.azure.com/subscriptions/<YOUR SUBSCRIPTION ID>/providers/Microsoft.Web/locations/<YOUR LOCATION>/managedApis/<API TYPE>?api-version=2016-06-01
The <API TYPE> should be the api type of the connection to check for example azureblob, azurequeues or documentdb.
A description of all the expected parameters is returned along side other descriptive information for that resource.