Can't get global_parameter to work with Terraform azurerm_data_factory - terraform

According to the documentation .. global_parameter is indeed a valid argument. However when trying to use it, my validate command reports that An argument named "global_parameter" is not expected here. Did you mean to define a block of type "global_parameter"?
My config is simply
resource "azurerm_data_factory" "adf" {
name = "adf-${var.project}"
location = var.az_location
resource_group_name = azurerm_resource_group.rg.name
tags = {
environment = var.environment
project = var.project
}
global_parameter = {}
}
I am using version 2.81 of the azurerm provider.

The documentation doesn't seem correct. It's not supposed to be an argument; it's actually supposed to be declared as a block. Hence this is incorrect:
global_paramter = {}
and this is correct:
global_parameter {}
(No equals assignment here)
The documentation says it's an argument while the error suggests you probably wanted a block instead of an argument.

Related

Validating through terraform that a vault policy exists before using it in a group

I have the following structure
module "policies" {
source = "../../../../path/to/my/custom/modules/groups"
for_each = var.config.policies
name = each.key
policy = each.value
}
module "groups" {
source = "../../../../path/to/my/custom/modules/groups"
for_each = var.config.groups
name = each.key
type = each.value.type
policies = each.value.policies
depends_on = [
module.policies
]
}
Policies and groups are declared in a yaml file from which through yamldecode the corresponding variables to for_each are created.
Is there any way to make sure that the policies passed to policies = each.value.policies of the groups module DO exist?
I mean, OK I have the depends_on clause, but I want to also provision for typos in the yaml file and other similar situations.
The usual way to declare a dependency on an external object (managed elsewhere) in Terraform is to use a data block using a data source defined by the provider responsible for that object. If the goal is only to verify that the object exists then it's enough to declare the data source and then have your downstream object's configuration refer to anything about its result, just so Terraform can see that the data source is a dependency and so should be resolved first.
Unfortunately it seems like the hashicorp/vault provider doesn't currently have a data source for declaring a dependency on a policy, although there is a feature request for it.
Assuming that it did exist then the pattern might look something like this:
data "vault_policy" "needed" {
for_each = var.config.policies
name = each.value
}
module "policies" {
source = "../../../../path/to/my/custom/modules/groups"
for_each = var.config.policies
name = each.key
# Accessing this indirectly via the data resource tells
# Terraform that it must complete the data lookup before
# planning anything which depends on this "policy" argument.
policy = data.vault_policy.needed[each.key].name
}
Without a data source for this particular object type I don't think there will be an elegant way to solve this, but you may be able to work around it by using a more general data source like hashicorp/external's external data source for collecting data by running an external program that prints JSON.
Again because you don't actually seem to need any specific data from the policy and only want to check whether it exists, it would be sufficient to write an external program which queries vault and then exists with an unsuccessful status if the request fails, or prints an empty JSON object {} if the request succeeds.
data "external" "vault_policy" {
for_each = var.config.policies
program = ["${path.module}/query-vault"]
query = {
policy_name = each.value
}
}
module "policies" {
source = "../../../../path/to/my/custom/modules/groups"
for_each = var.config.policies
name = each.key
policy = data.external.vault_policy.query.policy_name
}
I'm not familiar enough with Vault to suggest a specific implementation of this query-vault program, but you may be able to use a shell script wrapping the vault CLI program if you follow the advice in Processing JSON in shell scripts. You only need to do the input parsing part of that, because your result would be communicated either by exit 1 to signal failure or echo '{}' followed by exiting successfully to signal success.

Cannot assign variable from data.tf to variables.tf file

New to terraform, and have been building out the infrastructure recently.
I am trying to pull secrets from azure key vault and assign the keys to the variables.tf file depending on the environment(dev.tfvars, test.tfvars, etc). However when I execute the plan with the tfvar file as the parameter, I get an error with the following message:
Error: Variables not allowed
Here are the files and the relevant contents of it.
variables.tf:
variable "user_name" {
type = string
sensitive = true
}
data.tf (referencing the azure key vault):
data "azurerm_key_vault" "test" {
name = var.key_vault_name
resource_group_name = var.resource_group
}
data "azurerm_key_vault_secret" "test" {
name = "my-key-vault-key-name"
key_vault_id = data.azurerm_key_vault.test.id
}
test.tfvars:
user_name = "${data.azurerm_key_vault_secret.test.value}" # Where the error occurrs
Can anyone point out what I'm doing wrong here? And if so is there another way to achieve such a thing?
In Terraform a variable can be used for user input only. You can not assign to them anything dynamically computed from your code. They are like read-only arguments, for more info see Input Variables from the doc.
If you want to assign a value to something for later use, you must use locals. For example:
locals {
user_name = data.azurerm_key_vault_secret.test.value
}
Local values can be changed dynamically during execution. For more info, see Local Values.
You can't create dynamic variables. All variables must have known values before execution of your code. The only thing you could do is to use local, instead of variabile:
locals {
user_name = data.azurerm_key_vault_secret.test.value
}
and then refer to it as local.user_name.

Terraform : removal of identity block does not remove identity assigned from resource azure logic app

I have this in my main.tf and
dynamic "identity" {
for_each = var.identity == [] ? [] : [1]
content {
type = lookup(var.identity, "type", null)
#identity_ids = lookup(var.identity, "identity_ids", null)
}
}
I have defined variable as below.
variable "identity" {
description = "creates the identity for Logic App."
type = any
default = []
}
Removing identity block from input does not remove assigned identity. Terraform does not detect the change. Can some1 help ?
Also Logic App standard only supports SystemAssigned but doc says something else :
https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/logic_app_standard
There seems to be some type confusion in your configuration here, but Terraform isn't able to detect and report it because you didn't give a specific type constraint for your variable.
Specifically, it's not clear whether you intended var.identity to be a list of objects or a single object. You declared the default as [], suggesting you meant a list, but the content of the dynamic "identity" block treats var.identity as if it's just a single object.
I'm going to write this out both ways, so you can choose which one meets your actual requirement.
For a list of "identities" with one identity block each:
variable "identities" {
type = list(object({
type = string
identity_ids = set(string)
}))
default = []
}
resource "example" "example" {
dynamic "identity" {
for_each = var.identities
content {
type = each.value.type
identity_ids = each.value.identity_ids
}
}
}
For a single "identity" object that is optional:
variable "identities" {
type = object({
type = string
identity_ids = set(string)
})
default = null
}
resource "example" "example" {
dynamic "identity" {
for_each = var.identities[*]
content {
type = each.value.type
identity_ids = each.value.identity_ids
}
}
}
In this second example, notice that:
The type constraint for variable "identities" is now just for an object type directly, without the list(...) from the first example.
The default value for that variable is now null, which is the typical way to represent the absence of a single value.
The dynamic "identity" block's for_each expression uses the [*] operator, called the "splat operator", which has a special behavior where it'll convert a null value into an empty list and a non-null value into a single-element list, thus producing a suitable collection value for the for_each argument.
I would recommend always writing type constraints for your input variables, because then Terraform can give you better feedback in situations like yours where you were not consistent in the types you were using. If you use any in a type constraint then Terraform will have less insight into what you are intending and so its error messages will typically be less specific and possibly even misleading, if it makes an incorrect assumption about what your goals were.

Iterate over map with lists in terraform 0.12

I am using terraform 0.12.8 and I am trying to write a resource which would iterate over the following variable structure:
variable "applications" {
type = map(string)
default = {
"app1" = "test,dev,prod"
"app2" = "dev,prod"
}
}
My resource:
resource "aws_iam_user" "custom" {
for_each = var.applications
name = "circleci-${var.tags["ServiceType"]}-user-${var.tags["Environment"]}-${each.key}"
path = "/"
}
So, I can iterate over my map. However, I can't figure out how to verify that var.tags["Environment"] is enabled for specific app e.g. app1.
Basically, I want to ensure that the resource is created for each application as long as the Environment variable is in the list referencing app name in the applications map.
Could someone help me out here?
Please note that I am happy to go with a different variable structure if you have something to propose that would accomplish my goal.

Terraform how to interpolate map types in a template_file?

I am trying to pass a map variable into a template_file, and am being thrown this error:
vars (varsname): '' expected type 'string', got unconvertible type 'map[string]interface {}'
data "template_file" "app" {
template = "${file("./app_template.tpl")}"
vars {
container = "${var.container-configuration}"
}
}
variables.tf
variable "container-configuration" {
description = "Configuration for container"
type = "map"
default = {
image = "blahblah.dkr.ecr.us-east-2.amazonaws.com/connect"
container-port = "3000"
host-port = "3000"
cpu = "1024"
memory = "2048"
log-group = "test"
log-region = "us-east-2a"
}
}
Is there a way to pass the map into the template file for interpolation? I haven't found anything clear in the documentation.
Terraform v0.12 introduced the templatefile function, which absorbs the main use-cases of the template_file data source, and accepts values of any type:
templatefile("${path.module}/app_template.tpl", {
container = var.container-configuration
})
Terraform v0.11 and earlier do not have any means to render a template with non-string values. The limitation exists due to the nature of the protocol used to represent map values in the configuration: it is only capable of representing maps of string until the new protocol that was introduced in Terraform v0.12.
Passing maps is not yet supported, see template_file documentation.
From that article:
Variables must all be primitives. Direct references to lists or maps will cause a validation error.
It means you need to pass variables one by one individually.

Resources