Pass complex, non-primitive data types to Terraform template provider - terraform

Having a more complex list object like this
variable "proxy" {
type = list(object({
enabled = bool
host = string
port = number
user = string
password = string
}))
default = [
{
enabled = false
host = ""
port = 0
user = ""
password = ""
}
]
}
I want to use this in a external template (cloudinit in my case). The template_file directive allows passing variables to a template. Sadly, not for more complex types:
Note that variables must all be primitives. Direct references to lists or maps will cause a validation error.
So something like this
data "template_file" "cloudinit_data" {
template = file("cloudinit.cfg")
vars = {
proxy = var.proxy
}
}
cause the error
Inappropriate value for attribute "vars": element "proxy": string required.
This leads me to two questions:
How can I pass the variable to the template? I assume that I need to convert it to a primitive type like this:
vars = {
proxy_host = var.proxy.host
}
This doesn't work:
This value does not have any attributes.
Is there an alternative way to pass this object directly to the template?
I'm using v0.12.17.

The template_file data source continues to exist only for compatibility with configurations written for Terraform 0.11. Since you are using Terraform 0.12, you should use the templatefile function instead, which is a built-in part of the language and supports all value types.
Because templatefile is a function, you can call it from anywhere expressions are expected. If you want to use the rendered result multiple times then you could define it as a named local value, for example:
locals {
cloudinit_data = templatefile("${path.module}/cloudinit.cfg", {
proxy = var.proxy
})
}
If you only need this result once -- for example, if you're using it just to populate the user_data of a single aws_instance resource -- then you can just write this expression inline in the resource block, to keep everything together and make the configuration (subjectively) easier to read:
resource "aws_instance" "example" {
# ...
user_data = templatefile("${path.module}/cloudinit.cfg", {
proxy = var.proxy
})
}

Related

Terraform : removal of identity block does not remove identity assigned from resource azure logic app

I have this in my main.tf and
dynamic "identity" {
for_each = var.identity == [] ? [] : [1]
content {
type = lookup(var.identity, "type", null)
#identity_ids = lookup(var.identity, "identity_ids", null)
}
}
I have defined variable as below.
variable "identity" {
description = "creates the identity for Logic App."
type = any
default = []
}
Removing identity block from input does not remove assigned identity. Terraform does not detect the change. Can some1 help ?
Also Logic App standard only supports SystemAssigned but doc says something else :
https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/logic_app_standard
There seems to be some type confusion in your configuration here, but Terraform isn't able to detect and report it because you didn't give a specific type constraint for your variable.
Specifically, it's not clear whether you intended var.identity to be a list of objects or a single object. You declared the default as [], suggesting you meant a list, but the content of the dynamic "identity" block treats var.identity as if it's just a single object.
I'm going to write this out both ways, so you can choose which one meets your actual requirement.
For a list of "identities" with one identity block each:
variable "identities" {
type = list(object({
type = string
identity_ids = set(string)
}))
default = []
}
resource "example" "example" {
dynamic "identity" {
for_each = var.identities
content {
type = each.value.type
identity_ids = each.value.identity_ids
}
}
}
For a single "identity" object that is optional:
variable "identities" {
type = object({
type = string
identity_ids = set(string)
})
default = null
}
resource "example" "example" {
dynamic "identity" {
for_each = var.identities[*]
content {
type = each.value.type
identity_ids = each.value.identity_ids
}
}
}
In this second example, notice that:
The type constraint for variable "identities" is now just for an object type directly, without the list(...) from the first example.
The default value for that variable is now null, which is the typical way to represent the absence of a single value.
The dynamic "identity" block's for_each expression uses the [*] operator, called the "splat operator", which has a special behavior where it'll convert a null value into an empty list and a non-null value into a single-element list, thus producing a suitable collection value for the for_each argument.
I would recommend always writing type constraints for your input variables, because then Terraform can give you better feedback in situations like yours where you were not consistent in the types you were using. If you use any in a type constraint then Terraform will have less insight into what you are intending and so its error messages will typically be less specific and possibly even misleading, if it makes an incorrect assumption about what your goals were.

terraform variable default value interpolation from locals

I have a use case where I need two AWS providers for different resources. The default aws provider is configured in the main module which uses another module that defines the additional aws provider.
By default, I'd like both providers to use the same AWS credentials unless explicitly overridden.
I figured I could do something like this. In the main module:
locals {
foo_cloud_access_key = aws.access_key
foo_cloud_secret_key = aws.secret_key
}
variable "foo_cloud_access_key" {
type = string
default = local.foo_cloud_access_key
}
variable "foo_cloud_secret_key" {
type = string
default = local.foo_cloud_secret_key
}
where variables foo_cloud_secret_key and foo_cloud_access_key would then be passed down to the child module like this:
module foobar {
...
foobar_access_key = var.foo_cloud_access_key
foobar_secret_key = var.foo_cloud_secret_key
...
}
Where module foobar would then configure its additional was provide with these variables:
provider "aws" {
alias = "foobar_aws"
access_key = var.foobar_access_key
secret_key = var.foobar_secret_key
}
When I run the init terraform spits out this error (for both variables):
Error: Variables not allowed
on variables.tf line 66, in variable "foo_cloud_access_key":
66: default = local.foo_cloud_access_key
Variables may not be used here.
Is it possible to achieve something like this in terraform or is there any other way to go about this?
Having complex, computed default values of variables is possible, but only with a workaround:
define a dummy default value for the variable, e.g. null
define a local variable, its value is either the value of the variable or the actual default value
variable "something" {
default = null
}
locals {
some_computation = ... # based on whatever data you want
something = var.something == null ? local.some_computation : var.something
}
And then only only use local.something instead of var.something in the rest of the terraform files.

Terraform variable to assign using function

variable "cidr" {
type = map(string)
default = {
development = "x.1.0.0/16"
qa = "x.1.0.0/16"
default = "x.1.0.0/16"
}
}
variable "network_address_space" {
default = lookup(var.cidr, var.environment_name,"default")
}
Am getting error that "Error: Function calls not allowed"
variable "subnet_address_space": cidr_subnet2_address_space = cidrsubnet(var.network_address_space,8,1)
A Terraform Input Variable is analogous to a function argument in a general-purpose programming language: its value comes from an expression in the calling module, not from the current module.
The default mechanism allows us to substitute a value for when the caller doesn't specify one, but because variables are intended for getting data into a module from the outside, it doesn't make sense to set the default to something from inside that module: that would cause the result to potentially be something the caller of the module could never actually specify, because they don't have access to the necessary data.
Terraform has another concept Local Values which are roughly analogous to a local variable within a function in a general-purpose programming language. These can draw from function results and other objects in the current module to produce their value, and so we can use input variables and local values together to provide fallback behaviors like you've illustrated in your question:
var "environment_name" {
type = string
}
var "environment_default_cidr_blocks" {
type = map(string)
default = {
development = "10.1.0.0/16"
qa = "10.2.0.0/16"
}
}
var "override_network_range" {
type = string
default = null # If not set by caller, will be null
}
locals {
subnet_cidr_block = (
var.override_network_range != null ?
var.override_network_range :
var.environment_default_cidr_blocks[var.environment_name]
)
}
Elsewhere in the module you can use local.subnet_cidr_block to refer to the final CIDR block selection, regardless of whether it was set explicitly by the caller or by lookup into the table of defaults.
When a module uses computation to make a decision like this, it is sometimes useful for the module to export its result as an Output Value so that the calling module can make use of it too, similar to how Terraform resources also export additional attributes recording decisions made by the provider or by the remote API:
output "subnet_cidr_block" {
value = local.subnet_cidr_block
}
As stated in Interpolate variables inside .tfvars to define another variable by the Hashicorp person, it is intended to be constant by design.
Input variables are constant values passed into the root module, and so they cannot contain interpolations or other expressions that do not yield a constant value.
We cannot use variables in backend either as in Using variables in terraform backend config block.
These are the things we Terraform users tripped on at some point, I suppose.

Terraform how to interpolate map types in a template_file?

I am trying to pass a map variable into a template_file, and am being thrown this error:
vars (varsname): '' expected type 'string', got unconvertible type 'map[string]interface {}'
data "template_file" "app" {
template = "${file("./app_template.tpl")}"
vars {
container = "${var.container-configuration}"
}
}
variables.tf
variable "container-configuration" {
description = "Configuration for container"
type = "map"
default = {
image = "blahblah.dkr.ecr.us-east-2.amazonaws.com/connect"
container-port = "3000"
host-port = "3000"
cpu = "1024"
memory = "2048"
log-group = "test"
log-region = "us-east-2a"
}
}
Is there a way to pass the map into the template file for interpolation? I haven't found anything clear in the documentation.
Terraform v0.12 introduced the templatefile function, which absorbs the main use-cases of the template_file data source, and accepts values of any type:
templatefile("${path.module}/app_template.tpl", {
container = var.container-configuration
})
Terraform v0.11 and earlier do not have any means to render a template with non-string values. The limitation exists due to the nature of the protocol used to represent map values in the configuration: it is only capable of representing maps of string until the new protocol that was introduced in Terraform v0.12.
Passing maps is not yet supported, see template_file documentation.
From that article:
Variables must all be primitives. Direct references to lists or maps will cause a validation error.
It means you need to pass variables one by one individually.

Explicitly using variable default in Terraform

I have a map of values (populated from Consul), which I use to configure my resources, however if values for optional variables are missing, I would like Terraform to act as it the parameter was not provided. For example:
resource "aws_db_instance" "db" {
engine = "${lookup(config_map, "db_engine", "postgres")}"
port = "${lookup(config_map, "db_port", "<pick default for the engine>")}"
}
If port is not given, Terraform picks a default value depending on the engine. Can I trigger this behavior explicitly?
The following should do what you expect (syntax is validated, however apply has not been tested : I'll update the answer if it works, or delete it otherwise).
First, you should have somewhere a mapping between engines and default ports (here, this is a variable, but it could be stored in Consul like your config_map) :
variable "default_ports_by_engine" {
type = "map"
# key = engine, value = port
default = {
"postgres" = "3333"
"mysql" = "3334"
# other engines/ports...
}
}
Then, you can use this variable in a nested lookup :
resource "aws_db_instance" "db" {
engine = "${lookup(config_map, "db_engine", "postgres")}"
port = "${
lookup(
var.default_ports_by_engine,
"${lookup(config_map, "db_engine", "postgres")}"
)
}"
}
Notice that not passing a third argument to lookup function will make Terraform fail if db_engine is not found in default_ports_by_engine.

Resources