include a terraform template in another template - terraform

I have many template files that are used by terraform scripts, all template files have some common part, ie:
file a.tmpl:
env=prod
var=a
-------------------
file b.tmpl:
env=prod
var=b
I would like to export the common part to a separate file, so that it won't have to repeat in every file, something like:
file base.tmpl:
env=prod
-------------------
file a.tmpl:
%{ include "base.tmpl" }
var=a
-------------------
file b.tmpl:
%{ include "base.tmpl" }
var=b
but that feature doesn't exists
(it is very similar to django templates feature as described here: https://stackoverflow.com/a/10985987/245024)
is there a way to do the include somehow?
I was able to do a workaround by concating the files like this:
data "template_file" "vars_a" {
template = "${format("%s \n %s",
file("${path.module}/base.tmpl"),
file("${path.module}/a.tmpl")
)}"
}
but that is more limiting then including the base template directly in the file.

I think you could use templatefile:
a.tmpl
${file("base.tmpl")}
var=a
base.tmpl
var_ddd=ffff
var_sss=adfs
main.tf
data "template_file" "vars_a" {
template = templatefile("a.tmpl", {})
}
output "test" {
value = data.template_file.vars_a.template
}

Related

Is there a way to extend a variables.tf file with another?

Just an example scenario would be that I have a module, and that has variables. For example an S3 bucket would need a variable about the name of the bucket.
The variables.tf file would look like:
variable "bucket_name" { type = string }
Now if I want to use that module to create the S3 bucket, and then assign it to a cloudfront distribution, I have to assign the bucket name at that level to the bucket module. For example I want to give the distribution a name, I need to create a variables.tf for the distribution with the distribution name defined, but because I use the module of the S3 bucket, I have to include all variables of the S3 bucket as well. So the variables.tf file of the distribution would look like:
variable "bucket_name" { type = string }
variable "distribution_name" { type = string }
If I have a module that is only for us-east-1, and it needs the cloudfront distribution, I need to create a variables.tf file for that as well containing all the above.
If I for example have multiple AWS accounts for staging, and prod, and want the distribution included in both (or for only one, doesn't matter), I have to create yet another variables.tf file containing all the above.
Every time I would add a new variable, I'd have to update at least 4 variables.tf files which is definitely not a way (terraform is supposed to reduce hard coding).
What I would like is as I go up the ladder (from s3 to environment module), Import the variables.tf file of the "child module", and extend it.
Just for an imaginary scenario, I'd like to:
# bucket/variables.tf
variable "bucket_name" { type = string }
# distribution/variables.tf
import_variables {
source = "../bucket"
}
# us-east-1/variables.tf
import_variables {
source = "../distribution"
}
# staging/us-east-1/variables.tf
import_variables {
source = "../../us-east-1"
}
Am I having a completely wrong approach to terraform, or I just missed a method with which this variable definition sharing could be done?

How do I set a computed local variable

I have a tf.json file that declares a bunch of local variables. One of the variables is a array of complex objects like so:
{
"locals": [
{
"ordered_cache_behaviors": [
{
"path_pattern": "/poc-app-angular*",
"s3_target": "dev-ui-static",
"ingress": "external"
}
]
}
]
}
Here is what I want to do... instead of declaring the variable ordered_cache_behaviors statically in my file I want this to be a computed value. I will get this configuration from a S3 bucket and set the value here. So the value statically will only be an empty array [] that I will append to with a script after getting the data from S3.
This logic needs to execute each time before a terraform plan or terraform apply. What is the best way to do this? I am assuming I need to use a Provisioner to fire off a script? If so how do I then set the local variable here?
If the cache configuration data can be JSON-formatted, you may be able to use the s3_bucket_object datasource plus the jsondecode function as an alternative approach:
Upload your cache configuration data to the poc-app-cache-config bucket as cache-config.json, and then use the following to have Terraform download that file from S3 and parse it into your local ordered_cache_behaviors variable:
data "aws_s3_bucket_object" "cache_configuration" {
bucket = "poc-app-cache-config"
key = "cache-config.json" # JSON-formatted cache configuration map
}
...
locals {
ordered_cache_behaviors = jsondecode(aws_s3_bucket_object.cache_configuration.body)
}

How to define global variables in terraform?

I have a terraform project I am working on. In it, I want a file to contain many variables. I want these variables to be accessible from any module of the project. I have looked in the docs and on a udemy course but still don't see how to do this. How does one do this in terraform? Thanks!
I don't think this is possible. There are several discussions about this at Github, but this is not something the Hashicorp team wants.
In general we're against the particular solution of Global Variables, since it makes the input -> resources -> output flow of Modules less explicit, and explicitness is a core design goal.
I know, we have to repeat a lot of variables between different modules
I try to do this
Define a the set of common variables e.g. common_tags at the bottom/top of the variables.tf file for all modules. Usually, your tech ops admin/cloud admin will have a template project created for this.
For each module, I add the following as the last item
global_var = var.global_var_val
An example is common tags. In any root/child module I combine them using merge() function. I hope that makes sense.
You can use -var-file to pass the overrides. Example:
terraform -chdir=staging/000base apply -var-file ../../global.tfvars
Other way can be dedicated shared module.
For example you have module with:
output "my_var" {
value = "apollo440"
}
And in other modules you can use it like:
module "gvars" {
# source = "git#github.com:user/terraform-global-vars.git?ref=v0.0.1-alpha"
source="../../../modules/global_vars"
}
provider "something" {
name="${module.gvars.my_var}"
}
Third way could be to use https://registry.terraform.io/providers/hashicorp/http/latest/docs/data-sources/http to query some http endpoint.
Ouh... and forth way could be to utilize Vault Provider https://registry.terraform.io/providers/hashicorp/vault/latest/docs
For pathing constants between lots of modules we can use the following simple way.
/modules/global_constants/outputs.tf:
Describe module with global constants as outputs:
output "parameter_1" {
value = "value_1"
sensitive = true
}
output "parameter_2" {
value = "value_2"
sensitive = true
}
/example_1.tf
After we can use in any *.tf
module "global_settings" {
source = "./modules/global_constants"
}
data "azurerm_key_vault" "keyvault" {
name = module.global_settings.parameter_1
resource_group_name = module.global_settings.parameter_2
}
/modules/module2/main.tf
Or in other any modules:
module "global_settings" {
source = "../global_constants"
}
data "azurerm_key_vault" "keyvault" {
name = module.global_settings.parameter_1
resource_group_name = module.global_settings.parameter_2
}

How can I get the Terraform module name programmatically?

I have defined the following Terraform module:
module "lambda" {
source = "../lambda"
region = "us-west-1"
account = "${var.account}"
}
How can I take advantage from the module name to set the source parameter with an interpolation? I wish something like:
module "lambda" {
source = "../${this.name}"
region = "us-west-1"
account = "${var.account}"
}
locals {
module = basename(abspath(path.module))
}
{
...
some-id = local.module
...
}
I think is not possible. There's a self that allows you to reference attributes within your resource, but the identifier is not an attribute. Also, self is only allowed within provisioners.
I guess the only way to accomplish what you want is templating the .tf files, like:
module {{ my-module}} {
source = "../{{ my-module }}"
region = "us-west-1"
account = "${var.account}"
but you should render the templates before terraform init. It's straightforward to setup in a CI pipeline, but I find it cumbersome when working locally.

Terraform > Unescaped interpolations

What does this mean:
Note: Inline templates must escape their interpolations (as seen by the double
$ above). Unescaped interpolations will be processed before the template.
from https://www.terraform.io/docs/providers/template/index.html
The specific example is:
# Template for initial configuration bash script
data "template_file" "init" {
template = "$${consul_address}:1234"
vars {
consul_address = "${aws_instance.consul.private_ip}"
}
}
The ${} syntax is used by HCL for interpolation before the template rendering happens so if you were to just use:
# Template for initial configuration bash script
data "template_file" "init" {
template = "${consul_address}:1234"
vars {
consul_address = "${aws_instance.consul.private_ip}"
}
}
Terraform will attempt to find consul_address to template into the output instead of using the template variable of consul_address (which in turn is resolved to the private_ip output of the aws_instance.consul resource.
This is only an issue for inline templates and you don't need to do this for file based templates. For example this would be fine:
int.tpl
#!/bin/bash
echo ${consul_address}
template.tf
# Template for initial configuration bash script
data "template_file" "init" {
template = "${file("init.tpl")}"
vars {
consul_address = "${aws_instance.consul.private_ip}"
}
}
Of course if you then also needed to use the ${} syntax literally in your output template then you would need to double escape with something like this:
#!/bin/bash
CONSUL_ADDRESS_VAR=${consul_address}
echo $${CONSUL_ADDRESS_VAR}
This would then be rendered as:
#!/bin/bash
CONSUL_ADDRESS_VAR=1.2.3.4
echo ${CONSUL_ADDRESS_VAR}

Resources