Load config from external file to the terraform - terraform

I use Terraform to provide some Google infrastructure. I would like to store some configuration variables in an external (non-terraform) config file. The idea is to use those variables in the Terraform and bash also, so I wouldn't like to use typical .tfvars file. How to achieve this?
I have got three files and let's assume for simplicity, that they are being stored in the same directory.
General configuration files with the variables to ingest:
# config.txt
GOOGLE_PROJECT_ID='my-test-name'
GOOGLE_REGION='my-region'
Terraform file with the datasources:
# datasources.tf
data "local_file" "local_config_file" {
filename = "./config.txt"
}
Terraform file with the variables:
# variables.tf
variable "project_id" {}
variable "region" {
default = 'europe-west3'
}

If all of your variables you'd like to use in Terraform are string-type variables, you can define them as environment variables to use them in Terraform and your Bash scripts:
Terraform will read environment variables in the form of TF_VAR_name to find the value for a variable. For example, the TF_VAR_region variable can be set in the shell to set the region variable in Terraform.
# config.sh
export TF_VAR_region="my-region"
export TF_VAR_project_id="my-test-name"
Note that this approach won't work for list or map type variables:
Note: Environment variables can only populate string-type variables. List and map type variables must be populated via one of the other mechanisms.
See the docs here for more information.

Related

Handling variables in nested terraform structure

I've set up a terraform project with the following folder structure:
modules/environments/[$environment,all]/regions/[$region,all]
modules/resources/[$resource]
environments/[$environment]/[$region]
In modules/resources I have folders like api, load balancer, etc. In modules/environments/[$environment,all]/regions/[$region,all], I import the required modules from modules/resources/[$resource]. Given the environment is prod, in eu-west-1 for example, I import modules from modules/environments/[prod,all]/regions/[eu-west-1,all].
If I add a new variable, I have to update the variables.tf file in all the affected places, and in every main.tf, where I assign the variables to the modules. This is a lot of effort.
It would be a much easier scenario if I wouldn't have to assign the variables in the "module" {} level, but it would automatically get the variables from the .tfvars file that is being used when planning or applying, but it's not the case.
Is there any workoaround for this?

Can the config crate merge environment variables into a subsection of a hierarchical config file?

I am using the Config crate in Rust, and would like to use environment variables to set keys inside a section of the config. The end goal is to override application settings from a docker compose file, or docker command line, using the environment.
If my config was the following, could I use a specifically crafted environment variable to set database.echo ?
(code blurb below is taken from this example)
debug = true
[database]
echo = true
The example code to configure this using the environment variables illustrates only to set keys at the top level. Wondering how to extend this. The .set() takes a hierarchical key, so I'm hopeful that there's a way to encode the path in the env variable name.
Answering my own question.
I just noticed that the Environment code accepts a custom separator which will get replaced with . (dot).
So one can set the separator to something like _XX_ and that would get mapped to a ".". Setting DATABASE_XX_ECHO=true, for instance would then change the database.echo key.

Terraform. Map in tfvars in a module

I'm writing a terraform module that will select a value from a map, based on a key passed to it.
Nothing unusual.
The values in that map are secret however, and do not change based on who is calling the module.
A simple way I thought to keep those secrets secret would be to define the variable as a map in variables.tf in the module, put the keys/values in terraform.tfvars in the module, .gitignore terraform.tfvars, and encrypt it to terraform.tfvars.gpg or something.
But that doesn't work, because I have no default for the variable in the module terraform is expecting the variable to be set by the caller.
I can define the variable in the caller without a value, add it to the call, and either specify manually --var-file or include a terraform.tfvars in the caller. But that means the user having to remember a magic --var-file invocation or the duplication of terraform.tfvars everywhere my new module is used.
Remembering magic strings and duplication are both not good options.
Is it possible for a module to use its own tfvars to fill in variables not passed to it?
There is no way to use an automatic .tfvars file with a non-root module. Child modules always get all of their values from the calling module block (with default values inserted where appropriate); .tfvars is only for assigning values to root module variables.
Another option with similar characteristics to what you're describing is to use a data file in either JSON or YAML format inside the module directory, and load it in using the file function and one of the decoding functions. For example:
locals {
# This file must be created before using this module by
# decrypting secrets.yaml.gpg in this directory.
secrets = yamldecode(file("${path.module}/secrets.yaml"))
}
If the caller neglects to decrypt into the .gitignored file before using the module then the file call will fail with a "file not found" error message.
I'm not totally sure I understand but taking a stab. Are you using AWS? A fun solution I've used in the past is SSM parameters.
data "aws_ssm_parameter" "foo" {
name = "foo"
}
...
value = data.aws_ssm_parameter.foo.value
...
The SSM param could be created outside of tf and looked up in your module (and policies granting access depending on caller via IAM, or whatever).

Referring to sourced ARM variables in Terraform

I am trying to create AKS by using terraform, In service principle block we need to pass client_id and client_secret. Terraform has the ability to read env variables and source them if they are prepended as TF_VAR_name.
Terraform also mentioned that for provider block we can export the client related variables as ARM_CLIENT_name. So my question is how to use those ARM variables for provisioning my AKS.
Right now I am doing like this
- export ARM_CLIENT_ID=$AZ_USERNAME
- export ARM_CLIENT_SECRET=$AZ_PASSWORD
- export ARM_TENANT_ID=$AZ_TENANT
- export ARM_SUBSCRIPTION_ID=AZ_SUBSCRIPTION_ID
If I can't refer to above env variables then I should do
- export ARM_CLIENT_ID=$AZ_USERNAME
- export TF_VAR_client_id=$AZ_PASSWORD
- export ARM_CLIENT_SECRET=$AZ_PASSWORD
- export TF_VAR_client_secret=$AZ_PASSWORD
- export ARM_TENANT_ID=$AZ_TENANT
- export ARM_SUBSCRIPTION_ID=AZ_SUBSCRIPTION_ID
What you display in the question are two different situations.
One is that the authentications for Azure providers. It can quote the necessary input from the environment variables such as ARM_CLIENT_ID=$AZ_USERNAME, ARM_CLIENT_SECRET=$AZ_PASSWORD, ARM_TENANT_ID and ARM_SUBSCRIPTION_ID.
Another one is to quote the normal variables. You can export all the variables you need as environment variables with the prefix TF_VAR_, but another thing you need to do is that you also need to define the variables inside the Terraform file. As it shows here:
As a fallback for the other ways of defining variables, Terraform
searches the environment of its own process for environment variables
named TF_VAR_ followed by the name of a declared variable.
For example, if you want to quote the variable aksname from the environment variable, then you need to do two things:
export the environment variable aksname with a prefix TF_VAR_:
export TF_VAR_aksname=example-aks
define the variable aksname inside the Terraform file and quote it, here I just quote it in the output block:
variable "aksname" {}
output "aks_name" {
value = "${var.aksname}"
}
Then the output will like this:

terraform.tfvars vs variables.tf difference [duplicate]

This question already has answers here:
What is the difference between variables.tf and terraform.tfvars?
(5 answers)
Closed 3 years ago.
I've been researching this but can't find the distinction. A variables.tf file can store variable defaults/values, like a terraform.tfvars file.
What's the difference between these two and the need for one over the other? My understanding is if you pass in the var file as an argument in terraform via the command line.
There is a thread about this already and the only benefit seems to be passing in the tfvars file as an argument, as you can "potentially" do assignment of variables in a variable.tf file.
Is this the correct thinking?
The distinction between these is of declaration vs. assignment.
variable blocks (which can actually appear in any .tf file, but are in variables.tf by convention) declare that a variable exists:
variable "example" {}
This tells Terraform that this module accepts an input variable called example. Stating this makes it valid to use var.example elsewhere in the module to access the value of the variable.
There are several different ways to assign a value to this input variable:
Include -var options on the terraform plan or terraform apply command line.
Include -var-file options to select one or more .tfvars files to set values for many variables at once.
Create a terraform.tfvars file, or files named .auto.tfvars, which are treated the same as -var-file arguments but are loaded automatically.
For a child module, include an expression to assign to the variable inside the calling module block.
A variable can optionally be declared with a default value, which makes it optional. Variable defaults are used for situations where there's a good default behavior that would work well for most uses of the module/configuration, while still allowing that behavior to be overridden in exceptional cases.
The various means for assigning variable values are for dealing with differences. What that means will depend on exactly how you are using Terraform, but for example if you are using the same configuration multiple times to deploy different "copies" of the same infrastructure (environments, etc) then you might choose to have a different .tfvars file for each of these copies.
Because terraform.tfvars and .auto.tfvars are automatically loaded without any additional options, they behave similarly to defaults, but the intent of these is different. When running Terraform in automation, some users have their automation generate a terraform.tfvars file or .auto.tfvars just before running Terraform in order to pass in values the automation knows, such as what environment the automation is running for, etc.
The difference between the automatically-loaded .tfvars files and variable defaults is more clear when dealing with child modules. .tfvars files (and -var, -var-file options) only apply to the root module variables, while variable defaults apply when that module is used as a child module too, meaning that variables with defaults can be omitted in module blocks.
A variables.tf file is used to define the variables type and optionally set a default value.
A terraform.tfvars file is used to set the actual values of the variables.
You could set default values for all your variables and not use tfvars files at all.
Actually the objective of splitting between the definitions and the values, is to allow the definition of a common infrastructure design, and then apply specific values per environment.
Using multiple tfvars files that you give as an argument allows you to set different values per environment : secrets, VM size, number of instances, etc.

Resources