Execution order inside Terraform modules - terraform

I had an issue with the execution order of modules in terraform scripts. I have raised an issue with the source repo. https://github.com/hashicorp/terraform/issues/18143
Can anyone please help me with this issue here or on GitHub?
Any help will be highly appreciated.
Thanks!

The execution does not wait for the completion of the "vpc" module, but only for the availability of the value "module.vpc.vpc_id". For this it is enough to execute the aws_vpc resource. Therefore, you are not actually telling TerraForm to also wait for the consul_keys resource.
To fix this you have to add a dependency from the consul_keys resource to your other modules. This works either by :
Using a value exported by consul_keys in your other modules (either datacenter or var.name>)
Dumping the resources that depend on consul_keys in the same file.
Sadly there is no nice solution for this at the moment, but module dependencies are being worked on.
EDIT:
As an example of dumping all resources in the same file:
This does not work because there are no module dependencies:
module "vpc" {
...
}
module "other" {
depends_on=["module.vpc"]
}
vpc module file:
resource "aws_instance" "vpc_res1" {
...
}
resource "consul_keys" "my_keys" {
...
}
other module file:
resource "aws_instance" "other_res1" {
...
}
resource "aws_instance" "other_res2" {
...
}
Putting everything in the same file works. You can also keep the "vpc_res1" resource in a separate module:
resource "consul_keys" "my_keys" {
...
}
resource "aws_instance" "other_res1" {
depends_on = ["consul_keys.my_keys"]
}
resource "aws_instance" "other_res2" {
depends_on = ["consul_keys.my_keys"]
}

Related

Load terraform module from local path which can be set as a environment variable

I have a terraform module in my local "tmp\module\vpc".
Is there any way this local path "tmp\module\vpc" can be set in a environment variable and use it in the "source" field of calling module.
module "vpc" {
source = "/tmp/modules/vpc"
vpcname = var.vpcname
}
Not possible. Dynamic module paths is not something terraform supports.

terraform module structure and tfvars files

I have a configuration which uses modules, this is its structure:
main.tf
\modules
\kubernetes_cluster
\main.tf
\variables.tf
At this stage I had no separate tfvars file, I relied on default values declared in the variables.tf file, and this worked fine. I then decided to create a tfvars file resulting in:
main.tf
\modules
\kubernetes_cluster
\main.tf
\variables.tf
\variables.tfvars
At the same time I removed the default values from variables file, then when I ran:
terraform apply -target=module.kubernetes_cluster -auto-approve
I got errors complaining that I needed to pass my variables in as arguments due to the fact "They were missing", so I moved to this:
main.tf
variables.tf
variables.tfvars
\modules
\kubernetes_cluster
\main.tf
\variables.tf
this is what main.tf in the root module looks like:
module "kubernetes_cluster" {
source = "./modules/kubernetes_cluster"
kubernetes_version = var.kubernetes_version
node_hosts = var.node_hosts
}
When I run terraform apply I get prompted for the values of the variables. All I want to do is not rely on variable default values and to be able to run terraform apply from the root module directory without having to pass in variable values by hand, I suspect that my module structure somewhere along the line is not correct.
If you want to have TF load tfvars automatically, the file must be called terraform.tfvars, not variables.tfvars. There are other possibilities:
Terraform also automatically loads a number of variable definitions files if they are present:
Files named exactly terraform.tfvars or terraform.tfvars.json.
Any files with names ending in .auto.tfvars or .auto.tfvars.json.
As per the documentation, terraform automatically loads tfvars if:
Either variable file name is terraform.var or terraform.var.json
Or YOUR_NAME.auto.tfvar
So in your case renaming variables.tfvars to variables.auto.tfvars would work

Terraform upgrade and multiple versions.tf, do child modules inherit the provider versions?

Not very experienced with Terraform. I upgraded my project from 12 to 13 and looking to upgrade it to 14 afterwards.
As the documentation specifies, I ran terraform 0.13upgrade and terraform 0.13upgrade module, my directory became like this:
terraform
├── module
│ ├── main.tf
│ └── versions.tf
├── main.tf
└── versions.tf
I moved my versions but the problem is that I specified them only in the root versions.tf:
terraform {
required_providers {
google = {
source = "hashicorp/google"
version = "~> 3.8.0"
}
}
in module/versions.tf I kept only:
terraform {
required_providers {
google = {
source = "hashicorp/google"
}
}
Do child modules inherit the version from the root (where they are imported) or does this mean my module will automatically run with a more recent provider version (3.64 I think)?
Should I simply remove module/versions.tf? (It's tiresome to have 2 versions to edit everytime).
Thanks!
For a given Terraform configuration (which includes both the root module and any other modules you might call), there can be only one version of each provider. Terraform recognizes that two providers are "the same" by them having the same source value after normalization, and in your examples here both are using hashicorp/google, which is short for registry.terraform.io/hashicorp/google, and so both of them need to be able to agree on a particular version of that provider to use.
Terraform handles version constraints from multiple providers by combining them together and trying to honor all of them together. In your examples here you've written no version argument in the child module, and this means "any version is allowed".
Terraform will therefore look for an available provider version that matches both the ~> 3.8.0 constraint and the implied "any version" constraint, and since the ~> 3.8.0 constraint is a proper subset of "any version" it effectively takes priority over the open constraint in the child module. This is not strictly "inheritance", but it happens to behave somewhat like it in this case because the child module is totally unconstrained.
A more interesting example would be if both of your modules specified different version constraints, which means we can see a more interesting effect of combining them. Let's pretend that your two modules had the following requirements instead:
terraform {
required_providers {
google = {
source = "hashicorp/google"
version = "~> 3.8.2"
}
}
}
terraform {
required_providers {
google = {
source = "hashicorp/google"
version = ">= 3.8.5"
}
}
}
In this situation neither of these constraints is a subset of the other, but they do have some overlap: all of the 3.8.x versions from 3.8.5 onwards are acceptable to both modules. Therefore Terraform will select the newest available version from that set.
If you write two modules that have conflicting version constraints then that would be an error:
terraform {
required_providers {
google = {
source = "hashicorp/google"
version = "~> 3.7.0"
}
}
}
terraform {
required_providers {
google = {
source = "hashicorp/google"
version = "~> 3.8.5"
}
}
}
There is no provider version that is both a 3.7.x release and a 3.8.x release at the same time, so no release can ever possibly match both of these constraints, and thus provider version selection will fail. It's for this reason that the Terraform documentation section Best Practices for Provider Versions advises to use ~> version constraints only in the root module.

Terraform: How can I read variables into Terraform from a YAML file? Or from a DB like Hiera?

I wanted to read variables into my Terraform configuration file from an external YAML file or from a database like Hiera, how can I do that? For example:
provider "aws" {
region = hiera('virginia') # this should look up for virginia=us-east-1
}
resource "aws_instance" {
ami = hiera('production')
....
....
}
Basically its similar to how we can do a lookup for Puppet manifests/configs using hiera or a YAML file.
yamldecode function can be use to read Yaml file as input for terraform and parse it to use.
Let's say test.yml file as below
a: 1
b: 2
c: 3
Then below code can be use to read the yaml file
output "log" {
value = "${yamldecode(file("test.yml"))}"
}
In order to parse a specific value read from a yaml file
output "log" {
value = "${yamldecode(file("test.yml"))["a"]}"
}
Important: this is available in 0.12 or later version of terraform, in case of using the old version of terraform and still want to use yaml file, then use terraform-provider-yaml
You can include the yaml file as a local:
locals {
config = yamldecode(file("${path.module}/configfile.yml"))
}
Now you can call all variables from local.config, for example a variable project_name:
local.config.project_name
You have to move the variables to a terraform.tfvars or a *.auto.tfvars file.
https://www.terraform.io/language/values/variables#variable-definitions-tfvars-files
To persist variable values, create a file and assign variables within
this file. Create a file named terraform.tfvars with the following
contents:
access_key = "foo"
secret_key = "bar"
For all files which match terraform.tfvars or *.auto.tfvars present in
the current directory, Terraform automatically loads them to populate
variables. If the file is named something else, you can use the
-var-file flag directly to specify a file. These files are the same syntax as Terraform configuration files. And like Terraform
configuration files, these files can also be JSON.

Trouble setting terraform variable from CLI

My terraform project layout looks something like:
[project root dir]
main.tf
- my_module [directory]:
variables.tf
my_instance.tf
In variables.tf I have something like this:
variable "foo-ami" {
default = "ami-12345"
description = "Super awesome AMI"
}
In my_instance.tf I reference it like so:
resource "aws_instance" "my_instance" {
ami = "${var.foo-ami}"
...
}
If I want to override this variable from the command line, the docs here seem to suggest I should be able to run the following command from the top level (main.tf) location:
terraform plan -var 'foo-ami=ami-987654'
However, the variable switch doesn't seem to be getting picked up. The old value remains set. Further more if I remove the default setting I get an error from terraform saying it's not set, so clearly the -var switch isn't being picked up.
Thoughts?
TIA
The variables of root and my_module are different. If you want to pass a variable to my_module, you need to specify it.
In the main.tf, you should set the variable as follows:
variable "foo-ami" {}
module "my_module" {
source = "./my_module"
foo-ami = "${var.foo-ami}"
...
}
For details of the module feature refer to the following documents:
https://www.terraform.io/docs/modules/usage.html

Resources