Plan - Pass a var-file in a different directory to plan - terraform

Using Terraform v0.11.11, when invoking terraform plan, supply a .tfvars file that is NOT in the current directory e.g a common 'global' file for global constants
seems to just ignore it when I try ../ in an var-file option
-var-file="../$(ENV).global.tfvars"
What options do I have to achieve a similar result if this is not possible (event in the v0.12 is this still the case)?

Related

Handling variables in nested terraform structure

I've set up a terraform project with the following folder structure:
modules/environments/[$environment,all]/regions/[$region,all]
modules/resources/[$resource]
environments/[$environment]/[$region]
In modules/resources I have folders like api, load balancer, etc. In modules/environments/[$environment,all]/regions/[$region,all], I import the required modules from modules/resources/[$resource]. Given the environment is prod, in eu-west-1 for example, I import modules from modules/environments/[prod,all]/regions/[eu-west-1,all].
If I add a new variable, I have to update the variables.tf file in all the affected places, and in every main.tf, where I assign the variables to the modules. This is a lot of effort.
It would be a much easier scenario if I wouldn't have to assign the variables in the "module" {} level, but it would automatically get the variables from the .tfvars file that is being used when planning or applying, but it's not the case.
Is there any workoaround for this?

To specify a working directory for the plan, use the global -chdir flag

Why do I get this error (sorry a newbee)
" To specify a working directory for the plan, use the global -chdir flag"
I have my tfvars file in a folder called env-vars/dev.tfvars. So when I run Terraform plan -var-file=dev.tfvars or Terraform plan -var-file=env-vars\dev.tfvars I would like to run those set variables .
The parent directory is TfTest which contains main.tf, variables.tf etc
I'm not sure I understand the working directory vs workspace when using Vscode
You have a space between -var-file= and dev.tfvars. That means terraform sees dev.tfvars as a separate argument, hence the big bold error message "Too many command line arguments".
I was able to solve this on my own using this code
terraform plan -var-file="env-vars/dev.tfvars"

Call to function "file" failed: no file exists - Terraform

I have a template_file section in my terraform code, which has a variable value to be picked from a file like below
data "template_file" "post_sql"{
template = "${file("/home/user/setup_template.yaml")}"
vars = {
name1= data.azurerm_storage_account.newazure_storage_data.name
name2="${file("/home/user/${var.newname}loca.txt")}"
}
}
This file will get generated in the middle of tasks, but terraform looks for it at the starting of apply stage itself. I have even tried adding depends_on to no avail and throws the below error
Call to function "file" failed: no file exists at
/home/user/newnamerloca.txt.
How can i make this work, any help on this would be appreciated
The reason for the behavior you are seeing is included in the documentation for the file function:
This function can be used only with files that already exist on disk at the beginning of a Terraform run. Functions do not participate in the dependency graph, so this function cannot be used with files that are generated dynamically during a Terraform operation. We do not recommend using dynamic local files in Terraform configurations, but in rare situations where this is necessary you can use the local_file data source to read files while respecting resource dependencies.
The file function is intended for reading files that are included on disk as part of the configuration, typically in the same directory as the .tf file that refers to them, and using the path.module symbol to specify the path like this:
file("${path.module}/example.tmpl")
Your question doesn't explain why you are reading files from a user's home directory rather than from the current module configuration directory, or why one of the files doesn't exist before you run Terraform, so it's hard to give a specific suggestion on how to proceed. The documentation offers the local_file data source as a possible alternative, but it may not be the best approach depending on your goals. In particular, reading files on local disk from outside of the current module is often indicative of using Terraform for something outside of its intended scope, and so it may be most appropriate to use a different tool altogether.
Try "cat /home/user/newnamerloca.txt" and see if this file is actually in there.
Edit: Currently there is no workaround this, "data" resources are applied at the start of plan/apply thus need to be present in order to use them
Data resources have the same dependency resolution behavior as defined for managed resources. Setting the depends_on meta-argument within data blocks defers reading of the data source until after all changes to the dependencies have been applied.
NOTE: In Terraform 0.12 and earlier, due to the data resource behavior of deferring the read until the apply phase when depending on values that are not yet known, using depends_on with data resources will force the read to always be deferred to the apply phase, and therefore a configuration that uses depends_on with a data resource can never converge. Due to this behavior, we do not recommend using depends_on with data resources.
So maybe something like:
data "template_file" "post_sql"{
template = "${file("/home/user/setup_template.yaml")}"
vars = {
name1= data.azurerm_storage_account.newazure_storage_data.name
name2="${file("/home/user/${var.newname}loca.txt")}"
}
depends_on = [null_resource.example1]
}
resource "null_resource" "example1" { # **create the file here**
provisioner "local-exec" {
command = "open WFH, '>completed.txt' and print WFH scalarlocaltime"
interpreter = ["perl", "-e"]
}
}

How best to handle multiple .tfvars files that use common .tf files?

I am going to be managing the CDN configurations of dozens of applications through Terraform. I have a set of .tf files holding all the default constant settings that are common among all configurations and then each application has its own .tfvars file to hold its unique settings.
If I run something like terraform apply --var-file=app1.tfvars --var-file=app2.tfvars --var-file=app3.tfvars then only the last file passed in is used.
Even if this did work it will become unmanageable when I extend this to more sites.
What is the correct way to incorporate multiple .tfvars files that populate a common set of .tf files?
Edit: I should add that the .tfvar files define the same variables but with different values. I need to declare state of the resources defined in the .tf files once for each .tfvar file.
I found the best way to handle this case (without any 3rd party tools is to use) Terraform workspaces and create a separate workspace for each .tfvars file. This way I can use the same common .tf files and simply swap to a different workspace with terraform workspace select <workspace name> before running terraform apply --var-file=<filename> with each individual .tfvars file.
This should work using process substitution:
terraform apply -var-file=<(cat app1.tfvars app2.tfvars app3.tfvars)
Best Way may be the use of TerraGrunt https://terragrunt.gruntwork.io/ from GruntWork, which is a thin wrapper around Terraform, you can use the HCL configuration file to define your requirements.
Sample terragrunt.hcl configuration:
terraform {
extra_arguments "conditional_vars" {
commands = [
"apply",
"plan",
"import",
"push",
"refresh"
]
required_var_files = [
"${get_parent_terragrunt_dir()}/terraform.tfvars"
]
optional_var_files = [
"${get_parent_terragrunt_dir()}/${get_env("TF_VAR_env", "dev")}.tfvars",
"${get_parent_terragrunt_dir()}/${get_env("TF_VAR_region", "us-east-1")}.tfvars",
"${get_terragrunt_dir()}/${get_env("TF_VAR_env", "dev")}.tfvars",
"${get_terragrunt_dir()}/${get_env("TF_VAR_region", "us-east-1")}.tfvars"
]
}
You can pass down tfvars, also you can get more features from terragrunt by better organise your Terraform Layout, and use configuration file for passing tfvars from different locations.

Multiple .tf files in a folder

I have a project that I inherited that has multiple .tf (main.tf, xyz.tf, ...) files in certain folders. When it does a source = "../<folder_name>", what order are the files applied in ? I main.tf always applied first followed by the rest ?
Note: These are different from the variables.tf and outputs.tf files.
In Terraform 0.11, regular *.tf files were loaded in alphabetical order and then override files were applied.
When invoking any command that loads the Terraform configuration, Terraform loads all configuration files within the directory specified in alphabetical order.
...
Override files are the exception, as they're loaded after all non-override files, in alphabetical order.
In Terraform 0.12+ (including 1.x), the load order of *.tf files is no longer specified. Behind the scenes Terraform reads all of the files in a directory and then determines a resource ordering that makes sense regardless of the order the files were actually read.
Terraform evaluates all of the configuration files in a module, effectively treating the entire module as a single document. Separating various blocks into different files is purely for the convenience of readers and maintainers, and has no effect on the module's behavior.

Resources