How to indicate custom configuration files for terragrunt modules? - terraform

I am trying to build Terragrunt script for deploying the infrastructure to Microsoft Azure cloud. Things are working fairly well but I am not able to figure out one thing.
The structure of setup looks something like this:
rootdir
terragrunt.hcl
someconfig.hcl
module1dir
terragrunt.hcl
config.auto.tfvars.json
module2dir
terragrunt.hcl
config.auto.tfvars.json
module3dir
terragrunt.hcl
config.auto.tfvars.json
Each module is configured using Terraform autoload tfvars feature with config.auto.tfvars.json. What I would like is to have these files outside of the directory structure and somehow instruct Terragrunt to apply correct external configuration file to correct submodule.
Any ideas?

I solved this in the following manner:
Define environment variable you plan on using which should contain location to the configuration files. Make sure it is not clashing with anything existing. In this example we will use TGR_CFGDIR. In the external configuration module place the module configuration files and make sure they are properly named. Each file should be named as the module and end with .auto.tfvars.json. So if your module is named foo you should have config file foo.auto.tfvars.json. Change your terragrunt modules (terragrunt.hcl) to have these statements:
locals {
moduleconfig = get_env("TGR_CFGDIR")
modulename = basename(get_terragrunt_dir())
}
generate "configuration" {
path = "config.auto.tfvars.json"
if_exists = "overwrite"
disable_signature = true
contents = file("${local.moduleconfig}/${local.modulename}.auto.tfvars.json")
}
And finally call terragrunt cli like this:
TGR_CFGDIR="<configdir>" terragrunt "<somecommand>"

Related

How to read variables from CSV in terraform

I am having a CSV file with few values.
I want to read that values into variables in terraform.
I have used giving locals with file path . But it shows path not found. How can I read variables from CSV in terraform.
I am having my git structure like below.
where Key_vault folder is having my terraform codes.And adf_confg is having my csv file.
my main.tf is like this.
I am getting error: Invalid value for "path" parameter: no file exists at ./adf_config/datasets.csv; this function works only with files that
│ are distributed as part of the configuration source code, so if this file will be created by a resource in this
│ configuration you must instead obtain this result from an attribute of that resource
If your Terraform module is in the Key_Vault directory and your CSV file is in adf_config then the path from the Terraform module to the CSV file must start with ../ to traverse to the parent directory.
I would also typically suggest using path.module to be explicit that we're writing a path relative to the current module, although when your module is the root module it doesn't really make any difference because path.module will always be . (the current directory) in that case. Using path.module can help with refactoring this configuration into a child module later though, since it will already be clear what this path is relative to.
locals {
datasets = csvdecode(file("${path.module}/../adf_config/datasets.csv"))
}

terraform module structure and tfvars files

I have a configuration which uses modules, this is its structure:
main.tf
\modules
\kubernetes_cluster
\main.tf
\variables.tf
At this stage I had no separate tfvars file, I relied on default values declared in the variables.tf file, and this worked fine. I then decided to create a tfvars file resulting in:
main.tf
\modules
\kubernetes_cluster
\main.tf
\variables.tf
\variables.tfvars
At the same time I removed the default values from variables file, then when I ran:
terraform apply -target=module.kubernetes_cluster -auto-approve
I got errors complaining that I needed to pass my variables in as arguments due to the fact "They were missing", so I moved to this:
main.tf
variables.tf
variables.tfvars
\modules
\kubernetes_cluster
\main.tf
\variables.tf
this is what main.tf in the root module looks like:
module "kubernetes_cluster" {
source = "./modules/kubernetes_cluster"
kubernetes_version = var.kubernetes_version
node_hosts = var.node_hosts
}
When I run terraform apply I get prompted for the values of the variables. All I want to do is not rely on variable default values and to be able to run terraform apply from the root module directory without having to pass in variable values by hand, I suspect that my module structure somewhere along the line is not correct.
If you want to have TF load tfvars automatically, the file must be called terraform.tfvars, not variables.tfvars. There are other possibilities:
Terraform also automatically loads a number of variable definitions files if they are present:
Files named exactly terraform.tfvars or terraform.tfvars.json.
Any files with names ending in .auto.tfvars or .auto.tfvars.json.
As per the documentation, terraform automatically loads tfvars if:
Either variable file name is terraform.var or terraform.var.json
Or YOUR_NAME.auto.tfvar
So in your case renaming variables.tfvars to variables.auto.tfvars would work

CSV file path not set showing error in terraform

I am trying to access or calling CSV file in code which is another folder but when I set the path showing error in terraform
following is terraform code which set path of file module is folder in that a is another one folder after that there file name is test.CSV
locals {
group_names = csvdecode(file("/modules/a/test.csv"))
}
showing following error
Error: Error in function call
on VPN_Gateway\VPN_Gateway.tf line 7, in locals:
7: group_names = csvdecode(file("modules/a/test.csv"))
Call to function "file" failed: no file exists at modules\a\test.csv
When the CSV file is in another folder with multiple directory levels, then I recommend you use the absolute path for it. And you can get that path when you in the folder of the CSV file and use the command (I assume you use the Linux OS) pwd. The possible reason that you got the error is you use the wrong relative path. It seems your CSV file is in the Terraform module directory. If your directory tree like this:
.
├── main.tf
├── modules
│   └── a
│   └── test.csv
And you load the CSV file in the root path with main.tf file, then the code should be like this:
locals {
group_names = csvdecode(file("modules/a/test.csv"))
}
Suggest again, the absolute path is more appropriate.

Referencing Lambda Source Files in Sibling Directory using Terraform

I am attempting to deploy a Lambda function using Terraform, where my source files are in a different directory adjacent to where I have my Terraform files. I want to have Terraform do the zipping of the source files for me and deploy them into the Lambda. Terraform doesn't seem to want to recognize that my files are there, though.
My directory structure:
project_root/
deployment/
terraform/
my-terraform.tf
function_source/
function.py
I want it to package everything in function_source directory (there is only one file there now, but may be more later) and drop it into the deployment directory.
My Terraform:
data "archive_file" "lambda_zip" {
type = "zip"
output_path = "../function.zip"
source_dir = "../../function_source/"
}
resource "aws_lambda_function" "my_lambda" {
filename = "${data.archive_file.lambda_zip.output_path}"
function_name = "my-function"
role = "${aws_iam_role.lambda_role.arn}"
handler = "function.handler"
runtime = "python3.7"
}
When I run this, though, I get the error message data.archive_file.lambda_zip: data.archive_file.lambda_zip: error archiving directory: could not archive missing directory: ../../function_source/
I have tried using absolute paths without success (which wouldn't be a good solution anyway). I have also tried creating the .zip file manually and hardcoding its directly in Lambda declaration, but it only works if I put the .zip file in my terraform directory. It seems Terraform can only see files in its own directory or below, but I'd rather not co-mingle my source files there. Is there a way to do this?
I am using Terraform v0.12.4

Terraform: How can I read variables into Terraform from a YAML file? Or from a DB like Hiera?

I wanted to read variables into my Terraform configuration file from an external YAML file or from a database like Hiera, how can I do that? For example:
provider "aws" {
region = hiera('virginia') # this should look up for virginia=us-east-1
}
resource "aws_instance" {
ami = hiera('production')
....
....
}
Basically its similar to how we can do a lookup for Puppet manifests/configs using hiera or a YAML file.
yamldecode function can be use to read Yaml file as input for terraform and parse it to use.
Let's say test.yml file as below
a: 1
b: 2
c: 3
Then below code can be use to read the yaml file
output "log" {
value = "${yamldecode(file("test.yml"))}"
}
In order to parse a specific value read from a yaml file
output "log" {
value = "${yamldecode(file("test.yml"))["a"]}"
}
Important: this is available in 0.12 or later version of terraform, in case of using the old version of terraform and still want to use yaml file, then use terraform-provider-yaml
You can include the yaml file as a local:
locals {
config = yamldecode(file("${path.module}/configfile.yml"))
}
Now you can call all variables from local.config, for example a variable project_name:
local.config.project_name
You have to move the variables to a terraform.tfvars or a *.auto.tfvars file.
https://www.terraform.io/language/values/variables#variable-definitions-tfvars-files
To persist variable values, create a file and assign variables within
this file. Create a file named terraform.tfvars with the following
contents:
access_key = "foo"
secret_key = "bar"
For all files which match terraform.tfvars or *.auto.tfvars present in
the current directory, Terraform automatically loads them to populate
variables. If the file is named something else, you can use the
-var-file flag directly to specify a file. These files are the same syntax as Terraform configuration files. And like Terraform
configuration files, these files can also be JSON.

Resources