I am having a CSV file with few values.
I want to read that values into variables in terraform.
I have used giving locals with file path . But it shows path not found. How can I read variables from CSV in terraform.
I am having my git structure like below.
where Key_vault folder is having my terraform codes.And adf_confg is having my csv file.
my main.tf is like this.
I am getting error: Invalid value for "path" parameter: no file exists at ./adf_config/datasets.csv; this function works only with files that
│ are distributed as part of the configuration source code, so if this file will be created by a resource in this
│ configuration you must instead obtain this result from an attribute of that resource
If your Terraform module is in the Key_Vault directory and your CSV file is in adf_config then the path from the Terraform module to the CSV file must start with ../ to traverse to the parent directory.
I would also typically suggest using path.module to be explicit that we're writing a path relative to the current module, although when your module is the root module it doesn't really make any difference because path.module will always be . (the current directory) in that case. Using path.module can help with refactoring this configuration into a child module later though, since it will already be clear what this path is relative to.
locals {
datasets = csvdecode(file("${path.module}/../adf_config/datasets.csv"))
}
Related
Terraform newbie would appreciate a push in the right direction.
Have created my first module, a child module to root called 'storage' in there I want to put all the templates for the storage accounts. I thought this would be simple.
I've added a folder named 'storage' and moved the existing 'storage.tf' file into that and added added modules block to my main.tf (in root) to tell it the module exists. I did a terraform init and it found and installed it - great!
But now when I try something like:
terraform plan -var-file='env/dev/dev.tfvars'
None of the variables which my storage.tf file references are found. I just get:
Error: Reference to undeclared input variable
│
│ on modules\storage\storage.tf line 11, in resource "google_storage_bucket" "new_bucket":
│ 11: name = "${var.env_name}-${var.name_of_bucket}-test"
│
│ An input variable with the name "env_name" has not been declared. This variable can be declared with a variable "env_name" {} block.
All my variables are defined in a variables.tf in root and when running terraform I pass in the path to the relevant variable values using the .tfvars method.
Why can't the module see those?
I tried creating a vars.tf in the module directory, next to the storage.tf file. I copied the variable definitions into that file and got errors that the value of the variable was missing (even though the values were in the .tfvars file used in the invocation).
I then tried adding the value to the variable definitions using the 'default' and this actually worked - but that means I can no longer use the .tfvars to control which environment I'm configuring.
I must be missing something really basic!
Question: How can I get my modules to accept the values contained in my .tfvars? I don't mind having the variables which each module needs in a seperate vars.tf for each module - thats fine and actually useful, but there has to be some way to pass in the values with a simple .tfvars? Please... what am I missing?
Update
Below is the folder and file structure, sorry if I am not using the right syntax for the graph!
condo (folder)
|--.git (folder)
|--.github (folder)
|--creds (folder)
| |--dev_creds.json (file)
| |--prod_creds.json (file)
|
|--infracomm (folder) (root)
|
|--.terraform (file)
|--.terraform.hcl.lock (file)
|--env (folder)
| |--dev (folder)
| | |--dev.tfvar (file)
| |
| |--prod (folder)
| |--prod.tfvars (file)
|
|--modules (folder) (child module)
| |--storage (folder)
| |--storage.tf (file)
|
|--backend.tf (file)
|--main.tf (file)
|--variables.tf (file)
All I am trying to do is get the variables in the root variables.tf AND the values for those variables, which I am providing in the cli arg
-var-file='./env/dev/dev.tfvars'
To be usable inside the storage module.
UPDATE #2
Here is the code from main.tf in root where I am referencing the child module. Let me know if more needed:
provider "google" {
project = var.project_name
region = "europe-west4"
zone = "europe-west4"
credentials = var.credentials_path
}
// Define modules
module "storage" {
source = "./modules/storage"
}
Do I need to duplicate those variables in a vars.tf in the module? If so, how do I get values into them without using 'default' as that would break scalability.
I'm sure I'm missing something basic.
Thanks!
Let's say you want to pass the variable variable_to_child_module in your module, from the root module.
The value of this variable was added through a .tfvars file in the root module, namely variable_from_root_module
When you call the storage module from the root module pass the variable
// Define modules
module "storage" {
source = "./modules/storage"
variable_to_child_module = var.variable_from_root_module
}
Also in ./modules/storage directory you should add a variables.tf where you will add the variable definition
E.g.
variable "variable_to_child_module" {
description = "my child module variable"
type = string
}
Consider each module (root and storage) an independent module. You need to (re)define the variables which you use in the sub-module (storage) in the sub-module itself.
Assuming you have variables region and size which you pass into the main module and want to use in storage module, you can do this as follows:
module "storage" {
source = "./modules/storage"
region = var.region
size = var.size
}
I have a configuration which uses modules, this is its structure:
main.tf
\modules
\kubernetes_cluster
\main.tf
\variables.tf
At this stage I had no separate tfvars file, I relied on default values declared in the variables.tf file, and this worked fine. I then decided to create a tfvars file resulting in:
main.tf
\modules
\kubernetes_cluster
\main.tf
\variables.tf
\variables.tfvars
At the same time I removed the default values from variables file, then when I ran:
terraform apply -target=module.kubernetes_cluster -auto-approve
I got errors complaining that I needed to pass my variables in as arguments due to the fact "They were missing", so I moved to this:
main.tf
variables.tf
variables.tfvars
\modules
\kubernetes_cluster
\main.tf
\variables.tf
this is what main.tf in the root module looks like:
module "kubernetes_cluster" {
source = "./modules/kubernetes_cluster"
kubernetes_version = var.kubernetes_version
node_hosts = var.node_hosts
}
When I run terraform apply I get prompted for the values of the variables. All I want to do is not rely on variable default values and to be able to run terraform apply from the root module directory without having to pass in variable values by hand, I suspect that my module structure somewhere along the line is not correct.
If you want to have TF load tfvars automatically, the file must be called terraform.tfvars, not variables.tfvars. There are other possibilities:
Terraform also automatically loads a number of variable definitions files if they are present:
Files named exactly terraform.tfvars or terraform.tfvars.json.
Any files with names ending in .auto.tfvars or .auto.tfvars.json.
As per the documentation, terraform automatically loads tfvars if:
Either variable file name is terraform.var or terraform.var.json
Or YOUR_NAME.auto.tfvar
So in your case renaming variables.tfvars to variables.auto.tfvars would work
I am trying to build Terragrunt script for deploying the infrastructure to Microsoft Azure cloud. Things are working fairly well but I am not able to figure out one thing.
The structure of setup looks something like this:
rootdir
terragrunt.hcl
someconfig.hcl
module1dir
terragrunt.hcl
config.auto.tfvars.json
module2dir
terragrunt.hcl
config.auto.tfvars.json
module3dir
terragrunt.hcl
config.auto.tfvars.json
Each module is configured using Terraform autoload tfvars feature with config.auto.tfvars.json. What I would like is to have these files outside of the directory structure and somehow instruct Terragrunt to apply correct external configuration file to correct submodule.
Any ideas?
I solved this in the following manner:
Define environment variable you plan on using which should contain location to the configuration files. Make sure it is not clashing with anything existing. In this example we will use TGR_CFGDIR. In the external configuration module place the module configuration files and make sure they are properly named. Each file should be named as the module and end with .auto.tfvars.json. So if your module is named foo you should have config file foo.auto.tfvars.json. Change your terragrunt modules (terragrunt.hcl) to have these statements:
locals {
moduleconfig = get_env("TGR_CFGDIR")
modulename = basename(get_terragrunt_dir())
}
generate "configuration" {
path = "config.auto.tfvars.json"
if_exists = "overwrite"
disable_signature = true
contents = file("${local.moduleconfig}/${local.modulename}.auto.tfvars.json")
}
And finally call terragrunt cli like this:
TGR_CFGDIR="<configdir>" terragrunt "<somecommand>"
I am trying to access or calling CSV file in code which is another folder but when I set the path showing error in terraform
following is terraform code which set path of file module is folder in that a is another one folder after that there file name is test.CSV
locals {
group_names = csvdecode(file("/modules/a/test.csv"))
}
showing following error
Error: Error in function call
on VPN_Gateway\VPN_Gateway.tf line 7, in locals:
7: group_names = csvdecode(file("modules/a/test.csv"))
Call to function "file" failed: no file exists at modules\a\test.csv
When the CSV file is in another folder with multiple directory levels, then I recommend you use the absolute path for it. And you can get that path when you in the folder of the CSV file and use the command (I assume you use the Linux OS) pwd. The possible reason that you got the error is you use the wrong relative path. It seems your CSV file is in the Terraform module directory. If your directory tree like this:
.
├── main.tf
├── modules
│ └── a
│ └── test.csv
And you load the CSV file in the root path with main.tf file, then the code should be like this:
locals {
group_names = csvdecode(file("modules/a/test.csv"))
}
Suggest again, the absolute path is more appropriate.
I am attempting to deploy a Lambda function using Terraform, where my source files are in a different directory adjacent to where I have my Terraform files. I want to have Terraform do the zipping of the source files for me and deploy them into the Lambda. Terraform doesn't seem to want to recognize that my files are there, though.
My directory structure:
project_root/
deployment/
terraform/
my-terraform.tf
function_source/
function.py
I want it to package everything in function_source directory (there is only one file there now, but may be more later) and drop it into the deployment directory.
My Terraform:
data "archive_file" "lambda_zip" {
type = "zip"
output_path = "../function.zip"
source_dir = "../../function_source/"
}
resource "aws_lambda_function" "my_lambda" {
filename = "${data.archive_file.lambda_zip.output_path}"
function_name = "my-function"
role = "${aws_iam_role.lambda_role.arn}"
handler = "function.handler"
runtime = "python3.7"
}
When I run this, though, I get the error message data.archive_file.lambda_zip: data.archive_file.lambda_zip: error archiving directory: could not archive missing directory: ../../function_source/
I have tried using absolute paths without success (which wouldn't be a good solution anyway). I have also tried creating the .zip file manually and hardcoding its directly in Lambda declaration, but it only works if I put the .zip file in my terraform directory. It seems Terraform can only see files in its own directory or below, but I'd rather not co-mingle my source files there. Is there a way to do this?
I am using Terraform v0.12.4