The module root could not be found. There is nothing to output - terraform

Running a terraform output in my root Terraform directory I get:
The module root could not be found. There is nothing to output.
I have the following files:
iam.tf:
resource "aws_iam_user" "a_user" {
name = "a_user"
}
output.tf:
data "aws_caller_identity" "current" {}
output "account_id" {
value = "${data.aws_caller_identity.current.account_id}"
}
This https://www.terraform.io/docs/modules/index.html says:
Root module That is the current working directory when you run terraform apply or get, holding the Terraform configuration files. It is itself a valid module.
Any idea why the error message and how to fix?

Terraform refers root module from terraform.tfstate file.
This file conatains all info about your last known state from .tf files along with output variables.
Which is generated after first execution terraform apply command into current directory.
Simply run terraform apply
, then terraform output will shows your output variables.

You haven't added your module config above, but assuming you have a module file then you have to tell terraform about the source. if the source is a sub directory called example in the same location as iam.tf and output.tf, then you have to add module as bellow, then run terraform apply from the directory where output.tf and iam.tf are:
module "consul" {
source = "./example"
}
If your output is a remote location (e.g github) then source has to be as below
module "consul" {
source = "github.com/some-git.git"
}
Then you have to run "terraform get" to download your module. Then "terraform apply" to apply the module, then "terraform output" to list the output you specified above

The problem is you have not added your module config file. Something along
module "test_module" {
source = "./test_module"
}
You have to make sure the module config exists and also the source is valid. To get output, you need a state file which is created after running terraform apply. Looks like you either dont have one or you have no output in your state file.

Related

Creating terraform modules

I already have terraform code in different folders, each having its resources individually, the state file in remote S3.
Eg:- ec2-solr
ec2-setup\
Requirement is to create a main module calling this root configurations files, I have copied the backend.tf from the root module into main and setup the variables. The problem, is when I now run terraform plan from the main module tf is saying new resource to be created.
Is there a way, to import the state into the main module?
terraform v - "0.12.29"
provider.aws: version = "~> 3.37"
The command terraform state mv allows you to hook a remote object up to a new (or renamed or moved) resource instance.
This documentation gives the example of moving a resource into a module like this:
terraform state mv aws_instance.foo module.web

Terraform - access root module script from child module

I have a ROOT_MODULE with main.tf:
#Root Module - Just run the script
resource "null_resource" "example" {
provisioner "local_exec" {
command = "./script.sh"
}
and script.sh:
echo "Hello world
now I have another directory elsewhere where I've created a CHILD_MODULE with another main.tf:
#Child Module
module "ROOT_MODULE" {
source = "gitlabURL/ROOT_MODULE"
}
I've exported my planfile: terraform plan -out="planfile"
however, when I do terraform apply against the planfile, the directory I am currently in no longer has any idea where the script.sh is. I need to keep the script in the same directory as the root module. This script is also inside a gitlab repository so I don't have a local path to call it. Any idea as to how I can get this script into my child module / execute it from my planfile?
Error running command './script.sh': exit status 1. Output: cannot access 'script.sh': No such file or directory
You can access the path to the root module config to preserve pathing for files with the path.root intrinsic:
provisioner "local_exec" {
command = "${path.root}/script.sh"
}
However, based on your question, it appears you have swapped the terminology for root module and child module. Therefore, that module appears to really be your child module and not root, and you need to access the path with the path.module intrinsic:
provisioner "local_exec" {
command = "${path.module}/script.sh"
}
and then the pathing to the script will be preserved regardless of your current working directory.
These intrinsic expressions are documented here.

Terraform Enterprise module file location error

Need some assistance regarding using files with in modules from terraform enterprise. Git top folder structure is like this:
modules
README.md
main.tf
With in the modules the folder structure is like this:
modules
main.tf
file1.json
the main.tf within modules referring these file1.json through
like below
resource "aws_iam_policy" "deny_bill_policy" {
name = "OpsPolicy"
path = "/"
policy = "${file("${path.module}/file1.json")}"
}
The same program runs without any issues from my localpc to deploy on aws but when i run the same through terraform enterprise which pulls repo from git throwing the following error.
module.policy_roles.aws_iam_policy.deny_bill_policy: file: open file1.json: no such file or directory in: ${file("${path.module}/file1.json")}
fyi - there is no previous/old .terraform dir existed. Seems TFE handling module/paths is different. some one please assist me here.

terraform sub-module changes not being recognized in plan or apply

i have a terraform repo that looks something like this:
infrastructure
global
main.tf
The main.tf file references a module in a remote repository:
module "global" {
source = "git#github.com/company/repo//domain/global"
}
and the above module makes a reference to another module within the same remote repo: main.tf
module "global" {
source = "git#github.com/company/repo//infrastructure/global"
}
If i make a change in this module thats 3 levels deep, and then run terraform get and terraform init in the top level Terraform project followed by terraform plan, those changes aren't picked up.
Is there any reason for this?
i needed to do one of the following:
1) when running terraform init, i needed to pass the flag upgrade=true
2) or if running terraform get, i needed to pass the flag update=true
this downloads the latest versions of the requested modules

How to copy in additional files during "terraform get" that reside outside of the module directory?

The Hashicorp Consul repository contains a Terraform module for launching a Consul cluster in AWS. The module references several files that are found in the parent directory of the module under the shared/scripts directory here https://github.com/hashicorp/consul/tree/master/terraform
However, when I reference the module in one of my .tf files and run terraform get to download the module, the required files under shared/scripts/ are not included with the downloaded module files, leading to errors like the one described here
My module section in Terraform looks like this:
module "consul" {
source = "github.com/hashicorp/consul/terraform/aws"
key_name = "example_key"
key_path = "/path/to/example_key"
region = "us-east-1"
servers = "3"
platform = "centos7"
}
Is there anyway to have terraform get pull in files that live outside the module directory?
Thanks
From looking at what those files do, I'd just copy the one's you need (depending on whether you're deploying on debian or rhel), which will be 2/3 files and feeding them into provisioner "file":
https://www.terraform.io/docs/provisioners/file.html

Resources