I already have terraform code in different folders, each having its resources individually, the state file in remote S3.
Eg:- ec2-solr
ec2-setup\
Requirement is to create a main module calling this root configurations files, I have copied the backend.tf from the root module into main and setup the variables. The problem, is when I now run terraform plan from the main module tf is saying new resource to be created.
Is there a way, to import the state into the main module?
terraform v - "0.12.29"
provider.aws: version = "~> 3.37"
The command terraform state mv allows you to hook a remote object up to a new (or renamed or moved) resource instance.
This documentation gives the example of moving a resource into a module like this:
terraform state mv aws_instance.foo module.web
Related
We are using a private Github and Terraform Cloud for our projects. Everything is able to talk to each other so there is no issue there. However, I'm trying to create modules for a project I started. I was able to make it work as regular terraform files, but when I try to convert to the module system I am having issues with getting the state imported.
We have a separate repository called tf-modules. In this repository, my directory setup:
> root
>> mymodule
>>> lambda.tf
>>> eventbridge.tf
>>> bucket.tf
These files manage the software being deployed in our AWS environment. They are being used across multiple environments for each of our customers (each separated out by environment [qa, dev, prod]).
In my terraform files, I have:
> root
>> CUSTNAME
>>> mymodule
>>>> main.tf
Inside main.tf I have:
module "mymodule" {
source = "git::https://github.com/myprivaterepo/tf-modules.git"
}
In my dev environment, everything is set up so I need to import the state. However, it's not detecting the resources at all. In the .terraform directory, it is downloading the entire repository (the root with the readme.md and all)
I'm fairly new to Terraform. Am I approaching this wrong or misunderstanding?
I am using the latest version of Terraform.
Since there is a sub-directory "mymodule", you should specify the whole path.
module "mymodule" {
source = "git::https://github.com/myprivaterepo/tf-modules.git//mymodule"
}
Refer to module sources - sub directory
Example: git::https://example.com/network.git//modules/vpc
I am starting to use Terraform and I have a .terraform folder created by "terraform init/apply" containing :
./plugins/linux_amd64/lock.json
./plugins/linux_amd64/terraform-provider-google
./plugins/modules/modules.json
terraform.tfstate
Should I version these files ? I would say no ...
The .terraform directory is a local cache where Terraform retains some files it will need for subsequent operations against this configuration. Its contents are not intended to be included in version control.
However, you can ensure that you can faithfully reproduce this directory on other systems by specifying certain things in your configuration that inform what Terraform will place in there:
Use required_providers in a terraform block to specify an exact version constraint for the Google Cloud Platform provider:
terraform {
required_providers {
google = "3.0.0"
}
}
(this relates to the .terraform/plugins directory)
In each module you call (which seems to be none so far, but perhaps in future), ensure its source refers to an exact version rather than to a floating branch (for VCS modules) or set version to an exact version (for modules from Terraform Registry):
module "example"
source = "git::https://github.com/example/example.git?ref=v2.0.0"
# ...
}
module "example"
source = "hashicorp/consul/aws"
version = "v1.2.0
}
(this relates to the .terraform/modules directory)
If you are using a remote backend, include the full configuration in the backend block inside the terraform block, rather than using the -backend-config argument to terraform init.
(this relates to the .terraform/terraform.tfstate file, which remembers your active backend configuration for later operations)
Need some assistance regarding using files with in modules from terraform enterprise. Git top folder structure is like this:
modules
README.md
main.tf
With in the modules the folder structure is like this:
modules
main.tf
file1.json
the main.tf within modules referring these file1.json through
like below
resource "aws_iam_policy" "deny_bill_policy" {
name = "OpsPolicy"
path = "/"
policy = "${file("${path.module}/file1.json")}"
}
The same program runs without any issues from my localpc to deploy on aws but when i run the same through terraform enterprise which pulls repo from git throwing the following error.
module.policy_roles.aws_iam_policy.deny_bill_policy: file: open file1.json: no such file or directory in: ${file("${path.module}/file1.json")}
fyi - there is no previous/old .terraform dir existed. Seems TFE handling module/paths is different. some one please assist me here.
i have a terraform repo that looks something like this:
infrastructure
global
main.tf
The main.tf file references a module in a remote repository:
module "global" {
source = "git#github.com/company/repo//domain/global"
}
and the above module makes a reference to another module within the same remote repo: main.tf
module "global" {
source = "git#github.com/company/repo//infrastructure/global"
}
If i make a change in this module thats 3 levels deep, and then run terraform get and terraform init in the top level Terraform project followed by terraform plan, those changes aren't picked up.
Is there any reason for this?
i needed to do one of the following:
1) when running terraform init, i needed to pass the flag upgrade=true
2) or if running terraform get, i needed to pass the flag update=true
this downloads the latest versions of the requested modules
The Hashicorp Consul repository contains a Terraform module for launching a Consul cluster in AWS. The module references several files that are found in the parent directory of the module under the shared/scripts directory here https://github.com/hashicorp/consul/tree/master/terraform
However, when I reference the module in one of my .tf files and run terraform get to download the module, the required files under shared/scripts/ are not included with the downloaded module files, leading to errors like the one described here
My module section in Terraform looks like this:
module "consul" {
source = "github.com/hashicorp/consul/terraform/aws"
key_name = "example_key"
key_path = "/path/to/example_key"
region = "us-east-1"
servers = "3"
platform = "centos7"
}
Is there anyway to have terraform get pull in files that live outside the module directory?
Thanks
From looking at what those files do, I'd just copy the one's you need (depending on whether you're deploying on debian or rhel), which will be 2/3 files and feeding them into provisioner "file":
https://www.terraform.io/docs/provisioners/file.html