Retrieve the value of a provisioner command? - terraform

This is different from "Capture Terraform provisioner output?". I have a resource (a null_resource in this case) with a count and a local-exec provisioner that has some complex interpolated arguments:
resource "null_resource" "complex-provisioning" {
count = "${var.count}"
triggers {
server_triggers = "${null_resource.api-setup.*.id[count.index]}"
db_triggers = "${var.db_id}"
}
provisioner "local-exec" {
command = <<EOF
${var.init_command}
do-lots-of-stuff --target=${aws_instance.api.*.private_ip[count.index]} --bastion=${aws_instance.bastion.public_ip} --db=${var.db_name}
EOF
}
}
I want to be able to show what the provisioner did as output (this is not valid Terraform, just mock-up of what I want):
output "provisioner_commands" {
value = {
api_commands = "${null_resource.complex-provisioning.*.provisioner.0.command}"
}
}
My goal is to get some output like
provisioner_commands = {
api_commands = [
"do-lots-of-stuff --target=10.0.0.1 --bastion=77.2.4.34 --db=mydb.local",
"do-lots-of-stuff --target=10.0.0.2 --bastion=77.2.4.34 --db=mydb.local",
"do-lots-of-stuff --target=10.0.0.3 --bastion=77.2.4.34 --db=mydb.local",
]
}
Can I read provisioner configuration and output it like this? If not, is there a different way to get what I want? (If I didn't need to run over an array of resources, I would define the command in a local variable and reference it both in the provisioner and the output.)

You cannot grab the interpolated command from the local-exec provisioner block but if you put the same interpolation into the trigger, you can retrieve it in the output with a for expression in 0.12.x
resource "null_resource" "complex-provisioning" {
count = 2
triggers = {
command = "echo ${count.index}"
}
provisioner "local-exec" {
command = self.triggers.command
}
}
output "data" {
value = [
for trigger in null_resource.complex-provisioning.*.triggers:
trigger.command
]
}
$ terraform apply
null_resource.complex-provisioning[0]: Refreshing state... [id=9105930607760919878]
null_resource.complex-provisioning[1]: Refreshing state... [id=405391095459979423]
Apply complete! Resources: 0 added, 0 changed, 0 destroyed.
Outputs:
data = [
"echo 0",
"echo 1",
]

Related

Terraform local-exec command with complex variables

I'd need to iterate through a list of variables in local-exec provider. Is that possible?
variables.tf:
variable "items" {
default = []
}
main.tf:
resource "null_resource" "loop_list" {
provisioner "local-exec" {
interpreter = ["/bin/bash", "-c"]
command = <<EOF
for i in ${join(' ', var.items)}
print $i
done
EOF
}
}
You should be able to use environment. Something like this:
variable "items" {
default = ["item1", "item2"]
}
resource "null_resource" "loop_list" {
provisioner "local-exec" {
command = "for item in $ITEMS; do echo $item >> test-file; done"
environment = { ITEMS = join(" ", var.items) }
}
}
Where terraform apply and cat test-file yields:
item1
item2

Conditionally triggering of Terraform local_exec provisioner based on local_file changes

I'm using terraform 0.14 and have 2 resources, one is a local_file that creates a file on the local machine based on a variable and the other is a null_resource with a local_exec provisioner.
This all works as intended but I can only get it to either always run the provisioner (using an always-changing trigger, like timestamp()) or only run it once. Now I'd like to get it to run every time (and only when) the local_file actually changes.
Does anybody know how I can set a trigger that changes when the local_file content has changed? e.g. a last-updated-timestamp or maybe a checksum value?
resource "local_file" "foo" {
content = var.foobar
filename = "/tmp/foobar.txt"
}
resource "null_resource" "null" {
triggers = {
always_run = timestamp() # this will always run
}
provisioner "local-exec" {
command = "/tmp/somescript.py"
}
}
You can try using file hash to indicate its change:
resource "null_resource" "null" {
triggers = {
file_changed = md5(local_file.foo.content)
}
provisioner "local-exec" {
command = "/tmp/somescript.py"
}
}

How to run a null_resource in terraform at the start of the script

I have a use case where I am taking all variables from locals in terraform as shown below, but before that, I want to run a null_resource block which will run a python script and update all the data into the local's file.
So my use case in simple words is to execute a null_resource block at the start of the terraform script and then run all the other resource blocks
My current code sample is as follows:
// executing script for populating data in app_config.json
resource "null_resource" "populate_data" {
provisioner "local-exec" {
command = "python3 scripts/data_populate.py"
}
}
// reading data variables from app_config.json file
locals {
config_data = jsondecode(file("${path.module}/app_config.json"))
}
How do I achieve that? All I have tried is adding a triggers command inside locals as follows but even that did not work.
locals {
triggers = {
order = null_resource.populate_data.id
}
config_data = jsondecode(file("${path.module}/app_config.json"))
}
You can use depends_on
resource "null_resource" "populate_data" {
provisioner "local-exec" {
command = "python3 scripts/data_populate.py"
}
}
// reading data variables from app_config.json file
locals {
depends_on = [null_resource.populate_data]
config_data = jsondecode(file("${path.module}/app_config.json"))
}
Now locals will get executed after populate_data always.

terraform local-exec ausführen

I would like to perform the following scenario in Terraform:
resource "aws_ecr_repository" "jenkins" {
name = var.image_name
provisioner "local-exec" {
command = "./deploy-image.sh ${self.repository_url} ${var.image_name}"
}
}
However, it is not executed. Does anyone have an idea what could be?
I had to add a working directory
resource "null_resource" "backend_image" {
triggers = {
build_trigger = var.build_trigger
}
provisioner "local-exec" {
command = "./deploy-image.sh ${var.region} ${var.image_name} ${var.ecr_repository}"
interpreter = ["bash", "-c"]
working_dir = "${path.cwd}/${path.module}"
}
}
Now it works.

dealing with external data sources when running destroy in terraform

For an external data source, I need to run a bash command when I run terraform destroy.
Is there a way to do an if to trigger this?
data "external" "token" {
program = ["sh", "${path.module}/get_token.sh"]
query = {
controller = "${packet_device.controller.network.0.address}"
}
}
maybe using an if counter? but somehow making sure its run with destroy
count = var.myInitExData ? 1 : 0
Not sure if that works, but you could try null_resource with a Destroy-Time provisioner:
resource "null_resource" "token" {
triggers = {
token = data.external.token.result
}
provisioner "local-exec" {
when = "destroy"
working_dir = path.module
command = "destroy_time_script.sh"
interpreter = ["sh"]
}
}

Resources