Iterate through map for file function - terraform

I have a usecase for uploading multiple files to s3 using terraform. I would like to upload multiple objects using the count function. In doing this need to iterate through the source with file("${path.module}/path/to/file").
Is there anyway to make the file function a mapped variable leveraging the count.index?

Terraform 0.12.8 introduced a new function fileset which can return a set of file paths matching a particular pattern in a particular base directory.
We can combine that with resource for_each (rather than count) to upload the matching files to S3, like this:
resource "aws_s3_bucket_object" "example" {
for_each = fileset("${path.module}/files", "*") # could use ** instead for a recursive search
bucket = "example"
key = each.value
source = "${path.module}/${each.value.source_path}"
# Unless the bucket has encryption enabled, the ETag of each object is an
# MD5 hash of that object.
etag = filemd5("${path.module}/${each.value.source_path}")
}
Using for_each instead of count here means that Terraform will identify each instance of the resource by its S3 path rather than by its position in a list, and so you can add and remove files without disturbing other files. For example, if you have a file called example.txt then Terraform will track its instance as aws_s3_bucket_object.example["example.txt"], rather than an address like aws_s3_bucket_object.example[3] where 3 is its position in the list of files.
I have written a Terraform module that builds on fileset to also support template rendering and detecting filetypes based on filename suffixes, which might make life easier in some more complicated situations: apparentlymart/dir/template. You can use its result with aws_s3_bucket_object in a similar way to the above, as shown in its README.

Related

How to assign certain Terraform Variables via TFVARS file, while others via Terraform Cloud Variable sets

I currently have a dev.auto.tfavrs file with a few dozen variables and their values for my application such as:
DB_NUM_RECORDS_PER_EXECUTION = "Something here"
QUEUE_API_KEY = "Something here"
application = "My Appplication"
Application_Secret_Key = "Some Secret Key here"
All variables are defined in a variables.tf file, with the sensitive ones assigned a blank value.
I run terraform plan in this manner:
terraform apply -var-file=env/dev.auto.tfvars
(I have also qa.auto.tfvars and prod.auto.tfvars for the other environments)
I want to inject certain sensitive values to certain keys such as :
Application_Secret_Key via The Terraform Cloud Variable sets so developers don't have to.
I have added Application_Secret_Key in the TF Cloud Variable set.
but when i run the above terraform plan, Terraform Cloud is not injecting the value stored in the variable set...instead it assigns the blank value as defined in my configuration.
It is my understanding that the auto.tfvars files take precedence and over write the terraform.tfavars file in Terraform Cloud. Hence the sensitive values are Blank
I do not want to add dozens of variables in the TF Cloud Variable set...only certain sensitive ones.
Is this possible?
Thanks in advance

AWS Boto3 S3: I accidentally renamed a set of files as empty string. They're gone right?

To "rename" some files I copied them with a new name and then deleted the originals. In creating the new name I meant to do this:
new_key_path = '.'.join(key_path.split('.')[0:3])
But I did this
new_key_path = '.'.join(str.split('.')[0:3])
key_path vs str. The former is a valid variable (path to file), the latter was apparently not None, but an empty string. So it didn't error out. The result of this was that all iterations set new_key_path to .. The rest of the logic was such that I was essentially copying to the "root" of the s3 bucket...
Anyway, I can get the data back elsewhere but just want to validate that I indeed messed up in this specific way. I don't see it anywhere else in the bucket. Thanks
EDIT: adding the example renaming code. This is out of the box with Boto3.
self.s3_resource.Object(self._bucket_name, new_key_path).copy_from(CopySource=copy_src)
self.s3_resource.Object(self._bucket_name, key_path).delete()

How can I convert a tar encoded file into a terraform variable?

I'm trying to copy a license file onto an instance using a provisioner in terraform.
I'm trying to minimize the amount of files in my directory, so I would like to avoid having any extra files. For other files, I was able to pass it to the destination using content as opposed to source in a terraform provisioner. However, since the file I'm trying to copy is a tar archive, I can't find a way to convert it into a string format that can be expressed as a value for content.
It also needs to be decoded using terraform's language so that it can be read properly by the VM. This unfortunately limits my options to terraform's decoding functions](https://www.terraform.io/docs/configuration/functions/base64decode.html)
I was thinking my best option was to try to encode it into base64, but I couldn't find a way to do it. Any suggestions?
A quick way.
First encode your tar file into base64 and save to parameter store
cat <tar_file>.tar |base64 -w0 |tee output.txt
Notes: -w0 can make sure the output in one line
Then save the base64 hash to AWS Systems Manager Parameter Store as string or secure string, for example, name it as license_key
Then you can work in terraform to get it
data "aws_ssm_parameter" "foo" {
name = "license_key"
}
locals {
license = base64encode(data.aws_ssm_parameter.foo.value)
}
output "license" {
value = local.license
}
codes are for terraform 0.12.x
The rest for using provisioner in terraform, you should know how to do that.
Above codes are not tested, I just provide the idea for you.

Is there a way to input variable values from outside to terraform main file?

Is there a way I can input variable values from outside to terraform main file. It can be a excel sheet or sql db. Is it possible to do so ?
What you can't currently do is point you cmdline at a db i.e. to replace a tfvars file, but what you can set up in Terraform is to use a number of different key value stores:
consul
https://www.terraform.io/intro/examples/consul.html
aws parameter store (using a resource or data)
https://www.terraform.io/docs/providers/aws/d/ssm_parameter.html
There are quite a number of other key/value stores to choose from but there's no zero code solution and you will end up with lots of these statements:
Setup a key in Consul to provide inputs
data "consul_keys" "input" {
key {
name = "size"
path = "tf_test/size"
default = "m1.small"
}
}
There are many ways to do that;
You can use a tfvars file with all your inputs and you can use one file customer, user, environment
You can pass the variables to terraform executable on the command line
You can define environment files prefixed wit TF_VAR_[variable]
You can use https://www.terraform.io/docs/providers/aws/d/ssm_parameter.html as suggested above
You can even store variables in DynamoDB or any other database
You can use Consult+Vault as well

How do I append array items to a string over a loop in puppet

lets say I have an array with directory names
dirs = ['opt', 'apps', 'apache']
I want to iterate and generate a list of following paths
/opt
/opt/apps
/opt/apps/apache
through which I can create file resource.
Is there a reason you want to iterate through those files like that?
Because the simplest way to turn those into file resources would be this:
$dirs = ['/opt', '/opt/apps', '/opt/apps/apache']
file { $dirs:
ensure => directory,
}
If you just want to make sure that all the preceeding directories are created, there is also the dirtree module, which will do this all for you:
https://forge.puppet.com/pltraining/dirtree
$apache_dir = dirtree('/opt/apps/apache')
# Will return: ['/opt', '/opt/apps', '/opt/apps/apache']
You can then use that variable to create the directories.
As Matt mentions, you can also use maps, or an iterator to create the resources.
Basic example here:
$dirs = ['/opt', '/opt/apps', '/opt/apps/apache']
$dirs.each |String $path| {
file {$path:
ensure => directory,
}
}
Documented here: https://docs.puppet.com/puppet/latest/lang_iteration.html
There are a few different ways to do what you want to do in the code, it depends on how much management you want to do of those resources after creation.

Resources