I'm newbie to tflint and other scanning tools. I'm trying to understand the scanning procedure.
In my example, I have variable.tf file where i'm passing variable names like azure storage account name, account tier etc.
my variable.tf file has which i gave intentionally.
variable "storage_account_name"{
type = string
default = "test-sa-123"
}
and in main.tf, i'm using as var.storage_account_name.
if I do tflint, normally it should throw error as storage account name should not have special characters but it is not throwing any errors.
So I want to understand whether tflint is capable to take that variable from variables.tf file and throw error in main.tf?
I tried checkov also but it is not throwing error for this mistake.
Is there any other tool which can scan the variables.tf and throw error in main.tf? Or do we need write our own rule for this in tflint?
thanks,
Santosh
Underscores in Terraform variables are valid. Tflint won't identify that as far as I know.
It won't throw an error with this. Even if the variable is passed through to the provider and references a storage account I wouldn't expect that to throw an error.
There are plugins as rulsets like this azure one for tflint which might increase what it catches but I still wouldn't rely on it for this purpose.
The way I'd handle this would be through Terraform variable validation.
The rules around storage accounts seem to be (source):
length: 3-24
type: Lowercase letters and numbers.
To identify that as a regex value you could use: ^[a-z0-9]*$
See tests
Here's a a very rough example:
variable "storage_account_name" {
type = string
validation {
condition = (
length(var.storage_account_name) > 3 && length(var.storage_account_name) < 25
)
error_message = "The storage_account_name value must be between 3 and 24 characters."
}
validation {
condition = (
regex('/^[a-z0-9]*$/g', var.storage_account_name)
)
error_message = "The storage account name must be only lowercase letters and numbers."
}
}
Related
While inserting new aws IAM policy rule on terraform, terraform plan passes
as terraform apply fails on the statement ID.
data "aws_iam_policy_document" "db_iam_policy_document" {
version = "2012-10-17"
statement {
actions = ["rds:DeleteDBInstance"]
effect = "Deny"
resources = [
"arn:aws:rds:us-west-2:123456789:db:*"
]
condition {
test = "StringEquals"
variable = "rds:db-tag/environment"
values = [
"production"
]
}
sid = "don't_delete_production_dbs !"
}
}
The error presented on my CI/CD pipeline as the following:
Error: error updating IAM policy arn:aws:iam::123456789:policy/my_policy_name:
MalformedPolicyDocument: Statement IDs (SID) must be alpha-numeric.
Check that your input satisfies the regular expression [0-9A-Za-z]*
According to aws iam documentation The Sid element on aws policy ( statement id - which helps us to improve and convey a better documenation and readability to our iam rules ) supports ASCII uppercase letters (A-Z), lowercase letters (a-z), and numbers (0-9).
we need to change the sid attibute following the right rules and get rid of the incorrect chars ( on this example space, underscore,comma and exclamation mark ).
sid = "productionDBDeletionIsProhibited"
I'm working to fine-tune some of my Terraform modules, specifically around the google_compute_vpn_tunnel, google_compute_router_interface, and google_compute_router_peer resources. I'd like to make things similar to AWS, where pre-shared keys and tunnel interface IP addresses are randomized by default, but can be overridden by the user (provided they are within a certain range).
The random option is working fine. For example, to create a 20-character random password, I do this:
resource "random_password" "RANDOM_PSK" {
length = 20
special = false
}
But, I only want to use this value if an input variable called vpn_shared_secret was not defined. Seems like this should work:
variable "vpn_shared_secret" {
type = string
default = null
}
locals {
vpn_shared_secret = try(var.vpn_shared_secret, random_password.RANDOM_PSK.result)
}
resource "google_compute_vpn_tunnel" "VPN_TUNNEL" {
shared_secret = local.vpn_shared_secret
}
Instead, it seems to ignore the vpn_shared_secret input variable and just go with the randomly generated one each time.
Is try() the correct way to be doing this? I'm just now learning Terraform if/else and map statements.
How about the coalesce() function?
The coalesce function takes any number of arguments, and returns the first argument that isn't null or an empty string.
locals {
vpn_shared_secret = coalesce(var.vpn_shared_secret, random_password.RANDOM_PSK.result)
}
Original reference - Quit condition on Terraform blueprint
Is it still possible to make conditional check like in the above question
resource "null_resource" "condition_checker" {
count = "${var.variable == 1 ? 0 : 1}"
"Insert your custom error message" = true
}
Similar format does not work in terraform 0.12 and 0.13 and I could not find any reference to removal of this feature. Is it possible to make a check like this 0.12 or 0.13?
Currently it is still not possible to validate inputs that require access to more than a variable. (The validation block only allows access to the validated variable.)
A hacky validation is still possible using the external data source:
data "external" "check_valid" {
count = var.to_test == true && some_other_condition ? 1 : 0
program = ["sh", "-c", ">&2 echo Condition must be satisfied when to_test is true; exit 1"]
}
This condition is checked before terraform asks for approval of a plan.
On the output it looks like this:
Error: failed to execute "sh": Condition must be satisfied when to_test is true
on variables.tf line 1, in data "external" "check_valid":
1: data "external" "check_valid" {
What you're referring to here was never an actual Terraform feature, but rather an example of exploiting a bug in an earlier version of Terraform to get a result that Terraform had no explicit support for.
With that said, modern versions of Terraform have support for custom variable validation rules which allow you to write out variable validation checks directly inside the corresponding variable block. For example:
variable "variable" {
type = number
validation {
condition = var.variable == 1
error_message = "Variable value must always be 1."
}
}
With that said, I just copied your contrived example from the question here, so this would require some adaptation for a real example. Note also that variable validation rules can only depend on the variable value and other constants, so you can't use this for more complicated checks such as those which involve two different variables. For that sort of situation, I'd recommend refactoring so that the values that are related arrive in a single variable of a object type, and then the validation can be for whether that object is valid.
I'm trying to use the export from a azure function app in terraform to get the possible outbound ip addresses that I can add to a whitelist for a firewall
The parameter returned is a string of ips comma separated.
I have tried using the split function within terraform but it doesn't give a list, it gives an interface which can't be used as a list. I've tried using local scopes to add square brackets around it but still the same.
Let me just add this is terraform 11 not 12.
resource "azurerm_key_vault" "keyvault" {
name = "${var.project_name}-${var.environment}-kv"
location = "${azurerm_resource_group.environment.location}"
resource_group_name = "${azurerm_resource_group.environment.name}"
enabled_for_disk_encryption = true
tenant_id = "${var.tenant_id}"
sku_name = "standard"
network_acls {
bypass = "AzureServices"
default_action = "Deny"
ip_rules = "${split(",", azurerm_function_app.function.possible_outbound_ip_addresses)}"
}
tags = {
asset-code = "${var.storage_tags["asset_code"]}"
module-code = "${var.storage_tags["module_code"]}"
environment = "${var.environment}"
instance-code = "${var.storage_tags["instance_code"]}"
source = "terraform"
}
}
This comes back with the error "ip_rules must be a list".
Thanks
I think what you are seeing here is a classic Terraform 0.11 design flaw: when a value is unknown at plan time (because it will be decided only during apply), Terraform 0.11 can't properly track the type information for it.
Because possible_outbound_ip_addresses is an unknown value at planning time, the result of split with that string is also unknown. Because Terraform doesn't track type information for that result, the provider SDK code rejects that unknown value because it isn't a list.
To address this in Terraform 0.11 requires doing your initial run with the -target argument so that Terraform can focus on creating the function (and thus allocating its outbound IP addresses) first, and then deal with the processing of that string separately once it's known:
terraform apply -target=azurerm_function_app.function
terraform apply # to complete the rest of the work that -target excluded
Terraform 0.12 addressed this limitation by tracking type information for both known and unknown values, so in Terraform 0.12 the split function would see that you gave it an unknown string and accept that as being correctly typed, and then it would return an unknown list of strings to serve as a placeholder for the result that will be finally determined during the apply phase.
If is var.string is 1.2.3.4,5.6.7.8-
split(',', var.string)[0] should give you back 1.2.3.4 as a string. Your questions is difficult without an example.
Here is an example of how I can get a list of possible IPs
create a data source and then a locals var
app_services = [ "app1", "app2", "app3" ]
data "azurerm_app_service" "outbound_ips" {
count = length(var.app_services)
name = var.app_services[count.index]
resource_group_name = var.server_resource_group_name
}
locals {
apps_outbound_ips = distinct(flatten(concat(data.azurerm_app_service.outbound_ips.*.possible_outbound_ip_address_list)))
}
You don't have to use a data source either, if you are building the resource just use the outputs instead of a data source, in my case I use a data source as I build my apps separately.
Works flawlessly for me and produces a list of strings (Strings being each unique outbound IP of the set of app services / function apps) in the form of local.apps_outbound_ips
Enjoy :)
In my lambda.tf, I have a data resource
data "template_file" "handler" {
template = "${file("${path.module}/templates/handler.js")}"
vars = {
ENDPOINT = "${var.domain}"
PASSWORD = "${var.password}"
}
}
However - I'm encountering a syntax error:
Error: failed to render : <template_file>:280,49-50: Extra characters after interpolation expression; Expected a closing brace to end the interpolation expression, but found extra characters.
on ../docs/lambda.tf line 1, in data "template_file" "handler":
1: data "template_file" "handler" {
Is interpolation inside an interpolation allowed for Terraform? If so - any suggestions on pointing towards where the error is would be greatly appreciated.
Terraform v0.12.9.
Provider "aws" version "~> 2.7"
Not exactly clear what your template file looks like or what you are trying to do, so here are a couple different answers.
You can escape interpolation with double dollar signs: $${foo} will be rendered as a literal ${foo}.
Terraform does not allow dynamic construction of variable names, because it needs to be able to analyze the configuration statically (that is, without evaluating any expressions) in order to determine which order the expressions must be resolved in.
Terraform supports a map data structure that can be used to achieve this effect.
variable "var1" {
default = "value1"
}
variable "var2" {
default = "value2"
}
locals {
var3 = "${var.var1}_${var.var2}"
values = {
"value1_value2" = "local1"
"value2_value3" = "local2"
"value3_value4" = "local3"
}
}
output "val_output" {
value = "${local.values[local.var3]}"
}
If neither is what you are looking for, you need to share your template file or a modified version that duplicates the error.
The template_file data source continues to exist for users of Terraform 0.11 and earlier, but since you are using a Terraform 0.12 release I'd recommend using the templatefile function instead. Because it's built directly into Terraform, it is able to produce better error messages.
To use it, you can replace your references to data.template_file.handler.rendered with a direct call to templatefile. If you are using that rendered result in multiple locations, you can assign the templatefile result to a local value and reference that in multiple places instead.
templatefile("${path.module}/templates/handler.js", {
ENDPOINT = var.domain
PASSWORD = var.password
})
The error message you saw suggests that there's a syntax error in your template itself, but because template_file is implemented in a separate provider it's reporting that syntax error in an unhelpful way, pointing to a particular source location but not including the relevant source code snippet.
If you use templatefile instead, Terraform can hopefully report this syntax error itself and give better information about it.
Either way, it seems like the syntax error is on line 280 column 49 of your handler.js file and is caused by Terraform's template engine expecting to find the } to close a ${ ... } interpolation sequence but finding something else instead. If you correct that syntax error, template rendering should succeed by either approach.