Referencing Lambda Source Files in Sibling Directory using Terraform - terraform

I am attempting to deploy a Lambda function using Terraform, where my source files are in a different directory adjacent to where I have my Terraform files. I want to have Terraform do the zipping of the source files for me and deploy them into the Lambda. Terraform doesn't seem to want to recognize that my files are there, though.
My directory structure:
project_root/
deployment/
terraform/
my-terraform.tf
function_source/
function.py
I want it to package everything in function_source directory (there is only one file there now, but may be more later) and drop it into the deployment directory.
My Terraform:
data "archive_file" "lambda_zip" {
type = "zip"
output_path = "../function.zip"
source_dir = "../../function_source/"
}
resource "aws_lambda_function" "my_lambda" {
filename = "${data.archive_file.lambda_zip.output_path}"
function_name = "my-function"
role = "${aws_iam_role.lambda_role.arn}"
handler = "function.handler"
runtime = "python3.7"
}
When I run this, though, I get the error message data.archive_file.lambda_zip: data.archive_file.lambda_zip: error archiving directory: could not archive missing directory: ../../function_source/
I have tried using absolute paths without success (which wouldn't be a good solution anyway). I have also tried creating the .zip file manually and hardcoding its directly in Lambda declaration, but it only works if I put the .zip file in my terraform directory. It seems Terraform can only see files in its own directory or below, but I'd rather not co-mingle my source files there. Is there a way to do this?
I am using Terraform v0.12.4

Related

Terraform file path is not being resolved when using {path.cwd}

I have been trying to fix what seems to be a path resolving issue while running Terraform. The issue I notice is, the ${path.cwd} is resolving to /terraform path and not the actual path in which the .tf files and the output.tf files exist. I have tried hardcoding the path, but, what I see error related to the file not being available in that path as well. How to list out all the files in path.cwd to help with debugging if it is possible? What could be causing this?
output {
value = templatefile {
"${path.cwd}/somefile.yml" {
x = a.b.c.something.id,
y = c.v.b.something.id
}
}
}

Terraform, Archive failed due to missing directory

I have been working on Terraform using AzureDevOps before that I developed tf files using VS code and everything worked fine when try to move files from VS code to Azure DevOps , getting issue on Archive source file path it unable to find the directory, searched every where but unable to resolve this,
Path which was working fine on VS code was “…/Folder name” using same path in Azure DevOps as I have upload completed folder that I have build in VS code but it always get failed when try to archive files as it un-able to find the directory.
[Code Block DevOps]
terraform {
required_providers {
azurerm = {
source = "hashicorp/azurerm"
# Root module should specify the maximum provider version
# The ~> operator is a convenient shorthand for allowing only patch releases within a specific minor release.
version = "~>2.11"
}
}
}
provider "azurerm" {
features {}
#skip_provider_registration = true
}
locals {
location = "uksouth"
}
data "archive_file" "file_function_app" {
type = "zip"
source_dir = "../BlobToBlobTransferPackage"
output_path = "blobtoblobtransfer-app.zip"
}
module "windows_consumption" {
source = "./modules/fa"
archive_file = data.archive_file.file_function_app
}
output "windows_consumption_hostname" {
value = module.windows_consumption.function_app_default_hostname
}
Image of VS Code where everything is working fine:
Image of DevOps where getting Missing Directory Error:
Folder Structure that is working fine with VS code
It was due to path which is fixed now,

How to read variables from CSV in terraform

I am having a CSV file with few values.
I want to read that values into variables in terraform.
I have used giving locals with file path . But it shows path not found. How can I read variables from CSV in terraform.
I am having my git structure like below.
where Key_vault folder is having my terraform codes.And adf_confg is having my csv file.
my main.tf is like this.
I am getting error: Invalid value for "path" parameter: no file exists at ./adf_config/datasets.csv; this function works only with files that
│ are distributed as part of the configuration source code, so if this file will be created by a resource in this
│ configuration you must instead obtain this result from an attribute of that resource
If your Terraform module is in the Key_Vault directory and your CSV file is in adf_config then the path from the Terraform module to the CSV file must start with ../ to traverse to the parent directory.
I would also typically suggest using path.module to be explicit that we're writing a path relative to the current module, although when your module is the root module it doesn't really make any difference because path.module will always be . (the current directory) in that case. Using path.module can help with refactoring this configuration into a child module later though, since it will already be clear what this path is relative to.
locals {
datasets = csvdecode(file("${path.module}/../adf_config/datasets.csv"))
}

how to avoid mixture of \ and / in file paths when joining paths in Docker containerized Python code

As far as I'm aware I'm using best practices to define paths (using raw strings) and how I go about joining them (using os.path.join()), e.g.
import os
fdir = r'C:\Code\...\samples'
fpath = os.path.join(fdir, 'fname.ext')
and doing so has not caused me any problems when running my code within a Python or command shell. If I print fpath to the console I get consistent use of \s in the path:
C:\Code...\samples\fname.ext
But when I run a Docker containerized version of the code and run the image I get the error:
FileNotFoundError: [Errno 2] No such file or directory:
'C:\Code\...\samples/fname.ext'
I don't understand why os.path.join() has used a / to join fdir and fname.ext when the rest of the path included \\. It doesn't do this when I run the code outside of the container.
I have tried using os.path.normpath():
fpath = os.path.join(fdir, 'fname.ext')
fpath = os.path.normpath(fpath)
as discussed here, and os.sep.join():
fpath = os.sep.join([fdir, 'fname.ext'])
as covered here, and Path().joinpath():
from pathlib import Path
fpath = Path(fdir).joinpath('fname.ext')
as well as Path() / 'path_to_add':
fpath = Path(fdir) / 'fname.ext'
as discussed here, but in every case I end up with the same result using os.path.join().
Can someone please help me to understand what is going on and how to create consistent paths that will work whether I run the code in Python in a Windows environment, or in a Docker container?
Update Nov. 16:
In trying to keep my question brief I think I've left out details that are crucial. Apologies to those who have kindly taken the time to offer suggestions based on my incomplete description of the problem.
My code needs to import/export files from/to directories that are defined within a user-specified configuration file.
So the configuration file has a section of code where the user defines variables and paths, e.g.
samplesDir = r"path-to-samples-directory"
The variables are stored in a dictionary of dictionaris and stored as a .json.
At the start of the code the user defines the key that selects the dictionary of interest so that at various parts in my code when a file needs to be imported/exported, the paths are at hand.
So back to my example, samplesDir is stored in the configuration dictionary, cfgDict, so all I need to do is append the file name:
sampleFpath = os.path.join(sampleDir, sampleFname)
and sampleFname is determined based on other variables.
Because of the dynamic nature of the variables (including directory paths and file paths), I think it rules out the use of static path defined in a .yml with Docker Compose.
Update Nov. 18:
It may help to include a few more details and some screenshots.
The above screenshot shows the file and folder structure of the src directory containing the source code, the main app.py script for command-line use, the Dockerfile, etc.
The configs folder contains JSON files that includes variables, paths to directories and files. The user can create configuration files either by copying an existing one and modifying the entries, or configuration files can be generated by calling config.py.
Within config.py I have pre-set variables and paths, so that the directory path to the configuration files (configs), sample files (sample_DROs) and others (e.g. fiducials) are all within src.
I don't anticipate any reason why the user would want to store the config files anywhere else, nor do I expect them to want to use different sample files (or move them elsewhere). However, they will undoubtedly create their own fiducials and may decide not to store them in the fiducials directory (i.e. somewhere not within the src directory).
Likewise I have pre-set the download directory (based on the parameters stored within the configuration files, files are fetched from a server and downloaded) to be the default Downloads directory:
rootDownloadDir = os.path.join(Path.home(), "Downloads", "xnat_downloads")
Those files are later imported, processed, and the outputs are (by default) exported into sub-directories within rootDownloadDir.
Within Dockerfile I set the working directory of the container to be that of the source code and copy all of the contents of src (with the exception of some directories defined in .dockerignore):
WORKDIR C:/Code/WP1.3_multiple_modalities/src
...
COPY . .
so that the structure of the container mimics that of WORKDIR:
Hence I have allowed for flexibility in import/export directories, and they are by default a combination of paths within and outside of the src directory. And so, the code executed within the container will need to access files both within and outside of src.
That said, I don't know what rootDownloadDir will look like when os.path.join(Path.home(), "Downloads", "xnat_downloads") is run within the container.
This has got me thinking - Is it bad practice to set the download directory outside of src?
Returning to the original error:
the sample file is in the container:
From the actual behavior I can suppose that the container is based on Unix-like image. Path separator is / in such systems.
To build an environment-independent path which works inside and outside of the container you need the following steps:
Mounting of host folder to container directory.
Environment variable inside and outside the container.
I can show an example of how this is achievable via docker-compose tool and its configuration file docker-compose.yml:
# docker-compose.yml file
version: '3'
services:
<service_name>: # your service name here
image: <image_name> # name of image your container is built on
environment:
- SAMPLES_PATH=/samples
volumes:
- C:\Code\somepath\samples:/samples
In your python code you can use the following structure:
import os
fdir = os.getenv('SAMPLES_PATH', r'C:\Code\...\samples')
fpath = os.path.join(fdir, 'fname.ext')

How to indicate custom configuration files for terragrunt modules?

I am trying to build Terragrunt script for deploying the infrastructure to Microsoft Azure cloud. Things are working fairly well but I am not able to figure out one thing.
The structure of setup looks something like this:
rootdir
terragrunt.hcl
someconfig.hcl
module1dir
terragrunt.hcl
config.auto.tfvars.json
module2dir
terragrunt.hcl
config.auto.tfvars.json
module3dir
terragrunt.hcl
config.auto.tfvars.json
Each module is configured using Terraform autoload tfvars feature with config.auto.tfvars.json. What I would like is to have these files outside of the directory structure and somehow instruct Terragrunt to apply correct external configuration file to correct submodule.
Any ideas?
I solved this in the following manner:
Define environment variable you plan on using which should contain location to the configuration files. Make sure it is not clashing with anything existing. In this example we will use TGR_CFGDIR. In the external configuration module place the module configuration files and make sure they are properly named. Each file should be named as the module and end with .auto.tfvars.json. So if your module is named foo you should have config file foo.auto.tfvars.json. Change your terragrunt modules (terragrunt.hcl) to have these statements:
locals {
moduleconfig = get_env("TGR_CFGDIR")
modulename = basename(get_terragrunt_dir())
}
generate "configuration" {
path = "config.auto.tfvars.json"
if_exists = "overwrite"
disable_signature = true
contents = file("${local.moduleconfig}/${local.modulename}.auto.tfvars.json")
}
And finally call terragrunt cli like this:
TGR_CFGDIR="<configdir>" terragrunt "<somecommand>"

Resources