Use one terraform versions.tf for multiple terraform directories? - terraform

I'm updating my project to use the latest version of terraform. Each terraform directory in my project has it's own versions.tf file:
terraform
├── s3
│ ├── main.tf
│ └── variables.tf
│ └── versions.tf
├── pinpoint
├── main.tf
└── variables.tf
└── versions.tf
To keep things clean and avoid repeating myself by updating the version in every versions.tf, I was wondering if it was possible to have just one versions.tf file in the project that all directories can use like this:
terraform
├── s3
│ ├── main.tf
│ └── variables.tf
├── pinpoint
│ ├── main.tf
│ └── variables.tf
│
└── versions.tf
Is this possible? Is there a way to reference another versions file? (terraform version 0.12.31)

The typical content of versions.tf is a description of the providers needed by each specific module and, if needed, the version of Terraform that module is intended to support.
There should be no reason to share these declarations between modules because they are module-specific: one module might require a newer version of a provider than another, for example. It is not repetition to state the requirements for each module just because some of them happen to currently have the same requirements, because these modules should be able to evolve independently over time rather than always changing their requirements in lockstep.
Terraform will automatically select the newest available version of each required provider that meets the constraints of all modules involved in a particular configuration, so as long as all of your modules can agree on one version they are compatible with there is no need to tightly coordinate them.
Note that the required_providers block is intended to represent the set of provider versions that your module is known to be compatible with. That's not the appropriate place to select only a single version of a provider. Terraform automatically records the single selected version of each provider in the dependency lock file, so you can check that file into version control to fix a particular set of version selections, and then run terraform init -upgrade whenever you want to update that file to select newer available versions that are also compatible with your modules' version constraints.
If you really do still want to share a file between multiple configurations then you can achieve that using a symbolic links mechanism provided by your operating system, to make the same file appear to be in multiple directories. In that case Terraform isn't aware that the file is shared across all of them and so it will still treat each directory as separate, but you can update the real file and then all of the symbolic links will refer to the updated copy.

Related

How to resolve relative path hell (node with typescript)

I am developing a CLI using node, typescript and other packages. The idea is simple, the CLI gives you an initial structure for your node projects (javascript or typescript)
Well, while testing, I discovered a problem that I have not been able to solve for days. The initial structure is:
├────src
│ ├───api
│ │ └───users
│ │ ├───users.controller.ts/js
│ │ ├───users.interface.ts (just for typescript projects)
│ │ ├───users.routes.ts/js
│ │ └───users.service.ts/js
│ ├───config
│ │ └────environments
│ │ ├───development.ts/js
│ │ ├───index.ts/js
│ │ └───production.ts/js
│ ├───global
│ │ └───services
│ │ └───abstract.service.ts (just for typescript projects)
│ ├───middlewares
│ ├───database.ts/js
│ ├───index.ts/js
│ └───server.ts/js
├───.editorconfig
├───.env
├───.gitignore
├───package.json
├───package-lock.json
└───tsconfig.json (just for typescript projects)
The typescript services files extends of an abstract service (this one is in ./src/global/services/abstract.service.ts). Each component will be in ./src/api directory and have its respective folder (like users component). The relative path to import the abstract class in the users component is ../../global/services/abstract.service
But you can create the component in a different directories, so the previous relative path does not work. To fix this problem I decided to using a routes configuration like webpack. So instead of use the previous relative path, I want to use something like this #global/services/abstract.service
I achieved this if the node project using javascript with babel:
#babel/polyfill
#babel/cli
#babel/core
#babel/node
#babel/preset-node
babel-plugin-module-resolver
But I don not achieved that when is a typescript project. The tsconfig.json has the "paths" options, this works in development, but when I compile typescript to javascript this doesn not work. I tried many ways to resolve that but everything failed. These are the packages that I've used until this now:
link-module-alias ()
ts-node with tsconfig-path
ttypescript
#zerollup/ts-transform-paths
The second one works correctly but I must restart for each changes that I make in ts files, maybe I did not configure the scripts correctly in package.json, because the tsc-watch did not work). The links I saw:
Require absolute instead of relative
ttypescript
ts-transform-paths
link-module-alias
Relative path hell
tsconfig paths
Please, if anyone knows how to solve the relative path hell, I will appreciate it if you tell me how, with examples or something to help me

Store "Provider" and "Terraform" version in separate TF file?

Currently, I have to define the Terraform version and Provider version in each of my Terraform templates.
I would like to have a file outside of my Terraform templates where I can define the Provider version and Terraform version to use for every template in the directory structure.
I've looked at using an overrides file (https://www.terraform.io/docs/configuration/override.html), but it appears I'd have to define the whole Terraform/Provider block from each template within the override.tf file. I'd really like to just be able to tell Terraform to look at (pseudo-file) versions.tf, for example, to get the necessary versions for every template.
So something like this would be the desired (simplified) directory structure:
terraform
├── dev
│   └── main.tf
├── prod
│   └── main.tf
├── stg
│   └── main.tf
└── versions.tf
Right now there would need to be only one Provider version and one Terraform version defined in versions.tf
Is it possible to pass Provider/Terraform version into templates in this way?
If you don't want to use Terraform Workspaces then create a symlink in each subdirectory pointing to the versions.tf file.
For example, if your structure is:
terraform
├── dev
│ └── main.tf
├── prod
│ └── main.tf
├── stg
│ └── main.tf
└── versions.tf
And you want each subdirectory (dev, prod, stg) to point to the versions.tf file on the root terraform directory, create a symlink in each subdirectory:
ln -sf terraform/versions.tf terraform/dev/versions.tf
ln -sf terraform/versions.tf terraform/prod/versions.tf
ln -sf terraform/versions.tf terraform/stg/versions.tf
Your final structure would be:
terraform
├── dev
│ └── main.tf
│ └── versions.tf -> ../versions.tf
├── prod
│ └── main.tf
│ └── versions.tf -> ../versions.tf
├── stg
│ └── main.tf
│ └── versions.tf -> ../versions.tf
└── versions.tf
Using Terraform Workspaces, instead of having a sub-directory for each "environment" you're using so have a single directory like:
terraform
├── main.tf
└── versions.tf
And a workspace per environment which you create by doing:
terraform workspace new dev
You then use terraform interpolation to do something different depending on which environment you are working on, terraform states are also stored separately per workspace.
So if you want to work on the dev environment you switch to that one:
terraform workspace select dev

What is the purpose of variables.tf in Terraform root directory when using modules?

I am writing some code to describe infrastructure on the AWS, and I am following Terraform best practices. To make my code more reusable and future proof, I am using modules. In the end, my code may look like this:
├── modules
│   └── aws_vpc
│   ├── main.tf
│   └── vars.tf
├── prod
│   ├── main.tf
│   └── variables.tf
├── terraform.tfstate
└── terraform.tfstate.backup
I am NOT using terraform-workspace for sake of simplicity.
Question is, what is the purpose of variables.tf withing the Terraform root directory if I can't reuse them in modules?
Idea is to have separate directories for each environment dev, prod and qa where I can reuse all my modules defined in modules directory and use environment specific variables defined in env directory.
There are similar discussions on the Terraform Github pages, but as I see, such use is not encouraged.
So, what is the purpose of variables.tf in Terraform root directory if I can't reuse them later in modules?
I know that there is a Terragrunt which acts as a wrapper around the Terraform, but I would like to stick with the Terraform solely.
Modules have inputs and outputs. You can invoke the module multiple times, but you have to provide required inputs each time... you cannot simply rely on variables set in the calling context (although you can pass those values in as parameters).
Think of a module like a function, not an include. You have to pass in arguments. Modules do not have access to the global scope. This is generally a good programming practice... global variables lead to spaghetti code and unintended consequences. Some good reading here.
Your module is being invoked by Terraform code in your environment directory ./prod/, which itself has variable declarations in ./prod/variables.tf. Since you're creating a directory per environment, the only way to get around sharing data or resources at this top level is to copy/paste, or to symlink files like so:
├── modules
│   └── aws_vpc
│   ├── main.tf
│   └── vars.tf
├── prod
│   ├── global_variables.tf (symlink from ./shared/global_variables.tf)
│   ├── main.tf
│   └── variables.tf
├── shared
│   └── global_variables.tf
├── terraform.tfstate
└── terraform.tfstate.backup
Note that unless you actually want to change these values at runtime with -var or -var-file, you might actually want to be using locals. I used to use variables with default values instead of locals, but if the value will never change for that environment it makes sense to use locals instead.
In any given Terraform dir, I don't think about any .tf files as having any particular function apart from giving you some visibility into how you've broken out your resources and/or arranged your code. So main.tf and variables.tf could be consolidated into 1 file without any loss of function.
In your setup, your variables.tf would have a set of vars that each had a default value (or else terraform x commands would prompt for their values). Alternatively, you could omit the default values and instead create a terraform.tfvars file that set values for each var.
In this setup with separate dirs for each environment (prod, test, dev, etc.), I prefer to use terraform.tfvars as I find it easier to do diffs and I know that the only thing that I need to modify with any given env is the terraform.tfvars file.
For example, cidr_block may be a variable that you specify for each env, and then pass to the aws_vpc module. prod may be 192.168.0.0/16 while test might have 10.1.0.0/16.
As for the module, although it may seem like all of the vars are repeats/duplicates with the code in the environment dirs, it's not a given. For example, you may have a var in your env code that is a boolean that is passed into your module as an input that the module then uses to make certain 'decisions'.

How to use Bazel with Node.js

My understanding is that Bazel expects projects to be under a monorepo with a WORKSPACE file at the top-level and BUILD files in every project:
Repo
├── ProjectA
│   └── BUILD
├── ProjectB
│   └── BUILD
└── WORKSPACE
However, going through the Bazel NodeJS rules documentation, it seems to suggest that every project should have it's own WORKSPACE file where it defines its dependencies. i.e. ...
Repo
├── ProjectA
│   ├── BUILD
│   └── WORKSPACE
└── ProjectB
├── BUILD
└── WORKSPACE
This looks similar to a multi-repo with every project referencing other projects as an external dependency, which seemed okay to me, until I realized that for external dependencies, Bazel requires all transitive dependencies to be specified in the WORKSPACE file for every package, which is definitely not ideal.
What's the easiest way to use Bazel with NodeJS projects, with some projects possibly written in other languages? Also, is there an example somewhere for Bazel being used in a multi-repo setting?
Thanks!
I think the 2 possible options are in fact
Repo
├── MyProject
│ └── BUILD
├── third_party
│ └── ProjectB
│ └─ BUILD
└── WORKSPACE
or
Repo
├── MyProject
│ └── BUILD
└── WORKSPACE
where in the second case WORKSPACE references ProjectB with npm_install rule as defined in https://github.com/bazelbuild/rules_nodejs#using-bazel-managed-dependencies
I'm still trying to figure this out myself, but what I've gathered so far is that there is only one WORKSPACE file at the root of the repo. You need to have a package.json file (probably at the root) containing all the dependencies used in the whole repo then call npm_install or yarn_install in the WORKSPACE file to download them all.
Then your package BUILD file can reference a dependency with #npm//some_package as in:
filegroup(
name = 'sources',
srcs = ['index.js'],
)
npm_package(
name = 'pkg',
srcs = [ 'package.json'],
deps = [
':sources'
'#npm//lodash'
],
)
There are a few different dependency edge cases I haven't figured out yet so this may not be perfectly correct. Good luck.

Folder structure for a Node.js project

I notice that Node.js projects often include folders like these:
/libs, /vendor, /support, /spec, /tests
What exactly do these mean? What's the different between them, and where should I include referenced code?
Concerning the folders you mentioned:
/libs is usually used for custom classes/functions/modules
/vendor or /support contains 3rd party libraries (added as git
sub-module when using git as source control)
/spec contains specifications for BDD tests.
/tests contains the unit-tests for an application (using a testing
framework, see
here)
NOTE: both /vendor and /support are deprecated since NPM introduced a clean package management. It's recommended to handle all 3rd-party dependencies using NPM and a package.json file
When building a rather large application, I recommend the following additional folders (especially if you are using some kind of MVC- / ORM-Framework like express or mongoose):
/models contains all your ORM models (called Schemas in mongoose)
/views contains your view-templates (using any templating language supported in express)
/public contains all static content (images, style-sheets, client-side JavaScript)
/assets/images contains image files
/assets/pdf contains static pdf files
/css contains style sheets (or compiled output by a css engine)
/js contains client side JavaScript
/controllers contain all your express routes, separated by module/area of your application (note: when using the bootstrapping functionality of express, this folder is called /routes)
I got used to organize my projects this way and i think it works out pretty well.
Update for CoffeeScript-based Express applications (using connect-assets):
/app contains your compiled JavaScript
/assets/ contains all client-side assets that require compilation
/assets/js contains your client-side CoffeeScript files
/assets/css contains all your LESS/Stylus style-sheets
/public/(js|css|img) contains your static files that are not handled by any compilers
/src contains all your server-side specific CoffeeScript files
/test contains all unit testing scripts (implemented using a testing-framework of your choice)
/views contains all your express views (be it jade, ejs or any other templating engine)
There is a discussion on GitHub because of a question similar to this one:
https://gist.github.com/1398757
You can use other projects for guidance, search in GitHub for:
ThreeNodes.js - in my opinion, seems to have a specific structure not suitable for every project;
lighter - an more simple structure, but lacks a bit of organization;
And finally, in a book (http://shop.oreilly.com/product/0636920025344.do) suggests this structure:
├── index.html
├── js/
│ ├── main.js
│ ├── models/
│ ├── views/
│ ├── collections/
│ ├── templates/
│ └── libs/
│ ├── backbone/
│ ├── underscore/
│ └── ...
├── css/
└── ...
More example from my project architecture you can see here:
├── Dockerfile
├── README.md
├── config
│   └── production.json
├── package.json
├── schema
│   ├── create-db.sh
│   ├── db.sql
├── scripts
│   └── deploy-production.sh
├── src
│   ├── app -> Containes API routes
│   ├── db -> DB Models (ORM)
│   └── server.js -> the Server initlializer.
└── test
Basically, the logical app separated to DB and APP folders inside the SRC dir.
Assuming we are talking about web applications and building APIs:
One approach is to categorize files by feature. To illustrate:
We are developing a library application. In the first version of the application, a user can:
Search for books and see metadata of books
Search for authors and see their books
In a second version, users can also:
Create an account and log in
Loan/borrow books
In a third version, users can also:
Save a list of books they want to read/mark favorites
First we have the following structure:
books
└─ entities
│ └─ book.js
│ └─ author.js
│
└─ services
│ └─ booksService.js
│ └─ authorsService.js
│
└─ repositories
│ └─ booksRepository.js
│ └─ authorsRepository.js
│
└─ controllers
│ └─ booksController.js
│ └─ authorsController.js
│
└─ tests
└─ ...
We then add on the user and loan features:
user
└─ controllers
└─ entities
└─ services
└─ ...
loan
└─ controllers
└─ ...
And then the favorites functionality:
favorites
└─ controllers
└─ entities
└─ ...
For any new developer that gets handed the task to add on that the books search should also return information if any book have been marked as favorite, it's really easy to see where in the code he/she should look.
Then when the product owner sweeps in and exclaims that the favorites feature should be removed completely, it's easy to remove it.
It's important to note that there's no consensus on what's the best approach and related frameworks in general do not enforce nor reward certain structures.
I find this to be a frustrating and huge overhead but equally important. It is sort of a downplayed version (but IMO more important) of the style guide issue. I like to point this out because the answer is the same: it doesn't matter what structure you use as long as it's well defined and coherent.
So I'd propose to look for a comprehensive guide that you like and make it clear that the project is based on this.
It's not easy, especially if you're new to this! Expect to spend hours researching. You'll find most guides recommending an MVC-like structure. While several years ago that might have been a solid choice, nowadays that's not necessarily the case. For example here's another approach.
This is indirect answer, on the folder structure itself, very related.
A few years ago I had same question, took a folder structure but had to do a lot directory moving later on, because the folder was meant for a different purpose than that I have read on internet, that is, what a particular folder does has different meanings for different people on some folders.
Now, having done multiple projects, in addition to explanation in all other answers, on the folder structure itself, I would strongly suggest to follow the structure of Node.js itself, which can be seen at: https://github.com/nodejs/node. It has great detail on all, say linters and others, what file and folder structure they have and where. Some folders have a README that explains what is in that folder.
Starting in above structure is good because some day a new requirement comes in and but you will have a scope to improve as it is already followed by Node.js itself which is maintained over many years now.
Just clone the repo from GitHub
https://github.com/abhinavkallungal/Express-Folder-Structure
This is a basic structure of a node.js express.js project
with already setup MongoDB as database, hbs as view engine, nodemon also,
so you can easily set up node js express project
Step 1: download or clone the repo
Step 2: Open in any code editor
Step 3: Open the terminal on the folder path
Step 4: run the comment in terminal npm start
Step 5: start coding

Resources