I am using the following folder structure for my express CRUD application. I use it to have ideal manageable code of each file containing not more than 70 lines of code. What are your thoughts on the folder structure?
.
├── bin\
│ └── www
├── common\
│ ├── enums\
│ │ └── logTypesEnum.js
│ └── validators\
│ └── studentNameValidator.js
├── config\
│ └── db.js # config file for database connection
├── models\
│ └── log.js # contains model data for model 'log'
├── routes\
│ ├── log\
│ │ ├── index.js # handles all routes for /log/ endpoints and requires files in the directory and also contains middleware code
│ │ ├── insert.js # handles all routes for /log/insert endpoints
│ │ ├── remove # handles all routes for /log/remove endpoints
│ │ └── exportCSV.js # handles all routes for /log/exportCSV endpoints
│ └── student\
│ ├── index.js
│ ├── insert.js
│ └── remove.js
├── public\
│ ├── javascripts
│ ├── images
│ └── stylesheets
├── views\
│ ├── log\
│ │ ├── index.jade
│ │ ├── insert.jade
│ │ ├── remove.jade
│ │ ├── exportCSV.jade
│ └── student\
│ ├── index.jade
│ └── insert.jade
└── app.js
I am not sure why you decided on the number 70 unless you read somewhere that 70 makes some sort of ideal microservice which your structure does not allow for anyways.
As to the directory structure. I have come to the conclusion that internal directory structures are usually based on the programmer or team leader. It is more of a matter of what makes sense in your head as you see the visual design and implementation of your code.
That said, IMHO, overly complicated or let us say overly structured directory structures in Node and say for instance PHP cause an inordinate amount of moving up and down directory trees in order to access code, classes or just plain functions. Not to mention that it becomes jibberish for those who may come after you to maintain the code.
So use the directory structure you feel at home with. But make it clean and not complicated. Do not try to pigeonhole every aspect of every call and function into a specific 70 line definable directory structure. (again no clue where that number came from).
Clean, simple and sensible.
Those would be the best rules to follow IMHO.
Edit based on OP questions below:
Clarity of code is not defined by the amount of lines. When I test coders, no matter what language they specialize in, put some code up on the screen and ask them to translate that code line for line into plain English. First this actually tests the candidate ability. And second if the code is clear and good, then any other coder should be able to read it and clearly understand what is going on in the code. (Actually it tests both the original coder and the candidate.)
So clear code is not about lines. It is about excellent coding practices and the ability to make your code do what you envision.
Microservices has become sort of a buzzword. But essentially it simply means "focused". It means that you are creating a module to do a specific task So each module in your system does a specific task or tasks essential to your system and only focuses on that task. I think that may be actually what you are striving for. There are quite a few really good articles out there on Node and Microservices.
Related
I am just providing a piece of code here. When i run the whole code its showing Index.html file is not found.
app = Flask(__name__)
CORS(app)
app.url_map.converters['everything'] = EverythingConverter
def render(duplicates, current, total):
env = Environment(loader=FileSystemLoader('template'))
template = env.get_template('index.html')
return template.render(duplicates=duplicates,
current=current,
total=total)
the error is
raise TemplateNotFound(template)
jinja2.exceptions.TemplateNotFound: index.html
My file path is
/My_project
template
/index.html
my_project.py
I ran the program so many times but its writes the same error. Does anyone had some idea?
thanks in advance
According to the Flask documentation your project layout should look something like this:
my_project/
│
├── templates/
│ └── base.html
│
├── static/
│ └── style.css
│
└── app.py
The directory in which you store your templates should be named:
templates/
Instead of:
template/
Don't forget to change references to this directory as well.
With three different environments, I want to be able to dynamically set variables based on the environment. In my example below, let's say the instance type is different between dev and prod. I'm not able to reference instance_type within the module UNLESS I have a vars.tf file alongside my terraform.tfvars.
The error I get is:
unknown variable referenced: 'instance_type'. define it with 'variable' blocks
If that's the case, then wouldn't this file be the same exact file under modules/apollo/vars.tf?
I thought modules/apollo/vars.tf defines the necessary variables needed for the module. I didn't think it was necessary within the "root" level under env-dev/services/apollo/. If there's a "better" way of doing this, I'm all ears.
├── env-dev
│ └── services
│ └── apollo
│ ├── main.tf
│ ├── terraform.tfvars
│ └── vars.tf # Do i need this?
├── env-test
├── global
├── mgmt
└── modules
├── apollo
│ ├── main.tf
│ ├── user_data.tpl
│ └── vars.tf
└── defaults
└── main.tf
env-dev/services/apollo/terraform.tfvars
instance_type = "t2.medium"
env-prod/services/apollo/terraform.tfvars
instance_type = "t2.large"
modules/apollo/vars.tf
variable "instance_type" {
description = "EC2 Instance Type"
}
modules/apollo/main.tf
resource "aws_instance" "instance" {
...
instance_type = "${var.instance_type}"
...
}
Adjust the structure, this is my understand for your applications.
├── dev
│ └── apollo_terraform.tfvars
├── test
│ └── apollo_terraform.tfvars
├── global
│ └── apollo_terraform.tfvars
├── mgmt
│ └── apollo_terraform.tfvars
├── main.tf, vars.tf, output.tf, apollo.tf, default.tf, etc
└── modules
├── apollo
│ ├── main.tf
│ ├── user_data.tpl
│ └── vars.tf
└── defaults
└── main.tf
apollo.tf will have source module code to use the share module apollo. Same setting for default.tf
your plan/apply command should be like this:
terraform plan -var-file=${env}/apollo_terraform.tfvars
I've been trying to achieve something similar as intuitively it seems this is how it should work, however I am coming to the conclusion that modules are simply not designed for this use case.
Basically you are assigning values to variables that do not exist in your test/prod, to work around this instead of providing assignments in .tfvars you could try to declare them with default values:
env-dev/services/apollo/variables.tf
variable "instance_type" {
default = "t2.medium"
}
env-prod/services/apollo/variables.tf
variable "instance_type" {
default = "t2.large"
}
having those declared and assigned with default values still does not automatically link them to the input variables declared in your module, so
additionally in env-dev/services/apollo/main.tf and env-prod/services/apollo/main.tf you would still need to fill in properties for your module:
module "aws_inst" {
source = "..\\..\\..\\modules\\apollo"
instance_type = "${var.instance_type}"
}
You can quickly see how this defeats the purpose of modules in this scenario.
To elaborate, I think that modules were not designed for defining single resource per module to be able to fill in it's values dynamically, but rather to create "collections" of resources within a module where they can share/reuse same variables.
Note that when you are assigning value to instance_type key in module call, you are actually passing that value to modules input variable which is is then assigned to resource key by the same name.
I have a folder structure like so:
.
└── client
├── components
└── routes
├── index.js
├── Login
│ ├── index.js
│ ├── assets
│ ├── components
│ ├── container
│ └── modules
└── UpdatePassword
├── index.js
├── assets
├── components
├── container
└── modules
I would like to see if anyone is importing files from the UpdatePassword folder to the Login folder and vice versa.
Basically I'm following a fractal project structure where I want components that are related to the UpdatePassword or Login route to only exist in their respective folders. Shared components would exist in the client/components subdirectory. To maintain a structure like this, I would like to write a test that fails when an 'unacceptable' imports or require is used. I.e. if a file in UpdatePassword imports from Login/components.
Is there a way to test or check whether an import is coming from specific folders?
Try madge: I usually run it as madge --image /path-to-folder/dependencies.png routes (There is also a exclude option if you need it)
You'll get a visual graph which shows you dependencies between files.
I have no idea about native way to do it.But you can wrap "require" function:
function myRequire(fromPath, requiredPath) {
//code to judge whether or not can load requiredPath from fromPath
var can = ...
if(can) {
return require(requiredPath);
}
else {
throw new Error(`you can not load ${requiredPath} from ${fromPath}`);
}
}
In Cloud9 Express IDE example I see in folder/file routes/users.js - please tell me why this is in separate file -what is the intention of users.js as a distinct file? :
/*
GET users listing.
*/
exports.list = function(req, res){
res.send("respond with a resource");
};
ok Im trying to answer that one myself (feel free to chime in):
The basic Express set up folder structure according to Expressjs.com looks like this:
├── app.js
├── bin
│ └── www
├── package.json
├── public
│ ├── images
│ ├── javascripts
│ └── stylesheets
│ └── style.css
├── routes
│ ├── index.js
│ └── users.js
└── views
├── error.jade
├── index.jade
└── layout.jade
It is suggested and not obligatory. However for noobs like me this helpful website says:
If you are getting started with Express it is recommended that you use the structure from the generator.
Edited:
In the particular folder structure generated in Cloud9 Express the file /routes/users IS intrinsic to a typical Express-Jade setup.
Basic routing tutorial here answers what each of these files is doing
I'm rather new to web development all together. I'm trying to get a node.js/express/mongo project off the ground. I'm coming from a C/C++ background, and the module organization paradigms seem a little strange to me.
Currently, in my server.js, I create, connect, and initialize my mongoDB object using mongoose, then pass around this object query, insert ect on it. Doesn't seem like something I should be doing inside server.js.
Is the proper way to loosely couple Mongo from my project by creating a separate module (ex. database) where I do all the initialization, options, ect, and return a mongodb instance through this new module?
I've never undertaken a project of this size (or type) before, and just don't know how to best organize everything..
Any advice from those people much more experienced than me would be appreciated
Many ways to do it. Here's some ideas.
Yes, you'll want to have a routing/handlers setup of some kind so that different modules have the ability to call different services and/or decouple.
Below is a fairly standard node.js / express structure:
├── server.js
├── config
│ ├── development
│ ├── production
│ └── staging
├── handlers
│ ├── customers.js
│ └── stores.js
├── node_modules
│ ├── assert
│ ├── ejs
│ ├── express
│ ├── forever
│ ├── mongodb
│ └── mongoskin
├── package.json
├── README.md
then in server.js, you can import your handlers like so:
// import route handlers
var customers = require('./handlers/customers'),
stores = require('./handlers/stores');
and then inside your handlers, you'll be able to declare functions:
exports.addCustomer = function(req, res) {
// ....
};
which in server.js you can use for routing:
app.post('/customers/add/:id, metrics.addCustomer);
so then you've got a basic framework. Just defining the database connections outside of the exports.XXX functions in the handler files is fine because those functions will have access, but not anything in server.js so you won't pollute your namespace.
var url = config.user +":"
+ config.pass +"#"
+ config.host +"/"
+ config.database;
var mongo = require('mongoskin').db(url);
where you might load the config object from a JSON file.
hope that helps.