I am using node.js + express.js + everyauth.js. I have moved all my everyauth logic into a module file
var login = require('./lib/everyauthLogin');
inside this I load my oAuth config file with the key/secret combinations:
var conf = require('./conf');
.....
twitter: {
consumerKey: 'ABC',
consumerSecret: '123'
}
These codes are different for different environments - development / staging / production as the callbacks are to different urls.
Question: How do I set these in the environmental config to filter through all modules or can I pass the path directly into the module?
Set in env:
app.configure('development', function(){
app.set('configPath', './confLocal');
});
app.configure('production', function(){
app.set('configPath', './confProduction');
});
var conf = require(app.get('configPath'));
Pass in
app.configure('production', function(){
var login = require('./lib/everyauthLogin', {configPath: './confProduction'});
});
? hope that makes sense
My solution,
load the app using
NODE_ENV=production node app.js
Then setup config.js as a function rather than an object
module.exports = function(){
switch(process.env.NODE_ENV){
case 'development':
return {dev setting};
case 'production':
return {prod settings};
default:
return {error or other settings};
}
};
Then as per Jans solution load the file and create a new instance which we could pass in a value if needed, in this case process.env.NODE_ENV is global so not needed.
var Config = require('./conf'),
conf = new Config();
Then we can access the config object properties exactly as before
conf.twitter.consumerKey
You could also have a JSON file with NODE_ENV as the top level. IMO, this is a better way to express configuration settings (as opposed to using a script that returns settings).
var config = require('./env.json')[process.env.NODE_ENV || 'development'];
Example for env.json:
{
"development": {
"MONGO_URI": "mongodb://localhost/test",
"MONGO_OPTIONS": { "db": { "safe": true } }
},
"production": {
"MONGO_URI": "mongodb://localhost/production",
"MONGO_OPTIONS": { "db": { "safe": true } }
}
}
A very useful solution is use the config module.
after install the module:
$ npm install config
You could create a default.json configuration file. (you could use JSON or JS object using extension .json5 )
For example
$ vi config/default.json
{
"name": "My App Name",
"configPath": "/my/default/path",
"port": 3000
}
This default configuration could be override by environment config file or a local config file for a local develop environment:
production.json could be:
{
"configPath": "/my/production/path",
"port": 8080
}
development.json could be:
{
"configPath": "/my/development/path",
"port": 8081
}
In your local PC you could have a local.json that override all environment, or you could have a specific local configuration as local-production.json or local-development.json.
The full list of load order.
Inside your App
In your app you only need to require config and the needed attribute.
var conf = require('config'); // it loads the right file
var login = require('./lib/everyauthLogin', {configPath: conf.get('configPath'));
Load the App
load the app using:
NODE_ENV=production node app.js
or setting the correct environment with forever or pm2
Forever:
NODE_ENV=production forever [flags] start app.js [app_flags]
PM2 (via shell):
export NODE_ENV=staging
pm2 start app.js
PM2 (via .json):
process.json
{
"apps" : [{
"name": "My App",
"script": "worker.js",
"env": {
"NODE_ENV": "development",
},
"env_production" : {
"NODE_ENV": "production"
}
}]
}
And then
$ pm2 start process.json --env production
This solution is very clean and it makes easy set different config files for Production/Staging/Development environment and for local setting too.
In brief
This kind of a setup is simple and elegant :
env.json
{
"development": {
"facebook_app_id": "facebook_dummy_dev_app_id",
"facebook_app_secret": "facebook_dummy_dev_app_secret",
},
"production": {
"facebook_app_id": "facebook_dummy_prod_app_id",
"facebook_app_secret": "facebook_dummy_prod_app_secret",
}
}
common.js
var env = require('env.json');
exports.config = function() {
var node_env = process.env.NODE_ENV || 'development';
return env[node_env];
};
app.js
var common = require('./routes/common')
var config = common.config();
var facebook_app_id = config.facebook_app_id;
// do something with facebook_app_id
To run in production mode :
$ NODE_ENV=production node app.js
In detail
This solution is from : http://himanshu.gilani.info/blog/2012/09/26/bootstraping-a-node-dot-js-app-for-dev-slash-prod-environment/, check it out for more detail.
The way we do this is by passing an argument in when starting the app with the environment. For instance:
node app.js -c dev
In app.js we then load dev.js as our configuration file. You can parse these options with optparse-js.
Now you have some core modules that are depending on this config file. When you write them as such:
var Workspace = module.exports = function(config) {
if (config) {
// do something;
}
}
(function () {
this.methodOnWorkspace = function () {
};
}).call(Workspace.prototype);
And you can call it then in app.js like:
var Workspace = require("workspace");
this.workspace = new Workspace(config);
An elegant way is to use .env file to locally override production settings.
No need for command line switches. No need for all those commas and brackets in a config.json file. See my answer here
Example: on my machine the .env file is this:
NODE_ENV=dev
TWITTER_AUTH_TOKEN=something-needed-for-api-calls
My local .env overrides any environment variables. But on the staging or production servers (maybe they're on heroku.com) the environment variables are pre-set to stage NODE_ENV=stage or production NODE_ENV=prod.
Set environment variable in deployment server (ex: like NODE_ENV=production). You can access your environmental variable through process.env.NODE_ENV.
Find the following config file for the global settings
const env = process.env.NODE_ENV || "development"
const configs = {
base: {
env,
host: '0.0.0.0',
port: 3000,
dbPort: 3306,
secret: "secretKey for sessions",
dialect: 'mysql',
issuer : 'Mysoft corp',
subject : 'some#user.com',
},
development: {
port: 3000,
dbUser: 'root',
dbPassword: 'root',
},
smoke: {
port: 3000,
dbUser: 'root',
},
integration: {
port: 3000,
dbUser: 'root',
},
production: {
port: 3000,
dbUser: 'root',
}
};
const config = Object.assign(configs.base, configs[env]);
module.exports= config;
"base" contains common config for all environments.
Then import in other modules like:
const config = require('path/to/config.js')
console.log(config.port)
Happy Coding...
How about doing this in a much more elegant way with nodejs-config module.
This module is able to set configuration environment based on your computer's name. After that when you request a configuration you will get environment specific value.
For example lets assume your have two development machines named pc1 and pc2 and a production machine named pc3. When ever you request configuration values in your code in pc1 or pc2 you must get "development" environment configuration and in pc3 you must get "production" environment configuration. This can be achieved like this:
Create a base configuration file in the config directory, lets say "app.json" and add required configurations to it.
Now simply create folders within the config directory that matches your environment name, in this case "development" and "production".
Next, create the configuration files you wish to override and specify the options for each environment at the environment directories(Notice that you do not have to specify every option that is in the base configuration file, but only the options you wish to override. The environment configuration files will "cascade" over the base files.).
Now create new config instance with following syntax.
var config = require('nodejs-config')(
__dirname, // an absolute path to your applications 'config' directory
{
development: ["pc1", "pc2"],
production: ["pc3"],
}
);
Now you can get any configuration value without worrying about the environment like this:
config.get('app').configurationKey;
This answer is not something new. It's similar to what #andy_t has mentioned. But I use the below pattern for two reasons.
Clean implementation with No external npm dependencies
Merge the default config settings with the environment based settings.
Javascript implementation
const settings = {
_default: {
timeout: 100
baseUrl: "http://some.api/",
},
production: {
baseUrl: "http://some.prod.api/",
},
}
// If you are not using ECMAScript 2018 Standard
// https://stackoverflow.com/a/171256/1251350
module.exports = { ...settings._default, ...settings[process.env.NODE_ENV] }
I usually use typescript in my node project. Below is my actual implementation copy-pasted.
Typescript implementation
const settings: { default: ISettings, production: any } = {
_default: {
timeout: 100,
baseUrl: "",
},
production: {
baseUrl: "",
},
}
export interface ISettings {
baseUrl: string
}
export const config = ({ ...settings._default, ...settings[process.env.NODE_ENV] } as ISettings)
Related
With vue-cli it was possible to configure webpack devServer.before function like this:
devServer: {
before(app) {
app.get('/apiUrl', (req, res) => res.send(process.env.API_URL))
}
},
How is it possible to configure Vite dev server to obtain the same behavior?
(I tried with the proxy option but it does not work.)
According to this github issue, environment variables are not accessible in file vite.config.js (neither in vite.config.ts). However, the discussion in this issue also mentions a workaround that you can use in this file:
import { defineConfig, loadEnv } from 'vite'
import vue from '#vitejs/plugin-vue'
export default defineConfig(({mode}) => {
const env = loadEnv(mode, process.cwd());
return {
plugins: [
vue(),
],
server: {
proxy: {
'^/apiUrl': {
target: env.VITE_API_TARGET,
changeOrigin: true,
}
}
},
}
})
Note that the variable name must start with VITE_ for this to work.
I try to find a way to get rid of .env file from dev, stage and prod environments for Node.js application.
The current solution which I have seems to be a little bit factitious. During the build process, in pipeline env variables are rewrite from Jenkins (Config File Provider) into new .env file which is attached to an application. The deployed application in run by pm2.
I wonder if is it a better approach to run ${envlist} pm2 start app?
this is just an option that I use for handling the deployment. In my projects, I always have a file ecosystem.config.js
module.exports = {
apps: [{
name: "my little project",
script: "./app.js",
instances: "max",
env: {
NODE_ENV: "production",
},
env_development: {
NODE_ENV: "development",
},
env_production: {
NODE_ENV: "production",
}
}]
}
in my entry point I have (app.js):
if (process.env.NODE_APP_INSTANCE === '0' || process.env.NODE_ENV !== 'production') {
// start cronJobs here or any action needs to be done in single proccess
}
if (process.env.NODE_ENV === 'production') server.listen(800 + process.env.NODE_APP_INSTANCE);
else if (process.env.NODE_ENV === 'test') server.listen(700 + process.env.NODE_APP_INSTANCE);
else
server.listen(3045, err => {
console.log(err ? err : `REST server started on 3045.`);
});
and in your jenkins file :
stage('Deploy') {
environment {
NODE_ENV = "production"
}
steps {
sh 'pm2 start ecosystem.config.js'
}
}
you can run your app with any env you want:
pm2 start ecosystem.config.js --env production
I'd like to have a few environments, let's say development, production, test. These environments should be independent and use their own set of config parameters, e.g. for DB, SERVER_PORT, USER etc.
They should not be in the code base, so I think they should be different .env files. That's to say, I should be able to load different .env files depending on what environment is active. Also, it's not clear where I have to set that env switcher.
Maybe it should be a single .env file that has the NODE_ENV parameter, that param can be set to any of the above-mentioned values, be that development, production or test. And depending on the value of this parameter a necessary set of config parameters gets automatically loaded.
I've read the documentation, it seems a little confusing to me at the moment.
Seems like there should be some config factory.
Assuming you have the following config files in the root of the project: env.development, env.staging, env.test
Here is how I would implement it:
In the app.module.ts file:
import { ConfigModule } from '#nestjs/config';
const ENV = process.env.NODE_ENV;
#Module({
imports: [
ConfigModule.forRoot({
envFilePath: !ENV ? '.env' : `.env.${ENV}`,
}),
],
controllers: [AppController],
})
export class AppModule {}
Inspired by this solution: https://github.com/nestjsx/nestjs-config#using-different-env-files
You can use the config library as mentioned in the official documentation.
Otherwise you can use the npm library dotenv.
In either way what really matters is how you organise your .env files. Env files are supposed to contain database credentials, encryption secret and many confidential data, so its not really a good idea to put them in version control. Instead you should store the .env file in the system. Production server will have .env file with production secrets, developer server can have .env file with local secrets. Flag .env to be ignored by git. In this way you won't have to change according to environment, it will automatically take the right configuration based on which server you are deploying.
There are two approaches to this use case.
1.create .prod.env, .development.env, .test.env and load required env like this.
ConfigModule.forRoot({envFilePath: '.development.env'});
2.create config from function
export default () => ({
port: parseInt(process.env.PORT, 10) || 3000,
database: {
host: process.env.DATABASE_HOST,
port: parseInt(process.env.DATABASE_PORT, 10) || 5432
}
});
and use
#Module({
imports: [
ConfigModule.forRoot({
load: [configuration],
}),
],
})
see more here
The trick is to define custom filename paths for the envFilePath configuration of Netjs's ConfigModule that is based on the environment variable NODE_ENV. You can then set the value of NODE_ENV in your different scripts in your package.json
In your app.module.ts add
// Grab the system env variable
const ENV = process.env.NODE_ENV;
// Set custom filepath in ConfigModule properties based on NODE_ENV
#Module({
imports: [
ConfigModule.forRoot({
envFilePath: !ENV ? '.env.dev' : `.env.${ENV}`
})
]
})
In your package.json file modify your "scripts" to set NODE_ENV
"scripts": {
"start:dev": "NODE_ENV=dev nest start --watch",
"start:prod": "NODE_ENV=prod node dist/main"
}
another code style~
if you want to deeply use #nestjs/config , try like this
mkdir a dictionary named config, structure
- /config
- config.default.ts
- config.dev.ts
- config.production.ts
- configuration.ts
set individual config in config.ENV.ts like this
export default {
// nodemailer config
mailer: {
host: 'xxx',
port: 80,
auth: {
user: 'xxx',
pass: 'xxx',
},
secure: false, // or true using 443
},
// jwt sign secret
jwt: {
secret: process.env.JWT_SECRET || '123456',
}
}
then dynamic exports these file in configuration.ts
import { merge } from 'lodash';
import DefaultConfig from './config.default';
export default () => {
let envConfig = {};
try {
// eslint-disable-next-line #typescript-eslint/no-var-requires
envConfig = require(`./config.${process.env.NODE_ENV}`).default;
} catch (e) {
}
return merge(DefaultConfig, envConfig);
};
import configuration.ts in app.module.ts
import configuration from './config/Configuration';
#Module({
imports: [
ConfigModule.forRoot({
load: [configuration],
}),
],
})
that's all
I'm trying to use environment variables on docker only needed for on command. On Mac/Linux I can simple just run token=1234 node command.js and token is available as an environment variable. But when I do this with docker docker exec $CONTAINER nenv token=123 node command.js I get unknown command token=123
I don't use node env,
I recommend to do following:
create config folder
put this in config/index.js
var
nconf = require('nconf'),
path = require('path');
nconf.env().argv();
nconf.file('local', path.join(__dirname, 'config.local.json'));
nconf.file(path.join(__dirname, 'config.json'));
module.exports = nconf;
create files: config/config.json (template of config) and config/config.local.json (copy of template with real configuration)
for example:
{
"app": {
"useCluster": false,
"http": {
"enabled": true,
"port": 8000,
"host": "0.0.0.0"
},
"https": {
"enabled": false,
"port": 443,
"host": "0.0.0.0",
"certificate": {
"key": "server.key",
"cert": "server.crt"
}
},
"env": "production",
"profiler": false
},
"db": {
"driver": "mysql",
"host": "address here",
"port": 3306,
"user": "username here",
"pass": "password here",
"name": "database name here"
},
}
use in beginning of Your app: var config = require('./config');
and use config object whenever You need:
var config = require('./config'),
cluster = require('./components/cluster'),
http = require('http'),
...
...
https = require('https');
cluster.start(function() {
if (config.get('app:http:enabled')) {
var httpServer = http.createServer(app);
httpServer.listen(config.get('app:http:port'), config.get('app:http:host'),
function () {
winston.info('App listening at http://%s:%s', config.get('app:http:host'), config.get('app:http:port'));
});
}
if (config.get('app:https:enabled')) {
var httpsServer = https.createServer({
key: fs.readFileSync(path.join(__dirname, 'certificates', config.get('app:https:certificate:key'))),
cert: fs.readFileSync(path.join(__dirname, 'certificates', config.get('app:https:certificate:cert')))
}, app);
httpsServer.listen(config.get('app:https:port'), config.get('app:https:host'),
function () {
winston.info('App listening at https://%s:%s', config.get('app:https:host'), config.get('app:https:port'));
});
}
});
this example is more accurate way to have environment based configs. for example: config.local.json configuration that will be added to .gitignore and so on...
EDIT caused by my stupidness !
You can't set new env var using docker on an existing docker.
You have to do this when you build it (using Dockerfile or docker-compose), or when you run it (using docker run $CONTAINER -e "name=value" command).
Even if you need to simply retrieve certain configurations from the command (at docker run time) you can do it simply by switching from node env (process.env) to argv usage.
Such casers are not uncommon (docker-compose), and could be done in very easy way.
npm install yargs --save
run code with docker run or docker exec:
docker exec $CONTAINER node command.j --token 123
then in code:
const argv = require('yargs').argv;
...
let boo = do.something(argv.token);
I'm new to RequireJS and I'm stuck with the loading order.
I have a global project configuration that I need to be loaded before the modules located in js/app/*.
Here's my struture :
index.html
config.js
js/
require.js
app/
login.js
lib/
bootstrap-2.0.4.min.js
Here's the config.js file :
var Project = {
'server': {
'host': '127.0.0.1',
'port': 8080
},
'history': 10, // Number of query kept in the local storage history
'lang': 'en', // For future use
};
And here's my requirejs file (app.js) :
requirejs.config({
//By default load any module IDs from js/lib
baseUrl: 'js/lib',
//except, if the module ID starts with "app",
//load it from the js/app directory. paths
//config is relative to the baseUrl, and
//never includes a ".js" extension since
//the paths config could be for a directory.
paths: {
bootstrap: '../lib/bootstrap-2.0.4.min',
app: '../app',
},
shim: {
'app': {
deps: ['../../config'],
exports: function (a) {
console.log ('loaded!');
console.log (a);
}
} // Skual Config
},
});
var modules = [];
modules.push('jquery');
modules.push('bootstrap');
modules.push('app/login');
// Start the main app logic.
requirejs(modules, function ($) {});
But sometimes, when I load the page, I have a "Project" is undefined, because login.js has been loaded BEFORE config.js.
How can I force config.js to be loaded at first, no matter what ?
Note: I saw order.js as a plugin for RequireJS but it's apparently not supported since the v2, replaced by shim.
Ran into a similar problem today - we have bootstrapped data that we want to make sure is loaded before anything else, and that the module exposing that data is set up before any other modules are evaluated.
The easiest solution I found to force load order is to simply require a module be loaded before continuing on with app initialization:
require(["bootstrapped-data-setup", "some-other-init-code"], function(){
require(["main-app-initializer"]);
});
There's a possible solution to build a queue for modules to be loaded. In this case all modules will be loaded one-by-one in exact order:
var requireQueue = function(modules, callback) {
function load(queue, results) {
if (queue.length) {
require([queue.shift()], function(result) {
results.push(result);
load(queue, results);
});
} else {
callback.apply(null, results);
}
}
load(modules, []);
};
requireQueue([
'app',
'apps/home/initialize',
'apps/entities/initialize',
'apps/cti/initialize'
], function(App) {
App.start();
});
You won't have to worry about the load order if you define your js files as AMD modules. (Or you can use the shim config if you can't modify the config.js and login.js to call define).
config.js should look something like this:
define({project: {
'server': {
'host': '127.0.0.1',
'port': 8080
},
'history': 10, // Number of query kept in the local storage history
'lang': 'en', // For future use
}});
login.js:
define(['jquery', '../../config'], function($, config) {
// here, config will be loaded
console.log(config.project)
});
Again, shim config should only be used if calling define() inside the modules is not an option.