Docker Node.js environment variables - node.js

I'm trying to use environment variables on docker only needed for on command. On Mac/Linux I can simple just run token=1234 node command.js and token is available as an environment variable. But when I do this with docker docker exec $CONTAINER nenv token=123 node command.js I get unknown command token=123

I don't use node env,
I recommend to do following:
create config folder
put this in config/index.js
var
nconf = require('nconf'),
path = require('path');
nconf.env().argv();
nconf.file('local', path.join(__dirname, 'config.local.json'));
nconf.file(path.join(__dirname, 'config.json'));
module.exports = nconf;
create files: config/config.json (template of config) and config/config.local.json (copy of template with real configuration)
for example:
{
"app": {
"useCluster": false,
"http": {
"enabled": true,
"port": 8000,
"host": "0.0.0.0"
},
"https": {
"enabled": false,
"port": 443,
"host": "0.0.0.0",
"certificate": {
"key": "server.key",
"cert": "server.crt"
}
},
"env": "production",
"profiler": false
},
"db": {
"driver": "mysql",
"host": "address here",
"port": 3306,
"user": "username here",
"pass": "password here",
"name": "database name here"
},
}
use in beginning of Your app: var config = require('./config');
and use config object whenever You need:
var config = require('./config'),
cluster = require('./components/cluster'),
http = require('http'),
...
...
https = require('https');
cluster.start(function() {
if (config.get('app:http:enabled')) {
var httpServer = http.createServer(app);
httpServer.listen(config.get('app:http:port'), config.get('app:http:host'),
function () {
winston.info('App listening at http://%s:%s', config.get('app:http:host'), config.get('app:http:port'));
});
}
if (config.get('app:https:enabled')) {
var httpsServer = https.createServer({
key: fs.readFileSync(path.join(__dirname, 'certificates', config.get('app:https:certificate:key'))),
cert: fs.readFileSync(path.join(__dirname, 'certificates', config.get('app:https:certificate:cert')))
}, app);
httpsServer.listen(config.get('app:https:port'), config.get('app:https:host'),
function () {
winston.info('App listening at https://%s:%s', config.get('app:https:host'), config.get('app:https:port'));
});
}
});
this example is more accurate way to have environment based configs. for example: config.local.json configuration that will be added to .gitignore and so on...

EDIT caused by my stupidness !
You can't set new env var using docker on an existing docker.
You have to do this when you build it (using Dockerfile or docker-compose), or when you run it (using docker run $CONTAINER -e "name=value" command).

Even if you need to simply retrieve certain configurations from the command (at docker run time) you can do it simply by switching from node env (process.env) to argv usage.
Such casers are not uncommon (docker-compose), and could be done in very easy way.
npm install yargs --save
run code with docker run or docker exec:
docker exec $CONTAINER node command.j --token 123
then in code:
const argv = require('yargs').argv;
...
let boo = do.something(argv.token);

Related

AZ Custom Handler No connection could be made because the target machine actively refused it

Trying to add a Custom Handler to a simple AZ Function project. AZF works ok locally at VSC before adding. After adding, F5 starts ok like before:
[2021-12-30T19:33:14.402Z] Startup operation 'a9550626-ee3e-1234-b254-9facc08a3890' completed.
Functions:
select: [GET,POST] http://localhost:7071/api/select
then:
For detailed output, run func with --verbose flag.
[2021-12-30T19:33:16.324Z] Waiting for HttpWorker to be initialized.
Request to: http://127.0.0.1:49774/ failing with exception message: No
connection could be made because the target machine actively refused
it. (127.0.0.1:49774)
The port is random, next time can be 62974, then F5 stops itself.
Here is what's been added:
customHandler into root host.json.
a new folder middleware contains
app.js, myroute.js, host.json, mdw.js
root host.json:
{
...
"customHandler": {
"description": {
"defaultExecutablePath": "node",
"defaultWorkerPath": "middleware/app.js"
},
"enableForwardingHttpRequest": true
}
}
middleware/app.js:
const express = require('express')
const app = express()
const port = 3005;
app.listen(port, ()=>{console.log('================ My mdw is on 3005 ================');});
require("./myroute.js")(app);
middleware/myroute.js:
const express = require('express')
const mdw = require("./mdw");
module.exports = app =>
{
app.post("/api/testmdw", mdw.mytest);
};
middleware/mdw.js:
async function mytest(req, res, next)
{
const q = req;
return res.json({mdw: "ok"});
}
module.exports = { mytest }
middleware/host.json:
{
"version": "2.0",
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[2.*, 3.0.0)"
}
}
I am following this Blog to use middleware on azure function via custom handler
Host.json
"customHandler": {
"description": {
"defaultExecutablePath": "node",
"defaultWorkerPath": "azexpresstest/app.js"
},
"enableForwardingHttpRequest": true,
},
"extensions": {"http": {"routePrefix": ""}}
The customHandler section points to a target as defined by the defaultExecutablePath. The execution target may either be a command, executable, or file where the web server is implemented.
"customHandler": {
"description": {
"defaultExecutablePath": "app/handler.exe",
"workingDirectory": "app"
}
…
Refer here for more information

Angular, Node API, How to SSL Localhost, DEPTH_ZERO_SELF_SIGNED_CERT, Cookie

LocalHost, Angular 11 (https://localhost:4200) and Node API (https://localhost:3001), both are using OpenSSL, browser is Chrome. To iron out Status: CORS error (due to diff ports) I follow this adding Proxy, got this in Angular's console
[HPM] Error occurred while trying to proxy request /somewhere1 from
localhost:4200 to https://localhost:3001 (DEPTH_ZERO_SELF_SIGNED_CERT)
(https://nodejs.org/api/errors.html#errors_common_system_errors)
Following didn't help:
Confirmed the Chrome brought up by F5 has chrome://flags/#allow-insecure-localhost Enabled.
Added process.env.NODE_TLS_REJECT_UNAUTHORIZED = "0"; to Node API's server.js.
Proxy.conf.json
{
"context":
[
"/somewhere1",
"/xyz/somewhere"
],
"target" : "https://localhost:3001",
"secure": true,
"changeOrigin": true,
"rejectUnauthorzied": false,
"logLevel": "info"
}
angular.json
"serve": {
...
"options": {
"browserTarget": "myapp:build",
"ssl": true,
"proxyConfig": "src/proxy.conf.json"
Call API:
private http: HttpClient;
const httpOptions =
{
headers: new HttpHeaders({'Content-Type': 'application/json'}),
rejectUnauthorized: false
};
this.http.post<any[]>("/somewhere1/hello", {}, httpOptions).subscribe
Believe this is Angular end.
After days of frustration, I'm finally able to resolve them all and posting the solution here hope will help someone forward.
Environment:
Angular 11/12 front-end uses 3rd-party authentication such as Google.
API server is node 14.
Both are httpS/SSL.
Solution:
Follow this article to create trusted .crt and .key files
I didn't add localhost to hosts file.
Add a proxy.conf.js (or .json file with conformed format) and include it into angular.json file.
No need to specify httpOption for each individual http call.
API node, add the two files from 1 to server.js.
My Proxy.conf.js:
const PROXY_CONFIG =
[
{
context:
[
"/path1",
"/path2",
...
],
"target" : "https://localhost:3001", // I use 3000 for non-SSL
"changeOrigin": true, // helps on CORS Error in F12
"logLevel": "debug",
"rejectUnauthorzied": true, // or false if ok for you
"secure": false, // PROD must be "true", but DEV false else "UNABLE_TO_VERIFY_LEAF_SIGNATURE"
"strictSSL": true, // false is default
"withCredentials": true // required for Angular to send in cookie
}
]
module.exports = PROXY_CONFIG;
My Angular.json:
"architect": {
"serve": {
"builder": "#angular-devkit/build-angular:dev-server",
"options": {
"browserTarget": "myapp:build",
"ssl": true,
"proxyConfig": "src/proxy.conf.js"
...
My Server.js:
const fs = require("fs");
// following are the two files mentioned in No.5
const HttpSOptions =
{
key: fs.readFileSync('ssl\\server.key'),
cert: fs.readFileSync('ssl\\server.crt')
}
const httpSServer = httpS.createServer(HttpSOptions, app);
httpSServer.listen(3001, ()=>{console.log('httpS is on 3001');});
To verify certificates are FULLY trusted by Chrome, open one of your API URL call in Chrome, for example http://localhost:3001/path1/func/xyz, you should not see this

Config settings in VS Code Debugger to run docker

My goal is to set up configuration for my VS Code Debugger.
A legacy code I have is using docker container to run redis, and gulp task runner to launch an app.
My workflow consists of the following commands that I type in terminals:
docker-compose up which runs redis with db
gulp default starts the app's server
So far I've successfully created config for the gulp task, but still struggling to set up configuration for docker in debugger.
Docker Desktop is installed locally on Windows 10 Pro
Docker version: 2.0.0.0-win81 (29211)
docker file: docker-compose.yml
version: "2"
services:
redis:
container_name: redis
image: redis:latest
ports:
- 7113:6379
mariadb-test:
container_name: mariadb-test
image: wodby/mariadb
ports:
- 6604:3306
environment:
- MYSQL_ROOT_PASSWORD=***
- MYSQL_USER=****_****
- MYSQL_PASSWORD=****
- MYSQL_DATABASE=****
- MYSQL_CHARACTER_SET_FILESYSTEM=utf8mb4
- MYSQL_CHARACTER_SET_SERVER=utf8mb4
- MYSQL_CLIENT_DEFAULT_CHARACTER_SET=utf8mb4
- MYSQL_COLLATION_SERVER=utf8mb4_unicode_ci
- MYSQL_INIT_CONNECT=SET NAMES utf8mb4
Gulp task
gulp.task('default', function (done) {
runSequence('build:server', 'watch', 'start');
});
gulp.task('build:server', function (done) {
var tsProject = tsc.createProject('tsconfig.json');
var tsResult = gulp.src(['server/**/*.ts', 'server/**/*.tsx', '!server/test/**/*'])
.pipe(cache('typescript'))
.pipe(sourcemaps.init())
.pipe(tsProject()).js
.pipe(sourcemaps.mapSources(function (sourcePath, file) {
return sourcePath.replace('../../', '../');
}))
.pipe(sourcemaps.write("."))
.pipe(gulp.dest('dist/server'))
.on('end', done);
});
gulp.task('watch', function () {
gulp.watch(['server/**/*.ts', 'server/**/*.tsx', '!server/desktop/**/*', '!server/test/**/*'], ['compile']).on('change', function (e) {
gutil.log( gutil.colors.blue.bold('[CHANGE] ') + gutil.colors.green(e.path));
});
});
gulp.task('start', function () {
nodemon({
script: 'dist/server/bin.js',
watch: 'dist/',
ext: 'html js',
tasks: []
});
});
My launch.json configuration
{
"version": "0.2.0",
"configurations": [
{
"type": "node",
"request": "launch",
"name": "Gulp",
"program": "${workspaceFolder}/node_modules/gulp/bin/gulp.js",
"args": [
"default"
]
},
{
"type": "node",
"request": "attach",
"name": "Docker",
"address": "localhost",
"port": 6379,
"localRoot": "${workspaceFolder}",
"remoteRoot": "/",
}
]
}
All the time I'm getting this error:
Error: Cannot connect to runtime process, timeout after 10000 ms - (reason: Cannot connect to the target: connect ECONNREFUSED 127.0.0.1:6379).
As you don't run your appliaction in docker, you don't need to fiddle with that. However, you will need to add --inspect or --inspect-brk to your gulp task when you start your node process.
I would suggest creating a new debug task to your gulpfile:
gulp.task('debug', function (done) {
runSequence('build:server', 'watch', 'start:debug');
});
gulp.task('start:debug', function () {
nodemon({
script: 'dist/server/bin.js',
watch: 'dist/',
nodeArgs: ['--inspect'] // or --inspect-brk
ext: 'html js',
tasks: []
});
});
then you should be able to debug your code using chrome as can be seen here: https://blog.risingstack.com/how-to-debug-a-node-js-app-in-a-docker-container/
Be advised though that according to this question, you should use nodemon 1.12.7 or above to get this working.

Personality insight input var nodejs

var PersonalityInsightsV2 = require('watson-developer-cloud/personality-insights/v2');
var personality_insights = new PersonalityInsightsV2({
username: '<username>',
password: '<password>'
});
personality_insights.profile({
text: '<?php echo $_Session['description'];?>',
language: 'en' },
function (err, response) {
if (err)
console.log('error:', err);
else
console.log(JSON.stringify(response, null, 2));
});
It doesn't display anything. I have also done npm watson cloud and saved it, I have put my credentials and also forked it on git. What am I missing? I am a beginner but would love to use this on my page!
Here are the steps to run it locally, since you are a beginner I'll start from the beginning.
Create a new folder and name it whatever you want. Put these files in there.
Name the first file: index.js
fill in <YOUR-USERNAME>, <YOUR-PASSWORD>, and <YOUR-100-UNIQUE-WORDS> variables.
var express = require('express');
var app = express();
var http = require('http').Server(app);
var cfenv = require("cfenv");
var appEnv = cfenv.getAppEnv();
http.listen(appEnv.port, appEnv.bind);
var PersonalityInsightsV2 = require('watson-developer-cloud/personality-insights/v2');
var personality_insights = new PersonalityInsightsV2({
username: '<YOUR-USERNAME>',
password: '<YOUR-PASSWORD>'
});
personality_insights.profile({
text: "<YOUR-100-UNIQUE-WORDS>",
language: 'en' },
function (err, response) {
if (err)
console.log('error:', err);
else
console.log(JSON.stringify(response, null, 2));
});
Create another file and name it: package.json
put these contents in there
{
"name": "myWatsonApp",
"version": "1.0.0",
"description": "A Watson Personality Insights application",
"main": "index.js",
"scripts": {
"start": "node index.js"
},
"dependencies": {
"cfenv": "^1.0.3",
"express": "^4.13.4",
"watson-developer-cloud": "^2.2.0"
}
}
open your terminal and cd to the root of your folder you just created.
Run the command: npm install
Then run the command npm start
Your application will then be running and you will see output from the personality insights call you made in index.js

Node.js setting up environment specific configs to be used with everyauth

I am using node.js + express.js + everyauth.js. I have moved all my everyauth logic into a module file
var login = require('./lib/everyauthLogin');
inside this I load my oAuth config file with the key/secret combinations:
var conf = require('./conf');
.....
twitter: {
consumerKey: 'ABC',
consumerSecret: '123'
}
These codes are different for different environments - development / staging / production as the callbacks are to different urls.
Question: How do I set these in the environmental config to filter through all modules or can I pass the path directly into the module?
Set in env:
app.configure('development', function(){
app.set('configPath', './confLocal');
});
app.configure('production', function(){
app.set('configPath', './confProduction');
});
var conf = require(app.get('configPath'));
Pass in
app.configure('production', function(){
var login = require('./lib/everyauthLogin', {configPath: './confProduction'});
});
? hope that makes sense
My solution,
load the app using
NODE_ENV=production node app.js
Then setup config.js as a function rather than an object
module.exports = function(){
switch(process.env.NODE_ENV){
case 'development':
return {dev setting};
case 'production':
return {prod settings};
default:
return {error or other settings};
}
};
Then as per Jans solution load the file and create a new instance which we could pass in a value if needed, in this case process.env.NODE_ENV is global so not needed.
var Config = require('./conf'),
conf = new Config();
Then we can access the config object properties exactly as before
conf.twitter.consumerKey
You could also have a JSON file with NODE_ENV as the top level. IMO, this is a better way to express configuration settings (as opposed to using a script that returns settings).
var config = require('./env.json')[process.env.NODE_ENV || 'development'];
Example for env.json:
{
"development": {
"MONGO_URI": "mongodb://localhost/test",
"MONGO_OPTIONS": { "db": { "safe": true } }
},
"production": {
"MONGO_URI": "mongodb://localhost/production",
"MONGO_OPTIONS": { "db": { "safe": true } }
}
}
A very useful solution is use the config module.
after install the module:
$ npm install config
You could create a default.json configuration file. (you could use JSON or JS object using extension .json5 )
For example
$ vi config/default.json
{
"name": "My App Name",
"configPath": "/my/default/path",
"port": 3000
}
This default configuration could be override by environment config file or a local config file for a local develop environment:
production.json could be:
{
"configPath": "/my/production/path",
"port": 8080
}
development.json could be:
{
"configPath": "/my/development/path",
"port": 8081
}
In your local PC you could have a local.json that override all environment, or you could have a specific local configuration as local-production.json or local-development.json.
The full list of load order.
Inside your App
In your app you only need to require config and the needed attribute.
var conf = require('config'); // it loads the right file
var login = require('./lib/everyauthLogin', {configPath: conf.get('configPath'));
Load the App
load the app using:
NODE_ENV=production node app.js
or setting the correct environment with forever or pm2
Forever:
NODE_ENV=production forever [flags] start app.js [app_flags]
PM2 (via shell):
export NODE_ENV=staging
pm2 start app.js
PM2 (via .json):
process.json
{
"apps" : [{
"name": "My App",
"script": "worker.js",
"env": {
"NODE_ENV": "development",
},
"env_production" : {
"NODE_ENV": "production"
}
}]
}
And then
$ pm2 start process.json --env production
This solution is very clean and it makes easy set different config files for Production/Staging/Development environment and for local setting too.
In brief
This kind of a setup is simple and elegant :
env.json
{
"development": {
"facebook_app_id": "facebook_dummy_dev_app_id",
"facebook_app_secret": "facebook_dummy_dev_app_secret",
},
"production": {
"facebook_app_id": "facebook_dummy_prod_app_id",
"facebook_app_secret": "facebook_dummy_prod_app_secret",
}
}
common.js
var env = require('env.json');
exports.config = function() {
var node_env = process.env.NODE_ENV || 'development';
return env[node_env];
};
app.js
var common = require('./routes/common')
var config = common.config();
var facebook_app_id = config.facebook_app_id;
// do something with facebook_app_id
To run in production mode :
$ NODE_ENV=production node app.js
In detail
This solution is from : http://himanshu.gilani.info/blog/2012/09/26/bootstraping-a-node-dot-js-app-for-dev-slash-prod-environment/, check it out for more detail.
The way we do this is by passing an argument in when starting the app with the environment. For instance:
node app.js -c dev
In app.js we then load dev.js as our configuration file. You can parse these options with optparse-js.
Now you have some core modules that are depending on this config file. When you write them as such:
var Workspace = module.exports = function(config) {
if (config) {
// do something;
}
}
(function () {
this.methodOnWorkspace = function () {
};
}).call(Workspace.prototype);
And you can call it then in app.js like:
var Workspace = require("workspace");
this.workspace = new Workspace(config);
An elegant way is to use .env file to locally override production settings.
No need for command line switches. No need for all those commas and brackets in a config.json file. See my answer here
Example: on my machine the .env file is this:
NODE_ENV=dev
TWITTER_AUTH_TOKEN=something-needed-for-api-calls
My local .env overrides any environment variables. But on the staging or production servers (maybe they're on heroku.com) the environment variables are pre-set to stage NODE_ENV=stage or production NODE_ENV=prod.
Set environment variable in deployment server (ex: like NODE_ENV=production). You can access your environmental variable through process.env.NODE_ENV.
Find the following config file for the global settings
const env = process.env.NODE_ENV || "development"
const configs = {
base: {
env,
host: '0.0.0.0',
port: 3000,
dbPort: 3306,
secret: "secretKey for sessions",
dialect: 'mysql',
issuer : 'Mysoft corp',
subject : 'some#user.com',
},
development: {
port: 3000,
dbUser: 'root',
dbPassword: 'root',
},
smoke: {
port: 3000,
dbUser: 'root',
},
integration: {
port: 3000,
dbUser: 'root',
},
production: {
port: 3000,
dbUser: 'root',
}
};
const config = Object.assign(configs.base, configs[env]);
module.exports= config;
"base" contains common config for all environments.
Then import in other modules like:
const config = require('path/to/config.js')
console.log(config.port)
Happy Coding...
How about doing this in a much more elegant way with nodejs-config module.
This module is able to set configuration environment based on your computer's name. After that when you request a configuration you will get environment specific value.
For example lets assume your have two development machines named pc1 and pc2 and a production machine named pc3. When ever you request configuration values in your code in pc1 or pc2 you must get "development" environment configuration and in pc3 you must get "production" environment configuration. This can be achieved like this:
Create a base configuration file in the config directory, lets say "app.json" and add required configurations to it.
Now simply create folders within the config directory that matches your environment name, in this case "development" and "production".
Next, create the configuration files you wish to override and specify the options for each environment at the environment directories(Notice that you do not have to specify every option that is in the base configuration file, but only the options you wish to override. The environment configuration files will "cascade" over the base files.).
Now create new config instance with following syntax.
var config = require('nodejs-config')(
__dirname, // an absolute path to your applications 'config' directory
{
development: ["pc1", "pc2"],
production: ["pc3"],
}
);
Now you can get any configuration value without worrying about the environment like this:
config.get('app').configurationKey;
This answer is not something new. It's similar to what #andy_t has mentioned. But I use the below pattern for two reasons.
Clean implementation with No external npm dependencies
Merge the default config settings with the environment based settings.
Javascript implementation
const settings = {
_default: {
timeout: 100
baseUrl: "http://some.api/",
},
production: {
baseUrl: "http://some.prod.api/",
},
}
// If you are not using ECMAScript 2018 Standard
// https://stackoverflow.com/a/171256/1251350
module.exports = { ...settings._default, ...settings[process.env.NODE_ENV] }
I usually use typescript in my node project. Below is my actual implementation copy-pasted.
Typescript implementation
const settings: { default: ISettings, production: any } = {
_default: {
timeout: 100,
baseUrl: "",
},
production: {
baseUrl: "",
},
}
export interface ISettings {
baseUrl: string
}
export const config = ({ ...settings._default, ...settings[process.env.NODE_ENV] } as ISettings)

Resources