How to use environment variables in package.json - node.js

Because we don't want sensitive data in the project code, including the package.json file, using environment variables would be a logical choice in my opinion.
Example package.json:
"dependencies": {
"accounting": "~0.4.0",
"async": "~1.4.2",
"my-private-module":"git+https://${BB_USER}:${BB_PASS}#bitbucket.org/foo/bar.git"
Is this possible?
The question is not if this is wise or not good, just if it's possible.

In case you use .env file, let's use grep or eval to get a value environment variable from the .env file.
Updated start2 as #Paul suggested:
"scripts": {
"start": "NODE_ENV=$(grep NODE_ENV .env | cut -d '=' -f2) some_script",
"start2": "eval $(grep '^NODE_ENV' .env) && some_script"
}

I have similar but different requirement. For me, I want to use environment variables in the scripts.
Instead of using the environment variables directly in package.json, I do:
"some-script": "./scripts/some-script.sh",
And in some-script.sh:
#!/bin/sh
npm run some-other-script -- --prop=$SOME_ENV_VAR

Here's how I managed to work around package.json to achieve the same purpose. It uses a script that reads from a custom section of package.json for URL modules, interpolates environment variables in them, and installs them with npm install --no-save (the --no-save could be omitted, depending on the usecase).
As a bonus: it tries to read the env variable from .env.json, which can be gitignore'd, and very useful for development.
Create a script that will read from a custom section of package.json
env-dependencies.js
const execSync = require('child_process').execSync
const pkg = require('./package.json')
if (!pkg.envDependencies) {
return process.exit(0)
}
let env = Object.assign({}, process.env)
if (typeof pkg.envDependencies.localJSON === 'string') {
try {
Object.assign(env, require(pkg.envDependencies.localJSON))
} catch (err) {
console.log(`Could not read or parse pkg.envDependencies.localJSON. Processing with env only.`)
}
}
if (typeof pkg.envDependencies.urls === 'undefined') {
console.log(`pkg.envDependencies.urls not found or empty. Passing.`)
process.exit(0)
}
if (
!Array.isArray(pkg.envDependencies.urls) ||
!(pkg.envDependencies.urls.every(url => typeof url === 'string'))
) {
throw new Error(`pkg.envDependencies.urls should have a signature of String[]`)
}
const parsed = pkg.envDependencies.urls
.map(url => url.replace(/\${([0-9a-zA-Z_]*)}/g, (_, varName) => {
if (typeof env[varName] === 'string') {
return env[varName]
} else {
throw new Error(`Could not read env variable ${varName} in url ${url}`)
}
}))
.join(' ')
try {
execSync('npm install --no-save ' + parsed, { stdio: [0, 1, 2] })
process.exit(0)
} catch (err) {
throw new Error('Could not install pkg.envDependencies. Are you sure the remote URLs all have a package.json?')
}
Add a "postinstall": "node env-dependencies.js" to your package.json, that way it will be run on every npm install
Add your private git repos to package.json using the URLs you want (note: they all must have a package.json at root!):
"envDependencies": {
"localJSON": "./.env.json",
"urls": [
"git+https://${GITHUB_PERSONAL_ACCESS_TOKEN}#github.com/user/repo#semver:^2.0.0"
]
},
(the semver specifier #semver:^2.0.0 can be omitted, but refers to a git tag, which can be very useful, as it makes your git server a fully-fledge package manager)
npm install

No, it's not possible. You should access the repo using git+ssh, and store a private key in ~/.ssh.
Your line then looks like:
"my-private-module":"git+ssh://git#bitbucket.org/foo/bar.git"
Which doesn't contain anything sensitive.

No it isn't possible as npm does not treat any string values as any kind of templates.
It may be better to just use git+ssh (if your provider supports it) with an ssh agent.

You can use environment values to inject in your package.json like this:
Any environment variables that start with npm_config_ will be interpreted as a configuration parameter. For example, putting npm_config_foo=bar in your environment will set the foo configuration parameter to bar. Any environment configurations that are not given a value will be given the value of true. Config values are case-insensitive, so NPM_CONFIG_FOO=bar will work the same.
https://docs.npmjs.com/misc/config#environment-variables

I had the same need and my solution was based on #Long Nguyen's response. This way, I can only rely on what's defined on the .env file.
.env
...
SKIP_PREFLIGHT_CHECK=true
...
package.json
...
"scripts": {
"test": "yarn cross-env $(grep SKIP_PREFLIGHT_CHECK ../../.env) react-app-rewired test --watchAll=false"
}
...

You can install package https://www.npmjs.com/package/env-cmd
and all your envs from .env file will be visible
ie:
./.env:
ENV1=THANKS
ENV2=FOR ALL
ENV3=THE FISH
Package.json:
"scripts": {
"test": "env-cmd pact-broker can-i-deploy --broker-token=${ENV1}"
}
or another example from your question:
"my-private-module":"env-cmd git+https://${BB_USER}:${BB_PASS}#bitbucket.org/foo/bar.git"

For complicated environment variables, you can use
https://stedolan.github.io/jq/
to access JSON file (env file at your case)
JSON file could be something like
{
"env" :
{
"username" : "1345345",
"Groups" : [],
"arraytest" : [
{
"yes" : "1",
"no" : "0"
}
]
}
}
so the script could be something like this to access yes value
"scripts": {
"yes": "jq [].arraytest[0].yes?"
}

If you're running node inside a Docker container
Use Docker Compose to inject the env variable
app:
environment:
- NODE_ENV=staging
Run your package.json script from your Dockerfile
CMD [ "npm", "run", "start" ]
Use echo or printenv
"scripts": {
"start": "node -r dotenv/config app.js dotenv_config_path=/run/secrets/$(echo $NODE_ENV)"
"start": "node -r dotenv/config app.js dotenv_config_path=/run/secrets/$(printenv NODE_ENV)"
}
Don't use this for sensitive env variables. It's a really good way to point to a Docker secrets file (like this example shows).

Related

How I can use a commonjs module in my quasar project

I an SSR Quasar project using Vite. Whenever I try to add the #tiptap/extension-code-block-lowlight extension to my project, build it and then node dist/ssr/index.js it throws the following error:
Error [ERR_REQUIRE_ESM]: Must use import to load ES Module: /home/whatever/devotto/devotto.com/node_modules/lowlight/lib/common.js
require() of ES modules is not supported.
require() of /home/whatever/devotto/devotto.com/node_modules/lowlight/lib/common.js from /home/whatever/devotto/devotto.com/dist/ssr/server/server-entry.js is an ES module file as it is a .js file whose nearest parent package.json contains "type": "module" which defines all .js files in that package scope as ES modules.
Instead rename common.js to end in .cjs, change the requiring code to use import(), or remove "type": "module" from /home/whatever/devotto/devotto.com/node_modules/lowlight/package.json.
Upon investigation, I have concluded that the issue is the lowlight library being imported by #tiptap/extension-code-block-lowlight.
If I manually go to my node_modules/#tiptap/extension-code-block-lowlight/package.json AND node_modules/lowlight/package.json and remove the line "type": "module", I can run the project with no problem (e.g. yarn build && node dist/ssr/index.js.
This solution works on my current machine but I shouldn't have to touch the node_modules folder.
I would assume that I have to transpile lowlight library which prompts me to try to alter Vite configuration but no luck there as well
module.exports = function() {
return {
build: {
extendViteConf (viteConf, { isClient, isServer }) {
if (isServer) {
viteConf.optimizeDeps = viteConf.optimizeDeps || {};
viteConf.optimizeDeps.include = ['./node_modules/highlight.js'];
viteConf.build.commonjsOptions = viteConf.build.commonjsOptions || {};
viteConf.build.commonjsOptions.include = [/highlight.js/, /node_modules/];
// viteConf.optimizeDeps.entries = [
// 'node_modules/#tiptap/extension-code-block-lowlight/dist/tiptap-extension-code-block-lowlight.cjs',
// 'node_modules/highlight.js'
// ];
}
},
}
}
}
Is there a solution to this issue without having to manually change node_module folder? Thank you very much in advance.
I didn't exactly solve the question. I only applied an automated way to handle this whenever I run the command to build the server using pre scripts.
On my package.json:
{
"scripts": {
"start:test:webserver": "ENV_FILE=test quasar build --mode ssr --port 3000 && node dist/ssr/index.js",
"prestart:test:webserver": "sed -i '/\"type\": \"module\",/d' node_modules/lowlight/package.json && sed -i '/\"type\": \"module\",/d' node_modules/#tiptap/extension-code-block-lowlight/package.json",
}
}

Passing parameters from Jenkins CI to npm script

When I run Jenkins build, I would like to pass COMMIT_HASH and BRANCH_NAME to one of my javascript files: publish.js, so that I can remove hard-coded values for tags and consumerVersion.
Here is my code:
Jenkinsfile
stage('Publish Pacts') {
steps {
script {
sh 'npm run publish:pact -Dpact.consumer.version=${COMMIT_HASH} -Dpact.tag=${env.BRANCH_NAME}'
}
}
}
package.json
"scripts": {
"publish:pact": "node ./src/test/pact/publish.js"
}
./src/test/pact/publish.js
let publisher = require('#pact-foundation/pact-node');
let path = require('path');
let opts = {
providerBaseUrl: `http://localhost:${global.port}`,
pactFilesOrDirs: [path.resolve(process.cwd(), 'pacts')],
pactBroker: 'http://localhost:80',
tags: ["prod", "test"], // $BRANCH_NAME
consumerVersion: "2.0.0" // $COMMIT_HASH
};
publisher.publishPacts(opts).then(() => {
console.log("Pacts successfully published");
done()
});
Does anyone know how to do this?
You can pass cli arguments to your node script which end up in your process.argv.
Also npm passes on cli arguments via two dashes --.
To illustrate this consider this example:
Jenkinsfile
stage('Publish Pacts') {
steps {
script {
sh 'npm run publish:pact -- ${COMMIT_HASH} ${env.BRANCH_NAME}'
}
}
}
package.json
"scripts": {
"publish:pact": "node ./src/test/pact/publish.js"
}
publish.js
// process.argv[0] = path to node binary
// process.argv[1] = path to script
console.log('COMMIT_HASH:',process.argv[2]);
console.log('BRANCH_NAME:',process.argv[3]);
I left the cli flags out for simplicity.
Hope this helps

Run another yarn/npm task within a package.json, without specifying yarn or npm

I have a task in my package.json "deploy", which needs to first call "build". I have specified it like this:
"deploy": "yarn run build; ./deploy.sh",
The problem is that this hard codes yarn as the package manager. So if someone doesn't use yarn, it doesn't work. Switching to npm causes a similar issue.
What's a good way to achieve this while remaining agnostic to the choice of npm or yarn?
One simple approach is to use the npm-run-all package, whose documentation states:
Yarn Compatibility
If a script is invoked with Yarn, npm-run-all will correctly use Yarn to execute the plan's child scripts.
So you can do this:
"predeploy": "run-s build",
"deploy": "./deploy.sh",
And the predeploy step will use either npm or yarn depending on how you invoked the deploy task.
I think it is good to have the runs in package.json remain package manager agnostic so that they aren't tied to a specific package manager, but within a project, it is probably prudent to agree on the use of a single package manager so that you're not dealing with conflicting lockfiles.
It's probably not ideal, but you could run a .js file at your project root to make these checks...
You could create a file at your project root called yarnpm.js (or whatever), and call said file in your package.json deploy command..
// package.json (trimmed)
"scripts": {
"deploy": "node yarnpm",
"build": "whatever build command you use"
},
// yarnpm.js
const fs = require('fs');
const FILE_NAME = process.argv[1].replace(/^.*[\\\/]/, '');
// Command you wish to run with `{{}}` in place of `npm` or `yarn'
// This would allow you to easily run multiple `npm`/`yarn` commands without much work
// For example, `{{}} run one && {{}} run two
const COMMAND_TO_RUN = '{{}} run build; ./deploy.sh';
try {
if (fs.existsSync('./package-lock.json')) { // Check for `npm`
execute(COMMAND_TO_RUN.replace('{{}}', 'npm'));
} else if (fs.existsSync('./yarn.lock')) { // Check for `yarn`
execute(COMMAND_TO_RUN.replace('{{}}', 'yarn'));
} else {
console.log('\x1b[33m', `[${FILE_NAME}] Unable to locate either npm or yarn!`, '\033[0m');
}
} catch (err) {
console.log('\x1b[31m', `[${FILE_NAME}] Unable to deploy!`, '\033[0m');
}
function execute(command) { // Helper function, to make running `exec` easier
require('child_process').exec(command,
(error, stdout, stderr) => {
if (error) {
console.log(`error: ${error.message}`);
return;
}
if (stderr) {
console.log(`stderr: ${stderr}`);
return;
}
console.log(stdout);
});
}
Hope this helps in some way! Cheers.
EDIT:
...or if you wanted to parameterize the yarnpm.js script, to make it easily reusable, and to keep all "commands" inside the package.json file, you could do something like this..
// package.json (trimmed, parameterized)
"scripts": {
"deploy": "node yarnpm '{{}} run build; ./deploy.sh'",
"build": "node build.js"
},
// yarnpm.js (parameterized)
const COMMAND_TO_RUN = process.argv[2]; // Technically, the first 'parameter' is the third index
const FILE_NAME = process.argv[1].replace(/^.*[\\\/]/, '');
if (COMMAND_TO_RUN) {
const fs = require('fs');
try {
if (fs.existsSync('./package-lock.json')) { // Check for `npm`
execute(COMMAND_TO_RUN.replace('{{}}', 'npm'));
} else if (fs.existsSync('./yarn.lock')) { // Check for `yarn`
execute(COMMAND_TO_RUN.replace('{{}}', 'yarn'));
} else {
console.log('\x1b[33m', `[${FILE_NAME}] Unable to locate either npm or yarn!`, '\033[0m');
}
} catch (err) {
console.log('\x1b[31m', `[${FILE_NAME}] Unable to deploy!`, '\033[0m');
}
function execute(command) { // Helper function, to make running `exec` easier
require('child_process').exec(command,
(error, stdout, stderr) => {
if (error) {
console.log(`error: ${error.message}`);
return;
}
if (stderr) {
console.log(`stderr: ${stderr}`);
return;
}
console.log(stdout);
});
}
} else {
console.log('\x1b[31m', `[${FILE_NAME}] Requires a single argument!`, '\033[0m')
}
What if check before run?
You can create a new file called build.sh, and it's content below:
# check if current user installed node environment, if not, auto install it.
if command -v node >/dev/null 2>&1; then
echo "version of node: $(node -v)"
echo "version of npm: $(npm -v)"
else
# auto install node environment, suppose platform is centos,
# need change this part to apply other platform.
curl --silent --location https://rpm.nodesource.com/setup_12.x | sudo bash -
yum -y install nodejs
fi
npm run build
Then your script will be:
{
"deploy": "./build.sh && ./deploy.sh"
}
So I think I have a much simpler solution:
"deploy": "yarn run build || npm run build; ./deploy.sh",
Its only real downside is in the case where yarn exists, but the build fails, then npm run build will also take place.

Is there a way to pass the cypress.io baseUrl env var into my package.json run scripts?

I want to be able to pass the baseUrl from the cypress.json file into the scripts of the package.json file for my cypress test project. Is this possible?
I have been looking at the cypress documentation and stack overflow but I cannot find a solution that does not require adding another script to do something like "get-base-url": "type cypress.json | jq -r .baseUrl" and pass this script as an argument into the relevant "test" script (see below)
cypress.json file
{
"baseUrl": "http://localhost:3000/",
//other key-value pairs
}
}
package.json scripts section
{
//other settings
"scripts": {
//other scripts
"test": "start-server-and-test website:dev http://localhost:3000 cy:run",
},
//other settings
}
I anticipated there would be an equivalent to Cypress.config().baseUrl, to get the value of the baseUrl in the json file.
Resulting in something similar to the following (sudo-code, doesnt work)
{
//other settings
"scripts": {
//other scripts
"test": "start-server-and-test website:dev ${baseUrl} cy:run",
},
//other settings
}
NB: I have not posted on Stack Overflow before, so I apologise if I have not given enough info and/or missed something in the rules.
scripts capability is limited. You need a small script to receive baseUrl from cypress.json and pass it into the start-server-and-test package
Let's say we create a script called start-server-and-test.js with the following code and put it under the scripts directory
const cypressConfig = require('../cypress.json') // line 1
const startServerAndTest = require('start-server-and-test') // line 2
const [startScript, testScript] = process.argv.slice(2) // line 3
startServerAndTest({ // line 4
start: `npm ${startScript}`,
url: cypressConfig.baseUrl,
test: `npm ${testScript}`,
})
Here is how we use it in package.json
{
"scripts": {
"test": "node scripts/start-server-and-test.js website:dev cy:run",
},
}
Short explanation:
Line 1: read cypress.json and assign to cypressConfig which you can access baseUrl later by cypressConfig.baseUrl
Line 3: retrieve arguments in the command-line which are ['website:dev', 'cy:run']
Line 4: Run the package with corresponding parameters
Just wanted to elaborate on Hung Tran's solution above for 2021:
/* eslint-disable #typescript-eslint/no-var-requires */
require("dotenv").config();
const startServerAndTest = require("start-server-and-test");
const [startScript, testScript] = process.argv.slice(2);
startServerAndTest.startAndTest({
services: [{ start: `npm run ${startScript}`, url: process.env.CYPRESS_BASE_URL }],
test: `npm run ${testScript}`,
});

Load .env environment variables when running npm task

Let's say we have a .env file with some variables specified:
AWS_PROFILE=hsz
ENVIRONMENT=development
There is also a simple npm task defined:
{
"name": "project",
"version": "0.0.1",
"scripts": {
"deploy": "sls deploy"
}
}
But runnning npm run deploy ignores our .env definition.
It can be resolved with better-npm-run like:
{
"name": "project",
"version": "0.0.2",
"scripts": {
"deploy": "bnr deploy"
},
"betterScripts": {
"deploy": "sls deploy"
},
"devDependencies": {
"better-npm-run": "^0.1.1",
}
}
but this looks like an overhead - especially when we have 10+ tasks.
Is there a better way to always load .env without proxying all tasks via better-npm-run?
A bit ugly, but you could try something like this:
"scripts": {
"deploy": "export $(cat .env | xargs) && sls deploy"
}
This will export all environment variables from the .env file before running sls deploy.
There are some variations to this tehnique in this answer.
Not very clean but it avoids usage of an extra module.
You can use env-cmd npm package to set environment variables loaded from .env file before executing a npm script.
Add package to your package.json devDependencies:
npm i env-cmd -D
Prefix your npm script with env-cmd program in package.json:
{
"scripts": {
"deploy": "env-cmd sls deploy"
}
}
Maintain and load all your environment specific configuration in project itself.
dev.js
module.exports = {
"host":"dev.com"
}
prod.js
module.exports = {
"host":"prod.com"
}
config.js - main file that will resolve configuration based on process.env.ENV variable.
const dev = require('./dev');
const prod = require('./prod');
let envObject = {};
const env = process.env.ENV || "dev";
switch(env) {
case 'prod':
envObject = prod;
break;
default:
envObject = dev;
}
envObject['ENV'] = env;
process.env = Object.assign(process.env,envObject); // Optional if you prefer to add them into process environment otherwise `require('./config')` where you need configuration.
module.exports = envObject;
index.js - node project root file call every time when project start
const config = require('./config');
console.log('config object => ',config.host);
package.json
{
"name": "project",
"version": "0.0.2",
"scripts": {
"deploy": "sls deploy"
}
}
Running you node.js code
Prod environment ENV=prod npm run deploy;
Development environment - npm run deploy;
Default environment is set to dev in ./config.js
Using this simple practice you don't need any npm module to manage your environment configurations.
I was having the same issue while trying to syncing the DB using an external command and fixed the issue by requiring dotenv package which will load the variables
"scripts": {
"db-sync": "node --require dotenv/config ./src/sequelize/sync.js"}
then just call npm run db-sync

Resources