Create a persistent bash shell session in Node.js, know when commands finish, and read and modify sourced/exported variables - node.js

Imagine this contrived scenario:
./main.sh
source ./config.sh
SOME_CONFIG="${SOME_CONFIG}bar"
./output.sh
./config.sh
export SOME_CONFIG='foo'
./output.sh
echo "Config is: ${SOME_CONFIG}"
I am trying to replace ./main.sh with a Node.js powered ./main.js WITHOUT replacing the other shell files. The exported ./config.sh functions/variables must also be fully available to ./output.sh
Here is a NON working ./main.js. I have written this for the sole purpose to explain what I want the final code to look like:
const terminal = require('child_process').spawn('bash')
terminal.stdin.write('source ./config.sh\n')
process.env.SOME_CONFIG = `${process.env.SOME_CONFIG}bar` // this must be done in JS
terminal.stdin.write('./output.sh\n') // this must be able to access all exported functions/variables in config.sh, including the JS modified SOME_CONFIG
How can I achieve this? Ideally if there's a library that can do this I'd prefer that.

While this doesn't fully answer my question, it solves the contrived problem I had at hand and could help others if need be.
In general, if bash scripts communicate with each other via environment variables (eg. using export/source), this will allow you to start moving bash code to Node.js.
./main.js
const child_process = require("child_process");
const os = require("os");
// Source config.sh and print the environment variables including SOME_CONFIG
const sourcedConfig = child_process
.execSync(". ./config.sh > /dev/null 2>&1 && env")
.toString();
// Convert ALL sourced environment variables into an object
const sourcedEnvVars = sourcedConfig
.split(os.EOL)
.map((line) => ({
env: `${line.substr(0, line.indexOf("="))}`,
val: `${line.substr(line.indexOf("=") + 1)}`,
}))
.reduce((envVarObject, envVarEntry) => {
envVarObject[envVarEntry.env] = envVarEntry.val;
return envVarObject;
}, {});
// Make changes
sourcedEnvVars["SOME_CONFIG"] = `${sourcedEnvVars["SOME_CONFIG"]}bar`;
// Run output.sh and pass in the environment variables we got from the previous command
child_process.execSync("./output.sh", {
env: sourcedEnvVars,
stdio: "inherit",
});

Related

return a string from an async nodejs function called in bash

i'm calling an async nodejs function that uses prompts(https://www.npmjs.com/package/prompts)
basically, the user is presented options and after they select one, i want the selection outputted to a variable in bash. I cannot get this to work. it either hangs, or outputs everything since prompts is a user interface that uses stdout
//nodefunc.js
async run() {
await blahhhh;
return result; // text string
}
console.log(run());
// bash
x=$(node nodefunc.js)
echo $x
Unless you can ensure nothing else in the node script will print to stdout, you will need a different approach.
I'd suggest having the node script write to a temporary file, and have the bash script read the output from there.
Something like this perhaps:
const fs = require('fs');
const outputString = 'I am output';
fs.writeFileSync('/tmp/node_output.txt', outputString);
node nodefunc.js
# Assuming the node script ran succesfully, read the output file
x=$(</tmp/node_output.txt)
echo "$x"
# Optionally, cleanup the tmp file
rm /tmp/node_output.txt

Node.js: How to pass environment variables with a shell command?

I'm trying to generate specific values in a Node.js script and then pass them as environment variables to a shell command, but I can't seem to get it working. What's the right way to execute a string as a shell command from Node.js while passing in my own environment variables?
The following doesn't seem to be working as I would expect:
const shell = require("shelljs");
const { findPort } = require("./find-port");
async function main() {
// imagine that `findPort(value)` starts at the provided `value` and
// increments it until it finds an available port
const PORT = await findPort(8000); // 8000, 8001, etc
const DB_PORT = await findPort(3306); // 3306, 3307, etc
shell.exec(`yarn run dev`, {
env: {
PORT,
DB_PORT,
},
async: true,
});
}
main();
When I try to run this, I get the following error:
env: node: No such file or directory
Important: I don't want any values to leak out of this specific script, which is why I'm trying to avoid the export FOO=bar syntax, but I may be misunderstanding how that works.
I'd prefer a solution that uses shelljs, but I'm open to other solutions that use child_process.exec, execa, etc.
The script does exactly what you ask for: it runs yarn run dev with your environment variables. Unfortunately, that means that it does not run with the system's environment variables that yarn depends on, like PATH.
You can instead run it with both your variables and the system's variables:
shell.exec(`yarn run dev`, {
env: {
...process.env,
PORT,
DB_PORT,
}
});

Is there a way to run a self-terminating js script that can pass variables to the next?

I'd really like to have some of my my secrets/keys be iterable, since I have a growing list of external api keys that would be easier to use if I could match them based on the route being used without having to statically map them at the start of my application.
The only way I can think to better organize them without writing massive JSON one-line strings in a batch/bash file would be to have it all defined in js object literals and have a js script stringify it and load it into ENV variables to be passed to the application that's about to start.
NPM pre-start script:
const env = {
secret: 'supersecret',
key: `key
that requires
line breaks`,
apiKeys: {
'api-1':'a;sodhgfasdgflksdaj;lg',
'api-2':'ajl;sdfj;adjsfkljasd;f'
}
}
for (let x in env) {
if (typeof env[x] == 'string') {
process.env[x] = env[x];
} else {
process.env[x] = JSON.stringify(env[x])
}
console.log(x)
}
process.exit(22);
NPM start script:
const key = process.env.key
const apiKeys = JSON.parse(process.env.apiKeys)
Unfortunately, the ENV variables don't remain between instances, so this is useless.
Would it also be secure to use STDIN and STDOUT to pass the data between the two scripts?
My solution ended up being to pipe output by converting to JSON then streaming to STDOUT and receiving on STDIN on the second script. Doing this made it platform agnostic and I can add any sort of active secret management in the source script (e.g. accepting secrets from various other secret management systems/vaults or generating new secrets at every launch)
Send to STDOUT:
const env = {
someSecret: 'supersecret',
superSecretObject: {
moreProperties: 'data'
}
};
/* If you have an array of properties or have a very large set of secrets,
you should create a readable stream from it, and send that to stdout,
but this is much simpler */
process.stdout.write(JSON.stringify(env));
Accept on STDIN:
const fs = require('fs')
const env = (function () {
/* Using fs will error out on no input, but you can use process.stdin
if you don't need to suspend the whole application waiting for the input */
let envTmp = fs.readFileSync(0).toString();
envTmp = JSON.parse(envTmp);
return envTmp;
})();

why i am not able to update env variable in node js

I want to update my env variable in node js, but i am not able to update its env variable, i tried with console.log(process.env.DB_HOST) but still i am getting old value, here i have added my whole code, can anyone please look in to it, and help me to resolve this issue,
function exec_api() {
return new Promise(async function (resolve) {
const execSync = require('child_process').exec;
//let child_process_obj = execSync('DB_HOST='+process.env.UNITTEST_DB_HOST+' DB_DATABASE='+process.env.UNITTEST_DB_DATABASE+' DB_USERNAME='+process.env.UNITTEST_DB_USERNAME+' DB_PASSWORD='+process.env.UNITTEST_DB_PASSWORD+' PORT='+process.env.UNITTEST_SERVICE_PORT+' ./node_modules/.bin/nodemon main.js');
await execSync('export DB_HOST=' + process.env.UNITTEST_DB_HOST);
await execSync('export DB_DATABASE=' + process.env.UNITTEST_DB_DATABASE);
await execSync('export DB_USERNAME=' + process.env.UNITTEST_DB_USERNAME);
await execSync('export DB_PASSWORD=' + process.env.UNITTEST_DB_PASSWORD);
await execSync('export PORT=' + process.env.UNITTEST_API_BACKEND_PORT);
let child_process_obj = await execSync('node main.js');
unittest_api_backend_process_id = child_process_obj.pid;
resolve(true);
});
}
TLDR: Just change process.env
To change, add or delete environment variables, use process.env. The following is test code showing how this works:
In main.js:
console.log(process.env.DB_DATABASE);
In exec.js:
const execSync = require ('child_process').execSync;
process.env.DB_DATABASE = 'foo'; // this is ALL you need to do
console.log(execSync('node main.js').toString('utf8'));
With the two files above, if you run node exec.js you will see foo printed out in the console. This is printed from main.js which inherits the environment from exec.js.
So all you need to do in your code is:
I want to update my env variable in node js, but i am not able to update its env variable, i tried with console.log(process.env.DB_HOST) but still i am getting old value, here i have added my whole code, can anyone please look in to it, and help me to resolve this issue,
function exec_api() {
return new Promise(function (resolve) {
const exec = require('child_process').exec;
// The following is node.js equivalent of bash "export":
process.env.DB_HOST = process.env.UNITTEST_DB_HOST;
process.env.DB_DATABASE = process.env.UNITTEST_DB_DATABASE;
process.env.DB_USERNAME = process.env.UNITTEST_DB_USERNAME;
process.env.DB_PASSWORD = process.env.UNITTEST_DB_PASSWORD;
process.env.PORT = process.env.UNITTEST_SERVICE_PORT;
let child_process_obj = exec('node main.js', {
stdio: ['inherit', 'inherit', 'inherit']
});
unittest_api_backend_process_id = child_process_obj.pid;
resolve(true);
});
}
Note that if you want the promise to return when the main.js ends you need to do:
function exec_api() {
return new Promise(function (resolve) {
const exec = require('child_process').exec;
// The following is node.js equivalent of bash "export":
process.env.DB_HOST = process.env.UNITTEST_DB_HOST;
process.env.DB_DATABASE = process.env.UNITTEST_DB_DATABASE;
process.env.DB_USERNAME = process.env.UNITTEST_DB_USERNAME;
process.env.DB_PASSWORD = process.env.UNITTEST_DB_PASSWORD;
process.env.PORT = process.env.UNITTEST_SERVICE_PORT;
let child_process_obj = exec('node main.js', {
stdio: ['inherit', 'inherit', 'inherit']
});
unittest_api_backend_process_id = child_process_obj.pid;
child_process_obj.on('exit', () => resolve(true));
// ^^^ Cannot use `await` as the API is not promise based
// but event based instead.
});
}
Long story: The full explanation of why export doesn't work
On unixen, environment variables, and indeed, the entire environment including current working directory, root directory (which can be changed via chroot) etc. are not features of shells. They are features of processes.
We may be familiar with the export syntax of some shells to set environment variables for child processes but that is the shell's syntax. It has nothing to do with environment variables themselves. C/C++ for example don't use export instead uses the setenv() function do set environment variables (indeed, internally that's what bash/sh/ksh etc do when implementing export).
In node.js, the mechanism for reading and setting environment variables is via process.env.
Why asking a shell to do it don't work
This is not merely a node.js issue. It also won't work in bash:
In exporter.sh:
#! /bin/bash
export DB_DATABASE=$1
In exec.sh:
#! /bin/bash
./exporter.sh foo
echo $DB_DATABASE ;# does not print "foo"
This is a core security feature of unixen: other users should not be allowed to mess with your process. The way this policy is enforced in the case of the environment is that only a parent process can set the environment of the child process. A child process is not allowed to set the environment of the parent process. The assumption is that the child process belongs to the parent process so you should be allowed to do what you want to a program - but since the parent process (you) don't belong to the child process the child is not allowed to mess with the parent's environment.
That's why your attempt to use export doesn't work. It actually works (the variables are indeed created in the subshell) but is not allowed to change the environment of it's parent (the node.js process)
When you use export in a terminal, it instructs the shell to set environment variables.
When you call exec from your code, you are not running such a shell, with the reason being that it would become a challenge to extract the output of every command.
This makes export an ignored command.
You can solve this by passing an option object to execSync:
execSync('node main.js', {
env: {
DB_HOST: 'localhost',
// More envs...
}
}

How can I access command options in separate files?

I'm using commander to specify some commands and options for my global node module in the index.js file of the project (like shown in the example).
I know that I can easily check if a command was used by using the following code:
if (program.peppers) {
console.log('-peppers was used')
}
But how can I check those properties in other files? I've already tried exporting program and requiring it in those other files, but it doesn't seem to work.
Let's say I want to check if an option was used in a different file than in the one in which I've defined them. How should I do that?
You could pass the parsed program to the files/functions that need the parsed arguments or you can export the parsed arguments program instance
File index.js
var program = require('./program');
if (program.peppers) {
console.log('-peppers was used')
}
File program.js
var program = require('commander');
program
.version('0.0.1')
.option('-p, --peppers', 'Add peppers')
.option('-P, --pineapple', 'Add pineapple')
.option('-b, --bbq-sauce', 'Add bbq sauce')
.option('-c, --cheese [type]', 'Add the specified type of cheese [marble]', 'marble')
.parse(process.argv);
module.exports = program;
Invoking index.js from command line:
> node index.js -p
Yields the output
-peppers was used

Resources