return a string from an async nodejs function called in bash - node.js

i'm calling an async nodejs function that uses prompts(https://www.npmjs.com/package/prompts)
basically, the user is presented options and after they select one, i want the selection outputted to a variable in bash. I cannot get this to work. it either hangs, or outputs everything since prompts is a user interface that uses stdout
//nodefunc.js
async run() {
await blahhhh;
return result; // text string
}
console.log(run());
// bash
x=$(node nodefunc.js)
echo $x

Unless you can ensure nothing else in the node script will print to stdout, you will need a different approach.
I'd suggest having the node script write to a temporary file, and have the bash script read the output from there.
Something like this perhaps:
const fs = require('fs');
const outputString = 'I am output';
fs.writeFileSync('/tmp/node_output.txt', outputString);
node nodefunc.js
# Assuming the node script ran succesfully, read the output file
x=$(</tmp/node_output.txt)
echo "$x"
# Optionally, cleanup the tmp file
rm /tmp/node_output.txt

Related

Is it possible to roll a dynamically-generated Javascript array (Node) into a Bash array?

So I'm currently rolling a few different env variables into a Docker container using the following syntax:
Node script:
process.env['VAR1'] = 'someArbitraryValue';
process.env['VAR2'] = 'anotherArbitraryValue';
which then execs a bash script that looks like this:
params=()
[[ ! -z "$VAR1" ]] && params+=(-e "VAR1=$VAR1")
[[ ! -z "$VAR2" ]] && params+=(-e "VAR2=$VAR2")
docker run "${params[#]}"
That works just fine since I know the names of those env variables in advance and I can just hardcode the bash command to grab their values and insert them into params. However, what I'd like to be able to do is allow for a dynamically-generated list of variables to be added to the params list.
In other words, I run some function that returns an array that looks like:
var myArray = ['VAR3=somevalue', 'VAR4=anothervalue']
and is then passed into params by iterating through its contents and appending them. Since you can't set an array as an env variable in Bash, I'm not exactly sure if this is possible.
Is there a way to perform this operation, or am I out of luck?
If I'm not missing anything, yes; using child_process.execFile() (also see execFileSync()), you can pass elements of myArray as positional parameters to the bash script and do whatever you want with them in there.
const { execFile } = require('child_process');
// define "myArray" about here
const child = execFile('./myscript.sh', myArray, (error, stdout, stderr) => {
if (error) {
throw error;
}
console.log(stdout);
});
// ...
#!/bin/bash -
params=()
for param; do
params+=(-e "${param}")
done
docker run "${params[#]}"

Create a persistent bash shell session in Node.js, know when commands finish, and read and modify sourced/exported variables

Imagine this contrived scenario:
./main.sh
source ./config.sh
SOME_CONFIG="${SOME_CONFIG}bar"
./output.sh
./config.sh
export SOME_CONFIG='foo'
./output.sh
echo "Config is: ${SOME_CONFIG}"
I am trying to replace ./main.sh with a Node.js powered ./main.js WITHOUT replacing the other shell files. The exported ./config.sh functions/variables must also be fully available to ./output.sh
Here is a NON working ./main.js. I have written this for the sole purpose to explain what I want the final code to look like:
const terminal = require('child_process').spawn('bash')
terminal.stdin.write('source ./config.sh\n')
process.env.SOME_CONFIG = `${process.env.SOME_CONFIG}bar` // this must be done in JS
terminal.stdin.write('./output.sh\n') // this must be able to access all exported functions/variables in config.sh, including the JS modified SOME_CONFIG
How can I achieve this? Ideally if there's a library that can do this I'd prefer that.
While this doesn't fully answer my question, it solves the contrived problem I had at hand and could help others if need be.
In general, if bash scripts communicate with each other via environment variables (eg. using export/source), this will allow you to start moving bash code to Node.js.
./main.js
const child_process = require("child_process");
const os = require("os");
// Source config.sh and print the environment variables including SOME_CONFIG
const sourcedConfig = child_process
.execSync(". ./config.sh > /dev/null 2>&1 && env")
.toString();
// Convert ALL sourced environment variables into an object
const sourcedEnvVars = sourcedConfig
.split(os.EOL)
.map((line) => ({
env: `${line.substr(0, line.indexOf("="))}`,
val: `${line.substr(line.indexOf("=") + 1)}`,
}))
.reduce((envVarObject, envVarEntry) => {
envVarObject[envVarEntry.env] = envVarEntry.val;
return envVarObject;
}, {});
// Make changes
sourcedEnvVars["SOME_CONFIG"] = `${sourcedEnvVars["SOME_CONFIG"]}bar`;
// Run output.sh and pass in the environment variables we got from the previous command
child_process.execSync("./output.sh", {
env: sourcedEnvVars,
stdio: "inherit",
});

Why is my function running twice in command line but not in vscode

I am using a function from one file, in another file, and calling it there. This is causing the function to run twice at the same time when run from the command line, but not when I run it in VSCode.
Here is an example:
// fileOne
async function task() {
console.log('Hello')
}
module.exports = { task }
// fileTwo
const fileOne = require('./fileOne');
fileOne.task();
Output when ran in VSCode:
Hello
Output when ran in Command Line:
Hello
Hello
I'm not sure why this is happening... No I am not calling it in fileOne by accident because then it would also run twice in VSCode.
Thanks.
If your fileOne and fileTwo look exactly as in your problem statement, i.e.:
fileOne.js:
async function task() {
console.log('Hello')
}
module.exports = { task }
fileTwo.js:
const fileOne = require('./fileOne');
fileOne.task();
the output is 1 single 'Hello' when run in the following ways:
in Command Prompt
node fileTwo.js
in Windows PowerShell
node .\fileTwo.js
in Linux Bash Terminal
$ nodejs fileTwo.js
The same applies if you run the script having both files within 1 file (as you mention in the comments).
There were some cases where Node.js would print the output twice, but those were different scenarios.
You can try running just the fileTwo.js separately, but as already mentioned, it worked well also under a common file (e.g. your my_program_here.js in case it is just a combination of fileOne.js and fileTwo.js).
const fileOne = require('./fileOne');
This is based on the './' in different command lines.

Use child_process#spawn with a generic string

I have a script in the form of a string that I would like to execute in a Node.js child process.
The data looks like this:
const script = {
str: 'cd bar && fee fi fo fum',
interpreter: 'zsh'
};
Normally, I could use
const exec = [script.str,'|',script.interpreter].join(' ');
const cp = require('child_process');
cp.exec(exec, function(err,stdout,sterr){});
however, cp.exec buffers the stdout/stderr, and I would like to be able to be able to stream stdout/stderr to wherever.
does anyone know if there is a way to use cp.spawn in some way with a generic string, in the same way you can use cp.exec? I would like to avoid writing the string to a temporary file and then executing the file with cp.spawn.
cp.spawn will work with a string but only if it has a predictable format - this is for a library so it needs to be extremely generic.
...I just thought of something, I am guessing the best way to do this is:
const n = cp.spawn(script.interpreter);
n.stdin.write(script.str); // <<< key part
n.stdout.setEncoding('utf8');
n.stdout.pipe(fs.createWriteStream('./wherever'));
I will try that out, but maybe someone has a better idea.
downvoter: you are useless
Ok figured this out.
I used the answer from this question:
Nodejs Child Process: write to stdin from an already initialised process
The following allows you to feed a generic string to a child process, with different shell interpreters, the following uses zsh, but you could use bash or sh or whatever executable really.
const cp = require('child_process');
const n = cp.spawn('zsh');
n.stdin.setEncoding('utf8');
n.stdin.write('echo "bar"\n'); // <<< key part, you must use newline char
n.stdout.setEncoding('utf8');
n.stdout.on('data', function(d){
console.log('data => ', d);
});
Using Node.js, it's about the same, but seems like I need to use one extra call, that is, n.stdin.end(), like so:
const cp = require('child_process');
const n = cp.spawn('node').on('error', function(e){
console.error(e.stack || e);
});
n.stdin.setEncoding('utf-8');
n.stdin.write("\n console.log(require('util').inspect({zim:'zam'}));\n\n"); // <<< key part
n.stdin.end(); /// seems necessary to call .end()
n.stdout.setEncoding('utf8');
n.stdout.on('data', function(d){
console.log('data => ', d);
});

How to send commands from a BAT file to running NodeJS process in windows?

Is it possible to make/use custom text commands in a batch file while running a nodejs server through it?
//Current batch file
node nodeServer.js
//nodeServer.js
function list(){
//insert query
}
function unlist(){
//delete query
}
As of now, after i start the batch file, the nodeServer.js is started and the batch stops accepting any input.
I'd like to be able to type "nodeServer.js list"(in the batch window) and with that, call a function called "list" inside nodeServer.js,
I'm looking to insert data about the server into a database by running a insert query with the "list" function and run a delete query with nodeServer.js unlist to remove the inserted row before shutting down the server again.
I'm unfamiliar with batch files, Is this possible?
Update
To Clarify..
I want to type a text command IN the batch window, AFTER it have started the nodejs server, to run a specific function found inside the nodeServer.js
You want to send command to the NodeJS after the node process started.
To start a command form NodeJS without pause the bat file use start
To send the command I will use a simple text file. I will write to the file from the batch using echo and read the file form NodeJS using watch, and readFileSync
I will support sending function name and arguments using spaces. for example: list a b c
The BAT file:
#echo off This is make the bat file to not show the commands
REM `Rem` do nothing. It is exactly like // in javascript
REM This will start NodeJS without pause the bat
start node myNode.js
REM delete the command file if it exists
del command
REM Send 3 commands to the file. You can also add parameters
echo list >> command
echo list a b c>> command
echo unlist>> command
var fs = require('fs')
var filename = __dirname + '/command'
// Add here any command you want to be run by the BAT file
var commands = {
list: function() {
console.log('list', arguments)
},
unlist: function() {
console.log('unlist', arguments)
}
}
console.log('watching:' + filename)
if (fs.existsSync(filename)) {
console.log('File exists, running initial command')
runCommand()
}
require('fs').watchFile(filename, runCommand)
function runCommand() {
if(!fs.existsSync(filename)) return
var lines = fs.readFileSync(filename, 'utf-8').split('\r\n')
console.log(lines)
fs.unlink(filename)
for(var i=0;i<lines.length;i++){
var line=lines[i]
console.log(line)
line=line.split(' ') // Split to command and arguments
var command = line[0]
var args = line.slice(1)
if (!commands[command]) {
console.log('Command not found:"' + command +'"')
return;
}
console.log('Running command:', command, '(', args, ')')
commands[command].call(null, args)
}
}
Read more about FileSystem node Module: https://nodejs.org/docs/latest/api/fs.html#fs_class_fs_stats

Resources