child_process.execFile slow to exit - node.js

I have a Node script that calls an external program (PluginManager.exe) this way:
const util = require('util');
const execFile = util.promisify(require('child_process').execFile);
const process = execFile('PluginManager.exe', ['/install']);
process
.then(({stdout, stderr}) => console.log('done', stdout, stderr))
.catch(e => console.log(e));
PluginManager.exe takes 8 seconds to execute. My problem is that the Node script keeps running for another 10 more seconds after the child process has exited. I know when PluginManager.exe finishes because I can see it disappear from the Windows Task Manager Process List.
What keeps the Node process running for so long and what can I do to make sure it exits as soon as the child process exits?

Maybe it's waiting on input and timing out after 10s?
Try closing stdin with .end() as mentioned in https://nodejs.org/api/child_process.html#child_process_subprocess_stdin
(in this usage, you'll need the original return value of execFile, so don't promisify, as per https://stackoverflow.com/a/30883005/1105015)
e.g.
const util = require('util');
const execFile = require('child_process').execFile;
const process = execFile(
'PluginManager.exe', ['/install'], (e, stdout, stderr) => {
if (e) {
console.log(e);
} else {
console.log('done', stdout, stderr));
}});
process.stdin.end();

Have you tried by adding the killSignal option set to something a bit more aggresive?
const process = execFile('PluginManager.exe', ['/install'], {killSignal: 'SIGKILL'});

Related

How to make node.js spawn show text realtime with VT100 codes

I've written an Angular builder that dynamically sets up my jest command and then spawns it, as shown below. While I do get all the output, it's not showing the same way as if I were to run the child process directly from the command line.
For example, if I manually run the node command then I get VT100-like color coding, and a timer is shown as the jest tests are starting. With the below code, that doesn't happen, and the output just all comes out hodge-podge.
Is there a different way to handle running the command?
export async function runJest(options: Options, context: BuilderContext): Promise<BuilderOutput> {
...
const child = spawn('node', [
'--experimental-vm-modules',
'--no-warnings',
'./node_modules/jest/bin/jest.js',
'--no-cache',
'--config',
JSON.stringify(config)
])
child.stdout.on('data', x => process.stderr.write(x))
child.stderr.on('data', x => process.stderr.write(x))
const exitCode = await new Promise((resolve) => child.on('close', resolve))
return {success: exitCode === 0}
}

Is there a way to pass command line arguments to node process while loading environment to child process for a job in bull?

I need to pass command-line arguments or params or execargv which should come in process params while loading environment for a child process which will be launched by a bull for process a job.
is it possible? if yes is there any way to do it?
I can identify the child process is launched for bull usings args[1] which contains /bull/lib/process
but I want to pass custom param to node process.
when a worker script runs, it reads the environment and keeps it until you shut it down.
If you need variable parameters to the function the worker should use, then it is best you send them in your queue.
queue.js
queue.add("foo", {params:"parameters you need", payload:{ foo: "bar" }});
worker.js
const worker = new Worker("foo",
async (job) => {
await your_function(job.data.params, job.data.payload);
}
);
const your_function = async (params, payload) => {
require("fs").writeFileSync("runner.json", JSON.stringify(payload), "utf8");
await require("child_process").fork("runner.js", params.split(" "));
};
runner.js
console.log(process.argv);
const fs = require("fs");
fs.readFile("runner.json", "utf8", function (err, data) {console.log("data: ", JSON.parse(data));});

Send text to .bat file that is ran from NodeJS

I want to start a .bat file from within NodeJS. With that, I would then be able to send messages to that running bat file.
My code:
const childprocess = require("child_process")
const mybat = childprocess.exec("start cmd /c my.bat", () => {
console.log("bat file has finished")
})
// some time later in another function
mybat.send("text to send")
// within the bat, it would use the new message "text to send" as if you typed and sent a message in the cmd terminal
// ...
mybat.send("a")
// sending any key to complete a PAUSE command which will close the cmd
The .send() isn't a working function but hopefully it demonstrates what I'm trying to accomplish. Everything except the send functions works fine.
The following code uses #rauschma/stringio to asynchronously write to the stdin of a child process running a shell command:
const {streamWrite, streamEnd, onExit} = require('#rauschma/stringio');
const {spawn} = require('child_process');
async function main() {
const sink = spawn('cmd.exe', ['/c', 'my.bat'],
{stdio: ['pipe', process.stdout, process.stderr]}); // (A)
writeToWritable(sink.stdin); // (B)
await onExit(sink);
console.log('bat file has finished');
}
main();
async function writeToWritable(writable) {
...
await streamWrite(writable, 'text to send\n');
...
await streamWrite(writable, 'a');
...
await streamEnd(writable);
}
We spawn a separate process, called sink, for the shell command. writeToWritable writes to sink.stdin. It does so asynchronously and pauses via await, to avoid requiring too much buffering.
Observations:
In line A, we tell spawn() to let us access stdin via sink.stdin ('pipe'). stdout and stderr are forwarded to process.stdin and process.stderr, as previously.
We don’t await in line B for the writing to finish. Instead, we await until the child process sink is done.

Wait for all child processes to finish to continue

I would like to know if it is possible to wait for all child process created using the spawn function to finish before continuing execution.
I have a code looking like this:
const spawn = window.require('child_process').spawn;
let processes = [];
let thing = [];
// paths.length = 2
paths.forEach((path) => {
const pythonProcess = spawn("public/savefile.py", ['-d', '-j', '-p', path, tempfile]);
pythonProcess.on('exit', () => {
fs.readFile(tempfile, 'utf8', (err, data) => {
thing.push(...)
});
});
processes.push(pythonProcess);
});
console.log(processes) // Here we have 2 child processes
console.log(thing) // empty array.. the python processes didnt finish yet
return thing // of course it doesn't work. I want to wait for all the processes to have finished their callbacks to continue
As you can guess, I would like to know how I could get all the python scripts running at the same time, and wait for all of them to finish to continue my js code.
I'm running node 10.15.3
Thank you
ForEach to push Promise into an array of Promise and Promise.all()
Have you tried spawnSync ?
Is generally identical to spawn with the exception that the function
will not return until the child process has fully closed.
import { spawnSync } from "child_process";
spawnSync('ls', ['-la']);

Node.js child process isn't receiving stdin unless I close the stdin stream

I'm building a discord bot that wraps a terraria server in node.js so server users can restart the server and similar actions. I've managed to finish half the job, but I can't seem to create a command to execute commands on the terraria server. I've set it to write the command to the stdin of the child process and some basic debugging verifies that it does, but nothing apparently happens.
In the Node.js docs for child process stdin, it says "Note that if a child process waits to read all of its input, the child will not continue until this stream has been closed via end()." This seems likely to be the problem, as calling the end() function on it does actually send the command as expected. That said, it seems hard to believe that I'm unable to continuously send commands to stdin without having to close it.
Is this actually the problem, and if so what are my options for solving it? My code may be found below.
const discordjs = require("discord.js");
const child_process = require("child_process");
const tokens = require("./tokens");
const client = new discordjs.Client();
const terrariaServerPath = "C:\\Program Files (x86)\\Steam\\steamapps\\common\\Terraria\\TerrariaServer.exe"
const terrariaArgs = ['-port', '7777', "-maxplayers", "8", "-world", "test.wld"]
var child = child_process.spawn(terrariaServerPath, terrariaArgs);
client.on('ready', () => {
console.log(`Logged in as ${client.user.tag}!`);
});
client.on('disconnect', () => {
client.destroy();
});
client.on('message', msg => {
if (msg.channel.name === 'terraria') {
var msgSplit = msg.content.split(" ");
if (msgSplit[0] === "!restart") {
child.kill();
child = child_process.spawn(terrariaServerPath, terrariaArgs);
registerStdio();
msg.reply("restarting server")
}
if (msgSplit[0] === "!exec") {
msg.reply(msgSplit[1]);
child.stdin.write(msgSplit[1] + "\n");
child.stdin.end();
}
}
});
client.login(tokens.discord_token);
var registerStdio = function () {
child.stdout.on('data', (data) => {
console.log(`${data}`);
});
child.stderr.on('data', (data) => {
console.error(`${data}`);
});
}
registerStdio();
I was able to solve the problem by using the library node-pty. As near as I can tell, the problem was that the child process was not reading the stdin itself and I was unable to flush it. Node-pty creates a virtual terminal object which can be written to instead of stdin. This object does not buffer writes and so any input is immediately sent to the program.

Resources