Create and communicate with a non-terminating shell via child_process in nodejs - node.js

Is there a way to spawn a single child_process in NodeJS and pass it various commands over time keeping the same process open as long as necessary? Sort of like a spawned terminal which accepts commands from Node.
Why? Performance.
I have a NodeJS/Electron application which should execute powershell commands and this is achieved using Node's child_process module. However the performance is not great: there appears to be a couple of seconds overhead each time I spawn a child process (which is to be expected I suppose).
This means that commands such as Get-Date take 600ms instead of a few (2) milliseconds. Other commands take 2+ seconds instead of say 800ms.
Desired workflow:
Start a child powershell process (exec with shell = powershell)
Pass it a command
Get the results (stdout/stderr)
Wait seconds to minutes for the user...
Pass it a second command
Get the results (stdout/stderr)
etc...
Close child process
I have considered writing powershell commands from NodeJS to a file commands.txt. Next I would start a single powershell child_process which watches/tails a file for new commands and executes them, passing the output into another file which the parent (NodeJS) process watches. This seems a bit hacky however...

I have found one solution using spawn and periodically piping input to the process with stdin.write:
const { spawn } = require("child_process");
const ps1 = spawn("C:\\Windows\\SysWOW64\\WindowsPowerShell\\v1.0\\powershell.exe", [], {});
console.log("PID", ps1.pid, "started");
ps1.stdout.on('data', (data)=>{
console.log("STDOUT:"+data);
});
ps1.stderr.on('data', (data)=>{
console.log("STDERR:"+data);
});
ps1.on('close', (code, signal) => {
console.log(`child process terminated due to receipt of signal ${signal}`);
});
setInterval(()=>{
ps1.stdin.write("Get-Date\n");
}, 1000);
Results:
PID 7688 started
STDOUT:Windows PowerShell
Copyright (C) Microsoft Corporation. All rights reserved.
STDOUT:PS W:\powershell\powowshell\bak>
STDOUT:Get-Date
STDOUT:
STDOUT:Freitag, 17. Mai 2019 17:55:52
STDOUT:
STDOUT:PS W:\powershell\powowshell\bak>
STDOUT:Get-Date
STDOUT:
STDOUT:Freitag, 17. Mai 2019 17:55:53
So now it's "just" a case of stripping whitespace and other fuzz and getting the results.

Related

Where does the buffer come into picture when using node.js exec function instead of spawn function?

As I read from the child_process module documentation of Node.js, I understand the difference between exec and spawn. The most crucial difference that is also highlighted in a similar StackOverflow question about the Spawn vs Exec:
The main difference is that spawn is more suitable for long-running processes with huge output. That's because spawn streams input/output with a child process. On the other hand, exec buffers output in a small (by default 200K) buffer.
However, I noticed, thanks to TS Intellisense that both exec and spawn function return similar object of type ChildProcess. So, I could technically write this for the exec function using stdout as stream and it works:
function cmdAsync (cmd, options) {
return new Promise((resolve) => {
const proc = exec(cmd, options);
proc.stdout?.pipe(process.stdout);
proc.on('exit', resolve);
});
}
cmdAsync('node server/main.mjs');
And without any buffer time/delay, I could see the logs generated by server/main.mjs file being piped into the parent process's stdout stream.
So, my question is exactly where the buffering is happening and how the streaming behavior is different for exec than spawn function! Also, can I rely of this feature even if it is undocumented?

NodeJS child spawn exits without even waiting for process to finish

I'm trying to create an Angular11 application that connects to the NodeJS API that would run bash scripts when called and on exit it should either send an error or send a 200 status with a confirmation message.
here is one of the functions from that API. It runs a script called initialize_event.sh, gives it a few arguments when prompted and once the program finishes its course should display a success message (There is no error block for this function):
exports.create_event = function (req, res) {
var child = require("child_process").spawn;
var spawned = child("sh", ["/home/ubuntu/master/initialize_event.sh"]);
spawned.stdout.once("data", function (data) {
spawned.stdin.write(req.body.name + "\n");
});
spawned.stdout.once("data", function (data) {
spawned.stdin.write(req.body.domain_name + "\n");
});
spawned.on("exit", function (err) {
res.status(200).send(JSON.stringify("Event created successfully"));
});
};
The bash script is a long one, but what it basically does is take two variables (event name and domain name) and uses that to create a new event instance. Here are the first few lines of code for the program:
#!/bin/bash
#GET EVENT NAME
echo -n "Enter event name: "; read event;
echo -n "Enter event domain: "; read eventdomain;
#LOAD VARIABLES
export eventdomain;
export event;
export ename=$event-env;
export event_rds= someurl.com ;
export master_rds= otherurl.com;
export master_db=master;
# rest of code...
When called on its own directly from the terminal, the process takes around 30-40 seconds after taking input to create an event and then exits once completed. I can then check the list of events created using another script and the new event would show up in the list. However, when I call this script from the NodeJS function, it manages to take the inputs and the exit within 5 or 6 seconds, saying the event has been created successfully. When I check the list of events there is no event created. I wait to see if the process is still running and check back after a few minutes, still, no event created.
I suspect that the spawn exits before the script can be run completely. I thought that maybe the stdio streams are still open so I tried to use spawned.on.close instead of spawned.on.exit, but still the program exits before it even runs completely. I don't see any exceptions or errors appearing in the Node express console, so I can't really figure out why the program exits successfully without running all the way through.
I've used the same inputs when running from the terminal and on Postman, and have logged them as well to see if there are any empty variables being sent, but found nothing wrong with them either. I've double-checked the paths as well, literally copy-pasted from pwd to make sure I haven't been missing something, but still nothing.
What am I doing wrong here??
So here's the problem I found and solved:
The folder where the Node Express was being served from, and the folder where the bash scripts were saved were in different directories.
Problem:
So basically, whenever I created a child process, it was created with the following current directory:
var/www/html/node/
But the bash scripts were run from:
var/www/html/other/bash/scripts/
so any commands that were added to the bash script that involved directory change (like cd) were relative to the bash directory.
However, since the spawn's current directory was var/www/html/node the script being executed in the spawn also had the same current working directory as the node folder, and any directory changes within the script were now invalid since they didn't exist relative to node directory.
E.g.
When run from terminal:
test.sh -> cd /savedir/ -> /var/www/html/other/bash/scripts/savedir/ -> exists
When run from spawn:
test.sh -> cd /savedir/ -> /var/www/html/node/savedir/ -> Doesn't exist!
Solution:
The easiest way I was able to solve this was to modify the test.sh file. i.e during the start I added cd /var/www/html/other/bash/scripts/. This allowed the current directory of my spawn to change to the right directory that would make all the mv cd and other path relevant commands valid.

Nodejs: write to stdin of bash process crashes with EPIPE

My node process gets some PDF file via HTTP Request, then uses the request's onData event to pass the incoming data on to a properly configured lpr, spawned via child_process.exec. I write to stdin using process.stdin.write(...), followed by process.stdin.end() when done. This allows me to print those files immediately.
Now I have a situation where I don't want the data to be piped to lpr, but to some bash script. The script uses cat to process its stdin.
myscript.sh < somefile.pdf works as expected, as does cat somefile.pdf | myscript.sh.
However, when I spawn /path/to/script.sh from node (by simply replacing lpr with the script path in the source), the process exits with
events.js:183
throw er; // Unhandled 'error' event
^
Error: write EPIPE
at WriteWrap.afterWrite [as oncomplete] (net.js:868:14)
Subsequently, the whole node process crashes, the error sneaking around all try...catch blocks. Logging at the beginning of the bash script shows, it does not even get started.
When I target anything that's not a shell script but some compiled executable, like cat, echo,... everything works just fine.
Adding epipebomb module would not change anything.
I also tried piping to process.exec("bash", ["-c cat | myscript.sh"]), with the same errors.
An example bash script, just to test for execution:
#!/usr/bin/env bash
date > logfile.txt
cat > /dev/null
EDIT:
I think I maybe need to signal to keep the stdin stream open somehow.
The process-spawning part of the script, leaving promisification and output processing away:
const process = require("child_process")
// inputObservable being an rxjs Observable
execstuff(inputObervable) {
const task = process.spawn("/path/to/script.sh");
inputObservable.subscribe(
chunk => task.stdin.write(chunk),
error => console.error(error),
finished => task.stdin.end()
);
}
There is an example at child_process.spawn how you can write the following lines ps ax | grep ssh as node.js script, maybe it will be helpful for you:
const { spawn } = require('child_process');
const ps = spawn('ps', ['ax']);
const grep = spawn('grep', ['ssh']);
ps.stdout.on('data', (data) => {
grep.stdin.write(data);
});
ps.stderr.on('data', (data) => {
console.log(`ps stderr: ${data}`);
});
The first impression is that you are doing the same stuff, the problem may be in the chunk data, maybe one of the chunks is null, and it is closing the stream, and you want to close it by running task.stdin.end().
The other thing you can try is to run the node.js script with the NODE_DEBUG=stream node script.js
Will log the node.js internals how the stream, behaves, also may be helpful for you.

Execute multiple commands in Third party CLI mode using Node JS

I want to execute 5 commands in a sequence and log its output.For Example. First command XXXcli ip_address (This will connect me to the third party CLI mode) and the next commands will execute a script,the next will log output etc.But my problem is when I do SSH through node.js and spawn a shell inside ssh session, when I execute the first command I couldn't see any output on my Console. The Session creates a shell and once the shell enters the third party CLI ,Its becoming impossible for me to fire the next command or log the output of the first command.Kindly help me on this. I'm stuck with this for a long time
Update:
My Code:
session.on('exec', function (accept, reject, info) {
console.log('Client wants to execute: ' + inspect(info.command));
var stream = accept();
var cp = spawn('XXXCLI 10.21.254.12', {shell: true});
stream.stdin.pipe(cp.stdin);
cp.stdout.pipe(stream.stdout);
sleep(6000);
cp.stderr.pipe(stream.stderr);
cp.on('exit', function (code, signal) {
stream.exit(signal || code);
}).on('end', function (code, signal) {
stream.close();
});
});
When I manually type the first command 'XXXCLI ip_address' in my command prompt and press enter,I will get a output "Connected to CLI...." .Once I get this connection successful, I need to execute my second command i.e "Lmc sample" which will load the master config and I will get the output as "Message sent..", third command will execute a script,will get output as "Message sent.." .This is what happens when I enter these commands manually in cmd prompt and execute.
What is happening is once I execute my first command i.e "XXXCLI 10.21.254.12" manually in cmd, The path where we actually execute the commands i.e( C:\users\CLI>) will not be visible. This happens because now it got connected with the above mentioned ip (10.21.254.12) .And Only after connecting to this ip ,I can able to execute my other commands.i.e command to load master config ,cmd to execute script etc.
So I want to execute my first command and get its stream in a variable and execute rest of the commands inside the stream created by first command
Thanks!
I fixed this using Child Process in Node.js and writing the commands in the stream directly. When I did the same with Java it didn't work, but it did in Node.js.

How to restart a group of processes when it is triggered from one of them in C code

i have few processes *.rt written in C.
I want to restart all of them(*.rt) in the process foo.rt(one of the *.rt) in itself (buid-in C code)
Normally i have 2 bash scripts stop.sh and start.sh. These scripts are invoked from shell.
Here are the staffs of the scripts
stop.sh --> send kill -9 signal to all ".rt" files.
start.sh -->invokes processes named ".rt"
My problem is how can i restart all rt's from C code. Is there any Idea to restart all "*.rt" files triggered from foo.rt file ?
I tried to use this in foo.rt but it doesnt work. *Because stop.sh is killing all .rt files even if it is forked as a child which is deployed to execute start.sh script
...
case 708: /* There is a trigger signal here*/
{
result = APP_RES_PRG_OK;
if (fork() == 0) { /* child */
execl("/bin/sh","sh","-c","/sbin/stop.sh",NULL);
execl("/bin/sh","sh","-c","/sbin/start.sh",NULL);// Error:This will be killed by /sbin/stop command
}
}
I'have solved problem with "at" daemon in Linux
I invoke 2 system() calls stop & start.
My first attempt was faulty as explained above. execl make a new image and never returns to later execl unless it is succeed
Here is my solution
case 708: /*There is a trigger signal here*/
{
system("echo '/sbin/start.sh' | at now + 2 min");
system("echo '/sbin/stop.sh | at now + 1 min");
}
You could use process groups, at least if all your related processes are originated by the same process...
So you could write a glue program in C which set a new process group using setpgrp(2) and store its pid (or keeps running, waiting for some IPC).
Then you would stop that process group by using killpg(2).
See also the notion of session and setsid(2)

Resources