NodeJs spawn giving ENOENT error (Raspbian) - node.js

i'm having an error regarding spawning nodeJs script:
exec('node ./modules/buttons', function(error, stdout, stderr) {
if(error) console.log(error);
console.log(stdout);
if(stderr) console.log(stderr);
});
Exec Works perfectly fine. However spawn
var buttons = spawn('node ./modules/buttons.js', []);
buttons.stdout.on('data', function(data){
console.log(data);
});
Gives me the following error:
spawn node ./modules/buttons.js ENOENT
Defining the absolute path to the script results in the same error. Would appreciate it if someone could help me resolving this; I have absolutely no clue what could be the cause of this and google isn't helping me either.

exec accepts the command to be executed along with all the command line parameters, but spawn, OTOH, accepts the program to invoke and the command line arguments as an array.
In your case, Node.js is trying to execute a program called node ./modules/buttons.js, not node with ./modules/buttons.js as command line argument. That is why it is failing.
Quoting the example from the spawn docs,
const spawn = require('child_process').spawn;
const ls = spawn('ls', ['-lh', '/usr']);
The difference between exec and spawn is that, exec will be default launch the command in a shell, spawn simply invokes the program.
Note: BTW, as you are simply invoking a JavaScript file, you are better off using execFile

Related

Where does the buffer come into picture when using node.js exec function instead of spawn function?

As I read from the child_process module documentation of Node.js, I understand the difference between exec and spawn. The most crucial difference that is also highlighted in a similar StackOverflow question about the Spawn vs Exec:
The main difference is that spawn is more suitable for long-running processes with huge output. That's because spawn streams input/output with a child process. On the other hand, exec buffers output in a small (by default 200K) buffer.
However, I noticed, thanks to TS Intellisense that both exec and spawn function return similar object of type ChildProcess. So, I could technically write this for the exec function using stdout as stream and it works:
function cmdAsync (cmd, options) {
return new Promise((resolve) => {
const proc = exec(cmd, options);
proc.stdout?.pipe(process.stdout);
proc.on('exit', resolve);
});
}
cmdAsync('node server/main.mjs');
And without any buffer time/delay, I could see the logs generated by server/main.mjs file being piped into the parent process's stdout stream.
So, my question is exactly where the buffering is happening and how the streaming behavior is different for exec than spawn function! Also, can I rely of this feature even if it is undocumented?

Nodejs: write to stdin of bash process crashes with EPIPE

My node process gets some PDF file via HTTP Request, then uses the request's onData event to pass the incoming data on to a properly configured lpr, spawned via child_process.exec. I write to stdin using process.stdin.write(...), followed by process.stdin.end() when done. This allows me to print those files immediately.
Now I have a situation where I don't want the data to be piped to lpr, but to some bash script. The script uses cat to process its stdin.
myscript.sh < somefile.pdf works as expected, as does cat somefile.pdf | myscript.sh.
However, when I spawn /path/to/script.sh from node (by simply replacing lpr with the script path in the source), the process exits with
events.js:183
throw er; // Unhandled 'error' event
^
Error: write EPIPE
at WriteWrap.afterWrite [as oncomplete] (net.js:868:14)
Subsequently, the whole node process crashes, the error sneaking around all try...catch blocks. Logging at the beginning of the bash script shows, it does not even get started.
When I target anything that's not a shell script but some compiled executable, like cat, echo,... everything works just fine.
Adding epipebomb module would not change anything.
I also tried piping to process.exec("bash", ["-c cat | myscript.sh"]), with the same errors.
An example bash script, just to test for execution:
#!/usr/bin/env bash
date > logfile.txt
cat > /dev/null
EDIT:
I think I maybe need to signal to keep the stdin stream open somehow.
The process-spawning part of the script, leaving promisification and output processing away:
const process = require("child_process")
// inputObservable being an rxjs Observable
execstuff(inputObervable) {
const task = process.spawn("/path/to/script.sh");
inputObservable.subscribe(
chunk => task.stdin.write(chunk),
error => console.error(error),
finished => task.stdin.end()
);
}
There is an example at child_process.spawn how you can write the following lines ps ax | grep ssh as node.js script, maybe it will be helpful for you:
const { spawn } = require('child_process');
const ps = spawn('ps', ['ax']);
const grep = spawn('grep', ['ssh']);
ps.stdout.on('data', (data) => {
grep.stdin.write(data);
});
ps.stderr.on('data', (data) => {
console.log(`ps stderr: ${data}`);
});
The first impression is that you are doing the same stuff, the problem may be in the chunk data, maybe one of the chunks is null, and it is closing the stream, and you want to close it by running task.stdin.end().
The other thing you can try is to run the node.js script with the NODE_DEBUG=stream node script.js
Will log the node.js internals how the stream, behaves, also may be helpful for you.

How to run linux terminal commands on windows?

I'm not sure how to ask, but I'd like run the 'bash' command on windows 10 so that some linux commands run later. I'm using the framework Electron and the Child Process.
var os = require('os')
var exec = require('child_process').exec
if (os.platform() =='win32'){
var cmd_win = 'bash'
exec(cmd_win, function(error, stdout, stderr){
console.log(error)
});
}
The code snippet gives "Error: Command failed: bash". Does anyone know why? And can you help me? I hope you understood my question.
To initialize the WSL subsystem, you must launch a (hidden) Bash console window in the background, which doesn't work if you execute bash.exe directly - it works with neither exec nor execFile.
The trick is to get the shell (cmd) process that Node.js spawns to launch bash.exe without blocking, which, unfortunately, isn't easy to do: start cannot be used, because bash.exe is a console application and therefore makes start act synchronously.
The solution is to create an aux. VBScript file that launches bash.exe, which itself can be invoked asynchronously via wscript.exe. Note that the Bash console window is launched hidden:
var os = require('os')
var exec = require('child_process').exec
if (os.platform() === 'win32') {
var cmd_win = '\
echo WScript.CreateObject("Shell.Application").\
ShellExecute "bash", "", "", "open", 0 > %temp%\launchBashHidden.vbs \
& wscript %temp%\launchBashHidden.vbs'
exec(cmd_win, function(error, stdout, stderr){
if (error) console.error(error)
});
}
Note that the aux. VBScript file %temp%\launchBashHidden.vbs lingers between invocations. Cleaning it up after every run would require more work (you can't just delete it right away, because wscript, due to running asynchronously, may not have loaded it yet).
By default, exec will use cmd.exe to execute commands in windows. What you may be looking for is the shell option specified in the docs.
shell Shell to execute the command with (Default: '/bin/sh' on UNIX, 'cmd.exe' on Windows, The shell should understand the -c switch on UNIX or /s /c on Windows. On Windows, command line parsing should be compatible with cmd.exe.)
const os = require('os')
const exec = require('child_process').exec
if (os.platform() === 'win32') {
exec('ls', {shell: 'path/to/executable.exe'}, (err, stdout, stderr) => {
if (err) {
console.error(err)
return
}
console.log(stdout)
})
}
I have found a short way to do that that is :
Install git on your computer
Add C:\Program Files\Git\usr\bin to your path variable.
and check whether you can run linux commands in cmd.

NodeJS forking a bash command using child_process

Say I want to run the following bash command: ./someCommand &. When I use the following Node:
var execute = require('child_process').exec;
execute(cmd, function (err, out) {
// Do stuff
});
I can never go inside the callback, and I think the problem is because the process is waiting for ./someCommand to end, but I've tried to fork it!! What do I do?
execute.('-cs -Some Command-', func(err, res){
//Do staff
})
cs - command shell.
And you can use spawn instead exec.
https://nodejs.org/api/child_process.html#child_process_child_process_spawn_command_args_options
If you don't want to wait, just put your code outside, after execute.

Making lftp write to stdout without having to close the process first

I'm trying to wrap the lftp program in a node.js application, using child_process. The problem is that lftp doesn't write its output to stdout, so I cannot catch its output in node.js. Sample code:
var proc = require('child_process').spawn('lftp', ['-p', port, '-u', username + ',' + password, host]);
proc.stdout.on('data', function (data) {
console.log('stdout:', data.toString('utf-8'));
});
proc.on('exit', function (code) {
console.log('process exited with code ' + code);
});
proc.stdin.write('ls');
// proc.stdin.end();
If I uncomment the line that calls stdin.end() for the lftp child process, the output from the ls command appears in my terminal as it should. If I don't the process simply hangs and nothing gets outputted.
I've also tried using unbuffer, but it doesn't seem to allow me to write to lftp's stdin anymore. It outputs the usual "[Resolving host address...]" stuff, but not the output from the ls command.
My question is: what do I have to do to be able to interact with lftp using node.js' child_process?
Well, this was dumb. I forgot to write a newline after the ls command to stdin. It seems to work without the need for unbuffer.

Resources