I wrote java program that takes one argument and writes it.
Also, it reads lines and writes them.
Then, I compiled it and ran from Typescript (vscode extension).
const { spawn } = require('node:child_process');
const runJava = spawn('java', ['-cp', 'extension1/src', 'package.Main', 'aaa']);
runJava.stdout.on('data', (data: String) => {
console.log(`stdout: ${data}`);
});
runJava.stderr.on('data', (data: String) => {
console.error(`stderr: ${data}`);
});
runJava.on('close', (code: Number) => {
console.log(`child process exited with code ${code}`);
});
It writes 'aaa' to stdout successfully.
What do I need to do next to send some text to it?
I tried to write with echo, but that didn't work.
Related
I use child_process to execute a sh file which download file with curl but when I use .exit() function the process didn't stop
const { spawn } = require('child_process');
let command = spawn('./download.sh', ["url", "name"])
setTimeout(() => {
command.kill()
}, 2000);
command.stdout.on('data', data => {
console.log("Data")
})
command.on('exit', (code, signal) => {
console.log("exit");
})
command.on('close', (code, signal) => {
console.log("close");
})
And this is my output
Data
Data
Data
Data
Exit
Data
Data
Data
...
This is my download.sh
curl {test url} --output test.mp4
But when I execute spawn with the curl command directly, the process stop so I don't understand why.
const { spawn } = require('child_process');
let command = spawn('curl', ["test url", "--output", "test.mp4"])
setTimeout(() => {
command.kill()
}, 2000);
command.stdout.on('data', data => {
console.log("Data")
})
command.on('exit', (code, signal) => {
console.log("exit");
})
command.on('close', (code, signal) => {
console.log("close");
})
And this is my output
Data
Data
Data
Data
Exit
Close
If someone has an idea
Since you mentioned that you receive all the data in both cases, there is no problem.
Processes are asynchronous, so there is no guarantee that you will receive exit before data is finished. exit simply means that your process has exited at that point (and thus, the download should be complete). If the process wrote something to standard output/standard error, that data might still be stored in some buffers until it is consumed.
Consider a simple example below where I'm printing the response from a long-running command as it sends.
const { spawn } = require('child_process');
const ping = spawn('ping', ['www.google.com']);
ping.stdout.on('data', (data) => {
console.log(`stdout: ${data}`);
});
ping.stderr.on('data', (data) => {
console.error(`stderr: ${data}`);
});
ping.on('close', (code) => {
if (code !== 0) {
console.log(`ping process exited with code ${code}`);
}
});
This works ok. But when I try to pipe this result to grep, it stops working.
See sample below
const { spawn } = require('child_process');
const ping = spawn('ping', ['www.google.com']);
const grep = spawn('grep', ['bytes'])
ping.stdout.on('data', (data) => {
grep.stdin.write(data)
});
ping.stderr.on('data', (data) => {
console.error(`stderr: ${data}`);
});
ping.on('close', (code) => {
if (code !== 0) {
console.log(`ping process exited with code ${code}`);
}
grep.stdin.end();
});
grep.stdout.on('data', (data) => {
console.log(`stdout: ${data}`);
});
grep.stderr.on('data', (data) => {
console.error(`stderr: ${data}`);
});
grep.on('close', (code) => {
console.log(`grep process exited with code ${code}`);
});
The above code does not give me any results.
It looks like it is waiting for the first command to end before it pipes the result to the next.
I observed this by experimenting with the ls command.
Isn't the whole point of piping not to wait? Or am I missing something here?
Some variations I tried to no success:
ping.stdout.pipe(grep.stdin); instead of grep.stdin.write(data), although I don't believe there is much difference between the two.
const ping = spawn('sh', ['-c', 'ping www.google.com | grep bytes']); I have tried this with non-long-running commands and it works ok including the pipe and everything.
The problem is that grep block-buffers its output by default, so until several kilobytes of output are available, it won't send anything back to node. Note that if you waited long enough with your program as-is, you'd eventually see dozens of lines all suddenly returned at once. The --line-buffered option changes this, so do spawn('grep', ['--line-buffered', 'bytes']) instead of spawn('grep', ['bytes']).
I know there have been variations on this question, but none seem to cover this particular problem.
I am spawning a child process and attempting to send the output to the browser. The issue is the ansi coloring is not making it to the output.
I've imported ansi-to-html to render the ansi output if I receive it, but my spawned child is not preserving the output.
const process = spawn(
'bash',
[
'-ic',
'<command I am running>'
],
);
process.stdout.on('data', (data) => {
console.log(`stdout: ${data}`);
self.terminalOutput += convert.toHtml(`${data}`);
});
process.stderr.on('data', (data) => {
console.log(`stderr: ${data}`);
self.terminalOutput += convert.toHtml(`${data}`);
});
process.on('close', (code) => {
console.log(`child process exited with code ${code}`);
self.terminalOutput += convert.toHtml(`child process exited with code ${code}`)
});
So it looks like I found an answer here. `${data}` was implicitly converting the the data returned from the spawned process (I believe by doing type conversion implicitly calling toString() but I could be wrong here).
So to correctly pass the data to ansi-to-html, you must just pass it directly
process.stdout.on('data', (data) => {
self.terminalOutput += convert.toHtml(data);
});
process.stderr.on('data', (data) => {
self.terminalOutput += convert.toHtml(data);
});
For me setting the environment variable FORCE_COLOR=1 results in the data including ansi format string.
I need to execute one of the npm scripts from a different node project.
What I tried:
const installingf = spawn('cd', ['[path to directory]', '&&', 'npm', 'run', 'test']);
installingf.on('message', (data) => {
console.log(`stdout: ${data}`);
});
installingf.on('error', (data) => {
console.log(`stderr: ${data}`);
});
installingf.on('close', (code) => {
console.log(`child process exited with code ${code}`);
});
but it does not work. It shows child process exited with code 0 .
I tried mkdir to test if it gets executed but there are no directories created, it exits with code 0 but script is not executed.
Please help me figure out how to do this and if I am doing something wrong.
Maybe is because you're trying to execute one command, but you don't run the cmd.
The cd command doesn't exist on windows as long as you didn't create an alias. So you would need to run 'cmd' with arguments to run.
const spawn = require('child_process').spawn;
const bat = spawn('cmd.exe', ['/c','calc.exe']);
bat.stdout.on('data', (data) => {
console.log(data);
});
bat.stderr.on('data', (data) => {
console.log(data);
});
bat.on('closed', (code) => {
alert(`Child exited with code ${code}`);
});
IMPORTANT to read.
I need to put all the commands in a batch file (test.cmd) with some logic, for example:
IF condition1 (c:\Windows\System32\schtasks.exe /Create ...)
Else (c:\Windows\System32\schtasks.exe /delete ...)
If remove the if-else statement, and only leave one command in test.cmd, by using code like this can execute the command:
exec('some-path/test.cmd', (error, stdout, stderr) => {
if (error) {
console.log(error);
return;
}
console.log(stdout);
});
If add if-else statement back, does anyone know how can I pass parameter from node.js exec() function? In the terminal, it is easy to pass parameters like "test.cmd para1".
Yo can use node spawn.
Example variables
const spawn = require('child_process').spawn;
const ls = spawn('ls', ['-lh', '/usr']);
ls.stdout.on('data', (data) => {
console.log(`stdout: ${data}`);
});
ls.stderr.on('data', (data) => {
console.log(`stderr: ${data}`);
});
ls.on('close', (code) => {
console.log(`child process exited with code ${code}`);
});
documentation:
https://nodejs.org/api/child_process.html