How to pass or output data from a ruby script to a node.js script? - node.js

I am looking for a way to pass or output data from one script to another script so that the late script can execute itself with the output that came from the first one.
Basically, I have a ruby script with some instructions in it and I want to pass (or output...) the result of the ruby script to a node.js script.
I would like help ( and examples ... ) on how to realize this and/or recommendations for techniques or technologies I might have never heard of it that might do the trick
Thank you.

You can use child_process exec to execute a script and handle it's output.
Ruby Script
# example.rb
puts "hello world"
Node Script
// example.js
const exec = require('child_process').exec
exec('ruby example.rb', function(err, stdout, stderr) {
console.error(err)
console.error('stderr: ' + stderr)
console.log('stdout: ' + stdout) // logs "hello world"
});

Related

Trying to run a python script from nodeJS and can't figure out the argument list

I have to run a python script from my node project that converts a .csv file to a .txt file.
The command for running the python script on the terminal is
python3 csv_to_rdf.py "pathToFile" "date" > "nameOfNewFile.txt"
eg. python3 csv_to_rdf.py modifiedCSV.csv 08122022 > test.txt
I am using the child_process.spawn() method in nodeJS to run the script but I can't figure out the args.
When I use the following code snippet:
I get the following error message:
My question is how do I send in "> test.txt" in my spawn() method?
This is incredible, I am a semantic developer, and I had an extremely similar file (a python script that parsed a .csv and then converted it to .txt and then to .TriG) and I needed it run in my Node.js backend.
Short answer
A very simple solution I thought of after writing everything below is:
Python is better at writing to files than bash. Use python to write to your file and then do not pass a redirect into your spawn() and you'll be good. Escaped characters are difficult to handle in the child_process because you have to deal with JavaScript and Python both trying to handle escaped characters at the same time. If python doesn't have write access, then below may server you better.
Long answer
I moved away from child processes because they cause security issues. Any child_process with user input can be exploited. If you could move away from a child process and re-write your python file in JS/TypeScript, that would be best. If you are confident in the use of the spawn, then here is a solution:
Firstly, you cannot add a redirect to your spawn because the command is python3 and, as the error suggests, python3 takes the redirect as an unrecognized argument. It literally takes it as an argument to the python file, so sys.argv is the array you passed in the spawn. Python will take it in but without seeing your python code, I couldn't say how the extra arguments are being handled. Obviously an error is being thrown.
Remember the child_process.spawn() happens asynchronously, so the stdout and stderr can be extracted right after. Read child_process.spawn() docs. So remove the redirect in the spawns argument array and just run another child_process or use fn to write the output to a file when the stdout occurs:
const { spawn } = require('node:child_process');
const childPython = spawn('python3', ['modifiedCSV.csv', '08182022']);
childPython.stdout.on('data', (data) => {
console.log(`stdout: ${data}`);
});
Or you could use exec instead.
const { exec } = require('node:child_process');
const path = "modifiedCSV.csv";
const date = "08182022";
exec(`python3 ${path} ${date} > test.txt`, (error, stdout, stderr) => {
if (error) {
console.error(`exec error: ${error}`);
return;
}
console.log(`stdout: ${stdout}`);
console.error(`stderr: ${stderr}`);
})
The main issue is, if the path to the modified csv file is in anyway connected to a user's input, you have a security risk, because if python3 is run as sudo, which I believe in node, it is, a user can inject python code in-place of the file path and re-write and encrypt all the files that it can touch.

shelljs performance is slow

I have been using shelljs
On my super fast system I execute this:
var shell = require('shelljs')
const exec = require('child_process').exec
console.time('shell mktemp -d')
shell.exec('mktemp -d', {silent: true})
console.timeEnd('shell mktemp -d')
console.time('child exec mktemp -d')
exec('mktemp', ['-d'], function(error, stdout, stderr) {
if (error) {
console.error('stderr', stderr)
throw error
}
console.log('exec stdout', stdout)
console.timeEnd('child exec mktemp -d')
})
Its giving the following execution times:
shell mktemp -d: 208.126ms
exec stdout /tmp/tmp.w22tyS5Uyu
child exec mktemp -d: 48.812ms
Why is shelljs 4 times slower? Any thoughts?
Your code example compares async child_process.exec() with sync shell.exec(), which isn't entirely a fair comparison. I think you'll find shell.exec(..., { async: true }) performs a bit better: this is because sync shell.exec() does extra work to provide real-time stdio while still capturing stdout/stderr/return code as part of its return value; async shell.exec() can provide the same feature mostly for free.
Even with { silent: true }, the extra work is still necessary. shell.exec() is built on top of child_process.execSync(), which only returns stdout. We need to perform the same extra work in order to return return code and stderr.
Have a look to how shelljs is implemented:
It fully relies on node.js fs library. This library is cross platform and written in C++ but not as performant as C language. More generally, you can't have in JS the perfs you get in C...
Another thing, abstraction layers:
you're using exec(Command) where Command is a C tailored (Linux C here I think). The machine creates a thread and executes a command in it.
When using shell.js, there are many mechanisms to ensure cross plateform and keep the abstraction of your command as a function and keep the result as a variable. See the code of exec in shell.js:
https://github.com/shelljs/shelljs/blob/master/src/exec.js
It is not really doing the same thing as your line of code.
Hope that helps!

fetching 'rsync' output with nodejs child_process.exec callback

Currently I'm failing to fetch the rsync output when I'm calling nodejs child_process.exec with a callback-function like in this snippet:
var sys = require('sys'),
exec = require('child_process').exec;
cmd = 'rsync -rpz test/test-files/one.txt jloos#test.mygnia.de:~/remote-test/a/b/'
exec(cmd, function(error, stdio, stderr) {
sys.print('s: ' + stdio + '\n');
sys.print('e: ' + stderr + '\n');
});
I think this is caused by the specific behavior of rsync. rsync communicates with it's counterpart via terminal. So how can I fetch the messages from rsync, if even possible?
When I use cmd = 'ls -la' I get the expected output.
Thanks
Often stdout is buffered when the program isn't running in a virtual terminal.
Many languages have a pty module which will trick the program into behaving as though it is running in a terminal.
This provides that functionality for NodeJs;
https://github.com/chjj/pty.js
Keep in mind that rsync may be writing lots of special characters or using something like ncurses to provide the updating status messages, which may make it more difficult to work with the output.

Making lftp write to stdout without having to close the process first

I'm trying to wrap the lftp program in a node.js application, using child_process. The problem is that lftp doesn't write its output to stdout, so I cannot catch its output in node.js. Sample code:
var proc = require('child_process').spawn('lftp', ['-p', port, '-u', username + ',' + password, host]);
proc.stdout.on('data', function (data) {
console.log('stdout:', data.toString('utf-8'));
});
proc.on('exit', function (code) {
console.log('process exited with code ' + code);
});
proc.stdin.write('ls');
// proc.stdin.end();
If I uncomment the line that calls stdin.end() for the lftp child process, the output from the ls command appears in my terminal as it should. If I don't the process simply hangs and nothing gets outputted.
I've also tried using unbuffer, but it doesn't seem to allow me to write to lftp's stdin anymore. It outputs the usual "[Resolving host address...]" stuff, but not the output from the ls command.
My question is: what do I have to do to be able to interact with lftp using node.js' child_process?
Well, this was dumb. I forgot to write a newline after the ls command to stdin. It seems to work without the need for unbuffer.

How can I only use core Node.js to check up the filesystems status.( as 'df' command )

i want to implement a node.js program to checkup the filesystems(such as ext3..)status. However, the fs module only provide the operations of file. Must i use something else third part module?
One option would be to capture the output of the 'df' command and parse it.
You can run commands using child processes.
http://nodejs.org/docs/latest/api/child_processes.html#child_process.exec
var child_process = require('child_process');
child_process.exec('df', function(err, stdout, stderr) {
// 'stdout' here is a string containing the things printed by 'df'
console.log(stdout);
});

Resources