I am trying to use conquest dicom server and get a dicom file.
It has a '.exe' called 'dgate.exe' able to execute lua scripts.
So, I create a lua script and run it using nodejs exec child process.
Right into script, I save the original dicom file to debug. I can see it is a valid dicom file.
But, in node js, after I save the stdout from child process, it results in a invalid dicom file.
After compare the binary files I think the "exec" child process is adding extra CR on result. The original file has only LF.
How can I get rid off CR from stdout of exec child process, before add to image buffer?
--lua script
function readslice(pslice)
local slice = pslice or '86557:1.3.46.670589.30.1.6.1.116520970982.1481396381703.2'
local remotecode =
[[
local ofile=']]..slice..[[';
local outfile = tempfile('.dcm')
local x = DicomObject:new()
x:Read(ofile)
x:Script('save to teste2.dcm')
x:Write(outfile)
returnfile = outfile
]]
r=servercommand('lua:'..remotecode)
io.write(r)
end
-- executing lua script in node
const { exec, spawn } = require("child_process");
const cmd = `--dolua:dofile([[${APIFOLDER}/queryfunctions.lua]]);readslice([[${slice}]])`;
const dgateEx = `${APIFOLDER}/dgate -p${CQPORT} -h${CQAE} -q${CQIP} -w${global.appRoot}/api/dicomapi`;
exec('`${dgateEx} "${cmd}" `,
{ encoding: "binary", maxBuffer: 30000 * 1024 },
function (error, stdout, stderr) {
// save file for test - results at invalid file dicom file
fs.writeFileSync("test21.dcm", Buffer.from(stdout, "binary"));
})
.....
Related
i'm calling an async nodejs function that uses prompts(https://www.npmjs.com/package/prompts)
basically, the user is presented options and after they select one, i want the selection outputted to a variable in bash. I cannot get this to work. it either hangs, or outputs everything since prompts is a user interface that uses stdout
//nodefunc.js
async run() {
await blahhhh;
return result; // text string
}
console.log(run());
// bash
x=$(node nodefunc.js)
echo $x
Unless you can ensure nothing else in the node script will print to stdout, you will need a different approach.
I'd suggest having the node script write to a temporary file, and have the bash script read the output from there.
Something like this perhaps:
const fs = require('fs');
const outputString = 'I am output';
fs.writeFileSync('/tmp/node_output.txt', outputString);
node nodefunc.js
# Assuming the node script ran succesfully, read the output file
x=$(</tmp/node_output.txt)
echo "$x"
# Optionally, cleanup the tmp file
rm /tmp/node_output.txt
Trying to spawn a process with node and read its output.
I would like the output to be in a file and to be able to read it.
This is the code I have so far, but it throws an error -
const outFile = fs.openSync('out.log', 'a')
const errFile = fs.openSync('err.log', 'a')
const child = childProcess.spawn('node', [pathToJsFile], {
stdio: ['ignore', outFile, errFile],
detached: true
})
child.unref()
console.log(child.stdio)
console.log('waiting for output')
child.stdio[1].on('data', (data)=> { // ==> get error since stdio[1] is null
As mentioned in the comment, when I look in child.stdio I see [null, null, null]
However, when I look at the file, I can see the output is written.
I am using node 4.2.1
What am I doing wrong and how can I make this work?
You are wiring up the child process's output to a filesystem file 'out.log', so it goes there and therefore is not also directly available via stdout. You'll need to directly read the output file by filesystem path using the fs core module. var outBuffer = fs.readSync('out.log');
Is it possible to make/use custom text commands in a batch file while running a nodejs server through it?
//Current batch file
node nodeServer.js
//nodeServer.js
function list(){
//insert query
}
function unlist(){
//delete query
}
As of now, after i start the batch file, the nodeServer.js is started and the batch stops accepting any input.
I'd like to be able to type "nodeServer.js list"(in the batch window) and with that, call a function called "list" inside nodeServer.js,
I'm looking to insert data about the server into a database by running a insert query with the "list" function and run a delete query with nodeServer.js unlist to remove the inserted row before shutting down the server again.
I'm unfamiliar with batch files, Is this possible?
Update
To Clarify..
I want to type a text command IN the batch window, AFTER it have started the nodejs server, to run a specific function found inside the nodeServer.js
You want to send command to the NodeJS after the node process started.
To start a command form NodeJS without pause the bat file use start
To send the command I will use a simple text file. I will write to the file from the batch using echo and read the file form NodeJS using watch, and readFileSync
I will support sending function name and arguments using spaces. for example: list a b c
The BAT file:
#echo off This is make the bat file to not show the commands
REM `Rem` do nothing. It is exactly like // in javascript
REM This will start NodeJS without pause the bat
start node myNode.js
REM delete the command file if it exists
del command
REM Send 3 commands to the file. You can also add parameters
echo list >> command
echo list a b c>> command
echo unlist>> command
var fs = require('fs')
var filename = __dirname + '/command'
// Add here any command you want to be run by the BAT file
var commands = {
list: function() {
console.log('list', arguments)
},
unlist: function() {
console.log('unlist', arguments)
}
}
console.log('watching:' + filename)
if (fs.existsSync(filename)) {
console.log('File exists, running initial command')
runCommand()
}
require('fs').watchFile(filename, runCommand)
function runCommand() {
if(!fs.existsSync(filename)) return
var lines = fs.readFileSync(filename, 'utf-8').split('\r\n')
console.log(lines)
fs.unlink(filename)
for(var i=0;i<lines.length;i++){
var line=lines[i]
console.log(line)
line=line.split(' ') // Split to command and arguments
var command = line[0]
var args = line.slice(1)
if (!commands[command]) {
console.log('Command not found:"' + command +'"')
return;
}
console.log('Running command:', command, '(', args, ')')
commands[command].call(null, args)
}
}
Read more about FileSystem node Module: https://nodejs.org/docs/latest/api/fs.html#fs_class_fs_stats
I'm trying to execute Inkscape by passing data via stdin. Inkscape only supports this via /dev/stdin. Basically, I'm trying to do something like this:
echo "<sgv>...</svg>" | inkscape -z -f /dev/stdin -A /dev/stdout
I don't want to have to write the SVG to disk.
I tried just using stdin.write(), but it doesn't work (maybe because of /dev/stdin):
var cmd = spawn("inkscape", ["-z", "-f", "/dev/stdin", "-A", "/dev/stdout"], {encoding: "buffer", stdio: ["pipe", stdoutPipe, "pipe"]});
cmd.stdin.write(svg);
This does work, but I have to write the SVG to disk:
var cmd = spawn("inkscape", ["-z", "-f", "/dev/stdin", "-A", "/dev/stdout"], {encoding: "buffer", stdio: [fs.openSync('file.svg', "a"), stdoutPipe, "pipe"]});
I tried passing a stream to stdio, but I just keep getting TypeError: Incorrect value for stdio stream: [object Object]
Any ideas?
Addendum
The examples use Inkscape, but my question applies to any arbitrary program using /dev/stdin.
By the way, this would work for me:
var exec = require('child_process').exec;
exec("echo \"<svg>...</svg>\" | inkscape -z -f /dev/stdin -A /dev/stdout | cat", function (error, stdout, stderr) {});
Except, my SVG is too long, so it throws an error: Error: spawn Unknown system errno 7
Alright, I don't have Inkscape, but this appears to solve the Node.js side of things. I'm using wc as a stand in Inkscape; the -c option simply outputs the number of bytes in a given file (in this case /dev/stdin).
var child_process = require('child_process');
/**
* Create the child process, with output piped to the script's stdout
*/
var wc = child_process.spawn('wc', ['-c', '/dev/stdin']);
wc.stdout.pipe(process.stdout);
/**
* Write some data to stdin, and then use stream.end() to signal that we're
* done writing data.
*/
wc.stdin.write('test');
wc.stdin.end();
The trick seems to be signaling that you're done writing to the stream. Depending on how large your SVG is, you may need to pay attention to backpressure from Inkscape by handling the 'drain' event.
As for passing a stream into the child_process.spawn call, you instead need to use the 'pipe' option, and then pipe a readable stream into child.stdin, as shown below. I know this works in Node v0.10.26, but not sure about before that.
var stream = require('stream');
var child_process = require('child_process');
/**
* Create the child process, with output piped to the script's stdout
*/
var wc = child_process.spawn('wc', ['-c', '/dev/stdin'], {stdin: 'pipe'});
wc.stdout.pipe(process.stdout);
/**
* Build a readable stream with some data pushed into it.
*/
var readable = new stream.Readable();
readable._read = function noop() {}; // See note below
readable.push('test me!');
readable.push(null);
/**
* Pipe our readable stream into wc's standard input.
*/
readable.pipe(wc.stdin);
Obviously, this method is a bit more complicated, and you should use the method above unless you have good reason to (you're effectively implementing your own readable string).
Note: The readable._push function must be implemented according to the docs, but it doesn't necessarily have to do anything.
So, I figured out a work around. This seems like a bit of a hack, but it works just fine.
First, I made this one line shell script:
cat | inkscape -z -f /dev/stdin -A /dev/stdout | cat
Then, I simply spawn that file and write to the stdin like this:
cmd = spawn("shell_script");
cmd.stdin.write(svg);
cmd.stdin.end();
cmd.stdout.pipe(pipe);
I really think this should work without the shell script, but it won't (for me at least). This may be a Node.js bug.
The problem comes from the fact that file descriptors in node are sockets and that linux (and probably most Unices) won't let you open /dev/stdin if it's a socket.
I found this explanation by bnoordhuis on https://github.com/nodejs/node-v0.x-archive/issues/3530#issuecomment-6561239
The given solution is close to #nmrugg's answer :
var run = spawn("sh", ["-c", "cat | your_command_using_dev_stdin"]);
After further work, you can now use the https://www.npmjs.com/package/posix-pipe module to make sure that the process sees a stdin that is not a socket.
look at the 'should pass data to child process' test in this module which boils down to
var p = pipe()
var proc = spawn('your_command_using_dev_stdin', [ .. '/dev/stdin' .. ],
{ stdio: [ p[0], 'pipe', 'pipe' ] })
p[0].destroy() // important to avoid reading race condition between parent/child
proc.stdout.pipe(destination)
source.pipe(p[1])
As Inkscape bug 171016 indicates, Inkscape does not support importing via stdin, but it is on their Wishlist.
I'm trying to launch a shell command from Node.js, without redirecting that command's input and output -- just like shelling out to a command using a shell script, or using Ruby's system command. If the child process wants to write to STDOUT, I want that to go straight to the console (or get redirected, if my Node app's output was redirected).
Node doesn't seem to have any straightforward way to do this. It looks like the only way to run another process is with child_process, which always redirects the child process's input and output to pipes. I can write code to accept data from those pipes and write it to my process's STDOUT and STDERR, but if I do that, the APIs force me to sacrifice some flexibility.
I want two features:
Shell syntax. I want to be able to pipe output between commands, or run Windows batch files.
Unlimited output. If I'm shelling out to a compiler and it wants to generate megabytes of compiler warnings, I want them all to scroll across the screen (until the user gets sick of it and hits Ctrl+C).
It looks like Node wants to force me choose between those two features.
If I want an unlimited amount of output, I can use child_process.spawn and then do child.stdout.on('data', function(data) { process.stdout.write(data); }); and the same thing for stderr, and it'll happily pipe data until the cows come home. Unfortunately, spawn doesn't support shell syntax.
If I want shell syntax, I can use child_process.exec. But exec insists on buffering the child process's STDOUT and STDERR for me and giving them to me all at the end, and it limits the size of those buffers (configurable, 200K by default). I can still hook the on('data') events, if I want to see the output as it's generated, but exec will still add the data to its buffers too. When the amount of data exceeds the predefined buffer size, exec will terminate the child process.
(There's also child_process.execFile, which is the worst of both worlds from a flexibility standpoint: no shell syntax, but you still have to cap the amount of output you expect.)
Am I missing something? Is there any way to just shell out to a child process in Node, and not redirect its input and output? Something that supports shell syntax and doesn't crap out after a predefined amount of output, just like is available in shell scripts, Ruby, etc.?
You can inherit stdin/out/error streams via spawn argument so you don't need to pipe them manually:
var spawn = require('child_process').spawn;
spawn('ls', [], { stdio: 'inherit' });
Use shell for shell syntax - for bash it's -c parameter to read script from string:
var spawn = require('child_process').spawn;
var shellSyntaxCommand = 'ls -l | grep test | wc -c';
spawn('sh', ['-c', shellSyntaxCommand], { stdio: 'inherit' });
To summarise:
var spawn = require('child_process').spawn;
function shspawn(command) {
spawn('sh', ['-c', command], { stdio: 'inherit' });
}
shspawn('ls -l | grep test | wc -c');
You can replace exec by spawn and use the shell syntax simply with:
const {spawn} = require ('child_process');
const cmd = 'ls -l | grep test | wc -c';
const p = spawn (cmd, [], {shell: true});
p.stdout.on ('data', (data) => {
console.log (data.toString ());
});
The magic is just {shell: true}.
I haven't used it, but I've seen this library: https://github.com/polotek/procstreams
It you'd do this. The .out() automatically pipes to the process's stdin/out.
var $p = require('procstreams');
$p('cat lines.txt').pipe('wc -l').out();
If doesn't support shell syntax, but that's pretty trivial I think.
var command_str = "cat lines.txt | wc -l";
var cmds = command_str.split(/\s?\|\s?/);
var cmd = $p(cmds.shift());
while(cmds.length) cmd = cmd.pipe(cmds.shift());
cmd
.out()
.on('exit', function() {
// Do whatever
});
There's an example in the node docs for the child_process module:
Example of detaching a long-running process and redirecting its output to a file:
var fs = require('fs'),
spawn = require('child_process').spawn,
out = fs.openSync('./out.log', 'a'),
err = fs.openSync('./out.log', 'a');
var child = spawn('prg', [], {
detached: true,
stdio: [ 'ignore', out, err ]
});
child.unref();