How to force UTF-8 in node js with exec process? - node.js

I know the solution is very simple, but it's an hour I'm banging my head.
In Windows 10, if i launch the command "dir", i got this result:
Il volume nell'unità D non ha etichetta.
in Node js i try to exec the dir command in this way:
var child = exec('dir', {'encoding': 'UTF-8'}, (err, stdout, stderr) => {
console.log(stdout);
});
and i got this result:
Il volume nell'unit� C non ha etichetta.
Ah, damned accented letter!
I tried using UTF-16 and then, convert to string:
var child = exec('dir', {'encoding': 'UTF-16'}, (err, stdout, stderr) => {
let b: Buffer = stdout;
let o: string;
o = stdout.toString('UTF-8');
console.log(o);
});
I get the same cursed result:
"Il volume nell'unit� C non ha etichetta."
Can you help me to solve this rebus?
What am I doing wrong?
It almost seems that the exec command does not accept UTF-8 encoding
In fact, if I run this script to force conversion from UTF-8 to string:
var child = exec(j.cmd, {'encoding': 'UTF-8'}, (err, stdout, stderr) => {
var utf8 = require('utf8');
var o: string = utf8.decode(stdout)
console.log(o);
});
i got this error:
..\node_modules\utf8\utf8.js:194
throw Error('Invalid UTF-8 detected');
Any idea?

When you use dir in the command prompt the renderer knows which character encoding stdout is using, decodes the text bytes and renders the characters with the selected font.
When you exec a command, node does not know which character encoding stdout is using, so you tell it. The problem is you are telling it the wrong thing. To see which character encoding it is, go chcp. But, out-of-the-box, node only supports some of the dozens of characters encodings.
The solution is to tell the command prompt to use one they have in common. Since you are getting paths from the filesystem and the filesystem (NTFS) uses the Unicode character set for paths, UTF-8 is a great choice.
So, this should work:
exec('#chcp 65001 >nul & dir', {encoding: "UTF-8"},
(err, stdout, stderr) => console.log(stdout));
But, the chcp command has a delayed effect and isn't applied to the dir command. Here is one way of working around that:
exec('#chcp 65001 >nul & cmd /d/s/c dir', {encoding: "UTF-8"},
(err, stdout, stderr) => console.log(stdout));
Running a batch file might be a simpler way to get two separate commands to run with sequential effect but that would require setup and clean up.

Node.js does not support your encoding which used by windows cmd.exe, so you have to use a library like iconv-lite to convert it.
var child = exec('dir', {'encoding': 'buffer'}, (err, stdout, stderr) => {
console.log(iconv.decode(stdout, 'CP936'));
});
use buffer to get the original raw strings
replace your encoding string rather than 'CP936'. You can found it in cmd.exe > Properties > codepage.
Update: replace with your country's encoding also works fine. In my case, 'GB18030' or 'GBK' contains 'CP936', both can be used.

Related

Get image file with node and exec child process

I am trying to use conquest dicom server and get a dicom file.
It has a '.exe' called 'dgate.exe' able to execute lua scripts.
So, I create a lua script and run it using nodejs exec child process.
Right into script, I save the original dicom file to debug. I can see it is a valid dicom file.
But, in node js, after I save the stdout from child process, it results in a invalid dicom file.
After compare the binary files I think the "exec" child process is adding extra CR on result. The original file has only LF.
How can I get rid off CR from stdout of exec child process, before add to image buffer?
--lua script
function readslice(pslice)
local slice = pslice or '86557:1.3.46.670589.30.1.6.1.116520970982.1481396381703.2'
local remotecode =
[[
local ofile=']]..slice..[[';
local outfile = tempfile('.dcm')
local x = DicomObject:new()
x:Read(ofile)
x:Script('save to teste2.dcm')
x:Write(outfile)
returnfile = outfile
]]
r=servercommand('lua:'..remotecode)
io.write(r)
end
-- executing lua script in node
const { exec, spawn } = require("child_process");
const cmd = `--dolua:dofile([[${APIFOLDER}/queryfunctions.lua]]);readslice([[${slice}]])`;
const dgateEx = `${APIFOLDER}/dgate -p${CQPORT} -h${CQAE} -q${CQIP} -w${global.appRoot}/api/dicomapi`;
exec('`${dgateEx} "${cmd}" `,
{ encoding: "binary", maxBuffer: 30000 * 1024 },
function (error, stdout, stderr) {
// save file for test - results at invalid file dicom file
fs.writeFileSync("test21.dcm", Buffer.from(stdout, "binary"));
})
.....

Problems forming Imagemagick commands using nodejs gm module

I am trying to run an imageMagick command on AWS Lambda and using the gm module. I keep getting an error no decode delegate for this image format `' # error/constitute.c/ReadImage/544. I believe this error indicates that my syntax isn't correct for the command. I've tried many ways. I can run this command on the command line on my Linux system fine.
Here's the command. (adapted from here)
convert test.jpg -crop 120x120+300+300 -colorspace gray -format "%[fx:100*mean]%%" info:
Here's my function.
gm(imgobj,'test.jpg').command('convert')
.in('-crop', '120x120+300+300','-colorspace','gray','-format','%[fx:100*mean]%%')
.out('info:')
.stream(function (err, stdout, stderr) {
});
gm nodejs module. is here.
SOLVED!
gm(imgobj,'test.jpg').command('convert')
.in('-crop', '120x120+300+300')
.in('-colorspace', 'gray')
.toBuffer(function(err, buffer) {
if(err) throw err;
gm(buffer, 'test.jpg').identify({bufferStream: true, format:'%[fx:100*mean]'},function(err, data) {
if(err) throw err;
console.log('identify',data);
});
});
The documentations mentions this "GOTCHA":
When working with input streams and any 'identify'operation (size,
format, etc), you must pass {bufferStream: true} if you also need to
convert (write() or stream()) the image afterwards (NOTE: this buffers
the readStream in memory!).
The documentation says to use: gm().identify(format, callback) and this seems to work for me without setting bufferStream: true. I suppose that is correct since I do not "need to stream the image afterwards." However, for general knowledge, I looked at the source and figured out how to pass both params {bufferStream: true, format:'%[fx:100*mean]'} format being the escape argument.

How do I open a terminal application from node.js?

I would like to be able to open Vim from node.js program running in the terminal, create some content, save and exit Vim, and then grab the contents of the file.
I'm trying to do something like this:
filename = '/tmp/tmpfile-' + process.pid
editor = process.env['EDITOR'] ? 'vi'
spawn editor, [filename], (err, stdout, stderr) ->
text = fs.readFileSync filename
console.log text
However, when this runs, it just hangs the terminal.
I've also tried it with exec and got the same result.
Update:
This is complicated by the fact that this process is launched from a command typed at a prompt with readline running. I completely extracted the relevant parts of my latest version out to a file. Here is it in its entirety:
{spawn} = require 'child_process'
fs = require 'fs'
tty = require 'tty'
rl = require 'readline'
cli = rl.createInterface process.stdin, process.stdout, null
cli.prompt()
filename = '/tmp/tmpfile-' + process.pid
proc = spawn 'vim', [filename]
#cli.pause()
process.stdin.resume()
indata = (c) ->
proc.stdin.write c
process.stdin.on 'data', indata
proc.stdout.on 'data', (c) ->
process.stdout.write c
proc.on 'exit', () ->
tty.setRawMode false
process.stdin.removeListener 'data', indata
# Grab content from the temporary file and display it
text = fs.readFile filename, (err, data) ->
throw err if err?
console.log data.toString()
# Try to resume readline prompt
cli.prompt()
The way it works as show above, is that it shows a prompt for a couple of seconds, and then launches in to Vim, but the TTY is messed up. I can edit, and save the file, and the contents are printed correctly. There is a bunch of junk printed to terminal on exit as well, and Readline functionality is broken afterward (no Up/Down arrow, no Tab completion).
If I uncomment the cli.pause() line, then the TTY is OK in Vim, but I'm stuck in insert mode, and the Esc key doesn't work. If I hit Ctrl-C it quits the child and parent process.
You can inherit stdio from the main process.
const child_process = require('child_process')
var editor = process.env.EDITOR || 'vi';
var child = child_process.spawn(editor, ['/tmp/somefile.txt'], {
stdio: 'inherit'
});
child.on('exit', function (e, code) {
console.log("finished");
});
More options here: http://nodejs.org/api/child_process.html#child_process_child_process_spawn_command_args_options
Update: My answer applied at the time it was created, but for modern versions of Node, look at this other answer.
First off, your usage of spawn isn't correct. Here are the docs. http://nodejs.org/docs/latest/api/child_processes.html#child_process.spawn
Your sample code makes it seem like you expect vim to automatically pop up and take over the terminal, but it won't. The important thing to remember is that even though you may spawn a process, it is up to you to make sure that the data from the process makes it through to your terminal for display.
In this case, you need to take data from stdin and send it to vim, and you need to take data output by vim and set it to your terminal, otherwise you won't see anything. You also need to set the tty into raw mode, otherwise node will intercept some of the key sequences, so vim will not behave properly.
Next, don't do readFileSync. If you come upon a case where you think you need to use a sync method, then chances are, you are doing something wrong.
Here's a quick example I put together. I can't vouch for it working in every single case, but it should cover most cases.
var tty = require('tty');
var child_process = require('child_process');
var fs = require('fs');
function spawnVim(file, cb) {
var vim = child_process.spawn( 'vim', [file])
function indata(c) {
vim.stdin.write(c);
}
function outdata(c) {
process.stdout.write(c);
}
process.stdin.resume();
process.stdin.on('data', indata);
vim.stdout.on('data', outdata);
tty.setRawMode(true);
vim.on('exit', function(code) {
tty.setRawMode(false);
process.stdin.pause();
process.stdin.removeListener('data', indata);
vim.stdout.removeListener('data', outdata);
cb(code);
});
}
var filename = '/tmp/somefile.txt';
spawnVim(filename, function(code) {
if (code == 0) {
fs.readFile(filename, function(err, data) {
if (!err) {
console.log(data.toString());
}
});
}
});
Update
I seeee. I don't think readline is as compatible with all of this as you would like unfortunately. The issue is that when you createInterface, node kind of assumes that it will have full control over that stream from that point forward. When we redirect that data to vim, readline is still there processing keypresses, but vim is also doing the same thing.
The only way around this that I see is to manually disable everything from the cli interface before you start vim.
Just before you spawn the process, we need to close the interface, and unfortunately manually remove the keypress listener because, at least at the moment, node does not remove it automatically.
process.stdin.removeAllListeners 'keypress'
cli.close()
tty.setRawMode true
Then in the process 'exit' callback, you will need to call createInterface again.
I tried to do something like this using Node's repl library - https://nodejs.org/api/repl.html - but nothing worked. I tried launching vscode and TextEdit, but on the Mac there didn't seem to be a way to wait for those programs to close. Using execSync with vim, nano, and micro all acted strangely or hung the terminal.
Finally I switched to using the readline library using the example given here https://nodejs.org/api/readline.html#readline_example_tiny_cli - and it worked using micro, e.g.
import { execSync } from 'child_process'
...
case 'edit':
const cmd = `micro foo.txt`
const result = execSync(cmd).toString()
console.log({ result })
break
It switches to micro in a Scratch buffer - hit ctrl-q when done, and it returns the buffer contents in result.

NodeJS: Asynchronous file read problems

New to NodeJS.
Yes I know I could use a framework, but I want to get a good grok on it before delving into the myriad of fine fine tools that are out there.
my problem:
var img = fs.readFileSync(path);
the above works;
fs.readFile(path, function (err, data)
{
if (err) throw err;
console.log(data);
});
the above doesn't work;
the input path is : 'C:\NodeSite\chrome.jpg'
oh and working on Windows 7.
any help would be much appreciated.
Fixed
Late night/morning programming, introduces errors that are hard to spot. The path was being set from two different places, and so the source path were different in both cases. Thankyou for your help. I am a complete numpty. :)
If you are not setting an encoding when reading a file, you will get the binary content.
So for example, the following snippet will output the content of the test file using UTF-8 encoding. If you don't use an encoding, you will get an output like "" on your console (raw binary buffer).
var fs = require('fs');
var path = "C:\\tmp\\testfile.txt";
fs.readFile(path, 'utf8', function (err, data) {
if (err) throw err;
console.log(data);
});
Another issue (especially on windows-based OS's) can be the correct escaping of the target path. The above example shows how path's on Windows have to be escaped.
java guys will just use this javascript asynchronous command as if in pure java , troublefreely :
var fs = require('fs');
var Contenu = fs.readFileSync( fILE_FULL_Name , 'utf8');
console.log( Contenu );
That should take care of small & big files.

node.js using spawn for perl - stdout line by line

Spawn in nodeJS. I have just about managed to use this to run a bash command as follows. This seems to be pretty much non-blocing and I get action on the browser screen as the command trashes through data.
ls = spawn('find',['/'] );
response.writeHead(200, { "Content-Type": "text/plain" });
ls.stdout.on('data', function (data) {
console.log('stdout: ' + data);
response.write(data);
});
But I want to run a perl script with multiple arguments.
ls = spawn('blah.pl',['--argstring here', '--arg blah'] );
Perl script is just written to get arguments using getopts CPAN lib and it using CPAN expect lib to run though a pile of stuff - outputs to stdout and stderr if I have an error but I mostly care about stdout right now.
The thing is this is giving me no output. Seems to be completely blocking at least until the program finishes execution ... and it this case it doesn't at least for 10 mins.
Am I using spawn wrong?
I like the node module "carrier"
carrier = require "carrier"
childproc = require "child_process"
find = childproc.spawn "find"
find.stdout.setEncoding "utf8"
linereader = carrier.carry find.stdout
linereader.on "line", (line) -> console.log line

Resources