Node error while using node module usb and pcsclite - node.js

I want to use the node module "usb" to connect with a printer and node module "pcsclite" to use a smartcard-reader.
When i use both modules node.exe exits with error code -1073740771 (0xc000041d).
Here is a little test program:
var usb = require('usb')
var PCSC = require('pcsclite')()
PCSC.on('reader', function (reader) {
console.log('add reader')
reader.on('status', function (status) {
console.log('status: ' + status)
})
reader.on('error', function (err) { console.log('reader error: ' + err) })
reader.on('end', function () { console.log('end') })
})
PCSC.on('error', function (err) { console.log('PCSC error: ' + err) })
When i remove the first line, the program works fine.
P.S.: I use nodejs 4.8.3, usb 1.3.1 and pcsclite 0.4.12

Related

run-sequence synchronous task never completes

I'm almost certainly going about this in the wrong way, so first up my high-level requirement.
I'm using the angular2-seed and want to run Protractor tests in a headless mode by using Xvfb. I don't want an Xvfb server running at all times (this is a build server) so instead I'd like to spin up an Xvfb service, have Protractor do it's thing, and then "gracefully" shut down Xvfb. In isolation these tasks are working fine, however I've hit a wall when it comes to adding them into the gulp build setup.
Here's the task in the gulpfile:
gulp.task('e2e.headless', (done: any) =>
runSequence('start.xvfb',
'protractor',
'stop.xvfb',
done));
The tasks themselves are loaded in through individual typescript task files, i.e:
import {runProtractor} from '../../utils';
export = runProtractor
And here are my (latest) utility files themselves.
protractor.ts
import * as util from 'gulp-util';
import {normalize, join} from 'path';
import {ChildProcess} from 'child_process';
function reportError(message: string) {
console.error(require('chalk').white.bgRed.bold(message));
process.exit(1);
}
function promiseFromChildProcess(child: ChildProcess) {
return new Promise(function (resolve: () => void, reject: () => void) {
child.on('close', (code: any) => {
util.log('Exited with code: ', code);
resolve();
});
child.stdout.on('data', (data: any) => {
util.log(`stdout: ${data}`);
});
child.stderr.on('data', (data: any) => {
util.log(`stderr: ${data}`);
reject();
});
});
}
export function runProtractor(): (done: () => void) => void {
return done => {
const root = normalize(join(__dirname, '..', '..', '..'));
const exec = require('child_process').exec;
// Our Xvfb instance is running on :99
// TODO: Pass this in instead of hard-coding
process.env.DISPLAY=':99';
util.log('cwd:', root);
let child = exec('protractor', { cwd: root, env: process.env},
function (error: Error, stdout: NodeBuffer, stderr: NodeBuffer) {
if (error !== null) {
reportError('Protractor error: ' + error + stderr);
}
});
promiseFromChildProcess(child).then(() => done());
};
}
xvfb_tools.ts
import * as util from 'gulp-util';
const exec = require('child_process').exec;
function reportError(message: string) {
console.error(require('chalk').white.bgRed.bold(message));
process.exit(1);
}
export function stopXvfb() {
return exec('pkill -c -n Xvfb',
function (error: NodeJS.ErrnoException, stdout: NodeBuffer, stderr: NodeBuffer) {
if (error !== null) {
reportError('Failed to kill Xvfb. Not really sure why...');
} else if (stdout.toString() === '0') {
reportError('No known Xvfb instance. Is it running?');
} else {
util.log('Xvfb terminated');
}
});
}
export function startXvfb() {
return exec('Xvfb :99 -ac -screen 0 1600x1200x24',
function (error: NodeJS.ErrnoException, stdout: NodeBuffer, stderr: NodeBuffer) {
if (error !== null && error.code !== null) {
reportError('Xvfb failed to start. Err: ' + error.code + ', ' + error + ', ' + stderr);
}
});
}
I feel as though I'm probably going around the houses in creating a promise from my exec child_process, however earlier interations of the code didn't do it, so...
Note that the debug logging which should be output in runProtractor() displaying the root directory never gets called, so I'm quite sure that there is an async issue at play here. Here is the output from the task:
[00:47:49] Starting 'e2e.headless'...
[00:47:49] Starting 'start.xvfb'...
[00:47:49] Finished 'start.xvfb' after 12 ms
[00:47:49] Starting 'protractor'...
[00:47:49] Finished 'protractor' after 5.74 ms
[00:47:49] Starting 'stop.xvfb'...
[00:47:49] Finished 'stop.xvfb' after 11 ms
[00:47:49] Finished 'e2e.headless' after 38 ms
[00:47:49] Xvfb terminated
Can someone set me straight/push me in the right direction please??
Thanks to Ludovic from the angular2-seed team!
The mistake was in not calling the runProtractor function from the wrapper class, i.e.
export = runProtractor(). Once that was noted, I could then strip out the un-necessary wrapper function as well as the promiseFromChildProcess, which were distractions.
The final task was just an anonymous function that takes the gulp callback "done" which is called when exiting:
function reportError(message: string) {
console.error(require('chalk').white.bgRed.bold(message));
process.exit(1);
}
export = (done: any) => {
const root = normalize(join(__dirname, '..', '..', '..'));
const exec = require('child_process').exec;
process.env.DISPLAY=':99';
util.log('cwd:', root);
exec('protractor', { cwd: root, env: process.env},
function (error: Error, stdout: NodeBuffer, stderr: NodeBuffer) {
if (error !== null) {
reportError('Protractor error: ' + error + stderr);
} else {
done();
}
});
}
You need to add a callback function to your gulp task and call the cb (callback) function after all your runSequence tasks have completed.
gulp.task('e2e.headless', (cb) =>
runSequence('start.xvfb',
'protractor',
'stop.xvfb',
(err) => {
if (err) {
console.log(err.message);
} else {
console.log("Build finished successfully");
}
cb(err);
});
});

node.js+serialport receiving undefined error when writing to serialport

Guys I'm getting error when I try to write information into serial port. I din't get any compilation errors as you can see in the result below. But When I write using serial.write(); I get error as undefined.
connect.multipart() will be removed in connect 3.0
visit https://github.com/senchalabs/connect/wiki/Connect-3.0 for alternatives
connect.limit() will be removed in connect 3.0
This is done.
err undefined
results 2
COM3
USB\VID_10C4&PID_EA60\0001
Silicon Laboratories
Here is my program:
var comport = 'COM3';
var serialPort = require("serialport");
serialPort.list(function (err, ports) {
ports.forEach(function(port) {
console.log(port.comName);
console.log(port.pnpId);
console.log(port.manufacturer);
});
});
var SerialPort = serialPort.SerialPort; // localize object constructor
var newport = new SerialPort(comport, {
baudrate: 57600
});
newport.on('open',function(){
newport.write("S\n", function(err, results) {
console.log('err ' + err);
console.log('results ' + results);
});
console.log("This is done.");
});
newport.on('data', function(data) {
console.log(data.toString());
});
It's everything all right with your code, if the error parameter has the value undefined this means that there is no error. If the error parameter was an instance of an Error constructor then you should take care of it.
An usual pattern to handle errors:
newport.write("S\n", function(err, results) {
if (err) {
// Do something with the error
} else {
// Do something with the result
}
});

Node - how to wait on async operations?

Sorry, just starting with node. This might be a very novice question.
Let's say I have some code which reads some files from a directory in the file system:
var fs = require('fs');
fs.readdir(__dirname + '/myfiles', function (err, files) {
if (err) throw err;
files.forEach(function (fileName) {
fs.readFile(__dirname + '/myfiles/' + fileName, function (err, data) {
if (err) throw err;
console.log('finished reading file ' + fileName + ': ' + data);
module.exports.files.push(data);
});
});
});
Note that all of this occurs asynchronously. Let's also say I have a Mocha test which executes this code:
describe('fileProvider', function () {
describe('#files', function () {
it.only('files array not empty', function () {
assert(fileProvider.files.length > 0, 'files.length is zero');
});
});
});
The mocha test runs before the files are finished being read. I know this because I see the console.log statement after I see the little dot that indicates a mocha test being run (at least I think that is what is being indicated). Also, if I surround the assert with a setTimeout, the assert passes.
How should I structure my code so that I can ensure the async file operations are completed? Note that this is not just a problem with testing - I need the files to be loaded fully before I can do real work in my app as well.
I don't think the right answer is to read files synchronously, because that will block the Node request / response loop, right?
Bonus question:
Even if I put the assert in a setTimeout with a 0 timeout value, the test still passes. Is this because just putting it in a setTimeout kicks it to the end of the processing chain or something so the filesystem work finishes first?
You can implement a complete callback after all files have been read.
exports.files = [];
exports.initialize = initialize;
function initialize(callback) {
var fs = require('fs');
fs.readdir(__dirname + '/myfiles', function (err, files) {
if (err) throw err;
files.forEach(function (fileName) {
fs.readFile(__dirname + '/myfiles/' + fileName, function (err, data) {
if (err) throw err;
console.log('finished reading file ' + fileName + ': ' + data);
exports.files.push(data);
if (exports.files.length == files.length) {
callback();
}
});
});
}
You can call the file operation method by doing something like:
var f = require('./files.js');
if (f.files.length < 1) {
console.log('initializing');
f.initialize(function () {
console.log('After: ' + f.files.length);
var another = require('./files.js');
console.log('Another module: ' + another.files.length);
});
}
EDIT: Since you want to only have to call this once, you could initialize it once when the application loads. According to Node.js documentation, modules are cached after the first time they are loaded. The two above examples have been edited as well.
To avoid being caught up in nested callbacks. You might want to use async's each that will allow you to do the tasks asynchronously in a non-blocking manner:
https://github.com/caolan/async#each
I think that's a good test, the same thing would happen in any app that used your module, i.e. it's code could be run before files is set. What you need to do is create a callback like #making3 suggests, or use promises. I haven't used mocha, but there's a section on ascynchronous calls. You could export the promise itself:
module.exports.getFiles = new Promise((resolve, reject) => {
datas = [];
fs.readdir(__dirname + '/myfiles', function (err, files) {
if (err) {
reject(err);
return;
}
files.forEach(function (fileName) {
fs.readFile(__dirname + '/myfiles/' + fileName, function (err, data) {
if (err) {
reject(err);
return;
}
console.log('finished reading file ' + fileName + ': ' + data);
datas.push(data);
if (datas.length == files.length) {
resolve(datas);
}
});
});
});
}
chai-as-promissed lets you work directly with promises using eventually, or you can use the callback passed to your test I think:
describe('fileProvider', function () {
describe('#files', function () {
it.only('files array not empty', function (done) {
fileProvider.getFiles.then(function(value) {
assert(value.length > 0, 'files.length is zero');
done();
}, function(err) {
done(err);
})
});
});
});

Enumerate system drives in nodejs

Is there a way to retrieve the drive name of all logical drives on a computer ?
I've looked at the fs api, but from there I can only enumerate the files and directories of a given directory.
I'm not sure what you mean by "drive name". If you mean drives in the form of \\.\PhysicalDriveN, I faced the same problem and implemented this module that works in all major operating systems:
https://github.com/resin-io/drivelist
For Windows, you get information such as:
[
{
device: '\\\\.\\PHYSICALDRIVE0',
description: 'WDC WD10JPVX-75JC3T0',
size: '1000 GB'
},
{
device: '\\\\.\\PHYSICALDRIVE1',
description: 'Generic STORAGE DEVICE USB Device',
size: '15 GB'
}
]
If you targeting on Windows, you could try this:
This solution base upon the idea from this post.
I wrap it with promise.
var spawn = require("child_process").spawn
function listDrives(){
const list = spawn('cmd');
return new Promise((resolve, reject) => {
list.stdout.on('data', function (data) {
// console.log('stdout: ' + String(data));
const output = String(data)
const out = output.split("\r\n").map(e=>e.trim()).filter(e=>e!="")
if (out[0]==="Name"){
resolve(out.slice(1))
}
// console.log("stdoutput:", out)
});
list.stderr.on('data', function (data) {
// console.log('stderr: ' + data);
});
list.on('exit', function (code) {
console.log('child process exited with code ' + code);
if (code !== 0){
reject(code)
}
});
list.stdin.write('wmic logicaldisk get name\n');
list.stdin.end();
})
}
listDrives().then((data) => console.log(data))
Test it, you will see the result like:
["c:", "d:"]
Based on Edwin Lees answer:
const child = require('child_process');
child.exec('wmic logicaldisk get name', (error, stdout) => {
console.log(
stdout.split('\r\r\n')
.filter(value => /[A-Za-z]:/.test(value))
.map(value => value.trim())
);
});
Output: ['C:', 'D:'] etc.
How about using the DiskPart command? Does running diskpart list in the command line give you the output you need? If so you can execute this in node using child_process.exec
var exec = require('child_process').exec
var cmd = 'diskpart list'
exec(cmd, function(err, stdout, stderr) {
if (err) {
console.log('error running diskpart list command')
console.log(err)
return
}
console.log('stdout data')
console.log(stdout)
console.log('stderr data')
console.log(stderr)
})
+1 for #Bagherani's downgrade suggestion!
I am using Electron React Boilerplate v4.0 and could not get drivelist to load. I downgraded to drivelist#5.2.12 and it works for my needs.

nodejs serialport write issues?

I am using the nodejs serialport module (https://npmjs.org/package/serialport) and I am having issues when I write to the serial port.
If I simply write to the port as shown below, the serial device never gets the command.
var serialport = require("serialport");
var sp = new serialport.SerialPort(serialPortPath);
sp.write("SYST:ADDR?\n");
However, if I use a setTimeout as shown below, then it seems to work?
var serialport = require("serialport");
var sp = new serialport.SerialPort(serialPortPath);
setTimeout(function(){sp.write("SYST:ADDR?\n")},1000);
FYI, the "serialPortPath" is set elsewhere in the code.
I am not sure what is going on... any ideas?
I think I got it figured out from the github (https://github.com/voodootikigod/node-serialport page... basically it looks like I was missing the "open" event as shown below:
serialPort.on("open", function () {
console.log("open");
serialPort.on("data", function(data) {
console.log("data received: " + data);
});
serialPort.write("SYST:ADDR?\n", function(err, results) {
console.log("err: " + err);
console.log("results: " + results);
});
});
Here is another approach which works very well and allows for dynamic addressing of a specific serial device. In my case I am only interested in connecting to the Numato device connected to our integrated system which is why I have the conditional logic in the list callback.
exports.testSerial = function(data) {
serialPort.list(function(err, ports) {
var port = {};
for(var i = 0; i < ports.length; i++) {
try {
if(typeof ports[i].manufacturer != 'undefined' && ports[i].manufacturer.includes("Numato")) {
port = ports[i];
}
} catch(err) {
console.dir(err);
}
}
// the port will be opened via the constructor of this call
var numato = new serial(port.comName, {baudrate : 19200}, function(err) {
if(err) {
return console.dir(err);
}
// by having the write call within the callback you can access it directly w/o using .on()
numato.write('relay ' + data.state + ' ' + data.channel + '\r', function(err) {
if(err) {
console.dir('error writing');
console.dir(err);
}
console.dir('serial message written');
numato.close();
});
});
return true;
});
}
Hope this helps someone in the future! For reference this is with library version 4.0.7.

Resources