Is there a way to retrieve the drive name of all logical drives on a computer ?
I've looked at the fs api, but from there I can only enumerate the files and directories of a given directory.
I'm not sure what you mean by "drive name". If you mean drives in the form of \\.\PhysicalDriveN, I faced the same problem and implemented this module that works in all major operating systems:
https://github.com/resin-io/drivelist
For Windows, you get information such as:
[
{
device: '\\\\.\\PHYSICALDRIVE0',
description: 'WDC WD10JPVX-75JC3T0',
size: '1000 GB'
},
{
device: '\\\\.\\PHYSICALDRIVE1',
description: 'Generic STORAGE DEVICE USB Device',
size: '15 GB'
}
]
If you targeting on Windows, you could try this:
This solution base upon the idea from this post.
I wrap it with promise.
var spawn = require("child_process").spawn
function listDrives(){
const list = spawn('cmd');
return new Promise((resolve, reject) => {
list.stdout.on('data', function (data) {
// console.log('stdout: ' + String(data));
const output = String(data)
const out = output.split("\r\n").map(e=>e.trim()).filter(e=>e!="")
if (out[0]==="Name"){
resolve(out.slice(1))
}
// console.log("stdoutput:", out)
});
list.stderr.on('data', function (data) {
// console.log('stderr: ' + data);
});
list.on('exit', function (code) {
console.log('child process exited with code ' + code);
if (code !== 0){
reject(code)
}
});
list.stdin.write('wmic logicaldisk get name\n');
list.stdin.end();
})
}
listDrives().then((data) => console.log(data))
Test it, you will see the result like:
["c:", "d:"]
Based on Edwin Lees answer:
const child = require('child_process');
child.exec('wmic logicaldisk get name', (error, stdout) => {
console.log(
stdout.split('\r\r\n')
.filter(value => /[A-Za-z]:/.test(value))
.map(value => value.trim())
);
});
Output: ['C:', 'D:'] etc.
How about using the DiskPart command? Does running diskpart list in the command line give you the output you need? If so you can execute this in node using child_process.exec
var exec = require('child_process').exec
var cmd = 'diskpart list'
exec(cmd, function(err, stdout, stderr) {
if (err) {
console.log('error running diskpart list command')
console.log(err)
return
}
console.log('stdout data')
console.log(stdout)
console.log('stderr data')
console.log(stderr)
})
+1 for #Bagherani's downgrade suggestion!
I am using Electron React Boilerplate v4.0 and could not get drivelist to load. I downgraded to drivelist#5.2.12 and it works for my needs.
Related
Thanks in advance.
I'm creating an Electron-Create-React-App using electron-forge on Windows 10 Pro and am stuck with using async functions with execFile and readFile().
I want to achieve the following:-
main process - Receive a buffer of a screen capture (video) from the renderer process.
Create a temporary file and write the buffer to a .mp4 file.
Crop the video (based on x:y:width:height) using ffmpeg (installed in Electron as a binary).
Output = .mp4 file in temporary directory
Read the cropped .mp4 file using fs.readFile() (as a base64 encoded buffer)
Send the buffer to another renderer screen.
Delete temp file.
Q: I've managed to do most of it but cannot access the cropped .mp4 file in the temp directory.
I've tried the following:-
Electron main process
const fs = require('fs').promises
const path = require('path')
ipcMain.on("capture:buffer", async (video_object) => {
const {x_pos, y_pos, window_width, window_height, buffer} = video_object
try {
const dir = await fs.mkdtemp(await fs.realpath(os.tmpdir()) + path.sep)
const captured_video_file_path = path.join(dir, "screen_capture_video.mp4")
// This works
await fs.writeFile(captured_video_file_path, buffer, (error, stdout, stderr) => {
if (error) {
console.log(error)
}
console.log("Screen Capture File written")
})
// This also works
execFile(`${ffmpeg.path}`,
['-i', `${captured_video_file_path}`, '-vf',
`crop=${window_width}:${window_height}:${x_pos}:${y_pos}`,
`${path.join(dir,'cropped_video.mp4')}`],
(error, stdout, stderr) => {
if (error) {
console.log(error.message)
}
if (stderr) {
console.log(stderr)
}
console.log("Cropped File created")
})
// This code onwards doesn't work
await fs.readFile(path.join(dir, "cropped_video.mp4"), 'base64', (error, data) => {
if (error) {
console.log(error)
}
// To renderer
mainWindow.webContents.send("main:video_buffer", Buffer.from(data))
})
} catch (error) {
console.log(error)
} finally {
fs.rmdir(dir, {recursive: true})
}
})
When trying to read the file i get the following error :-
[Error: ENOENT: no such file or directory, open 'C:\Users\XXXX\XXXXX\XXXXX\temp\temp_eYGMCR\cropped_video.mp4']
I've checked that the correct path exists with console.log.
I suspect it is a 'simple' issue with using async / execFile() properly but don't know exactly where I am making a silly mistake.
Any help would be appreciated.
Thanks.
Because at that time of calling fs.readFile, execFile may not be done yet.
Untested, but you may want to create a promise and wait for execFile to be completed before proceeding and see whether it works.
await new Promise( resolve => {
execFile(`${ffmpeg.path}`,
['-i', `${captured_video_file_path}`, '-vf',
`crop=${window_width}:${window_height}:${x_pos}:${y_pos}`,
`${path.join(dir,'cropped_video.mp4')}`],
(error, stdout, stderr) => {
if (error) {
console.log(error.message)
}
if (stderr) {
console.log(stderr)
}
console.log("Cropped File created")
resolve() //this tells `await` it's ready to move on
})
})
Thanks for the pointers guys.
Here's the solution I found.
Another big problem with safely creating and removing temporary directories in Electron is fs.rmdir() doesn't work when using Electron-Forge / Builder due to an issue with ASAR files.
(ASAR files are used to package Electron apps).
const fsPromises = require('fs').promises
ipcMain.on("capture:buffer", async (video_object) => {
const {x_pos, y_pos, window_width, window_height, buffer} = video_object
const temp_dir = await fsPromises.mkdtemp(await fsPromises.realpath(os.tmpdir()) + path.sep)
const captured_video_file_path = path.join(dir, "screen_capture_video.mp4")
try {
await fsPromises.writeFile(captured_video_file_path, buffer)
}
catch (error) {console.error}
// note no callback req'd as per jfriends advice
let child_object =
execFile(`${ffmpeg.path}`,
['-i', `${captured_video_file_path}`, '-vf',
`crop=${window_width}:${window_height}:${x_pos}:${y_pos}`,
`${path.join(dir,'cropped_video.mp4')}`],
(error, stdout, stderr) => {
if (error) {
console.log(error.message)
}
if (stderr) {
console.log(stderr)
}
console.log("Cropped File created")
})
child_object.on("close", async
() => {
try { video_buffer = await fsPromises.readFile(path.join(dir, "cropped_video.mp4")
// To renderer
mainWindow.webContents.send("main:video_buffer", video_buffer)
} catch (error) {
log(error)
} finally {
process.noAsar = true
fs.rmdir(temp_directory, {recursive: true}, (error) => {if (error) {log(error)}})
console.log("Done !!!")
process.noASAR = false
}
})
I want to use the node module "usb" to connect with a printer and node module "pcsclite" to use a smartcard-reader.
When i use both modules node.exe exits with error code -1073740771 (0xc000041d).
Here is a little test program:
var usb = require('usb')
var PCSC = require('pcsclite')()
PCSC.on('reader', function (reader) {
console.log('add reader')
reader.on('status', function (status) {
console.log('status: ' + status)
})
reader.on('error', function (err) { console.log('reader error: ' + err) })
reader.on('end', function () { console.log('end') })
})
PCSC.on('error', function (err) { console.log('PCSC error: ' + err) })
When i remove the first line, the program works fine.
P.S.: I use nodejs 4.8.3, usb 1.3.1 and pcsclite 0.4.12
I'm trying to find a way to list all installed web browsers in my macOS Electron app. What would be the best way to do this? Or... I'm happy to maintain a list of possible browsers but need a way to check they are present.
You'll need to create a child process which executes a command to receive the currently installed applications. Luckely macOS offers the system_profiler utility for doing so and even better it allows XML export via the -xml argument. But be aware it is by far not the fastest function.
You'll need to get the buffer chunks from the subprocess callback, encode it as utf-8 and then parse the XML string through something like xml2js. After that it is a simple check of the property of the browser is checked or not.
Updated code by Will Stone
import jp from 'jsonpath' // for easier json traversal
import { spawn } from 'child_process'
import parser from 'xml2json'
const sp = spawn('system_profiler', ['-xml', 'SPApplicationsDataType'])
let profile = ''
const browsers = [
'Brave',
'Chromium',
'Firefox',
'Google Chrome',
'Maxthon',
'Opera',
'Safari',
'SeaMonkey',
'TorBrowser',
'Vivaldi'
]
sp.stdout.setEncoding('utf8')
sp.stdout.on('data', data => {
profile += data // gather chunked data
})
sp.stderr.on('data', data => {
console.log(`stderr: ${data}`)
})
sp.on('close', code => {
console.log(`child process exited with code ${code}`)
})
sp.stdout.on('end', function() {
profile = parser.toJson(profile, { object: true })
const installedBrowsers = jp
.query(profile, 'plist.array.dict.array[1].dict[*].string[0]')
.filter(item => browsers.indexOf(item) > -1)
console.log(installedBrowsers)
console.log('Finished collecting data chunks.')
})
Initial code:
const { spawn } = require('child_process');
const parser = new xml2js.Parser();
const sp = spawn('system_profiler', ['-xml', 'SPApplicationsDataType']);
sp.stdout.on('data', (data) => {
parser.parseString(data, function(err, result){
console.log(result)
});
});
sp.stderr.on('data', (data) => {
console.log(`stderr: ${data}`);
});
sp.on('close', (code) => {
console.log(`child process exited with code ${code}`);
});
I'm reading a pdf text from a s3 bucket using S3fs.readFile, and i would like to get the result, transform in string and immediately open a spawn child_process calling pdftotext, passing the string:
S3Fs.readFile('./my-pdf-in-s3-bucket', {encoding: 'binary'}, (error, result) => {
mychild = child_process.spawn('pdftotext', [
result.Body
]);
});
This is causing the spawn process break because the string is to long, and i don't want save the file in disk just to read it again.
Is it possible?
Thanks!
pdftotext should allow reading from stdin and writing to stdout (at least it worked for me with v0.41.0), so you could do this instead:
S3Fs.readFile('./my-pdf-in-s3-bucket', (err, result) => {
if (err) throw err; // Handle better
var cp = child_process.spawn('pdftotext', [ '-', '-' ]);
cp.stdout.pipe(process.stdout);
cp.on('close', (code, signal) => {
console.log(`pdftotext finished with status ${code}`);
});
cp.stdin.end(result);
});
Or possibly better yet, you might be able to stream the file to the child process instead of buffering its entire contents in memory first:
var cp = child_process.spawn('pdftotext', [ '-', '-' ]);
var rs = S3Fs.createReadStream('./my-pdf-in-s3-bucket');
rs.on('error', (err) => {
cp.kill();
});
cp.stdout.pipe(process.stdout);
cp.on('close', (code, signal) => {
console.log(`pdftotext finished with status ${code}`);
});
rs.pipe(cp.stdin);
I have this simple script :
var exec = require('child_process').exec;
exec('coffee -cw my_file.coffee', function(error, stdout, stderr) {
console.log(stdout);
});
where I simply execute a command to compile a coffee-script file. But stdout never get displayed in the console, because the command never ends (because of the -w option of coffee).
If I execute the command directly from the console I get message like this :
18:05:59 - compiled my_file.coffee
My question is : is it possible to display these messages with the node.js exec ? If yes how ? !
Thanks
Don't use exec. Use spawn which is an EventEmmiter object. Then you can listen to stdout/stderr events (spawn.stdout.on('data',callback..)) as they happen.
From NodeJS documentation:
var spawn = require('child_process').spawn,
ls = spawn('ls', ['-lh', '/usr']);
ls.stdout.on('data', function (data) {
console.log('stdout: ' + data.toString());
});
ls.stderr.on('data', function (data) {
console.log('stderr: ' + data.toString());
});
ls.on('exit', function (code) {
console.log('child process exited with code ' + code.toString());
});
exec buffers the output and usually returns it when the command has finished executing.
exec will also return a ChildProcess object that is an EventEmitter.
var exec = require('child_process').exec;
var coffeeProcess = exec('coffee -cw my_file.coffee');
coffeeProcess.stdout.on('data', function(data) {
console.log(data);
});
OR pipe the child process's stdout to the main stdout.
coffeeProcess.stdout.pipe(process.stdout);
OR inherit stdio using spawn
spawn('coffee -cw my_file.coffee', { stdio: 'inherit' });
There are already several answers however none of them mention the best (and easiest) way to do this, which is using spawn and the { stdio: 'inherit' } option. It seems to produce the most accurate output, for example when displaying the progress information from a git clone.
Simply do this:
var spawn = require('child_process').spawn;
spawn('coffee', ['-cw', 'my_file.coffee'], { stdio: 'inherit' });
Credit to #MorganTouvereyQuilling for pointing this out in this comment.
Inspired by Nathanael Smith's answer and Eric Freese's comment, it could be as simple as:
var exec = require('child_process').exec;
exec('coffee -cw my_file.coffee').stdout.pipe(process.stdout);
I'd just like to add that one small issue with outputting the buffer strings from a spawned process with console.log() is that it adds newlines, which can spread your spawned process output over additional lines. If you output stdout or stderr with process.stdout.write() instead of console.log(), then you'll get the console output from the spawned process 'as is'.
I saw that solution here:
Node.js: printing to console without a trailing newline?
Hope that helps someone using the solution above (which is a great one for live output, even if it is from the documentation).
I have found it helpful to add a custom exec script to my utilities that do this.
utilities.js
const { exec } = require('child_process')
module.exports.exec = (command) => {
const process = exec(command)
process.stdout.on('data', (data) => {
console.log('stdout: ' + data.toString())
})
process.stderr.on('data', (data) => {
console.log('stderr: ' + data.toString())
})
process.on('exit', (code) => {
console.log('child process exited with code ' + code.toString())
})
}
app.js
const { exec } = require('./utilities.js')
exec('coffee -cw my_file.coffee')
After reviewing all the other answers, I ended up with this:
function oldSchoolMakeBuild(cb) {
var makeProcess = exec('make -C ./oldSchoolMakeBuild',
function (error, stdout, stderr) {
stderr && console.error(stderr);
cb(error);
});
makeProcess.stdout.on('data', function(data) {
process.stdout.write('oldSchoolMakeBuild: '+ data);
});
}
Sometimes data will be multiple lines, so the oldSchoolMakeBuild header will appear once for multiple lines. But this didn't bother me enough to change it.
child_process.spawn returns an object with stdout and stderr streams.
You can tap on the stdout stream to read data that the child process sends back to Node. stdout being a stream has the "data", "end", and other events that streams have. spawn is best used to when you want the child process to return a large amount of data to Node - image processing, reading binary data etc.
so you can solve your problem using child_process.spawn as used below.
var spawn = require('child_process').spawn,
ls = spawn('coffee -cw my_file.coffee');
ls.stdout.on('data', function (data) {
console.log('stdout: ' + data.toString());
});
ls.stderr.on('data', function (data) {
console.log('stderr: ' + data.toString());
});
ls.on('exit', function (code) {
console.log('code ' + code.toString());
});
Here is an async helper function written in typescript that seems to do the trick for me. I guess this will not work for long-lived processes but still might be handy for someone?
import * as child_process from "child_process";
private async spawn(command: string, args: string[]): Promise<{code: number | null, result: string}> {
return new Promise((resolve, reject) => {
const spawn = child_process.spawn(command, args)
let result: string
spawn.stdout.on('data', (data: any) => {
if (result) {
reject(Error('Helper function does not work for long lived proccess'))
}
result = data.toString()
})
spawn.stderr.on('data', (error: any) => {
reject(Error(error.toString()))
})
spawn.on('exit', code => {
resolve({code, result})
})
})
}