Can't get array of fs.Dirent from fs.readdir - node.js

I am using node 8.10.0.
fs.readdir() returns an array of filenames and child directory names, or fs.Dirents[].
I can't get that to work. Here's a simple example:
console.log(require("fs").readdirSync("/", {withFileTypes:true}));
This gives me an array of strings (e.g. ["bin", "mnt", "usr", "var", ...]), not an array of fs.Dirent objects (which is what I want).
How do I get this to work?

Required functionality is added in: v10.10.0, you have to update node.

I've hit the same problem and although I have the latest node.js (v10.16 currently) and intellisense in VS Code is consistent with the online documentation, the run-time reality surprised me. But that is because the code gets executed by node.js v10.2 (inside a VS Code extension).
So on node.js 10.2, this code works for me to get files in a directory:
import * as fs from 'fs';
import util = require('util');
export const readdir = util.promisify(fs.readdir);
let fileNames: string[] = await readdir(directory)
// keep only files, not directories
.filter(fileName => fs.statSync(path.join(directory, fileName)).isFile());
On the latest node.js, the same code could be simplified this way:
let fileEnts: fs.Dirent[] = await fs.promises.readdir(directory, { withFileTypes: true });
let fileNames: string[] = fileEnts
.filter(fileEnt => fileEnt.isFile())
.map(fileEnt => fileEnt.name);
The code snippets are in Typescript.

Related

How to create an import/export script using Node.JS?

I'm looking to import/export a list of files in a directory through an index.js file in the same directory.
For example, I have 2 files in a directory: admin.js and user.js and I am looking to require and exporting them in the in the index.js like so
module.exports = {
admin: require("./admin"),
users: require("./users"),
};
The script I have come up with looks like this but it is not working and giving me an error
fs.readdirSync(__dirname, (files) => {
files.forEach((file) => {
module.exports[file] = require(`./${file}`);
});
});
How can I improve this script to make it work?
Thank you!
[Update - 2022 December 18]
Found a solution based off of sequelize models/index.js, this will pretty much require and export your files and folders, feel free to use and modify
const fs = require('fs')
const path = require('path')
const basename = path.basename(__filename)
const controllers = {}
fs.readdirSync(__dirname)
.filter((folder) => {
return folder.indexOf('.') !== 0 && folder !== basename
})
.forEach((folder) => {
const controller = require(path.join(__dirname, folder))
controllers[controller.name] = controller
})
module.exports = controllers
fs.readdirSync() does NOT accept a callback. It directly returns the result:
const files = fs.readdirSync(__dirname);
for (let file of files) {
module.exports[file] = require(`./${file}`);
}
Note, the future of the Javascript language is using import and export with statically declared module names instead of require()and module.exports and this structure will generally not work with the newer way of doing things. So, if you expect to eventually move to the newer ESM modules, you may not want to bake in this type of architecture.
There is a dynamic import in ESM modules, but it's asynchronous (returns a promise that you have to wait for).
Also, note that this will attempt to reload your index.js file containing this code. That's might not be harmful, but may not be your intention.

JSON file not found

I have a json file with the name of email_templates.json placed in the same folder as my js file bootstrap.js. when I try to read the file I get an error.
no such file or directory, open './email_templates.json'
bootstrap.js
"use strict";
const fs = require('fs');
module.exports = async () => {
const { config } = JSON.parse(fs.readFileSync('./email_templates.json'));
console.log(config);
};
email_templates.json
[
{
"name":"vla",
"subject":"test template",
"path": ""
}
]
I am using VS code , for some reason VS code doesnt autocomplete the path as well which is confusing for me.Does anyone know why it is doing this?
Node v:14*
A possible solution is to get the full path (right from C:\, for example, if you are on Windows).
To do this, you first need to import path in your code.
const path = require("path");
Next, we need to join the directory in which the JavaScript file is in and the JSON filename. To do this, we will use the code below.
const jsonPath = path.resolve(__dirname, "email_templates.json");
The resolve() function basically mixes the two paths together to make one complete, valid path.
Finally, you can use this path to pass into readFileSync().
fs.readFileSync(jsonPath);
This should help with finding the path, if the issue was that it didn't like the relative path. The absolute path may help it find the file.
In conclusion, this solution should help with finding the path.

How to delete all files and subdirectories in a directory with Node.js

I am working with node.js and need to empty a folder. I read a lot of deleting files or folders. But I didn't find answers, how to delete all files AND folders in my folder Test, without deleting my folder Test` itself.
I try to find a solution with fs or extra-fs. Happy for some help!
EDIT 1: Hey #Harald, you should use the del library that #ziishaned posted above. Because it's much more clean and scalable. And use my answer to learn how it works under the hood :)
EDIT: 2 (Dec 26 2021): I didn't know that there is a fs method named fs.rm that you can use to accomplish the task with just one line of code.
fs.rm(path_to_delete, { recursive: true }, callback)
// or use the synchronous version
fs.rmSync(path_to_delete, { recursive: true })
The above code is analogous to the linux shell command: rm -r path_to_delete.
We use fs.unlink and fs.rmdir to remove files and empty directories respectively. To check if a path represents a directory we can use fs.stat().
So we've to list all the contents in your test directory and remove them one by one.
By the way, I'll be using the synchronous version of fs methods mentioned above (e.g., fs.readdirSync instead of fs.readdir) to make my code simple. But if you're writing a production application then you should use asynchronous version of all the fs methods. I leave it up to you to read the docs here Node.js v14.18.1 File System documentation.
const fs = require("fs");
const path = require("path");
const DIR_TO_CLEAR = "./trash";
emptyDir(DIR_TO_CLEAR);
function emptyDir(dirPath) {
const dirContents = fs.readdirSync(dirPath); // List dir content
for (const fileOrDirPath of dirContents) {
try {
// Get Full path
const fullPath = path.join(dirPath, fileOrDirPath);
const stat = fs.statSync(fullPath);
if (stat.isDirectory()) {
// It's a sub directory
if (fs.readdirSync(fullPath).length) emptyDir(fullPath);
// If the dir is not empty then remove it's contents too(recursively)
fs.rmdirSync(fullPath);
} else fs.unlinkSync(fullPath); // It's a file
} catch (ex) {
console.error(ex.message);
}
}
}
Feel free to ask me if you don't understand anything in the code above :)
You can use del package to delete files and folder within a directory recursively without deleting the parent directory:
Install the required dependency:
npm install del
Use below code to delete subdirectories or files within Test directory without deleting Test directory itself:
const del = require("del");
del.sync(['Test/**', '!Test']);

How can I use exported strings as my elements in Puppeteer

I can't seem to declare a string const and use that as my element with Puppeteer. For example:
await page.click("#playerView");
Works fine, but:
const playerViewId = "#playerView";
await page.click(playerViewId);
Doesn't. I ultimately want to hold all my Page Elements in an object in a separate file to tidy up my project.
Any ideas why this isn't working?
Thanks
I can confirm that the following case does in fact work:
const playerViewId = '#playerView';
await page.click(playerViewId);
If this is not working, consider upgrading your version of Node.js and/or Puppeteer.
If you are trying to define your variable in a separate file, you can use:
// external-file.js:
module.exports.playerViewId = '#playerView';
// main-file.js:
const external_variables = require('./external-file');
const playerViewId = external_variables.playerViewId;
await page.click(playerViewId);
Additionally, you should check to make sure that the element with id playerView exists and has loaded completely before attempting to use page.click().

Can I load multiple files with one require statement?

maybe this question is a little silly, but is it possible to load multiple .js files with one require statement? like this:
var mylib = require('./lib/mylibfiles');
and use:
mylib.foo(); //return "hello from one"
mylib.bar(): //return "hello from two"
And in the folder mylibfiles will have two files:
One.js
exports.foo= function(){return "hello from one";}
Two.js
exports.bar= function(){return "hello from two";}
I was thinking to put a package.json in the folder that say to load all the files, but I don't know how. Other aproach that I was thinking is to have a index.js that exports everything again but I will be duplicating work.
Thanks!!
P.D: I'm working with nodejs v0.611 on a windows 7 machine
First of all using require does not duplicate anything. It loads the module and it caches it, so calling require again will get it from memory (thus you can modify module at fly without interacting with its source code - this is sometimes desirable, for example when you want to store db connection inside module).
Also package.json does not load anything and does not interact with your app at all. It is only used for npm.
Now you cannot require multiple modules at once. For example what will happen if both One.js and Two.js have defined function with the same name?? There are more problems.
But what you can do, is to write additional file, say modules.js with the following content
module.exports = {
one : require('./one.js'),
two : require('./two.js'),
/* some other modules you want */
}
and then you can simply use
var modules = require('./modules.js');
modules.one.foo();
modules.two.bar();
I have a snippet of code that requires more than one module, but it doesn't clump them together as your post suggests. However, that can be overcome with a trick that I found.
function requireMany () {
return Array.prototype.slice.call(arguments).map(function (value) {
try {
return require(value)
}
catch (event) {
return console.log(event)
}
})
}
And you use it as such
requireMany("fs", "socket.io", "path")
Which will return
[ fs {}, socketio {}, path {} ]
If a module is not found, an error will be sent to the console. It won't break the programme. The error will be shown in the array as undefined. The array will not be shorter because one of the modules failed to load.
Then you can bind those each of those array elements to a variable name, like so:
var [fs, socketio, path] = requireMany("fs", "socket.io", "path")
It essentially works like an object, but assigns the keys and their values to the global namespace. So, in your case, you could do:
var [foo, bar] = requireMany("./foo.js", "./bar.js")
foo() //return "hello from one"
bar() //return "hello from two"
And if you do want it to break the programme on error, just use this modified version, which is smaller
function requireMany () {
return Array.prototype.slice.call(arguments).map(require)
}
Yes, you may require a folder as a module, according to the node docs. Let's say you want to require() a folder called ./mypack/.
Inside ./mypack/, create a package.json file with the name of the folder and a main javascript file with the same name, inside a ./lib/ directory.
{
"name" : "mypack",
"main" : "./lib/mypack.js"
}
Now you can use require('./mypack') and node will load ./mypack/lib/mypack.js.
However if you do not include this package.json file, it may still work. Without the file, node will attempt to load ./mypack/index.js, or if that's not there, ./mypack/index.node.
My understanding is that this could be beneficial if you have split your program into many javascript files but do not want to concatenate them for deployment.
You can use destructuring assignment to map an array of exported modules from require statements in one line:
const requires = (...modules) => modules.map(module => require(module));
const [fs, path] = requires('fs', 'path');
I was doing something similar to what #freakish suggests in his answer with a project where I've a list of test scripts that are pulled into a Puppeteer + Jest testing setup. My test files follow the naming convention testname1.js - testnameN.js and I was able use a generator function to require N number of files from the particular directory with the approach below:
const fs = require('fs');
const path = require('path');
module.exports = class FilesInDirectory {
constructor(directory) {
this.fid = fs.readdirSync(path.resolve(directory));
this.requiredFiles = (this.fid.map((fileId) => {
let resolvedPath = path.resolve(directory, fileId);
return require(resolvedPath);
})).filter(file => !!file);
}
printRetrievedFiles() {
console.log(this.requiredFiles);
}
nextFileGenerator() {
const parent = this;
const fidLength = parent.requiredFiles.length;
function* iterate(index) {
while (index < fidLength) {
yield parent.requiredFiles[index++];
}
}
return iterate(0);
}
}
Then use like so:
//Use in test
const FilesInDirectory = require('./utilities/getfilesindirectory');
const StepsCollection = new FilesInDirectory('./test-steps');
const StepsGenerator = StepsCollection.nextFileGenerator();
//Assuming we're in an async function
await StepsGenerator.next().value.FUNCTION_REQUIRED_FROM_FILE(someArg);

Resources