How can I use Node.js fs to sort a set of folders according to their created date? - node.js

I'm building a node.js application in which I need to read all the folders in a parent folder and display their names in the order they were created on the page. Here is what I have so far:
function getMixFolders() {
const { readdirSync } = require('fs');
const folderInfo = readdirSync('./shahspace.com/music mixes/')
.filter(item => item.isDirectory() && item.name !== 'views');
return folderInfo.map(folder => folder.name);
}
As you can see, I haven't implemented sorting. This is because readdirSync doesn't return the information I need. The only things it returns are the name of the folder and something called the Symbol(type) (which seems to indicate whether its a folder or file).
Is there another method for getting more details about the folders I'm reading from the parent folder? Specifically the created date?

There is no super efficient way in nodejs to get a directory listing and get statistics on each item (such as createDate). Instead, you have to distill the listing down to the files/folders you're interested in and then call fs.statSync() (or one of the similar variants) on each one to get that info. Here's a working version that looks like it does what you want:
Get directory list using the {withFileTypes: true} option
Filter to just folders
Ignore any folders named "views"
Get createDate of each folder
Sort the result by that createDate in ascending order (oldest folders first)
This code can be run as it's own program to test:
const fs = require('fs');
const path = require('path');
const mixPath = './shahspace.com/music mixes/';
function getMixFolders() {
const folderInfo = fs.readdirSync(mixPath, { withFileTypes: true })
.filter(item => item.isDirectory() && item.name !== 'views')
.map(folder => {
const fullFolderPath = path.join(path.resolve(mixPath), folder.name);
const stats = fs.statSync(fullFolderPath);
return { path: fullFolderPath, ctimeMs: stats.ctimeMs }
}).sort((a, b) => {
return a.ctimeMs - b.ctimeMs;
});
return folderInfo;
}
let result = getMixFolders();
console.log(result);
If you wanted the final array to be only the folder names without the createDates you could add one more .map() to transform the final result.

Related

How to create an import/export script using Node.JS?

I'm looking to import/export a list of files in a directory through an index.js file in the same directory.
For example, I have 2 files in a directory: admin.js and user.js and I am looking to require and exporting them in the in the index.js like so
module.exports = {
admin: require("./admin"),
users: require("./users"),
};
The script I have come up with looks like this but it is not working and giving me an error
fs.readdirSync(__dirname, (files) => {
files.forEach((file) => {
module.exports[file] = require(`./${file}`);
});
});
How can I improve this script to make it work?
Thank you!
[Update - 2022 December 18]
Found a solution based off of sequelize models/index.js, this will pretty much require and export your files and folders, feel free to use and modify
const fs = require('fs')
const path = require('path')
const basename = path.basename(__filename)
const controllers = {}
fs.readdirSync(__dirname)
.filter((folder) => {
return folder.indexOf('.') !== 0 && folder !== basename
})
.forEach((folder) => {
const controller = require(path.join(__dirname, folder))
controllers[controller.name] = controller
})
module.exports = controllers
fs.readdirSync() does NOT accept a callback. It directly returns the result:
const files = fs.readdirSync(__dirname);
for (let file of files) {
module.exports[file] = require(`./${file}`);
}
Note, the future of the Javascript language is using import and export with statically declared module names instead of require()and module.exports and this structure will generally not work with the newer way of doing things. So, if you expect to eventually move to the newer ESM modules, you may not want to bake in this type of architecture.
There is a dynamic import in ESM modules, but it's asynchronous (returns a promise that you have to wait for).
Also, note that this will attempt to reload your index.js file containing this code. That's might not be harmful, but may not be your intention.

Bulk Renaming Files Based on JSON in Node Recurisvely

I'm wanting to bulk rename files within a folder based on a JSON file that I have with the following format:
{
"1": {
"Filename": "Background-1",
"New Filename": "Background-1#4"
},
"2": {
"Filename": "Background-2",
"New Filename": "Background-2#6"
},
The original Filenames are within a folder structure such as
Background
--Background-1
--Background-2
Other Folder
--Another-Filename
--Another-Filename-2
And so on and so forth. I want to copy the files with the new names, while retaining the name of the folder they're in, over to a new folder.
So far I've tried using fs and klaw-sync to read the filenames, traverse through directories, etc, but it seems wildly inefficient to run through each key and then run through each folder recurisvely to find a matching file, then rename it and copy. There's over 180 files and ~15 folders.
Any idea how I can approach this better, or any suggestions/examples I could use?
Here's what I've got so far.
Thanks.
// Require Node's File System module
const fs = require('fs');
var path = require('path');
var klawSync = require('klaw-sync');
// Read the JSON file
fs.readFile(__dirname + '/rename_config.json', function (error, data) {
if (error) {
console.log(error);
return;
}
const obj = JSON.parse(data);
// Iterate over the object
Object.keys(obj).forEach(key => {
// Create an empty variable to be accesible in the closure
var paths;
// The directory that you want to explore
var directoryToExplore = path.join(__dirname, '../art');
try {
paths = klawSync(directoryToExplore);
} catch (err) {
console.error(err);
}
//console.log(paths);
//traverse through paths to find an equal name
//find the path of that equivalent name, then rename to new directory
});
});

Nodejs readdir - only find files

When reading a directory, I currently have this:
fs.readdir(tests, (err, items) => {
if(err){
return cb(err);
}
const cmd = items.filter(v => fs.lstatSync(tests + '/' + v).isFile());
k.stdin.end(`${cmd}`);
});
first of all I need a try/catch in there around fs.lstatSync, which I don't want to add. But is there a way to use fs.readdir to only find files?
Something like:
fs.readdir(tests, {type:'f'}, (err, items) => {});
does anyone know how?
Starting from node v10.10.0, you can add withFileTypes as options parameter to get fs.Dirent instead of string.
// or readdir to get a promise
const subPaths = fs.readdirSync(YOUR_BASE_PATH, {
withFileTypes: true
});
// subPaths is fs.Dirent[] type
const directories = subPaths.filter((dirent) => dirent.isFile());
// directories is string[] type
more info is located at node documentation:
fs.Dirent
fs.readdirSync
fs.readdir
Unfortunately, fs.readdir doesn't have an option to specify that you're only looking for files, not folders/directories (per docs). Filtering the results from fs.readdir to knock out the directories is your best bet.
https://nodejs.org/dist/latest-v10.x/docs/api/fs.html#fs_fs_readdir_path_options_callback
The optional options argument can be a string specifying an
encoding, or an object with an encoding property specifying the
character encoding to use for the filenames passed to the callback. If
the encoding is set to 'buffer', the filenames returned will be
passed as Buffer objects.
Yeah fs.readdir can't do this currently (only read files or only read dirs).
I filed an issue with Node.js and looks like it may be a good feature to add.
https://github.com/nodejs/node/issues/21804
If your use case is scripting/automation. You might try fs-jetpack library. That can find files in folder for you, but also can be configured for much more sophisticated searches.
const jetpack = require("fs-jetpack");
// Find all files in my_folder
const filesInFolder = jetpack.find("my_folder", { recursive: false }));
console.log(filesInFolder);
// Example of more sophisticated search:
// Find all `.js` files in the folder tree, with modify date newer than 2020-05-01
const borderDate = new Date("2020-05-01")
const found = jetpack.find("foo", {
matching: "*.js",
filter: (file) => {
return file.modifyTime > borderDate
}
});
console.log(found);

Node.js check if path is file or directory

I can't seem to get any search results that explain how to do this.
All I want to do is be able to know if a given path is a file or a directory (folder).
The following should tell you. From the docs:
fs.lstatSync(path_string).isDirectory()
Objects returned from fs.stat() and fs.lstat() are of this type.
stats.isFile()
stats.isDirectory()
stats.isBlockDevice()
stats.isCharacterDevice()
stats.isSymbolicLink() // (only valid with fs.lstat())
stats.isFIFO()
stats.isSocket()
NOTE:
The above solution will throw an Error if; for ex, the file or directory doesn't exist.
If you want a true or false approach, try fs.existsSync(dirPath) && fs.lstatSync(dirPath).isDirectory(); as mentioned by Joseph in the comments below.
Update: Node.Js >= 10
We can use the new fs.promises API
const fs = require('fs').promises;
(async() => {
const stat = await fs.lstat('test.txt');
console.log(stat.isFile());
})().catch(console.error)
Any Node.Js version
Here's how you would detect if a path is a file or a directory asynchronously, which is the recommended approach in node.
using fs.lstat
const fs = require("fs");
let path = "/path/to/something";
fs.lstat(path, (err, stats) => {
if(err)
return console.log(err); //Handle error
console.log(`Is file: ${stats.isFile()}`);
console.log(`Is directory: ${stats.isDirectory()}`);
console.log(`Is symbolic link: ${stats.isSymbolicLink()}`);
console.log(`Is FIFO: ${stats.isFIFO()}`);
console.log(`Is socket: ${stats.isSocket()}`);
console.log(`Is character device: ${stats.isCharacterDevice()}`);
console.log(`Is block device: ${stats.isBlockDevice()}`);
});
Note when using the synchronous API:
When using the synchronous form any exceptions are immediately thrown.
You can use try/catch to handle exceptions or allow them to bubble up.
try{
fs.lstatSync("/some/path").isDirectory()
}catch(e){
// Handle error
if(e.code == 'ENOENT'){
//no such file or directory
//do something
}else {
//do something else
}
}
Seriously, question exists five years and no nice facade?
function isDir(path) {
try {
var stat = fs.lstatSync(path);
return stat.isDirectory();
} catch (e) {
// lstatSync throws an error if path doesn't exist
return false;
}
}
Depending on your needs, you can probably rely on node's path module.
You may not be able to hit the filesystem (e.g. the file hasn't been created yet) and tbh you probably want to avoid hitting the filesystem unless you really need the extra validation. If you can make the assumption that what you are checking for follows .<extname> format, just look at the name.
Obviously if you are looking for a file without an extname you will need to hit the filesystem to be sure. But keep it simple until you need more complicated.
const path = require('path');
function isFile(pathItem) {
return !!path.extname(pathItem);
}
If you need this when iterating over a directory (Because that's how I've found this question):
Since Node 10.10+, fs.readdir has a withFileTypes option which makes it return directory entry fs.Dirent instead of strings. Directory entries has a name property, and useful methods such as isDirectory or isFile, so you don't need to call fs.lstat explicitly.
import { promises as fs } from 'fs';
// ./my-dir has two subdirectories: dir-a, and dir-b
const dirEntries = await fs.readdir('./my-dir', { withFileTypes: true });
// let's filter all directories in ./my-dir
const onlyDirs = dirEntries.filter(de => de.isDirectory()).map(de => de.name);
// onlyDirs is now [ 'dir-a', 'dir-b' ]
Here's a function that I use. Nobody is making use of promisify and await/async feature in this post so I thought I would share.
const promisify = require('util').promisify;
const lstat = promisify(require('fs').lstat);
async function isDirectory (path) {
try {
return (await lstat(path)).isDirectory();
}
catch (e) {
return false;
}
}
Note : I don't use require('fs').promises; because it has been experimental for one year now, better not rely on it.
The answers above check if a filesystem contains a path that is a file or directory. But it doesn't identify if a given path alone is a file or directory.
The answer is to identify directory-based paths using "/." like --> "/c/dos/run/." <-- trailing period.
Like a path of a directory or file that has not been written yet. Or a path from a different computer. Or a path where both a file and directory of the same name exists.
// /tmp/
// |- dozen.path
// |- dozen.path/.
// |- eggs.txt
//
// "/tmp/dozen.path" !== "/tmp/dozen.path/"
//
// Very few fs allow this. But still. Don't trust the filesystem alone!
// Converts the non-standard "path-ends-in-slash" to the standard "path-is-identified-by current "." or previous ".." directory symbol.
function tryGetPath(pathItem) {
const isPosix = pathItem.includes("/");
if ((isPosix && pathItem.endsWith("/")) ||
(!isPosix && pathItem.endsWith("\\"))) {
pathItem = pathItem + ".";
}
return pathItem;
}
// If a path ends with a current directory identifier, it is a path! /c/dos/run/. and c:\dos\run\.
function isDirectory(pathItem) {
const isPosix = pathItem.includes("/");
if (pathItem === "." || pathItem ==- "..") {
pathItem = (isPosix ? "./" : ".\\") + pathItem;
}
return (isPosix ? pathItem.endsWith("/.") || pathItem.endsWith("/..") : pathItem.endsWith("\\.") || pathItem.endsWith("\\.."));
}
// If a path is not a directory, and it isn't empty, it must be a file
function isFile(pathItem) {
if (pathItem === "") {
return false;
}
return !isDirectory(pathItem);
}
Node version: v11.10.0 - Feb 2019
Last thought: Why even hit the filesystem?
I could check if a directory or file exists using this:
// This returns if the file is not a directory.
if(fs.lstatSync(dir).isDirectory() == false) return;
// This returns if the folder is not a file.
if(fs.lstatSync(dir).isFile() == false) return;
Function that returns type
I like coffee
type: (uri)-> (fina) ->
fs.lstat uri, (erro,stats) ->
console.log {erro} if erro
fina(
stats.isDirectory() and "directory" or
stats.isFile() and "document" or
stats.isSymbolicLink() and "link" or
stats.isSocket() and "socket" or
stats.isBlockDevice() and "block" or
stats.isCharacterDevice() and "character" or
stats.isFIFO() and "fifo"
)
usage:
dozo.type("<path>") (type) ->
console.log "type is #{type}"

node.js require all files in a folder?

How do I require all files in a folder in node.js?
need something like:
files.forEach(function (v,k){
// require routes
require('./routes/'+v);
}};
When require is given the path of a folder, it'll look for an index.js file in that folder; if there is one, it uses that, and if there isn't, it fails.
It would probably make most sense (if you have control over the folder) to create an index.js file and then assign all the "modules" and then simply require that.
yourfile.js
var routes = require("./routes");
index.js
exports.something = require("./routes/something.js");
exports.others = require("./routes/others.js");
If you don't know the filenames you should write some kind of loader.
Working example of a loader:
var normalizedPath = require("path").join(__dirname, "routes");
require("fs").readdirSync(normalizedPath).forEach(function(file) {
require("./routes/" + file);
});
// Continue application logic here
I recommend using glob to accomplish that task.
var glob = require( 'glob' )
, path = require( 'path' );
glob.sync( './routes/**/*.js' ).forEach( function( file ) {
require( path.resolve( file ) );
});
Base on #tbranyen's solution, I create an index.js file that load arbitrary javascripts under current folder as part of the exports.
// Load `*.js` under current directory as properties
// i.e., `User.js` will become `exports['User']` or `exports.User`
require('fs').readdirSync(__dirname + '/').forEach(function(file) {
if (file.match(/\.js$/) !== null && file !== 'index.js') {
var name = file.replace('.js', '');
exports[name] = require('./' + file);
}
});
Then you can require this directory from any where else.
Another option is to use the package require-dir which let's you do the following. It supports recursion as well.
var requireDir = require('require-dir');
var dir = requireDir('./path/to/dir');
I have a folder /fields full of files with a single class each, ex:
fields/Text.js -> Test class
fields/Checkbox.js -> Checkbox class
Drop this in fields/index.js to export each class:
var collectExports, fs, path,
__hasProp = {}.hasOwnProperty;
fs = require('fs');
path = require('path');
collectExports = function(file) {
var func, include, _results;
if (path.extname(file) === '.js' && file !== 'index.js') {
include = require('./' + file);
_results = [];
for (func in include) {
if (!__hasProp.call(include, func)) continue;
_results.push(exports[func] = include[func]);
}
return _results;
}
};
fs.readdirSync('./fields/').forEach(collectExports);
This makes the modules act more like they would in Python:
var text = new Fields.Text()
var checkbox = new Fields.Checkbox()
One more option is require-dir-all combining features from most popular packages.
Most popular require-dir does not have options to filter the files/dirs and does not have map function (see below), but uses small trick to find module's current path.
Second by popularity require-all has regexp filtering and preprocessing, but lacks relative path, so you need to use __dirname (this has pros and contras) like:
var libs = require('require-all')(__dirname + '/lib');
Mentioned here require-index is quite minimalistic.
With map you may do some preprocessing, like create objects and pass config values (assuming modules below exports constructors):
// Store config for each module in config object properties
// with property names corresponding to module names
var config = {
module1: { value: 'config1' },
module2: { value: 'config2' }
};
// Require all files in modules subdirectory
var modules = require('require-dir-all')(
'modules', // Directory to require
{ // Options
// function to be post-processed over exported object for each require'd module
map: function(reqModule) {
// create new object with corresponding config passed to constructor
reqModule.exports = new reqModule.exports( config[reqModule.name] );
}
}
);
// Now `modules` object holds not exported constructors,
// but objects constructed using values provided in `config`.
I know this question is 5+ years old, and the given answers are good, but I wanted something a bit more powerful for express, so i created the express-map2 package for npm. I was going to name it simply express-map, however the people at yahoo already have a package with that name, so i had to rename my package.
1. basic usage:
app.js (or whatever you call it)
var app = require('express'); // 1. include express
app.set('controllers',__dirname+'/controllers/');// 2. set path to your controllers.
require('express-map2')(app); // 3. patch map() into express
app.map({
'GET /':'test',
'GET /foo':'middleware.foo,test',
'GET /bar':'middleware.bar,test'// seperate your handlers with a comma.
});
controller usage:
//single function
module.exports = function(req,res){
};
//export an object with multiple functions.
module.exports = {
foo: function(req,res){
},
bar: function(req,res){
}
};
2. advanced usage, with prefixes:
app.map('/api/v1/books',{
'GET /': 'books.list', // GET /api/v1/books
'GET /:id': 'books.loadOne', // GET /api/v1/books/5
'DELETE /:id': 'books.delete', // DELETE /api/v1/books/5
'PUT /:id': 'books.update', // PUT /api/v1/books/5
'POST /': 'books.create' // POST /api/v1/books
});
As you can see, this saves a ton of time and makes the routing of your application dead simple to write, maintain, and understand. it supports all of the http verbs that express supports, as well as the special .all() method.
npm package: https://www.npmjs.com/package/express-map2
github repo: https://github.com/r3wt/express-map
Expanding on this glob solution. Do this if you want to import all modules from a directory into index.js and then import that index.js in another part of the application. Note that template literals aren't supported by the highlighting engine used by stackoverflow so the code might look strange here.
const glob = require("glob");
let allOfThem = {};
glob.sync(`${__dirname}/*.js`).forEach((file) => {
/* see note about this in example below */
allOfThem = { ...allOfThem, ...require(file) };
});
module.exports = allOfThem;
Full Example
Directory structure
globExample/example.js
globExample/foobars/index.js
globExample/foobars/unexpected.js
globExample/foobars/barit.js
globExample/foobars/fooit.js
globExample/example.js
const { foo, bar, keepit } = require('./foobars/index');
const longStyle = require('./foobars/index');
console.log(foo()); // foo ran
console.log(bar()); // bar ran
console.log(keepit()); // keepit ran unexpected
console.log(longStyle.foo()); // foo ran
console.log(longStyle.bar()); // bar ran
console.log(longStyle.keepit()); // keepit ran unexpected
globExample/foobars/index.js
const glob = require("glob");
/*
Note the following style also works with multiple exports per file (barit.js example)
but will overwrite if you have 2 exports with the same
name (unexpected.js and barit.js have a keepit function) in the files being imported. As a result, this method is best used when
your exporting one module per file and use the filename to easily identify what is in it.
Also Note: This ignores itself (index.js) by default to prevent infinite loop.
*/
let allOfThem = {};
glob.sync(`${__dirname}/*.js`).forEach((file) => {
allOfThem = { ...allOfThem, ...require(file) };
});
module.exports = allOfThem;
globExample/foobars/unexpected.js
exports.keepit = () => 'keepit ran unexpected';
globExample/foobars/barit.js
exports.bar = () => 'bar run';
exports.keepit = () => 'keepit ran';
globExample/foobars/fooit.js
exports.foo = () => 'foo ran';
From inside project with glob installed, run node example.js
$ node example.js
foo ran
bar run
keepit ran unexpected
foo ran
bar run
keepit ran unexpected
One module that I have been using for this exact use case is require-all.
It recursively requires all files in a given directory and its sub directories as long they don't match the excludeDirs property.
It also allows specifying a file filter and how to derive the keys of the returned hash from the filenames.
Require all files from routes folder and apply as middleware. No external modules needed.
// require
const { readdirSync } = require("fs");
// apply as middleware
readdirSync("./routes").map((r) => app.use("/api", require("./routes/" + r)));
I'm using node modules copy-to module to create a single file to require all the files in our NodeJS-based system.
The code for our utility file looks like this:
/**
* Module dependencies.
*/
var copy = require('copy-to');
copy(require('./module1'))
.and(require('./module2'))
.and(require('./module3'))
.to(module.exports);
In all of the files, most functions are written as exports, like so:
exports.function1 = function () { // function contents };
exports.function2 = function () { // function contents };
exports.function3 = function () { // function contents };
So, then to use any function from a file, you just call:
var utility = require('./utility');
var response = utility.function2(); // or whatever the name of the function is
Can use : https://www.npmjs.com/package/require-file-directory
Require selected files with name only or all files.
No need of absoulute path.
Easy to understand and use.
Using this function you can require a whole dir.
const GetAllModules = ( dirname ) => {
if ( dirname ) {
let dirItems = require( "fs" ).readdirSync( dirname );
return dirItems.reduce( ( acc, value, index ) => {
if ( PATH.extname( value ) == ".js" && value.toLowerCase() != "index.js" ) {
let moduleName = value.replace( /.js/g, '' );
acc[ moduleName ] = require( `${dirname}/${moduleName}` );
}
return acc;
}, {} );
}
}
// calling this function.
let dirModules = GetAllModules(__dirname);
Create an index.js file in your folder with this code :
const fs = require('fs')
const files = fs.readdirSync('./routes')
for (const file of files) {
require('./'+file)
}
And after that you can simply load all the folder with require("./routes")
If you include all files of *.js in directory example ("app/lib/*.js"):
In directory app/lib
example.js:
module.exports = function (example) { }
example-2.js:
module.exports = function (example2) { }
In directory app create index.js
index.js:
module.exports = require('./app/lib');

Resources