Can you pull in excludes/includes options in Typescript Compiler API? - node.js

I'm trying use the Typescript API to do some custom compilation (using plugins, etc.).
When compiling with the command line, typescript will automatically take into account not only the compilerOptions, but also the excludes key, as well as the types key (among other things).
The Typescript API documentation is almost non-existent, but I've tried manually doing a deep search and excluding these files from the total file list passed into the createProgram method. However, this has taken a toll on our build performance, and I think the Typescript compiler has a much more efficient solution.
Here's what I have in my compile.ts file.
import ts from 'typescript';
import keysTransformer from 'ts-transformer-keys/transformer';
import tsConfig from './tsconfig.json';
import fs from 'fs';
const deepFindTsFiles = (dir: string, fileList?: string[]) => {
dir = dir.endsWith('/') ? dir : dir + '/'
const files = fs.readdirSync(dir);
fileList = fileList || [];
files.forEach((file) => {
console.log('file', dir + file);
console.log(tsConfig.exclude.includes(dir + file));
if (fs.statSync(dir + file).isDirectory() && !tsConfig.exclude.includes(dir + file)) {
fileList = deepFindTsFiles(dir + file + '/', fileList);
}
else if (!tsConfig.exclude.includes(dir + file) && file.split('.')[1].includes('ts')) {
fileList!.push(dir + file);
}
});
return fileList;
};
const fileList = deepFindTsFiles(tsConfig.compilerOptions.rootDir);
// the compiler options parsed from tsconfig.json
const { options } = ts.parseJsonConfigFileContent(
tsConfig,
ts.sys,
__dirname
);
// createProgram doesn't seem to have an option for excludes/includes/types
const program = ts.createProgram(fileList, options);
const transformers = {
before: [keysTransformer(program)],
after: []
};
program.emit(undefined, undefined, undefined, false, transformers);
The code above properly compiles my source code and applies the proper plugins, however it's much slower than using the command line tools.
What's passed into the createProgram method is just the compilerOptions object in my tsconfig.json file, not the files to exclude. It does not seem like the createProgram method takes any arguments like this.
But, surely there is? Right? I have to imagine that this is possible. But, the documentation is severely lacking

I managed to figure out how to get the list of fileNames using the tsconfig.json and the TS compiler API.
const configPath = ts.findConfigFile(
"./", /*searchPath*/
ts.sys.fileExists,
"tsconfig.json"
);
if (!configPath) throw new Error("Could not find a valid 'tsconfig.json'.");
const configFile = ts.readJsonConfigFile(configPath, ts.sys.readFile);
const { fileNames } = ts.parseJsonSourceFileConfigFileContent(
configFile,
ts.sys,
'./'
);
console.log(fileNames)
This takes into account the includes and excludes specified in the tsconfig.json.
This might not be the best way to do this but I have been struggling to figure this out for a while.

Related

How to create an import/export script using Node.JS?

I'm looking to import/export a list of files in a directory through an index.js file in the same directory.
For example, I have 2 files in a directory: admin.js and user.js and I am looking to require and exporting them in the in the index.js like so
module.exports = {
admin: require("./admin"),
users: require("./users"),
};
The script I have come up with looks like this but it is not working and giving me an error
fs.readdirSync(__dirname, (files) => {
files.forEach((file) => {
module.exports[file] = require(`./${file}`);
});
});
How can I improve this script to make it work?
Thank you!
[Update - 2022 December 18]
Found a solution based off of sequelize models/index.js, this will pretty much require and export your files and folders, feel free to use and modify
const fs = require('fs')
const path = require('path')
const basename = path.basename(__filename)
const controllers = {}
fs.readdirSync(__dirname)
.filter((folder) => {
return folder.indexOf('.') !== 0 && folder !== basename
})
.forEach((folder) => {
const controller = require(path.join(__dirname, folder))
controllers[controller.name] = controller
})
module.exports = controllers
fs.readdirSync() does NOT accept a callback. It directly returns the result:
const files = fs.readdirSync(__dirname);
for (let file of files) {
module.exports[file] = require(`./${file}`);
}
Note, the future of the Javascript language is using import and export with statically declared module names instead of require()and module.exports and this structure will generally not work with the newer way of doing things. So, if you expect to eventually move to the newer ESM modules, you may not want to bake in this type of architecture.
There is a dynamic import in ESM modules, but it's asynchronous (returns a promise that you have to wait for).
Also, note that this will attempt to reload your index.js file containing this code. That's might not be harmful, but may not be your intention.

Can you can define 'page_objects_path' directory which will read from all sub-folders without specifying them explicitly?

The project that I am currently working on, uses Selenium WebDriver Nightwatch and Cucumber.
The issue is that project's folder structure has changed and now 'page_objects_path' in 'nightwatch.conf.js' file looks something like this:
'page_objects_path':
[
"./componentTests/page-objects",
"./componentTests/page-objects/xxxxxx",
"./componentTests/page-objects/xxxxx xxxx",
"./endToEndTests/page-objects",
"./endToEndTests/page-objects/xxxx",
"./endToEndTests/page-objects/xxxxxxx",
"./endToEndTests/page-objects/xxxx xxxxx",
"./endToEndTests/page-objects/xxxxxx"
"./endToEndTests/page-objects/xxxxxxxxxx"
],
Is there any way, where Nightwatch can read all sub-folders from /page-objects/ directory without being explicitly specified in the array as separate paths?
I believe
'page_objects_path':
[
"./componentTests/page-objects",
"./endToEndTests/page-objects",
],
should be enough.
The page class should have sub-classes called by the sub-folders of your structure.
E.g. the method getTheCoolElement() from "./endToEndTests/page-objects/mainPage/SubPage.js" should be called like: browser.page.mainPage.SubPage().getTheCoolElement()
See working example in owncloud phoenix project, that has a hierarchy of page objects: https://github.com/owncloud/phoenix/tree/master/tests/acceptance/pageObjects
alternatively you could just create that array programmatically using JS. e.g.
const fs = require('fs')
// const path = require('path')
const getAllFolders = function (dirPath, arrayOfFiles) {
const files = fs.readdirSync(dirPath)
arrayOfFiles = arrayOfFiles || []
files.forEach(function (file) {
if (fs.statSync(dirPath + '/' + file).isDirectory()) {
arrayOfFiles = getAllFolders(dirPath + '/' + file, arrayOfFiles)
arrayOfFiles.push(path.join(dirPath, '/', file))
}
})
return arrayOfFiles
}
let allPageObjectPath = getAllFolders(
path.join(__dirname, '/componentTests/page-objects')
)
allPageObjectPath = allPageObjectPath.concat(
getAllFolders(path.join(__dirname, '/endToEndTests/page-objects'))
)
module.exports = {
page_objects_path: allPageObjectPath,
....
}

Alternative for __dirname in Node.js when using ES6 modules

I use the flag --experimental-modules when running my Node application in order to use ES6 modules.
However when I use this flag the metavariable __dirname is not available. Is there an alternative way to get the same string that is stored in __dirname that is compatible with this mode?
As of Node.js 10.12 there's an alternative that doesn't require creating multiple files and handles special characters in filenames across platforms:
import { dirname } from 'path';
import { fileURLToPath } from 'url';
const __dirname = dirname(fileURLToPath(import.meta.url));
The most standardized way in 2021
import { URL } from 'url'; // in Browser, the URL in native accessible on window
const __filename = new URL('', import.meta.url).pathname;
// Will contain trailing slash
const __dirname = new URL('.', import.meta.url).pathname;
And forget about join to create paths from the current file, just use the URL
const pathToAdjacentFooFile = new URL('./foo.txt', import.meta.url).pathname;
const pathToUpperBarFile = new URL('../bar.json', import.meta.url).pathname;
For Node 10.12 +...
Assuming you are working from a module, this solution should work, and also gives you __filename support as well
import path from 'node:path';
import { fileURLToPath } from 'node:url';
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
The nice thing is that you are also only two lines of code away from supporting require() for CommonJS modules. For that you would add:
import { createRequireFromPath } from 'module';
const require = createRequireFromPath(__filename);
In most cases, using what is native to Node.js (with ES Modules), not external resources, the use of __filename and __dirname for most cases can be totally unnecessary. Most (if not all) of the native methods for reading (streaming) supports the new URL + import.meta.url, exactly as the official documentation itself suggests:
No __filename or __dirname
No JSON Module Loading
No require.resolve
As you can see in the description of the methods, the path parameter shows the supported formats, and in them include the <URL>, examples:
Method
path param supports
fs.readFile(path[, options], callback)
<string>, <Buffer>, <URL>, <integer>
fs.readFileSync(path[, options])
<string>, <Buffer>, <URL>, <integer>
fs.readdir(path[, options], callback)
<string>, <Buffer>, <URL>
fs.readdirSync(path[, options])
<string>, <Buffer>, <URL>, <integer>
fsPromises.readdir(path[, options])
<string>, <Buffer>, <URL>
fsPromises.readFile(path[, options])
<string>, <Buffer>, <URL>, <FileHandle>
So with new URL('<path or file>', import.meta.url) it solves and you don't need to be treating strings and creating variables to be concatenated later.
Examples:
See how it is possible to read a file at the same level as the script without needing __filename or any workaround:
import { readFileSync } from 'fs';
const output = readFileSync(new URL('./foo.txt', import.meta.url));
console.log(output.toString());
List all files in the script directory:
import { readdirSync } from 'fs';
readdirSync(new URL('./', import.meta.url)).forEach((dirContent) => {
console.log(dirContent);
});
Note: In the examples I used the synchronous functions just to make it easier to copy and execute.
If the intention is to make a "own log" (or something similar) that will depend on third parties, it is worth some things done manually, but within the language and Node.js this is not necessary, with ESMODULES it is totally possible not to depend on either __filename and neither __dirname, since native resources with new URL with already solve it.
Note that if you are interested in using something like require at strategic times and need the absolute path from the main script, you can use module.createRequire(filename) (Node.js v12.2.0 + only) combined with import.meta.url to load scripts at levels other than the current script level, as this already helps to avoid the need for __dirname, an example using import.meta.url with module.createRequire:
import { createRequire } from 'module';
const require = createRequire(import.meta.url);
// foo-bar.js is a CommonJS module.
const fooBar = require('./foo-bar');
fooBar();
Source from foo-bar.js:
module.exports = () => {
console.log('hello world!');
};
Which is similar to using without "ECMAScript modules":
const fooBar = require('./foo-bar');
There have been proposals about exposing these variables through import.meta, but for now, you need a hacky workaround that I found here:
// expose.js
module.exports = {__dirname};
// use.mjs
import expose from './expose.js';
const {__dirname} = expose;
I used:
import path from 'path';
const __dirname = path.resolve(path.dirname(decodeURI(new URL(import.meta.url).pathname)));
decodeURI was important: used spaces and other stuff within the path on my test system.
path.resolve() handles relative urls.
edit:
fix to support windows (/C:/... => C:/...):
import path from 'path';
const __dirname = (() => {let x = path.dirname(decodeURI(new URL(import.meta.url).pathname)); return path.resolve( (process.platform == "win32") ? x.substr(1) : x ); })();
I made this module es-dirname that will return the current script dirname.
import dirname from 'es-dirname'
console.log(dirname())
It works both in CommonJs scripts and in ES Modules both on Windows and Linux.
Open an issue there if have an error as the script has been working so far in my projects but it might fail in some other cases. For this reason do not use it in a production environment. And this is a temporary solution as I am sure the Node.js team will release a robust way to do it in a near future.
import path from 'path';
import { fileURLToPath } from 'url';
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
// do not use the following code which is bad for CJK characters
const __filename = new URL('', import.meta.url).pathname;
import path from 'path';
const __dirname = path.join(path.dirname(decodeURI(new URL(import.meta.url).pathname))).replace(/^\\([A-Z]:\\)/, "$1");
This code also works on Windows. (the replacement is safe on other platforms, since path.join returns back-slash separators only on Windows)
Since other answers, while useful, don't cover both cross-platform cases (Windows POSIX) and/or path resolution other than the __dirname or __filename and it's kind of verbose to repeat this kind of code everywhere:
import { dirname, join } from 'path'
import { fileURLToPath } from 'url'
const __filename = fileURLToPath(import.meta.url)
const __dirname = dirname(__filename)
const somePath = join(__dirname, '../some-dir-or-some-file')
I just published a NPM package called esm-path to help with this kind of recurring task, hoping it can also be useful to others.
It's documented but here how to use it:
import { getAbsolutePath } from 'esm-path'
const currentDirectoryPath = getAbsolutePath(import.meta.url)
console.log(currentDirectoryPath)
const parentDirectoryPath = getAbsolutePath(import.meta.url, '..')
console.log(parentDirectoryPath)
// Adapt the relative path to your case
const packageJsonFilePath = getAbsolutePath(import.meta.url, '../package.json')
console.log(packageJsonFilePath)
// Adapt the relative path to your case
const packageJsonFilePath = getAbsolutePath(import.meta.url, '..' , 'package.json')
console.log(packageJsonFilePath)
Just use path.resolve() method.
import { resolve } from 'path';
app.use('/public/uploads', express.static(resolve('public', 'uploads')))
I use this option, since the path starts with file:// just remove that part.
const __filename = import.meta.url.slice(7);
const __dirname = import.meta.url.slice(7, import.meta.url.lastIndexOf("/"));
As Geoff pointed out the following code returns not the module's path but working directory.
import path from 'path';
const __dirname = path.resolve();
works with --experimental-modules
create a file called root-dirname.js in your project root with this in it:
import { dirname } from 'path'
const dn = dirname(new URL(import.meta.url).hostname)
const __dirname = process.platform === 'win32' ? dn.substr(1) : dn // remove the leading slash on Windows
export const rootDirname = __dirname
Then just import rootDirname when you want the path to the project root folder.
Other than that, Rudolf Gröhling's answer is also correct.
You can use the stack from a new Error(). The error doesn't need to be thrown, and won't stop program execution either. The first line of the stack will always be the error and its message, with the second line being the file the which the error was invoked from.
Since this is a method (which is probably in a util.js file), the real location of the getDirname() call is actually the third line of the error stack.
export const getDirname = () => {
// get the stack
const { stack } = new Error();
// get the third line (the original invoker)
const invokeFileLine = stack.split(`\n`)[2];
// match the file URL from file://(.+)/ and get the first capturing group
// the (.+) is a greedy quantifier and will make the RegExp expand to the largest match
const __dirname = invokeFileLine.match(/file:\/\/(.+)\//)[1];
return __dirname;
};
another option
import {createRequire} from 'module'; // need node v12.2.0
const require = createRequire(import.meta.url);
const __dirname = require.resolve.paths('.')[0];
I have also published a package on NPM called cross-dirname (forked from es-dirname). The package is tested with Node.js (ESM and CJS), Deno and GJS.
Example:
import dirname from 'cross-dirname'
console.log(dirname())
Agree or disagree with the use of global, I found this to be the easiest way to remember and refactor existing code.
Put somewhere early in your code execution:
import { fileURLToPath } from 'node:url';
import { dirname } from 'node:path';
global.___filename = (path) => {
return fileURLToPath(path);
};
global.___dirname = (path) => {
return dirname(global.___filename(path));
};
And then in whichever file you need dirname or filename:
___filename(import.meta.url)
___dirname(import.meta.url)
Of course if we had macros, I wouldn't need to pass import.meta.url, perhaps there's an improvement.
process.cwd()
From documentation:
The process.cwd() method returns the current working directory of the
Node.js process.

Node.js check if path is file or directory

I can't seem to get any search results that explain how to do this.
All I want to do is be able to know if a given path is a file or a directory (folder).
The following should tell you. From the docs:
fs.lstatSync(path_string).isDirectory()
Objects returned from fs.stat() and fs.lstat() are of this type.
stats.isFile()
stats.isDirectory()
stats.isBlockDevice()
stats.isCharacterDevice()
stats.isSymbolicLink() // (only valid with fs.lstat())
stats.isFIFO()
stats.isSocket()
NOTE:
The above solution will throw an Error if; for ex, the file or directory doesn't exist.
If you want a true or false approach, try fs.existsSync(dirPath) && fs.lstatSync(dirPath).isDirectory(); as mentioned by Joseph in the comments below.
Update: Node.Js >= 10
We can use the new fs.promises API
const fs = require('fs').promises;
(async() => {
const stat = await fs.lstat('test.txt');
console.log(stat.isFile());
})().catch(console.error)
Any Node.Js version
Here's how you would detect if a path is a file or a directory asynchronously, which is the recommended approach in node.
using fs.lstat
const fs = require("fs");
let path = "/path/to/something";
fs.lstat(path, (err, stats) => {
if(err)
return console.log(err); //Handle error
console.log(`Is file: ${stats.isFile()}`);
console.log(`Is directory: ${stats.isDirectory()}`);
console.log(`Is symbolic link: ${stats.isSymbolicLink()}`);
console.log(`Is FIFO: ${stats.isFIFO()}`);
console.log(`Is socket: ${stats.isSocket()}`);
console.log(`Is character device: ${stats.isCharacterDevice()}`);
console.log(`Is block device: ${stats.isBlockDevice()}`);
});
Note when using the synchronous API:
When using the synchronous form any exceptions are immediately thrown.
You can use try/catch to handle exceptions or allow them to bubble up.
try{
fs.lstatSync("/some/path").isDirectory()
}catch(e){
// Handle error
if(e.code == 'ENOENT'){
//no such file or directory
//do something
}else {
//do something else
}
}
Seriously, question exists five years and no nice facade?
function isDir(path) {
try {
var stat = fs.lstatSync(path);
return stat.isDirectory();
} catch (e) {
// lstatSync throws an error if path doesn't exist
return false;
}
}
Depending on your needs, you can probably rely on node's path module.
You may not be able to hit the filesystem (e.g. the file hasn't been created yet) and tbh you probably want to avoid hitting the filesystem unless you really need the extra validation. If you can make the assumption that what you are checking for follows .<extname> format, just look at the name.
Obviously if you are looking for a file without an extname you will need to hit the filesystem to be sure. But keep it simple until you need more complicated.
const path = require('path');
function isFile(pathItem) {
return !!path.extname(pathItem);
}
If you need this when iterating over a directory (Because that's how I've found this question):
Since Node 10.10+, fs.readdir has a withFileTypes option which makes it return directory entry fs.Dirent instead of strings. Directory entries has a name property, and useful methods such as isDirectory or isFile, so you don't need to call fs.lstat explicitly.
import { promises as fs } from 'fs';
// ./my-dir has two subdirectories: dir-a, and dir-b
const dirEntries = await fs.readdir('./my-dir', { withFileTypes: true });
// let's filter all directories in ./my-dir
const onlyDirs = dirEntries.filter(de => de.isDirectory()).map(de => de.name);
// onlyDirs is now [ 'dir-a', 'dir-b' ]
Here's a function that I use. Nobody is making use of promisify and await/async feature in this post so I thought I would share.
const promisify = require('util').promisify;
const lstat = promisify(require('fs').lstat);
async function isDirectory (path) {
try {
return (await lstat(path)).isDirectory();
}
catch (e) {
return false;
}
}
Note : I don't use require('fs').promises; because it has been experimental for one year now, better not rely on it.
The answers above check if a filesystem contains a path that is a file or directory. But it doesn't identify if a given path alone is a file or directory.
The answer is to identify directory-based paths using "/." like --> "/c/dos/run/." <-- trailing period.
Like a path of a directory or file that has not been written yet. Or a path from a different computer. Or a path where both a file and directory of the same name exists.
// /tmp/
// |- dozen.path
// |- dozen.path/.
// |- eggs.txt
//
// "/tmp/dozen.path" !== "/tmp/dozen.path/"
//
// Very few fs allow this. But still. Don't trust the filesystem alone!
// Converts the non-standard "path-ends-in-slash" to the standard "path-is-identified-by current "." or previous ".." directory symbol.
function tryGetPath(pathItem) {
const isPosix = pathItem.includes("/");
if ((isPosix && pathItem.endsWith("/")) ||
(!isPosix && pathItem.endsWith("\\"))) {
pathItem = pathItem + ".";
}
return pathItem;
}
// If a path ends with a current directory identifier, it is a path! /c/dos/run/. and c:\dos\run\.
function isDirectory(pathItem) {
const isPosix = pathItem.includes("/");
if (pathItem === "." || pathItem ==- "..") {
pathItem = (isPosix ? "./" : ".\\") + pathItem;
}
return (isPosix ? pathItem.endsWith("/.") || pathItem.endsWith("/..") : pathItem.endsWith("\\.") || pathItem.endsWith("\\.."));
}
// If a path is not a directory, and it isn't empty, it must be a file
function isFile(pathItem) {
if (pathItem === "") {
return false;
}
return !isDirectory(pathItem);
}
Node version: v11.10.0 - Feb 2019
Last thought: Why even hit the filesystem?
I could check if a directory or file exists using this:
// This returns if the file is not a directory.
if(fs.lstatSync(dir).isDirectory() == false) return;
// This returns if the folder is not a file.
if(fs.lstatSync(dir).isFile() == false) return;
Function that returns type
I like coffee
type: (uri)-> (fina) ->
fs.lstat uri, (erro,stats) ->
console.log {erro} if erro
fina(
stats.isDirectory() and "directory" or
stats.isFile() and "document" or
stats.isSymbolicLink() and "link" or
stats.isSocket() and "socket" or
stats.isBlockDevice() and "block" or
stats.isCharacterDevice() and "character" or
stats.isFIFO() and "fifo"
)
usage:
dozo.type("<path>") (type) ->
console.log "type is #{type}"

node.js require all files in a folder?

How do I require all files in a folder in node.js?
need something like:
files.forEach(function (v,k){
// require routes
require('./routes/'+v);
}};
When require is given the path of a folder, it'll look for an index.js file in that folder; if there is one, it uses that, and if there isn't, it fails.
It would probably make most sense (if you have control over the folder) to create an index.js file and then assign all the "modules" and then simply require that.
yourfile.js
var routes = require("./routes");
index.js
exports.something = require("./routes/something.js");
exports.others = require("./routes/others.js");
If you don't know the filenames you should write some kind of loader.
Working example of a loader:
var normalizedPath = require("path").join(__dirname, "routes");
require("fs").readdirSync(normalizedPath).forEach(function(file) {
require("./routes/" + file);
});
// Continue application logic here
I recommend using glob to accomplish that task.
var glob = require( 'glob' )
, path = require( 'path' );
glob.sync( './routes/**/*.js' ).forEach( function( file ) {
require( path.resolve( file ) );
});
Base on #tbranyen's solution, I create an index.js file that load arbitrary javascripts under current folder as part of the exports.
// Load `*.js` under current directory as properties
// i.e., `User.js` will become `exports['User']` or `exports.User`
require('fs').readdirSync(__dirname + '/').forEach(function(file) {
if (file.match(/\.js$/) !== null && file !== 'index.js') {
var name = file.replace('.js', '');
exports[name] = require('./' + file);
}
});
Then you can require this directory from any where else.
Another option is to use the package require-dir which let's you do the following. It supports recursion as well.
var requireDir = require('require-dir');
var dir = requireDir('./path/to/dir');
I have a folder /fields full of files with a single class each, ex:
fields/Text.js -> Test class
fields/Checkbox.js -> Checkbox class
Drop this in fields/index.js to export each class:
var collectExports, fs, path,
__hasProp = {}.hasOwnProperty;
fs = require('fs');
path = require('path');
collectExports = function(file) {
var func, include, _results;
if (path.extname(file) === '.js' && file !== 'index.js') {
include = require('./' + file);
_results = [];
for (func in include) {
if (!__hasProp.call(include, func)) continue;
_results.push(exports[func] = include[func]);
}
return _results;
}
};
fs.readdirSync('./fields/').forEach(collectExports);
This makes the modules act more like they would in Python:
var text = new Fields.Text()
var checkbox = new Fields.Checkbox()
One more option is require-dir-all combining features from most popular packages.
Most popular require-dir does not have options to filter the files/dirs and does not have map function (see below), but uses small trick to find module's current path.
Second by popularity require-all has regexp filtering and preprocessing, but lacks relative path, so you need to use __dirname (this has pros and contras) like:
var libs = require('require-all')(__dirname + '/lib');
Mentioned here require-index is quite minimalistic.
With map you may do some preprocessing, like create objects and pass config values (assuming modules below exports constructors):
// Store config for each module in config object properties
// with property names corresponding to module names
var config = {
module1: { value: 'config1' },
module2: { value: 'config2' }
};
// Require all files in modules subdirectory
var modules = require('require-dir-all')(
'modules', // Directory to require
{ // Options
// function to be post-processed over exported object for each require'd module
map: function(reqModule) {
// create new object with corresponding config passed to constructor
reqModule.exports = new reqModule.exports( config[reqModule.name] );
}
}
);
// Now `modules` object holds not exported constructors,
// but objects constructed using values provided in `config`.
I know this question is 5+ years old, and the given answers are good, but I wanted something a bit more powerful for express, so i created the express-map2 package for npm. I was going to name it simply express-map, however the people at yahoo already have a package with that name, so i had to rename my package.
1. basic usage:
app.js (or whatever you call it)
var app = require('express'); // 1. include express
app.set('controllers',__dirname+'/controllers/');// 2. set path to your controllers.
require('express-map2')(app); // 3. patch map() into express
app.map({
'GET /':'test',
'GET /foo':'middleware.foo,test',
'GET /bar':'middleware.bar,test'// seperate your handlers with a comma.
});
controller usage:
//single function
module.exports = function(req,res){
};
//export an object with multiple functions.
module.exports = {
foo: function(req,res){
},
bar: function(req,res){
}
};
2. advanced usage, with prefixes:
app.map('/api/v1/books',{
'GET /': 'books.list', // GET /api/v1/books
'GET /:id': 'books.loadOne', // GET /api/v1/books/5
'DELETE /:id': 'books.delete', // DELETE /api/v1/books/5
'PUT /:id': 'books.update', // PUT /api/v1/books/5
'POST /': 'books.create' // POST /api/v1/books
});
As you can see, this saves a ton of time and makes the routing of your application dead simple to write, maintain, and understand. it supports all of the http verbs that express supports, as well as the special .all() method.
npm package: https://www.npmjs.com/package/express-map2
github repo: https://github.com/r3wt/express-map
Expanding on this glob solution. Do this if you want to import all modules from a directory into index.js and then import that index.js in another part of the application. Note that template literals aren't supported by the highlighting engine used by stackoverflow so the code might look strange here.
const glob = require("glob");
let allOfThem = {};
glob.sync(`${__dirname}/*.js`).forEach((file) => {
/* see note about this in example below */
allOfThem = { ...allOfThem, ...require(file) };
});
module.exports = allOfThem;
Full Example
Directory structure
globExample/example.js
globExample/foobars/index.js
globExample/foobars/unexpected.js
globExample/foobars/barit.js
globExample/foobars/fooit.js
globExample/example.js
const { foo, bar, keepit } = require('./foobars/index');
const longStyle = require('./foobars/index');
console.log(foo()); // foo ran
console.log(bar()); // bar ran
console.log(keepit()); // keepit ran unexpected
console.log(longStyle.foo()); // foo ran
console.log(longStyle.bar()); // bar ran
console.log(longStyle.keepit()); // keepit ran unexpected
globExample/foobars/index.js
const glob = require("glob");
/*
Note the following style also works with multiple exports per file (barit.js example)
but will overwrite if you have 2 exports with the same
name (unexpected.js and barit.js have a keepit function) in the files being imported. As a result, this method is best used when
your exporting one module per file and use the filename to easily identify what is in it.
Also Note: This ignores itself (index.js) by default to prevent infinite loop.
*/
let allOfThem = {};
glob.sync(`${__dirname}/*.js`).forEach((file) => {
allOfThem = { ...allOfThem, ...require(file) };
});
module.exports = allOfThem;
globExample/foobars/unexpected.js
exports.keepit = () => 'keepit ran unexpected';
globExample/foobars/barit.js
exports.bar = () => 'bar run';
exports.keepit = () => 'keepit ran';
globExample/foobars/fooit.js
exports.foo = () => 'foo ran';
From inside project with glob installed, run node example.js
$ node example.js
foo ran
bar run
keepit ran unexpected
foo ran
bar run
keepit ran unexpected
One module that I have been using for this exact use case is require-all.
It recursively requires all files in a given directory and its sub directories as long they don't match the excludeDirs property.
It also allows specifying a file filter and how to derive the keys of the returned hash from the filenames.
Require all files from routes folder and apply as middleware. No external modules needed.
// require
const { readdirSync } = require("fs");
// apply as middleware
readdirSync("./routes").map((r) => app.use("/api", require("./routes/" + r)));
I'm using node modules copy-to module to create a single file to require all the files in our NodeJS-based system.
The code for our utility file looks like this:
/**
* Module dependencies.
*/
var copy = require('copy-to');
copy(require('./module1'))
.and(require('./module2'))
.and(require('./module3'))
.to(module.exports);
In all of the files, most functions are written as exports, like so:
exports.function1 = function () { // function contents };
exports.function2 = function () { // function contents };
exports.function3 = function () { // function contents };
So, then to use any function from a file, you just call:
var utility = require('./utility');
var response = utility.function2(); // or whatever the name of the function is
Can use : https://www.npmjs.com/package/require-file-directory
Require selected files with name only or all files.
No need of absoulute path.
Easy to understand and use.
Using this function you can require a whole dir.
const GetAllModules = ( dirname ) => {
if ( dirname ) {
let dirItems = require( "fs" ).readdirSync( dirname );
return dirItems.reduce( ( acc, value, index ) => {
if ( PATH.extname( value ) == ".js" && value.toLowerCase() != "index.js" ) {
let moduleName = value.replace( /.js/g, '' );
acc[ moduleName ] = require( `${dirname}/${moduleName}` );
}
return acc;
}, {} );
}
}
// calling this function.
let dirModules = GetAllModules(__dirname);
Create an index.js file in your folder with this code :
const fs = require('fs')
const files = fs.readdirSync('./routes')
for (const file of files) {
require('./'+file)
}
And after that you can simply load all the folder with require("./routes")
If you include all files of *.js in directory example ("app/lib/*.js"):
In directory app/lib
example.js:
module.exports = function (example) { }
example-2.js:
module.exports = function (example2) { }
In directory app create index.js
index.js:
module.exports = require('./app/lib');

Resources