Exclude SVG from gulp-svg-sprite build process - svg

I'm currently building an SVG Icon system using gulp-svg-sprite and have run into a situation where I need to exclude some icons from the build process.
Is there a way to EXCLUDE an SVG from running through these 2 pipes? Somehow I need to get the src filename and compare it to the SVG I want to exclude and so something like:
if src != svgToExclude then run the pipes.
I don't want specific icons being optimized via SVGO and other plugins for those 1-off SVG's that require 2 styleable paths.
Here is the code I'm working with:
const gulp = require('gulp');
const svgo = require('gulp-svgo');
const rsp = require('remove-svg-properties').stream;
const dom = require('gulp-dom');
const xmlEdit = require('gulp-edit-xml');
const gulpIf = require('gulp-if');
const gulpIgnore = require('gulp-ignore');
const { toPath } = require('svg-points');
var excludeIcon = './utilities/checkbox-checked/checkbox-checked--s.svg';
const svgBuild = src => {
return gulp
.src(src)
.pipe(
rsp.remove({
properties: ['fill', rsp.PROPS_STROKE],
log: false,
})
)
.pipe(
svgo({
js2svg: {
indent: 2,
pretty: true,
},
plugins: [{ removeTitle: true }],
})
)
};
module.exports = svgBuild;
I'm new to gulp & node so any help would be appreciated!
Thanks,
- Ryan

Using gulp-filter:
const filter = require('gulp-filter');
something like
var excludeIconArray = ["iconToExclude1.svg", "iconToExclude2.svg", etc.]
const svgBuild = src => {
const svgFilter = filter(file => {
// will probably need file.path string manipulations here
// file.path is the full path but you can select portions of it
// so the below is just pseudocode
return !excludeIconArray.includes(file.path)
});
return gulp
.src(src)
// put the next pipe wherever you want to exclude certain files
// either right after source or just before the svgo pipe
.pipe(svgFilter())

Related

Transform, generate and serve dynamic content with Vite [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 9 months ago.
Improve this question
I was wondering if any of the following is possible to implement using vite build tool.
Consider that I have files in directory matching the pattern: /content/file-[id].md
/content/file-1.md
/content/file-2.md
Every time I serve the SPA app with vite command or building an app with vite build I would like to
grab all the files /content/file-[id].md and transform them into /content_parsed/file-[id].html
/content_parsed/file-1.html
/content_parsed/file-2.html
grab all files /content_parsed/file-[id].html and generated a manifest file /files.manifest containing all paths of files.
/files.manifest
This has to be done automatically in watch mode, when the app is served (vite command) and on-demand when app is built (vite build).
I am pretty sure this is possible to be done with a manual script that I could run with node ./prepareFiles.js && vite, but in this case I am loosing the reactivity when serving the app (i.e. the watch-mode).. so a direct integration into vite would be a step-up in terms of usability and testability (I think).
Given the above use-case - can vite do this? Do I need to write a custom plugin for that? or do you recommend creating a separate watch-files/watch-directory script for that?
I have been able to partially accomplish what I wanted. The only issue right now is the hot reload functionality.
if you import the manifest as
import doc from 'docs.json'
then the page will be auto-reloaded if the module is updated.
On the other had, if you want to dynamically load the data with fetch API:
fetch('docs.json')
.then(r => r.json())
.then(json => {
//...
})
Then the only way to refresh page contents is by manual refresh.. If anyone has a suggestion how to trigger reload from within vite plugin context please let me know.. I will update the post once I figure it out.
Also I should mention that I have decided not to pre-generate the html pages so this functionality is missing from the plugin but could easily be extended with marked, markdown-it remarked etc..
Plugin: generateFilesManifest.ts
import {PluginOption} from "vite";
import fs from "fs";
import path from 'path'
const matter = require('front-matter');
const chokidar = require('chokidar');
import {FSWatcher} from "chokidar";
export type GenerateFilesManifestConfigType = {
watchDirectory: string,
output: string
}
export type MatterOutputType<T> = {
attributes: T,
body: string,
bodyBegin: number,
frontmatter: string,
path: string,
filename: string,
filenameNoExt: string,
}
export default function generateFilesManifest(userConfig: GenerateFilesManifestConfigType): PluginOption {
let config: GenerateFilesManifestConfigType = userConfig
let rootDir: string
let publicDir: string
let command: string
function generateManifest() {
const watchDirFullPath = path.join(rootDir, config.watchDirectory)
const files = fs.readdirSync(watchDirFullPath);
// regenerate manifest
const manifest: any[] = []
files.forEach(fileName => {
const fileFullPath = path.join(watchDirFullPath, fileName)
// get front matter data
const fileContents = fs.readFileSync(fileFullPath).toString()
//const frontMatter = matter.read(fileFullPath)
const frontMatter = matter(fileContents)
//console.log(frontMatter);
// get file path relative to public directory
//const basename = path.basename(__dirname)
const fileRelativePath = path.relative(publicDir, fileFullPath);
const fileInfo = JSON.parse(JSON.stringify(frontMatter)) as MatterOutputType<any>;
fileInfo.path = fileRelativePath
fileInfo.filename = fileName
fileInfo.filenameNoExt = fileName.substring(0, fileName.lastIndexOf('.'));
fileInfo.frontmatter = ''
manifest.push(fileInfo);
});
const outputString = JSON.stringify(manifest, null, 2);
fs.writeFileSync(config.output, outputString, {encoding: 'utf8', flag: 'w'})
console.log('Auto-generated file updated')
}
let watcher: FSWatcher | undefined = undefined;
return {
name: 'generate-files-manifest',
configResolved(resolvedConfig) {
publicDir = resolvedConfig.publicDir
rootDir = resolvedConfig.root
command = resolvedConfig.command
},
buildStart(options: NormalizedInputOptions) {
generateManifest();
if (command === 'serve') {
const watchDirFullPath = path.join(rootDir, config.watchDirectory)
watcher = chokidar.watch(watchDirFullPath,
{
ignoreInitial: true
}
);
watcher
.on('add', function (path) {
//console.log('File', path, 'has been added');
generateManifest();
})
.on('change', function (path) {
//console.log('File', path, 'has been changed');
generateManifest();
})
.on('unlink', function (path) {
//console.log('File', path, 'has been removed');
generateManifest();
})
.on('error', function (error) {
console.error('Error happened', error);
})
}
},
buildEnd(err?: Error) {
console.log('build end')
watcher?.close();
}
}
}
in vite.config.ts, use as
export default defineConfig({
plugins: [
vue(),
generateFilesManifest({
watchDirectory: '/public/docs',
output: './public/docs.json'
})
]
})
you might want to cover such as edge-cases as watch directory not present etc...
front-matter is the library that parses markdown files. Alternative is gray-matter
EDIT: thanks to #flydev response I was able to dig some more examples on page reload functionality. Here's the experimental functionality that you could add:
function generateManifest() {
// ...
ws?.send({type: 'full-reload', path: '*'})
}
let ws: WebSocketServer | undefined = undefined;
return {
name: 'generate-files-manifest',
//...
configureServer(server: ViteDevServer) {
ws = server.ws
}
// ...
}
Currently the whole page is reloaded regardless of the path.. Not sure if there is a way to make it smart enough to just reload pages that loaded the manifest file.. I guess it's currently limited by my own ability to write a better code :)

gulp - wrap plugin (which uses through2) output with string

I would like to know how exactly can I manipulate the output of my Gulp plugin so, for example, no matter how many files are passed to the plugin, it will wrap the output with a string. Currently I cannot know when does the last file is done.
The super simplified example below will iterate on 3 files and will create a new file named output.js and in it there will be three times the string xxx (xxxxxxxxx).
I would like the plugin itself to wrap the contents so the output will
be: +xxxxxxxxx+.
How can I do this?
Thanks!
Gulpfile
var gulp = require('gulp');
var concat = require('gulp-concat');
var foo = require('./index');
gulp.task('default', function() {
gulp.src([a.html, b.html, c.html])
.pipe(foo())
.pipe(concat('output.js'))
.pipe(gulp.dest('./test/output'))
});
The most basic gulp plugin (index.js):
var through2 = require('through2'),
gutil = require('gulp-util');
var PLUGIN_NAME = 'foo';
module.exports = function( options ){
// through2.obj(fn) is a convenience wrapper around
// through2({ objectMode: true }, fn)
return through2.obj(function( file, enc, callback ){
file.contents = new Buffer( 'xxx' );
this.push(file);
callback();
});
}
I understand the files are currently simply returned modified, but what I don't understand is how to append text and return the concatenated result that I want, while keeping it OK with Gulp working standards.
The "real" plugin should actually wrap the files results with:
var foo = { FILES_CONTENT }
where FILES_CONTENT will actually be a a concatenated string of all the files:
"file_name" : "file_content",
"file_name" : "file_content",
...
I would make the following changes to your gulpfile.js:
var gulp = require('gulp');
var foo = require('./index.js');
gulp.task('default', function() {
return gulp.src(['a.html', 'b.html', 'c.html'])
.pipe(foo({fileName:'output.js', varName:'bar'}))
.pipe(gulp.dest('./test/output'))
});
Since your foo() plugin itself will concatenate all the files, there's no need to use gulp-concat at all. Instead your plugin should accept an option fileName that provides the name of the generated file. I've also added another option varName that will provide the name of the var in the output file.
I'll assume that a.html, b.html and c.html are simple HTML files, something like this:
<h1 class="header">a</h1>
As you've already realized you need to concat all the files in the plugin itself. That's not really difficult however and doesn't require a lot of code. Here's a index.js which does exactly that:
var through2 = require('through2'),
gutil = require('gulp-util'),
path = require('path'),
File = require('vinyl');
var PLUGIN_NAME = 'foo';
module.exports = function(options) {
var files = { };
var outputFile = null;
return through2.obj(function(file, enc, callback){
outputFile = outputFile || file;
var filePath = path.relative(file.base, file.path);
files[filePath] = file.contents.toString();
callback();
}, function(callback) {
outputFile = outputFile ? outputFile.clone() : new File();
outputFile.path = path.resolve(outputFile.base, options.fileName);
outputFile.contents = new Buffer(
'var ' + options.varName + ' = ' +
JSON.stringify(files, null, 2) + ';'
);
this.push(outputFile);
callback();
});
}
Since you want to output a key/value mapping from file names to file contents our transformFunction just stores both of those things in a regular JavaScript object files. None of the input files themselves are emitted. Their names and contents are just stored until we have all of them.
The only tricky part is making sure that we respect the .base property of each file as is customary for gulp plugins. This allows the user to provide a custom base folder using the base option in gulp.src().
Once all files have been processed through2 calls the flushFunction. In there we create our output file with the provided fileName (once again making sure we respect the .base property).
Creating the output file contents is then just a matter of serializing our files object using JSON.stringify() (which automatically takes care of any escaping that has to be done).
The resulting ./test/output/output.js will then look like this:
var bar = {
"a.html": "<h1 class=\"header\">a</h1>\n",
"b.html": "<h1 class=\"header\">b</h1>\n",
"c.html": "<h1 class=\"header\">c</h1>\n"
};
You should use the gulp pipeline technique (standard).
This means that you can use the gulp-insert package in order
to add the string xxx.
var insert = require('gulp-insert');
.pipe(insert.append('xxx')); // Appends 'xxx' to the contents of every file
You can also prepend, append and wrap with this package and it support of course the gulp standards.
So the full example will be:
var gulp = require('gulp');
var concat = require('gulp-concat');
var foo = require('./index');
var insert = require('gulp-insert');
gulp.task('default', function() {
gulp.src([a.html, b.html, c.html])
.pipe(foo()
.pipe(insert.append('xxx'))
.pipe(concat('output.js'))
.pipe(gulp.dest('./test/output'))
});

Check package version at runtime in nodejs?

I have some of my entries in package.json defined as "*"
"dependencies": {
"express": "4.*",
"passport": "*",
"body-parser": "*",
"express-error-handler": "*"
},
I wan't to freeze those values to the current version. How can I know what version my packages are at run time? I don't mind checking one by one since I don't have many of them :)
BTW: I cannot do npm list --depth=0 because I cannot access the vm directly (PaaS restriction), just the logs.
You can use the fs module to read the directories in the node_modules directory and then read package.json in each of them.
var fs = require('fs');
var dirs = fs.readdirSync('node_modules');
var data = {};
dirs.forEach(function(dir) {
try{
var file = 'node_modules/' + dir + '/package.json';
var json = require(file);
var name = json.name;
var version = json.version;
data[name] = version;
}catch(err){}
});
console.debug(data['express']); //= 4.11.2
Just in case if you need the version on the front-end, there is an npm package just for this and it can be used both on client-side and server-side.
global-package-version
You can use it in your code like this
import globalPackageVersion from 'global-package-version';
// package name is 'lodash'
globalPackageVersion(require('lodash/package.json'));
// You can type 'packageVersion' in browser console to check lodash version
// => packageVersion = { lodash: '4.7.2'}
packageVersion becomes a global object when used in server side and becomes a window object when used on the client side. Works well with webpack and all other bundling tools.
Disclaimer: I am the author of this package :)
I've 'modernised' a bit #laggingreflex answer, this works on ES6+, node 10, tested on a lambda running in aws. It's an endpoint from an express app.
const fs = require("fs");
module.exports.dependencies = async (_req, res) => {
const dirs = fs.readdirSync("./node_modules");
const modulesInfo = dirs.reduce((acc, dir) => {
try {
const file = `${dir}/package.json`;
const { name, version } = require(file);
return { ...acc, [name]: version };
} catch (err) {}
}, {});
res.status(200).json(modulesInfo);
};
The accepted solution can be improved upon in both terms of performance and stability:
1: the package name IS THE directory. In typically cases where you are looking for a specific package, you do not need to load every module.
2: this code will not run on all os due to the way the paths are formed
3: using require means the path needs to be relative to the current file (this would only work if your file is located at the top of your project folder & along side node_modules). In most cases, using readFile or readFileSync is a easier approach.
const fs = require('fs');
const path = require('path');
const dirs = fs.readdirSync('node_modules');
const data = {};
//add ones you care about
const trackedPackages = ['express', 'passport', 'body-parser'];
dirs.forEach(function(dir) {
if(trackedPackages.indexOf(dir) > -1){
try{
const json = JSON.parse(
fs.readFileSync(path.join('node_modules', dir, 'package.json'), 'utf8')
);
data[dir] = json.version;
}catch(e){
console.log(`failed to read/parse package.json for ${dir}`, e);
}
}
});
console.debug(data['express']); //= 4.11.2

How do you create a file from a string in Gulp?

In my gulpfile I have a version number in a string. I'd like to write the version number to a file. Is there a nice way to do this in Gulp, or should I be looking at more general NodeJS APIs?
If you'd like to do this in a gulp-like way, you can create a stream of "fake" vinyl files and call pipe per usual. Here's a function for creating the stream. "stream" is a core module, so you don't need to install anything:
const Vinyl = require('vinyl')
function string_src(filename, string) {
var src = require('stream').Readable({ objectMode: true })
src._read = function () {
this.push(new Vinyl({
cwd: "",
base: "",
path: filename,
contents: Buffer.from(string, 'utf-8')
}))
this.push(null)
}
return src
}
You can use it like this:
gulp.task('version', function () {
var pkg = require('package.json')
return string_src("version", pkg.version)
.pipe(gulp.dest('build/'))
})
It's pretty much a one-liner in node:
require('fs').writeFileSync('dist/version.txt', '1.2.3');
Or from package.json:
var pkg = require('./package.json');
var fs = require('fs');
fs.writeFileSync('dist/version.txt', 'Version: ' + pkg.version);
I'm using it to specify a build date in an easily-accessible file, so I use this code before the usual return gulp.src(...) in the build task:
require('fs').writeFileSync('dist/build-date.txt', new Date());
This can also be done with vinyl-source-stream. See this document in the gulp repository.
var gulp = require('gulp'),
source = require('vinyl-source-stream');
gulp.task('some-task', function() {
var stream = source('file.txt');
stream.end('some data');
stream.pipe(gulp.dest('output'));
});
According to the maintainer of Gulp, the preferred way to write a string to a file is using fs.writeFile with the task callback.
var fs = require('fs');
var gulp = require('gulp');
gulp.task('taskname', function(cb){
fs.writeFile('filename.txt', 'contents', cb);
});
Source: https://github.com/gulpjs/gulp/issues/332#issuecomment-36970935
You can also use gulp-file:
var gulp = require('gulp');
var file = require('gulp-file');
gulp.task('version', function () {
var pkg = require('package.json')
return gulp.src('src/**')
.pipe(file('version', pkg.version))
.pipe(gulp.dest('build/'))
});
or without using gulp.src():
gulp.task('version', function () {
var pkg = require('package.json')
return file('version', pkg.version, {src: true})
.pipe(gulp.dest('build/'))
});
The gulp-header package can be used to prefix files with header banners.
eg. This will inject a banner into the header of your javascript files.
var header = require('gulp-header');
var pkg = require('./package.json');
var banner = ['/**',
' * <%= pkg.name %> - <%= pkg.description %>',
' * #version v<%= pkg.version %>',
' * #link <%= pkg.homepage %>',
' * #license <%= pkg.license %>',
' */',
''].join('\n');
gulp.src('./foo/*.js')
.pipe(header(banner, { pkg: pkg } ))
.pipe(gulp.dest('./dist/')
Gulp is a streaming build system leveraging pipes.
If you simply want to write a new file with an arbitrary string, you can use built in node fs object.
Using the string-to-stream and vinyl-source-stream modules:
var str = require('string-to-stream');
var source = require('vinyl-source-stream');
var gulp = require('gulp');
str('1.4.27').pipe(source('version.txt')).pipe(gulp.dest('dist'));
Here's an answer that works in 2019.
Plugin:
var Vinyl = require('vinyl');
var through = require('through2');
var path = require('path');
// https://github.com/gulpjs/gulp/tree/master/docs/writing-a-plugin#modifying-file-content
function stringSrc(filename, string) {
/**
* #this {Transform}
*/
var transform = function(file, encoding, callback) {
if (path.basename(file.relative) === 'package.json') {
file.contents = Buffer.from(
JSON.stringify({
name: 'modified-package',
version: '1.0.0',
}),
);
}
// if you want to create multiple files, use this.push and provide empty callback() call instead
// this.push(file);
// callback();
callback(null, file);
};
return through.obj(transform);
}
And in your gulp pipeline:
gulp.src([
...
])
.pipe(stringSrc('version.json', '123'))
.pipe(gulp.dest(destinationPath))
From source: https://github.com/gulpjs/gulp/tree/master/docs/writing-a-plugin#modifying-file-content
The function parameter that you pass to through.obj() is a _transform
function which will operate on the input file. You may also provide an
optional _flush function if you need to emit a bit more data at the
end of the stream.
From within your transform function call this.push(file) 0 or more
times to pass along transformed/cloned files. You don't need to call
this.push(file) if you provide all output to the callback() function.
Call the callback function only when the current file (stream/buffer)
is completely consumed. If an error is encountered, pass it as the
first argument to the callback, otherwise set it to null. If you have
passed all output data to this.push() you can omit the second argument
to the callback.
Generally, a gulp plugin would update file.contents and then choose to
either:
call callback(null, file) or make one call to this.push(file)
This can also be achieved using gulp-tap
This can be especially helpful if you have identified multiple files that require this header. Here is relevant code (Also from gulp-tap documentation)
var gulp = require('gulp'),
tap = require('gulp-tap');
gulp.src("src/**")
.pipe(tap(function(file){
file.contents = Buffer.concat([
new Buffer('Some Version Header', 'utf8'),
file.contents
]);
}))
.pipe(gulp.dest('dist');

node.js require all files in a folder?

How do I require all files in a folder in node.js?
need something like:
files.forEach(function (v,k){
// require routes
require('./routes/'+v);
}};
When require is given the path of a folder, it'll look for an index.js file in that folder; if there is one, it uses that, and if there isn't, it fails.
It would probably make most sense (if you have control over the folder) to create an index.js file and then assign all the "modules" and then simply require that.
yourfile.js
var routes = require("./routes");
index.js
exports.something = require("./routes/something.js");
exports.others = require("./routes/others.js");
If you don't know the filenames you should write some kind of loader.
Working example of a loader:
var normalizedPath = require("path").join(__dirname, "routes");
require("fs").readdirSync(normalizedPath).forEach(function(file) {
require("./routes/" + file);
});
// Continue application logic here
I recommend using glob to accomplish that task.
var glob = require( 'glob' )
, path = require( 'path' );
glob.sync( './routes/**/*.js' ).forEach( function( file ) {
require( path.resolve( file ) );
});
Base on #tbranyen's solution, I create an index.js file that load arbitrary javascripts under current folder as part of the exports.
// Load `*.js` under current directory as properties
// i.e., `User.js` will become `exports['User']` or `exports.User`
require('fs').readdirSync(__dirname + '/').forEach(function(file) {
if (file.match(/\.js$/) !== null && file !== 'index.js') {
var name = file.replace('.js', '');
exports[name] = require('./' + file);
}
});
Then you can require this directory from any where else.
Another option is to use the package require-dir which let's you do the following. It supports recursion as well.
var requireDir = require('require-dir');
var dir = requireDir('./path/to/dir');
I have a folder /fields full of files with a single class each, ex:
fields/Text.js -> Test class
fields/Checkbox.js -> Checkbox class
Drop this in fields/index.js to export each class:
var collectExports, fs, path,
__hasProp = {}.hasOwnProperty;
fs = require('fs');
path = require('path');
collectExports = function(file) {
var func, include, _results;
if (path.extname(file) === '.js' && file !== 'index.js') {
include = require('./' + file);
_results = [];
for (func in include) {
if (!__hasProp.call(include, func)) continue;
_results.push(exports[func] = include[func]);
}
return _results;
}
};
fs.readdirSync('./fields/').forEach(collectExports);
This makes the modules act more like they would in Python:
var text = new Fields.Text()
var checkbox = new Fields.Checkbox()
One more option is require-dir-all combining features from most popular packages.
Most popular require-dir does not have options to filter the files/dirs and does not have map function (see below), but uses small trick to find module's current path.
Second by popularity require-all has regexp filtering and preprocessing, but lacks relative path, so you need to use __dirname (this has pros and contras) like:
var libs = require('require-all')(__dirname + '/lib');
Mentioned here require-index is quite minimalistic.
With map you may do some preprocessing, like create objects and pass config values (assuming modules below exports constructors):
// Store config for each module in config object properties
// with property names corresponding to module names
var config = {
module1: { value: 'config1' },
module2: { value: 'config2' }
};
// Require all files in modules subdirectory
var modules = require('require-dir-all')(
'modules', // Directory to require
{ // Options
// function to be post-processed over exported object for each require'd module
map: function(reqModule) {
// create new object with corresponding config passed to constructor
reqModule.exports = new reqModule.exports( config[reqModule.name] );
}
}
);
// Now `modules` object holds not exported constructors,
// but objects constructed using values provided in `config`.
I know this question is 5+ years old, and the given answers are good, but I wanted something a bit more powerful for express, so i created the express-map2 package for npm. I was going to name it simply express-map, however the people at yahoo already have a package with that name, so i had to rename my package.
1. basic usage:
app.js (or whatever you call it)
var app = require('express'); // 1. include express
app.set('controllers',__dirname+'/controllers/');// 2. set path to your controllers.
require('express-map2')(app); // 3. patch map() into express
app.map({
'GET /':'test',
'GET /foo':'middleware.foo,test',
'GET /bar':'middleware.bar,test'// seperate your handlers with a comma.
});
controller usage:
//single function
module.exports = function(req,res){
};
//export an object with multiple functions.
module.exports = {
foo: function(req,res){
},
bar: function(req,res){
}
};
2. advanced usage, with prefixes:
app.map('/api/v1/books',{
'GET /': 'books.list', // GET /api/v1/books
'GET /:id': 'books.loadOne', // GET /api/v1/books/5
'DELETE /:id': 'books.delete', // DELETE /api/v1/books/5
'PUT /:id': 'books.update', // PUT /api/v1/books/5
'POST /': 'books.create' // POST /api/v1/books
});
As you can see, this saves a ton of time and makes the routing of your application dead simple to write, maintain, and understand. it supports all of the http verbs that express supports, as well as the special .all() method.
npm package: https://www.npmjs.com/package/express-map2
github repo: https://github.com/r3wt/express-map
Expanding on this glob solution. Do this if you want to import all modules from a directory into index.js and then import that index.js in another part of the application. Note that template literals aren't supported by the highlighting engine used by stackoverflow so the code might look strange here.
const glob = require("glob");
let allOfThem = {};
glob.sync(`${__dirname}/*.js`).forEach((file) => {
/* see note about this in example below */
allOfThem = { ...allOfThem, ...require(file) };
});
module.exports = allOfThem;
Full Example
Directory structure
globExample/example.js
globExample/foobars/index.js
globExample/foobars/unexpected.js
globExample/foobars/barit.js
globExample/foobars/fooit.js
globExample/example.js
const { foo, bar, keepit } = require('./foobars/index');
const longStyle = require('./foobars/index');
console.log(foo()); // foo ran
console.log(bar()); // bar ran
console.log(keepit()); // keepit ran unexpected
console.log(longStyle.foo()); // foo ran
console.log(longStyle.bar()); // bar ran
console.log(longStyle.keepit()); // keepit ran unexpected
globExample/foobars/index.js
const glob = require("glob");
/*
Note the following style also works with multiple exports per file (barit.js example)
but will overwrite if you have 2 exports with the same
name (unexpected.js and barit.js have a keepit function) in the files being imported. As a result, this method is best used when
your exporting one module per file and use the filename to easily identify what is in it.
Also Note: This ignores itself (index.js) by default to prevent infinite loop.
*/
let allOfThem = {};
glob.sync(`${__dirname}/*.js`).forEach((file) => {
allOfThem = { ...allOfThem, ...require(file) };
});
module.exports = allOfThem;
globExample/foobars/unexpected.js
exports.keepit = () => 'keepit ran unexpected';
globExample/foobars/barit.js
exports.bar = () => 'bar run';
exports.keepit = () => 'keepit ran';
globExample/foobars/fooit.js
exports.foo = () => 'foo ran';
From inside project with glob installed, run node example.js
$ node example.js
foo ran
bar run
keepit ran unexpected
foo ran
bar run
keepit ran unexpected
One module that I have been using for this exact use case is require-all.
It recursively requires all files in a given directory and its sub directories as long they don't match the excludeDirs property.
It also allows specifying a file filter and how to derive the keys of the returned hash from the filenames.
Require all files from routes folder and apply as middleware. No external modules needed.
// require
const { readdirSync } = require("fs");
// apply as middleware
readdirSync("./routes").map((r) => app.use("/api", require("./routes/" + r)));
I'm using node modules copy-to module to create a single file to require all the files in our NodeJS-based system.
The code for our utility file looks like this:
/**
* Module dependencies.
*/
var copy = require('copy-to');
copy(require('./module1'))
.and(require('./module2'))
.and(require('./module3'))
.to(module.exports);
In all of the files, most functions are written as exports, like so:
exports.function1 = function () { // function contents };
exports.function2 = function () { // function contents };
exports.function3 = function () { // function contents };
So, then to use any function from a file, you just call:
var utility = require('./utility');
var response = utility.function2(); // or whatever the name of the function is
Can use : https://www.npmjs.com/package/require-file-directory
Require selected files with name only or all files.
No need of absoulute path.
Easy to understand and use.
Using this function you can require a whole dir.
const GetAllModules = ( dirname ) => {
if ( dirname ) {
let dirItems = require( "fs" ).readdirSync( dirname );
return dirItems.reduce( ( acc, value, index ) => {
if ( PATH.extname( value ) == ".js" && value.toLowerCase() != "index.js" ) {
let moduleName = value.replace( /.js/g, '' );
acc[ moduleName ] = require( `${dirname}/${moduleName}` );
}
return acc;
}, {} );
}
}
// calling this function.
let dirModules = GetAllModules(__dirname);
Create an index.js file in your folder with this code :
const fs = require('fs')
const files = fs.readdirSync('./routes')
for (const file of files) {
require('./'+file)
}
And after that you can simply load all the folder with require("./routes")
If you include all files of *.js in directory example ("app/lib/*.js"):
In directory app/lib
example.js:
module.exports = function (example) { }
example-2.js:
module.exports = function (example2) { }
In directory app create index.js
index.js:
module.exports = require('./app/lib');

Resources