Related
Hi,
I made an app for node.js so my app.js looks like this:
global.fs = require("fs");
global.vm = require('vm');
var includefile = function(path) {
var code = fs.readFileSync(path);
vm.runInThisContext(code, path);
}.bind(this);
includefile("variables.js");
as for variables.js I have this:
global.app = require("express")();
but when I start the app I get this error:
require is not defined at variables.js
why is it that requires loads fine if executed from app.js but not from an external file?
Thank you.
I'm a little confused, is variables.js just another source file in your project? All you should need to do is require the file in like you've done at the top. As an example for variables.js:
const Variables = {
fs: "some_value"
};
module.exports = Variables;
And for app.js
const { Variables } = require("./variables.js");
const fs = Variables.fs;
Executing console.log(fs); in app.js will print "some_value". Same can be done with functions.
If variables.js is part of your project code, you should use the answer of M. Gercz.
If it's not part of your project code and you want to get some information from it, you could use a json file:
app.js
const variables = require('variables.json');
console.log(variables.whatever);
variables.json
{ whatever: "valuable information" }
If it's not certain, that variables.json is preset, you can do a check using fs.existsSync;
Notice: As jfriend00 commented, using global is not recommended.
file structure is
-src
--Visitor
---visitor.model.js
---Sessions
----session.model.js
In visitor.model.js file
const {Sessions} = require('./Sessions/session.model');
const Visitor = {};
Visitor.visitorFunc = () => {
}
Sessions.sessionFunc();
module.exports = {Visitor: Visitor};
In session.model.js file
const {Visitor} = require('../visitor.model.js');
const Session = {};
Sessions.sessionFunc = () => {
}
Visitor.visitorFunc();
module.exports = {Session: Session};
when I do imports like this in Visitor file Session is undefined. What is the reason for that.. Is it calling import recursively ?
Circular dependencies are allowed in node
https://nodejs.org/api/modules.html#modules_cycles
When main.js loads a.js, then a.js in turn loads b.js. At that point, b.js tries to load a.js. In order to prevent an infinite loop, an unfinished copy of the a.js exports object is returned to the b.js module. b.js then finishes loading, and its exports object is provided to the a.js module.
Since Session and Visitor sounds like database models with an M:N relationship circular dependencies are the way to go (e.g.: Join query)
How to deal with cyclic dependencies in Node.js
Node.js Module.Exports Undefined Empty Object
But it would be less messy to avoid them if you can.
As #prashand above has given the reasons you would have to do imports and calling imported functions after exporting current module.. above example is working with a slight change as follows
const Visitor = {};
Visitor.visitorFunc = () => {
console.log('hello from visitor model');
}
module.exports = {Visitor: Visitor};
// import session.model after exporting the current module
const {Session} = require('./Sessions/session.model');
// then call the required function
Session.sessionFunc();
Simply just use exports.someMember = someMember instead of module.exports = { someMember }.
Your visitor.model.js file is outside the Sessions directory. In order to import session.model.js you need to give absolute path to that file. So your require statement should be like this
const { Sessions } = require('../Sessions/session.model.js');
I use the flag --experimental-modules when running my Node application in order to use ES6 modules.
However when I use this flag the metavariable __dirname is not available. Is there an alternative way to get the same string that is stored in __dirname that is compatible with this mode?
As of Node.js 10.12 there's an alternative that doesn't require creating multiple files and handles special characters in filenames across platforms:
import { dirname } from 'path';
import { fileURLToPath } from 'url';
const __dirname = dirname(fileURLToPath(import.meta.url));
The most standardized way in 2021
import { URL } from 'url'; // in Browser, the URL in native accessible on window
const __filename = new URL('', import.meta.url).pathname;
// Will contain trailing slash
const __dirname = new URL('.', import.meta.url).pathname;
And forget about join to create paths from the current file, just use the URL
const pathToAdjacentFooFile = new URL('./foo.txt', import.meta.url).pathname;
const pathToUpperBarFile = new URL('../bar.json', import.meta.url).pathname;
For Node 10.12 +...
Assuming you are working from a module, this solution should work, and also gives you __filename support as well
import path from 'node:path';
import { fileURLToPath } from 'node:url';
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
The nice thing is that you are also only two lines of code away from supporting require() for CommonJS modules. For that you would add:
import { createRequireFromPath } from 'module';
const require = createRequireFromPath(__filename);
In most cases, using what is native to Node.js (with ES Modules), not external resources, the use of __filename and __dirname for most cases can be totally unnecessary. Most (if not all) of the native methods for reading (streaming) supports the new URL + import.meta.url, exactly as the official documentation itself suggests:
No __filename or __dirname
No JSON Module Loading
No require.resolve
As you can see in the description of the methods, the path parameter shows the supported formats, and in them include the <URL>, examples:
Method
path param supports
fs.readFile(path[, options], callback)
<string>, <Buffer>, <URL>, <integer>
fs.readFileSync(path[, options])
<string>, <Buffer>, <URL>, <integer>
fs.readdir(path[, options], callback)
<string>, <Buffer>, <URL>
fs.readdirSync(path[, options])
<string>, <Buffer>, <URL>, <integer>
fsPromises.readdir(path[, options])
<string>, <Buffer>, <URL>
fsPromises.readFile(path[, options])
<string>, <Buffer>, <URL>, <FileHandle>
So with new URL('<path or file>', import.meta.url) it solves and you don't need to be treating strings and creating variables to be concatenated later.
Examples:
See how it is possible to read a file at the same level as the script without needing __filename or any workaround:
import { readFileSync } from 'fs';
const output = readFileSync(new URL('./foo.txt', import.meta.url));
console.log(output.toString());
List all files in the script directory:
import { readdirSync } from 'fs';
readdirSync(new URL('./', import.meta.url)).forEach((dirContent) => {
console.log(dirContent);
});
Note: In the examples I used the synchronous functions just to make it easier to copy and execute.
If the intention is to make a "own log" (or something similar) that will depend on third parties, it is worth some things done manually, but within the language and Node.js this is not necessary, with ESMODULES it is totally possible not to depend on either __filename and neither __dirname, since native resources with new URL with already solve it.
Note that if you are interested in using something like require at strategic times and need the absolute path from the main script, you can use module.createRequire(filename) (Node.js v12.2.0 + only) combined with import.meta.url to load scripts at levels other than the current script level, as this already helps to avoid the need for __dirname, an example using import.meta.url with module.createRequire:
import { createRequire } from 'module';
const require = createRequire(import.meta.url);
// foo-bar.js is a CommonJS module.
const fooBar = require('./foo-bar');
fooBar();
Source from foo-bar.js:
module.exports = () => {
console.log('hello world!');
};
Which is similar to using without "ECMAScript modules":
const fooBar = require('./foo-bar');
There have been proposals about exposing these variables through import.meta, but for now, you need a hacky workaround that I found here:
// expose.js
module.exports = {__dirname};
// use.mjs
import expose from './expose.js';
const {__dirname} = expose;
I used:
import path from 'path';
const __dirname = path.resolve(path.dirname(decodeURI(new URL(import.meta.url).pathname)));
decodeURI was important: used spaces and other stuff within the path on my test system.
path.resolve() handles relative urls.
edit:
fix to support windows (/C:/... => C:/...):
import path from 'path';
const __dirname = (() => {let x = path.dirname(decodeURI(new URL(import.meta.url).pathname)); return path.resolve( (process.platform == "win32") ? x.substr(1) : x ); })();
I made this module es-dirname that will return the current script dirname.
import dirname from 'es-dirname'
console.log(dirname())
It works both in CommonJs scripts and in ES Modules both on Windows and Linux.
Open an issue there if have an error as the script has been working so far in my projects but it might fail in some other cases. For this reason do not use it in a production environment. And this is a temporary solution as I am sure the Node.js team will release a robust way to do it in a near future.
import path from 'path';
import { fileURLToPath } from 'url';
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
// do not use the following code which is bad for CJK characters
const __filename = new URL('', import.meta.url).pathname;
import path from 'path';
const __dirname = path.join(path.dirname(decodeURI(new URL(import.meta.url).pathname))).replace(/^\\([A-Z]:\\)/, "$1");
This code also works on Windows. (the replacement is safe on other platforms, since path.join returns back-slash separators only on Windows)
Since other answers, while useful, don't cover both cross-platform cases (Windows POSIX) and/or path resolution other than the __dirname or __filename and it's kind of verbose to repeat this kind of code everywhere:
import { dirname, join } from 'path'
import { fileURLToPath } from 'url'
const __filename = fileURLToPath(import.meta.url)
const __dirname = dirname(__filename)
const somePath = join(__dirname, '../some-dir-or-some-file')
I just published a NPM package called esm-path to help with this kind of recurring task, hoping it can also be useful to others.
It's documented but here how to use it:
import { getAbsolutePath } from 'esm-path'
const currentDirectoryPath = getAbsolutePath(import.meta.url)
console.log(currentDirectoryPath)
const parentDirectoryPath = getAbsolutePath(import.meta.url, '..')
console.log(parentDirectoryPath)
// Adapt the relative path to your case
const packageJsonFilePath = getAbsolutePath(import.meta.url, '../package.json')
console.log(packageJsonFilePath)
// Adapt the relative path to your case
const packageJsonFilePath = getAbsolutePath(import.meta.url, '..' , 'package.json')
console.log(packageJsonFilePath)
Just use path.resolve() method.
import { resolve } from 'path';
app.use('/public/uploads', express.static(resolve('public', 'uploads')))
I use this option, since the path starts with file:// just remove that part.
const __filename = import.meta.url.slice(7);
const __dirname = import.meta.url.slice(7, import.meta.url.lastIndexOf("/"));
As Geoff pointed out the following code returns not the module's path but working directory.
import path from 'path';
const __dirname = path.resolve();
works with --experimental-modules
create a file called root-dirname.js in your project root with this in it:
import { dirname } from 'path'
const dn = dirname(new URL(import.meta.url).hostname)
const __dirname = process.platform === 'win32' ? dn.substr(1) : dn // remove the leading slash on Windows
export const rootDirname = __dirname
Then just import rootDirname when you want the path to the project root folder.
Other than that, Rudolf Gröhling's answer is also correct.
You can use the stack from a new Error(). The error doesn't need to be thrown, and won't stop program execution either. The first line of the stack will always be the error and its message, with the second line being the file the which the error was invoked from.
Since this is a method (which is probably in a util.js file), the real location of the getDirname() call is actually the third line of the error stack.
export const getDirname = () => {
// get the stack
const { stack } = new Error();
// get the third line (the original invoker)
const invokeFileLine = stack.split(`\n`)[2];
// match the file URL from file://(.+)/ and get the first capturing group
// the (.+) is a greedy quantifier and will make the RegExp expand to the largest match
const __dirname = invokeFileLine.match(/file:\/\/(.+)\//)[1];
return __dirname;
};
another option
import {createRequire} from 'module'; // need node v12.2.0
const require = createRequire(import.meta.url);
const __dirname = require.resolve.paths('.')[0];
I have also published a package on NPM called cross-dirname (forked from es-dirname). The package is tested with Node.js (ESM and CJS), Deno and GJS.
Example:
import dirname from 'cross-dirname'
console.log(dirname())
Agree or disagree with the use of global, I found this to be the easiest way to remember and refactor existing code.
Put somewhere early in your code execution:
import { fileURLToPath } from 'node:url';
import { dirname } from 'node:path';
global.___filename = (path) => {
return fileURLToPath(path);
};
global.___dirname = (path) => {
return dirname(global.___filename(path));
};
And then in whichever file you need dirname or filename:
___filename(import.meta.url)
___dirname(import.meta.url)
Of course if we had macros, I wouldn't need to pass import.meta.url, perhaps there's an improvement.
process.cwd()
From documentation:
The process.cwd() method returns the current working directory of the
Node.js process.
maybe this question is a little silly, but is it possible to load multiple .js files with one require statement? like this:
var mylib = require('./lib/mylibfiles');
and use:
mylib.foo(); //return "hello from one"
mylib.bar(): //return "hello from two"
And in the folder mylibfiles will have two files:
One.js
exports.foo= function(){return "hello from one";}
Two.js
exports.bar= function(){return "hello from two";}
I was thinking to put a package.json in the folder that say to load all the files, but I don't know how. Other aproach that I was thinking is to have a index.js that exports everything again but I will be duplicating work.
Thanks!!
P.D: I'm working with nodejs v0.611 on a windows 7 machine
First of all using require does not duplicate anything. It loads the module and it caches it, so calling require again will get it from memory (thus you can modify module at fly without interacting with its source code - this is sometimes desirable, for example when you want to store db connection inside module).
Also package.json does not load anything and does not interact with your app at all. It is only used for npm.
Now you cannot require multiple modules at once. For example what will happen if both One.js and Two.js have defined function with the same name?? There are more problems.
But what you can do, is to write additional file, say modules.js with the following content
module.exports = {
one : require('./one.js'),
two : require('./two.js'),
/* some other modules you want */
}
and then you can simply use
var modules = require('./modules.js');
modules.one.foo();
modules.two.bar();
I have a snippet of code that requires more than one module, but it doesn't clump them together as your post suggests. However, that can be overcome with a trick that I found.
function requireMany () {
return Array.prototype.slice.call(arguments).map(function (value) {
try {
return require(value)
}
catch (event) {
return console.log(event)
}
})
}
And you use it as such
requireMany("fs", "socket.io", "path")
Which will return
[ fs {}, socketio {}, path {} ]
If a module is not found, an error will be sent to the console. It won't break the programme. The error will be shown in the array as undefined. The array will not be shorter because one of the modules failed to load.
Then you can bind those each of those array elements to a variable name, like so:
var [fs, socketio, path] = requireMany("fs", "socket.io", "path")
It essentially works like an object, but assigns the keys and their values to the global namespace. So, in your case, you could do:
var [foo, bar] = requireMany("./foo.js", "./bar.js")
foo() //return "hello from one"
bar() //return "hello from two"
And if you do want it to break the programme on error, just use this modified version, which is smaller
function requireMany () {
return Array.prototype.slice.call(arguments).map(require)
}
Yes, you may require a folder as a module, according to the node docs. Let's say you want to require() a folder called ./mypack/.
Inside ./mypack/, create a package.json file with the name of the folder and a main javascript file with the same name, inside a ./lib/ directory.
{
"name" : "mypack",
"main" : "./lib/mypack.js"
}
Now you can use require('./mypack') and node will load ./mypack/lib/mypack.js.
However if you do not include this package.json file, it may still work. Without the file, node will attempt to load ./mypack/index.js, or if that's not there, ./mypack/index.node.
My understanding is that this could be beneficial if you have split your program into many javascript files but do not want to concatenate them for deployment.
You can use destructuring assignment to map an array of exported modules from require statements in one line:
const requires = (...modules) => modules.map(module => require(module));
const [fs, path] = requires('fs', 'path');
I was doing something similar to what #freakish suggests in his answer with a project where I've a list of test scripts that are pulled into a Puppeteer + Jest testing setup. My test files follow the naming convention testname1.js - testnameN.js and I was able use a generator function to require N number of files from the particular directory with the approach below:
const fs = require('fs');
const path = require('path');
module.exports = class FilesInDirectory {
constructor(directory) {
this.fid = fs.readdirSync(path.resolve(directory));
this.requiredFiles = (this.fid.map((fileId) => {
let resolvedPath = path.resolve(directory, fileId);
return require(resolvedPath);
})).filter(file => !!file);
}
printRetrievedFiles() {
console.log(this.requiredFiles);
}
nextFileGenerator() {
const parent = this;
const fidLength = parent.requiredFiles.length;
function* iterate(index) {
while (index < fidLength) {
yield parent.requiredFiles[index++];
}
}
return iterate(0);
}
}
Then use like so:
//Use in test
const FilesInDirectory = require('./utilities/getfilesindirectory');
const StepsCollection = new FilesInDirectory('./test-steps');
const StepsGenerator = StepsCollection.nextFileGenerator();
//Assuming we're in an async function
await StepsGenerator.next().value.FUNCTION_REQUIRED_FROM_FILE(someArg);
How do I require all files in a folder in node.js?
need something like:
files.forEach(function (v,k){
// require routes
require('./routes/'+v);
}};
When require is given the path of a folder, it'll look for an index.js file in that folder; if there is one, it uses that, and if there isn't, it fails.
It would probably make most sense (if you have control over the folder) to create an index.js file and then assign all the "modules" and then simply require that.
yourfile.js
var routes = require("./routes");
index.js
exports.something = require("./routes/something.js");
exports.others = require("./routes/others.js");
If you don't know the filenames you should write some kind of loader.
Working example of a loader:
var normalizedPath = require("path").join(__dirname, "routes");
require("fs").readdirSync(normalizedPath).forEach(function(file) {
require("./routes/" + file);
});
// Continue application logic here
I recommend using glob to accomplish that task.
var glob = require( 'glob' )
, path = require( 'path' );
glob.sync( './routes/**/*.js' ).forEach( function( file ) {
require( path.resolve( file ) );
});
Base on #tbranyen's solution, I create an index.js file that load arbitrary javascripts under current folder as part of the exports.
// Load `*.js` under current directory as properties
// i.e., `User.js` will become `exports['User']` or `exports.User`
require('fs').readdirSync(__dirname + '/').forEach(function(file) {
if (file.match(/\.js$/) !== null && file !== 'index.js') {
var name = file.replace('.js', '');
exports[name] = require('./' + file);
}
});
Then you can require this directory from any where else.
Another option is to use the package require-dir which let's you do the following. It supports recursion as well.
var requireDir = require('require-dir');
var dir = requireDir('./path/to/dir');
I have a folder /fields full of files with a single class each, ex:
fields/Text.js -> Test class
fields/Checkbox.js -> Checkbox class
Drop this in fields/index.js to export each class:
var collectExports, fs, path,
__hasProp = {}.hasOwnProperty;
fs = require('fs');
path = require('path');
collectExports = function(file) {
var func, include, _results;
if (path.extname(file) === '.js' && file !== 'index.js') {
include = require('./' + file);
_results = [];
for (func in include) {
if (!__hasProp.call(include, func)) continue;
_results.push(exports[func] = include[func]);
}
return _results;
}
};
fs.readdirSync('./fields/').forEach(collectExports);
This makes the modules act more like they would in Python:
var text = new Fields.Text()
var checkbox = new Fields.Checkbox()
One more option is require-dir-all combining features from most popular packages.
Most popular require-dir does not have options to filter the files/dirs and does not have map function (see below), but uses small trick to find module's current path.
Second by popularity require-all has regexp filtering and preprocessing, but lacks relative path, so you need to use __dirname (this has pros and contras) like:
var libs = require('require-all')(__dirname + '/lib');
Mentioned here require-index is quite minimalistic.
With map you may do some preprocessing, like create objects and pass config values (assuming modules below exports constructors):
// Store config for each module in config object properties
// with property names corresponding to module names
var config = {
module1: { value: 'config1' },
module2: { value: 'config2' }
};
// Require all files in modules subdirectory
var modules = require('require-dir-all')(
'modules', // Directory to require
{ // Options
// function to be post-processed over exported object for each require'd module
map: function(reqModule) {
// create new object with corresponding config passed to constructor
reqModule.exports = new reqModule.exports( config[reqModule.name] );
}
}
);
// Now `modules` object holds not exported constructors,
// but objects constructed using values provided in `config`.
I know this question is 5+ years old, and the given answers are good, but I wanted something a bit more powerful for express, so i created the express-map2 package for npm. I was going to name it simply express-map, however the people at yahoo already have a package with that name, so i had to rename my package.
1. basic usage:
app.js (or whatever you call it)
var app = require('express'); // 1. include express
app.set('controllers',__dirname+'/controllers/');// 2. set path to your controllers.
require('express-map2')(app); // 3. patch map() into express
app.map({
'GET /':'test',
'GET /foo':'middleware.foo,test',
'GET /bar':'middleware.bar,test'// seperate your handlers with a comma.
});
controller usage:
//single function
module.exports = function(req,res){
};
//export an object with multiple functions.
module.exports = {
foo: function(req,res){
},
bar: function(req,res){
}
};
2. advanced usage, with prefixes:
app.map('/api/v1/books',{
'GET /': 'books.list', // GET /api/v1/books
'GET /:id': 'books.loadOne', // GET /api/v1/books/5
'DELETE /:id': 'books.delete', // DELETE /api/v1/books/5
'PUT /:id': 'books.update', // PUT /api/v1/books/5
'POST /': 'books.create' // POST /api/v1/books
});
As you can see, this saves a ton of time and makes the routing of your application dead simple to write, maintain, and understand. it supports all of the http verbs that express supports, as well as the special .all() method.
npm package: https://www.npmjs.com/package/express-map2
github repo: https://github.com/r3wt/express-map
Expanding on this glob solution. Do this if you want to import all modules from a directory into index.js and then import that index.js in another part of the application. Note that template literals aren't supported by the highlighting engine used by stackoverflow so the code might look strange here.
const glob = require("glob");
let allOfThem = {};
glob.sync(`${__dirname}/*.js`).forEach((file) => {
/* see note about this in example below */
allOfThem = { ...allOfThem, ...require(file) };
});
module.exports = allOfThem;
Full Example
Directory structure
globExample/example.js
globExample/foobars/index.js
globExample/foobars/unexpected.js
globExample/foobars/barit.js
globExample/foobars/fooit.js
globExample/example.js
const { foo, bar, keepit } = require('./foobars/index');
const longStyle = require('./foobars/index');
console.log(foo()); // foo ran
console.log(bar()); // bar ran
console.log(keepit()); // keepit ran unexpected
console.log(longStyle.foo()); // foo ran
console.log(longStyle.bar()); // bar ran
console.log(longStyle.keepit()); // keepit ran unexpected
globExample/foobars/index.js
const glob = require("glob");
/*
Note the following style also works with multiple exports per file (barit.js example)
but will overwrite if you have 2 exports with the same
name (unexpected.js and barit.js have a keepit function) in the files being imported. As a result, this method is best used when
your exporting one module per file and use the filename to easily identify what is in it.
Also Note: This ignores itself (index.js) by default to prevent infinite loop.
*/
let allOfThem = {};
glob.sync(`${__dirname}/*.js`).forEach((file) => {
allOfThem = { ...allOfThem, ...require(file) };
});
module.exports = allOfThem;
globExample/foobars/unexpected.js
exports.keepit = () => 'keepit ran unexpected';
globExample/foobars/barit.js
exports.bar = () => 'bar run';
exports.keepit = () => 'keepit ran';
globExample/foobars/fooit.js
exports.foo = () => 'foo ran';
From inside project with glob installed, run node example.js
$ node example.js
foo ran
bar run
keepit ran unexpected
foo ran
bar run
keepit ran unexpected
One module that I have been using for this exact use case is require-all.
It recursively requires all files in a given directory and its sub directories as long they don't match the excludeDirs property.
It also allows specifying a file filter and how to derive the keys of the returned hash from the filenames.
Require all files from routes folder and apply as middleware. No external modules needed.
// require
const { readdirSync } = require("fs");
// apply as middleware
readdirSync("./routes").map((r) => app.use("/api", require("./routes/" + r)));
I'm using node modules copy-to module to create a single file to require all the files in our NodeJS-based system.
The code for our utility file looks like this:
/**
* Module dependencies.
*/
var copy = require('copy-to');
copy(require('./module1'))
.and(require('./module2'))
.and(require('./module3'))
.to(module.exports);
In all of the files, most functions are written as exports, like so:
exports.function1 = function () { // function contents };
exports.function2 = function () { // function contents };
exports.function3 = function () { // function contents };
So, then to use any function from a file, you just call:
var utility = require('./utility');
var response = utility.function2(); // or whatever the name of the function is
Can use : https://www.npmjs.com/package/require-file-directory
Require selected files with name only or all files.
No need of absoulute path.
Easy to understand and use.
Using this function you can require a whole dir.
const GetAllModules = ( dirname ) => {
if ( dirname ) {
let dirItems = require( "fs" ).readdirSync( dirname );
return dirItems.reduce( ( acc, value, index ) => {
if ( PATH.extname( value ) == ".js" && value.toLowerCase() != "index.js" ) {
let moduleName = value.replace( /.js/g, '' );
acc[ moduleName ] = require( `${dirname}/${moduleName}` );
}
return acc;
}, {} );
}
}
// calling this function.
let dirModules = GetAllModules(__dirname);
Create an index.js file in your folder with this code :
const fs = require('fs')
const files = fs.readdirSync('./routes')
for (const file of files) {
require('./'+file)
}
And after that you can simply load all the folder with require("./routes")
If you include all files of *.js in directory example ("app/lib/*.js"):
In directory app/lib
example.js:
module.exports = function (example) { }
example-2.js:
module.exports = function (example2) { }
In directory app create index.js
index.js:
module.exports = require('./app/lib');