module.exports and local variables - node.js

I'm trying to create a small web app and I have a question about module.exports
supppose I have a file counter.js
var counter = 0;
module.exports = {
add: function() {
counter++;
},
get: function() {
return counter;
}
}
and I try to reference this file in several files suppose app.js and count.js all located in the same directory
// from app.js
var a = require('./counter.js');
a.add();
console.log(a.get()); // the value is 1
// from count.js
var b = require('./counter.js');
b.add();
console.log(b.get()); // the value is 2?
will the value be 1 or 2?

The result will be 2
From nodejs.org doc
Modules are cached after the first time they are loaded. This means (among other things) that every call to require('foo') will get exactly the same object returned, if it would resolve to the same file.
Multiple calls to require('foo') may not cause the module code to be executed multiple times. This is an important feature. With it, "partially done" objects can be returned, thus allowing transitive dependencies to be loaded even when they would cause cycles.

If you mean that (for example) app.js will include count.js via require() or that both app.js and count.js will be included by a third file via require(), then they all share an instance and the answer is 2.
If you mean what happens if you run node app.js and then run node count.js, then in that case, every file gets its own instance of the required module, so it will be 1.

Related

Discard and re-import dynamic import [duplicate]

From the node.js documentation:
Modules are cached after the first time they are loaded. This means (among other things) that every call to require('foo') will get exactly the same object returned, if it would resolve to the same file.
Is there a way to invalidate this cache? i.e. for unit testing, I'd like each test to be working on a fresh object.
You can always safely delete an entry in require.cache without a problem, even when there are circular dependencies. Because when you delete, you just delete a reference to the cached module object, not the module object itself, the module object will not be GCed because in case of circular dependencies, there is still a object referencing this module object.
Suppose you have:
script a.js:
var b=require('./b.js').b;
exports.a='a from a.js';
exports.b=b;
and script b.js:
var a=require('./a.js').a;
exports.b='b from b.js';
exports.a=a;
when you do:
var a=require('./a.js')
var b=require('./b.js')
you will get:
> a
{ a: 'a from a.js', b: 'b from b.js' }
> b
{ b: 'b from b.js', a: undefined }
now if you edit your b.js:
var a=require('./a.js').a;
exports.b='b from b.js. changed value';
exports.a=a;
and do:
delete require.cache[require.resolve('./b.js')]
b=require('./b.js')
you will get:
> a
{ a: 'a from a.js', b: 'b from b.js' }
> b
{ b: 'b from b.js. changed value',
a: 'a from a.js' }
===
The above is valid if directly running node.js. However, if using tools that have their own module caching system, such as jest, the correct statement would be:
jest.resetModules();
If you always want to reload your module, you could add this function:
function requireUncached(module) {
delete require.cache[require.resolve(module)];
return require(module);
}
and then use requireUncached('./myModule') instead of require.
Yes, you can access the cache via require.cache[moduleName] where moduleName is the name of the module you wish to access. Deleting an entry by calling delete require.cache[moduleName] will cause require to load the actual file.
This is how you would remove all cached files associated with the module:
/**
* Removes a module from the cache
*/
function purgeCache(moduleName) {
// Traverse the cache looking for the files
// loaded by the specified module name
searchCache(moduleName, function (mod) {
delete require.cache[mod.id];
});
// Remove cached paths to the module.
// Thanks to #bentael for pointing this out.
Object.keys(module.constructor._pathCache).forEach(function(cacheKey) {
if (cacheKey.indexOf(moduleName)>0) {
delete module.constructor._pathCache[cacheKey];
}
});
};
/**
* Traverses the cache to search for all the cached
* files of the specified module name
*/
function searchCache(moduleName, callback) {
// Resolve the module identified by the specified name
var mod = require.resolve(moduleName);
// Check if the module has been resolved and found within
// the cache
if (mod && ((mod = require.cache[mod]) !== undefined)) {
// Recursively go over the results
(function traverse(mod) {
// Go over each of the module's children and
// traverse them
mod.children.forEach(function (child) {
traverse(child);
});
// Call the specified callback providing the
// found cached module
callback(mod);
}(mod));
}
};
Usage would be:
// Load the package
var mypackage = require('./mypackage');
// Purge the package from cache
purgeCache('./mypackage');
Since this code uses the same resolver require does, just specify whatever you would for require.
"Unix was not designed to stop its users from doing stupid things, as
that would also stop them from doing clever things." – Doug Gwyn
I think that there should have been a way for performing an explicit uncached module loading.
There's a Simple Module for that (with tests)
We had this exact issue while testing our code (delete cached modules so they can be re-required in a fresh state) so we reviewed all the suggestions of people on the various StackOverflow Questions & Answers and put together a simple node.js module (with tests):
https://www.npmjs.com/package/decache
As you would expect, works for both published npm packages and locally defined modules. Windows, Mac, Linux, etc.
How? (usage)
Usage is pretty simple:
install
Install the module from npm:
npm install decache --save-dev
Use it in your code:
// require the decache module:
const decache = require('decache');
// require a module that you wrote"
let mymod = require('./mymodule.js');
// use your module the way you need to:
console.log(mymod.count()); // 0 (the initial state for our counter is zero)
console.log(mymod.incrementRunCount()); // 1
// delete the cached module:
decache('./mymodule.js');
//
mymod = require('./mymodule.js'); // fresh start
console.log(mymod.count()); // 0 (back to initial state ... zero)
If you have any questions or need more examples, please create a GitHub issue:
https://github.com/dwyl/decache/issues
For anyone coming across this who is using Jest, because Jest does its own module caching, there's a built-in function for this - just make sure jest.resetModules runs eg. after each of your tests:
afterEach( function() {
jest.resetModules();
});
Found this after trying to use decache like another answer suggested. Thanks to Anthony Garvan.
Function documentation here.
The solutions is to use:
delete require.cache[require.resolve(<path of your script>)]
Find here some basic explanations for those who, like me, are a bit new in this:
Suppose you have a dummy example.js file in the root of your directory:
exports.message = "hi";
exports.say = function () {
console.log(message);
}
Then you require() like this:
$ node
> require('./example.js')
{ message: 'hi', say: [Function] }
If you then add a line like this to example.js:
exports.message = "hi";
exports.say = function () {
console.log(message);
}
exports.farewell = "bye!"; // this line is added later on
And continue in the console, the module is not updated:
> require('./example.js')
{ message: 'hi', say: [Function] }
That's when you can use delete require.cache[require.resolve()] indicated in luff's answer:
> delete require.cache[require.resolve('./example.js')]
true
> require('./example.js')
{ message: 'hi', say: [Function], farewell: 'bye!' }
So the cache is cleaned and the require() captures the content of the file again, loading all the current values.
rewire is great for this use case, you get a new instance with each call. Easy dependency injection for node.js unit testing.
rewire adds a special setter and getter to modules so you can modify their behaviour for better unit testing. You may
inject mocks for other modules or globals like process
leak private variables
override variables within the module.
rewire does not load the file and eval the contents to emulate node's require mechanism. In fact it uses node's own require to load the module. Thus your module behaves exactly the same in your test environment as under regular circumstances (except your modifications).
Good news to all caffeine-addicts: rewire works also with Coffee-Script. Note that in this case CoffeeScript needs to be listed in your devDependencies.
I'd add to luff's answer one more line and change the parameter name:
function requireCached(_module){
var l = module.children.length;
for (var i = 0; i < l; i++)
{
if (module.children[i].id === require.resolve(_module))
{
module.children.splice(i, 1);
break;
}
}
delete require.cache[require.resolve(_module)];
return require(_module)
}
Yes, you can invalidate cache.
The cache is stored in an object called require.cache which you can access directly according to filenames (e.g. - /projects/app/home/index.js as opposed to ./home which you would use in a require('./home') statement).
delete require.cache['/projects/app/home/index.js'];
Our team has found the following module useful. To invalidate certain groups of modules.
https://www.npmjs.com/package/node-resource
I am not 100% certain of what you mean by 'invalidate', but you can add the following above the require statements to clear the cache:
Object.keys(require.cache).forEach(function(key) { delete require.cache[key] })
Taken from #Dancrumb's comment here
requireUncached with relative path: 🔥
const requireUncached = require => module => {
delete require.cache[require.resolve(module)];
return require(module);
};
module.exports = requireUncached;
invoke requireUncached with relative path:
const requireUncached = require('../helpers/require_uncached')(require);
const myModule = requireUncached('./myModule');
I couldn't neatly add code in an answer's comment. But I would use #Ben Barkay's answer then add this to the require.uncache function.
// see https://github.com/joyent/node/issues/8266
// use in it in #Ben Barkay's require.uncache function or along with it. whatever
Object.keys(module.constructor._pathCache).forEach(function(cacheKey) {
if ( cacheKey.indexOf(moduleName) > -1 ) {
delete module.constructor._pathCache[ cacheKey ];
}
});
Say you've required a module, then uninstalled it, then reinstalled the same module but used a different version that has a different main script in its package.json, the next require will fail because that main script does not exists because it's cached in Module._pathCache
If you want a module to simply never be cached (sometimes useful for development, but remember to remove it when done!) you can just put delete require.cache[module.id]; inside the module.
here's my version of this answer, which handles not loading a file if it has (for example) syntax errors
function reacquire(module) {
const fullpath = require.resolve(module);
const backup = require.cache[fullpath];
delete require.cache[fullpath];
try {
const newcopy = require(module);
console.log("reqcquired:",module,typeof newcopy);
return newcopy;
} catch (e) {
console.log("Can't reqcquire",module,":",e.message);
require.cache[fullpath] = backup;
return backup;
}
}
Following two step procedure is working perfectly for me.
After changing Model file i-e 'mymodule.js' dynamically, you need to Delete precompiled model in mongoose model first then reload it using require-reload
Example:
// Delete mongoose model
delete mongoose.connection.models[thisObject.singular('mymodule')]
// Reload model
var reload = require('require-reload')(require);
var entityModel = reload('./mymodule.js');
The documentation says:
Modules are cached in this object when they are required. By deleting a key value from this object, the next require will reload the module. This does not apply to native addons, for which reloading will result in an error.
If it's for unit tests, another good tool to use is proxyquire. Everytime you proxyquire the module, it will invalidate the module cache and cache a new one. It also allows you to modify the modules required by the file that you are testing.
I made a small module to delete module from the cache after loading. This forces reevaluation of the module next time it is required. See https://github.com/bahmutov/require-and-forget
// random.js
module.exports = Math.random()
const forget = require('require-and-forget')
const r1 = forget('./random')
const r2 = forget('./random')
// r1 and r2 will be different
// "random.js" will not be stored in the require.cache
PS: you can also put "self-destruct" into the module itself. See https://github.com/bahmutov/unload-me
PSS: more tricks with Node require in my https://glebbahmutov.com/blog/hacking-node-require/

Require and instanceof

Can someone please explain my why this happens and how to make it work correctly:
Imagine we have a module published on NPM called test-module with the following content:
function Class() {
}
module.exports = Class;
Then let's imagine our project folder looks like this:
[root]
-- child
and let's assume both the root and child folders have their own npm packages installed including our test-module module (same version of the module used in both folders).
Then we have the following files in the project:
// [root]/index.js
var Test = require("test-module");
var childTest = require("./child");
var one = new Test();
childTest(one);
and
// [root]/child/index.js
var Test = require("test-module");
var two = new Test();
module.exports = function(test) {
console.log("#1:", test instanceof Test);
console.log("#2:", two instanceof Test);
}
If we were to run node index.js on the root folder the result in the console is this:
#1: false
#2: true
Question(s)
Shouldn't both #1 and #2 statements equal to true? Why the statement marked as #1 in the console log equals to false even though test argument (one) is actually the instance of our test-module module, it just isn't installed in the same folder? Is there a way to make it work like this without some hackish solutions that came up to my mind?
Well the problem is that you have two copies of the class in different files. Just because they are the same version doesn't mean Javascript will treat them the same. Node.js in particular doesn't really care about the version of the module and if there are other copies about it.
Even though they have the same name, for Javascript they are different classes. Because every time the class code executes, it'll be a unique function. You can observe it like this:
const ClassA = function SomeClass() {};
const ClassB = function SomeClass() {};
new ClassA() instanceof SomeClass // > false
Now you probably wonder what you have to do, to share the same exact module in your case. Normally what people do is to have only one copy of the module is installed on the root level and then specify the module version in the child with peerDependencies which npm ensures it matches. Here's a good blog post about this issue.

How to use module.exports of Nodejs [duplicate]

What is the purpose of Node.js module.exports and how do you use it?
I can't seem to find any information on this, but it appears to be a rather important part of Node.js as I often see it in source code.
According to the Node.js documentation:
module
A reference to the current
module. In particular module.exports
is the same as the exports object. See
src/node.js for more information.
But this doesn't really help.
What exactly does module.exports do, and what would a simple example be?
module.exports is the object that's actually returned as the result of a require call.
The exports variable is initially set to that same object (i.e. it's a shorthand "alias"), so in the module code you would usually write something like this:
let myFunc1 = function() { ... };
let myFunc2 = function() { ... };
exports.myFunc1 = myFunc1;
exports.myFunc2 = myFunc2;
to export (or "expose") the internally scoped functions myFunc1 and myFunc2.
And in the calling code you would use:
const m = require('./mymodule');
m.myFunc1();
where the last line shows how the result of require is (usually) just a plain object whose properties may be accessed.
NB: if you overwrite exports then it will no longer refer to module.exports. So if you wish to assign a new object (or a function reference) to exports then you should also assign that new object to module.exports
It's worth noting that the name added to the exports object does not have to be the same as the module's internally scoped name for the value that you're adding, so you could have:
let myVeryLongInternalName = function() { ... };
exports.shortName = myVeryLongInternalName;
// add other objects, functions, as required
followed by:
const m = require('./mymodule');
m.shortName(); // invokes module.myVeryLongInternalName
This has already been answered but I wanted to add some clarification...
You can use both exports and module.exports to import code into your application like this:
var mycode = require('./path/to/mycode');
The basic use case you'll see (e.g. in ExpressJS example code) is that you set properties on the exports object in a .js file that you then import using require()
So in a simple counting example, you could have:
(counter.js):
var count = 1;
exports.increment = function() {
count++;
};
exports.getCount = function() {
return count;
};
... then in your application (web.js, or really any other .js file):
var counting = require('./counter.js');
console.log(counting.getCount()); // 1
counting.increment();
console.log(counting.getCount()); // 2
In simple terms, you can think of required files as functions that return a single object, and you can add properties (strings, numbers, arrays, functions, anything) to the object that's returned by setting them on exports.
Sometimes you'll want the object returned from a require() call to be a function you can call, rather than just an object with properties. In that case you need to also set module.exports, like this:
(sayhello.js):
module.exports = exports = function() {
console.log("Hello World!");
};
(app.js):
var sayHello = require('./sayhello.js');
sayHello(); // "Hello World!"
The difference between exports and module.exports is explained better in this answer here.
Note that the NodeJS module mechanism is based on CommonJS modules which are supported in many other implementations like RequireJS, but also SproutCore, CouchDB, Wakanda, OrientDB, ArangoDB, RingoJS, TeaJS, SilkJS, curl.js, or even Adobe Photoshop (via PSLib).
You can find the full list of known implementations here.
Unless your module use node specific features or module, I highly encourage you then using exports instead of module.exports which is not part of the CommonJS standard, and then mostly not supported by other implementations.
Another NodeJS specific feature is when you assign a reference to a new object to exports instead of just adding properties and methods to it like in the last example provided by Jed Watson in this thread. I would personally discourage this practice as this breaks the circular reference support of the CommonJS modules mechanism. It is then not supported by all implementations and Jed example should then be written this way (or a similar one) to provide a more universal module:
(sayhello.js):
exports.run = function() {
console.log("Hello World!");
}
(app.js):
var sayHello = require('./sayhello');
sayHello.run(); // "Hello World!"
Or using ES6 features
(sayhello.js):
Object.assign(exports, {
// Put all your public API here
sayhello() {
console.log("Hello World!");
}
});
(app.js):
const { sayHello } = require('./sayhello');
sayHello(); // "Hello World!"
PS: It looks like Appcelerator also implements CommonJS modules, but without the circular reference support (see: Appcelerator and CommonJS modules (caching and circular references))
Some few things you must take care if you assign a reference to a new object to exports and /or modules.exports:
1. All properties/methods previously attached to the original exports or module.exports are of course lost because the exported object will now reference another new one
This one is obvious, but if you add an exported method at the beginning of an existing module, be sure the native exported object is not referencing another object at the end
exports.method1 = function () {}; // exposed to the original exported object
exports.method2 = function () {}; // exposed to the original exported object
module.exports.method3 = function () {}; // exposed with method1 & method2
var otherAPI = {
// some properties and/or methods
}
exports = otherAPI; // replace the original API (works also with module.exports)
2. In case one of exports or module.exports reference a new value, they don't reference to the same object any more
exports = function AConstructor() {}; // override the original exported object
exports.method2 = function () {}; // exposed to the new exported object
// method added to the original exports object which not exposed any more
module.exports.method3 = function () {};
3. Tricky consequence. If you change the reference to both exports and module.exports, hard to say which API is exposed (it looks like module.exports wins)
// override the original exported object
module.exports = function AConstructor() {};
// try to override the original exported object
// but module.exports will be exposed instead
exports = function AnotherConstructor() {};
the module.exports property or the exports object allows a module to select what should be shared with the application
I have a video on module_export available here
When dividing your program code over multiple files, module.exports is used to publish variables and functions to the consumer of a module. The require() call in your source file is replaced with corresponding module.exports loaded from the module.
Remember when writing modules
Module loads are cached, only initial call evaluates JavaScript.
It's possible to use local variables and functions inside a module, not everything needs to be exported.
The module.exports object is also available as exports shorthand. But when returning a sole function, always use module.exports.
According to: "Modules Part 2 - Writing modules".
the refer link is like this:
exports = module.exports = function(){
//....
}
the properties of exports or module.exports ,such as functions or variables , will be exposed outside
there is something you must pay more attention : don't override exports .
why ?
because exports just the reference of module.exports , you can add the properties onto the exports ,but if you override the exports , the reference link will be broken .
good example :
exports.name = 'william';
exports.getName = function(){
console.log(this.name);
}
bad example :
exports = 'william';
exports = function(){
//...
}
If you just want to exposed only one function or variable , like this:
// test.js
var name = 'william';
module.exports = function(){
console.log(name);
}
// index.js
var test = require('./test');
test();
this module only exposed one function and the property of name is private for the outside .
There are some default or existing modules in node.js when you download and install node.js like http, sys etc.
Since they are already in node.js, when we want to use these modules we basically do like import modules, but why? because they are already present in the node.js. Importing is like taking them from node.js and putting them into your program. And then using them.
Whereas Exports is exactly the opposite, you are creating the module you want, let's say the module addition.js and putting that module into the node.js, you do it by exporting it.
Before I write anything here, remember, module.exports.additionTwo is same as exports.additionTwo
Huh, so that's the reason, we do like
exports.additionTwo = function(x)
{return x+2;};
Be careful with the path
Lets say you have created an addition.js module,
exports.additionTwo = function(x){
return x + 2;
};
When you run this on your NODE.JS command prompt:
node
var run = require('addition.js');
This will error out saying
Error: Cannot find module addition.js
This is because the node.js process is unable the addition.js since we didn't mention the path. So, we have can set the path by using NODE_PATH
set NODE_PATH = path/to/your/additon.js
Now, this should run successfully without any errors!!
One more thing, you can also run the addition.js file by not setting the NODE_PATH, back to your nodejs command prompt:
node
var run = require('./addition.js');
Since we are providing the path here by saying it's in the current directory ./ this should also run successfully.
A module encapsulates related code into a single unit of code. When creating a module, this can be interpreted as moving all related functions into a file.
Suppose there is a file Hello.js which include two functions
sayHelloInEnglish = function() {
return "Hello";
};
sayHelloInSpanish = function() {
return "Hola";
};
We write a function only when utility of the code is more than one call.
Suppose we want to increase utility of the function to a different file say World.js,in this case exporting a file comes into picture which can be obtained by module.exports.
You can just export both the function by the code given below
var anyVariable={
sayHelloInEnglish = function() {
return "Hello";
};
sayHelloInSpanish = function() {
return "Hola";
};
}
module.export=anyVariable;
Now you just need to require the file name into World.js inorder to use those functions
var world= require("./hello.js");
The intent is:
Modular programming is a software design technique that emphasizes
separating the functionality of a program into independent,
interchangeable modules, such that each contains everything necessary
to execute only one aspect of the desired functionality.
Wikipedia
I imagine it becomes difficult to write a large programs without modular / reusable code. In nodejs we can create modular programs utilising module.exports defining what we expose and compose our program with require.
Try this example:
fileLog.js
function log(string) { require('fs').appendFileSync('log.txt',string); }
module.exports = log;
stdoutLog.js
function log(string) { console.log(string); }
module.exports = log;
program.js
const log = require('./stdoutLog.js')
log('hello world!');
execute
$ node program.js
hello world!
Now try swapping ./stdoutLog.js for ./fileLog.js.
What is the purpose of a module system?
It accomplishes the following things:
Keeps our files from bloating to really big sizes. Having files with e.g. 5000 lines of code in it are usually real hard to deal with during development.
Enforces separation of concerns. Having our code split up into multiple files allows us to have appropriate file names for every file. This way we can easily identify what every module does and where to find it (assuming we made a logical directory structure which is still your responsibility).
Having modules makes it easier to find certain parts of code which makes our code more maintainable.
How does it work?
NodejS uses the CommomJS module system which works in the following manner:
If a file wants to export something it has to declare it using module.export syntax
If a file wants to import something it has to declare it using require('file') syntax
Example:
test1.js
const test2 = require('./test2'); // returns the module.exports object of a file
test2.Func1(); // logs func1
test2.Func2(); // logs func2
test2.js
module.exports.Func1 = () => {console.log('func1')};
exports.Func2 = () => {console.log('func2')};
Other useful things to know:
Modules are getting cached. When you are loading the same module in 2 different files the module only has to be loaded once. The second time a require() is called on the same module the is pulled from the cache.
Modules are loaded in synchronous. This behavior is required, if it was asynchronous we couldn't access the object retrieved from require() right away.
ECMAScript modules - 2022
From Node 14.0 ECMAScript modules are no longer experimental and you can use them instead of classic Node's CommonJS modules.
ECMAScript modules are the official standard format to package JavaScript code for reuse. Modules are defined using a variety of import and export statements.
You can define an ES module that exports a function:
// my-fun.mjs
function myFun(num) {
// do something
}
export { myFun };
Then, you can import the exported function from my-fun.mjs:
// app.mjs
import { myFun } from './my-fun.mjs';
myFun();
.mjs is the default extension for Node.js ECMAScript modules.
But you can configure the default modules extension to lookup when resolving modules using the package.json "type" field, or the --input-type flag in the CLI.
Recent versions of Node.js fully supports both ECMAScript and CommonJS modules. Moreover, it provides interoperability between them.
module.exports
ECMAScript and CommonJS modules have many differences but the most relevant difference - to this question - is that there are no more requires, no more exports, no more module.exports
In most cases, the ES module import can be used to load CommonJS modules.
If needed, a require function can be constructed within an ES module using module.createRequire().
ECMAScript modules releases history
Release
Changes
v15.3.0, v14.17.0, v12.22.0
Stabilized modules implementation
v14.13.0, v12.20.0
Support for detection of CommonJS named exports
v14.0.0, v13.14.0, v12.20.0
Remove experimental modules warning
v13.2.0, v12.17.0
Loading ECMAScript modules no longer requires a command-line flag
v12.0.0
Add support for ES modules using .js file extension via package.json "type" field
v8.5.0
Added initial ES modules implementation
You can find all the changelogs in Node.js repository
let test = function() {
return "Hello world"
};
exports.test = test;

require() always returns the same instance of a module during unit tests [duplicate]

From the node.js documentation:
Modules are cached after the first time they are loaded. This means (among other things) that every call to require('foo') will get exactly the same object returned, if it would resolve to the same file.
Is there a way to invalidate this cache? i.e. for unit testing, I'd like each test to be working on a fresh object.
You can always safely delete an entry in require.cache without a problem, even when there are circular dependencies. Because when you delete, you just delete a reference to the cached module object, not the module object itself, the module object will not be GCed because in case of circular dependencies, there is still a object referencing this module object.
Suppose you have:
script a.js:
var b=require('./b.js').b;
exports.a='a from a.js';
exports.b=b;
and script b.js:
var a=require('./a.js').a;
exports.b='b from b.js';
exports.a=a;
when you do:
var a=require('./a.js')
var b=require('./b.js')
you will get:
> a
{ a: 'a from a.js', b: 'b from b.js' }
> b
{ b: 'b from b.js', a: undefined }
now if you edit your b.js:
var a=require('./a.js').a;
exports.b='b from b.js. changed value';
exports.a=a;
and do:
delete require.cache[require.resolve('./b.js')]
b=require('./b.js')
you will get:
> a
{ a: 'a from a.js', b: 'b from b.js' }
> b
{ b: 'b from b.js. changed value',
a: 'a from a.js' }
===
The above is valid if directly running node.js. However, if using tools that have their own module caching system, such as jest, the correct statement would be:
jest.resetModules();
If you always want to reload your module, you could add this function:
function requireUncached(module) {
delete require.cache[require.resolve(module)];
return require(module);
}
and then use requireUncached('./myModule') instead of require.
Yes, you can access the cache via require.cache[moduleName] where moduleName is the name of the module you wish to access. Deleting an entry by calling delete require.cache[moduleName] will cause require to load the actual file.
This is how you would remove all cached files associated with the module:
/**
* Removes a module from the cache
*/
function purgeCache(moduleName) {
// Traverse the cache looking for the files
// loaded by the specified module name
searchCache(moduleName, function (mod) {
delete require.cache[mod.id];
});
// Remove cached paths to the module.
// Thanks to #bentael for pointing this out.
Object.keys(module.constructor._pathCache).forEach(function(cacheKey) {
if (cacheKey.indexOf(moduleName)>0) {
delete module.constructor._pathCache[cacheKey];
}
});
};
/**
* Traverses the cache to search for all the cached
* files of the specified module name
*/
function searchCache(moduleName, callback) {
// Resolve the module identified by the specified name
var mod = require.resolve(moduleName);
// Check if the module has been resolved and found within
// the cache
if (mod && ((mod = require.cache[mod]) !== undefined)) {
// Recursively go over the results
(function traverse(mod) {
// Go over each of the module's children and
// traverse them
mod.children.forEach(function (child) {
traverse(child);
});
// Call the specified callback providing the
// found cached module
callback(mod);
}(mod));
}
};
Usage would be:
// Load the package
var mypackage = require('./mypackage');
// Purge the package from cache
purgeCache('./mypackage');
Since this code uses the same resolver require does, just specify whatever you would for require.
"Unix was not designed to stop its users from doing stupid things, as
that would also stop them from doing clever things." – Doug Gwyn
I think that there should have been a way for performing an explicit uncached module loading.
There's a Simple Module for that (with tests)
We had this exact issue while testing our code (delete cached modules so they can be re-required in a fresh state) so we reviewed all the suggestions of people on the various StackOverflow Questions & Answers and put together a simple node.js module (with tests):
https://www.npmjs.com/package/decache
As you would expect, works for both published npm packages and locally defined modules. Windows, Mac, Linux, etc.
How? (usage)
Usage is pretty simple:
install
Install the module from npm:
npm install decache --save-dev
Use it in your code:
// require the decache module:
const decache = require('decache');
// require a module that you wrote"
let mymod = require('./mymodule.js');
// use your module the way you need to:
console.log(mymod.count()); // 0 (the initial state for our counter is zero)
console.log(mymod.incrementRunCount()); // 1
// delete the cached module:
decache('./mymodule.js');
//
mymod = require('./mymodule.js'); // fresh start
console.log(mymod.count()); // 0 (back to initial state ... zero)
If you have any questions or need more examples, please create a GitHub issue:
https://github.com/dwyl/decache/issues
For anyone coming across this who is using Jest, because Jest does its own module caching, there's a built-in function for this - just make sure jest.resetModules runs eg. after each of your tests:
afterEach( function() {
jest.resetModules();
});
Found this after trying to use decache like another answer suggested. Thanks to Anthony Garvan.
Function documentation here.
The solutions is to use:
delete require.cache[require.resolve(<path of your script>)]
Find here some basic explanations for those who, like me, are a bit new in this:
Suppose you have a dummy example.js file in the root of your directory:
exports.message = "hi";
exports.say = function () {
console.log(message);
}
Then you require() like this:
$ node
> require('./example.js')
{ message: 'hi', say: [Function] }
If you then add a line like this to example.js:
exports.message = "hi";
exports.say = function () {
console.log(message);
}
exports.farewell = "bye!"; // this line is added later on
And continue in the console, the module is not updated:
> require('./example.js')
{ message: 'hi', say: [Function] }
That's when you can use delete require.cache[require.resolve()] indicated in luff's answer:
> delete require.cache[require.resolve('./example.js')]
true
> require('./example.js')
{ message: 'hi', say: [Function], farewell: 'bye!' }
So the cache is cleaned and the require() captures the content of the file again, loading all the current values.
rewire is great for this use case, you get a new instance with each call. Easy dependency injection for node.js unit testing.
rewire adds a special setter and getter to modules so you can modify their behaviour for better unit testing. You may
inject mocks for other modules or globals like process
leak private variables
override variables within the module.
rewire does not load the file and eval the contents to emulate node's require mechanism. In fact it uses node's own require to load the module. Thus your module behaves exactly the same in your test environment as under regular circumstances (except your modifications).
Good news to all caffeine-addicts: rewire works also with Coffee-Script. Note that in this case CoffeeScript needs to be listed in your devDependencies.
I'd add to luff's answer one more line and change the parameter name:
function requireCached(_module){
var l = module.children.length;
for (var i = 0; i < l; i++)
{
if (module.children[i].id === require.resolve(_module))
{
module.children.splice(i, 1);
break;
}
}
delete require.cache[require.resolve(_module)];
return require(_module)
}
Yes, you can invalidate cache.
The cache is stored in an object called require.cache which you can access directly according to filenames (e.g. - /projects/app/home/index.js as opposed to ./home which you would use in a require('./home') statement).
delete require.cache['/projects/app/home/index.js'];
Our team has found the following module useful. To invalidate certain groups of modules.
https://www.npmjs.com/package/node-resource
I am not 100% certain of what you mean by 'invalidate', but you can add the following above the require statements to clear the cache:
Object.keys(require.cache).forEach(function(key) { delete require.cache[key] })
Taken from #Dancrumb's comment here
requireUncached with relative path: 🔥
const requireUncached = require => module => {
delete require.cache[require.resolve(module)];
return require(module);
};
module.exports = requireUncached;
invoke requireUncached with relative path:
const requireUncached = require('../helpers/require_uncached')(require);
const myModule = requireUncached('./myModule');
I couldn't neatly add code in an answer's comment. But I would use #Ben Barkay's answer then add this to the require.uncache function.
// see https://github.com/joyent/node/issues/8266
// use in it in #Ben Barkay's require.uncache function or along with it. whatever
Object.keys(module.constructor._pathCache).forEach(function(cacheKey) {
if ( cacheKey.indexOf(moduleName) > -1 ) {
delete module.constructor._pathCache[ cacheKey ];
}
});
Say you've required a module, then uninstalled it, then reinstalled the same module but used a different version that has a different main script in its package.json, the next require will fail because that main script does not exists because it's cached in Module._pathCache
If you want a module to simply never be cached (sometimes useful for development, but remember to remove it when done!) you can just put delete require.cache[module.id]; inside the module.
here's my version of this answer, which handles not loading a file if it has (for example) syntax errors
function reacquire(module) {
const fullpath = require.resolve(module);
const backup = require.cache[fullpath];
delete require.cache[fullpath];
try {
const newcopy = require(module);
console.log("reqcquired:",module,typeof newcopy);
return newcopy;
} catch (e) {
console.log("Can't reqcquire",module,":",e.message);
require.cache[fullpath] = backup;
return backup;
}
}
Following two step procedure is working perfectly for me.
After changing Model file i-e 'mymodule.js' dynamically, you need to Delete precompiled model in mongoose model first then reload it using require-reload
Example:
// Delete mongoose model
delete mongoose.connection.models[thisObject.singular('mymodule')]
// Reload model
var reload = require('require-reload')(require);
var entityModel = reload('./mymodule.js');
The documentation says:
Modules are cached in this object when they are required. By deleting a key value from this object, the next require will reload the module. This does not apply to native addons, for which reloading will result in an error.
If it's for unit tests, another good tool to use is proxyquire. Everytime you proxyquire the module, it will invalidate the module cache and cache a new one. It also allows you to modify the modules required by the file that you are testing.
I made a small module to delete module from the cache after loading. This forces reevaluation of the module next time it is required. See https://github.com/bahmutov/require-and-forget
// random.js
module.exports = Math.random()
const forget = require('require-and-forget')
const r1 = forget('./random')
const r2 = forget('./random')
// r1 and r2 will be different
// "random.js" will not be stored in the require.cache
PS: you can also put "self-destruct" into the module itself. See https://github.com/bahmutov/unload-me
PSS: more tricks with Node require in my https://glebbahmutov.com/blog/hacking-node-require/

node.js require() cache - possible to invalidate?

From the node.js documentation:
Modules are cached after the first time they are loaded. This means (among other things) that every call to require('foo') will get exactly the same object returned, if it would resolve to the same file.
Is there a way to invalidate this cache? i.e. for unit testing, I'd like each test to be working on a fresh object.
You can always safely delete an entry in require.cache without a problem, even when there are circular dependencies. Because when you delete, you just delete a reference to the cached module object, not the module object itself, the module object will not be GCed because in case of circular dependencies, there is still a object referencing this module object.
Suppose you have:
script a.js:
var b=require('./b.js').b;
exports.a='a from a.js';
exports.b=b;
and script b.js:
var a=require('./a.js').a;
exports.b='b from b.js';
exports.a=a;
when you do:
var a=require('./a.js')
var b=require('./b.js')
you will get:
> a
{ a: 'a from a.js', b: 'b from b.js' }
> b
{ b: 'b from b.js', a: undefined }
now if you edit your b.js:
var a=require('./a.js').a;
exports.b='b from b.js. changed value';
exports.a=a;
and do:
delete require.cache[require.resolve('./b.js')]
b=require('./b.js')
you will get:
> a
{ a: 'a from a.js', b: 'b from b.js' }
> b
{ b: 'b from b.js. changed value',
a: 'a from a.js' }
===
The above is valid if directly running node.js. However, if using tools that have their own module caching system, such as jest, the correct statement would be:
jest.resetModules();
If you always want to reload your module, you could add this function:
function requireUncached(module) {
delete require.cache[require.resolve(module)];
return require(module);
}
and then use requireUncached('./myModule') instead of require.
Yes, you can access the cache via require.cache[moduleName] where moduleName is the name of the module you wish to access. Deleting an entry by calling delete require.cache[moduleName] will cause require to load the actual file.
This is how you would remove all cached files associated with the module:
/**
* Removes a module from the cache
*/
function purgeCache(moduleName) {
// Traverse the cache looking for the files
// loaded by the specified module name
searchCache(moduleName, function (mod) {
delete require.cache[mod.id];
});
// Remove cached paths to the module.
// Thanks to #bentael for pointing this out.
Object.keys(module.constructor._pathCache).forEach(function(cacheKey) {
if (cacheKey.indexOf(moduleName)>0) {
delete module.constructor._pathCache[cacheKey];
}
});
};
/**
* Traverses the cache to search for all the cached
* files of the specified module name
*/
function searchCache(moduleName, callback) {
// Resolve the module identified by the specified name
var mod = require.resolve(moduleName);
// Check if the module has been resolved and found within
// the cache
if (mod && ((mod = require.cache[mod]) !== undefined)) {
// Recursively go over the results
(function traverse(mod) {
// Go over each of the module's children and
// traverse them
mod.children.forEach(function (child) {
traverse(child);
});
// Call the specified callback providing the
// found cached module
callback(mod);
}(mod));
}
};
Usage would be:
// Load the package
var mypackage = require('./mypackage');
// Purge the package from cache
purgeCache('./mypackage');
Since this code uses the same resolver require does, just specify whatever you would for require.
"Unix was not designed to stop its users from doing stupid things, as
that would also stop them from doing clever things." – Doug Gwyn
I think that there should have been a way for performing an explicit uncached module loading.
There's a Simple Module for that (with tests)
We had this exact issue while testing our code (delete cached modules so they can be re-required in a fresh state) so we reviewed all the suggestions of people on the various StackOverflow Questions & Answers and put together a simple node.js module (with tests):
https://www.npmjs.com/package/decache
As you would expect, works for both published npm packages and locally defined modules. Windows, Mac, Linux, etc.
How? (usage)
Usage is pretty simple:
install
Install the module from npm:
npm install decache --save-dev
Use it in your code:
// require the decache module:
const decache = require('decache');
// require a module that you wrote"
let mymod = require('./mymodule.js');
// use your module the way you need to:
console.log(mymod.count()); // 0 (the initial state for our counter is zero)
console.log(mymod.incrementRunCount()); // 1
// delete the cached module:
decache('./mymodule.js');
//
mymod = require('./mymodule.js'); // fresh start
console.log(mymod.count()); // 0 (back to initial state ... zero)
If you have any questions or need more examples, please create a GitHub issue:
https://github.com/dwyl/decache/issues
For anyone coming across this who is using Jest, because Jest does its own module caching, there's a built-in function for this - just make sure jest.resetModules runs eg. after each of your tests:
afterEach( function() {
jest.resetModules();
});
Found this after trying to use decache like another answer suggested. Thanks to Anthony Garvan.
Function documentation here.
The solutions is to use:
delete require.cache[require.resolve(<path of your script>)]
Find here some basic explanations for those who, like me, are a bit new in this:
Suppose you have a dummy example.js file in the root of your directory:
exports.message = "hi";
exports.say = function () {
console.log(message);
}
Then you require() like this:
$ node
> require('./example.js')
{ message: 'hi', say: [Function] }
If you then add a line like this to example.js:
exports.message = "hi";
exports.say = function () {
console.log(message);
}
exports.farewell = "bye!"; // this line is added later on
And continue in the console, the module is not updated:
> require('./example.js')
{ message: 'hi', say: [Function] }
That's when you can use delete require.cache[require.resolve()] indicated in luff's answer:
> delete require.cache[require.resolve('./example.js')]
true
> require('./example.js')
{ message: 'hi', say: [Function], farewell: 'bye!' }
So the cache is cleaned and the require() captures the content of the file again, loading all the current values.
rewire is great for this use case, you get a new instance with each call. Easy dependency injection for node.js unit testing.
rewire adds a special setter and getter to modules so you can modify their behaviour for better unit testing. You may
inject mocks for other modules or globals like process
leak private variables
override variables within the module.
rewire does not load the file and eval the contents to emulate node's require mechanism. In fact it uses node's own require to load the module. Thus your module behaves exactly the same in your test environment as under regular circumstances (except your modifications).
Good news to all caffeine-addicts: rewire works also with Coffee-Script. Note that in this case CoffeeScript needs to be listed in your devDependencies.
I'd add to luff's answer one more line and change the parameter name:
function requireCached(_module){
var l = module.children.length;
for (var i = 0; i < l; i++)
{
if (module.children[i].id === require.resolve(_module))
{
module.children.splice(i, 1);
break;
}
}
delete require.cache[require.resolve(_module)];
return require(_module)
}
Yes, you can invalidate cache.
The cache is stored in an object called require.cache which you can access directly according to filenames (e.g. - /projects/app/home/index.js as opposed to ./home which you would use in a require('./home') statement).
delete require.cache['/projects/app/home/index.js'];
Our team has found the following module useful. To invalidate certain groups of modules.
https://www.npmjs.com/package/node-resource
I am not 100% certain of what you mean by 'invalidate', but you can add the following above the require statements to clear the cache:
Object.keys(require.cache).forEach(function(key) { delete require.cache[key] })
Taken from #Dancrumb's comment here
requireUncached with relative path: 🔥
const requireUncached = require => module => {
delete require.cache[require.resolve(module)];
return require(module);
};
module.exports = requireUncached;
invoke requireUncached with relative path:
const requireUncached = require('../helpers/require_uncached')(require);
const myModule = requireUncached('./myModule');
I couldn't neatly add code in an answer's comment. But I would use #Ben Barkay's answer then add this to the require.uncache function.
// see https://github.com/joyent/node/issues/8266
// use in it in #Ben Barkay's require.uncache function or along with it. whatever
Object.keys(module.constructor._pathCache).forEach(function(cacheKey) {
if ( cacheKey.indexOf(moduleName) > -1 ) {
delete module.constructor._pathCache[ cacheKey ];
}
});
Say you've required a module, then uninstalled it, then reinstalled the same module but used a different version that has a different main script in its package.json, the next require will fail because that main script does not exists because it's cached in Module._pathCache
If you want a module to simply never be cached (sometimes useful for development, but remember to remove it when done!) you can just put delete require.cache[module.id]; inside the module.
here's my version of this answer, which handles not loading a file if it has (for example) syntax errors
function reacquire(module) {
const fullpath = require.resolve(module);
const backup = require.cache[fullpath];
delete require.cache[fullpath];
try {
const newcopy = require(module);
console.log("reqcquired:",module,typeof newcopy);
return newcopy;
} catch (e) {
console.log("Can't reqcquire",module,":",e.message);
require.cache[fullpath] = backup;
return backup;
}
}
Following two step procedure is working perfectly for me.
After changing Model file i-e 'mymodule.js' dynamically, you need to Delete precompiled model in mongoose model first then reload it using require-reload
Example:
// Delete mongoose model
delete mongoose.connection.models[thisObject.singular('mymodule')]
// Reload model
var reload = require('require-reload')(require);
var entityModel = reload('./mymodule.js');
The documentation says:
Modules are cached in this object when they are required. By deleting a key value from this object, the next require will reload the module. This does not apply to native addons, for which reloading will result in an error.
If it's for unit tests, another good tool to use is proxyquire. Everytime you proxyquire the module, it will invalidate the module cache and cache a new one. It also allows you to modify the modules required by the file that you are testing.
I made a small module to delete module from the cache after loading. This forces reevaluation of the module next time it is required. See https://github.com/bahmutov/require-and-forget
// random.js
module.exports = Math.random()
const forget = require('require-and-forget')
const r1 = forget('./random')
const r2 = forget('./random')
// r1 and r2 will be different
// "random.js" will not be stored in the require.cache
PS: you can also put "self-destruct" into the module itself. See https://github.com/bahmutov/unload-me
PSS: more tricks with Node require in my https://glebbahmutov.com/blog/hacking-node-require/

Resources