maybe this question is a little silly, but is it possible to load multiple .js files with one require statement? like this:
var mylib = require('./lib/mylibfiles');
and use:
mylib.foo(); //return "hello from one"
mylib.bar(): //return "hello from two"
And in the folder mylibfiles will have two files:
One.js
exports.foo= function(){return "hello from one";}
Two.js
exports.bar= function(){return "hello from two";}
I was thinking to put a package.json in the folder that say to load all the files, but I don't know how. Other aproach that I was thinking is to have a index.js that exports everything again but I will be duplicating work.
Thanks!!
P.D: I'm working with nodejs v0.611 on a windows 7 machine
First of all using require does not duplicate anything. It loads the module and it caches it, so calling require again will get it from memory (thus you can modify module at fly without interacting with its source code - this is sometimes desirable, for example when you want to store db connection inside module).
Also package.json does not load anything and does not interact with your app at all. It is only used for npm.
Now you cannot require multiple modules at once. For example what will happen if both One.js and Two.js have defined function with the same name?? There are more problems.
But what you can do, is to write additional file, say modules.js with the following content
module.exports = {
one : require('./one.js'),
two : require('./two.js'),
/* some other modules you want */
}
and then you can simply use
var modules = require('./modules.js');
modules.one.foo();
modules.two.bar();
I have a snippet of code that requires more than one module, but it doesn't clump them together as your post suggests. However, that can be overcome with a trick that I found.
function requireMany () {
return Array.prototype.slice.call(arguments).map(function (value) {
try {
return require(value)
}
catch (event) {
return console.log(event)
}
})
}
And you use it as such
requireMany("fs", "socket.io", "path")
Which will return
[ fs {}, socketio {}, path {} ]
If a module is not found, an error will be sent to the console. It won't break the programme. The error will be shown in the array as undefined. The array will not be shorter because one of the modules failed to load.
Then you can bind those each of those array elements to a variable name, like so:
var [fs, socketio, path] = requireMany("fs", "socket.io", "path")
It essentially works like an object, but assigns the keys and their values to the global namespace. So, in your case, you could do:
var [foo, bar] = requireMany("./foo.js", "./bar.js")
foo() //return "hello from one"
bar() //return "hello from two"
And if you do want it to break the programme on error, just use this modified version, which is smaller
function requireMany () {
return Array.prototype.slice.call(arguments).map(require)
}
Yes, you may require a folder as a module, according to the node docs. Let's say you want to require() a folder called ./mypack/.
Inside ./mypack/, create a package.json file with the name of the folder and a main javascript file with the same name, inside a ./lib/ directory.
{
"name" : "mypack",
"main" : "./lib/mypack.js"
}
Now you can use require('./mypack') and node will load ./mypack/lib/mypack.js.
However if you do not include this package.json file, it may still work. Without the file, node will attempt to load ./mypack/index.js, or if that's not there, ./mypack/index.node.
My understanding is that this could be beneficial if you have split your program into many javascript files but do not want to concatenate them for deployment.
You can use destructuring assignment to map an array of exported modules from require statements in one line:
const requires = (...modules) => modules.map(module => require(module));
const [fs, path] = requires('fs', 'path');
I was doing something similar to what #freakish suggests in his answer with a project where I've a list of test scripts that are pulled into a Puppeteer + Jest testing setup. My test files follow the naming convention testname1.js - testnameN.js and I was able use a generator function to require N number of files from the particular directory with the approach below:
const fs = require('fs');
const path = require('path');
module.exports = class FilesInDirectory {
constructor(directory) {
this.fid = fs.readdirSync(path.resolve(directory));
this.requiredFiles = (this.fid.map((fileId) => {
let resolvedPath = path.resolve(directory, fileId);
return require(resolvedPath);
})).filter(file => !!file);
}
printRetrievedFiles() {
console.log(this.requiredFiles);
}
nextFileGenerator() {
const parent = this;
const fidLength = parent.requiredFiles.length;
function* iterate(index) {
while (index < fidLength) {
yield parent.requiredFiles[index++];
}
}
return iterate(0);
}
}
Then use like so:
//Use in test
const FilesInDirectory = require('./utilities/getfilesindirectory');
const StepsCollection = new FilesInDirectory('./test-steps');
const StepsGenerator = StepsCollection.nextFileGenerator();
//Assuming we're in an async function
await StepsGenerator.next().value.FUNCTION_REQUIRED_FROM_FILE(someArg);
Related
From the node.js documentation:
Modules are cached after the first time they are loaded. This means (among other things) that every call to require('foo') will get exactly the same object returned, if it would resolve to the same file.
Is there a way to invalidate this cache? i.e. for unit testing, I'd like each test to be working on a fresh object.
You can always safely delete an entry in require.cache without a problem, even when there are circular dependencies. Because when you delete, you just delete a reference to the cached module object, not the module object itself, the module object will not be GCed because in case of circular dependencies, there is still a object referencing this module object.
Suppose you have:
script a.js:
var b=require('./b.js').b;
exports.a='a from a.js';
exports.b=b;
and script b.js:
var a=require('./a.js').a;
exports.b='b from b.js';
exports.a=a;
when you do:
var a=require('./a.js')
var b=require('./b.js')
you will get:
> a
{ a: 'a from a.js', b: 'b from b.js' }
> b
{ b: 'b from b.js', a: undefined }
now if you edit your b.js:
var a=require('./a.js').a;
exports.b='b from b.js. changed value';
exports.a=a;
and do:
delete require.cache[require.resolve('./b.js')]
b=require('./b.js')
you will get:
> a
{ a: 'a from a.js', b: 'b from b.js' }
> b
{ b: 'b from b.js. changed value',
a: 'a from a.js' }
===
The above is valid if directly running node.js. However, if using tools that have their own module caching system, such as jest, the correct statement would be:
jest.resetModules();
If you always want to reload your module, you could add this function:
function requireUncached(module) {
delete require.cache[require.resolve(module)];
return require(module);
}
and then use requireUncached('./myModule') instead of require.
Yes, you can access the cache via require.cache[moduleName] where moduleName is the name of the module you wish to access. Deleting an entry by calling delete require.cache[moduleName] will cause require to load the actual file.
This is how you would remove all cached files associated with the module:
/**
* Removes a module from the cache
*/
function purgeCache(moduleName) {
// Traverse the cache looking for the files
// loaded by the specified module name
searchCache(moduleName, function (mod) {
delete require.cache[mod.id];
});
// Remove cached paths to the module.
// Thanks to #bentael for pointing this out.
Object.keys(module.constructor._pathCache).forEach(function(cacheKey) {
if (cacheKey.indexOf(moduleName)>0) {
delete module.constructor._pathCache[cacheKey];
}
});
};
/**
* Traverses the cache to search for all the cached
* files of the specified module name
*/
function searchCache(moduleName, callback) {
// Resolve the module identified by the specified name
var mod = require.resolve(moduleName);
// Check if the module has been resolved and found within
// the cache
if (mod && ((mod = require.cache[mod]) !== undefined)) {
// Recursively go over the results
(function traverse(mod) {
// Go over each of the module's children and
// traverse them
mod.children.forEach(function (child) {
traverse(child);
});
// Call the specified callback providing the
// found cached module
callback(mod);
}(mod));
}
};
Usage would be:
// Load the package
var mypackage = require('./mypackage');
// Purge the package from cache
purgeCache('./mypackage');
Since this code uses the same resolver require does, just specify whatever you would for require.
"Unix was not designed to stop its users from doing stupid things, as
that would also stop them from doing clever things." – Doug Gwyn
I think that there should have been a way for performing an explicit uncached module loading.
There's a Simple Module for that (with tests)
We had this exact issue while testing our code (delete cached modules so they can be re-required in a fresh state) so we reviewed all the suggestions of people on the various StackOverflow Questions & Answers and put together a simple node.js module (with tests):
https://www.npmjs.com/package/decache
As you would expect, works for both published npm packages and locally defined modules. Windows, Mac, Linux, etc.
How? (usage)
Usage is pretty simple:
install
Install the module from npm:
npm install decache --save-dev
Use it in your code:
// require the decache module:
const decache = require('decache');
// require a module that you wrote"
let mymod = require('./mymodule.js');
// use your module the way you need to:
console.log(mymod.count()); // 0 (the initial state for our counter is zero)
console.log(mymod.incrementRunCount()); // 1
// delete the cached module:
decache('./mymodule.js');
//
mymod = require('./mymodule.js'); // fresh start
console.log(mymod.count()); // 0 (back to initial state ... zero)
If you have any questions or need more examples, please create a GitHub issue:
https://github.com/dwyl/decache/issues
For anyone coming across this who is using Jest, because Jest does its own module caching, there's a built-in function for this - just make sure jest.resetModules runs eg. after each of your tests:
afterEach( function() {
jest.resetModules();
});
Found this after trying to use decache like another answer suggested. Thanks to Anthony Garvan.
Function documentation here.
The solutions is to use:
delete require.cache[require.resolve(<path of your script>)]
Find here some basic explanations for those who, like me, are a bit new in this:
Suppose you have a dummy example.js file in the root of your directory:
exports.message = "hi";
exports.say = function () {
console.log(message);
}
Then you require() like this:
$ node
> require('./example.js')
{ message: 'hi', say: [Function] }
If you then add a line like this to example.js:
exports.message = "hi";
exports.say = function () {
console.log(message);
}
exports.farewell = "bye!"; // this line is added later on
And continue in the console, the module is not updated:
> require('./example.js')
{ message: 'hi', say: [Function] }
That's when you can use delete require.cache[require.resolve()] indicated in luff's answer:
> delete require.cache[require.resolve('./example.js')]
true
> require('./example.js')
{ message: 'hi', say: [Function], farewell: 'bye!' }
So the cache is cleaned and the require() captures the content of the file again, loading all the current values.
rewire is great for this use case, you get a new instance with each call. Easy dependency injection for node.js unit testing.
rewire adds a special setter and getter to modules so you can modify their behaviour for better unit testing. You may
inject mocks for other modules or globals like process
leak private variables
override variables within the module.
rewire does not load the file and eval the contents to emulate node's require mechanism. In fact it uses node's own require to load the module. Thus your module behaves exactly the same in your test environment as under regular circumstances (except your modifications).
Good news to all caffeine-addicts: rewire works also with Coffee-Script. Note that in this case CoffeeScript needs to be listed in your devDependencies.
I'd add to luff's answer one more line and change the parameter name:
function requireCached(_module){
var l = module.children.length;
for (var i = 0; i < l; i++)
{
if (module.children[i].id === require.resolve(_module))
{
module.children.splice(i, 1);
break;
}
}
delete require.cache[require.resolve(_module)];
return require(_module)
}
Yes, you can invalidate cache.
The cache is stored in an object called require.cache which you can access directly according to filenames (e.g. - /projects/app/home/index.js as opposed to ./home which you would use in a require('./home') statement).
delete require.cache['/projects/app/home/index.js'];
Our team has found the following module useful. To invalidate certain groups of modules.
https://www.npmjs.com/package/node-resource
I am not 100% certain of what you mean by 'invalidate', but you can add the following above the require statements to clear the cache:
Object.keys(require.cache).forEach(function(key) { delete require.cache[key] })
Taken from #Dancrumb's comment here
requireUncached with relative path: 🔥
const requireUncached = require => module => {
delete require.cache[require.resolve(module)];
return require(module);
};
module.exports = requireUncached;
invoke requireUncached with relative path:
const requireUncached = require('../helpers/require_uncached')(require);
const myModule = requireUncached('./myModule');
I couldn't neatly add code in an answer's comment. But I would use #Ben Barkay's answer then add this to the require.uncache function.
// see https://github.com/joyent/node/issues/8266
// use in it in #Ben Barkay's require.uncache function or along with it. whatever
Object.keys(module.constructor._pathCache).forEach(function(cacheKey) {
if ( cacheKey.indexOf(moduleName) > -1 ) {
delete module.constructor._pathCache[ cacheKey ];
}
});
Say you've required a module, then uninstalled it, then reinstalled the same module but used a different version that has a different main script in its package.json, the next require will fail because that main script does not exists because it's cached in Module._pathCache
If you want a module to simply never be cached (sometimes useful for development, but remember to remove it when done!) you can just put delete require.cache[module.id]; inside the module.
here's my version of this answer, which handles not loading a file if it has (for example) syntax errors
function reacquire(module) {
const fullpath = require.resolve(module);
const backup = require.cache[fullpath];
delete require.cache[fullpath];
try {
const newcopy = require(module);
console.log("reqcquired:",module,typeof newcopy);
return newcopy;
} catch (e) {
console.log("Can't reqcquire",module,":",e.message);
require.cache[fullpath] = backup;
return backup;
}
}
Following two step procedure is working perfectly for me.
After changing Model file i-e 'mymodule.js' dynamically, you need to Delete precompiled model in mongoose model first then reload it using require-reload
Example:
// Delete mongoose model
delete mongoose.connection.models[thisObject.singular('mymodule')]
// Reload model
var reload = require('require-reload')(require);
var entityModel = reload('./mymodule.js');
The documentation says:
Modules are cached in this object when they are required. By deleting a key value from this object, the next require will reload the module. This does not apply to native addons, for which reloading will result in an error.
If it's for unit tests, another good tool to use is proxyquire. Everytime you proxyquire the module, it will invalidate the module cache and cache a new one. It also allows you to modify the modules required by the file that you are testing.
I made a small module to delete module from the cache after loading. This forces reevaluation of the module next time it is required. See https://github.com/bahmutov/require-and-forget
// random.js
module.exports = Math.random()
const forget = require('require-and-forget')
const r1 = forget('./random')
const r2 = forget('./random')
// r1 and r2 will be different
// "random.js" will not be stored in the require.cache
PS: you can also put "self-destruct" into the module itself. See https://github.com/bahmutov/unload-me
PSS: more tricks with Node require in my https://glebbahmutov.com/blog/hacking-node-require/
I am struggling to get const properties define outside of module.exports from the object where I am using that module. here is simple example:
ServiceX.js
const _ = require('lodash');
module.exports = {
testFirstName: function () {
console.log('Nodics');
}
}
ServiceY.js
const utils = require('utils');
module.exports = {
testLastName: function () {
console.log('framework');
}
}
now if I import both file via require and merging via _.merge(). Output file contain both of the methods, but it doesn't contain any of the const variable define outside exports.
let combined = _.merge(require('ServiceX'), require('ServiceY'));
writing this combined to the third file some this MyService
even if I print this combined object via console.log(combined), I get only both functions, not const properties.
Use case I have:
I have n-number of files in different location, I need to read files, merge all and create a new file with merged content.
Please help me,
Thanks
A problem here is that a service currently looks like:
const _ = require('lodash');
module.exports = {
testFirstName: function () {
console.log('Nodics');
}
}
And the reason for that is that if you read all files and concatenate the result then module.exports will be overwritten by the last read file.
But if it instead looked like:
const _ = require('lodash');
module.exports.testFirstname = function () {
console.log('Nodics');
}
You could pull it off by doing fs.readFileSync() on all your services and concatenate the result.
You could also do it as a build command, rather than in code:
cat service*.js > combined.js
You will most likely run into other problems, since if one service uses const _ = require('lodash') and another is doing the same you will try to redefine a const variable.
To get around this you could move have to move your requires into another scope, so they don't conflict at file level scope.
I apologise for the phrasing of the question - it's a bit difficult to sum up as a question - please feel free to edit it if you can clarify. Also, as this quite a complex and long query - thank you to all those who are putting in the time to read through it!
I have 4 files (listed with directory tree from project root) as part of a project I'm building which aims to scrape blockchains and take advantage of multiple cores do get the job done:
./main.js
./scraper.js
./api/api.js
./api/litecoin_api.js
main.js
const { scraper } = require('./scraper.js')
const blockchainCli = process.env.BLOCKSCRAPECLI || 'litecoin-cli'
const client = (args) => {
// create child process which returns a promise which resolves after
// data has finished buffering from locally hosted node using cli
let child = spawn(`${blockchainCli} ${args.join(' ')}`, {
shell: true
})
// ... wrap command in a promise here, etc
}
const main = () => {
// count cores, spawn a worker per core using node cluster, add
// message handlers, then begin scraping blockchain with each core...
scraper(blockHeight)
}
main()
module.exports = {
client,
blockchainCli
}
scraper.js
const api = require('./api/api.js')
const scraper = async (blockHeight) => {
try {
let blockHash = await api.getBlockHashByHeight(blockHeight)
let block = await api.getBlock(blockHash)
// ... etc, scraper tested and working, writes to shared writeStream
}
module.exports = {
scraper
}
api.js
const { client, blockchainCli } = require('../main.js')
const litecoin = require('./litecoin_api')
let blockchain = undefined
if (blockchainCli === 'litecoin-cli' || blockchainCli === 'bitcoin-cli') {
blockchain = litecoin
}
// PROBLEM HERE: blockchainCli (and client) are both undefined if and
// only if running scraper from main.js (but not if running scraper
// from scraper.js)
const decodeRawTransaction = (txHash) => {
return client([blockchain.decodeRawTransaction, txHash])
}
const getBlock = (blockhash) => {
return client([blockchain.getBlock, blockhash])
}
const getBlockHashByHeight = (height) => {
return client([blockchain.getBlockHash, height])
}
const getInfo = () => {
return client([blockchain.getInfo])
}
const getRawTransaction = (txHash, verbose = true) => {
return client([blockchain.getRawTransaction, txHash, verbose])
}
module.exports = {
decodeRawTransaction,
getBlock,
getBlockHashByHeight,
getInfo,
getRawTransaction
}
So, I've taken out most the noise in the files which I don't think is necessary but it's open source so if you need more take a look here.
The problem is that, if I start the scraper from inside scraper.js by doing, say, something like this: scraper(1234567) it works like a charm and outputs the expected data to a csv file.
However if I start the scraper from inside the main.js file, I get this error:
Cannot read property 'getBlockHash' of undefined
at Object.getBlockHashByHeight (/home/grayedfox/github/blockscrape/api/api.js:19:29)
at scraper (/home/grayedfox/github/blockscrape/scraper.js:53:31)
at Worker.messageHandler (/home/grayedfox/github/blockscrape/main.js:81:5)
I don't know why, when launching the scraper from main.js, the blockchain is undefined. I thought it might be from the destructuring, but removing the curly braces from around the first line in the example main.js file doesn't change anything (same error).
Things are a bit messy at the moment (in the middle of developing this branch) - but the essential problem now is that it's not clear to me why the require would fail (cannot see variables inside main.js) if it's used in the following way:
main.js (execute scraper()) > scraper.js > api.js
But not fail (can see variables inside main.js) if it's run like this:
scraper.js (execute scraper()) > api.js
Thank you very much for your time!
You have a circular dependency between main and api, each requiring in the other. main requires api through scraper and api directly requires main. That causes things not to work.
You have to remove the circular dependency by putting common shared code into its own module that can be included by both, but doesn't include others that include it. It just needs better modularity.
From the node.js documentation:
Modules are cached after the first time they are loaded. This means (among other things) that every call to require('foo') will get exactly the same object returned, if it would resolve to the same file.
Is there a way to invalidate this cache? i.e. for unit testing, I'd like each test to be working on a fresh object.
You can always safely delete an entry in require.cache without a problem, even when there are circular dependencies. Because when you delete, you just delete a reference to the cached module object, not the module object itself, the module object will not be GCed because in case of circular dependencies, there is still a object referencing this module object.
Suppose you have:
script a.js:
var b=require('./b.js').b;
exports.a='a from a.js';
exports.b=b;
and script b.js:
var a=require('./a.js').a;
exports.b='b from b.js';
exports.a=a;
when you do:
var a=require('./a.js')
var b=require('./b.js')
you will get:
> a
{ a: 'a from a.js', b: 'b from b.js' }
> b
{ b: 'b from b.js', a: undefined }
now if you edit your b.js:
var a=require('./a.js').a;
exports.b='b from b.js. changed value';
exports.a=a;
and do:
delete require.cache[require.resolve('./b.js')]
b=require('./b.js')
you will get:
> a
{ a: 'a from a.js', b: 'b from b.js' }
> b
{ b: 'b from b.js. changed value',
a: 'a from a.js' }
===
The above is valid if directly running node.js. However, if using tools that have their own module caching system, such as jest, the correct statement would be:
jest.resetModules();
If you always want to reload your module, you could add this function:
function requireUncached(module) {
delete require.cache[require.resolve(module)];
return require(module);
}
and then use requireUncached('./myModule') instead of require.
Yes, you can access the cache via require.cache[moduleName] where moduleName is the name of the module you wish to access. Deleting an entry by calling delete require.cache[moduleName] will cause require to load the actual file.
This is how you would remove all cached files associated with the module:
/**
* Removes a module from the cache
*/
function purgeCache(moduleName) {
// Traverse the cache looking for the files
// loaded by the specified module name
searchCache(moduleName, function (mod) {
delete require.cache[mod.id];
});
// Remove cached paths to the module.
// Thanks to #bentael for pointing this out.
Object.keys(module.constructor._pathCache).forEach(function(cacheKey) {
if (cacheKey.indexOf(moduleName)>0) {
delete module.constructor._pathCache[cacheKey];
}
});
};
/**
* Traverses the cache to search for all the cached
* files of the specified module name
*/
function searchCache(moduleName, callback) {
// Resolve the module identified by the specified name
var mod = require.resolve(moduleName);
// Check if the module has been resolved and found within
// the cache
if (mod && ((mod = require.cache[mod]) !== undefined)) {
// Recursively go over the results
(function traverse(mod) {
// Go over each of the module's children and
// traverse them
mod.children.forEach(function (child) {
traverse(child);
});
// Call the specified callback providing the
// found cached module
callback(mod);
}(mod));
}
};
Usage would be:
// Load the package
var mypackage = require('./mypackage');
// Purge the package from cache
purgeCache('./mypackage');
Since this code uses the same resolver require does, just specify whatever you would for require.
"Unix was not designed to stop its users from doing stupid things, as
that would also stop them from doing clever things." – Doug Gwyn
I think that there should have been a way for performing an explicit uncached module loading.
There's a Simple Module for that (with tests)
We had this exact issue while testing our code (delete cached modules so they can be re-required in a fresh state) so we reviewed all the suggestions of people on the various StackOverflow Questions & Answers and put together a simple node.js module (with tests):
https://www.npmjs.com/package/decache
As you would expect, works for both published npm packages and locally defined modules. Windows, Mac, Linux, etc.
How? (usage)
Usage is pretty simple:
install
Install the module from npm:
npm install decache --save-dev
Use it in your code:
// require the decache module:
const decache = require('decache');
// require a module that you wrote"
let mymod = require('./mymodule.js');
// use your module the way you need to:
console.log(mymod.count()); // 0 (the initial state for our counter is zero)
console.log(mymod.incrementRunCount()); // 1
// delete the cached module:
decache('./mymodule.js');
//
mymod = require('./mymodule.js'); // fresh start
console.log(mymod.count()); // 0 (back to initial state ... zero)
If you have any questions or need more examples, please create a GitHub issue:
https://github.com/dwyl/decache/issues
For anyone coming across this who is using Jest, because Jest does its own module caching, there's a built-in function for this - just make sure jest.resetModules runs eg. after each of your tests:
afterEach( function() {
jest.resetModules();
});
Found this after trying to use decache like another answer suggested. Thanks to Anthony Garvan.
Function documentation here.
The solutions is to use:
delete require.cache[require.resolve(<path of your script>)]
Find here some basic explanations for those who, like me, are a bit new in this:
Suppose you have a dummy example.js file in the root of your directory:
exports.message = "hi";
exports.say = function () {
console.log(message);
}
Then you require() like this:
$ node
> require('./example.js')
{ message: 'hi', say: [Function] }
If you then add a line like this to example.js:
exports.message = "hi";
exports.say = function () {
console.log(message);
}
exports.farewell = "bye!"; // this line is added later on
And continue in the console, the module is not updated:
> require('./example.js')
{ message: 'hi', say: [Function] }
That's when you can use delete require.cache[require.resolve()] indicated in luff's answer:
> delete require.cache[require.resolve('./example.js')]
true
> require('./example.js')
{ message: 'hi', say: [Function], farewell: 'bye!' }
So the cache is cleaned and the require() captures the content of the file again, loading all the current values.
rewire is great for this use case, you get a new instance with each call. Easy dependency injection for node.js unit testing.
rewire adds a special setter and getter to modules so you can modify their behaviour for better unit testing. You may
inject mocks for other modules or globals like process
leak private variables
override variables within the module.
rewire does not load the file and eval the contents to emulate node's require mechanism. In fact it uses node's own require to load the module. Thus your module behaves exactly the same in your test environment as under regular circumstances (except your modifications).
Good news to all caffeine-addicts: rewire works also with Coffee-Script. Note that in this case CoffeeScript needs to be listed in your devDependencies.
I'd add to luff's answer one more line and change the parameter name:
function requireCached(_module){
var l = module.children.length;
for (var i = 0; i < l; i++)
{
if (module.children[i].id === require.resolve(_module))
{
module.children.splice(i, 1);
break;
}
}
delete require.cache[require.resolve(_module)];
return require(_module)
}
Yes, you can invalidate cache.
The cache is stored in an object called require.cache which you can access directly according to filenames (e.g. - /projects/app/home/index.js as opposed to ./home which you would use in a require('./home') statement).
delete require.cache['/projects/app/home/index.js'];
Our team has found the following module useful. To invalidate certain groups of modules.
https://www.npmjs.com/package/node-resource
I am not 100% certain of what you mean by 'invalidate', but you can add the following above the require statements to clear the cache:
Object.keys(require.cache).forEach(function(key) { delete require.cache[key] })
Taken from #Dancrumb's comment here
requireUncached with relative path: 🔥
const requireUncached = require => module => {
delete require.cache[require.resolve(module)];
return require(module);
};
module.exports = requireUncached;
invoke requireUncached with relative path:
const requireUncached = require('../helpers/require_uncached')(require);
const myModule = requireUncached('./myModule');
I couldn't neatly add code in an answer's comment. But I would use #Ben Barkay's answer then add this to the require.uncache function.
// see https://github.com/joyent/node/issues/8266
// use in it in #Ben Barkay's require.uncache function or along with it. whatever
Object.keys(module.constructor._pathCache).forEach(function(cacheKey) {
if ( cacheKey.indexOf(moduleName) > -1 ) {
delete module.constructor._pathCache[ cacheKey ];
}
});
Say you've required a module, then uninstalled it, then reinstalled the same module but used a different version that has a different main script in its package.json, the next require will fail because that main script does not exists because it's cached in Module._pathCache
If you want a module to simply never be cached (sometimes useful for development, but remember to remove it when done!) you can just put delete require.cache[module.id]; inside the module.
here's my version of this answer, which handles not loading a file if it has (for example) syntax errors
function reacquire(module) {
const fullpath = require.resolve(module);
const backup = require.cache[fullpath];
delete require.cache[fullpath];
try {
const newcopy = require(module);
console.log("reqcquired:",module,typeof newcopy);
return newcopy;
} catch (e) {
console.log("Can't reqcquire",module,":",e.message);
require.cache[fullpath] = backup;
return backup;
}
}
Following two step procedure is working perfectly for me.
After changing Model file i-e 'mymodule.js' dynamically, you need to Delete precompiled model in mongoose model first then reload it using require-reload
Example:
// Delete mongoose model
delete mongoose.connection.models[thisObject.singular('mymodule')]
// Reload model
var reload = require('require-reload')(require);
var entityModel = reload('./mymodule.js');
The documentation says:
Modules are cached in this object when they are required. By deleting a key value from this object, the next require will reload the module. This does not apply to native addons, for which reloading will result in an error.
If it's for unit tests, another good tool to use is proxyquire. Everytime you proxyquire the module, it will invalidate the module cache and cache a new one. It also allows you to modify the modules required by the file that you are testing.
I made a small module to delete module from the cache after loading. This forces reevaluation of the module next time it is required. See https://github.com/bahmutov/require-and-forget
// random.js
module.exports = Math.random()
const forget = require('require-and-forget')
const r1 = forget('./random')
const r2 = forget('./random')
// r1 and r2 will be different
// "random.js" will not be stored in the require.cache
PS: you can also put "self-destruct" into the module itself. See https://github.com/bahmutov/unload-me
PSS: more tricks with Node require in my https://glebbahmutov.com/blog/hacking-node-require/
From the node.js documentation:
Modules are cached after the first time they are loaded. This means (among other things) that every call to require('foo') will get exactly the same object returned, if it would resolve to the same file.
Is there a way to invalidate this cache? i.e. for unit testing, I'd like each test to be working on a fresh object.
You can always safely delete an entry in require.cache without a problem, even when there are circular dependencies. Because when you delete, you just delete a reference to the cached module object, not the module object itself, the module object will not be GCed because in case of circular dependencies, there is still a object referencing this module object.
Suppose you have:
script a.js:
var b=require('./b.js').b;
exports.a='a from a.js';
exports.b=b;
and script b.js:
var a=require('./a.js').a;
exports.b='b from b.js';
exports.a=a;
when you do:
var a=require('./a.js')
var b=require('./b.js')
you will get:
> a
{ a: 'a from a.js', b: 'b from b.js' }
> b
{ b: 'b from b.js', a: undefined }
now if you edit your b.js:
var a=require('./a.js').a;
exports.b='b from b.js. changed value';
exports.a=a;
and do:
delete require.cache[require.resolve('./b.js')]
b=require('./b.js')
you will get:
> a
{ a: 'a from a.js', b: 'b from b.js' }
> b
{ b: 'b from b.js. changed value',
a: 'a from a.js' }
===
The above is valid if directly running node.js. However, if using tools that have their own module caching system, such as jest, the correct statement would be:
jest.resetModules();
If you always want to reload your module, you could add this function:
function requireUncached(module) {
delete require.cache[require.resolve(module)];
return require(module);
}
and then use requireUncached('./myModule') instead of require.
Yes, you can access the cache via require.cache[moduleName] where moduleName is the name of the module you wish to access. Deleting an entry by calling delete require.cache[moduleName] will cause require to load the actual file.
This is how you would remove all cached files associated with the module:
/**
* Removes a module from the cache
*/
function purgeCache(moduleName) {
// Traverse the cache looking for the files
// loaded by the specified module name
searchCache(moduleName, function (mod) {
delete require.cache[mod.id];
});
// Remove cached paths to the module.
// Thanks to #bentael for pointing this out.
Object.keys(module.constructor._pathCache).forEach(function(cacheKey) {
if (cacheKey.indexOf(moduleName)>0) {
delete module.constructor._pathCache[cacheKey];
}
});
};
/**
* Traverses the cache to search for all the cached
* files of the specified module name
*/
function searchCache(moduleName, callback) {
// Resolve the module identified by the specified name
var mod = require.resolve(moduleName);
// Check if the module has been resolved and found within
// the cache
if (mod && ((mod = require.cache[mod]) !== undefined)) {
// Recursively go over the results
(function traverse(mod) {
// Go over each of the module's children and
// traverse them
mod.children.forEach(function (child) {
traverse(child);
});
// Call the specified callback providing the
// found cached module
callback(mod);
}(mod));
}
};
Usage would be:
// Load the package
var mypackage = require('./mypackage');
// Purge the package from cache
purgeCache('./mypackage');
Since this code uses the same resolver require does, just specify whatever you would for require.
"Unix was not designed to stop its users from doing stupid things, as
that would also stop them from doing clever things." – Doug Gwyn
I think that there should have been a way for performing an explicit uncached module loading.
There's a Simple Module for that (with tests)
We had this exact issue while testing our code (delete cached modules so they can be re-required in a fresh state) so we reviewed all the suggestions of people on the various StackOverflow Questions & Answers and put together a simple node.js module (with tests):
https://www.npmjs.com/package/decache
As you would expect, works for both published npm packages and locally defined modules. Windows, Mac, Linux, etc.
How? (usage)
Usage is pretty simple:
install
Install the module from npm:
npm install decache --save-dev
Use it in your code:
// require the decache module:
const decache = require('decache');
// require a module that you wrote"
let mymod = require('./mymodule.js');
// use your module the way you need to:
console.log(mymod.count()); // 0 (the initial state for our counter is zero)
console.log(mymod.incrementRunCount()); // 1
// delete the cached module:
decache('./mymodule.js');
//
mymod = require('./mymodule.js'); // fresh start
console.log(mymod.count()); // 0 (back to initial state ... zero)
If you have any questions or need more examples, please create a GitHub issue:
https://github.com/dwyl/decache/issues
For anyone coming across this who is using Jest, because Jest does its own module caching, there's a built-in function for this - just make sure jest.resetModules runs eg. after each of your tests:
afterEach( function() {
jest.resetModules();
});
Found this after trying to use decache like another answer suggested. Thanks to Anthony Garvan.
Function documentation here.
The solutions is to use:
delete require.cache[require.resolve(<path of your script>)]
Find here some basic explanations for those who, like me, are a bit new in this:
Suppose you have a dummy example.js file in the root of your directory:
exports.message = "hi";
exports.say = function () {
console.log(message);
}
Then you require() like this:
$ node
> require('./example.js')
{ message: 'hi', say: [Function] }
If you then add a line like this to example.js:
exports.message = "hi";
exports.say = function () {
console.log(message);
}
exports.farewell = "bye!"; // this line is added later on
And continue in the console, the module is not updated:
> require('./example.js')
{ message: 'hi', say: [Function] }
That's when you can use delete require.cache[require.resolve()] indicated in luff's answer:
> delete require.cache[require.resolve('./example.js')]
true
> require('./example.js')
{ message: 'hi', say: [Function], farewell: 'bye!' }
So the cache is cleaned and the require() captures the content of the file again, loading all the current values.
rewire is great for this use case, you get a new instance with each call. Easy dependency injection for node.js unit testing.
rewire adds a special setter and getter to modules so you can modify their behaviour for better unit testing. You may
inject mocks for other modules or globals like process
leak private variables
override variables within the module.
rewire does not load the file and eval the contents to emulate node's require mechanism. In fact it uses node's own require to load the module. Thus your module behaves exactly the same in your test environment as under regular circumstances (except your modifications).
Good news to all caffeine-addicts: rewire works also with Coffee-Script. Note that in this case CoffeeScript needs to be listed in your devDependencies.
I'd add to luff's answer one more line and change the parameter name:
function requireCached(_module){
var l = module.children.length;
for (var i = 0; i < l; i++)
{
if (module.children[i].id === require.resolve(_module))
{
module.children.splice(i, 1);
break;
}
}
delete require.cache[require.resolve(_module)];
return require(_module)
}
Yes, you can invalidate cache.
The cache is stored in an object called require.cache which you can access directly according to filenames (e.g. - /projects/app/home/index.js as opposed to ./home which you would use in a require('./home') statement).
delete require.cache['/projects/app/home/index.js'];
Our team has found the following module useful. To invalidate certain groups of modules.
https://www.npmjs.com/package/node-resource
I am not 100% certain of what you mean by 'invalidate', but you can add the following above the require statements to clear the cache:
Object.keys(require.cache).forEach(function(key) { delete require.cache[key] })
Taken from #Dancrumb's comment here
requireUncached with relative path: 🔥
const requireUncached = require => module => {
delete require.cache[require.resolve(module)];
return require(module);
};
module.exports = requireUncached;
invoke requireUncached with relative path:
const requireUncached = require('../helpers/require_uncached')(require);
const myModule = requireUncached('./myModule');
I couldn't neatly add code in an answer's comment. But I would use #Ben Barkay's answer then add this to the require.uncache function.
// see https://github.com/joyent/node/issues/8266
// use in it in #Ben Barkay's require.uncache function or along with it. whatever
Object.keys(module.constructor._pathCache).forEach(function(cacheKey) {
if ( cacheKey.indexOf(moduleName) > -1 ) {
delete module.constructor._pathCache[ cacheKey ];
}
});
Say you've required a module, then uninstalled it, then reinstalled the same module but used a different version that has a different main script in its package.json, the next require will fail because that main script does not exists because it's cached in Module._pathCache
If you want a module to simply never be cached (sometimes useful for development, but remember to remove it when done!) you can just put delete require.cache[module.id]; inside the module.
here's my version of this answer, which handles not loading a file if it has (for example) syntax errors
function reacquire(module) {
const fullpath = require.resolve(module);
const backup = require.cache[fullpath];
delete require.cache[fullpath];
try {
const newcopy = require(module);
console.log("reqcquired:",module,typeof newcopy);
return newcopy;
} catch (e) {
console.log("Can't reqcquire",module,":",e.message);
require.cache[fullpath] = backup;
return backup;
}
}
Following two step procedure is working perfectly for me.
After changing Model file i-e 'mymodule.js' dynamically, you need to Delete precompiled model in mongoose model first then reload it using require-reload
Example:
// Delete mongoose model
delete mongoose.connection.models[thisObject.singular('mymodule')]
// Reload model
var reload = require('require-reload')(require);
var entityModel = reload('./mymodule.js');
The documentation says:
Modules are cached in this object when they are required. By deleting a key value from this object, the next require will reload the module. This does not apply to native addons, for which reloading will result in an error.
If it's for unit tests, another good tool to use is proxyquire. Everytime you proxyquire the module, it will invalidate the module cache and cache a new one. It also allows you to modify the modules required by the file that you are testing.
I made a small module to delete module from the cache after loading. This forces reevaluation of the module next time it is required. See https://github.com/bahmutov/require-and-forget
// random.js
module.exports = Math.random()
const forget = require('require-and-forget')
const r1 = forget('./random')
const r2 = forget('./random')
// r1 and r2 will be different
// "random.js" will not be stored in the require.cache
PS: you can also put "self-destruct" into the module itself. See https://github.com/bahmutov/unload-me
PSS: more tricks with Node require in my https://glebbahmutov.com/blog/hacking-node-require/