In other words, the codebase I'm trying to refactor has a lot of this (anti?) pattern:
A.js
const libC = require("/path/to/libC")
const A_init = () => {
// some code
libC.someFunc()
global_variable = libC // <- setting the require return value to a global variable!
// some code
}
exports.A_init = A_init
B.js
const B_init = () => {
// some code
libC.someOtherFunc() // instead of "requiring" libC itself, it relies on A.js's init method to se it
// some code
}
exports.B_init = B_init
Now the user of this library will get a ReferenceError if they attempt to call B_init() before calling A_init() which seems like very bad practise. I'm aware global variables like this lead to tightly coupled, difficult to manage code, my question is: Should the solution be for B.js to also simply "require" libC (and by extension, every single module in the codebase should do the same and "require" all their own dependencies)? Is there some sort of DI framework/pattern used in node to make this more elegant?
I am writing tests right now for my node application.
I have fixtures which I use to test my data and I ran into the Problem, that when I alter any of them in a method, then they are globally altered for all the other tests as well, which obviously has to do with referencing. Now I figured if I write my fixtures into a JSON and require that JSON in each file then they will have unique references for each file, which turns out now, they don't.
My question would be: is there an easy way to handle fixtures in Node such that every file has an instance of the fixtures which won't affect the other test files.
The way I currently import my fixtures in every test file:
const {fixture1, someOtherFixture } = require('../../../../../fixtures/keywords.json');
require calls are cached, so once you call it, consecutive calls will return the same object.
You can do the following:
const {fixture1, someOtherFixture } = require('../../../../../fixtures/keywords.json');
const fixtureCopy = JSON.parse(JSON.stringify(fixture1));
const someOtherFixtureCopy = JSON.parse(JSON.stringify(someOtherFixtureCopy));
or use a package:
deepcopy
clone
const deepcopy = require('deepcopy');
const {fixture1, someOtherFixture } = require('../../../../../fixtures/keywords.json');
const fixtureCopy = deepcopy(fixture1);
const someOtherFixtureCopy = deepcopy(someOtherFixtureCopy);
Or change your module to export a function that will return new copies everytime. This is the recommended approach in my opinion.
module.exports = {
get() {
return deepcopy(fixture); // fixture being the Object you have
}
}
const fixture = require('./fixture');
const fixture1 = fixture.get();
This isn't specific to JSON. It's not uncommon that modules need to be re-evaluated in tests. require.cache can be modified in Node.js to affect how modules are cached, either directly or with helpers like decache.
Depending on the case,
decache('../../../../../fixtures/keywords.json')
goes before require in a test, or to afterEach to clean up.
What is the purpose of Node.js module.exports and how do you use it?
I can't seem to find any information on this, but it appears to be a rather important part of Node.js as I often see it in source code.
According to the Node.js documentation:
module
A reference to the current
module. In particular module.exports
is the same as the exports object. See
src/node.js for more information.
But this doesn't really help.
What exactly does module.exports do, and what would a simple example be?
module.exports is the object that's actually returned as the result of a require call.
The exports variable is initially set to that same object (i.e. it's a shorthand "alias"), so in the module code you would usually write something like this:
let myFunc1 = function() { ... };
let myFunc2 = function() { ... };
exports.myFunc1 = myFunc1;
exports.myFunc2 = myFunc2;
to export (or "expose") the internally scoped functions myFunc1 and myFunc2.
And in the calling code you would use:
const m = require('./mymodule');
m.myFunc1();
where the last line shows how the result of require is (usually) just a plain object whose properties may be accessed.
NB: if you overwrite exports then it will no longer refer to module.exports. So if you wish to assign a new object (or a function reference) to exports then you should also assign that new object to module.exports
It's worth noting that the name added to the exports object does not have to be the same as the module's internally scoped name for the value that you're adding, so you could have:
let myVeryLongInternalName = function() { ... };
exports.shortName = myVeryLongInternalName;
// add other objects, functions, as required
followed by:
const m = require('./mymodule');
m.shortName(); // invokes module.myVeryLongInternalName
This has already been answered but I wanted to add some clarification...
You can use both exports and module.exports to import code into your application like this:
var mycode = require('./path/to/mycode');
The basic use case you'll see (e.g. in ExpressJS example code) is that you set properties on the exports object in a .js file that you then import using require()
So in a simple counting example, you could have:
(counter.js):
var count = 1;
exports.increment = function() {
count++;
};
exports.getCount = function() {
return count;
};
... then in your application (web.js, or really any other .js file):
var counting = require('./counter.js');
console.log(counting.getCount()); // 1
counting.increment();
console.log(counting.getCount()); // 2
In simple terms, you can think of required files as functions that return a single object, and you can add properties (strings, numbers, arrays, functions, anything) to the object that's returned by setting them on exports.
Sometimes you'll want the object returned from a require() call to be a function you can call, rather than just an object with properties. In that case you need to also set module.exports, like this:
(sayhello.js):
module.exports = exports = function() {
console.log("Hello World!");
};
(app.js):
var sayHello = require('./sayhello.js');
sayHello(); // "Hello World!"
The difference between exports and module.exports is explained better in this answer here.
Note that the NodeJS module mechanism is based on CommonJS modules which are supported in many other implementations like RequireJS, but also SproutCore, CouchDB, Wakanda, OrientDB, ArangoDB, RingoJS, TeaJS, SilkJS, curl.js, or even Adobe Photoshop (via PSLib).
You can find the full list of known implementations here.
Unless your module use node specific features or module, I highly encourage you then using exports instead of module.exports which is not part of the CommonJS standard, and then mostly not supported by other implementations.
Another NodeJS specific feature is when you assign a reference to a new object to exports instead of just adding properties and methods to it like in the last example provided by Jed Watson in this thread. I would personally discourage this practice as this breaks the circular reference support of the CommonJS modules mechanism. It is then not supported by all implementations and Jed example should then be written this way (or a similar one) to provide a more universal module:
(sayhello.js):
exports.run = function() {
console.log("Hello World!");
}
(app.js):
var sayHello = require('./sayhello');
sayHello.run(); // "Hello World!"
Or using ES6 features
(sayhello.js):
Object.assign(exports, {
// Put all your public API here
sayhello() {
console.log("Hello World!");
}
});
(app.js):
const { sayHello } = require('./sayhello');
sayHello(); // "Hello World!"
PS: It looks like Appcelerator also implements CommonJS modules, but without the circular reference support (see: Appcelerator and CommonJS modules (caching and circular references))
Some few things you must take care if you assign a reference to a new object to exports and /or modules.exports:
1. All properties/methods previously attached to the original exports or module.exports are of course lost because the exported object will now reference another new one
This one is obvious, but if you add an exported method at the beginning of an existing module, be sure the native exported object is not referencing another object at the end
exports.method1 = function () {}; // exposed to the original exported object
exports.method2 = function () {}; // exposed to the original exported object
module.exports.method3 = function () {}; // exposed with method1 & method2
var otherAPI = {
// some properties and/or methods
}
exports = otherAPI; // replace the original API (works also with module.exports)
2. In case one of exports or module.exports reference a new value, they don't reference to the same object any more
exports = function AConstructor() {}; // override the original exported object
exports.method2 = function () {}; // exposed to the new exported object
// method added to the original exports object which not exposed any more
module.exports.method3 = function () {};
3. Tricky consequence. If you change the reference to both exports and module.exports, hard to say which API is exposed (it looks like module.exports wins)
// override the original exported object
module.exports = function AConstructor() {};
// try to override the original exported object
// but module.exports will be exposed instead
exports = function AnotherConstructor() {};
the module.exports property or the exports object allows a module to select what should be shared with the application
I have a video on module_export available here
When dividing your program code over multiple files, module.exports is used to publish variables and functions to the consumer of a module. The require() call in your source file is replaced with corresponding module.exports loaded from the module.
Remember when writing modules
Module loads are cached, only initial call evaluates JavaScript.
It's possible to use local variables and functions inside a module, not everything needs to be exported.
The module.exports object is also available as exports shorthand. But when returning a sole function, always use module.exports.
According to: "Modules Part 2 - Writing modules".
the refer link is like this:
exports = module.exports = function(){
//....
}
the properties of exports or module.exports ,such as functions or variables , will be exposed outside
there is something you must pay more attention : don't override exports .
why ?
because exports just the reference of module.exports , you can add the properties onto the exports ,but if you override the exports , the reference link will be broken .
good example :
exports.name = 'william';
exports.getName = function(){
console.log(this.name);
}
bad example :
exports = 'william';
exports = function(){
//...
}
If you just want to exposed only one function or variable , like this:
// test.js
var name = 'william';
module.exports = function(){
console.log(name);
}
// index.js
var test = require('./test');
test();
this module only exposed one function and the property of name is private for the outside .
There are some default or existing modules in node.js when you download and install node.js like http, sys etc.
Since they are already in node.js, when we want to use these modules we basically do like import modules, but why? because they are already present in the node.js. Importing is like taking them from node.js and putting them into your program. And then using them.
Whereas Exports is exactly the opposite, you are creating the module you want, let's say the module addition.js and putting that module into the node.js, you do it by exporting it.
Before I write anything here, remember, module.exports.additionTwo is same as exports.additionTwo
Huh, so that's the reason, we do like
exports.additionTwo = function(x)
{return x+2;};
Be careful with the path
Lets say you have created an addition.js module,
exports.additionTwo = function(x){
return x + 2;
};
When you run this on your NODE.JS command prompt:
node
var run = require('addition.js');
This will error out saying
Error: Cannot find module addition.js
This is because the node.js process is unable the addition.js since we didn't mention the path. So, we have can set the path by using NODE_PATH
set NODE_PATH = path/to/your/additon.js
Now, this should run successfully without any errors!!
One more thing, you can also run the addition.js file by not setting the NODE_PATH, back to your nodejs command prompt:
node
var run = require('./addition.js');
Since we are providing the path here by saying it's in the current directory ./ this should also run successfully.
A module encapsulates related code into a single unit of code. When creating a module, this can be interpreted as moving all related functions into a file.
Suppose there is a file Hello.js which include two functions
sayHelloInEnglish = function() {
return "Hello";
};
sayHelloInSpanish = function() {
return "Hola";
};
We write a function only when utility of the code is more than one call.
Suppose we want to increase utility of the function to a different file say World.js,in this case exporting a file comes into picture which can be obtained by module.exports.
You can just export both the function by the code given below
var anyVariable={
sayHelloInEnglish = function() {
return "Hello";
};
sayHelloInSpanish = function() {
return "Hola";
};
}
module.export=anyVariable;
Now you just need to require the file name into World.js inorder to use those functions
var world= require("./hello.js");
The intent is:
Modular programming is a software design technique that emphasizes
separating the functionality of a program into independent,
interchangeable modules, such that each contains everything necessary
to execute only one aspect of the desired functionality.
Wikipedia
I imagine it becomes difficult to write a large programs without modular / reusable code. In nodejs we can create modular programs utilising module.exports defining what we expose and compose our program with require.
Try this example:
fileLog.js
function log(string) { require('fs').appendFileSync('log.txt',string); }
module.exports = log;
stdoutLog.js
function log(string) { console.log(string); }
module.exports = log;
program.js
const log = require('./stdoutLog.js')
log('hello world!');
execute
$ node program.js
hello world!
Now try swapping ./stdoutLog.js for ./fileLog.js.
What is the purpose of a module system?
It accomplishes the following things:
Keeps our files from bloating to really big sizes. Having files with e.g. 5000 lines of code in it are usually real hard to deal with during development.
Enforces separation of concerns. Having our code split up into multiple files allows us to have appropriate file names for every file. This way we can easily identify what every module does and where to find it (assuming we made a logical directory structure which is still your responsibility).
Having modules makes it easier to find certain parts of code which makes our code more maintainable.
How does it work?
NodejS uses the CommomJS module system which works in the following manner:
If a file wants to export something it has to declare it using module.export syntax
If a file wants to import something it has to declare it using require('file') syntax
Example:
test1.js
const test2 = require('./test2'); // returns the module.exports object of a file
test2.Func1(); // logs func1
test2.Func2(); // logs func2
test2.js
module.exports.Func1 = () => {console.log('func1')};
exports.Func2 = () => {console.log('func2')};
Other useful things to know:
Modules are getting cached. When you are loading the same module in 2 different files the module only has to be loaded once. The second time a require() is called on the same module the is pulled from the cache.
Modules are loaded in synchronous. This behavior is required, if it was asynchronous we couldn't access the object retrieved from require() right away.
ECMAScript modules - 2022
From Node 14.0 ECMAScript modules are no longer experimental and you can use them instead of classic Node's CommonJS modules.
ECMAScript modules are the official standard format to package JavaScript code for reuse. Modules are defined using a variety of import and export statements.
You can define an ES module that exports a function:
// my-fun.mjs
function myFun(num) {
// do something
}
export { myFun };
Then, you can import the exported function from my-fun.mjs:
// app.mjs
import { myFun } from './my-fun.mjs';
myFun();
.mjs is the default extension for Node.js ECMAScript modules.
But you can configure the default modules extension to lookup when resolving modules using the package.json "type" field, or the --input-type flag in the CLI.
Recent versions of Node.js fully supports both ECMAScript and CommonJS modules. Moreover, it provides interoperability between them.
module.exports
ECMAScript and CommonJS modules have many differences but the most relevant difference - to this question - is that there are no more requires, no more exports, no more module.exports
In most cases, the ES module import can be used to load CommonJS modules.
If needed, a require function can be constructed within an ES module using module.createRequire().
ECMAScript modules releases history
Release
Changes
v15.3.0, v14.17.0, v12.22.0
Stabilized modules implementation
v14.13.0, v12.20.0
Support for detection of CommonJS named exports
v14.0.0, v13.14.0, v12.20.0
Remove experimental modules warning
v13.2.0, v12.17.0
Loading ECMAScript modules no longer requires a command-line flag
v12.0.0
Add support for ES modules using .js file extension via package.json "type" field
v8.5.0
Added initial ES modules implementation
You can find all the changelogs in Node.js repository
let test = function() {
return "Hello world"
};
exports.test = test;
I'm working on my first NodeJS project. I started building modules in the classical way as I read on books and over the internet.
As the project started growing I decided to split modules in small reusable pieces. That lead me to have a lot of require at the top of the file and sometime the risk to tackle circular dependencies. Moreover, this approach, doesn't really fits with testing because I had to require all the dependencies to make tests. I asked other developers a better way to solve this problem and most of them suggested me to use dependency injection with function constructor.
Suppose I have ModuleA and ModuleB,
ModuleC requires both ModuleA and ModuleB. Instead of requiring these modules an the top of the page I should pass them as argument in a constructor function.
e.g.
module.exports = function ModuleC(moduleA, moduleB) {
//module implementation here....
function doSomething() {}
return {
doSomething
}
}
the problem with this approach, that at first looked good, is that in the main entry point of the application I have to require and instantiate all the module to pass.
const ModuleA = require("./moduleA");
const ModuleB = require("./moduleB");
const ModuleC = require("./moduleC");
const moduleA = new ModuleA();
const moduleB = new ModuleB();
const moduleC = new ModuleC(moduleA, moduleB);
moduleC.doSomething();
Now with only 3 modules I had to write 7 line of code to use a function of a module. If I had 20 modules to work with the main entry point of the application would be a nightmare.
I guess this is not the best approach and even with this method, testing is not easy.
So, I ask you to suggest/explain me the best way to achieve this simple task that I'm finding, maybe harder than it is, while starting exploring the NodeJS word. Thanks.
Code re-usability can also be achieved if you put all of your codes in a single file. Creating smaller modules is not the only solution. consider the following codes written in a file allLogic.js(let).
var allFunctions = function(){ };
var allFunctionsProto = allFunctions.prototype;
allFunctionsProto.getJSONFile = function(fileName){
var that = this; //use 'that' instead of 'this'
//code to fetch json file
};
allFunctionsProto.calculateInterest = function(price){
var that = this; //use 'that' instead of 'this'
//code to calculate interest
};
allFunctionsProto.sendResponse = function(data){
var that = this; //use 'that' instead of 'this'
//code to send response
};
//similary write all of your functions here using "allFunctionsProto.<yourFunctionName>"
module.exports = new allFunctions();
Whenever I need to get any JSON file I know that the logic to get JSON file has already been written in allLogic.js file, hence I will require this module and use that as below.
var allLogic = require('../path-to-allLogic/allLogic.js');
allLogic.getJSON();
this approach is far better than creating tons of module for each work. Of course if the module gets longer you can create new module but in that case you need to consider separation of concern otherwise the circular dependency will haunt you.
As you are using you moduleA and moduleB in moduleC, if you put all of the codes from moduleA, moduleB and moduleC in a single module as I have pointed out you can reference the functions and all of the separate functions inside that module using that and those are also available after require.
I have read the details on NodeJS site : https://nodejs.org/api/modules.html. I don't understand how modules work, and what are the minimal steps for creating a module, and how npm can help me.
How can I create a module?
How do I use a module?
What does putting it on npm mean?
Note: this is a self answered question, with the purpose of sharing knowledge as a canonical.
You can create a NodeJS module using one line of code:
//mymodule.js
module.exports = 3;
Then you can load the module, by using require:
//app.js
require('./mymodule.js')
I added './' because it is a module of one file. We will cover it later.
Now if you do for example:
var mymodule = require('./mymodule.js');
console.log(mymodule); // 3
You can replace the number 3, with a function, for example:
//mymodule.js:
module.exports = function () {
console.log('function inside the module');
};
Then you can use it:
var mymodule = require('./mymodule.js');
mymodule();
Private variables:
Every variable you define inside A module will be defined only inside it:
//mymodule.js
var myPrivateVariable = 3;
publicVariable = 5; // Never user global variables in modules
//It's bad-pracrtice. Always add: var.
module.exports = function() {
// Every function of the module can use the private variables
return myPrivateVariable++
};
//app.js
var mymodule = require('./mymodule.js');
console.log(mymodule()); // return 3
console.log(mymodule()); // return 4
Reuse modules:
One more thing you need to know about NodeJS modules, is that if you use the same module twice(require it), it will return the same instance, it will not run in twice.
for example:
//app.js
var mymodule1 = require('./mymodule.js');
var mymodule2 = require('./mymodule.js');
console.log(mymodule1()); //return 3
console.log(mymodule2()); //return 4 (not 3)
console.log(mymodule1()); //return 5
As you see in the example below, that private variable is shared between all the instances of the module.
A module package
If your module contain more than one file, or you want to share the module with others, you have to create the module in separate folder, and create a package.json file for the module.
npm init will create package.json file for you.
For modules, there are 3 required parts:
package.json
{
"name" : "You module name",
"version" : "0.0.3"
}
Now, you can publish the module, using npm publish. I recommend you publish all your modules to github as well, then the module will be connected to your github page.
What you publish to NPM will be accessible by everyone. So never publish modules that contain private data. For that you can use private npm modules.
Next steps
Modules can return more than one function or one variable. See this samples in which we return an object.
module.exports.a = function() {
// ..
};
module.exports.b = function() {
// ..
};
// OR
myObj = {
a:3,
b:function() {
return this.a;
}
};
module.exports = myObj;
More info:
Read about package.json files
Versioning in you modules best practice
More best practive for NodeJS modules
Private modules, using private npm
Related Questions:
What is the purpose of Node.js module.exports and how do you use it?
module.exports vs exports in Node.js
Creating module in node.js is pretty simple!!!
You may consider module as a set of functionalities you can use in other code by simply just requiring it.
for eg:Consider a file functional.js having the content:
function display(){
console.log('i am in a display function');
}
module.exports = display;
Now just require it in any other module like:
var display = require('./functional');
display()
Output:i am in a display function
Similarly you can do:
var exports = module.exports = {};
exports.display = function(){
console.log('i am in the display function');
}
or you do the same for objects like:
var funObj = {
hello:function(){
console.log('hello function');
},
display:function(){
console.log('display function');
}
};
module.exports = funObj;
There are two main ways for wiring modules. One of them is using hard coded dependencies, explicitly loading one module into another using a require call. The other method is to use a dependency injection pattern, where we pass the components as a parameter or we have a global container (known as IoC, or Inversion of Control container), which centralizes the management of the modules.
We can allow Node.js to manage the modules life cycle by using hard coded module loading. It organizes your packages in an intuitive way, which makes understanding and debugging easy.
Dependency Injection is rarely used in a Node.js environment, although it is a useful concept. The DI pattern can result in an improved decoupling of the modules. Instead of explicitly defining dependencies for a module, they are received from the outside. Therefore they can be easily replaced with modules having the same interfaces.
Let’s see an example for DI modules using the factory pattern:
class Car {
constructor (options) {
this.engine = options.engine
}
start () {
this.engine.start()
}
}
function create (options) {
return new Car(options)
}
module.exports = create