Mocha tests: avoid '../../' when requiring - node.js

My folder structure is:
/backend/services/service.js
/test/backend/services/service.js
In each service test, I use the following:
var service = require('../../../backend/services/service')
Is there a more elegant way of automatically requiring the file with the same path, only without the test folder? Maybe using mocha.opts somehow?

I can give you a solution a little hacky that I use. I have a /test/__bootstrap.test.js file where I create a global testRequire() function that I only use for tests:
// Creation of a wrapper to avoid ../../../.. using require()
global.testRequire = function(name) {
return require(path.join(__dirname, '..', name));
};
Then in /test/backend/services/service.js I use testRequire() instead of require().
var service = = testRequire('backend/services/service');

Related

Require JSON as deep copy

I am writing tests right now for my node application.
I have fixtures which I use to test my data and I ran into the Problem, that when I alter any of them in a method, then they are globally altered for all the other tests as well, which obviously has to do with referencing. Now I figured if I write my fixtures into a JSON and require that JSON in each file then they will have unique references for each file, which turns out now, they don't.
My question would be: is there an easy way to handle fixtures in Node such that every file has an instance of the fixtures which won't affect the other test files.
The way I currently import my fixtures in every test file:
const {fixture1, someOtherFixture } = require('../../../../../fixtures/keywords.json');
require calls are cached, so once you call it, consecutive calls will return the same object.
You can do the following:
const {fixture1, someOtherFixture } = require('../../../../../fixtures/keywords.json');
const fixtureCopy = JSON.parse(JSON.stringify(fixture1));
const someOtherFixtureCopy = JSON.parse(JSON.stringify(someOtherFixtureCopy));
or use a package:
deepcopy
clone
const deepcopy = require('deepcopy');
const {fixture1, someOtherFixture } = require('../../../../../fixtures/keywords.json');
const fixtureCopy = deepcopy(fixture1);
const someOtherFixtureCopy = deepcopy(someOtherFixtureCopy);
Or change your module to export a function that will return new copies everytime. This is the recommended approach in my opinion.
module.exports = {
get() {
return deepcopy(fixture); // fixture being the Object you have
}
}
const fixture = require('./fixture');
const fixture1 = fixture.get();
This isn't specific to JSON. It's not uncommon that modules need to be re-evaluated in tests. require.cache can be modified in Node.js to affect how modules are cached, either directly or with helpers like decache.
Depending on the case,
decache('../../../../../fixtures/keywords.json')
goes before require in a test, or to afterEach to clean up.

How to use module.exports of Nodejs [duplicate]

What is the purpose of Node.js module.exports and how do you use it?
I can't seem to find any information on this, but it appears to be a rather important part of Node.js as I often see it in source code.
According to the Node.js documentation:
module
A reference to the current
module. In particular module.exports
is the same as the exports object. See
src/node.js for more information.
But this doesn't really help.
What exactly does module.exports do, and what would a simple example be?
module.exports is the object that's actually returned as the result of a require call.
The exports variable is initially set to that same object (i.e. it's a shorthand "alias"), so in the module code you would usually write something like this:
let myFunc1 = function() { ... };
let myFunc2 = function() { ... };
exports.myFunc1 = myFunc1;
exports.myFunc2 = myFunc2;
to export (or "expose") the internally scoped functions myFunc1 and myFunc2.
And in the calling code you would use:
const m = require('./mymodule');
m.myFunc1();
where the last line shows how the result of require is (usually) just a plain object whose properties may be accessed.
NB: if you overwrite exports then it will no longer refer to module.exports. So if you wish to assign a new object (or a function reference) to exports then you should also assign that new object to module.exports
It's worth noting that the name added to the exports object does not have to be the same as the module's internally scoped name for the value that you're adding, so you could have:
let myVeryLongInternalName = function() { ... };
exports.shortName = myVeryLongInternalName;
// add other objects, functions, as required
followed by:
const m = require('./mymodule');
m.shortName(); // invokes module.myVeryLongInternalName
This has already been answered but I wanted to add some clarification...
You can use both exports and module.exports to import code into your application like this:
var mycode = require('./path/to/mycode');
The basic use case you'll see (e.g. in ExpressJS example code) is that you set properties on the exports object in a .js file that you then import using require()
So in a simple counting example, you could have:
(counter.js):
var count = 1;
exports.increment = function() {
count++;
};
exports.getCount = function() {
return count;
};
... then in your application (web.js, or really any other .js file):
var counting = require('./counter.js');
console.log(counting.getCount()); // 1
counting.increment();
console.log(counting.getCount()); // 2
In simple terms, you can think of required files as functions that return a single object, and you can add properties (strings, numbers, arrays, functions, anything) to the object that's returned by setting them on exports.
Sometimes you'll want the object returned from a require() call to be a function you can call, rather than just an object with properties. In that case you need to also set module.exports, like this:
(sayhello.js):
module.exports = exports = function() {
console.log("Hello World!");
};
(app.js):
var sayHello = require('./sayhello.js');
sayHello(); // "Hello World!"
The difference between exports and module.exports is explained better in this answer here.
Note that the NodeJS module mechanism is based on CommonJS modules which are supported in many other implementations like RequireJS, but also SproutCore, CouchDB, Wakanda, OrientDB, ArangoDB, RingoJS, TeaJS, SilkJS, curl.js, or even Adobe Photoshop (via PSLib).
You can find the full list of known implementations here.
Unless your module use node specific features or module, I highly encourage you then using exports instead of module.exports which is not part of the CommonJS standard, and then mostly not supported by other implementations.
Another NodeJS specific feature is when you assign a reference to a new object to exports instead of just adding properties and methods to it like in the last example provided by Jed Watson in this thread. I would personally discourage this practice as this breaks the circular reference support of the CommonJS modules mechanism. It is then not supported by all implementations and Jed example should then be written this way (or a similar one) to provide a more universal module:
(sayhello.js):
exports.run = function() {
console.log("Hello World!");
}
(app.js):
var sayHello = require('./sayhello');
sayHello.run(); // "Hello World!"
Or using ES6 features
(sayhello.js):
Object.assign(exports, {
// Put all your public API here
sayhello() {
console.log("Hello World!");
}
});
(app.js):
const { sayHello } = require('./sayhello');
sayHello(); // "Hello World!"
PS: It looks like Appcelerator also implements CommonJS modules, but without the circular reference support (see: Appcelerator and CommonJS modules (caching and circular references))
Some few things you must take care if you assign a reference to a new object to exports and /or modules.exports:
1. All properties/methods previously attached to the original exports or module.exports are of course lost because the exported object will now reference another new one
This one is obvious, but if you add an exported method at the beginning of an existing module, be sure the native exported object is not referencing another object at the end
exports.method1 = function () {}; // exposed to the original exported object
exports.method2 = function () {}; // exposed to the original exported object
module.exports.method3 = function () {}; // exposed with method1 & method2
var otherAPI = {
// some properties and/or methods
}
exports = otherAPI; // replace the original API (works also with module.exports)
2. In case one of exports or module.exports reference a new value, they don't reference to the same object any more
exports = function AConstructor() {}; // override the original exported object
exports.method2 = function () {}; // exposed to the new exported object
// method added to the original exports object which not exposed any more
module.exports.method3 = function () {};
3. Tricky consequence. If you change the reference to both exports and module.exports, hard to say which API is exposed (it looks like module.exports wins)
// override the original exported object
module.exports = function AConstructor() {};
// try to override the original exported object
// but module.exports will be exposed instead
exports = function AnotherConstructor() {};
the module.exports property or the exports object allows a module to select what should be shared with the application
I have a video on module_export available here
When dividing your program code over multiple files, module.exports is used to publish variables and functions to the consumer of a module. The require() call in your source file is replaced with corresponding module.exports loaded from the module.
Remember when writing modules
Module loads are cached, only initial call evaluates JavaScript.
It's possible to use local variables and functions inside a module, not everything needs to be exported.
The module.exports object is also available as exports shorthand. But when returning a sole function, always use module.exports.
According to: "Modules Part 2 - Writing modules".
the refer link is like this:
exports = module.exports = function(){
//....
}
the properties of exports or module.exports ,such as functions or variables , will be exposed outside
there is something you must pay more attention : don't override exports .
why ?
because exports just the reference of module.exports , you can add the properties onto the exports ,but if you override the exports , the reference link will be broken .
good example :
exports.name = 'william';
exports.getName = function(){
console.log(this.name);
}
bad example :
exports = 'william';
exports = function(){
//...
}
If you just want to exposed only one function or variable , like this:
// test.js
var name = 'william';
module.exports = function(){
console.log(name);
}
// index.js
var test = require('./test');
test();
this module only exposed one function and the property of name is private for the outside .
There are some default or existing modules in node.js when you download and install node.js like http, sys etc.
Since they are already in node.js, when we want to use these modules we basically do like import modules, but why? because they are already present in the node.js. Importing is like taking them from node.js and putting them into your program. And then using them.
Whereas Exports is exactly the opposite, you are creating the module you want, let's say the module addition.js and putting that module into the node.js, you do it by exporting it.
Before I write anything here, remember, module.exports.additionTwo is same as exports.additionTwo
Huh, so that's the reason, we do like
exports.additionTwo = function(x)
{return x+2;};
Be careful with the path
Lets say you have created an addition.js module,
exports.additionTwo = function(x){
return x + 2;
};
When you run this on your NODE.JS command prompt:
node
var run = require('addition.js');
This will error out saying
Error: Cannot find module addition.js
This is because the node.js process is unable the addition.js since we didn't mention the path. So, we have can set the path by using NODE_PATH
set NODE_PATH = path/to/your/additon.js
Now, this should run successfully without any errors!!
One more thing, you can also run the addition.js file by not setting the NODE_PATH, back to your nodejs command prompt:
node
var run = require('./addition.js');
Since we are providing the path here by saying it's in the current directory ./ this should also run successfully.
A module encapsulates related code into a single unit of code. When creating a module, this can be interpreted as moving all related functions into a file.
Suppose there is a file Hello.js which include two functions
sayHelloInEnglish = function() {
return "Hello";
};
sayHelloInSpanish = function() {
return "Hola";
};
We write a function only when utility of the code is more than one call.
Suppose we want to increase utility of the function to a different file say World.js,in this case exporting a file comes into picture which can be obtained by module.exports.
You can just export both the function by the code given below
var anyVariable={
sayHelloInEnglish = function() {
return "Hello";
};
sayHelloInSpanish = function() {
return "Hola";
};
}
module.export=anyVariable;
Now you just need to require the file name into World.js inorder to use those functions
var world= require("./hello.js");
The intent is:
Modular programming is a software design technique that emphasizes
separating the functionality of a program into independent,
interchangeable modules, such that each contains everything necessary
to execute only one aspect of the desired functionality.
Wikipedia
I imagine it becomes difficult to write a large programs without modular / reusable code. In nodejs we can create modular programs utilising module.exports defining what we expose and compose our program with require.
Try this example:
fileLog.js
function log(string) { require('fs').appendFileSync('log.txt',string); }
module.exports = log;
stdoutLog.js
function log(string) { console.log(string); }
module.exports = log;
program.js
const log = require('./stdoutLog.js')
log('hello world!');
execute
$ node program.js
hello world!
Now try swapping ./stdoutLog.js for ./fileLog.js.
What is the purpose of a module system?
It accomplishes the following things:
Keeps our files from bloating to really big sizes. Having files with e.g. 5000 lines of code in it are usually real hard to deal with during development.
Enforces separation of concerns. Having our code split up into multiple files allows us to have appropriate file names for every file. This way we can easily identify what every module does and where to find it (assuming we made a logical directory structure which is still your responsibility).
Having modules makes it easier to find certain parts of code which makes our code more maintainable.
How does it work?
NodejS uses the CommomJS module system which works in the following manner:
If a file wants to export something it has to declare it using module.export syntax
If a file wants to import something it has to declare it using require('file') syntax
Example:
test1.js
const test2 = require('./test2'); // returns the module.exports object of a file
test2.Func1(); // logs func1
test2.Func2(); // logs func2
test2.js
module.exports.Func1 = () => {console.log('func1')};
exports.Func2 = () => {console.log('func2')};
Other useful things to know:
Modules are getting cached. When you are loading the same module in 2 different files the module only has to be loaded once. The second time a require() is called on the same module the is pulled from the cache.
Modules are loaded in synchronous. This behavior is required, if it was asynchronous we couldn't access the object retrieved from require() right away.
ECMAScript modules - 2022
From Node 14.0 ECMAScript modules are no longer experimental and you can use them instead of classic Node's CommonJS modules.
ECMAScript modules are the official standard format to package JavaScript code for reuse. Modules are defined using a variety of import and export statements.
You can define an ES module that exports a function:
// my-fun.mjs
function myFun(num) {
// do something
}
export { myFun };
Then, you can import the exported function from my-fun.mjs:
// app.mjs
import { myFun } from './my-fun.mjs';
myFun();
.mjs is the default extension for Node.js ECMAScript modules.
But you can configure the default modules extension to lookup when resolving modules using the package.json "type" field, or the --input-type flag in the CLI.
Recent versions of Node.js fully supports both ECMAScript and CommonJS modules. Moreover, it provides interoperability between them.
module.exports
ECMAScript and CommonJS modules have many differences but the most relevant difference - to this question - is that there are no more requires, no more exports, no more module.exports
In most cases, the ES module import can be used to load CommonJS modules.
If needed, a require function can be constructed within an ES module using module.createRequire().
ECMAScript modules releases history
Release
Changes
v15.3.0, v14.17.0, v12.22.0
Stabilized modules implementation
v14.13.0, v12.20.0
Support for detection of CommonJS named exports
v14.0.0, v13.14.0, v12.20.0
Remove experimental modules warning
v13.2.0, v12.17.0
Loading ECMAScript modules no longer requires a command-line flag
v12.0.0
Add support for ES modules using .js file extension via package.json "type" field
v8.5.0
Added initial ES modules implementation
You can find all the changelogs in Node.js repository
let test = function() {
return "Hello world"
};
exports.test = test;

Node modules as constructor function

I'm working on my first NodeJS project. I started building modules in the classical way as I read on books and over the internet.
As the project started growing I decided to split modules in small reusable pieces. That lead me to have a lot of require at the top of the file and sometime the risk to tackle circular dependencies. Moreover, this approach, doesn't really fits with testing because I had to require all the dependencies to make tests. I asked other developers a better way to solve this problem and most of them suggested me to use dependency injection with function constructor.
Suppose I have ModuleA and ModuleB,
ModuleC requires both ModuleA and ModuleB. Instead of requiring these modules an the top of the page I should pass them as argument in a constructor function.
e.g.
module.exports = function ModuleC(moduleA, moduleB) {
//module implementation here....
function doSomething() {}
return {
doSomething
}
}
the problem with this approach, that at first looked good, is that in the main entry point of the application I have to require and instantiate all the module to pass.
const ModuleA = require("./moduleA");
const ModuleB = require("./moduleB");
const ModuleC = require("./moduleC");
const moduleA = new ModuleA();
const moduleB = new ModuleB();
const moduleC = new ModuleC(moduleA, moduleB);
moduleC.doSomething();
Now with only 3 modules I had to write 7 line of code to use a function of a module. If I had 20 modules to work with the main entry point of the application would be a nightmare.
I guess this is not the best approach and even with this method, testing is not easy.
So, I ask you to suggest/explain me the best way to achieve this simple task that I'm finding, maybe harder than it is, while starting exploring the NodeJS word. Thanks.
Code re-usability can also be achieved if you put all of your codes in a single file. Creating smaller modules is not the only solution. consider the following codes written in a file allLogic.js(let).
var allFunctions = function(){ };
var allFunctionsProto = allFunctions.prototype;
allFunctionsProto.getJSONFile = function(fileName){
var that = this; //use 'that' instead of 'this'
//code to fetch json file
};
allFunctionsProto.calculateInterest = function(price){
var that = this; //use 'that' instead of 'this'
//code to calculate interest
};
allFunctionsProto.sendResponse = function(data){
var that = this; //use 'that' instead of 'this'
//code to send response
};
//similary write all of your functions here using "allFunctionsProto.<yourFunctionName>"
module.exports = new allFunctions();
Whenever I need to get any JSON file I know that the logic to get JSON file has already been written in allLogic.js file, hence I will require this module and use that as below.
var allLogic = require('../path-to-allLogic/allLogic.js');
allLogic.getJSON();
this approach is far better than creating tons of module for each work. Of course if the module gets longer you can create new module but in that case you need to consider separation of concern otherwise the circular dependency will haunt you.
As you are using you moduleA and moduleB in moduleC, if you put all of the codes from moduleA, moduleB and moduleC in a single module as I have pointed out you can reference the functions and all of the separate functions inside that module using that and those are also available after require.

Using Yeoman programmatically inside nodejs project

I want to use an yeoman generator inside a NodeJS project
I installed yeoman-generatorand generator-git (the generator that I want use) as locally dependency, and, at this moment my code is like this:
var env = require('yeoman-generator')();
var path = require('path');
var gitGenerator = require('generator-git');
var workingDirectory = path.join(process.cwd(), 'install_here/');
generator = env.create(gitGenerator);
obviously the last line doesn't work and doesn't generate the scaffold.
The question: How to?
Importantly, I want to stay in local dependency level!
#simon-boudrias's solution does work, but after I changed the process.chdir(), this.templatePath() and this.destinationPath() returns same path.
I could have use this.sourcePath() to tweak the template path, but having to change this to each yeoman generator is not so useful. After digging to yo-cli, I found the following works without affecting the path.
var env = require('yeoman-environment').createEnv();
env.lookup(function() {
env.run('generator-name');
});
env.create() only instantiate a generator - it doesn't run it.
To run it, you could call generator.run(). But that's not ideal.
The best way IMO would be this way:
var path = require('path');
var env = require('yeoman-generator')();
var gitGenerator = require('generator-git');
// Optionnal: look every generator in your system. That'll allow composition if needed:
// env.lookup();
env.registerStub(gitGenerator, 'git:app');
env.run('git:app');
If necessary, make sure to process.chdir() in the right directory before launching your generator.
Relevant documentation on the Yeoman Environment class can be found here: http://yeoman.io/environment/Environment.html
Also see: http://yeoman.io/authoring/integrating-yeoman.html
The yeoman-test module is also very useful if you want to pass predefined answers to your prompts. This worked for me.
var yeomanTest = require('yeoman-test');
var answers = require('from/some/file.json');
var context = yeomanTest.run(path.resolve('path/to/generator'));
context.settings.tmpdir = false; // don't run in tempdir
context.withGenerators([
'paths/to/subgenerators',
'more/of/them'
])
.withOptions({ // execute with options
'skip-install': true,
'skip-sdk': true
})
.withPrompts(answers) // answer prompts
.on('end', function () {
// do some stuff here
});

Using environment specific configuration files in Node.js

Unlike Rails, there doesn't seem to be an accepted way of loading environment specific config files in Node.js.
Currently I'm using the following:
config/development.js and config/production.js:
module.exports = {
'db': 'mongodb://127.0.0.1/example_dev',
'port': 3002
};
Followed by the following at the top of my app.js file:
var config = require('./config/' + process.env.NODE_ENV + '.js');
This pattern works pretty well, however it forces me to pass along this config file to any modules that need it. This gets kind of clunky, for instance:
var routes = require('./routes')(config);
.. and in routes/index.js:
modules.export = function(config) {
this.index = function...
this.show = function...
};
Etc, etc. The module pattern just seems to be pretty clunky when dealing with something that should be global, like configuration settings. I could require the configuration file at the top of every file that needs it as well, but that doesn't seem ideal either.
Does anyone have a best practice for including a configuration file and making it globally available?
You could just attach it to the global process object:
app.js:
var config = require('./config/' + process.env.NODE_ENV + '.js');
process.config = config;
aywhere else in your app
console.log(process.config);
Yes, it's a bit dangerous in that it can get overwritten anywhere, but it's still a pretty simple approach.

Resources