Best practice on avoid duplicated requires in nodejs - node.js

I have multiple js files, all have the same requires in the beginning like
var config = require("config");
var expect = require("chai").expect;
var commonAssertions = require('../../../utils/common_assertions.js');
var commonSteps = require('../../../utils/common_steps.js');
I am thinking about putting all of them in one file and just require this single file.
I am wondering if there is any best practice or convention on this in nodejs.

Remember that require() must always return a Javascript object, module.exports.
So if you were to extract this to a different file, that would be perfectly fine.
includes.js
exports.config = require("config");
exports.chai = require("chai").expect;
exports.commonAssertions = require('../../../utils/common_assertions.js');
exports.commonSteps = require('../../../utils/common_steps.js');
myfile.js
includes = require('./includes')
includes.expect(true).to.be.true //For example
It is not necessarily a good or bad practice. I would say that if you expect to need the exact same modules from many different files, then go for it.

Related

Defining modules in one file and requiring it in Node.js

I'm updating my code to include new packages so often and I have more than 100 files.
I want to make something like this,
File : dependencies.js :
const snekfetch = require("snekfetch");
const fs = require("fs");
It's very annoying to modify every file to add just a single package.
I'm trying to require the dependencies.js using this:
require("./dependencies.js")
But I see this in my console:
ReferenceError: snekfetch is not defined
Is there any way I can succeed?
I think you are not exporting the modules in dependencies.js
dependencies.js should look like,
const snekfetch = require("snekfetch");
const fs = require("fs");
module.exports = {
"snekfetch": snekfetch,
"fs": fs
};
Then you should be able to import this file and use it like as follows,
var dependencies = require('./dependencies.js');
// dependencies.fs.readFile();
Although there are much better ways to handle your imports then just creating a simple file of dependencies. Have a look at this link.

Require JSON as deep copy

I am writing tests right now for my node application.
I have fixtures which I use to test my data and I ran into the Problem, that when I alter any of them in a method, then they are globally altered for all the other tests as well, which obviously has to do with referencing. Now I figured if I write my fixtures into a JSON and require that JSON in each file then they will have unique references for each file, which turns out now, they don't.
My question would be: is there an easy way to handle fixtures in Node such that every file has an instance of the fixtures which won't affect the other test files.
The way I currently import my fixtures in every test file:
const {fixture1, someOtherFixture } = require('../../../../../fixtures/keywords.json');
require calls are cached, so once you call it, consecutive calls will return the same object.
You can do the following:
const {fixture1, someOtherFixture } = require('../../../../../fixtures/keywords.json');
const fixtureCopy = JSON.parse(JSON.stringify(fixture1));
const someOtherFixtureCopy = JSON.parse(JSON.stringify(someOtherFixtureCopy));
or use a package:
deepcopy
clone
const deepcopy = require('deepcopy');
const {fixture1, someOtherFixture } = require('../../../../../fixtures/keywords.json');
const fixtureCopy = deepcopy(fixture1);
const someOtherFixtureCopy = deepcopy(someOtherFixtureCopy);
Or change your module to export a function that will return new copies everytime. This is the recommended approach in my opinion.
module.exports = {
get() {
return deepcopy(fixture); // fixture being the Object you have
}
}
const fixture = require('./fixture');
const fixture1 = fixture.get();
This isn't specific to JSON. It's not uncommon that modules need to be re-evaluated in tests. require.cache can be modified in Node.js to affect how modules are cached, either directly or with helpers like decache.
Depending on the case,
decache('../../../../../fixtures/keywords.json')
goes before require in a test, or to afterEach to clean up.

Multiple require() from the same library in the same module

I'm looking at the sources of the #slack/client npm package for NodeJs and see that right at the top they have this:
var forEach = require('lodash').forEach;
var bind = require('lodash').bind;
var has = require('lodash').has;
var isArray = require('lodash').isArray;
var isEmpty = require('lodash').isEmpty;
var isObject = require('lodash').isObject;
What's the point in cherry picking all this from the lodash module when you can make it more succinct by only including the whole lib once and then using the methods you need?
// Include the whole lib
var _ = require('lodash');
// And later
if (_.isObject(...)) // etc
It's not like they are using each method many times. In fact, most are used just once or twice. Also, it is my understanding that even when partially requiring part of a module, the whole thing is eval()'d, so there is no advantage memory or performance-wise.
I find this package to be very well written so I'm curious to know why they chose to do this.

Using Yeoman programmatically inside nodejs project

I want to use an yeoman generator inside a NodeJS project
I installed yeoman-generatorand generator-git (the generator that I want use) as locally dependency, and, at this moment my code is like this:
var env = require('yeoman-generator')();
var path = require('path');
var gitGenerator = require('generator-git');
var workingDirectory = path.join(process.cwd(), 'install_here/');
generator = env.create(gitGenerator);
obviously the last line doesn't work and doesn't generate the scaffold.
The question: How to?
Importantly, I want to stay in local dependency level!
#simon-boudrias's solution does work, but after I changed the process.chdir(), this.templatePath() and this.destinationPath() returns same path.
I could have use this.sourcePath() to tweak the template path, but having to change this to each yeoman generator is not so useful. After digging to yo-cli, I found the following works without affecting the path.
var env = require('yeoman-environment').createEnv();
env.lookup(function() {
env.run('generator-name');
});
env.create() only instantiate a generator - it doesn't run it.
To run it, you could call generator.run(). But that's not ideal.
The best way IMO would be this way:
var path = require('path');
var env = require('yeoman-generator')();
var gitGenerator = require('generator-git');
// Optionnal: look every generator in your system. That'll allow composition if needed:
// env.lookup();
env.registerStub(gitGenerator, 'git:app');
env.run('git:app');
If necessary, make sure to process.chdir() in the right directory before launching your generator.
Relevant documentation on the Yeoman Environment class can be found here: http://yeoman.io/environment/Environment.html
Also see: http://yeoman.io/authoring/integrating-yeoman.html
The yeoman-test module is also very useful if you want to pass predefined answers to your prompts. This worked for me.
var yeomanTest = require('yeoman-test');
var answers = require('from/some/file.json');
var context = yeomanTest.run(path.resolve('path/to/generator'));
context.settings.tmpdir = false; // don't run in tempdir
context.withGenerators([
'paths/to/subgenerators',
'more/of/them'
])
.withOptions({ // execute with options
'skip-install': true,
'skip-sdk': true
})
.withPrompts(answers) // answer prompts
.on('end', function () {
// do some stuff here
});

Using environment specific configuration files in Node.js

Unlike Rails, there doesn't seem to be an accepted way of loading environment specific config files in Node.js.
Currently I'm using the following:
config/development.js and config/production.js:
module.exports = {
'db': 'mongodb://127.0.0.1/example_dev',
'port': 3002
};
Followed by the following at the top of my app.js file:
var config = require('./config/' + process.env.NODE_ENV + '.js');
This pattern works pretty well, however it forces me to pass along this config file to any modules that need it. This gets kind of clunky, for instance:
var routes = require('./routes')(config);
.. and in routes/index.js:
modules.export = function(config) {
this.index = function...
this.show = function...
};
Etc, etc. The module pattern just seems to be pretty clunky when dealing with something that should be global, like configuration settings. I could require the configuration file at the top of every file that needs it as well, but that doesn't seem ideal either.
Does anyone have a best practice for including a configuration file and making it globally available?
You could just attach it to the global process object:
app.js:
var config = require('./config/' + process.env.NODE_ENV + '.js');
process.config = config;
aywhere else in your app
console.log(process.config);
Yes, it's a bit dangerous in that it can get overwritten anywhere, but it's still a pretty simple approach.

Resources