Node.js dynamic relative require path - node.js

I need to be able to use require() on a dynamic relative path - meaning that the relative path needs to change depending on the current environment.
What is the best practice for this type of situation?
I thought of something like this:
var module = require(process.env.MY_MODULES_PATH + '/my-module');
However environment variables are not very convenient.
Are there other possibilities?
Maybe use package.json post-install script to set the environment variable for me?
Maybe there's a built in solution in node I don't know about?
EDIT
I just realized that this is a special case of require() "mocking". Is there a best practice for how to mock require() for unit-tests for example?

MockedRequire.js
var path = require('path');
function MockedRequire(module) {
return require(path.join('/path/to/modules', module));
}
module.exports = MockedRequire;
Use:
var mymodule = require('./MockedRequire.js')('mymodule');
To be honest I haven't actually tested this but it should work without issues.

Webpack needs to know what files to bundle at compile time, but expression only be given value in runtime, you need require.context:
/* If structure like:
src -
|
-- index.js (where these code deploy)
|
-- assets -
|
--img
*/
let assetsPath = require.context('./assets/img', false, /\.(png|jpe?g|svg)$/);
// See where is the image after bundling.
// console.log(assetsPath('./MyImage.png'));
// In fact you could put all the images you want in './assets/img', and access it by index: './otherImg.jpg'
var newelement = {
"id": doc.id,
"background": assetsPath('./MyImage.png');
};

I would suggest using a configuration loader. It will select your path based on your NODE_ENV variable, but its much cleaner than what you suggested because you keep all of the environment specific config within a single external file.
Examples:
https://github.com/uber/zero-config
https://github.com/vngrs/konfig
Roll your own

Related

Require behaves differently in Angular project

If to write something like const theoretically = require('jasmine-theories');, require returns the content of the file.
But if to set declare var require: any;, then next **require** executions steps inside webpack bootstrap function __webpack_require__(moduleId) and return real file path with hash, not content (for example 'file.65465436547.js').
I've found out that file-loader has such behavior https://www.npmjs.com/package/file-loader .
I may assume that depending on declare var require: any; require is taken from NodeJS or from File-loader.
Is that correct? And is there more obvious way how and when to use each of them?
And how can I config File-loader to behave another way in case of Angular application? Angular CLI doesn't provide webpack.config, therefore loaders just get installed without any configuration possible.
Overall the question can be shortened to:
Why require returns content in one case and file name in another case?
Don't use require, use
import { theoretically } from 'jasmine-theories';
This is webpack tree shakable.
It looks that before declare var require: any; default method is used from Node JS. But after explicit declaring FileLoader's method becomes visible. And as it has default exporting - require usage is switched to FileLoader's one.

How do I use require inside a module with the parents context?

app.js
require('./modules/mod');
modules/mod/mod.js
modules.exports = () => {
require('./modules/secondmodule');
}
Essentially I want the above code to be able to require modules, but using the same context that itself was called from, e.g. another module in the same folder, without having to use relative paths.
I thought module.require() did this, but it seems to give me the same error that require() was after I moved my code into the separate module (mod.js).
edit:
I have since discovered I can use require.parent.module and it seems to be working. Please let me know if this is not advised.
require uses paths that are relative to current module. It's possible to do this by providing require from parent module:
modules.exports = parentRequire => {
parentRequire('./modules/secondmodule');
}
Which is used like:
require('./modules/mod')(require);
It's correct to use relative modules instead because child module shouldn't be aware of the context in which it's evaluated:
require('../secondmodule');
In case mod has to be decoupled from secondmodule, dependencies can be provided to it with some common pattern, e.g. dependency injection or service locator.
Secondary optional answer:
module.exports = () => {
module.parent.require('./modules/secondmodule');
}

How to properly require files into another file

I have a bunch of js script files that use require to require the same series of libraries, etc.'
let path = require('path');
let _ = require('underscore');
I want to put all these requirements into a separate library file that I can then reuse amongst the files that need them. I though I could do something like this:
var common = function() {
this.requireFiles = function() {
let path = require('path');
let _ = require('underscore');
...
}
};
export.common = common;
However, when I want to include this library in those files that use all these same files, it does not work. I am trying something like this:
let CommonTools = require('../server_libs/commonFiles').common;
let commonFiles = new CommonTools();
migration.requireFiles();
This give me an error that _ is not a function, when I want to use the underscore methods. Any hints as to where I should look for better understanding on this topic?
I personally do not recommend making a common module. The node.js module mentality is to just require() in what a module needs. Yes, it seems like a little extra/redundant typing in each module, but it makes each module self describing and does not build any unnecessary dependencies between modules leading to the simplest module sharing or reuse options. Modules are cached by the require() sub-system so it doesn't really cost you at runtime to just require() in each module as you need them. This is pretty much the node.js way.
That said, if you really want to make a common module, you can do it like this:
common.js
module.exports = {
_: require('underscore');
path: require('path');
}
otherModule.js
const {_, path} = require('common.js');
// can now call underscore methods _.each()
// can now call path methods path.join()
This uses destructing assignment to get properties from the common.js exports and assign them to a module-scoped variable and to do it for multiple properties in one statement. It still requires you to list each property you want defined in this module (which helps self describe what you're doing).
This also assume you're using require() and module.exports. If you're using the newer import and export keywords, then you can modify the syntax accordingly, but still use the same concept.

Check if the required module is a built-in module

I need to check if the module that is going to be loaded is a built-in or an external module. For example, suppose that you have a module named fs inside the node_modules directory. If you do require("fs") the built-in module will be loaded instead of the module inside node_modules, so I'm sure that this question has a solution.
Example:
var loadModule = function (moduleName){
if (isCoreModule (moduleName)){
...
}else{
...
}
};
loadModule ("fs");
process.binding('natives'); returns an object that provides access to all built-in modules, so getting the keys of this object will get you the module names. So you could simply do something like:
var nativeModules = Object.keys(process.binding('natives'));
function loadModule(name) {
if (~nativeModules.indexOf(name)) {
// `name` is a native module name
} else {
// ...
}
};
loadModule('fs');
you can use require.resolve.paths(str):
1- if str is core module the call will return null.
2- if str is not core module you will get an array of strings.
My first attempt would be: require.resolve(moduleName).indexOf('/') <= 0. If that is true, it's a core module. Might not be portable to windows as implemented, but you should be able to use this idea to get going in the right direction.
Aside: beware require.resolve does synchronous filesystem IO. Be careful about using it within a network server.
Aside: Careful with the term "native" which generally means a native-code compiled add-on, or a C/C++ implementation. Both "core" and community modules can be either pure JS or native.
I think "core" is the most accurate term for built-in modules.
Aside: best not to shadow global variable names, so moduleName instead of just module which can be confusing with the global of the same name.
Module.isBuiltin Added in: v18.6.0, v16.17.0
import { isBuiltin } from 'node:module';
isBuiltin('node:fs'); // true
isBuiltin('fs'); // true
isBuiltin('fs/promises'); // true
isBuiltin('wss'); // false

how to pass a shared variable to downstream modules?

I have a node toplevel myapp variable that contains some key application state - loggers, db handlers and some other data. The modules downstream in directory hierarchy need access to these data. How can I set up a key/value system in node to do that?
A highly upticked and accepted answer in Express: How to pass app-instance to routes from a different file? suggests using, in a lower level module
//in routes/index.js
var app = require("../app");
But this injects a hard-coded knowledge of the directory structure and file names which should be a bigger no-no jimho. Is there some other method, like something native in JavaScript? Nor do I relish the idea of declaring variables without var.
What is the node way of making a value available to objects created in lower scopes? (I am very much new to node and all-things-node aren't yet obvious to me)
Thanks a lot.
Since using node global (docs here) seems to be the solution that OP used, thought I'd add it as an official answer to collect my valuable points.
I strongly suggest that you namespace your variables, so something like
global.myApp.logger = { info here }
global.myApp.db = {
url: 'mongodb://localhost:27017/test',
connectOptions : {}
}
If you are in app.js and just want to allow access to it
global.myApp = this;
As always, use globals with care...
This is not really related to node but rather general software architecture decisions.
When you have a client and a server module/packages/classes (call them whichever way you like) one way is to define routines on the server module that takes as arguments whichever state data your client keeps on the 'global' scope, completes its tasks and reports back to the client with results.
This way, it is perfectly decoupled and you have a strict control of what data goes where.
Hope this helps :)
One way to do this is in an anonymous function - i.e. instead of returning an object with module.exports, return a function that returns an appropriate value.
So, let's say we want to pass var1 down to our two modules, ./module1.js and ./module2.js. This is how the module code would look:
module.exports = function(var1) {
return {
doSomething: function() { return var1; }
};
}
Then, we can call it like so:
var downstream = require('./module1')('This is var1');
Giving you exactly what you want.
I just created an empty module and installed it under node_modules as appglobals.js
// index.js
module.exports = {};
// package.json too is barebones
{ "name": "appGlobals" }
And then strut it around as without fearing refactoring in future:
var g = require("appglobals");
g.foo = "bar";
I wish it came built in as setter/getter, but the flexibility has to be admired.
(Now I only need to figure out how to package it for production)

Resources