How to dynamically import package.json dependencies based on environment variables? - node.js

How could I add a script to my package.json file that would allow me to dynamically use a local file instead of a package version based on an environment variable?
"dependencies": {
"dynamic-dependency": "$(process.env.NODE_ENV !== 'dev' ? '^1.0.7' : 'file:../local-path-to-package')"
}

You can't do this in package.json, which is non-executable JSON file. The JSON variant used in package.json doesn't even support comments :). The purpose of package.json is to specify which dependencies are installed into node_modules, and that's it. With those dependencies installed, they can be used by Node at runtime, which locates them using the module resolution algorithm:
If the module identifier passed to require() is not a core module, and does not begin with '/', '../', or './', then Node.js starts at the parent directory of the current module, and adds /node_modules, and attempts to load the module from that location. Node.js will not append node_modules to a path already ending in node_modules.
So you can't use NPM/package.json for this. But, I see that you tagged your question with React, so if you are using Webpack, you can solve this issue in your Webpack config. This can be done with resolve.alias:
const path = require('path');
module.exports = {
//...
resolve: {
alias: {
'dynamic-dependency': process.env.NODE_ENV !== 'dev' ? 'dynamic-dependency' : path.resolve(__dirname, '../local-path-to-package'),
},
},
};
I have not used other JS bundlers, but I would have to think Parcel/Rollup etc support this kind of configuration as well.

Related

How to point webpack to a specific node_modules folder

I am trying to build a grpc web client and I need to pack the code to resolve the require statements.
I have compiled the protos to js and it works if I have them in the current folder where I have installed the node modules.
The problem is if I have the compiled proto in some other place and I require them from there, webpack looks for the node modules in that path.
From my client.js
working version:
const {StopRequest, StopReply} = require('./work_pb.js');
Problematic version:
const {StopRequest, StopReply} = require('../../../messages/proto/output/work_pb.js');
In this last case it looks for the node modules in ../../../messages/proto/output/.
The node modules are installed in the current path where my client.js is and from where I ran npx webpack client.js.
ERROR in /home/xxx/yyy/zzz/messages/proto/output/work_pb.js
Module not found: Error: Can't resolve 'google-protobuf' in '/home/xxx/yyy/zzz/messages/proto/output'
# /home/xxx/yyy/zzz/messages/proto/output/work_pb.js 11:11-37
# ./client.js
How do I tell webpack to look in the current path for the node modules and not in the path of the compiled proto?
You can specify resolve.modules to customize the directory, where Webpack searches for modules with absolute import paths:
// inside webpack.config.js
const path = require('path');
module.exports = {
//...
resolve: {
modules: [path.resolve(__dirname, 'node_modules'), 'node_modules']
}
};
This will let node_modules inside your project root (where webpack config resides) take precedence over the default setting "node_modules", which resolves paths via node module resolution algorithm.
More infos: Webpack docs: Module Resolution

Node.js + Webpack + TypeScript: access to path of project with source files (not usage project)

Let's make clear once again: I don't need process.cwd in this question, I need
to access to absolute path of source project. E.g:
Source code: C:\Users\user1\projects\lib1\src\library.ts (becomes to Node Module in the future)
Project that uses Library: C:\Users\user1\projects\someProject\src\someProject.ts
So, I need to get the C:\Users\user1\projects\lib1\src inside library.ts.
I tried:
webpack.config.js
module.exports = {
// ...
target: 'node',
externals: [nodeExternals()],
plugins: [
new Webpack.DefinePlugin({
__PROJECT_ROUTE_ABSOLUTE_PATH__: __dirname
})
]
};
project-types.d.ts
declare var __PROJECT_ROUTE_ABSOLUTE_PATH__: string;
If to try console.log(__PROJECT_ROUTE_ABSOLUTE_PATH__) in library.ts, below invalid JavaScript
will be produced:
console.log(C:\Users\user1\projects\lib1);
The path is correct, but quotations are missing. I don't know how to explain it.
But anyway, how we can get right path?
There is also a strange phenomena: if to invoke __dirname, just / will be returned, so path.resolve(__dirname, 'fileName') givesC:\fileName `
You can directly use the node.js path module which is in built.
The path module provides utilities for working with file and directory paths. It can be accessed using:
const path = require('path');
__filename is the file name of the current module. This is the resolved absolute path of the current module file. (ex:/home/user/some/dir/file.js)
__dirname is the directory name of the current module. (ex:/home/user/some/dir)
fs.readFile(path.resolve(__dirname + 'fileName'))
This will resolve to the path of the file.

Express, Pug and Webpack

I have a Node js server app which uses Express and Pug. I would like to bundle it to single script which can be deployed by pm2. There seem to be several problems with this.
In runtime I get Cannot find module "." and during compilation few messages like
WARNING in ./node_modules/express/lib/view.js 80:29-41 Critical
dependency: the request of a dependency is an expression
appear which come from dynamic imports like require(mod).__express. I assume Webpack can't statically resolve those and does not know which dependency to include.
How can this be solved ?
How do I make Pug compile and be part of the output js ?
It is because webpack rebundle node_modules (already bundled) dependencies and in the case of pug, it doesn't work.
You need to use webpack-node-externals within the webpack externals option in order to specifically ask not to re-bundle depedencies.
Install webpack-node-externals: npm i -D webpack-node-externals
Integrate it your webpack config file:
Example
// ...
const nodeExternals = require('webpack-node-externals')
module.exports = {
target: 'node',
entry: {
// ...
},
module: {
// ...
},
externals: [nodeExternals()],
output: {
// ...
},
}

Webpack fails with Node FFI and Typescript - dynamic require error

In a simple Typescript program I require Node FFI with
import * as Electron from 'electron';`
import * as ffi from 'ffi';`
and then
mylib = ffi.Library('libmoi', {
'worker': [ 'string', [ 'string' ] ],
'test' : [ 'string', [] ]
} );
Linking that up via webpack yields
WARNING in ./~/bindings/bindings.js
Critical dependencies:
76:22-40 the request of a dependency is an expression
76:43-53 the request of a dependency is an expression
# ./~/bindings/bindings.js 76:22-40 76:43-53
The problem seems to be that FFI has a dynamic require and the fix seems to be to apply webpack.ContextReplacementPlugin in the webpack.config.js file.
This is a bit out of my reach, but an example for an Angular case is:
plugins: [
new webpack.ContextReplacementPlugin(
// The (\\|\/) piece accounts for path separators in *nix and Windows
/angular(\\|\/)core(\\|\/)(esm(\\|\/)src|src)(\\|\/)linker/,
root('./src') // location of your src
)
]
Any idea how to do this for FFI?
Here is the answer: github issue comment on the Johnny-Five repo
Quoting from brodo's answer, this is what you do to stop webpack getting snarled up with "bindings" and similar:
... the webpack config looks like this:
module.exports = {
plugins: [
new webpack.ContextReplacementPlugin(/bindings$/, /^$/)
],
externals: ["bindings"]
}
I also had a similar issue, somehow, I managed to resolve it. I will first explain my understanding.
Main work of webpack is to bundle the separate code file into one file, by default it bundles all the code that is referenced in its tree.
Generally two types of node_modules:
To be used on browser side(angular, rxjs etc)
To be used on nodejs side(express, ffi etc)
It is safer to bundle browser side node_module but not safer to bundle node side node_module because they are not designed like that So the solution is below two steps:
Give appropriate target(node, electron etc) in webpack.config.js file e.g "target":'electron-renderer' by default it is browser
Declare node_side module as external dependency in your webpack.config.js file e.g.
"externals": {
"bindings": "require('bindings')",
"ffi": "require('ffi')"
}

How does react avoid using require("../../Path/To/Module"), and just use require("Module")

As far as I've seen, npm modules can be require() without a path:
require("module") // --> npm module
and local modules are require() using a path:
require("./module") // --> local module, in this directory
require("../../path/to/module") // path to directory
In react.js, modules are required without a path. See here for example. I'm wondering how they achieve this.
Apparently it uses rewrite-modules Babel plugin with module-map module (see gulpfile.js.)
There's also this Babel plugin that you can use to achieve the same behavior.
If you're using Webpack, you can add path/to/modules into resolve.modulesDirectories array and it will work similarly to requiring from node_modules instead of using relative paths.
resolve: {
modulesDirectories: ['path/to/modules', 'node_modules'],
},
and then
var foo = require('foo');
// Instead of:
// var foo = require('/path/to/modules/foo');
// or
// var foo = require('../../foo');

Resources