How to build multiple npm packages for sharing? - node.js

I'm trying to create a mono repository to host multiple small packages that I plan to deploy on npm. I use Rollup for the bundling part.
After many hours watching what others do, searching on the internet and experimenting, I've reached a point where I'm stuck and little push in the right direction would be very much appreciated.
I've created a minimalist demo project that I've hosted on GitHub so it's easy to experiment with. You can find it here:
https://github.com/Stnaire/my-lib
In the repository you'll find 3 packages in the packages directory:
config: contains a SharedConfiguration service
container: a wrapper around Inversify to have a statically accessible container with helper methods
storage: a package containing a StorageService (normally for writing in the local storage, cookies, etc) and a VarHolder helper which is just a memory storage.
In each package there is a package.json (defining the npm package parameters) and a tsconfig.json for the build.
What I'm trying to do is the have a npm package for each of the packages, each allowing for the following types of usages:
With a bundler in a TypeScript environment
import { SharedConfiguration } from '#my-lib/config';
// or ideally:
import { SharedConfiguration } from '#my-lib/config/shared-configuration';
// this way only this dependency is included, not the whole `config` pacakge
With a bundler in a JavaScript environment
var SharedConfiguration = require('#my-lib/config');
// Or like above, ideally:
var SharedConfiguration = require('#my-lib/config/shared-configuration');
By including the output JS file in the browser
<script src="my-lib-config.umd.js"></script>
<script>
MyLibConfig.SharedConfiguration.get(...);
</script>
What I tried
I've created two branches in the demo repository, corresponding to two strategies.
First strategy: create aliases (branch detailed-modules-resolution)
In the tsconfig.json, I do:
{
"paths": {
"#my-lib/config/*": ["packages/config/src/*"],
"#my-lib/container/*": ["packages/container/src/*"],
"#my-lib/storage/*": ["packages/storage/src/*"]
}
}
This way I can import precisely what I need:
import { SharedConfiguration } from '#my-lib/config/shared-configuration';
And because the aliases also correspond to the folder structure in node_modules, it should work for the end user with a bundler as well.
But this way I get warnings when building as UMD:
No name was provided for external module '#my-lib/config/shared-configuration' in output.globals – guessing 'sharedConfiguration'
No name was provided for external module '#my-lib/container/container' in output.globals – guessing 'container'
Creating a global alias for each import is out of question, thus the second strategy.
Second strategy: centralize all public exports (branch centralized-imports)
So the idea is simply to export everthing the package wants to expose to other packages in the index.ts:
// packages/config/index.ts
export * from './shared-configuration';
Then in the build script, I do this:
// scripts/build/builds.js
/**
* Define the mapping of externals.
*/
export const Globals = {
'inversify': 'Inversify'
};
for (const build of Object.keys(Builds)) {
Globals[`#my-lib/${Builds[build].package}`] = Builds[build].moduleName;
}
Wihch creates the following object:
{
'inversify': 'Inversify',
'#my-lib/config': 'MyLibConfig',
'#my-lib/container': 'MyLibContainer',
'#my-lib/storage': 'MyLibStorage',
}
This way umd builds are working.
But there is two main drawbacks:
You have very little control on what you import. You import a whole package or nothing. And if the package depends on other packages you can quickly import thousands of lines for a little service.
This creates circular dependencies, as in my example project.
SharedConfiguration uses VarHolder which is in the #my-lib/storage. But this package also contain a StorageService which uses SharedConfiguration, creating a circular dependency
because all the imports are based on the index.ts: #my-lib/config => #my-lib/storage => #my-lib/config.
I thought about using one strategy or the other depending if I build in umd or not, but it feels wrong.
It must be simplier way to handle all of this.
Thank you very much for reading all this.

Related

Create a TypeScript library with optional dependencies resolved by application

I've written a library published to a private npm repo which is used by my applications.
This library contains utilities and has dependencies to other libraries, as an example let's choose #aws-sdk/client-lambda.
Some of my applications use only some of the utilities and don't need the dependencies to the external libraries, while some applications use all of the utilities.
To avoid having all applications getting a lot of indirect dependencies they don't need, I tried declaring the dependencies as peerDependencies and having the applications resolve the ones they need. It works well to publish the package, and to use it from applications who declare all of the peerDependencies as their own local dependencies, but applications failing to declare one of the dependencies get build errors when the included .d.ts files of the library are imported in application code:
error TS2307: Cannot find module '#aws-sdk/client-kms' or its corresponding type declarations.
Is it possible to resolve this situation so that my library can contain many different utils but the applications may "cherry-pick" the dependencies they need to fulfill the requirements of those utilities in runtime?
Do I have to use dynamic imports to do this or is there another way?
I tried using #ts-ignore in the library code, and it was propagated to the d.ts file imported by the applications, but it did not help.
Setup:
my-library
package.json:
peerDependencies: {
"#aws-sdk/client-lambda": "^3.27.0"
}
foo.ts:
import {Lambda} from '#aws-sdk/client-lambda';
export function foo(lambda: Lambda): void {
...
}
bar.ts:
export function bar(): void {
...
}
index.ts:
export * from './foo';
export * from './bar';
my-application1 - works fine
package.json:
dependencies: {
"my-library": "1.0.0",
"#aws-sdk/client-lambda": "^3.27.0"
}
test.ts:
import {foo} from 'my-library';
foo();
my-application2 - does not compile
package.json:
dependencies: {
"my-library": ...
}
test:ts:
import {bar} from 'my-library';
bar();
I found two ways of dealing with this:
1. Only use dynamic imports for the optional dependencies
If you make sure that types exported by the root file of the package only include types and interfaces and not classes etc, the transpiled JS will not contain any require statement to the optional library. Then use dynamic imports to import the optional library from a function so that they are required only when the client explicitly uses those parts of the library.
In the case of #aws-sdk/client-lambda, which was one of my optional dependencies, I wanted to expose function that could take an instance of a Lambda object or create one itself:
import {Lambda} from '#aws-sdk/client-lambda';
export function foo(options: {lambda?: Lambda}) {
if (!lambda) {
lambda = new Lambda({ ... });
}
...
}
Since Lambda is a class, it will be part of the transpiled JS as a require statement, so this does not work as an optional dependency. So I had to 1) make that import dynamic and 2) define an interface to be used in place of Lambda in my function's arguments to get rid of the require statement on the package's root path. Unfortunately in this particular case, the AWS SDK does not offer any type or interface which the class implements, so I had to come up with a minimal type such as
export interface AwsClient {
config: {
apiVersion: string;
}
}
... but of course, lacking a type ot represent the Lambda class, you might even resort to any.
Then comes the dynamic import part:
export async function foo(options: {lambda?: AwsClient}) {
if (!lambda) {
const {Lambda} = await import('#aws-sdk/client-lambda');
lambda = new Lambda({ ... });
}
...
}
With this code, there is no longer any require('#aws-sdk/client-lambda') on the root path of the package, only within the foo function. Only clients calling the foo function will have to have the dependency in their
node_modules.
As you can see, a side-effect of this is that every function using the optional library must be async since dynamic imports return promises. In my case this worked out, but it may complicate things. In one case I had a non-async function (such as a class constructor) needing an optional library, so I had no choice but to cache the promised import and resolve it later when used from an async member function, or do a lazy import when needed. This has the potential of cluttering code badly ...
So, to summarize:
Make sure any code that imports code from the optional library is put inside functions that the client wanting to use that functionality calls
It's OK to have imports of types from the optional library in the root of your package as it's stripped out when transpiled
If needed, defined substitute types to act as place-holders for any class arguments (as classes are both types and code!)
Transpile and investigate the resulting JS to see if you have any require statement for the optional library in the root, if so, you've missed something.
Note that if using webpack etc, using dynamic imports can be tricky as well. If the import paths are constants, it usually works, but building the path dynamically (await import('#aws-sdk/' + clientName)) will usually not unless you give webpack hints. This had me puzzled for a while since I wrote a wrapper in front of my optional AWS dependencies, which ended up not working at all for this reason.
2. Put the files using the optional dependencies in .ts files not exported by the root file of the package (i.e., index.ts).
This means that clients wanting to use the optional functionality must
import those files by sub-path, such as:
import {OptionalStuff} from 'my-library/dist/optional;
... which is obviously less than ideal.
in my case, the typescript IDE in vscode fails to import the optional type, so im using the relative import path
// fix: Cannot find module 'windows-process-tree' or its corresponding type declarations
//import type * as WindowsProcessTree from 'windows-process-tree';
import type * as WindowsProcessTree from '../../../../../node_modules/#types/windows-process-tree';
// global variable
let windowsProcessTree: typeof WindowsProcessTree;
if (true) { // some condition
windowsProcessTree = await import('windows-process-tree');
windowsProcessTree.getProcessTree(rootProcessId, tree => {
// ...
});
}
package.json
{
"devDependencies": {
"#types/windows-process-tree": "^0.2.0",
},
"optionalDependencies": {
"windows-process-tree": "^0.3.4"
}
}
based on vscode/src/vs/platform/terminal/node/windowsShellHelper.ts

How to avoid singleton in node

I have an npm dependency that I import into a file of my server node. I wish it was not a singleton because it should not be shared between each request.
The file who import the dependency :
const dependency = require('dependency');
export class dependencyFactory {
public static getDependency() {
return dependency;
}
}
index.js of dependency in node_modules :
const path = require('path');
const createApi = require('./createApi');
module.exports = createApi(path.join(__dirname, './lib/providers'));
How can i do that ? thank you.
Modules result in singletons in Node. If this is undesirable, a workaround always depends on specific package.
A preferable way is to export factory function or constructor class from a package that can create new instances when needed.
If this is not possible, possible workarounds may include:
use package internal modules to create new instances
invalidate module cache to re-import a package
get class constructor from a singleton and create a new instance
All of them can be considered hacks and should be avoided when possible. E.g. relying on internal package structure may introduce breaking changes with new package version, even if package changelog doesn't assume breaking changes. And a pitfall for cache invalidation is that a package may consist of numerous modules that should or should not be re-imported.
The first workaround seems to be applicable here.
const createApi = require('dependency/createApi');
const instance = createApi(require.resolve('dependency/lib/providers'));
A cleaner solution is to fork a package and add a capability to create multiple instances.

Importing a file local to an npm module

I have a node module named redux-loop that I'm using, and I would like to modify one of its functions slightly.
It's not a very complicated file, so I've made a copy of it in my react-native app and made my changes. The original code requires a few exports from files inside the module, eg:
var { loop, isLoop, getEffect, getModel } = require('./loop');
var { batch, none } = require('./effects');
The problem is that my copy of this file cannot seem to get direct access to those files, so I can't import those symbols.
I've tried various combinations of things, such as:
var { loop, isLoop, getEffect, getModel } = require('redux-loop/loop');
var { batch, none } = require('redux-loop/effects');
…which conceptually would mean to me to require the loop.js file inside the redux-loop module, but apparently module loading doesn't work that way.
What's the best way for me to import theses symbols into this file?
If I understand your question and you're open to reapplying your changes to the source of the package, you could use npm edit <pkg> and modify the package directly. When your changes are done the package will be rebuilt with your modifications.

Typescript : Add a method on a class from another module / Populate a namespace from different modules

The story
I am building a modular library for math operations. I also want to divide the library into multiple modules : core, relational, vectors, and so on. The modules can be used on their own (but all depend on the core module)
I know it's not possible to use partial classes How do I split a TypeScript class into multiple files? / https://github.com/Microsoft/TypeScript/issues/563
The problem :
The core module defines the Set class, which is a mathematical set. It defines operations such as Set#add, Set#remove.
Though, the optional relational module adds a Set#productoperator on the Setclass.
Other modules could also add other operations on the Set class. I want to keep the possibility of adding functionality when I will see fit.
The question
With typescript, how can I add a method on a class that resides in another module ?
How can I arrange the typings so that the user of my library will see the Set#product in his code completion only if he installed the relational module ? Otherwise he only sees the #add and #remove operations ?
I am developing this library for node.js, but also use browserify to bundle it for browser use.
// core/set.ts
export class Set {
add(element){}
remove(element){}
}
// relational/set.ts
import {Set} from './../core/set.ts';
Set.prototype.product = function(){} // ?
// app/index.js
import {core} from 'mylib';
var set = new Set();
set.add();
set.remove();
// set.product() is not available
// app/index2.js
import {core} from 'mylib';
import {relational} from 'mylib';
var set = new Set();
set.add();
set.remove();
set.product() //is available
Bonus question
All these modules are made to be available through a common namespace, let's call it MyLibrary. The core module adds MyLibrary.Core, and the relational module adds some objects into the MyLibrary.Core, and also adds MyLibrary.Relational.
Let's say I publish another module, only used to act as a facade for the other modules. Let's call this module my-library.
If a user installs the my-library, core and relational modules with npm.
npm install my-library && npm install core and nom-install relational
In the client application I would like the user of the library to only have to write
var lib = require('my-library');
Then, my-library would automatically check all installed MyLibrary modules, require them and populate the MyLibrary namespace and return it.
How an I tell the my-library module, the first time it is accessed, in both node and browser env to
Check for any available MyLibrary module (browser and node environments)
Run a method for each module (to install them in the namespace)
Return that nice fruity namespace
If you're just writing declaration files, you can use interfaces instead and do something like what moment-timezone does.
moment.d.ts
declare module moment {
interface Moment {
// ...
}
interface MomentStatic {
// ...
}
}
declare module 'moment' {
var _tmp: moment.MomentStatic;
export = _tmp;
}
moment-timezone.d.ts
Just redeclare the same interfaces with extra functions.
declare module moment {
interface Moment {
tz(): void;
}
interface MomentStatic {
tz(): void;
}
}
declare module 'moment-timezone' {
var _tmp: moment.MomentStatic;
export = _tmp;
}
Both packages are identical now, and moment gets the new methods automatically.

Why does require() give me a new copy of a module when loaded from a different location?

I'm pretty new to Node.JS development and I'm trying to create a Hubot adapter.
The hubot module exports a class named TextMessage I need to use. In my adapter, I create an instance of this and pass it to the running robot like so:
{Adapter, TextMessage} = require 'hubot'
class MyAdapter extends Adapter
onNewMessage: (text) =>
#receive new TextMessage text
However, in Hubot's own code it is checking that my message is an instanceof TextMessage. When I run a bot and use my adapter, this check always fails.
The project structure is laid out like this:
my-bot
|- node_modules
|- my-adapter
| |- node_modules
| | |- hubot
|- hubot
As a result require('hubot') in my-adapter is giving me a different copy of the hubot module as the one given to my-bot.
I'm pretty sure I'm not understanding some fundamental concept of Node modules here. What am I doing wrong?
The reason it is designed this way is so that it's possible for a module to always get a "fresh" version (that has not been modified by any libraries such as your one). Generally, if you require() one of your dependencies, then you should be able to rely on default behaviour for that module. (The alternative is unpredictable and possibly insecure.)
If the purpose of your module is to adapt another module, then you should either:
Return a modified module
For example, your module might do:
module.exports = require('hubot');
// ... your custom modifications
Or:
var hubot = module.exports.hubot = require('hubot');
This means that the app itself wouldn't depend on hubot, but only your module.
Not list hubot as a dependency for your adapter
Node.js require() calls cascade up the path - so if you simply don't install hubot as a dependency, then you can still require() it, and it will use the version from the app.
This means that it's possible to install your module without installing hubot, thus causing problems - but on the other hand, it also allows multiple modules to modify the same base module.
Personally, I'd opt for the second option.

Resources