How to avoid singleton in node - node.js

I have an npm dependency that I import into a file of my server node. I wish it was not a singleton because it should not be shared between each request.
The file who import the dependency :
const dependency = require('dependency');
export class dependencyFactory {
public static getDependency() {
return dependency;
}
}
index.js of dependency in node_modules :
const path = require('path');
const createApi = require('./createApi');
module.exports = createApi(path.join(__dirname, './lib/providers'));
How can i do that ? thank you.

Modules result in singletons in Node. If this is undesirable, a workaround always depends on specific package.
A preferable way is to export factory function or constructor class from a package that can create new instances when needed.
If this is not possible, possible workarounds may include:
use package internal modules to create new instances
invalidate module cache to re-import a package
get class constructor from a singleton and create a new instance
All of them can be considered hacks and should be avoided when possible. E.g. relying on internal package structure may introduce breaking changes with new package version, even if package changelog doesn't assume breaking changes. And a pitfall for cache invalidation is that a package may consist of numerous modules that should or should not be re-imported.
The first workaround seems to be applicable here.
const createApi = require('dependency/createApi');
const instance = createApi(require.resolve('dependency/lib/providers'));
A cleaner solution is to fork a package and add a capability to create multiple instances.

Related

Create a TypeScript library with optional dependencies resolved by application

I've written a library published to a private npm repo which is used by my applications.
This library contains utilities and has dependencies to other libraries, as an example let's choose #aws-sdk/client-lambda.
Some of my applications use only some of the utilities and don't need the dependencies to the external libraries, while some applications use all of the utilities.
To avoid having all applications getting a lot of indirect dependencies they don't need, I tried declaring the dependencies as peerDependencies and having the applications resolve the ones they need. It works well to publish the package, and to use it from applications who declare all of the peerDependencies as their own local dependencies, but applications failing to declare one of the dependencies get build errors when the included .d.ts files of the library are imported in application code:
error TS2307: Cannot find module '#aws-sdk/client-kms' or its corresponding type declarations.
Is it possible to resolve this situation so that my library can contain many different utils but the applications may "cherry-pick" the dependencies they need to fulfill the requirements of those utilities in runtime?
Do I have to use dynamic imports to do this or is there another way?
I tried using #ts-ignore in the library code, and it was propagated to the d.ts file imported by the applications, but it did not help.
Setup:
my-library
package.json:
peerDependencies: {
"#aws-sdk/client-lambda": "^3.27.0"
}
foo.ts:
import {Lambda} from '#aws-sdk/client-lambda';
export function foo(lambda: Lambda): void {
...
}
bar.ts:
export function bar(): void {
...
}
index.ts:
export * from './foo';
export * from './bar';
my-application1 - works fine
package.json:
dependencies: {
"my-library": "1.0.0",
"#aws-sdk/client-lambda": "^3.27.0"
}
test.ts:
import {foo} from 'my-library';
foo();
my-application2 - does not compile
package.json:
dependencies: {
"my-library": ...
}
test:ts:
import {bar} from 'my-library';
bar();
I found two ways of dealing with this:
1. Only use dynamic imports for the optional dependencies
If you make sure that types exported by the root file of the package only include types and interfaces and not classes etc, the transpiled JS will not contain any require statement to the optional library. Then use dynamic imports to import the optional library from a function so that they are required only when the client explicitly uses those parts of the library.
In the case of #aws-sdk/client-lambda, which was one of my optional dependencies, I wanted to expose function that could take an instance of a Lambda object or create one itself:
import {Lambda} from '#aws-sdk/client-lambda';
export function foo(options: {lambda?: Lambda}) {
if (!lambda) {
lambda = new Lambda({ ... });
}
...
}
Since Lambda is a class, it will be part of the transpiled JS as a require statement, so this does not work as an optional dependency. So I had to 1) make that import dynamic and 2) define an interface to be used in place of Lambda in my function's arguments to get rid of the require statement on the package's root path. Unfortunately in this particular case, the AWS SDK does not offer any type or interface which the class implements, so I had to come up with a minimal type such as
export interface AwsClient {
config: {
apiVersion: string;
}
}
... but of course, lacking a type ot represent the Lambda class, you might even resort to any.
Then comes the dynamic import part:
export async function foo(options: {lambda?: AwsClient}) {
if (!lambda) {
const {Lambda} = await import('#aws-sdk/client-lambda');
lambda = new Lambda({ ... });
}
...
}
With this code, there is no longer any require('#aws-sdk/client-lambda') on the root path of the package, only within the foo function. Only clients calling the foo function will have to have the dependency in their
node_modules.
As you can see, a side-effect of this is that every function using the optional library must be async since dynamic imports return promises. In my case this worked out, but it may complicate things. In one case I had a non-async function (such as a class constructor) needing an optional library, so I had no choice but to cache the promised import and resolve it later when used from an async member function, or do a lazy import when needed. This has the potential of cluttering code badly ...
So, to summarize:
Make sure any code that imports code from the optional library is put inside functions that the client wanting to use that functionality calls
It's OK to have imports of types from the optional library in the root of your package as it's stripped out when transpiled
If needed, defined substitute types to act as place-holders for any class arguments (as classes are both types and code!)
Transpile and investigate the resulting JS to see if you have any require statement for the optional library in the root, if so, you've missed something.
Note that if using webpack etc, using dynamic imports can be tricky as well. If the import paths are constants, it usually works, but building the path dynamically (await import('#aws-sdk/' + clientName)) will usually not unless you give webpack hints. This had me puzzled for a while since I wrote a wrapper in front of my optional AWS dependencies, which ended up not working at all for this reason.
2. Put the files using the optional dependencies in .ts files not exported by the root file of the package (i.e., index.ts).
This means that clients wanting to use the optional functionality must
import those files by sub-path, such as:
import {OptionalStuff} from 'my-library/dist/optional;
... which is obviously less than ideal.
in my case, the typescript IDE in vscode fails to import the optional type, so im using the relative import path
// fix: Cannot find module 'windows-process-tree' or its corresponding type declarations
//import type * as WindowsProcessTree from 'windows-process-tree';
import type * as WindowsProcessTree from '../../../../../node_modules/#types/windows-process-tree';
// global variable
let windowsProcessTree: typeof WindowsProcessTree;
if (true) { // some condition
windowsProcessTree = await import('windows-process-tree');
windowsProcessTree.getProcessTree(rootProcessId, tree => {
// ...
});
}
package.json
{
"devDependencies": {
"#types/windows-process-tree": "^0.2.0",
},
"optionalDependencies": {
"windows-process-tree": "^0.3.4"
}
}
based on vscode/src/vs/platform/terminal/node/windowsShellHelper.ts

How to build multiple npm packages for sharing?

I'm trying to create a mono repository to host multiple small packages that I plan to deploy on npm. I use Rollup for the bundling part.
After many hours watching what others do, searching on the internet and experimenting, I've reached a point where I'm stuck and little push in the right direction would be very much appreciated.
I've created a minimalist demo project that I've hosted on GitHub so it's easy to experiment with. You can find it here:
https://github.com/Stnaire/my-lib
In the repository you'll find 3 packages in the packages directory:
config: contains a SharedConfiguration service
container: a wrapper around Inversify to have a statically accessible container with helper methods
storage: a package containing a StorageService (normally for writing in the local storage, cookies, etc) and a VarHolder helper which is just a memory storage.
In each package there is a package.json (defining the npm package parameters) and a tsconfig.json for the build.
What I'm trying to do is the have a npm package for each of the packages, each allowing for the following types of usages:
With a bundler in a TypeScript environment
import { SharedConfiguration } from '#my-lib/config';
// or ideally:
import { SharedConfiguration } from '#my-lib/config/shared-configuration';
// this way only this dependency is included, not the whole `config` pacakge
With a bundler in a JavaScript environment
var SharedConfiguration = require('#my-lib/config');
// Or like above, ideally:
var SharedConfiguration = require('#my-lib/config/shared-configuration');
By including the output JS file in the browser
<script src="my-lib-config.umd.js"></script>
<script>
MyLibConfig.SharedConfiguration.get(...);
</script>
What I tried
I've created two branches in the demo repository, corresponding to two strategies.
First strategy: create aliases (branch detailed-modules-resolution)
In the tsconfig.json, I do:
{
"paths": {
"#my-lib/config/*": ["packages/config/src/*"],
"#my-lib/container/*": ["packages/container/src/*"],
"#my-lib/storage/*": ["packages/storage/src/*"]
}
}
This way I can import precisely what I need:
import { SharedConfiguration } from '#my-lib/config/shared-configuration';
And because the aliases also correspond to the folder structure in node_modules, it should work for the end user with a bundler as well.
But this way I get warnings when building as UMD:
No name was provided for external module '#my-lib/config/shared-configuration' in output.globals – guessing 'sharedConfiguration'
No name was provided for external module '#my-lib/container/container' in output.globals – guessing 'container'
Creating a global alias for each import is out of question, thus the second strategy.
Second strategy: centralize all public exports (branch centralized-imports)
So the idea is simply to export everthing the package wants to expose to other packages in the index.ts:
// packages/config/index.ts
export * from './shared-configuration';
Then in the build script, I do this:
// scripts/build/builds.js
/**
* Define the mapping of externals.
*/
export const Globals = {
'inversify': 'Inversify'
};
for (const build of Object.keys(Builds)) {
Globals[`#my-lib/${Builds[build].package}`] = Builds[build].moduleName;
}
Wihch creates the following object:
{
'inversify': 'Inversify',
'#my-lib/config': 'MyLibConfig',
'#my-lib/container': 'MyLibContainer',
'#my-lib/storage': 'MyLibStorage',
}
This way umd builds are working.
But there is two main drawbacks:
You have very little control on what you import. You import a whole package or nothing. And if the package depends on other packages you can quickly import thousands of lines for a little service.
This creates circular dependencies, as in my example project.
SharedConfiguration uses VarHolder which is in the #my-lib/storage. But this package also contain a StorageService which uses SharedConfiguration, creating a circular dependency
because all the imports are based on the index.ts: #my-lib/config => #my-lib/storage => #my-lib/config.
I thought about using one strategy or the other depending if I build in umd or not, but it feels wrong.
It must be simplier way to handle all of this.
Thank you very much for reading all this.

Typescript : Add a method on a class from another module / Populate a namespace from different modules

The story
I am building a modular library for math operations. I also want to divide the library into multiple modules : core, relational, vectors, and so on. The modules can be used on their own (but all depend on the core module)
I know it's not possible to use partial classes How do I split a TypeScript class into multiple files? / https://github.com/Microsoft/TypeScript/issues/563
The problem :
The core module defines the Set class, which is a mathematical set. It defines operations such as Set#add, Set#remove.
Though, the optional relational module adds a Set#productoperator on the Setclass.
Other modules could also add other operations on the Set class. I want to keep the possibility of adding functionality when I will see fit.
The question
With typescript, how can I add a method on a class that resides in another module ?
How can I arrange the typings so that the user of my library will see the Set#product in his code completion only if he installed the relational module ? Otherwise he only sees the #add and #remove operations ?
I am developing this library for node.js, but also use browserify to bundle it for browser use.
// core/set.ts
export class Set {
add(element){}
remove(element){}
}
// relational/set.ts
import {Set} from './../core/set.ts';
Set.prototype.product = function(){} // ?
// app/index.js
import {core} from 'mylib';
var set = new Set();
set.add();
set.remove();
// set.product() is not available
// app/index2.js
import {core} from 'mylib';
import {relational} from 'mylib';
var set = new Set();
set.add();
set.remove();
set.product() //is available
Bonus question
All these modules are made to be available through a common namespace, let's call it MyLibrary. The core module adds MyLibrary.Core, and the relational module adds some objects into the MyLibrary.Core, and also adds MyLibrary.Relational.
Let's say I publish another module, only used to act as a facade for the other modules. Let's call this module my-library.
If a user installs the my-library, core and relational modules with npm.
npm install my-library && npm install core and nom-install relational
In the client application I would like the user of the library to only have to write
var lib = require('my-library');
Then, my-library would automatically check all installed MyLibrary modules, require them and populate the MyLibrary namespace and return it.
How an I tell the my-library module, the first time it is accessed, in both node and browser env to
Check for any available MyLibrary module (browser and node environments)
Run a method for each module (to install them in the namespace)
Return that nice fruity namespace
If you're just writing declaration files, you can use interfaces instead and do something like what moment-timezone does.
moment.d.ts
declare module moment {
interface Moment {
// ...
}
interface MomentStatic {
// ...
}
}
declare module 'moment' {
var _tmp: moment.MomentStatic;
export = _tmp;
}
moment-timezone.d.ts
Just redeclare the same interfaces with extra functions.
declare module moment {
interface Moment {
tz(): void;
}
interface MomentStatic {
tz(): void;
}
}
declare module 'moment-timezone' {
var _tmp: moment.MomentStatic;
export = _tmp;
}
Both packages are identical now, and moment gets the new methods automatically.

Using a global variable in Node.js for dependency injection

I'm starting out a long term project, based on Node.js, and so I'm looking to build upon a solid dependency injection (DI) system.
Although Node.js at its core implies using simple module require()s for wiring components, I find this approach not best suited for a large project (e.g. requiring modules in each file is not that maintainable, testable or dynamic).
Now, I'd done my bits of research before posting this question and I've found out some interesting DI libraries for Node.js (see wire.js and dependable.js).
However, for maximal simplicity and minimal repetition I've come up with my own proposition of implementing DI:
You have a module, di.js, which acts as the container and is initialized by pointing to a JSON file storing a map of dependency names and their respective .js files.
This already provides a dynamic nature to the DI, as you may easily swap test/development dependencies.
The container can return dependencies by using an inject() function, which finds the dependency mapping and calls require() with it.
For simplicity, the module is assigned to a global variable, i.e. global.$di, so that any file in the project may use the container/injector by calling $di.inject().
Here's the gist of the implementation:
File di.js
module.exports = function(path) {
this.deps = require(path);
return {
inject: function(name) {
if (!deps[name])
throw new Error('dependency "' + name + '" isn\'t registered');
return require(deps[name]);
}
};
};
Dependency map JSON file
{
"vehicle": "lib/jetpack",
"fuel": "lib/benzine",
"octane": "lib/octane98"
}
Initialize the $di in the main JavaScript file, according to development/test mode:
var path = 'dep-map-' + process.env.NODE_ENV + '.json;
$di = require('di')(path);
Use it in some file:
var vehicle = $di.inject('vehicle');
vehicle.go();
So far, the only problem I could think of using this approach is the global variable $di.
Supposedly, global variables are a bad practice, but it seems to me like I'm saving a lot of repetition for the cost of a single global variable.
What can be suggested against my proposal?
Overall this approach sounds fine to me.
The way global variables work in Node.js is that when you declare a variable without the var keyword, and it gets added to the global object which is shared between all modules. You can also explicitly use global.varname. Example:
vehicle = "jetpack"
fuel = "benzine"
console.log(vehicle) // "jetpack"
console.log(global.fuel) // "benzine"
Variables declared with var will only be local to the module.
var vehicle = "car"
console.log(vehicle) // "car"
console.log(global.vehicle) // "jetpack"
So in your code if you are doing $di = require('di')(path) (without var), then you should be able to use it in other modules without any issues. Using global.$di might make the code more readable.
Your approach is a clear and simple one which is good. Whether you have a global variable or require your module every time is not important.
Regarding testability it allows you to replace your modules with mocks. For unit testing you should add a function that makes it easy for you to apply different mocks for each test. Something that extends your dependency map temporarily.
For further reading I can recommend a great blog article on dependency injection in Node.js as well as a talk on the future dependency injector of angular.js which is designed by some serious masterminds.
BTW, you might be interested in Fire Up! which is a dependency injection container I implemented.

nodejs module does not export function

I ran into an issue with my Nodejs application.
I have two different apps that are using shared library, which is located so that it is found one level up in node_modules. So i have this structure ./app1/app.js, ./app2/app.js and ./node_modules/shared.libs/index.js.
shared.libs in its turn has some other modules installed, like mongoose, redis etc. Plus some mogoose models with additional functions in them. All are exported from index.js like this:
exports.async = require('async');
exports.config = require('./config');
exports.utils = require('./lib/utils');
And then in apps i import them like this:
var libs = require('shared.libs');
var config = libs.config;
So after this code i can use config which is coming from that shared library.
This part was and is working just fine. But now i need to put additional layer on top of this library (read: provide more unified interface for both apps).
What i tried to do is to add some functions into index.js of shared library and then export the whole object with these functions. But whenever i try to call previously imported (by var libs = require('shared.libs');) object it says that libs is not defined.
What am i doing wrong here?
I generally want to keep other code the same, so i won't need to go over replacing require part everywhere, but at the same time provide additional functionality from shared library which will be available from that imported libs object.
this should definitely work:
module.exports = {
async: require('async'),
config: require('./config'),
utils: require('./lib/utils'),
foo: function () {
return 'bar';
}
};
reference like:
var libs = require('shared.libs');
console.log(libs.async);
console.log(libs.config);
console.log(libs.utils);
console.log(libs.foo);
console.log(libs.foo());
What makes me wonder is one of your comments above, that you get an error libs is not defined. That looks like you should have other unnoticed errors before.. during the the initialization of your shared.libs module..

Resources