Make a var, "global" just in the npm Package instead of whole Project in Nodejs - node.js

I am transforming my Node MicroServices in npm packages, so that I can run several of them in a single server. They exchanges RabbitMQ messages, so they are independent from the listening server.
My problem is that each Ms used global.variables, like the DB connection for example. Now that they are Packages, and so I can have different Mss in one server, I can't share a global.db_connection because each Ms may use a different db.
Is there a way so that I can use a global.db_connection just inside a Package, and that var is global just inside that Package?
==== UPDATE1
Before, I wrongly referred to Modules. I mean npm package. Each MS is a separated npm package, that I install in the server. I've corrected my question now, using package instead of module.
=== EXAMPLE
I know there is a better way to use a DB connection among the code. I am using 'db_connection' just as example of a global object.
-- packageA.js
#module MS_A
global.db_connection = db.connect(URL_1);
class MS_A
{
...
}
module.exports = MS_A;
-- packageB.js
#module MS_B
global.db_connection = db.connect(URL_2);
class MS_B
{
...
}
module.exports = MS_B;
-- server.js
global.needed_variable = 3;
var MS_A = new require("./packageA.js")().start();
var MS_B = new require("./packageB.js")().start();
/**
"needed_variable" should be visible inside both packages.
"db_connection" should be not visible outside each package
*/
MS_B should use its own db_connection, and be not aware of the db_connection of MS_A (and vice versa).

Related

How to set a default/fallback scope on package names with Node require()

I am using a particular module in my Node code:
const example = require('example');
However this module is slow to be updated, so I have forked it and published it with my updates under my own scope on npmjs.com. However now to use my own module, I must change every use in my code:
const example = require('#my-username/example');
The problem with this is that I will have to commit a bunch of changes throughout many files to rename the module, then when upstream merges my changes into the official version I will have to update my code again to remove the scope operator from require() across all those files, then add it back if I have more changes that are slow to be accepted, and so on.
Is there a way that I can tell Node or NPM that if require() can't find a module with no scope in the name, to then check all the #scope folders in node_modules to see if there's a match there?
If that were possible then I would only need to update package.json with the relevant package version and the code itself could remain unchanged as I switch between my fork and the official version.
you can implement it using module-alias
This will slow down your startup, but let you write all this logic for every requires you make in your application.
const moduleAlias = require('module-alias')
// Custom handler function (starting from v2.1)
moduleAlias.addAlias('request', (fromPath, request, alias) => {
console.log({
fromPath,
request,
alias,
});
return __dirname + '/my-custom-request'
})
require('request')

Setting up a central logging package for multiple applications

Say we create a package called 'my-loggers' that can be reused, that looks like this:
import bunyan = require('bunyan');
const loggers = {};
export const getLogger = function(name, config){
if(loggers[name]){
return loggers[name];
}
return loggers[name] = bunyan.createLogger(config);
};
then in each application, we do:
npm install -S my-loggers
then we use the logger like this:
import * as MyLoggers from 'my-loggers';
import config = require('../my-app-config');
const log = MyLoggers.getLogger('my-app', config.logging);
the problem is it really requires 3 lines of code to retrieve the logger for each file in each app.
I am trying to figure out a way to create a single package that can retrieve a logger for any file in any app, and I am trying to
cut everything down to just one LoC.
I am also trying to avoid relative paths.
I cannot think of a solution that involves 1 logging package for all our apps. The only thing I can think of is either a separate logging package per app (which is kinda lame), or doing a trick where we symlink some code from our project into node_modules, upon npm install, which allows us to avoid relative paths.
Anyone know of a good way to solve this one?
Ok, I have a solution. We can just do this:
import log from 'my-loggers/app1'
import log from 'my-loggers/app2'
import log from 'my-loggers/app3'
that way we can store different loggers in the same package.

node: how to require different classes depending on env

In node, module imports (aka require()) are hard coded in every file (aka module) which requires that import. This is can be tens or in our case hundreds of duplicate imports. What we are "requiring" in a dynamic way are mainly services, e.g. "playerService" with find, update, get methods etc. but could also be domain objects or persistence libraries
The crux is we have 3 versions of this "playerService" js file. One which does everything locally (in memory) for development, one which does everything with a local database (test), and one which does everything with an external system via an API (live). The switch in this case in on environment (dev, test or live).
It is worth noting we use classes everywhere we can because we find functions which return functions of functions etc. to be unreadable/maintainable (we are java developers really struggling with js)
We are also exclusively using web sockets in our node app -there is no http code.
So our services look like this:
const Player = require("./player")
class PlayerService {
constructor(timeout) {
this.timeout= 3000 // todo, put in a config file
if (timeout != null) {this.timeout= timeout}
}
updatePlayer(player) {
// logic to lookup player in local array and change it for dev version.
// test version would lookup player in DB and update it.
}
}
module.exports = PlayerService
We are familiar with dependency injection with Grails and spring, but haven't found anything comprehensible (see below) for node. We are not javascript nor node gurus unfortunately, despite extensive reading.
Currently we are considering one of these options, but would like to hear any better suggestions:
option 1:
Hard code the "dev" requires, e.g. require("./dev/playerSerice")
have a jenkins build server rewrite the source code in every file to require("./test/playerSerice").
option 2:
Hard code the "dev" requires, e.g. require("./playerSerice")
have a jenkins build server swap the file /test/playerService/playerService" to ./playerService.
Obviously these make it hard for developers to run the test or pro versions on their local machines without hacking the source.
option 3:
1. put the required module paths in a single config file.
2. swap out just the config file. E.g.
let Config = require("./config")
let PlayerService = require(Config.playerService)
We have tried to make this dependent on env and have a global config which the development, test and prod configs over ride these, but have not found an elegant way to do this. One way might be to duplicate this code at the top of every module:
let env = process.env.NODE_ENV || 'development'
let config = require('./config.'+env);
let PlayerService = require("./" + Config.playerService)
Then in config.development.js:
var config = require("./config.global")
config.PlayerService = "devPlayerService"
module.exports = config
Option 4:
Perhaps something like this would work:
let env = process.env.NODE_ENV || 'development'
require("./" + env + "/playerService")
all the above solutions suffer from lack of singletons - services are stateless. We are guessing that node is going to construct a new copy of each service for each request (or web socket message in our case). Is there a way to minimise this?
Obviously some simple, readable, and officially maintained form of Dependency injection would be nice, with some way to switch between which set of classes were injected.
We have read the following posts:
https://blog.risingstack.com/dependency-injection-in-node-js/ resultant code is unreadable (for us at least). The example being so contrived doesn't help, team is just some sort of proxy wrapper around User, not a service or anything useful. What are options? Why options?
https://medium.com/#Jeffijoe/dependency-injection-in-node-js-2016-edition-f2a88efdd427
But found them incomprehensible. E.g. the examples have keywords which come from thin air - they dont seem to be javascript or node commands and are not explained in the documentation where they come from.
And looked at these projects:
https://github.com/jaredhanson/electrolyte
https://www.npmjs.com/package/node-dependency-injection
https://www.npmjs.com/package/di
but they seemed to be either abandoned (di), not maintained or we just cant figure them out (electrolyte).
Is there some standard or simple di solution that many people are using, ideally documented for mortals and with a non "express" dependent example?
UPDATE 1
It seems the pattern I am using to create my services creates a new instance very time it is used/called. Services should be singletons. The simple solution is to add this to the bottom of my services:
let playerService = new PlayerService();
module.exports = playerService;
Apparently, this only creates one instance of the object, no matter now many times require("./playerService") is called.
For keeping the configuration per env, the right way is probably (similar to what you suggested)- Keeping a config/env directory and putting there a file per env, ie development.js, test.js etc, and in each of them putting the right values. eg:
module.exports = {
playerService: 'dev/PlayerService'
}
and require it:
let env = process.env.NODE_ENV || 'development'
, envConfig = require("./config/" + env)
, playerService = require(envConfig.playerService)
You can also have the all in one file like this:
config.js:
module.exports = {
development:{
playerService: '....'
},
test:{
playerService: '....'
}
}
and require it:
let env = process.env.NODE_ENV || 'development'
, config = require("./config")
, playerService = require(config[env][playerService])
This is a common use-case.
Or, if you have all services in directories per env, ie one directory for dev, one for test etc, you don't need the config, you can require like that:
let env = process.env.NODE_ENV || 'development'
, playerService = require('./' + env + '/playerServcie')
Making the services singleton in node js should be simple, have a look at the following:
https://blog.risingstack.com/fundamental-node-js-design-patterns/
https://www.sitepoint.com/javascript-design-patterns-singleton/
and this
Hope this helps.

How nodejs handles global module being used in multiple applications running in parallel?

I have created very simple nodejs hello application.
//hello.js
#!/usr/bin/env node
var config = require('./config');
console.log(config.getConfig());
config.setConfig(Math.floor(Math.random() * 6) + 1 );
console.log(config.getConfig());
var readconfig = require('./readconfig');
console.log("Hi !! I am glad to see you again");
//config.js
var config;
exports.getConfig = function(){
return config;
}
exports.setConfig = function(c){
config = c;
}
//package.json
"bin": {
"hello": "hello.js"
}
//readconfig.js
var config = require('./config');
console.log(config.getConfig());
config.setConfig(Math.floor(Math.random() * 6) + 1 );
console.log(config.getConfig());
I had installed my sample module globally. And when I run 'hello' command from different folders, or multiple times from the same folder, I ever get;
undefined
2 //some random number
2 //some random number
4 //some random number
Hi !! I am glad to see you again
From the above response it looks like, nodejs cache a local module but maintain different copy for each application context. However I am not sure.
Can somebody plz explain how exactly nodejs handles global modules and separate their variables from all the applications using it?
Each node application is completely independent of the others. They are separate processes, completely separate modules, each loading independently, sharing no data.
Within a given application, a module loaded with require() is cached. The first time the module is loaded with require() it is loaded from disk and initialized. From then on, any subsequent requests to load that module just return the same module handle that was previously loaded - it doesn't load it again or run it again.
The require() module loader maintains a cache. Once a module is loaded it is then in the cache and any subsequent requests to load it (from with that specific application) just return the cached module handle.

How to reference local files in a npm module?

I wrote a simple npm module to precompile my handlebars templates when using django compressor to do post-processing for some client side components and found that I need to ship the npm module with a few js files.
Currently I just assume no one is installing this with the global flag because I've "hard coded" the path to these dependencies in the npm module itself
example layout of my npm module
/
* /bin
* /lib/main.js
* /vendor/ember.js
Now inside main.js I want to use the ember.js file ... currently my hard coded approach looks like this
var emberjs = fs.readFileSync('node_modules/django-ember-precompile/vendor/ember.js', 'utf8');
Again -this only works because I assume you install it local but I'd like to think node.js has a more legit way to get locally embedded files
Anyone know how I can improve this to be more "global" friendly?
What you can do is get the directory of the current file and make your file paths relative to that.
var path = require('path')
, fs = require('fs');
var vendor = path.join(path.dirname(fs.realpathSync(__filename)), '../vendor');
var emberjs = fs.readFileSync(vendor + '/ember.js', 'utf8');
Hope that helps!
One of the great strengths of Node.js is how quickly you can get up and running. The downside to this approach is that you are forced to fit the design patterns it was build around.
This is an example where your approach differs too much from Nodes approach.
Node expects everything in a module to be exposed from the modules exports, including templates.
Move the readFileSync into the django-ember-precompile module, then expose the returned value via a module export in lib/main.js.
Example:
package.json
{
"name": "django-ember-precompile",
"main": "lib/main.js"
}
lib/main.js
module.exports.ember = readFileSync('vendor/ember.js')
vendor/ember.js
You obtain your template via
var template = require('django-ember-precompile').ember
This example can be refined, but the core idea is the same.

Resources