Is it possible to keep an instance on Node.js across an entire app? - node.js

I want to instantiate a class in a module like this:
const Agenda = require('agenda');
const agenda = new Agenda({db: {address: mongoConnectionString}});
and then access the agenda already configured object from everywhere else in the code (like a singleton). At first I thought about using module.exports = agenda; but then when I require this module in another module of the application it will execute all the code again right?
So, if I'm not wrong which is the best approach to achieve this? Thanks.

You can use global object to share variables everywhere in your node application:
const Agenda = require('agenda');
const agenda = new Agenda({db: {address: mongoConnectionString}});
global.agenda = agenda;
And then you can get your agenda in another module like this:
const agenda = global.agenda;

Related

What does ".Strategy" do in Node or Passport?

What does ".Strategy" do here? Is it Node? Is it Passport?
var LocalStrategy = require('passport-local').Strategy;
Everything up to '.Strategy' part I understand. I just want to know what '.Strategy' does. I have checked the documentation on passport-local module on npm. I have also checked Passport's documentation, and it is just used in code snippets. No explanation is provided.
I am working with the MEAN stack and we are using Passport to authenticate users.
If you look at the sources of passport-local index.js you'll see it exports the same thing directly and in exports.Strategy.
When you do require('passport-local).Strategy you import the export defined in exports.Strategy, but it's really the same to do just require('passport-local') in this case because the same constructor is exported directly from the module.
If you define a module like this:
var Thing = { foo: () => 'bar' };
exports = module.exports = Thing;
exports.Thing = Thing;
you can use it in many ways:
const Thing = require('./module');
console.log(Thing.foo());
works, as does
const Thing = require('./module').Thing;
console.log(Thing.foo());
and with both imports you can actually call also
console.log(Thing.Thing.foo());
If you remove the exports.Thing = Thing; part of the module, then
const Thing = require('./module').Thing;
does not work anymore.
The exports cause confusion often. You could take a look of Node docs or eg. this answer.

Instantiate node module differently per (web) user

I was wondering what the best practice is for the following scenario:
I am planning to use an npm module for a web servie, where the user enters a access and secret key. Then a module is used which is instantiated like this:
var module = require('module')('ACCESS_KEY','SECRET_KEY');
Each user of course has a different access and secret key. The module exposes several functions which I want to use with the user's access and secret key on his behalf.
Now my question is, how I can 'require' that module with the keys from the database for each user, not just for the whole application with a single static pair. I am on node 8 and using ES6.
The crucial detail here is that this:
var module = require('module')('ACCESS_KEY','SECRET_KEY');
...is equivalent to this:
var moduleFunc = require('module');
var module = moduleFunc('ACCESS_KEY', 'SECRET_KEY');
In other words, 'module' exports a function, and you're calling that function with two arguments ('ACCESS_KEY', 'SECRET_KEY') and assigning the result to module.
That means you can instead require('module') at the top of your file and then use the function it gives you as many times as you want later on, with different arguments.
For example:
const someApi = require('some-api');
// ...later...
app.get('/', (req, res) => {
const { ACCESS_KEY, SECRET_KEY } = getUserKeys(req);
const apiClient = someApi(ACCESS_KEY, SECRET_KEY);
// ...
});

How to DRY requires in a Node Express app?

I have a Node + Express app. In many of my files I am doing this at the top
const config = require('./config');
const Twit = require('twit');
const TwitConnector = new Twit(config);
Is there a way to DRY this, so I don't have to repeat this everywhere?
Is there a, best practice, pattern to make something like TwitConnector globally available so that I can use it anytime I need it?
Or maybe that's not a good idea and explicitly requiring it is the right thing to do?
Can't you make twit-connector.js file and require that instead? I don't think making it global is a good idea.
twit-connector.js
const config = require('./config');
const Twit = require('twit');
const TwitConnector = new Twit(config);
module.exports = TwitConnector;
somefile.js
const TwitConnector = require('./twit-connector');
// do something with TwitConnector

Two way communication between routers within express app

I have an express app that has a router for different sections of my application, each contained within individual files. At the end of each file I export the router object like so.
var express = require("express");
var router = express.Router();
//routing handlers
module.exports = router;
However my problem is that I am trying to implement a feature were a user is allowed to edit a post that could be displayed on the front page, therefore in order to have the most current version of the user's post I need to be able to know when the user edits the post to make the necessary changes.
I have two modules one that handles dispatching the user's posts call this module B and another that handles editing call this module A. I need to be able to have module A include handler function and an array from module B, but I also need module B to be able to be notified when to make changes to the its array that module A requires.
I have tried
module A
var express = require('express');
var EventEmitter = require('events').EventEmitter;
var evt = new EventEmitter();
var router = express.Router();
var modB = require('moduleB');
router.evt = evt;
module.exports = router;
Module B
var express = require('express');
var router = express.Router();
var modA = require('moduleA').evt;
modA.on('myEvent',handler);
var myArray = [....];
router.myArray = myArray;
module.exports = router;
This gives me an undefined for modA and throws an error. I suspect it might be the order the modules are included but anyhow I would like to obtain some feedback since I sense that this might not even be good practice.
I think you are running into a common scenario for someone just starting out with express. A lot of people stick everything into routes/controllers when really the route should be very simple and just extract the data needed to figure out what the request is doing and then pass it to a service for most of the processing.
The solution is to create a Service and put the bulk of your logic and common code there, then you can wire up ModA and ModB to use the Service as needed.
EDIT with an example(not working but should give you a good starting point):
Shared Service
var EventEmitter = require('events').EventEmitter;
var evt = new EventEmitter();
module.exports = {
saveData: function(data) {
// do some saving stuff then trigger the event
evt.emit('myEvent', data);
},
onDataChange: function(handler) {
evt.on('myEvent', handler);
}
};
Module A
var service = require('SharedService.js');
// listen for events
service.onDataChange(function(e, data) {
// do something with the data
});
Module B
var service = require('SharedService.js');
// save some data which will cause Module A's listener to fire
service.saveData(data);
This example above hides the implementation of EventEmitter which may or may not be desirable. Another way you could do it would be to have SharedService extend EventEmitter, then your Modules could listen/emit directly on the service.

Using Express.js 3 with database modules, where to init database client?

Knowing that Express.js pretty much leaves it to developer on deciding app structure, and after reading quite a few suggestions on SO (see link1 and link2 for example) as well as checking the example in official repo, I am still not sure if what I am doing is the best way forward.
Say I am using Redis extensively in my app, and that I have multiple "models" that require redis client to run query, would it be better to init redis client in the main app.js, like this:
var db = redis.createClient();
var models = require('./models')(db);
var routes = require('./controllers')(models);
or would it be better to just init redis in each model, then let each controller require models of interests?
The latter approach is what I am using, which looks less DRY. But is passing models instance around the best way? Note that I am loading multiple models/controllers here - I am not sure how to modify my setup to pass the redis client correctly to each models.
//currently in models/index.js
exports.home = require('./home.js');
exports.users = require('./user.js');
TL;DR, my questions are:
where best to init redis client in a MVC pattern app?
how to pass this redis client instance to multiple models with require('./models')(db)
Update:
I tried a different approach for index.js, use module.exports to return an object of models/controllers instead:
module.exports = function(models){
var routes = {};
routes.online = require('./home.js')(models);
routes.users = require('./user.js')(models);
return routes;
};
Seems like a better idea now?
Perhaps it's useful if I share how I recently implemented a project using Patio, a SQL ORM. A bit more background: the MVC-framework I was using was Locomotive, but that's absolutely not a requirement (Locomotive doesn't have an ORM and it leaves implementing how you handle models and databases to the developer, similar to Express).
Locomotive has a construct called 'initializers', which are just JS files which are loaded during app startup; what they do is up to the developer. In my project, one initializer configured the database.
The initializer established the actual database connection, also took care of loading all JS files in the model directory. In pseudocode:
registry = require('model_registry'); // see below
db = createDatabaseConnection();
files = fs.readDirSync(MODEL_DIRECTORY);
for each file in files:
if filename doesn't end with '.js':
continue
mod = require(path.join(MODEL_DIRECTORY, filename));
var model = mod(db);
registry.registerModel(model);
Models look like this:
// models/mymodel.js
module.exports = function(db ) {
var model = function(...) { /* model class */ };
model.modelName = 'MyModel'; // used by registry, see below
return model;
};
The model registry is a very simple module to hold all models:
module.exports = {
registerModel : function(model) {
if (! model.hasOwnProperty('modelName'))
throw Error('[model registry] models require a modelName property');
this[model.modelName] = model;
}
};
Because the model registry stores the model classes in this (which is module.exports), they can then be imported from other files where you need to access the model:
// mycontroller.js
var MyModel = require('model_registry').MyModel;
var instance = new MyModel(...);
Disclaimer: this worked for me, YMMV. Also, the code samples above don't take into account any asynchronous requirements or error handling, so the actual implementation in my case was a bit more elaborate.

Resources