Exporting functions to multiple callers in different files - node.js

I'm building a logging module that can be called by multiple callers located in different files.
My objective is to initialize the log file at the start of program and have the callers just call a function that logs to the file initialized earlier without going through the whole initialisation again.
I can't quite grasp the concept of module exports hence I'm hoping that you can help.
The actual logging occurs on the method write. On the main app.js file, I can initiate and log just fine.
However on a different file, I'm having a mental block on how I can just log to the file without going through creating the logfile again.
var fs = require('fs');
var fd = {},
log = {},
debug = false;
var tnlog = function(env, file, hostname, procname, pid) {
if (env == 'development')
debug = true;
fd = fs.createWriteStream(file, { flags: 'a', encoding: 'utf8', mode: 0644 });
log = { hostname: hostname, procname: procname, pid: pid };
};
tnlog.prototype.write = function(level, str) {
if (debug)
console.log(str);
else {
log.timestamp = Date.now();
log.level = level;
log.str = str;
fd.write(JSON.stringify(log) + '\n');
}
};
exports.tnlog = tnlog;
This is how I initialize and logging on the main file:
var logfile = '/var/log/node/www/app.log';
var tnlog = require('./lib/tnlog').tnlog,
log = new tnlog(app.get('env'), logfile, os.hostname(), appname, process.pid);
If you can suggest a better way of doing things, I definitely will appreciate that.

edit
The simplest solution would be to put
var logfile = '/var/log/node/www/app.log';
var tnlog = require('./lib/tnlog').tnlog,
module.exports = new tnlog(app.get('env'), logfile, os.hostname(), appname, process.pid);
into a separate file (mylogger.js), and require that anywhere you want to log something with logger = require "./mylogger.js . You always get back that single instance of tnlog, because node caches the exported value.
I also see you might be using Express, so you could also do
app.set("logger",new tnlog(app.get('env'), logfile, os.hostname(), appname, process.pid))
and retrieve it anywhere you have a reference to the app object with app.get("logger").
old
More complicated:
You must decide whether you want to support logging to different files while the same app is running. If so, you absolutely need to create an object for each log file. You don't have to export a constructor function per se, you could also export a kind of hybrid between a factory and a singleton pattern.
You would create something like:
var loggers = {}
module.exports = function getLogger(env, file, hostname, procname, pid) {
key = env + file + hostname + procname + pid)
if(loggers[key]) return loggers[key]
return loggers[key] = new Logger(env, file, hostname, procname, pid)
}
I.e. you check if you have already created the logger object, based on concatenating the function argument variables.
You then need to create a proper Logger constructor of course, but I assume you know a bit of Javascript.
Note that the loggers object will remain a private variable, just like the Logger constructor. Because node.js caches the object the module exports, the value of the loggers object will persist over multiple calls to require, as part of the getLogger closure.

Related

Is there a way to run a self-terminating js script that can pass variables to the next?

I'd really like to have some of my my secrets/keys be iterable, since I have a growing list of external api keys that would be easier to use if I could match them based on the route being used without having to statically map them at the start of my application.
The only way I can think to better organize them without writing massive JSON one-line strings in a batch/bash file would be to have it all defined in js object literals and have a js script stringify it and load it into ENV variables to be passed to the application that's about to start.
NPM pre-start script:
const env = {
secret: 'supersecret',
key: `key
that requires
line breaks`,
apiKeys: {
'api-1':'a;sodhgfasdgflksdaj;lg',
'api-2':'ajl;sdfj;adjsfkljasd;f'
}
}
for (let x in env) {
if (typeof env[x] == 'string') {
process.env[x] = env[x];
} else {
process.env[x] = JSON.stringify(env[x])
}
console.log(x)
}
process.exit(22);
NPM start script:
const key = process.env.key
const apiKeys = JSON.parse(process.env.apiKeys)
Unfortunately, the ENV variables don't remain between instances, so this is useless.
Would it also be secure to use STDIN and STDOUT to pass the data between the two scripts?
My solution ended up being to pipe output by converting to JSON then streaming to STDOUT and receiving on STDIN on the second script. Doing this made it platform agnostic and I can add any sort of active secret management in the source script (e.g. accepting secrets from various other secret management systems/vaults or generating new secrets at every launch)
Send to STDOUT:
const env = {
someSecret: 'supersecret',
superSecretObject: {
moreProperties: 'data'
}
};
/* If you have an array of properties or have a very large set of secrets,
you should create a readable stream from it, and send that to stdout,
but this is much simpler */
process.stdout.write(JSON.stringify(env));
Accept on STDIN:
const fs = require('fs')
const env = (function () {
/* Using fs will error out on no input, but you can use process.stdin
if you don't need to suspend the whole application waiting for the input */
let envTmp = fs.readFileSync(0).toString();
envTmp = JSON.parse(envTmp);
return envTmp;
})();

How to set global modules in multiple node.js files

I have a project which i'm working on, but i realised it was over 2000 lines and wanted to split it up into different files for different group of functions. eg. send message and read message functions are in message.js. The problem is that i need alot of modules in each of the files and if i create an instance of the module, i will need to create a new instance in another file, but i want to use the same instance!
I've tried module.exports = { ... } and exports.function() to pass the modules and instances to other files but sometimes it says that the function does not exist.
For example in my app.js file:
const module = require('module')
instance = new module()
const message = require('./message.js')
message.passModule(instance)
And in my message.js file:
let module-instance
exports.passModule = function(instance) {
module-instance = instance
}
module-instance.doSomething()
So, how could I have all the modules to be available in all the files, but only declare them in one, and how do I get the instance I made in one File to be able to be used in the other Files?
Some library file
singleton/file.js
const someDependency = require('some-module');
class Singleton {
method() {
...
return someDependency.someFunctionality();
}
...
}
module.exports = new Singleton();
Someplace where you want to use your singleton
const singleton = require('singleton/file');
singleton.method();

How to modify variable in one file from another

My log.js,
var data ;
var winston = require('winston');
var config = {'status':1};
module.exports.config = config;
My get.js file(from where i want to modify log.js),
exports.getcategories = function (req, res) {
if(log.status == 1){
var data = 'loaded successfully';
}
};
Here i want to modify my data variable in log.js from my get.js,can anyone please suggest help.
You need to read up on Javascript scopes. In order to access any variable in a file or function from outside it, there are only two ways:
Make the variable global (NOT a good idea).
Create getter/setter functions and export them, thereby exposing it to the outside world

How do I get the dirname of the calling method when it is in a different file in nodejs?

Let's say I have two files, dir/a.js and lib/b.js
a.js:
b = require('../lib/b');
b.someFn();
b.js:
var fallback = "./config.json";
module.exports = {
someFn = function(jsonFile) {
console.log(require(jsonFile || fallback);
}
}
The entire purpose of b.js in this example is to read a json file, so I might call it as b.someFn("path/to/file.json").
But I want there to be a default, like a config file. But the default should be relative to a.js and not b.js. In other words, I should be able to call b.someFn() from a.js, and it should say, "since you didn't pass me the path, I will assume a default path of config.json." But the default should be relative to a.js, i.e. should be dir/config.json and not lib/config.json, which I would get if I did require(jsonFile).
I could get the cwd, but that will only work if I launch the script from within dir/.
Is there any way for b.js to say, inside someFn(), "give me the __dirname of the function that called me?"
Use callsite, then:
b.js:
var path = require('path'),
callsite = require('callsite');
module.exports = {
someFn: function () {
var stack = callsite(),
requester = stack[1].getFileName();
console.log(path.dirname(requester));
}
};
Alternatively, using parent-module:
const path = require('path');
const parentModule = require('parent-module');
// get caller of current script
console.log(path.dirname(parentModule()));
// get caller of module, change './index.js' to your "main" script
console.log(path.dirname(parentModule(require.resolve('./index.js'))));
If you want to get the directory of the script of the caller function, then use the stacktrace as the above answer shows, otherwise, what's the problem of hardcoding the directory of a.js?
var fallback = "dir_a/config.json";
module.exports = {
someFn = function(jsonFile) {
console.log(require(jsonFile || fallback);
}
}

What is the best way to expose methods from Node.js?

Consider I want to expose a method called Print
Binding method as prototype:
File Saved as Printer.js
var printerObj = function(isPrinted) {
this.printed = isPrinted;
}
printerObj.prototype.printNow = function(printData) {
console.log('= Print Started =');
};
module.exports = printerObj;
Then access printNow() by putting code require('Printer.js').printNow() in any external .js node program file.
Export method itself using module.exports:
File Saved as Printer2.js
var printed = false;
function printNow() {
console.log('= Print Started =');
}
module.exports.printNow = printNow;
Then access printNow() by putting code require('Printer2.js').printNow() in any external .js node program file.
Can anyone tell what is the difference and best way of doing it with respect to Node.js?
Definitely the first way. It is called the substack pattern and you can read about it on Twitter and on Mikeal Rogers' blog. Some code examples can be found at the jade github repo in the parser:
var Parser = exports = module.exports = function Parser(str, filename, options){
this.input = str;
this.lexer = new Lexer(str, options);
...
};
Parser.prototype = {
context: function(parser){
if (parser) {
this.contexts.push(parser);
} else {
return this.contexts.pop();
}
},
advance: function(){
return this.lexer.advance();
}
};
In the first example you are creating a class, ideally you should use it with "new" in your caller program:
var PrinterObj = require('Printer.js').PrinterObj;
var printer = new PrinterObj();
printer.PrintNow();
This is a good read on the subject: http://www.2ality.com/2012/01/js-inheritance-by-example.html
In the second example you are returning a function.
The difference is that you can have multiple instances of the first example (provided you use new as indicated) but only one instance of the second approach.

Resources