Javascript-emitting template engine for node.js? - node.js

Consider e.g. the following scenario: we give some entry point URL (something like https://our.server/customer-name/entry-point.js) to our customer, so that they're able to include our product on their page by simply writing
<script language="Javascript" src="https://our.server/customer-name/entry-point.js"/>
in the place they want to put our product (yes, I know, this is an ugly solution, but it is not something I could change).
So here we face the problem: our entry-point.js should somehow know from where (https://our.server/customer-name/) it should load the other files. So it seems that the answer is to generate entry-point.js dynamically so that it will contain e.g.
var ourcompany_ourproduct_basepath = "https://our.server/customer-name/";
The obvious way to do this is to construct an entry-point.js manually, something like this:
res.write("var ourprefix_basepath = \"" + basepath.escape() + "\";");
res.write("function ourprefix_entryPoint() { /*do something*/ }");
res.write("ourprefix_entryPoint();");
As you can see, it is just too bad.
Is there any template engine that will allow e.g. for the following:
var basepath = "https://our.server/customer-name/";
var export = {
ourprefix_basepath: basepath.escape(),
ourprefix_entrypoint: function() { /* do something */ }
};
templateEngine.render(export);
or
view.vw:
ourprefix_basepath = rewrite("{#basepath}");
function ourprefix_entrypoint() { /* do something */
ourprefix_entrypoint();
App.js:
templateEngine.render("view.vw", { basepath: "https://our.server/customer-name/" });
or something like this (you've got the idea), which will write the following to the response stream:
var ourprefix_basepath = "https://our.server/customer-name/";
function ourprefix_entrypoint() { /* do something */ };
ourprefix_entrypoint();
?

It seems that Node.js supports reflection, though i can't find if it is explicitly stated somewhere.
So, by exploiting the fact JSON is the subset of JS, the following cleaner code without using a template engine is possible:
var entrypoint = function(data) {
/* do something with data.basepath and data.otherparam here... */
}
exports.processRequest = function(request, response) {
var data = {
basepath: computeBasepath(request),
otherparam: "somevalue"
};
response.send("(" + entrypoint.toString() + ")(" + JSON.stringify(data) + ")");
}

Related

NodeJS/Express share function between multiple routes files [duplicate]

Let's say I have a file called app.js. Pretty simple:
var express = require('express');
var app = express.createServer();
app.set('views', __dirname + '/views');
app.set('view engine', 'ejs');
app.get('/', function(req, res){
res.render('index', {locals: {
title: 'NowJS + Express Example'
}});
});
app.listen(8080);
What if I have a functions inside "tools.js". How would I import them to use in apps.js?
Or...am I supposed to turn "tools" into a module, and then require it? << seems hard, I rather do the basic import of the tools.js file.
You can require any js file, you just need to declare what you want to expose.
// tools.js
// ========
module.exports = {
foo: function () {
// whatever
},
bar: function () {
// whatever
}
};
var zemba = function () {
}
And in your app file:
// app.js
// ======
var tools = require('./tools');
console.log(typeof tools.foo); // => 'function'
console.log(typeof tools.bar); // => 'function'
console.log(typeof tools.zemba); // => undefined
If, despite all the other answers, you still want to traditionally include a file in a node.js source file, you can use this:
var fs = require('fs');
// file is included here:
eval(fs.readFileSync('tools.js')+'');
The empty string concatenation +'' is necessary to get the file content as a string and not an object (you can also use .toString() if you prefer).
The eval() can't be used inside a function and must be called inside the global scope otherwise no functions or variables will be accessible (i.e. you can't create a include() utility function or something like that).
Please note that in most cases this is bad practice and you should instead write a module. However, there are rare situations, where pollution of your local context/namespace is what you really want.
Update 2015-08-06
Please also note this won't work with "use strict"; (when you are in "strict mode") because functions and variables defined in the "imported" file can't be accessed by the code that does the import. Strict mode enforces some rules defined by newer versions of the language standard. This may be another reason to avoid the solution described here.
You need no new functions nor new modules.
You simply need to execute the module you're calling if you don't want to use namespace.
in tools.js
module.exports = function() {
this.sum = function(a,b) { return a+b };
this.multiply = function(a,b) { return a*b };
//etc
}
in app.js
or in any other .js like myController.js :
instead of
var tools = require('tools.js') which force us to use a namespace and call tools like tools.sum(1,2);
we can simply call
require('tools.js')();
and then
sum(1,2);
in my case I have a file with controllers ctrls.js
module.exports = function() {
this.Categories = require('categories.js');
}
and I can use Categories in every context as public class after require('ctrls.js')()
Create two js files
// File cal.js
module.exports = {
sum: function(a,b) {
return a+b
},
multiply: function(a,b) {
return a*b
}
};
Main js file
// File app.js
var tools = require("./cal.js");
var value = tools.sum(10,20);
console.log("Value: "+value);
Console Output
Value: 30
create two files e.g app.js and tools.js
app.js
const tools= require("./tools.js")
var x = tools.add(4,2) ;
var y = tools.subtract(4,2);
console.log(x);
console.log(y);
tools.js
const add = function(x, y){
return x+y;
}
const subtract = function(x, y){
return x-y;
}
module.exports ={
add,subtract
}
output
6
2
Here is a plain and simple explanation:
Server.js content:
// Include the public functions from 'helpers.js'
var helpers = require('./helpers');
// Let's assume this is the data which comes from the database or somewhere else
var databaseName = 'Walter';
var databaseSurname = 'Heisenberg';
// Use the function from 'helpers.js' in the main file, which is server.js
var fullname = helpers.concatenateNames(databaseName, databaseSurname);
Helpers.js content:
// 'module.exports' is a node.JS specific feature, it does not work with regular JavaScript
module.exports =
{
// This is the function which will be called in the main file, which is server.js
// The parameters 'name' and 'surname' will be provided inside the function
// when the function is called in the main file.
// Example: concatenameNames('John,'Doe');
concatenateNames: function (name, surname)
{
var wholeName = name + " " + surname;
return wholeName;
},
sampleFunctionTwo: function ()
{
}
};
// Private variables and functions which will not be accessible outside this file
var privateFunction = function ()
{
};
I was also looking for a NodeJS 'include' function and I checked the solution proposed by Udo G - see message https://stackoverflow.com/a/8744519/2979590. His code doesn't work with my included JS files.
Finally I solved the problem like that:
var fs = require("fs");
function read(f) {
return fs.readFileSync(f).toString();
}
function include(f) {
eval.apply(global, [read(f)]);
}
include('somefile_with_some_declarations.js');
Sure, that helps.
Create two JavaScript files. E.g. import_functions.js and main.js
1.) import_functions.js
// Declaration --------------------------------------
module.exports =
{
add,
subtract
// ...
}
// Implementation ----------------------------------
function add(x, y)
{
return x + y;
}
function subtract(x, y)
{
return x - y;
}
// ...
2.) main.js
// include ---------------------------------------
const sf= require("./import_functions.js")
// use -------------------------------------------
var x = sf.add(4,2);
console.log(x);
var y = sf.subtract(4,2);
console.log(y);
output
6
2
The vm module in Node.js provides the ability to execute JavaScript code within the current context (including global object). See http://nodejs.org/docs/latest/api/vm.html#vm_vm_runinthiscontext_code_filename
Note that, as of today, there's a bug in the vm module that prevenst runInThisContext from doing the right when invoked from a new context. This only matters if your main program executes code within a new context and then that code calls runInThisContext. See https://github.com/joyent/node/issues/898
Sadly, the with(global) approach that Fernando suggested doesn't work for named functions like "function foo() {}"
In short, here's an include() function that works for me:
function include(path) {
var code = fs.readFileSync(path, 'utf-8');
vm.runInThisContext(code, path);
}
say we wants to call function ping() and add(30,20) which is in lib.js file
from main.js
main.js
lib = require("./lib.js")
output = lib.ping();
console.log(output);
//Passing Parameters
console.log("Sum of A and B = " + lib.add(20,30))
lib.js
this.ping=function ()
{
return "Ping Success"
}
//Functions with parameters
this.add=function(a,b)
{
return a+b
}
Udo G. said:
The eval() can't be used inside a function and must be called inside
the global scope otherwise no functions or variables will be
accessible (i.e. you can't create a include() utility function or
something like that).
He's right, but there's a way to affect the global scope from a function. Improving his example:
function include(file_) {
with (global) {
eval(fs.readFileSync(file_) + '');
};
};
include('somefile_with_some_declarations.js');
// the declarations are now accessible here.
Hope, that helps.
app.js
let { func_name } = require('path_to_tools.js');
func_name(); //function calling
tools.js
let func_name = function() {
...
//function body
...
};
module.exports = { func_name };
It worked with me like the following....
Lib1.js
//Any other private code here
// Code you want to export
exports.function1 = function(params) {.......};
exports.function2 = function(params) {.......};
// Again any private code
now in the Main.js file you need to include Lib1.js
var mylib = requires('lib1.js');
mylib.function1(params);
mylib.function2(params);
Please remember to put the Lib1.js in node_modules folder.
Another way to do this in my opinion, is to execute everything in the lib file when you call require() function using (function(/* things here */){})(); doing this will make all these functions global scope, exactly like the eval() solution
src/lib.js
(function () {
funcOne = function() {
console.log('mlt funcOne here');
}
funcThree = function(firstName) {
console.log(firstName, 'calls funcThree here');
}
name = "Mulatinho";
myobject = {
title: 'Node.JS is cool',
funcFour: function() {
return console.log('internal funcFour() called here');
}
}
})();
And then in your main code you can call your functions by name like:
main.js
require('./src/lib')
funcOne();
funcThree('Alex');
console.log(name);
console.log(myobject);
console.log(myobject.funcFour());
Will make this output
bash-3.2$ node -v
v7.2.1
bash-3.2$ node main.js
mlt funcOne here
Alex calls funcThree here
Mulatinho
{ title: 'Node.JS is cool', funcFour: [Function: funcFour] }
internal funcFour() called here
undefined
Pay atention to the undefined when you call my object.funcFour(), it will be the same if you load with eval(). Hope it helps :)
You can put your functions in global variables, but it's better practice to just turn your tools script into a module. It's really not too hard – just attach your public API to the exports object. Take a look at Understanding Node.js' exports module for some more detail.
I just want to add, in case you need just certain functions imported from your tools.js, then you can use a destructuring assignment which is supported in node.js since version 6.4 - see node.green.
Example:
(both files are in the same folder)
tools.js
module.exports = {
sum: function(a,b) {
return a + b;
},
isEven: function(a) {
return a % 2 == 0;
}
};
main.js
const { isEven } = require('./tools.js');
console.log(isEven(10));
output: true
This also avoids that you assign those functions as properties of another object as its the case in the following (common) assignment:
const tools = require('./tools.js');
where you need to call tools.isEven(10).
NOTE:
Don't forget to prefix your file name with the correct path - even if both files are in the same folder, you need to prefix with ./
From Node.js docs:
Without a leading '/', './', or '../' to indicate a file, the module
must either be a core module or is loaded from a node_modules folder.
Include file and run it in given (non-global) context
fileToInclude.js
define({
"data": "XYZ"
});
main.js
var fs = require("fs");
var vm = require("vm");
function include(path, context) {
var code = fs.readFileSync(path, 'utf-8');
vm.runInContext(code, vm.createContext(context));
}
// Include file
var customContext = {
"define": function (data) {
console.log(data);
}
};
include('./fileToInclude.js', customContext);
Using the ESM module system:
a.js:
export default function foo() {};
export function bar() {};
b.js:
import foo, {bar} from './a.js';
This is the best way i have created so far.
var fs = require('fs'),
includedFiles_ = {};
global.include = function (fileName) {
var sys = require('sys');
sys.puts('Loading file: ' + fileName);
var ev = require(fileName);
for (var prop in ev) {
global[prop] = ev[prop];
}
includedFiles_[fileName] = true;
};
global.includeOnce = function (fileName) {
if (!includedFiles_[fileName]) {
include(fileName);
}
};
global.includeFolderOnce = function (folder) {
var file, fileName,
sys = require('sys'),
files = fs.readdirSync(folder);
var getFileName = function(str) {
var splited = str.split('.');
splited.pop();
return splited.join('.');
},
getExtension = function(str) {
var splited = str.split('.');
return splited[splited.length - 1];
};
for (var i = 0; i < files.length; i++) {
file = files[i];
if (getExtension(file) === 'js') {
fileName = getFileName(file);
try {
includeOnce(folder + '/' + file);
} catch (err) {
// if (ext.vars) {
// console.log(ext.vars.dump(err));
// } else {
sys.puts(err);
// }
}
}
}
};
includeFolderOnce('./extensions');
includeOnce('./bin/Lara.js');
var lara = new Lara();
You still need to inform what you want to export
includeOnce('./bin/WebServer.js');
function Lara() {
this.webServer = new WebServer();
this.webServer.start();
}
Lara.prototype.webServer = null;
module.exports.Lara = Lara;
You can simple just require('./filename').
Eg.
// file: index.js
var express = require('express');
var app = express();
var child = require('./child');
app.use('/child', child);
app.get('/', function (req, res) {
res.send('parent');
});
app.listen(process.env.PORT, function () {
console.log('Example app listening on port '+process.env.PORT+'!');
});
// file: child.js
var express = require('express'),
child = express.Router();
console.log('child');
child.get('/child', function(req, res){
res.send('Child2');
});
child.get('/', function(req, res){
res.send('Child');
});
module.exports = child;
Please note that:
you can't listen PORT on the child file, only parent express module has PORT listener
Child is using 'Router', not parent Express moudle.
Node works based on commonjs modules and more recently, esm modules. Basically, you should create modules in separated .js files and make use of imports/exports (module.exports and require).
Javascript on the browser works differently, based on scope. There is the global scope, and through clojures (functions inside other functions) you have private scopes.
So,in node, export functions and objects that you will consume in other modules.
The cleanest way IMO is the following, In tools.js:
function A(){
.
.
.
}
function B(){
.
.
.
}
module.exports = {
A,
B
}
Then, in app.js, just require the tools.js as following: const tools = require("tools");
I was as well searching for an option to include code without writing modules, resp. use the same tested standalone sources from a different project for a Node.js service - and jmparattes answer did it for me.
The benefit is, you don't pollute the namespace, I don't have trouble with "use strict"; and it works well.
Here a full sample:
Script to load - /lib/foo.js
"use strict";
(function(){
var Foo = function(e){
this.foo = e;
}
Foo.prototype.x = 1;
return Foo;
}())
SampleModule - index.js
"use strict";
const fs = require('fs');
const path = require('path');
var SampleModule = module.exports = {
instAFoo: function(){
var Foo = eval.apply(
this, [fs.readFileSync(path.join(__dirname, '/lib/foo.js')).toString()]
);
var instance = new Foo('bar');
console.log(instance.foo); // 'bar'
console.log(instance.x); // '1'
}
}
Hope this was helpfull somehow.
Like you are having a file abc.txt and many more?
Create 2 files: fileread.js and fetchingfile.js, then in fileread.js write this code:
function fileread(filename) {
var contents= fs.readFileSync(filename);
return contents;
}
var fs = require("fs"); // file system
//var data = fileread("abc.txt");
module.exports.fileread = fileread;
//data.say();
//console.log(data.toString());
}
In fetchingfile.js write this code:
function myerror(){
console.log("Hey need some help");
console.log("type file=abc.txt");
}
var ags = require("minimist")(process.argv.slice(2), { string: "file" });
if(ags.help || !ags.file) {
myerror();
process.exit(1);
}
var hello = require("./fileread.js");
var data = hello.fileread(ags.file); // importing module here
console.log(data.toString());
Now, in a terminal:
$ node fetchingfile.js --file=abc.txt
You are passing the file name as an argument, moreover include all files in readfile.js instead of passing it.
Thanks
Another method when using node.js and express.js framework
var f1 = function(){
console.log("f1");
}
var f2 = function(){
console.log("f2");
}
module.exports = {
f1 : f1,
f2 : f2
}
store this in a js file named s and in the folder statics
Now to use the function
var s = require('../statics/s');
s.f1();
s.f2();
To turn "tools" into a module, I don't see hard at all. Despite all the other answers I would still recommend use of module.exports:
//util.js
module.exports = {
myFunction: function () {
// your logic in here
let message = "I am message from myFunction";
return message;
}
}
Now we need to assign this exports to global scope (in your app|index|server.js )
var util = require('./util');
Now you can refer and call function as:
//util.myFunction();
console.log(util.myFunction()); // prints in console :I am message from myFunction
To interactively test the module ./test.js in a Unix environment, something like this could be used:
>> node -e "eval(''+require('fs').readFileSync('./test.js'))" -i
...
Use:
var mymodule = require("./tools.js")
app.js:
module.exports.<your function> = function () {
<what should the function do>
}

Dynamic require in Nodejs

I'm requiring a library in NodeJS which has a self-invoking function, that results an error because it looks for an object which is not initialized at that moment .
I want to dynamically require this library when that object is initialized.
Is there any way to dynamically require/ load a library ?
This is the part of library required :
https://github.com/sakren/node-google-maps/blob/develop/lib/Google.js#L5
Actually I want to require when the window object is present (client-side rendering).
So something like this :
'use strict';
var React = require('react');
var Map = require('./map.jsx');
var Common = require('../common/common');
var MapStatic = require('./map-static.jsx');
exports.type = function() {
return 'map';
};
exports.jsx = function(data) {
if (Common.isServerSide()) {
return (<MapStatic data={data}/>);
} else {
return (
<Map data={data}/>
);
}
};
exports.transform = require('./map-transform.js');
The reason the code looks weired is that I'm using react.
In nodeJS require can be used anywhere at anytime whithout much limitations AFAIK.
Which error is thrown once you require at runtime ?
In your else branch.
Try the following.
requires = {}
function getter(key) {
if(!requires[key]){
requires[key] = require(key)
}
return requires[key]
}

Send parameters to jshint reporter in Gulp

I have Gulpfile with jshint configured to use jshint-stylish reporter. I need to pass option verbose to reporter in order to display warning codes. Is it possible to do it using Gulp?
Current my gulpfile.js looks like below:
var gulp = require('gulp');
var jshint = require('gulp-jshint');
var compass = require('gulp-compass');
var path = require('path');
require('shelljs/global');
var jsFiles = ['www/js/**/*.js', '!www/js/libraries/**/*.js', 'www/spec/**/*.js', '!www/spec/lib/**/*.js'];
var sassFiles = 'www/sass/*.scss';
gulp.task('lint', function () {
return gulp
.src(jsFiles)
.pipe(jshint())
.pipe(jshint.reporter('jshint-stylish'));
});
gulp.task('compass', function () {
gulp.src(sassFiles)
.pipe(compass({
project: path.join(__dirname, 'www'),
css: 'css',
sass: 'sass',
image: 'img',
font: 'fonts'
})).on('error', function() {});
});
var phonegapBuild = function (platform) {
if (!which('phonegap')) {
console.log('phonegap command not found')
return 1;
}
exec('phonegap local build ' + platform);
};
gulp.task('build:android', ['lint', 'compass'], function () {
phonegapBuild('android');
});
gulp.task('build:ios', ['lint', 'compass'], function () {
phonegapBuild('ios');
});
gulp.task('watch', function() {
gulp.watch(jsFiles, ['lint']);
gulp.watch(sassFiles, ['compass']);
});
gulp.task('default', ['lint', 'compass']);
Well, this, plus the fact that the output of the stylish reporter is hardly readable on Windows due to the darkness of the blue text, so I have to keep going in an manually changing the colour after installing it, has made me do something about it. So you should hopefully have more luck with this reporter I've just written:
https://github.com/spiralx/jshint-summary
You basically use it like this;
var summary = require('jshint-summary');
// ...
.pipe(jshint.reporter(summary({
verbose: true,
reasonCol: 'cyan,bold',
codeCol: 'green'
})
and the summary function will initialise the function passed to JSHint with those settings - see the page on Github for a bit more documentation.
It's got some very basic tests, and the library's gulpfile.js uses it to show its own JSHint output :)
How about using similar technique, as you already did with phonegap?
var jshint = function (parameter) {
// todo: define paths with js files, or pass them as parameter too
exec('jshint ' + paths + ' ' + parameter);
};
Based on https://github.com/wearefractal/gulp-jshint/blob/master/index.js#L99 it appears that gulp-jshint doesn't facilitate passing more than the name to the reporter if you load it with a string. It seems a simple thing to extend though. I'll race you to a pull request. :D
Alternatively, try something like this:
var stylish = require('jshint-stylish');
// ...
.pipe(jshint.reporter(stylish(opt)));
I'm pretty sure I have the syntax wrong, but this may get you unstuck.
It's annoying, and makes any decent reporter somewhat tricky to use within the existing framework. I've come up with this hack for the Stylish reporter, it's just currently in my gulpfile.js:
function wrapStylishReporter(reporterOptions) {
var reporter = require(stylish).reporter,
reporterOptions = reporterOptions || {};
var wrapped = function(results, data, config) {
var opts = [config, reporterOptions].reduce(function(dest, src) {
if (src) {
for (var k in src) {
dest[k] = src[k];
}
}
return dest;
}, {});
reporter(results, data, opts);
};
return jshint.reporter(wrapped);
}
And then for the task definition itself:
gulp.task('lint', function() {
return gulp.src('+(bin|lib)/**/*.js')
.pipe(jshint())
.pipe(wrapStylishReporter({ verbose: true }))
.pipe(jshint.reporter('fail'));
});
Ideally reporters would either be a function that takes an options parameter and returns the reporter function, or a fairly basic class so you could have options as well as state.

Node.JS - fs.exists not working?

I'm a beginner in Node.js, and was having trouble with this piece of code.
var fs = require('fs');
Framework.Router = function() {
this.run = function(req, res) {
fs.exists(global.info.controller_file, function(exists) {
if (exists) {
// Here's the problem
res.writeHead(200, {'Content-Type':'text/html'});
var cname = App.ucfirst(global.info.controller)+'Controller';
var c = require(global.info.controller_file);
var c = new App[cname]();
var action = global.info.action;
c[action].apply(global.info.action, global.info.params);
res.end();
} else {
App.notFound();
return false;
}
});
}
};
The problem lies in the part after checking if the 'global.info.controller_file' exists, I can't seem to get the code to work properly inside the: if (exists) { ... NOT WORKING }
I tried logging out the values for all the variables in that section, and they have their expected values, however the line: c[action].apply(global.info.action, global.info.params);
is not running as expected. It is supposed to call a function in the controller_file and is supposed to do a simple res.write('hello world');. I wasn't having this problem before I started checking for the file using fs.exists. Everything inside the if statement, worked perfectly fine before this check.
Why is the code not running as expected? Why does the request just time out?
Does it have something to do with the whole synchronous vs asynchronous thing? (Sorry, I'm a complete beginner)
Thank you
Like others have commented, I would suggest you rewrite your code to bring it more in-line with the Node.js design patterns, then see if your problem still exists. In the meantime, here's something which may help:
The advice about not using require dynamically at "run time" should be heeded, and calling fs.exists() on every request is tremendously wasteful. However, say you want to load all *.js files in a directory (perhaps a "controllers" directory). This is best accomplished using an index.js file.
For example, save the following as app/controllers/index.js
var fs = require('fs');
var files = fs.readdirSync(__dirname);
var dotJs = /\.js$/;
for (var i in files) {
if (files[i] !== 'index.js' && dotJs.test(files[i]))
exports[files[i].replace(dotJs, '')] = require('./' + files[i]);
}
Then, at the start of app/router.js, add:
var controllers = require('./controllers');
Now you can access the app/controllers/test.js module by using controllers.test. So, instead of:
fs.exists(controllerFile, function (exists) {
if (exists) {
...
}
});
simply:
if (controllers[controllerName]) {
...
}
This way you can retain the dynamic functionality you desire without unnecessary disk IO.

What is the best way to expose methods from Node.js?

Consider I want to expose a method called Print
Binding method as prototype:
File Saved as Printer.js
var printerObj = function(isPrinted) {
this.printed = isPrinted;
}
printerObj.prototype.printNow = function(printData) {
console.log('= Print Started =');
};
module.exports = printerObj;
Then access printNow() by putting code require('Printer.js').printNow() in any external .js node program file.
Export method itself using module.exports:
File Saved as Printer2.js
var printed = false;
function printNow() {
console.log('= Print Started =');
}
module.exports.printNow = printNow;
Then access printNow() by putting code require('Printer2.js').printNow() in any external .js node program file.
Can anyone tell what is the difference and best way of doing it with respect to Node.js?
Definitely the first way. It is called the substack pattern and you can read about it on Twitter and on Mikeal Rogers' blog. Some code examples can be found at the jade github repo in the parser:
var Parser = exports = module.exports = function Parser(str, filename, options){
this.input = str;
this.lexer = new Lexer(str, options);
...
};
Parser.prototype = {
context: function(parser){
if (parser) {
this.contexts.push(parser);
} else {
return this.contexts.pop();
}
},
advance: function(){
return this.lexer.advance();
}
};
In the first example you are creating a class, ideally you should use it with "new" in your caller program:
var PrinterObj = require('Printer.js').PrinterObj;
var printer = new PrinterObj();
printer.PrintNow();
This is a good read on the subject: http://www.2ality.com/2012/01/js-inheritance-by-example.html
In the second example you are returning a function.
The difference is that you can have multiple instances of the first example (provided you use new as indicated) but only one instance of the second approach.

Resources