Assuming I have the content of a js file in a string. Furthermore, assume it has an exports['default'] = function() {...} and/or other exported properties or functions. Is there any way to "require" it (compile it) from that string into an object, such that I can use it? (Also, I don't want to cache it like require() does.)
Here's a very simple example using vm.runInThisContext():
const vm = require('vm');
let code = `
exports['default'] = function() {
console.log('hello world');
}
`
global.exports = {}; // this is what `exports` in the code will refer to
vm.runInThisContext(code);
global.exports.default(); // "hello world"
Or, if you don't want to use globals, you can achieve something similar using eval:
let sandbox = {};
let wrappedCode = `void function(exports) { ${ code } }(sandbox)`;
eval(wrappedCode);
sandbox.default(); // "hello world"
Both methods assume that the code you're feeding to it is "safe", because they will both allow running arbitrary code.
Related
From a performance perspective, is there any difference between invoking code by wrapping it in a function and then exporting it:
function doSomething () {
// doing something here
}
module.exports = doSomething();
And just requiring it without any exports? like this:
myModule.js
// Code doing something
file that requires the module:
var doSomething = require('./myModule');
And if the purpose of the code inside the module is to run just once, do I need to store it in a variable?
If you don't need the return value of that function, then you don't have to store it in a variable.
The difference with:
function doSomething () {
// doing something here
}
module.exports = doSomething();
and using:
var x = require('module');
var y = require('module');
vs.
function doSomething () {
// doing something here
}
module.exports = doSomething;
and using:
var x = require('module')();
var y = require('module')();
is that in the first case, the function will be run only once, while in the second case the function will be run twice.
The difference is that if you just include it without module.exports, then the code will execute immediately but be private to the module. You can only access the data if you export it somehow, with module.exports. It can be either a function or a Javascript Object. Essentially, you can view everything within the module as being completely hidden from everything else in your application.
The only shortcut that I know of is for JSON files. If you look here: Module.exports vs plain json for config files, you can see that you can require('file.json') and it will replace the contents of the json file with a Javascript object that you can then use in your application.
Hi I found a framework where they use a lot this pattern.
exports.install = function(){
//code
}
but usually you see this pattern in nodejs
module.exports = {
//code
}
Is this the same thing or is this something else ?
exports is the object corresponding to module.exports before you do anything to it. I think it's due to some legacy code, but basically folks use module.exports if they want to replace the whole object with their own object or a function, while they use exports if they just want to hang functions off the module. It's a little confusing at first, but essentially exports.install just means that calling code would do something like:
const mod = require('that-module');
mod.install(params, callback); // call that function
The framework you're looking at is probably using it as part of a bootstrapping process, afaik it doesn't have significance to the node engine itself.
Yes, it is the same thing. You can use one of 2 ways to setup your code.
The different thing is memory. They point to same memory. You can think exports like a variable and you can not use this way to export your module:
Given this module:
// test.js
exports = {
// you can not use this way to export module.
// because at this time, `exports` points to another memory region
// and it did not lie on same memory with `module.exports`
sayHello: function() {
console.log("Hello !");
}
}
The following code will get the error: TypeError: test.sayHello is not a function
// app.js
var test = require("./test");
test.sayHello();
// You will get TypeError: test.sayHello is not a function
The correct way you must use module.exports to export your module:
// test.js
module.exports = {
// you can not use this way to export module.
sayHello: function() {
console.log("Hello !");
}
}
// app.js
var test = require("./test");
test.sayHello();
// Console prints: Hello !
So, it just is style of developer.
Writing the simplest module we could, we write into hello.js:
var hello = function(){
console.log('hello');
};
exports = hello; \\ Doesn't work on Amazon EC2 Ubuntu Instance nor Windows Powershell
I run Node and require the module
var hello = require('./hello');
hello;
and an empty array {} gets returned when I'm supposed to get [Function].
I tried replacing exports with module.exports, but this doesn't work on my Windows Powershell. It does work on my Amazon EC2 Ubuntu Instance, so why doesn't exports work? Has the API changed? And what could possibly be happening with Powershell that neither of these work?
I know Windows isn't the most desirable development environment, but I can't get my head around such a simple mishap.
EDIT
exporting with ES6 is a little nicer
export const hello = function(){
console.log('hello');
};
importing will look like
import {hello} from './file';
Original answer
You'll want to use module.exports
var hello = function(){
console.log('hello');
};
module.exports = hello;
If just exporting one thing, I'll usually do it all in one line
var hello = module.exports = function() {
console.log('hello');
};
Extras
If you use a named function, in the event an error occurs in your code, your stack trace will look a lot nicer. Here's the way I would write it
// use a named function ↓
var hello = module.exports = function hello() {
console.log("hello");
};
Now instead of showing anonymous for the function name in the stack trace, it will show you hello. This makes finding bugs so much easier.
I use this pattern everywhere so that I can debug code easily. Here's another example
// event listeners ↓
mystream.on("end", function onEnd() {
console.log("mystream ended");
};
// callbacks ↓
Pokemon.where({name: "Metapod"}, function pokemonWhere(err, result) {
// do stuff
});
If you want to export multiple things, you can use exports directly, but you must provide a key
// lib/foobar.js
exports.foo = function foo() {
console.log("hello foo!");
};
exports.bar = function bar() {
console.log("hello bar!");
};
Now when you use that file
var foobar = require("./lib/foobar");
foobar.foo(); // hello foo!
foobar.bar(); // hello bar!
As a final bonus, I'll show you how you can rewrite that foobar.js by exporting a single object but still getting the same behavior
// lib/foobar.js
module.exports = {
foo: function foo() {
console.log("hello foo!");
},
bar: function bar() {
console.log("hello bar!");
}
};
// works the same as before!
This allows you to write modules in whichever way is best suited for that particular module. Yay!
The reason exports is not working is because of the reference conflict. The top variable in each file is module which has a property module.exports. When the module is loaded new variable is created in the background. Something like this happens:
var exports = module.exports;
Obviously exports is a reference to module.exports, but doing
exports = function(){};
forces exports variable to point at function object - it does not change module.exports. It's like doing:
var TEST = { foo: 1 };
var foo = TEST.foo;
foo = "bar";
console.log(TEST.foo);
// 1
Common practice is to do:
module.exports = exports = function() { ... };
I have no idea why it doesn't work under Windows Powershell. To be honest I'm not even sure what that is. :) Can't you just use native command prompt?
Consider I want to expose a method called Print
Binding method as prototype:
File Saved as Printer.js
var printerObj = function(isPrinted) {
this.printed = isPrinted;
}
printerObj.prototype.printNow = function(printData) {
console.log('= Print Started =');
};
module.exports = printerObj;
Then access printNow() by putting code require('Printer.js').printNow() in any external .js node program file.
Export method itself using module.exports:
File Saved as Printer2.js
var printed = false;
function printNow() {
console.log('= Print Started =');
}
module.exports.printNow = printNow;
Then access printNow() by putting code require('Printer2.js').printNow() in any external .js node program file.
Can anyone tell what is the difference and best way of doing it with respect to Node.js?
Definitely the first way. It is called the substack pattern and you can read about it on Twitter and on Mikeal Rogers' blog. Some code examples can be found at the jade github repo in the parser:
var Parser = exports = module.exports = function Parser(str, filename, options){
this.input = str;
this.lexer = new Lexer(str, options);
...
};
Parser.prototype = {
context: function(parser){
if (parser) {
this.contexts.push(parser);
} else {
return this.contexts.pop();
}
},
advance: function(){
return this.lexer.advance();
}
};
In the first example you are creating a class, ideally you should use it with "new" in your caller program:
var PrinterObj = require('Printer.js').PrinterObj;
var printer = new PrinterObj();
printer.PrintNow();
This is a good read on the subject: http://www.2ality.com/2012/01/js-inheritance-by-example.html
In the second example you are returning a function.
The difference is that you can have multiple instances of the first example (provided you use new as indicated) but only one instance of the second approach.
maybe this question is a little silly, but is it possible to load multiple .js files with one require statement? like this:
var mylib = require('./lib/mylibfiles');
and use:
mylib.foo(); //return "hello from one"
mylib.bar(): //return "hello from two"
And in the folder mylibfiles will have two files:
One.js
exports.foo= function(){return "hello from one";}
Two.js
exports.bar= function(){return "hello from two";}
I was thinking to put a package.json in the folder that say to load all the files, but I don't know how. Other aproach that I was thinking is to have a index.js that exports everything again but I will be duplicating work.
Thanks!!
P.D: I'm working with nodejs v0.611 on a windows 7 machine
First of all using require does not duplicate anything. It loads the module and it caches it, so calling require again will get it from memory (thus you can modify module at fly without interacting with its source code - this is sometimes desirable, for example when you want to store db connection inside module).
Also package.json does not load anything and does not interact with your app at all. It is only used for npm.
Now you cannot require multiple modules at once. For example what will happen if both One.js and Two.js have defined function with the same name?? There are more problems.
But what you can do, is to write additional file, say modules.js with the following content
module.exports = {
one : require('./one.js'),
two : require('./two.js'),
/* some other modules you want */
}
and then you can simply use
var modules = require('./modules.js');
modules.one.foo();
modules.two.bar();
I have a snippet of code that requires more than one module, but it doesn't clump them together as your post suggests. However, that can be overcome with a trick that I found.
function requireMany () {
return Array.prototype.slice.call(arguments).map(function (value) {
try {
return require(value)
}
catch (event) {
return console.log(event)
}
})
}
And you use it as such
requireMany("fs", "socket.io", "path")
Which will return
[ fs {}, socketio {}, path {} ]
If a module is not found, an error will be sent to the console. It won't break the programme. The error will be shown in the array as undefined. The array will not be shorter because one of the modules failed to load.
Then you can bind those each of those array elements to a variable name, like so:
var [fs, socketio, path] = requireMany("fs", "socket.io", "path")
It essentially works like an object, but assigns the keys and their values to the global namespace. So, in your case, you could do:
var [foo, bar] = requireMany("./foo.js", "./bar.js")
foo() //return "hello from one"
bar() //return "hello from two"
And if you do want it to break the programme on error, just use this modified version, which is smaller
function requireMany () {
return Array.prototype.slice.call(arguments).map(require)
}
Yes, you may require a folder as a module, according to the node docs. Let's say you want to require() a folder called ./mypack/.
Inside ./mypack/, create a package.json file with the name of the folder and a main javascript file with the same name, inside a ./lib/ directory.
{
"name" : "mypack",
"main" : "./lib/mypack.js"
}
Now you can use require('./mypack') and node will load ./mypack/lib/mypack.js.
However if you do not include this package.json file, it may still work. Without the file, node will attempt to load ./mypack/index.js, or if that's not there, ./mypack/index.node.
My understanding is that this could be beneficial if you have split your program into many javascript files but do not want to concatenate them for deployment.
You can use destructuring assignment to map an array of exported modules from require statements in one line:
const requires = (...modules) => modules.map(module => require(module));
const [fs, path] = requires('fs', 'path');
I was doing something similar to what #freakish suggests in his answer with a project where I've a list of test scripts that are pulled into a Puppeteer + Jest testing setup. My test files follow the naming convention testname1.js - testnameN.js and I was able use a generator function to require N number of files from the particular directory with the approach below:
const fs = require('fs');
const path = require('path');
module.exports = class FilesInDirectory {
constructor(directory) {
this.fid = fs.readdirSync(path.resolve(directory));
this.requiredFiles = (this.fid.map((fileId) => {
let resolvedPath = path.resolve(directory, fileId);
return require(resolvedPath);
})).filter(file => !!file);
}
printRetrievedFiles() {
console.log(this.requiredFiles);
}
nextFileGenerator() {
const parent = this;
const fidLength = parent.requiredFiles.length;
function* iterate(index) {
while (index < fidLength) {
yield parent.requiredFiles[index++];
}
}
return iterate(0);
}
}
Then use like so:
//Use in test
const FilesInDirectory = require('./utilities/getfilesindirectory');
const StepsCollection = new FilesInDirectory('./test-steps');
const StepsGenerator = StepsCollection.nextFileGenerator();
//Assuming we're in an async function
await StepsGenerator.next().value.FUNCTION_REQUIRED_FROM_FILE(someArg);