If I have a string with source code in it
var code = "console.log('I\'ve been loaded.');";
and want to run it in Node, normally it's suggest to use
vm.runInThisContext(code, "NOT_A_FILE.mycode");
This is all well and good until the code becomes more complicated:
var code2 = "require('http');\n" +
"console.log(http);" // TODO make more useless
In this case, we cannot run code2 in vm because it doesn't let the module have a require() function, as the vm is just a V8 VM, rather than a Node one.
This is the only problem I've run into so far—but I don't know what other problems may be down the road.
So, fundamentally, my question is: what extra functionality does require() do in making Node modules, and how can I ensure that my "string" modules have this functionality? If I need this functionality, am I better off just making temporary files which I actually require()?
Please vm.runInNewContext instead of vm.runInThisContext, then runs it in sandbox and you can pass require object ! Thats it..
var vm = require('vm');
var code2 = "var http = require('http');\n" + "console.log(http);" //TODO make more useless
vm.runInNewContext(code2, {
require: require,
http: require('http'),
console: console
}, 'yourvmfilename1');
OR you can directly send http object.
var vm = require('vm');
var code3 = "console.log(http);" //TODO make more useless
vm.runInNewContext(code3, {
http: require('http'),
console: console
}, 'yourvmfilename2');
Related
The following Gulp task does almost what I want.
const gulp = require('gulp');
const browserify = require('browserify');
const vinylStream = require('vinyl-source-stream');
const vinylBuffer = require('vinyl-buffer');
const watchify = require('watchify');
const glob = require('glob');
const jasmineBrowser = require('gulp-jasmine-browser');
gulp.task('test', function() {
let testBundler = browserify({
entries: glob.sync('src/**/*-test.js'),
cache: {},
packageCache: {},
}).plugin(watchify);
function updateSpecs() {
return testBundler.bundle()
.pipe(vinylStream(jsBundleName))
.pipe(vinylBuffer())
.pipe(jasmineBrowser.specRunner({console: true}))
.pipe(jasmineBrowser.headless({driver: 'phantomjs'}));
}
testBundler.on('update', updateSpecs);
updateSpecs();
});
It bundles all my Jasmine specs using Browserify and has them tested through gulp-jasmine-browser. It also watches all specs and all modules that they depend on and re-runs the tests if any of these modules changes.
The only ugly bit, which I'd really like to see solved, is that a new PhantomJS instance and a new Jasmine server are created every time updateSpecs is run. I was hoping to avoid that with code like the following:
gulp.task('test', function() {
let testBundler = browserify({
entries: glob.sync('src/**/*-test.js'),
cache: {},
packageCache: {},
}).plugin(watchify);
// persist the Jasmine server and PhantomJS browser
let testServer = jasmineBrowser.headless({driver: 'phantomjs'});
function updateSpecs() {
return testBundler.bundle()
.pipe(vinylStream(jsBundleName))
.pipe(vinylBuffer())
.pipe(jasmineBrowser.specRunner({console: true}))
.pipe(testServer);
}
testBundler.on('update', updateSpecs);
updateSpecs();
});
Alas, this doesn't work. Right after starting the task, all tests run fine, but the next time updateSpecs is called, I get a write after end error and the task exits with status 1. This error originates from the readable-stream Node module.
As I understand it, the end event during the first run of updateSpecs leaves testServer in a state in which it doesn't accept any new inputs. Unfortunately, the Node.js streams documentation isn't very clear on how to remedy this.
I have tried breaking the pipe chain at a different place, but I got the same result, which seems to indicate this is universal behaviour for streams. I also tried stopping the end event from propagating by inserting a through-stream that didn't re-emit that event, but this prevented the tests from being run at all. Finally, I tried returning the testServer stream from the task; this stopped the error, but although the updateSpecs function gets called every time the sources change, the tests are only being run the first time the task starts. This time, the testServer simply seems to ignore the new input.
The gulp-jasmine-browser documentation suggests that the following code would work:
var watch = require('gulp-watch');
gulp.task('test', function() {
var filesForTest = ['src/**/*.js', 'spec/**/*-test.js'];
return gulp.src(filesForTest)
.pipe(watch(filesForTest))
.pipe(jasmineBrowser.specRunner())
.pipe(jasmineBrowser.server());
});
And it goes on to suggest that you can also make this work with Browserify, but this isn't illustrated. Apparently, gulp-watch does something which causes the follow-up pipes to accept updated inputs later. How can I imitate this behaviour with watchify?
GitHub issue
As it turns out, it is a hard rule in Node.js that you cannot write after the end event. In addition, jasmineBrowser.specRunner(), .server() and .headless() must receive the end signal in order to actually test anything. This restriction is inherited from the official Jasmine test runner.
The example with gulp-watch from the README doesn't actually work, either, for the same reason. In order to make it work, one would have to do something similar to the working version of my watchify code in the question:
gulp.task('test', function() {
var filesForTest = ['src/**/*.js', 'spec/**/*-test.js'];
function runTests() {
return gulp.src(filesForTest)
.pipe(jasmineBrowser.specRunner())
.pipe(jasmineBrowser.server());
}
watch(filesForTest).on('add change unlink', runTests);
});
(I didn't test it, but something very close to this should work.)
So whatever watching mechanism you're using, you'll always need to call .specRunner() and .server() again for every cycle. The good news is that apparently, the Jasmine server will be reused if you explicitly pass a port number:
.pipe(jasmineBrowser.server({port: 8080}));
this also applies to .headless().
I know one could retrieve the data from the package.json when the application is launched from the current node module (i.e. in node my.js I can require('package.json').name).
But how could I retrieve the information about the current application from a different module? (i.e. an application is launched and it requires a module which requires my module). In that case I need to know what the root application is.
What would be the most efficient and accurate way to retrieve this information?
I don't know if is that solution you are looking for. As I see, you are trying to get some file descriptors about modules.
Maybe you should get the current file descriptor from any module. I saw some info about require('module'), that can be useful for that.
_resolveLookupPaths and _resolveFilename link are methods that can be useful. but unfortunately, the second method just works for global dependencies. So, Local dependencias are not considered.
I belive this code can help you to get the full path of a module:
// my own file descriptor
var packageJson = require('./package.json');
console.log(packageJson);
// module file descriptor
var moduleName = 'mongoose'; // change module name
var modulePath = require('module')._resolveLookupPaths('./');
var moduleDescriptor = require(modulePath[1][1] + '/' + moduleName + '/package.json');
console.log(moduleDescriptor.version);
when you run this method:
var modulePath = require('module')._resolveLookupPaths('./');
the output is always the same:
[ './',
[ '.',
'/home/<username>/<folder>/<sub-folder>/node_modules',
'/home/<username>/<folder>/node_modules',
'/home/<username>/node_modules',
... ] ]
So the full path for me is modulePath[1][1]
I believe with that you can create the real path of any local module. But I really don't know if this is the best or official way to get a file descriptor info of any module.
edit:
Maybe this can help you:
// index.js
require('mymodule').init();
In node modules:
// node_modules/mymodule/index.js
module.exports.init = function() {
var pack = require(process.cwd() + '\\package.json'); // I'm running on windows!!
console.log(pack.version); // output 1.0.0
}
the output is (according to app that is calling my module):
// 1.0.0
I want to use an yeoman generator inside a NodeJS project
I installed yeoman-generatorand generator-git (the generator that I want use) as locally dependency, and, at this moment my code is like this:
var env = require('yeoman-generator')();
var path = require('path');
var gitGenerator = require('generator-git');
var workingDirectory = path.join(process.cwd(), 'install_here/');
generator = env.create(gitGenerator);
obviously the last line doesn't work and doesn't generate the scaffold.
The question: How to?
Importantly, I want to stay in local dependency level!
#simon-boudrias's solution does work, but after I changed the process.chdir(), this.templatePath() and this.destinationPath() returns same path.
I could have use this.sourcePath() to tweak the template path, but having to change this to each yeoman generator is not so useful. After digging to yo-cli, I found the following works without affecting the path.
var env = require('yeoman-environment').createEnv();
env.lookup(function() {
env.run('generator-name');
});
env.create() only instantiate a generator - it doesn't run it.
To run it, you could call generator.run(). But that's not ideal.
The best way IMO would be this way:
var path = require('path');
var env = require('yeoman-generator')();
var gitGenerator = require('generator-git');
// Optionnal: look every generator in your system. That'll allow composition if needed:
// env.lookup();
env.registerStub(gitGenerator, 'git:app');
env.run('git:app');
If necessary, make sure to process.chdir() in the right directory before launching your generator.
Relevant documentation on the Yeoman Environment class can be found here: http://yeoman.io/environment/Environment.html
Also see: http://yeoman.io/authoring/integrating-yeoman.html
The yeoman-test module is also very useful if you want to pass predefined answers to your prompts. This worked for me.
var yeomanTest = require('yeoman-test');
var answers = require('from/some/file.json');
var context = yeomanTest.run(path.resolve('path/to/generator'));
context.settings.tmpdir = false; // don't run in tempdir
context.withGenerators([
'paths/to/subgenerators',
'more/of/them'
])
.withOptions({ // execute with options
'skip-install': true,
'skip-sdk': true
})
.withPrompts(answers) // answer prompts
.on('end', function () {
// do some stuff here
});
I'm using Express and Passport for node.js to build a simple web server, I coded a simple module and then I loaded the module inside a GET request, everything works great until more than one user access the request.
I use to believe that a "var" inside an "app.get" function was removed from memory after the function finished, but isn't the case, I use some local variables inside the external module and the values are being shared between users, the module looks like this:
var some_value=0;
function some_method(){
some_value++;
return some_value;
}
exports.some_method = some_method;
And the Express request code looks like this:
app.get('/someurl', function(req, res) {
var some_extermal_module = require('/some_extermal_module'); // <-----Right way?
var data = some_extermal_module.some_method();
res.render('view', {
title : 'Title',
data_to_vew: data
});
});
An object inside a "app.get" request stays always in memory regardless of is being accessed by a different user?
How to clean a "var" object after it runs?
How can I avoid this memory conflicts?
Do I have to code differently the module or call differently the module?
Thanks a lot.
UPDATE: I guess this is a proper solution but I need the review of some node.js/Express expert for approval it or correction.
app.js:
var ext_mod = require('./module');
var express = require('express');
var app = express();
app.get('/', function(req, res){
var ex_mod_instance = new ext_mod({});
ex_mod_instance.func_a({},function(ret){
res.send('Hello World: '+ret);
});
ex_mod_instance = null; // clean
});
app.listen(8080);
console.log('Listening on port 8080');
module.js:
var node_module = function(config) {
this.config = config;
this.counter=0;
};
node_module.prototype = {
func_a: function(param,done) {
this.counter++;
done(this.counter);
},
func_b: function(param,done) {
}
};
module.exports = node_module;
Is this the best way to save memory (leaks)?
Every time a function is called you do get "clean" local variables in the local scope. Modules are for the purpose of writing clean, organized code, so that you do not have every function and variable in the global scope. I believe require does cache the module, so maybe you are having a problem with variables in the closure around the function exported from the module. You'll have to include more code.
One way you could solve this is by exporting a function that creates the module. That function could be your constructor, which will scope your counter locally.
Again, this is one solution.
Your variable 'some_value' is global in the context of the module. So each time a request use this module, it uses the same variable.
(Require does cache the modules wich are loaded only the first time)
I can think of 2 ways to achieve this:
either you want one variable per request, and you declare this variable in the module function, or in the res.locals.some_value if you want to use it in many functions during the same request
either you want one variable per user, and then you need to use express session middleware, and add the variable to req.session.some_value
Unlike Rails, there doesn't seem to be an accepted way of loading environment specific config files in Node.js.
Currently I'm using the following:
config/development.js and config/production.js:
module.exports = {
'db': 'mongodb://127.0.0.1/example_dev',
'port': 3002
};
Followed by the following at the top of my app.js file:
var config = require('./config/' + process.env.NODE_ENV + '.js');
This pattern works pretty well, however it forces me to pass along this config file to any modules that need it. This gets kind of clunky, for instance:
var routes = require('./routes')(config);
.. and in routes/index.js:
modules.export = function(config) {
this.index = function...
this.show = function...
};
Etc, etc. The module pattern just seems to be pretty clunky when dealing with something that should be global, like configuration settings. I could require the configuration file at the top of every file that needs it as well, but that doesn't seem ideal either.
Does anyone have a best practice for including a configuration file and making it globally available?
You could just attach it to the global process object:
app.js:
var config = require('./config/' + process.env.NODE_ENV + '.js');
process.config = config;
aywhere else in your app
console.log(process.config);
Yes, it's a bit dangerous in that it can get overwritten anywhere, but it's still a pretty simple approach.