Calling a yeoman generator after a generator has finished - node.js

I am looking to call another yeoman generator once the first generator has finished installing, this will be based on an answer I give for one of the prompts.
I have tried calling it at the end.
end: function () {
this.installDependencies({
callback: function () {
if( this.generator2 ){
shell.exec('yo generator2');
}
}.bind(this)
});
},
This runs generator2, but I am unable to answer any prompts.
These are 2 separate generators, so I cannot make the second a sub generator.

Use Yeoman composability feature.
About the code, don't use this.installDependencies() callback (that won't work as you expect). Rather use the run loop priorities.
Also, you should review your logic and the way you think about your current problem. When composing generators, the core idea is to keep both decoupled. They shouldn't care about the ordering, they should run in any order and output the same result. Thinking about your code this way will greatly reduce the complexity and make it more robust.

I see this is an older question, but I came accross a similar requirement & want to make sure all options are listed. I agree with the other answers that it is the best choice to use the composability feature & keep the order irrelevant. But in case it really is necessary to run generators sequentially:
You can also execute another generator using the integration features.
So in generator1 you could call
this.env.run('generator2');
This will also let you answer prompts in generator2.

When using .composeWith a priority group function (e.g.: prompting, writing...) will be executed for all the generators, then the next priority group. If you call .composeWith to generatorB from inside a generatorA, then execution will be, e.g.:
generatorA.prompting => generatorB.prompting => generatorA.writing =>
generatorB.writing
You can cover all possible execution scenarios, condition checking with this concept, also use the options of .composeWith('my-genertor', { 'options' : options })
If you want to control execution between different generators, I advise you to create a "main" generator which composes them together, like written on http://yeoman.io/authoring/composability.html#order:
// In my-generator/generators/turbo/index.js
module.exports = require('yeoman-generator').Base.extend({
'prompting' : function () {
console.log('prompting - turbo');
},
'writing' : function () {
console.log('prompting - turbo');
}
});
// In my-generator/generators/electric/index.js
module.exports = require('yeoman-generator').Base.extend({
'prompting' : function () {
console.log('prompting - zap');
},
'writing' : function () {
console.log('writing - zap');
}
});
// In my-generator/generators/app/index.js
module.exports = require('yeoman-generator').Base.extend({
'initializing' : function () {
this.composeWith('my-generator:turbo');
this.composeWith('my-generator:electric');
}
});

Related

Issues running async tasks in Gulp

I have minimal experience with Node.js projects and have inherited one which uses Gulp to build and distribute itself.
The project has several disparate task files, and a gulpfile.js which references them all. For brevity, I will use the smallest function as an example.
//gulpfile.js
...
gulp.task('csscompile', function() {
gulp.src('./src/ops/gulp/csscompile.js')
});
...
//csscompile.js
...
module.exports = function (gulp) {
gulp.task('csscompile', function (done) {
let paths = [
'src/app/**/*.scss',
'node_modules/angular-material/angular-material.scss',
];
gulp.src(paths)
.pipe(sass())
.pipe(concat('app.css'))
.pipe(gulp.dest('build'))
.done();
});
};
Running $ gulp csscompile from the command line always completes the entire function in about 10ms, and I get output like:
[10:44:43] The following tasks did not complete: csscompile
[10:44:43] Did you forget to signal async completion?
Obviously there is an issue with waiting for the functions to complete. After searching around I have tried every imaginable combination of using async flags, using function (done), and more, to no avail. Even putting the csscompile function directly in the csscompile task does not work.
I assume using gulp.src() is part of the issue, I don't know whether those functions are inlined or what. I also want to avoid having to turn every one of these functions into a Promise.
Does anyone have any recommendations of changes I can make?
According to the docs:
When you see the "Did you forget to signal async completion?" warning, none of the techniques mentioned above were used. You'll need to use the error-first callback or return a stream, promise, event emitter, child process, or observable to resolve the issue.
So you need to return the stream.
This should work:
module.exports = function (gulp) {
gulp.task('csscompile', function () {
let paths = [
'src/app/**/*.scss',
'node_modules/angular-material/angular-material.scss',
];
return gulp.src(paths)
.pipe(sass())
.pipe(concat('app.css'))
.pipe(gulp.dest('build'));
});

how to set a particular test file as the first when running mocha?

Is there a way to set a particular test file as the first in mocha and then the rest of the test files can execute in any order.
One technique that can be used is to involve number in test filename such as
01-first-test.js
02-second-test.js
03-third-test.js
So by defining this, the test will be executed from first test until third test.
No. There is no guarantee your tests will run in any particular order. If you need to do some setup for tests inside of a given describe block, try using the before hook like so.
There is no direct way, but there is certainly a solution to this. Wrap your describe block in function and call function accordingly.
firstFile.js
function first(){
describe("first test ", function () {
it("should run first ", function () {
//your code
});
});
}
module.exports = {
first
}
secondFile.js
function second(){
describe("second test ", function () {
it("should run after first ", function () {
//your code
})
})
}
module.exports = {
second
}
Then create one main file and import modules.
main.spec.js
const firstThis = require('./first.js)
const secondSecond = require(./second.js)
firstThis.first();
secondSecond.second();
In this way you can use core javaScript features and play around with mocha as well.
This is the solution I have been using since long. would highly appreciate if anyone would come with better approach.

Trying to understand node.js callback when the function normally doesn't expect any parameters?

I am trying to work with node.js and node-java and trying to get my head wrapped around some concepts, and in particular how to write async method calls.
I think that, for a function in Java, myclass.x():
[In Java]:
Z = myclass.x(abc);
And:
[In node.js/node-java]:
myclass.x(abc, function(err,data) {
//TODO
Z = data;});
In other words, the myclass.x function gets evaluated using the parameter abc, and if no error, then the result goes into "data" which is then assigned to Z.
Is that correct?
Here's the thing (or one of the things) that I am confused about.
What happens if the function myclass.x() doesn't take any parameters?
In other words, it is normally (in Java) just called like:
Z = myclass.x();
If that is the case, how should the node.js code look?
myclass.x(, function(err,data) {
//TODO
Z = data;});
doesn't seem right, but:
myclass.x( function(err,data) {
//TODO
Z = data;});
also doesn't seem correct.
So what is the correct way to code the node.js code in this case?
Thanks in advance!!
Jim
EDIT 1: Per comments, I'm adding the specific code I'm working with is the last couple of commented out lines from this other question at:
node.js and node-java: What is equivalent node.js code for this java code?
These are the lines (commented out in that other question):
var MyFactoryImplClass = java.import("oracle.security.jps.openaz.pep.PepRequestFactoryImpl.PepRequestFactoryImpl");
var result = myFactoryImplClass.newPepRequest(newSubject, requestACTIONString ,requestRESOURCEString , envBuilt)
I tried to make the last line use an async call:
MyFactoryImplClass.getPepRequestFactory( function(err,data) {
//TODO
pepReqF1=data;})
javaLangSystem.out.printlnSync("Finished doing MyFactoryImplClass.getPepRequestFactory() and stored it in pepReqF1 =[" + pepReqF1 + "]");
But the output was showing the value of that pepReqF1 as "undefined".
If calling the method with one parameter and a callback is:
myclass.x(abc, function(err, data) {
// ...
});
Then calling a method with only a callback would be:
myclass.x(function(err, data) {
// ...
});
The function(err, data) { } part is just a normal parameter just like abc. In fact, you can pass a named function with:
function namedFun(err, data) {
// ...
}
myclass.x(abc, namedFun);
Or even:
var namedFun = function (err, data) {
// ...
}
myclass.x(abc, namedFun);
Functions in JavaScript are first-class objects like strings or arrays. You can pass a named function as a parameter to some other function:
function fun1(f) {
return f(10);
}
function fun2(x) {
return x*x;
}
fun1(fun2);
just like you can pass a named array:
function fun3(a) {
return a[0]
}
var array = [1, 2, 3];
fun3(array);
And you can pass an anonymous function as a parameter:
function fun1(f) {
return f(10);
}
fun1(function (x) {
return x*x;
});
just like you can pass an anonymous array:
function fun3(a) {
return a[0]
}
fun3([1, 2, 3]);
There is also a nice shortcut so that instead of:
fun1(function (x) {
return x*x;
});
You can write:
fun1(x => x*x);
Making my comment into an answer...
If the issue you're experiencing is that Z does not have the value you want when you are examining it, then that is probably because of a timing issue. Asynchronous callbacks happen at some unknown time in the future while the rest of your code continues to run. Because of that, the only place you can reliably use the result passed to the asynchronous callback is inside the callback itself or in some function you would call from that function and pass it the value.
So, if your .x() method calls it's callback asynchronously, then:
var Z;
myclass.x( function(err,data) {
// use the err and data arguments here inside the callback
Z = data;
});
console.log(Z); // outputs undefined
// you can't access Z here. Even when assigned
// to higher scoped variables because the callback has not yet
// been called when this code executes
You can see this is a little more clearly by understanding the sequencing
console.log('A');
someAsyncFucntion(function() {
console.log('B');
})
console.log('C');
This will produce a log of:
A
C
B
Showing you that the async callback happens some time in the future, after the rest of your sequential code has executed.
Java, on the other hand, primarily uses blocking I/O (the function doesn't return until the I/O operation is copmlete) so you don't usually have this asynchronous behavior that is standard practice in node.js. Note: I believe there are some asynchronous capabilities in Java, but that isn't the typical way things are done and in node.js, it is the typical ways things are done.
This creates a bit of an architectural mismatch if you're trying to port code that uses I/O from environment from another because the structure has to be redone in order to work properly in a node.js environment.

Node.js: should I keep `assert()`s in production code?

A methodological question:
I'm implementing an API interface to some services, using node.js, mongodb and express.js.
On many (almost all) sites I see code like this:
method(function(err, data) {
assert.equal(null, err);
});
The question is: should I keep assert statements in my code at production time (at least for 'low significance' errors)? Or, are these just for testing code, and I should better handle all errors each time?
You definitively should not keep them in the production environment.
If you google a bit, there are a plethora of alternative approaches to strip out them.
Personally, I'd use the null object pattern by implementing two wrappers in a separate file: the former maps its method directly to the one exported by the module assert, the latter offers empty functions and nothing more.
Thus, at runtime, you can plug in the right one by relying on some global variable previously correctly set, like process.env.mode. Within your files, you'll have only to import the above mentioned module and use it instead of using directly assert.
This way, all around your code you'll never see error-prone stuff like myAssert && myAssert(cond), instead you'll have ever a cleaner and safer myAssert(cond) statement.
It follows a brief example:
// myassert.js
var assert = require('assert');
if('production' === process.env.mode) {
var nil = function() { };
module.exports = {
equal = nil;
notEqual = nil;
// all the other functions
};
} else {
// a wrapper like that one helps in not polluting the exported object
module.exports = {
equal = function(actual, expected, message) {
assert.equal(actual, expected, message);
},
notEqual = function(actual, expected, message) {
assert.notEqual(actual, expected, message);
},
// all the other functions
}
}
// another_file.js
var assert = require('path_to_myassert/myassert');
// ... your code
assert(true, false);
// ... go on
Yes! asserts are good in production code.
Asserts allow a developer to document assumptions that the code makes, making code easier to read and maintain.
It is better for an assert to fail in production than allow the undefined behaviour that the assert was protecting. When an assert fails you can more easily see the problem and fix it.
Knowing your code is working within assumptions is far more valuable than a small performance gain.
I know opinions differ here. I have offered a 'Yes' answer because I am interested to see how people vote.
probably no
ref: When should assertions stay in production code?
Mostly in my code i put error handling function in a separate file , and use same error method everywhere, it mostly depends on logic anyways
like ppl generally forget this
process.on('uncaughtException', function (err) {
console.log(err);
})
and err==null doesn't hurts , it checks both null and undefined

Using this within a promise in AngularJS

Is there a best-practice solution to be able to use within in promise this? In jQuery i can bind my object to use it in my promise/callback - but in angularJS? Are there best-practice solutions? The way "var service = this;" i don't prefer ...
app.service('exampleService', ['Restangular', function(Restangular) {
this._myVariable = null;
this.myFunction = function() {
Restangular.one('me').get().then(function(response) {
this._myVariable = true; // undefined
});
}
}];
Are there solutions for this issue? How i can gain access to members or methods from my service within the promise?
Thank you in advance.
The generic issue of dynamic this in a callback is explained in this answer which is very good - I'm not going to repeat what Felix said. I'm going to discuss promise specific solutions instead:
Promises are specified under the Promises/A+ specification which allows promise libraries to consume eachother's promises seamlessly. Angular $q promises honor that specification and therefor and Angular promise must by definition execute the .then callbacks as functions - that is without setting this. In strict mode doing promise.then(fn) will always evaluate this to undefined inside fn (and to window in non-strict mode).
The reasoning is that ES6 is across the corner and solves these problems more elegantly.
So, what are your options?
Some promise libraries provide a .bind method (Bluebird for example), you can use these promises inside Angular and swap out $q.
ES6, CoffeeScript, TypeScript and AtScript all include a => operator which binds this.
You can use the ES5 solution using .bind
You can use one of the hacks in the aforementioned answer by Felix.
Here are these examples:
Adding bind - aka Promise#bind
Assuming you've followed the above question and answer you should be able to do:
Restangular.one('me').get().bind(this).then(function(response) {
this._myVariable = true; // this is correct
});
Using an arrow function
Restangular.one('me').get().then(response => {
this._myVariable = true; // this is correct
});
Using .bind
Restangular.one('me').get().then(function(response) {
this._myVariable = true; // this is correct
}.bind(this));
Using a pre ES5 'hack'
var that = this;
Restangular.one('me').get().then(function(response) {
that._myVariable = true; // this is correct
});
Of course, there is a bigger issue
Your current design does not contain any way to _know when _myVariable is available. You'd have to poll it or rely on internal state ordering. I believe you can do better and have a design where you always execute code when the variable is available:
app.service('exampleService', ['Restangular', function(Restangular) {
this._myVariable =Restangular.one('me');
}];
Then you can use _myVariable via this._myVariable.then(function(value){. This might seem tedious but if you use $q.all you can easily do this with several values and this is completely safe in terms of synchronization of state.
If you want to lazy load it and not call it the first time (that is, only when myFunction is called) - I totally get that. You can use a getter and do:
app.service('exampleService', ['Restangular', function(Restangular) {
this.__hidden = null;
Object.defineProperty(this,"_myVariable", {
get: function(){
return this.__hidden || (this.__hidden = Restangular.one('me'));
}
});
}];
Now, it will be lazy loaded only when you access it for the first time.

Resources