I am developing a "plugins" concept whereby I have a series of files, each containing a single function (a plugin). I would like to automatically load and execute these using promise.all().
Problem: each plugin function does not execute.
Here is my example plugin plugins/example.js:
"use strict";
exports = function() {
return new Promise(function(resolve, reject) {
console.log("Plugin running....");
setTimeout(resolve, 200, 'example plugin succeeded!');
});
};
From my app.js I then load all plugins using the require-all NPM module:
const plugins = require('require-all')(__dirname + '/plugins');
I then try to execute all as part of my promise chain:
return Promise.all([plugins]);
No logging takes place from the function. Interestingly when I log the contents of plugins, I see and empty object:
{
"example": {}
}
Can anyone advise why the example function is not being called?
One way of doing it would be something like following. Lets say that there is a plugins directory with files like pluginA.js, pluginB.js, ..., pluginZ.js. As you stated in your question, exported value from those plugins is always a function that will return a promise. I would create plugins/index.js that would export everything from those plugins like:
// plugins/index.js
'use strict'
const pluginA = require('./pluginA')
const pluginB = require('./pluginB')
...
const pluginZ = require('./pluginZ')
module.exports = [ pluginA, pluginB, ..., pluginZ ]
So then you could use this as following:
// foo.js
'use strict'
const _ = require('lodash')
const plugins = require('./plugins')
Promise.all(_.map(plugins, (fn) => fn()))
.then((data) => console.log(data))
.catch((err) => console.log(err))
The RequireAll plugin returns an object containing the plugin name as a key, and the import as the value, so you'll have to get the values and then call those functions to get the promises, and then you'd have an array of promises
const plugins = require('require-all')(__dirname + '/plugins');
var promises = Object.keys(plugins).map(function(key) {
return plugins[key]();
});
Promise.all(promises).then(function() {
// all done
}).catch(function(err) {
// fail
});
Related
Background
I am returning data from AWS Secrets Manager and using the aws-sdk to do so. Earlier I asked a question about how to correctly return the data and export it since the exported object never had the data resolved by the time the export was imported somewhere else. This caused me to get a bunch of undefined.
After solving that problem it was determined that the way to handle this was to wrap the aws-sdk function in a promise, then call the promise in another file with async await. This causes me issues.
Example
If I request and return the data from AWS like this,
let secrets = {
jwtHash: 10,
};
const client = new AWS.SecretsManager({
region: region
});
const promise = new Promise((resolve, reject) => {
client.getSecretValue({ SecretId: secretName }, async (err, data) => {
if (err) {
reject(err);
} else {
const res = await JSON.parse(data.SecretString);
secrets.dbUsername = res.username;
secrets.dbPassword = res.password;
secrets.dbHost = res.host;
secrets.dbPort = res.port;
secrets.dbDatabase = res.dbname;
resolve(secrets);
}
});
});
module.exports = promise;
Then I can import it in another file and use the data like this,
const promise = require('../secrets');
(async () => {
const secrets = await promise;
// use secrets here
})();
Now let's say in that file where I am trying to use secrets I have something like this,
const pool = new Pool({
user: secrets.dbUsername,
host: secrets.dbHost,
database: secrets.dbDatabase,
password: secrets.dbPassword,
port: secrets.dbPort
});
pool.on('error', err => {
console.error('Unexpected error on idle client', err);
process.exit(-1);
});
module.exports = pool;
If I wrap the pool function in the async self invoking function I have trouble exporting it so it can be used anywhere in my app when I need a database connection. Similar, I have many functions throughout my application that need access to the secret data. If I were to walk through the application wrapping all of my code in async functions it would continue to cause more of these difficulties.
Question
It seems to me the best solution here would be to return the data asynchronously and once it has resolved, export it synchronously.
How can I achieve such a task in this scenario?
A win here would be,
Make the request in /secrets/index.js
Build the secrets object in the same file
Export secrets as an object that can easily be imported anywhere else in my application without the need for asynchronous functions.
Example of How I Would Like to Use This
const secrets = require('../secrets');
const pool = new Pool({
user: secrets.dbUsername,
host: secrets.dbHost,
database: secrets.dbDatabase,
password: secrets.dbPassword,
port: secrets.dbPort
});
pool.on('error', err => {
console.error('Unexpected error on idle client', err);
process.exit(-1);
});
module.exports = pool;
Because the needed data is gotten asynchronously, there's no way around making everything that depends on it (somehow) asynchronous as well. With asynchronicity involved, one possibility is to usually export functions that can be called on demand, rather than exporting objects:
an object that depends on the asynchronous data can't be meaningfully exported before the data comes back
if you export functions rather than objects, you can ensure that control flow starts from your single entry point and heads downstream, rather than every module initializing itself at once (which can be problematic when some modules depend on others to be initialized properly, as you're seeing)
On another note, note that if you have a single Promise that needs to resolve, it's probably easier to call .then on it than use an async function. For example, rather than
const promise = require('../secrets');
(async () => {
// try/catch is needed to handle rejected promises when using await:
try {
const secrets = await promise;
// use secrets here
} catch(e) {
// handle errors
}
})();
you might consider:
const promise = require('../secrets');
promise
.then((secrets) => {
// use secrets here
})
.catch((err) => {
// handle errors
});
It's less wordy and probably easier to make sense of at a glance - better than a self-invoking async IIFE. IMO, the place to use await is when you have multiple Promises that need to resolve, and chaining .thens and returned Promises together gets too ugly.
A module that depends on secrets to perform has to, in its code, have something that effectively waits for secrets to be populated. Although being able to use your const secrets = require('../secrets'); in your lower code example would be nice, it just isn't possible like that. You can export a function that takes secrets as a parameter rather than as a require, and then (synchronously!) return the instantiated pool:
// note, secrets is *not* imported
function makePool(secrets) {
const pool = new Pool({
user: secrets.dbUsername,
host: secrets.dbHost,
database: secrets.dbDatabase,
password: secrets.dbPassword,
port: secrets.dbPort
});
pool.on('error', err => {
console.error('Unexpected error on idle client', err);
process.exit(-1);
});
return pool;
}
module.exports = makePool;
Then, to use it in another module, once the secrets are created, call makePool with the secrets, and then use / pass around the returned pool:
const secretsProm = require('../secrets');
const makePool = require('./makePool');
secretsProm.then((secrets) => {
const pool = makePool(secrets);
doSomethingWithPool(pool);
})
.catch((err) => {
// handle errors
});
Note that the doSomethingWithPool function can be completely synchronous, as is makePool - the asynchronous nature of secrets, once handled with .then in one module, does not have to be dealt with asynchronously anywhere else, as long as other modules export functions, rather than objects.
I would suggest doing everything in 1 file, and then instead of exporting the object you create, export a function that returns the object. The function will always have access to the must up-to-date version of the object, and you can call it from any file to access the same object.
Example:
Create two files in a folder. In the first file, we will do this:
Define a value.
Set a timeout to change the value after some time
Export the value itself
Export a function that returns the value
values.js
let x = 0 ; // set initial value
setTimeout(() => { x = 5; }, 2000); // sometime later, value will change
const getValueOfX = () => { return x; };
module.exports = {
x: x,
getValueOfX: getValueOfX
};
Now in the other file, we just import the two exports from the previous file (we put them both in an object for easy exporting). We can then log them out, wait for some time to pass, and log them out again.
index.js
let values = require('./values');
console.log(`Single value test. x = ${values.x}`);
console.log(`Function return value test. x = ${values.getValueOfX()}`);
setTimeout(() => { console.log(`Single value test. x = ${values.x}`); }, 4000);
setTimeout(() => { console.log(`Function return value test. x = ${values.getValueOfX()}`); }, 4000);
To run the code, just open your Terminal or Command Prompt and, from the same directory as these two files, run node index.js
You'll see that when just the value (object, array, w/e) is exported, it is exported as-is when the export runs - almost always before the API call is finished.
BUT - If you export a function that returns the value (object, array, w/e), then that function will retrieve the up-to-date version of the value at the time it is called! Great for API calls!
so your code might look like this:
let secrets = { jwtHash: 10 };
const client = new AWS.SecretsManager({
region: region
});
let pool = null;
client.getSecretValue({ SecretId: secretName }, async (err, data) => {
if (err) {
reject(err);
} else {
const res = await JSON.parse(data.SecretString);
pool = new Pool({
user: res.username,
host: res.host
database: res.dbname
password: res.password
port: res.port
});
pool.on('error', err=> {
console.error('Unexpected error on idle client', err);
process.exit(-1);
});
}
});
module.exports = function(){ return pool; };
One thing I do (especially when working with a large application that imports static variables that have been moved to a database) is load that file via a function and that function populates an export.
// config.js
const exports = {};
export async function populate() {
const RUNTIMEVARS = await what_you_do_to_get_vars();
for (const config of RUNTIMEVARS) {
exports[config.key] = exports[config.data];
}
// for anything needing the config in the bootstrap.
return exports;
}
export default exports;
Then in the bootstrap:
// bootstrap.js
import './database-connection.js'; // important to have no internal dependencies.
(async() => {
const { populate } = await import('./config.js');
await populate();
import('./application/index.js');
})()
Now any file inside your application can import config from '../config.js' as though it were statically declared as we populated the object in the populate function in the bootstrap.
I'm asking this after a lot of research
I have these files
// connection.js
const mysql = require('mysql');
module.exports = mysql.getConnection();
// a.js
const connection = require('./connection');
function doQuery() {
const q = 'select * from table';
connection.query(q, (err, res) => {
... do some stuff
})
}
module.exports = doQuery;
When I what to do a test with jest (deleting unnecessary things to better read)
// test.js
const jest = require('jest');
const a = require('./a.js');
const connection = {
query: (query, cb) => cb('error', null),
};
jest.mock('./connection.js', () => connection);
test('testing something', () => {
expect(a.doQuery).to.be.true //this is just an example
});
I'm getting the next error
The module factory of `jest.mock()` is not allowed to reference any out-of-scope variables.
Invalid variable access: connection
I tried moving the files in the same folder, with relative paths, absolute paths, moving the order of imports, and I really can't get it.
I really don't know how to fix this, also I'm migrating from proxyquire to jest that is the reason that I'm doing this and I can't use proxyquire anymore.
Yeah, well...
Gonna answer my very own question for the future.
The thing is that you cant use variables that are declared outside the jest.mock() function as I am doing.
The test file and the mock function should be something like this
Solution 1
jest.mock('./connection.js', () => ({
query: (query, cb) => cb('ee', null),
}));
Look that I'm not using any variables (const connection...) inside the jest.mock function
skyboyer
Also, I found out another solution thanks to #skyboyer.
Solution 2
Use names outside the jest.mock with the prefix mock*
const mockConnection = {
query: (query, cb) => cb('ee', null),
}
jest.mock('./connection.js',
am trying to write tests to test streams on my app, dealing with fs library with its createWriteStream function, so the stub i created is as follows:
writeStreamStub = sinon.stub()
onceStreamEventStub = sinon.stub()
endStreamStub = sinon.stub()
onStreamStub = sinon.stub()
createStreamStub = sinon.stub(fs, 'createWriteStream').returns({
write: writeStreamStub,
once: onceStreamEventStub,
end: endStreamStub,
on: onStreamStub
})
So now I can test for whether the functions are called and the returned functions are also called. But I am using the --coverage flag and the code of the callbacks of the returned functions is not covered, the write method is called inside a process.nextTick and I have no idea how to go about this. Is it possible to cover the whole code and the code inside the callbacks, and if so, how do I go about it. Thanks in advance.
N.B. The variables are globaly declared
If there's no cogent reason to use both sinon and jest, I'd recommend just using one library. If you decide to go with jest, here's a simple example. Assume you have a class like
const fs = require('fs');
module.exports = class FileWriter {
constructor() {
this.writer = fs.createWriteStream('./testfile.txt');
}
writeFile() {
process.nextTick(() => {
this.writeContent('hello world');
});
}
writeContent(content) {
this.writer.write(content);
this.writer.end();
}
};
and in your unit-test you want to mock the behaviour of all the used fs-functions (createWriteStream, writer, end in this case) and just check if they are called with the correct arguments. You could do this with something like this:
const fs = require('fs');
const FileWriter = require('./FileWriter');
// use this to have mocks for all of fs' functions (you could use jest.fn() instead as well)
jest.mock('fs');
describe('FileWriter', () => {
it('should write file with correct args', async () => {
const writeStub = jest.fn().mockReturnValue(true);
const endStub = jest.fn().mockReturnValue(true);
const writeStreamStub = fs.createWriteStream.mockReturnValue({
write: writeStub,
end: endStub,
});
const fileWriter = new FileWriter();
fileWriter.writeFile();
await waitForNextTick();
expect(writeStreamStub).toBeCalledWith('./testfile.txt');
expect(writeStub).toBeCalledWith('hello world');
expect(endStub).toHaveBeenCalled();
});
});
function waitForNextTick() {
return new Promise(resolve => process.nextTick(resolve));
}
Suppose I have a the following module, as database.js
const initOptions = {}
const pgp = require('pg-promise')(initOptions)
const config = require('../../config')
const db = pgp({
host: config.database.host,
port: config.database.port,
database: config.database.database,
user: config.database.user,
password: config.database.password
})
module.exports = db
And the following module as create.js
const db = require('./database')
function create (name) {
return new Promise((resolve, reject) => {
db.func('create', name)
.then(data => {
return resolve(data)
})
.catch(err => {
return reject(err)
})
})
}
module.exports = create
I'm trying to run a unit test on create.js that will test that db.func is called with 'create' as first argument and 'name' as the second, but without actually needing to set up a database connection (So tests can run offline).
From what I can gather, this is what libraries like sinon.JS can be used for, so I tried creating a test and stubbed the db object.
const chai = require('chai')
const chaiAsPromised = require('chai-as-promised')
chai.use(chaiAsPromised)
const sinon = require('sinon')
const expect = chai.expect
const create = require('./create.js')
describe('Test Module', () => {
it('should test stuff', () => {
const db = require('./database')
const dbStub = sinon.stub(db, 'func').callsFake(Promise.resolve(null))
expect(create('test').then).to.be.a('Function')
})
})
However, it fails with
TypeError: Cannot redefine property: func
Most likely due to my limited exposure to sinon...
How do I go about stubbing (or maybe I need to mock?) the db function so that I can test it an ensure db.func was called?
You can make the properties configurable by disabling locks with the no noLocking option in Initialization Options. This allows sinon to replace the properties:
const initOptions = { noLocking : true };
On a related note:
Your create function is creating an unnecessary promise wrapper, which is a promise anti-pattern. You should just return the result from db.func, which is a promise already:
function create(name) {
return db.func('create', name);
}
Also callsFake takes a function and you are giving it a promise. Use returns instead:
const dbStub = sinon.stub(db, 'func').returns(Promise.resolve(null))
I'm having trouble setting the noLocking option. The docs state it can be set after initialization, however if I set it with db.$config.options.noLocking = true, the same error occurs. However, if I set it in the database.js init options it works fine.
From the author of pg-promise...
It is because at that point the noLocking can only affect tasks and transactions. And since the db level of the protocol is initiated only once, setting noLocking after the library's initialization doesn't effect it.
I have just updated the documentation to clarify it:
This option is dynamic (can be set before or after initialization). However, changing it after the library's initialization will not affect Database objects that have already been created.
I'm trying to call getMarchandiseList() method that call a mongoose method " find({}) " in my marchandise router.
the problem is when I send the responce I find that I get a result=0 before the find({}) method send it's result
//the result should be
{id_marchandise : 01 , libelle_marchandise : 'fer'}.
[MarchandiseDAO.js][1]
*
var express = require("express");
var router = express.Router();
var marchandiseDAO = require ('../DAO/marchandises');
router.get('/listmarchandise', function (req,res) {
var marchandise = new marchandiseDAO();
marchandise.getMarchandiseList();
res.send("result" +marchandise.result);
});
module.exports = router;
*
[marchandiserouter.js][2]
var model = require("../Models/model");
var marchandiseDAO = function () {
console.log("get instance ");
this.result = 0 ;
}
marchandiseDAO.prototype.getMarchandiseList = function () {
console.log("begin");
model.MarchandiseModel.find({},function(err,marchandiselist){
console.log("traitement");
if(err) result = err;
else result = marchandiselist;
});
console.log("end");
}
You cannot run mongoose methods synchronously. But if you use a modern version of Node then you can make it appear as if it was run asynchronously.
Unfortunately you posted your code examples as screenshots which would require me to use Photoshop to fix your code, which is obviously not worth the hassle. So instead I will show you a general solution.
Using normal promises:
Model.find({...}).then(data => {
// you can only use data here
}).catch(err => {
// always handle errors
});
Using await:
try {
let data = await Model.find({...});
// you can only use data here
} catch (err) {
// always handle errors
}
The second example can only be used in an async function. See this for more info:
try/catch blocks with async/await
Use await outside async
Using acyns/await in Node 6 with Babel
When do async methods throw and how do you catch them?
using promises in node.js to create and compare two arrays
Keeping Promise Chains Readable
function will return null from javascript post/get