Is there an alternative to requiring common node modules besides globals? - node.js

Getting tired of typing
const async = require('async');
const _ = require('lodash');
at the head of almost every JS file.
One could use globals, good for ease of use, bad for unit tests.
Is there an alternative that I'm missing? If I can do a require('common') to load the utilities I want and use them in the current file, that would be best.

Well then do that - create a common.js file and put all the stuff in there, then simply require whatever you need in a single statement using the destructuring assignment.
Example
common.js
module.exports = {
fs: require('fs'),
http: require('http')
//... what else you want
};
main.js
const { fs, http } = require('./common.js');
Note
This was just an example to show you how to archive your desired behaviour. But I would not recommend you to use this as it obscures what you're actually loading and bring a unnecessary dependency just to save some statements.

Wow, there's this amazing thing called keyboard snippets which completely saves one from typing those redundant characters over and over, without the need for compromising the integrity of the code.
VSCode
"debug require": {
"prefix": "rede",
"body": [
"const debug = require('debug')('$1');$0"
]
},
"lodash require": {
"prefix": "relo",
"body": [
"const _ = require('lodash');$0"
]
},
"async require": {
"prefix": "reas",
"body": [
"const async = require('async');$0"
]
},

Related

what is the equivalent of jasmine.createSpyObj() for jest?

What is the equivalent of the below code in Jest.
let mockHeroService = jasmine.createSpyObj(['getHeros', 'addHero', 'deleteHero']);
I would like to use it the testBed.
TestBed.configureTestingModule({
providers: [
{
provide: HeroService,
useValue: mockHeroService
}
]
});
My understanding is that, with jest, you can only spy on one method of a service like
const spy = jest.spyOn(HeroService, 'getHeros');
Thanks for helping
There's no equivalent because it doesn't have much uses. Jest is focused on modular JavaScript and generates auto-mocks (stubs) with jest.mock and jest.createMockFromModule.
The problem with auto-mocks is that they result in unspecified set of functions that behave differently from original ones and can make the code that uses them to work incorrectly or silently fail.
A mock with no implementation can be defined as:
let mockHeroService = { getHeros: jest.fn(), ... };
Most times some implementation is expected:
let mockHeroService = { getHeros: jest.fn().mockReturnValue(...), ... };

Using nock with Intern4 and dojo - What is the correct approach

I have been experimenting with Intern as a testing platform for our code base which has a number of oddities to it. We basically load outside of the dojoLoader and core dojo files. This means we are outside the release process and loose all the goodness in terms of testing what that brings.
I have taken on the task of developing a tool chain that will manage the code (linting, build) and finally testing. And I have got to grips with most aspects of Unit testing and Functional Tests but testing 3rd Party APIs has really had me scratching my head.
On reading the docs I should be able to use Nock to mock the api, I have tried many different examples to get a basic hello world working and I have had varying degrees of success.
What I am noting is that Nock seems to play nicely when you are natively using node but it all falls to pieces once dojo is brought in to the equation. I have tried using axios, get, with tdd and bdd. All of which seem to fail miserably.
I have had a breakthrough moment with the below code, which will allow me to test a mock API with Nock successfully.
I have seen other examples taking a TDD approach to this but when I use the define pattern, there is no done() to signify the async process is complete.
While the below does work, I feel I have had to jump through many hoops to get to this point, i.e. lack of core node module util.promisify (I am currently running with Node V9.10.x). Lack of support for import et al, all which makes adopting examples very tough.
I am new to intern and I wonder if there a preferred or standard approach that I am missing which would make this simpler. I honestly prefer the TDD/BDD pattern visually but if the below is my only option for my setup then I will accept that.
define([
'require',
'dojo/node!nock',
'dojo/node!http',
'dojo/node!es6-promisify'
], function (require, nock, http, util) {
const { registerSuite } = intern.getInterface('object');
const { assert } = intern.getPlugin('chai');
const { get } = http;
const { promisify } = util;
const _get = promisify(get);
registerSuite('async demo', {
'async test'() {
const dfd = this.async();
nock('http://some-made-up-service.com')
.get('/request')
.reply(200, {a:'Hello world!'});
http.request({
host:'some-made-up-service.com',
path: '/request',
port: 80
}, function(res){
res.on('data', dfd.callback((data) => {
var toJSON = JSON.parse(data.toString())
assert.strictEqual(toJSON.a, 'Hello world!');
}))
}).end();
}
});
});
My config is here also, I am sure there are entries in the file which are unnecessary but I am just figuring out what works at the moment.
{
"node": {
"plugins": "node_modules/babel-register/lib/node.js"
},
"loader": {
"script": "dojo",
"options": {
"packages": [
{ "name": "app", "location": "asset/js/app" },
{ "name": "tests", "location": "asset/js/tests" },
{ "name": "dojo", "location": "node_modules/dojo" },
{ "name": "dojox", "location": "node_modules/dojox" },
{ "name": "dijit", "location": "node_modules/dijit" }
],
"map": {
"plugin-babel": "node_modules/systemjs-plugin-babel/plugin-babel.js",
"systemjs-babel-build": "node_modules/systemjs-plugin-babel/systemjs-babel-browser.js"
},
"transpiler": "plugin-babel"
}
},
"filterErrorStack": false,
"suites": [
"./asset/js/common/sudo/tests/all.js"
],
"environments": ["node", "chrome"],
"coverage": "asset/js/common/sudo/dev/**/*.js"
}
If you're using dojo's request methods, you can use Dojo's request registry to setup mocking, which can be a bit easier to deal with than nock when working with Dojo. Overall, though, the process is going to be similar to what's in your example: mock a request, make a request, asynchronously resolve the test when the request completes and assertions have been made.
Regarding util.promisify, that's present in Node v8+, so you should be able to use it in 9.10.
Regarding tdd vs bdd, assuming you're referring to Intern test interfaces (although it sounds like you may be referring to something else?), they all support the same set of features. If you can do something with the "object" interface (registerSuite), you can also do it with the "tdd" (suite and test) and "bdd" (describe and it) interfaces.
Regarding the lack of support for import and other language features, that's dependent on how the tests are written rather than a function of Intern. If tests need to run in the Dojo loader, they'll need to be AMD modules, which means no import. However, tests can be written in modern ES6 and run through the TypeScript compiler or babel and emitted as AMD modules. That adds a build step, but at least tests can be written in a more modern syntax.
Note that no node-specific functionality (nock, promisify, etc.) will work in a browser.

express body-parser utf-8 error in test

Super stumped by this. I have some server code that for some reason throws a UTF-8 error in my tests but works fine when running the server normally:
code:
export default ({ projectId = PROJECT_ID, esHost = ES_HOST } = {}) => {
let app = express();
app.use(cors());
app.use(bodyParser.json({ limit: '50mb' }));
let http = Server(app);
let io = socketIO(http);
let server = {
app,
io,
http,
status: 'off',
listen(
port = PORT,
cb = () => {
rainbow(`⚡️ Listening on port ${port} ⚡️`);
},
) {
this.http.listen(port, () => {
main({ io, app, projectId, esHost, port });
this.status = 'on';
cb();
});
},
close(cb = () => {}) {
if (this.http) {
this.http.close(() => {
this.status = 'off';
cb();
});
} else {
throw '❗️ cannot close server that has not been started ❗️';
}
},
};
return server;
};
usage (exactly the same, but in jest test body-parser isn't working properly):
import createServer from '../server'
let server = createServer()
server.listen(5050);
I'm using postman, post response outside of test:
{
"projects": [
{
"id": "test",
"active": true,
"timestamp": "2018-02-25T21:33:08.006Z"
},
{
"id": "TEST-PROJECT",
"active": true,
"timestamp": "2018-03-05T21:34:34.604Z"
},
{
"id": "asd",
"active": true,
"timestamp": "2018-03-06T23:29:55.348Z"
}
],
"total": 3
}
unexpected post response inside jest test server:
Error
UnsupportedMediaTypeError: unsupported charset "UTF-8" at /Users/awilmer/Projects/arranger/node_modules/body-parser/lib/read.js:83:18 at invokeCallback (/Users/awilmer/Projects/arranger/node_modules/raw-body/index.js:224:16) at _combinedTickCallback (internal/process/next_tick.js:131:7) at process._tickCallback (internal/process/next_tick.js:180:9)
So I was able to reproduce the issue and find the source of the issue and the workaround to make it work. The issue is caused by jest framework.
Before you jump on reading the rest of the thread, I would suggest you read another Jest thread I answer long back. This would help get some context internals about the require method in jest
Specify code to run before any Jest setup happens
Cause
The issue happens only in test and not in production. This is because of jest require method.
When you run your tests, it starts a express server, which calls the node_modules/raw-body/index.js as shown in below image
As you can see the encodings is null. This is because the iconv-lite module does a lazy loading of encodings. The encodings are only loaded when getCodec method gets executed.
Now when your test has fired the API, the server needs to read the body so the getCodec gets called
This then goes through the jest-runtime/build/index.js custom require method (which is overloaded if you read the previous link).
The execModule has a check for this._environment.global, which is blank in this case and hence a null value is returned and the module never gets executed
Now when you look at the exports of the encodings module, it just is a blank object
So the issue is purely a jest. A feature jest lacks or a bug mostly?
Related Issues
Related issues have already been discussed on below threads
https://github.com/facebook/jest/issues/2605
https://github.com/RubenVerborgh/N3.js/issues/120
https://github.com/sidorares/node-mysql2/issues/489#issuecomment-313374683
https://github.com/ashtuchkin/iconv-lite/issues/118
https://github.com/Jason-Rev/vscode-spell-checker/issues/159
Fix
The fix to the problem is that we load the module during our test itself and force a early loading instead of lazy loading. This can be done by adding a line to your index.test.js at the top
import encodings from '../../node_modules/iconv-lite/encodings';
import createServer from '#arranger/server';
After the change all the test pass, though you have a error in the url of the test so you get Cannot POST /
I'm adding a slightly different solution inspired from #Tarun Lalwani
Add the following lines at the top of your test file.
const encodings = require('./node_modules/iconv-lite/encodings');
const iconvLite = require('./node_modules/iconv-lite/lib');
iconvLite.getCodec('UTF-8');
I spent many hours trying to figure out why Jest would report a 415 error code when testing the Node.js server. Node.js is configured to use app.use(bodyParser.json(...)); on our system, too. That didn't solve the issue.
Solution
When using res.status(...), you MUST either chain on .json() or use res.json(), too. That means if you respond with a 500 error or otherwise and you don't return any JSON data, you still need to use res.json(). No idea why, as that defeats the whole purpose of app.use(bodyParser.json(...)); in the first place.
Example
const express = require('express');
const router = express.Router();
router.post("/register", (req, res) => {
// ...
res.status(500).json();
// ...
});

log4js - node package - not writing to file

I am novice to nodejs and wanted my node app to log to a file as well as console.
Here is my code structure:
fpa_log4j.json:
{
"appenders": {
"fpa_file": {
"type": "file",
"filename": "fpa.log",
"maxLogSize": 10485760,
"backups": 10,
"compress": true
},
"screen": {
"type": "stdout",
"layout": {
"type": "coloured"
}
}
},
"categories": {
"default": {
"appenders": [
"fpa_file",
"screen"
],
"level": "trace"
}
}
}
config.js
var LOG4J_CFG = 'config/fpa_log4j.json';
var fs = require('fs');
var log4js = require('log4js');
var path = require('path');
LOG4J_CFG = path.resolve(LOG4J_CFG);
log4js.configure(JSON.parse(fs.readFileSync(LOG4J_CFG, 'utf8')));
....
....
module.exports = {
log4js,
....
....
}
app.js
var cfg = require('./config');
var log = cfg.log4js.getLogger("appMain");
log.info('FPA program STARTED!');
OUTPUT
[2017-12-04T03:20:17.791] [INFO] appMain - FPA program STARTED!
However the log file seems to be empty:
dir fpa.log
Volume in drive C is XXXXXXXXX
Volume Serial Number is XXXXXXXXXXXX
Directory of c:\fpaMain
12/04/2017 12:13 AM 0 fpa.log
1 File(s) 0 bytes
0 Dir(s) 12,242,776,064 bytes free
To be frank, I was able to make this work few days back. But then something got changed (alas, not able to recollect what it is!) I guess, and then logging into the file stopped.
Ok. So I figured it out finally. Sharing it here for the sake of others. Here it comes:
There were few more lines in the app.js (which were removed for clarity) and the app was in fact crashing immediately after the log.info() statement. And I was expecting the statements to appear in the log file as it appeared on the console. But then, I failed to realize that, unlike other programming languages, nodejs has got the powerful feature of parallel processing by birth itself (which in fact was one of the core reasons for my love towards it :)).
So in my case, due to its unparallel feature of parallel processing, nodejs just fired the log statement and went on processing other statements. And, when it crashed, it took down the whole program with it. Since writing to the console was way much faster than IO (for the obvious reasons!!), it showed up on the screen faster. However before it could write to the file system the app got crashed, ending up nothing being written to the file!
Lesson Learned:
Always remember that nodejs === parallel processing
For any IO related issues like the above, ensure that the program runs to complete or at least for some more time before it crashes,
thus providing enough time for the IO to complete.

Can I load multiple files with one require statement?

maybe this question is a little silly, but is it possible to load multiple .js files with one require statement? like this:
var mylib = require('./lib/mylibfiles');
and use:
mylib.foo(); //return "hello from one"
mylib.bar(): //return "hello from two"
And in the folder mylibfiles will have two files:
One.js
exports.foo= function(){return "hello from one";}
Two.js
exports.bar= function(){return "hello from two";}
I was thinking to put a package.json in the folder that say to load all the files, but I don't know how. Other aproach that I was thinking is to have a index.js that exports everything again but I will be duplicating work.
Thanks!!
P.D: I'm working with nodejs v0.611 on a windows 7 machine
First of all using require does not duplicate anything. It loads the module and it caches it, so calling require again will get it from memory (thus you can modify module at fly without interacting with its source code - this is sometimes desirable, for example when you want to store db connection inside module).
Also package.json does not load anything and does not interact with your app at all. It is only used for npm.
Now you cannot require multiple modules at once. For example what will happen if both One.js and Two.js have defined function with the same name?? There are more problems.
But what you can do, is to write additional file, say modules.js with the following content
module.exports = {
one : require('./one.js'),
two : require('./two.js'),
/* some other modules you want */
}
and then you can simply use
var modules = require('./modules.js');
modules.one.foo();
modules.two.bar();
I have a snippet of code that requires more than one module, but it doesn't clump them together as your post suggests. However, that can be overcome with a trick that I found.
function requireMany () {
return Array.prototype.slice.call(arguments).map(function (value) {
try {
return require(value)
}
catch (event) {
return console.log(event)
}
})
}
And you use it as such
requireMany("fs", "socket.io", "path")
Which will return
[ fs {}, socketio {}, path {} ]
If a module is not found, an error will be sent to the console. It won't break the programme. The error will be shown in the array as undefined. The array will not be shorter because one of the modules failed to load.
Then you can bind those each of those array elements to a variable name, like so:
var [fs, socketio, path] = requireMany("fs", "socket.io", "path")
It essentially works like an object, but assigns the keys and their values to the global namespace. So, in your case, you could do:
var [foo, bar] = requireMany("./foo.js", "./bar.js")
foo() //return "hello from one"
bar() //return "hello from two"
And if you do want it to break the programme on error, just use this modified version, which is smaller
function requireMany () {
return Array.prototype.slice.call(arguments).map(require)
}
Yes, you may require a folder as a module, according to the node docs. Let's say you want to require() a folder called ./mypack/.
Inside ./mypack/, create a package.json file with the name of the folder and a main javascript file with the same name, inside a ./lib/ directory.
{
"name" : "mypack",
"main" : "./lib/mypack.js"
}
Now you can use require('./mypack') and node will load ./mypack/lib/mypack.js.
However if you do not include this package.json file, it may still work. Without the file, node will attempt to load ./mypack/index.js, or if that's not there, ./mypack/index.node.
My understanding is that this could be beneficial if you have split your program into many javascript files but do not want to concatenate them for deployment.
You can use destructuring assignment to map an array of exported modules from require statements in one line:
const requires = (...modules) => modules.map(module => require(module));
const [fs, path] = requires('fs', 'path');
I was doing something similar to what #freakish suggests in his answer with a project where I've a list of test scripts that are pulled into a Puppeteer + Jest testing setup. My test files follow the naming convention testname1.js - testnameN.js and I was able use a generator function to require N number of files from the particular directory with the approach below:
const fs = require('fs');
const path = require('path');
module.exports = class FilesInDirectory {
constructor(directory) {
this.fid = fs.readdirSync(path.resolve(directory));
this.requiredFiles = (this.fid.map((fileId) => {
let resolvedPath = path.resolve(directory, fileId);
return require(resolvedPath);
})).filter(file => !!file);
}
printRetrievedFiles() {
console.log(this.requiredFiles);
}
nextFileGenerator() {
const parent = this;
const fidLength = parent.requiredFiles.length;
function* iterate(index) {
while (index < fidLength) {
yield parent.requiredFiles[index++];
}
}
return iterate(0);
}
}
Then use like so:
//Use in test
const FilesInDirectory = require('./utilities/getfilesindirectory');
const StepsCollection = new FilesInDirectory('./test-steps');
const StepsGenerator = StepsCollection.nextFileGenerator();
//Assuming we're in an async function
await StepsGenerator.next().value.FUNCTION_REQUIRED_FROM_FILE(someArg);

Resources