How to properly test a module that requires newrelic with jest - node.js

I'm working migrating a bunch of unit tests from mockery to jest. When I jest a module that requires the new relic agent like so: require('newrelic'), I get downstream errors like :
- TypeError: Cannot convert undefined or null to object
at Object.<anonymous> (node_modules/newrelic/lib/config.js:165:33)
at Runtime._execModule (node_modules/jest-cli/src/Runtime/Runtime.js:261:17)
at Object.<anonymous> (node_modules/newrelic/lib/logger.js:18:14)
at Object.<anonymous> (node_modules/newrelic/index.js:3:14)
What is the best way to deal with modules like newrelic which jest has a hard time mocking? What have other people done when they have both jest and newrelic in their stack?

The route I ended up taking was to create a mock module for newrelic in my __mocks__ folder:
module.exports = {
addCustomParameter: jest.fn()
};
I will probably need to add more functions later, but for now this is enough. I still wonder if there is a way to get jest to auto mock the newrelic library without erroring.

I've seen this with a few modules were automocking fails for various reasons, although it seems to happen a lot less frequently in the newer versions of Jest.
As #linuxdan suggests you might be able to work around the issue by using the manual mocking functionality documented here.
To so this you'll probably want to just export an object with the expected methods generated using jest.fn().
The reason this will work is it will stop jest trying to determine the methods it needs to auto mock on the newrelic library. It will be during this process that it fails.

Related

Automock modules imported with node protocol in Jest

I'm trying to use Jest manual mocking (ref) with built-in Node.js modules that are imported using the node: protocol (ref) in a TypeScript project. I can get this to work by, for example, creating a file in my project called __mocks__/node:fs.ts and calling jest.mock("node:fs"); in my test.
However, node:fs.ts is an invalid file name on Windows due to the colon. So the question is: is there an alternative name that's compatible with Jest manual mocking that works with just calling jest.mock("node:fs");.
I tried two alternatives:
Putting the mocks for builtin modules at __mocks__/node/fs.ts, as expected this didn't work.
I currently landed at the non-ideal solution of mocking builtin modules like jest.mock("node:fs", () => require("../path/to/the/node-fs.mock.ts"));. This works, but is not a nice solution.
For reference, here's a link to a project where I'm having this issue: https://github.com/ericcornelissen/svgo-action/tree/b8a750b3738ba631e7be677ce03a90c90bab2783

NestJS - Using DotEnv

I am working on the NestJS along with TypeOrm (MySQL).
Project itself is provisioned by Terraform, run by Jenkins and deployed on K8.
I will use process.env.******* for the DB Connection, and
when it comes to the deployment (test, stage and prod), I really don't care. Jenkins provides the credentials (provided by Terraform).
However, I want to have a local mode, where it is friendly to other developers to start the service locally.
In my previous project, I had extra file in the root. That file was only wrapper, which loads dotenv and then the main app file.
Something like this:
require('dotenv').config();
const lambdaApp = require('./index');
lambdaApp.handler()
That was simple and easy to use. I just have .env.example file, and if you need if you set it up yourself.
I figured I shall do the same with the NestJS. Unfortunately, I am stuck.
If I were to use local.index.js to start the dotenv, then How can I load and execute the main.ts file. I could call the bootstrap() function, but it wont work.
Simple approach that did not work:
require('dotenv').config();
const mainApp = require('./main.ts');
mainApp.bootstrap();
The main.ts, needs to be converted to js from ts.
I could probably find some way to do it in code, but it's looks really wrong. There has to be a simpler way to achieve this, which, unfortunately I am not seeing.
This was the case of not reading the documentation and reinventing the wheel. In my defense, I can say there are so many things to read, I don't have time. That is pure truth, but time and reading can be managed. I should have checked the official documentation first, and I would find the answer in there.
Anyway, right here it is explained. I will not post any codes
samples, since it is pointless to do it. They also use dotenv library and env file.

How to prevent Mocha from preserving require cache between test files?

I am running my integration test cases in separate files for each API.
Before it begins I start the server along with all services, like databases. When it ends, I close all connections. I use Before and After hooks for that purpose. It is important to know that my application depends on an enterprise framework where most "core work" is written and I install it as a dependency of my application.
I run the tests with Mocha.
When the first file runs, I see no problems. When the second file runs I get a lot of errors related to database connections. I tried to fix it in many different ways, most of them failed because of the limitations the Framework imposed me.
Debugging I found out that Mocha actually loads all files first, that means that all code written before the hooks and the describe calls is executed. So when the second file is loaded, the require.cache is already full of modules. Only after that the suite executes the tests sequentially.
That has a huge impact in this Framework because many objects are actually Singletons, so if in a after hook it closes a connection with a database, it closes the connection inside the Singleton. The way the Framework was built makes it very hard to give a workaround to this problem, like reconnecting to all services in the before hook.
I wrote a very ugly code that helps me before I can refactor the Framework. This goes in each test file I want to invalidate the cache.
function clearRequireCache() {
Object.keys(require.cache).forEach(function (key) {
delete require.cache[key];
});
}
before(() => {
clearRequireCache();
})
It is working, but seems to be very bad practice. And I don`t want this in the code.
As a second idea I was thinking about running Mocha multiple times, one for each "module" (as of my Framework) or file.
"scripts": {
"test-integration" : "./node_modules/mocha/bin/mocha ./api/modules/module1/test/integration/*.integration.js && ./node_modules/mocha/bin/mocha ./api/modules/module2/test/integration/file1.integration.js && ./node_modules/mocha/bin/mocha ./api/modules/module2/test/integration/file2.integration.js"
}
I was wondering if Mocha provides a solution for this problem so I can get rid of that code and delay the code refacting a bit.

Debug compilation errors errors using Angular Universal in Angular 8 Application

A Brief Backstory
I have been implementing several upgrades to an Angular 8 application such as server-side rendering, and Google Analytics. As most developers do, I would code then test then move on to the next task. Typically I use ng serve to run the application as I am developing.
With Server-side rendering, to test speed, lazy-loaded images, etc, you need to use a node express server running on a generated JS file. After building, etc, I use Node prerender (my js file is prerender.js) to see what the application will look like prerendering on the server.
When I run this command, I should not get any errors, and I know my prender file will start a local server on port 4000.
The Problem
I get errors when running a node express server that I do not get when running with ng serve I recently got an error that said:
Unhandled Promise rejection: Cannot read property 'subscribe' of undefined ; Zone: <root> ; Task: Promise.then ; Value: TypeError: Cannot read property 'subscribe' of undefined
at new ApplicationRef (C:\4towerdevelopment\dist-stage\server\main.js:45910:37)
at _createClass (C:\4towerdevelopment\dist-stage\server\main.js:37184:20)
at _createProviderInstance (C:\4towerdevelopment\dist-stage\server\main.js:37138:26)
at initNgModule (C:\4towerdevelopment\dist-stage\server\main.js:37044:32)
at new NgModuleRef_ (C:\4towerdevelopment\dist-stage\server\main.js:38176:9)
at Object.createNgModuleRef (C:\4towerdevelopment\dist-stage\server\main.js:38159:12)
at NgModuleFactory_.create (C:\4towerdevelopment\dist-stage\server\main.js:50821:25)
at C:\4towerdevelopment\dist-stage\prerender.js:29175:43
at ZoneDelegate.invoke (C:\4towerdevelopment\dist-stage\prerender.js:481:26)
at Object.onInvoke (C:\4towerdevelopment\dist-stage\prerender.js:28683:33) TypeError: Cannot read property 'subscribe' of undefined
at new ApplicationRef (C:\4towerdevelopment\dist-stage\server\main.js:45910:37)
at _createClass (C:\4towerdevelopment\dist-stage\server\main.js:37184:20)
at _createProviderInstance (C:\4towerdevelopment\dist-stage\server\main.js:37138:26)
at initNgModule (C:\4towerdevelopment\dist-stage\server\main.js:37044:32)
at new NgModuleRef_ (C:\4towerdevelopment\dist-stage\server\main.js:38176:9)
at Object.createNgModuleRef (C:\4towerdevelopment\dist-stage\server\main.js:38159:12)
at NgModuleFactory_.create (C:\4towerdevelopment\dist-stage\server\main.js:50821:25)
at C:\4towerdevelopment\dist-stage\prerender.js:29175:43
at ZoneDelegate.invoke (C:\4towerdevelopment\dist-stage\prerender.js:481:26)
at Object.onInvoke (C:\4towerdevelopment\dist-stage\prerender.js:28683:33)
The closest this gets me to figuring what actually is causing the problem is letting me know that a provider somewhere is causing this error. Looks like something should be an observable rather than a subscription. Beyond that, I guess I just start looking through my providers. My question is:
How can I force Angular to possibly throw this error when developing using ng serve?
If I can't, is there a better way to debug this current error besides combing through each provider? Or at least a way to tell what service is causing the issue?
Thank you.
UPDATE: Basic repo with problem here. I made a new angular project, made sure dependencies were up to date, installed ngUniversal per this post, and received this same Unhandled promise message when running node prerender
Verbatim I went to to the Angular cli website, made a new project (default Angular version installed was 8.3), installed Angular Universal, and tried to build. Same error message as above.
This looks like it was ApplicationRef that triggered the error, and that class is provided internally by the Angular core.
It could be failing in the constructor of the class, and there are a few calls to subscribe on Zone observables. I don't think you're going to find anything in your source code that directly relates to this error. It looks like a build configuration problem.
https://github.com/angular/angular/blob/bb52fb798c8578c461d21aee2b7623232184a5d3/packages/core/src/application_ref.ts#L562
I do not know what could possibly produce this problem, but I would start a new project with SSR and compare the differences to your current project.
Eventually, I dropped the angular 6 approach using pre-render, and went with the latest universal package. There seems to be no problem building using npm run build:ssr and serving dist/server.js in an express server. I am not sure what the problem was with the pre-render approach, but it seems to be outdated anyways. #Reactgular thanks for the feedback.

Correct configuration with Gulp, Mocha, Browserify to execute client side test with server side tests

I'm working on a node application utilizing gulp for our build processes and the gulp-mocha plugin for our test-runner.
gulp.task('test', function () {
return gulp.src(TESTJS)
.pipe(mocha({reporter: 'spec'}))
.on("error", function (err) {
// handle the mocha errors so that they don't cloud the test results,
// or end the watch
console.log(err.toString());
this.emit('end');
});
});
Currently TESTJS is only my server-side tests. I am wanting to use this same process to execute my client tests as well. I looked into gulp-blanket-mocha and gave it a shot but I keep running into the same issue. When trying to test my backbone code, it fails because the other client components necessary (namely jquery) are not found by the test runner and it fails. I get that I need to use some sort of headless webkit like phantomJS. But I am having real trouble figuring out how to incorporate that into this gulp process with browserify.
Anyone tried getting a setup like this going or have any ideas what I am missing here in terms of having my gulp "test" task execute my client side mocha tests as well as my server side?
A potential setup is :
Test runner - this is the glue between gulp and karma and provides option to set the karma options.files with the gulp.src() stream. Frankly if you have no steps before your karma tests, then use karma directly within gulp task, without gulp plugin.
Use associated karma plugins, to run on phantom/chrome/firefox
Use associated karma plugins for coverage, alt-js compilation
More plugins & configuring karma options for reporting of tests and coverage.
Using browserify will change the whole setup above.
Since it needs to resolve requires, it must run on all the "entry point" files. Typically your tests should require sources, and must be entry points.
Use karma-bro - it solves the problems in karma-browserify (ATM this doesnt even work - it cant work with bfy 5.0 api) & karma-browserifast.
Coverage becomes tricky since sources/vendor-sources/tests are all bundled. So I had created a custom coverage transform, that marks which code whould be instrumented while bfy is bundling
browserify should be a "preprocessor" in karma.
A bunch of "transform: []" should be configured in browserfy options
The transforms can be configured by taking an existing transform module and wrapping with a custom module like what I did above for browserify-istanbul

Resources