Global in jest puppeteer custom test environment loses context in test - jestjs

I'm writing a npm module to reduce the boiler code in my test project and this module will be added as a dev dependency in that test project. I'm using prototype inheritance to extend an existing class(Page) of puppeteer library as well in this npm module.
When I create a global browser in custom test environment and use it to create page instance in the test. The Page class loses reference to all properties I have added to it using prototypal inheritance. And it throws error page.sendText is not a function
But when I create the browser instance in the test. I am able to use the properties I have added to Page class.
Creating global.browser in custom test environment.
Creating browser instance in test.
GLOBALSETUP in npm module
const browser = await puppeteer.launch(await puppeteer.launch({some config});
TEST ENVIRONMENT in npm module:
this.global __BROWSER__ = await puppeteer.connect({
browserWSEndpoint: wsEndpoint,
});
Extended PAGE class in npm module:
const Page = require('puppeteer/lib/Page').Page;
Page.prototype.sendText = async (selector, text) => {
let element = await this.waitForSelector(selector, {visible : true});
await element.type(text);
}
Test in test project after locally installing the npm module:
describe('awesome test', () => {
it('something will be ok', async () => {
const context = global.__BROWSER__.defaultBrowserContext();
page = await context.newPage();
await page.goto('https://google.com/')
await page.sendText('#search', "puppeteer")
await page.screenshot({path: 'google.png', fullPage: true})
})
});
The error is -> page.sendText is not a function
I was expecting the sendText would work.

Related

ErroNormalModuleFactory is no longer a waterfall hook?

I am trying to import WalletProvider from "#truffle/hdwallet-provider"; in reactJS component it is giving me this error as soon as I execute npm run
Error: NormalModuleFactory.resolve (NormalModuleFactory) is no longer a waterfall hook, but a bailing hook instead. Do not return the passed object, but modify it instead. Returning false will ignore the request and results in no module created. Returning a Module object will result in this module used as result.
I tried in separate JS file it is working fine
const Web3 = require("web3");
const WalletProvider = require("#truffle/hdwallet-provider");
let provider = new WalletProvider({
mnemonic: {
phrase:
"***************************************************************",
},
providerOrUrl: "https://goerli.infura.io/v3/*******************",
});
const web3 = new Web3(provider);
const fetch123 = async () => {
const accounts = await web3.eth.getAccounts();
console.log(accounts);
};
fetch123();

Puppeteer: TypeError: Readable is not a constructor

I have been trying to use Puppeteer#15.5.0 to generate a PDF on the server side in Node.js.
import { launch } from 'puppeteer';
...
const browser = await launch();
const page = await browser.newPage();
await page.setContent('COME ON!');
console.log(await page.content());
const pdfBuffer = await page.pdf();
The console.log statement gives me the expected output of <html><head></head><body>COME ON!</body></html>
It then runs into the following error:
Error:
TypeError: Readable is not a constructor
at getReadableFromProtocolStream (/Users/kaziehsanaziz/Work/DocSpace/repos/docspace-pay/.webpack/service/src/public-lambda.js:405775:12)
at runMicrotasks (<anonymous>)
at processTicksAndRejections (internal/process/task_queues.js:95:5)
at async Page.pdf (/Users/kaziehsanaziz/Work/DocSpace/repos/docspace-pay/.webpack/service/src/public-lambda.js:403129:26)
at async /Users/kaziehsanaziz/Work/DocSpace/repos/docspace-pay/.webpack/service/src/public-lambda.js:329729:31
Puppeteer cannot be bundled using Webpack. The issue was that I was trying to do just that. In my case, since I was using Serverless, the solution was to tell the serverless-bundle plugin to not bundle the Puppeteer.
bundle:
packager: yarn
esbuild: true
forceExclude:
- aws-sdk
- puppeteer
externals:
- puppeteer-core
- '#sparticuz/chrome-aws-lambda'
The forceExclude is doing the trick here for the local environment. The external is what's helping the Production environment.
I have also run into this issue. It occurs when webpack (v5 on my end) bundles puppeteer. I have solved it by explicitly declaring webpack ignore directive when importing a file which uses puppeteer. I did this via dynamic es import, but a static one could be done in a very similar way:
const loadModule = async (modulePath) => {
try {
return await import(/* webpackIgnore: true */ modulePath)
} catch (e) {
throw new ImportError(`Unable to import module ${modulePath}`)
}
}
const renderPdf = (await loadModule('../../renderPdf/index.js')).default
use require puppeteer instead of import puppeteer statement

make jest compile/transform/serve locally the module under test with puppeteer

I need to pass a function that is written in typescript which should run in the browser. The issue that I am having is that either I need to have the module I am testing transpiled and them encoded so I can pass it to the browser in puppeteer and it will run normally. This was the approach I was using, and it works. in short I was using es-build to bundle the module. and using readFile then encoding so I can, in the browser import it and run it there.
I am thinking if there is a better way to do this with jest-puppeteer? I can't use page.exposeFunction() because that is running on node environment. and passing the encoded function will give the browser ts code which is not what I want. To understand better look at the code bellow.
//file: module_under-test.e2e.test.ts
//importing does not help us because we might need the whole module encoded
import { testFn } from './module_under-test';
import fs, { readFileSync } from 'fs';
import util from 'util';
const readFile = util.promisify(fs.readFile);
//this will encode the module in a string, so it can be imported in the browser.
async function importer(path) {
return `data:text/javascript;utf-8,${encodeURIComponent((await readFile(path, { encoding: 'utf-8' })))}`;
}
describe('Basic authentication e2e tests', () => {
beforeAll(async () => {
await page.setViewport( {
width: 1920,
height: 1080,
deviceScaleFactor: 1
} );
//we do stuff like opening the page and logging in, etc
});
it('testToRunOnBrowser', async () => {
//module should be already transpiled but this was the old approach. I would use importer from the dist folder.
//with this the test pass but we don't want to have to transpile code everytime to run it.
//since we could already do it with only esbuild and puppeteer
expect(await page.evaluate(testToRunOnBrowser,await importer('../dist/module_under-test.mjs'))).toBe(true);
})
});
export async function testToRunOnBrowser(deps) {
const {testFn} = await import(deps)
const ctx = new browserGlobalFunctionCtx();
const data = ctx.DoGLobalBrowserThings();
ctx.load(data);
const dataLoaded = await testFn()
return dataLoaded === 'what I want to assert'
}
One way I did think but I was not able to do, is servng the whole src folder since the code from this project should all be tested on the browser. With that I can use babel standalone with "#babel/plugin-transform-modules-umd" and just import ts on the browser. any ideas or pointers how to do that with jest-puppeteer?

Jest initialize and shared objects once per test suite and across test cases

I want to use shared resources between jest test suites. I read in the internet and found that this could be the solution. But the setup is invoked per each test file.
I have two test files links.test.js and 'subscritpions.test.js'. I usually call them with one command jest and that all.
The problem is that the setup function of my custom environment custom-environment.js:
const NodeEnvironment = require('jest-environment-node');
const MySql = require('../../lib/databases/myslq/db');
class CustomEnvironment extends NodeEnvironment {
constructor(config) {
super(config)
}
async setup() {
await super.setup();
console.log(`Global Setup !!!!!!!!!`);
this.global.gObject = "I am global object"
this.global.liveUsers = await new MySql("Live Users");
this.global.stageUsers = await new MySql("Stage Users");
}
async teardown() {
console.log(`Global terdown !!!!!!!!!`);
await super.teardown();
this.global.gObject = "I am destroyed";
this.global.liveUsers.closeConnection();
this.global.stageUsers.closeConnection();
}
runScript(script) {
return super.runScript(script)
}
}
module.exports = CustomEnvironment;
is called twice for each test:
Global Setup !!!!!!!!!
Global Setup !!!!!!!!!
ERROR>>> Error: listen EADDRINUSE: address already in use 127.0.0.1:3306
So it tries to establish second connection to the same port - while I could simply use the existing connection.
The way it works seems to me makes no difference from defining
beforeAll(async () => {
});
afterAll(() => {
});
hooks.
So to wrap up, the question is: Using jest command (thus running all test suits), how can I invoke setup function once for all test and share global objects across them?
setup and teardown are indeed executed for each test suite, similarly to top-level beforeAll and afterAll.
Test suites run in separate processes. Test environment is initialized for each test suite, e.g. jsdom environment provides fake DOM instance for each suite and cannot be cross-contaminated between them.
As the documentation states,
Note: TestEnvironment is sandboxed. Each test suite will trigger setup/teardown in their own TestEnvironment.
The environment isn't suitable for global setup and teardown. globalSetup and globalTeardown should be used for that. They are appropriate for setting up and shutting down server instances, this is what documentation example shows:
// setup.js
module.exports = async () => {
// ...
// Set reference to mongod in order to close the server during teardown.
global.__MONGOD__ = mongod;
};
// teardown.js
module.exports = async function () {
await global.__MONGOD__.stop();
};
Since this happens in parent process, __MONGOD__ is unavailable in test suites.

Is there a jest config that will fail tests on console.warn?

How do I configure jest tests to fail on warnings?
console.warn('stuff');
// fail test
You can use this simple override :
let error = console.error
console.error = function (message) {
error.apply(console, arguments) // keep default behaviour
throw (message instanceof Error ? message : new Error(message))
}
You can make it available across all tests using Jest setupFiles.
In package.json :
"jest": {
"setupFiles": [
"./tests/jest.overrides.js"
]
}
Then put the snippet into jest.overrides.js
For those using create-react-app, not wanting to run npm run eject, you can add the following code to ./src/setupTests.js:
global.console.warn = (message) => {
throw message
}
global.console.error = (message) => {
throw message
}
Now, jest will fail when messages are passed to console.warn or console.error.
create-react-app Docs - Initializing Test Environment
I implemented this recently using jest.spyOn introduced in v19.0.0 to mock the warn method of console (which is accesses via the global context / object).
Can then expect that the mocked warn was not called, as shown below.
describe('A function that does something', () => {
it('Should not trigger a warning', () => {
var warn = jest.spyOn(global.console, 'warn');
// Do something that may trigger warning via `console.warn`
doSomething();
// ... i.e.
console.warn('stuff');
// Check that warn was not called (fail on warning)
expect(warn).not.toHaveBeenCalled();
// Cleanup
warn.mockReset();
warn.mockRestore();
});
});
There is a useful npm package that helps you to achieve that: jest-fail-on-console
It's easily configurable.
Install:
npm i -D jest-fail-on-console
Configure:
In a file used in the setupFilesAfterEnv option of Jest, add this code:
import failOnConsole from 'jest-fail-on-console'
failOnConsole()
// or with options:
failOnConsole({ shouldFailOnWarn: false })
I decided to post a full example based on user1823021 answer
describe('#perform', () => {
var api
// the global.fetch is set to jest.fn() object globally
global.fetch = jest.fn()
var warn = jest.spyOn(global.console, 'warn');
beforeEach(function() {
// before every test, all mocks need to be resetted
api = new Api()
global.fetch.mockReset()
warn.mockReset()
});
it('triggers an console.warn if fetch fails', function() {
// In this test fetch mock throws an error
global.fetch.mockImplementationOnce(() => {
throw 'error triggered'
})
// I run the test
api.perform()
// I verify that the warn spy has been triggered
expect(warn).toHaveBeenCalledTimes(1);
expect(warn).toBeCalledWith("api call failed with error: ", "error triggered")
});
it('calls fetch function', function() {
// I create 2 more mock objects to verify the fetch parameters
const url = jest.fn()
const config = jest.fn()
api.url = url
api.config = config
// I run the test
api.perform()
// I verify that fetch has been called with url and config mocks
expect(global.fetch).toHaveBeenCalledTimes(1)
expect(global.fetch).toBeCalledWith(url, config)
expect(warn).toHaveBeenCalledTimes(0)
});
})
the #perform method I am testing
class Api {
constructor(auth) {
this._credentials = auth
}
perform = async () => {
try {
return await fetch(this.url, this.config)
} catch(error) {
console.warn('api call failed with error: ', error)
}
}
}
You can set the environment variable CI=true before running jest which will cause it to fail tests on warnings in addition to errors.
Example which runs all test files in the test folder:
CI=true jest ./test
Automated CI/CD pipelines such as Github Actions set CI to true by default, which can be one reason why a unit test will pass on your local machine when warnings are thrown, but fail in the pipeline.
(Here is the Github Actions documentation on default environment variables: https://docs.github.com/en/actions/learn-github-actions/environment-variables#default-environment-variables)

Resources