E2e tests fail because of the extra opened database connection - nestjs

I'm trying to setup environment for e2e tests. I setup e2e-database and module to load fixtures into it. But after all my manipulations I got next error:
AlreadyHasActiveConnectionError: Cannot create a new connection named "default", because connection with such name already exist and it now has an active connection session.
I don't understand where is the place where extra connection is created. Because in afterEach hook I destroy all created connections.
Here are some links to my code:
test.e2e-spec.ts
Orm config
app.module.ts
test-utils.ts
Docker-compose
Stack:
NestJs
TypeOrm
typeorm-fixtures - library to generate fixtures
Jest
UPD:
I found a good example of nestjs project with typeorm and e2e tests. May be it will help someone.
Fix for me was to use prisma in my current project. lol
UPD 2:
I suppose that problem was in that fact that I didn't provide db connection to my custom repositories. Because of that every repo opens its own connection and it causes the error. The same error happend with prisma too. So, if you are using custom repos, provide db connection.
For example
{
provide: UserDITokens.UserRepository,
useFactory: () => new UserRepository(prisma),
},
Where prisma is singleton of db connection

I had this problem.
Fix
Change beforeEach to beforeAll and afterEach to afterAll. I'm not 100% sure but I'd argue that this is caused by the registration of new connections when beforeEach is called so even though you are destroying the connection in afterEach, it is still registered.

I suppose that problem was in that fact that I didn't provide db connection to my custom repositories. Because of that every repo opens its own connection and it causes the error. The same error happend with prisma too. So, if you are using custom repos, provide db connection.
For example
{
provide: UserDITokens.UserRepository,
useFactory: () => new UserRepository(prisma),
},
Where prisma is singleton of db connection

Related

How to ensure mongoose.dropDatabase() can ONLY be called when connected to mongo-memory-server

We're using mongodb-memory-server for unit tests. After every unit test suite we execute:
await connection.dropDatabase();
await collection.deleteMany({});
To setup mongoose we have two different methods:
setupMongoose(); <--- Connects to our dev database in the cloud (Atlas)
setupMongooseWithMemoryServer(); <---- Connects mongoose to memory server.
We're a team of developers, and my worst fear is that someone uses "setupMongoose()" to setup unit tests by mistake some day. If that happens, dropDatabase() will be called for our "real" dev database. That would be a catastrophy.
So how can I ensure that dropDatabase() and maybe collection.deleteMany({}) can NEVER ever be called on our cloud database?
Some thoughts:
I have thought about setting up env variables and check for it before calling the dangerous methods. I've also already made a run time check:
checkForUnitTestEnv() {
if (!this.init || process.env.JEST_WORKER_ID === undefined) {
console.error('FATAL TRIED TO DROP DATABSE WITHOUT JEST!');
throw 'FATAL TRIED TO DROP DATABSE WITHOUT JEST!';
}
}
(this.init is only true if memory-server has been initialized).
But these methods are not fool proof. Errors can still happen if our developers are not careful. So I was hoping to either make it "illegal operations" with our database provider (Atlas) if possible, or check the mongoose connection uri on run time before calling the dangerous methods (but I haven't found a good way to do this yet).

SoftDelete works on typeorm with dropSchema true?

I'm implementing a test database which uses typeorm with dropSchema activated.
And I noticed... my delete routes which calls a softdelete method just remove the row(until the dabase drops). But it works fine at normal database (dropSchema off).
From TypeOrm Connection Options :
dropSchema - Drops the schema each time connection is being
established. Be careful with this option and don't use this in
production - otherwise you'll lose all production data. This option is
useful during debug and development.
So when you run with dropSchema: true all the data gets deleted including the soft-deleted rows.

Jest tests hang due to open Sequelize connections

The Setup
I have a NodeJS project that uses:
Jest for testing.
Sequelize as an ORM.
Sequelize is instantiated when loading the models module which gets imported by a few of the files that are being tested with Jest.
The Problem
Jest tests pass but then hang with the message:
Jest did not exit one second after the test run has completed.
This usually means that there are asynchronous operations that weren't
stopped in your tests. Consider running Jest with
--detectOpenHandles to troubleshoot this issue.
Note: Adding --detectOpenHandles to the test call does not affect the output.
I do not actually invoke the sequelize object from any test paths, however some of the tested files import the models module and therefore Sequelize is instantiated.
(I will also note that this only occurs on my TravisCI environment, but I suspect this is a red herring.)
The Context
Because Jest is running tests in parallel, the models module is loaded multiple times during the entire test process. I confirmed this with debug output saying SEQUELIZE LOADED which appears multiple times when I run the tests.
The Attempts
I did attempt to invoke sequelize.close() inside of a globalTeardown but this appears to simply open (and then close) a new sequelize connection.
Since none of the tests actually rely on a database connection, I attempted running sequelize.close() within the models module immediately before export. This fixed the issue (though obviously is not a solution).
I have attempted to configure the test connection pools to aggressively end connections.
const sequelizeConfig = {
...
pool: {
idle: 0,
evict: 0,
}
}
This did nothing.
The Requirements
I don't want to use a brute force solution such as running --forceExit via Jest when I run my tests. This feels like it is ignoring the root issue and might expose me to other kinds of mistakes down the line.
My tests are spread across dozens of files, which means needing to invoke something in an afterAllTests would require a lot of redundancy and would introduce a code smell.
The Question
How can I ensure that sequelize connections are closed after tests finish, so they don't cause Jest to hang?
Jest provides a way to create universal setup before every test suite. This means it is possible to leverage Jest's afterAll to close the sequelize connection pool without having to manually include it in every single test suite.
Example Jest config in package.json
"jest": {
...
"setupFilesAfterEnv": ["./src/test/suiteSetup.js"]
}
Example suiteSetup.js
import models from '../server/models'
afterAll(() => models.sequelize.close())
// Note: in my case sequelize is exposed as an attribute of my models module.
Since the setupFilesAfterEnv is loaded before every test suite, this ensures that every Jest thread that opened a connection ultimately has that connection closed.
This doesn't violate DRY and it doesn't rely on a clunky --forceExit.
It does mean that connections will be closed that may never have been opened (which is a bit brute force) but that might be the best option.

NodeJS/Express: ECONNRESET when doing multiples requests using Sequelize/Epilogue

I'm building a webapp using the following the architecture:
a postgresql database (called DB),
a NodeJS service (called DBService) using Sequelize to manipulate the DB and Epilogue to expose a REST interface via Express,
a NodeJS service called Backend serving as a backend and using DBService threw REST calls
an AngularJS website called Frontend using Backend
Here are the version I'm using:
PostgreSQL 9.3
Sequelize 2.0.4
Epilogue 0.5.2
Express 4.13.3
My DB schema is quite complex containing 36 tables and some of them contains few hundreds of records. The DB is not meant to write data very often, but mostly to read them.
But recently I created a script in Backend to make a complete check up of datas contained inside the DB: basically this script retrieve all datas of all tables and do some basic checks on datas. Currently the script only does reading on database.
In order to achieve my script I had to remove the pagination limit of Epilogue by using the option pagination: false (see https://github.com/dchester/epilogue#pagination).
But now when I launch my script I randomly obtained that kind of error:
The request failed when trying to retrieve a uniquely associated objects with URL:http://localhost:3000/CallTypes/178/RendererThemes.
Code : -1
Message : Error: connect ECONNRESET 127.0.0.1:3000
The error randomly appears during the script execution: then it's not always this URL which is returned, and even not always the same tables or relations. The error message before code is a custom message returned by Backend.
The URL is a reference to the DBService but I don't see any error in it, even using logging: console.log in Sequelize and DEBUG=express:* to see what happens in Express.
I tried to put some setTimeout in my Backend script to slow it, without real change. I also tried to manipulate different values like PostgreSQL max_connections limit (I set the limit to 1000 connections), or Sequelize maxConcurrentQueries and pool values, but without success yet.
I did not find where I can customize the pool connection of Express, maybe it should do the trick.
I assume that the error comes from DBService, from the Express configuration or somewhere in the configuration of the DB (either in Sequelize/Epilogue or even in the postgreSQL server itself), but as I did not see any error in any log I'm not sure.
Any idea to help me solve it?
EDIT
After further investigation I may have found the answer which is very similar to How to avoid a NodeJS ECONNRESET error?
: I'm using my own object RestClient to do my http request and this object was built as a singleton with this method:
var NodeRestClient : any = require('node-rest-client').Client;
...
static getClient() {
if(RestClient.client == null) {
RestClient.client = new NodeRestClient();
}
return RestClient.client;
}
Then I was always using the same object to do all my requests and when the process was too fast, it created collisions... So I just removed the test if(RestClient.client == null) and for now it seems to work.
If there is a better way to manage that, by closing request or managing a pool feel free to contribute :)

How to unit test a method which connects to mongo, without actually connecting to mongo?

I'm trying to write a test to test a method that connects to mongo, but I don't actually want to have to have mongo running and actually make a connection to it to have my tests pass successfully.
Here's my current test which is successful when my mongo daemon is running.
describe('with a valid mongo string parameter', function() {
it('should return a rejected promise', function(done) {
var con = mongoFactory.getConnection('mongodb://localhost:27017');
expect(con).to.be.fulfilled;
done();
});
});
mongoFactory.getConnection code:
getConnection: function getConnection(connectionString) {
// do stuff here
// Initialize connection once
MongoClient.connect(connectionString, function(err, database) {
if (err) {
def.reject(err);
}
def.resolve(database);
});
return def.promise;
}
There are a couple of SO answers related to unit testing code that uses MongoDB as a data store:
Mocking database in node.js?
Mock/Test Mongodb Database Node.js
Embedded MongoDB when running integration tests
Similar: Unit testing classes that have online functionality
I'll make an attempt at consolidating these solutions.
Preamble
First and foremost, you should want MongoDB to be running while performing your tests. MongoDB's query language is complex, so running legitimate queries against a stable MongoDB instance is required to ensure your queries are running as planned and that your application is responding properly to the results. With this in mind, however, you should never run your tests against a production system, but instead a peripheral system to your integration environment. This can be on the same machine as your CI software, or simply relatively close to it (in terms of process, not necessarily network or geographically speaking).
This ENV could be low-footprint and completely run in memory (resource 1) (resource 2), but would not necessarily require the same performance characteristics as your production ENV. (If you want to performance test, this should be handled in a separate environment from your CI anyway.)
Setup
Install a mongod service specifically for CI. If repl sets and/or sharding are of concern (e.g. write concern, no use of $isolated, etc.), it is possible to mimic a clustered environment by running multiple mongod instances (1 config, 2x2 data for shard+repl) and a mongos instance on the same machine with either some init.d scripts/tweaks or something like docker.
Use environment-specific configurations within your application (either embedded via .json files, or in some place like /etc, /home/user/.your-app or similar). Your application can load these based on a node environment variable like NODE_ENV=int. Within these configurations your db connection strings will differ. If you're not using env-specific configs, start doing this as a means to abstract the application runtime settings (i.e. "local", "dev", "int", "pre", "prod", etc.). I can provide a sample upon request.
Include test-oriented fixtures with your application/testing suite. As mentioned in one of the linked questions, MongoDB's Node.js driver supports some helper libraries: mongodb-fixtures and node-database-cleaner. Fixtures provide a working and consistent data set for testing: think of them as a bootstrap.
Builds/Tests
Clean the associated database using something like node-database-cleaner.
Populate your fixtures into the now empty database with the help of mongodb-fixtures.
Perform your build and test.
Repeat.
On the other hand...
If you still decide that not running MongoDB is the correct approach (and you wouldn't be the only one), then abstracting your data store calls from the driver with an ORM is your best bet (for the entire application, not just testing). For example, something like model claims to be database agnostic, although I've never used it. Utilizing this approach, you would still require fixtures and env configurations, however you would not be required to install MongoDB. The caveat here is that you're at the mercy of the ORM you choose.
You could try tingodb.
TingoDB is an embedded JavaScript in-process filesystem or in-memory database upwards compatible with MongoDB at the API level.

Resources