NodeJS mysql2 - should I handle pool disconnections? - node.js

I use mysql2 module in my NodeJS project.
I understand the concept of database pooling in mysql2 module
(https://www.npmjs.com/package/mysql2).
Before using pool, I used regular connection with mysql2.createConnection() function,
but after some time I got 'PROTOCOL_CONNECTION_LOST' error.
My code is:
db.js:
const mysql = require('mysql2');
const pool = mysql.createPool({
host: 'sql server',
user: 'username',
database: 'database',
password: 'pass',
port: 3306,
connectionLimit: 10,
queueLimit: 0
});
module.exports = pool;
And how I using it:
const db = require('db');
db.query(...);
The query() and execute() functions on pool instance automatically call release() function of pool instance, and that is very good (because I don't need to write that command manually after any query).
But I need to understand: if I work like that (with pool, instead of without pool), there is a guarentee
'PROTOCOL_CONNECTION_LOST' will not be thrown? Should I handle that or the pooling mechanism does it automatically?
I ask because I saw on the internet some code that, for example, re-create the connection.
With pool, I need to re-create the connection sometime?
Thanks a lot!

Related

SQL Server and NodeJS - Connection Pool availabilty is always zero

I'm trying to initialize a pool of SQL Server connections for my nodejs web application to use. I've set the config to create 10 min connections but when I start the app with the below code. I only have 0 available connections which doesn't allow me to begin any transactions.
Any help would be greatly appreciated!
test.js
const sql = require('mssql');
require('dotenv').config();
const appPool = new sql.ConnectionPool({
user: process.env.DB_USER,
password: process.env.DB_PWD,
database: process.env.DB_NAME,
server: process.env.DB_HOST,
pool: {
min: 10,
max: 100,
acquireTimeoutMillis: 15000,
},
options: {
encrypt: true,
trustServerCertificate: false
}
});
appPool.connect().then(pool => {
console.log(`SERVER: Connected to the db and ${pool.available} connections are available!`);
});
Output
MINGW64 ~/Desktop/React Projects/dummy-project (master)
$ node test.js
SERVER: Connected to the db and 0 connections are available!
node-mssql uses tarn.js for connection pooling. The connections are not created preemptively. Instead, they're created as and when the connections are requested. Once the number of concurrent requests for connections exceed pool.min, the available connections in the pool stays at that level.
So, doing a transaction or running a query shouldn't be a problem, node-mssql will create the connection and fulfil the request.
If you really want to have pool.min connections in your pool, you can write a small script to fire some concurrent queries on your database which will warm up the pool for you.

Mysql Connection vs Connection Pool

I have 4 separate tables in the same database. Would it be better to use mysql2.createConnection() or mysql2.createPool to bulk insert into each table? I'd like to run the inserts asynchronously.
The code will be executing the inserts from an AWS Lambda and connections are done through RDS Proxy which handles connection pooling for all connections to the Aurora MySql database instance.
const mysql2 = require('mysql2');
const connection = mysql2.createConnection({
host : 'example.org',
user : 'bob',
password : 'secret'
});
or
mysql2.createPool
const mysql2 = require('mysql2');
const pool = mysql2.createPool({
connectionLimit : 10,
host : 'example.org',
user : 'bob',
password : 'secret'
});
If you would like to run the inserts asynchronously, you will want createPool.
Because in with createConnection, there is only 1 connection and all queries executed on that connection are queued, and that is not really asynchronous. (Async from node.js perspective, but the queries are executed sequentially)

Is it a good practice to close my connection after Sequelize operations?

currently I'm designing an app on top of Sequelize using NodeJS/TypeScript, and I'm wondering if it can cause performance issues not closing a connection.
For instance, in a micro service, I need data from 1 entity.
const resolver = async (,,{db}) => {
const entity1 = await db.models.Entity1.findOne()
return entity1
}
Is it required to close the connection after having called findOne?
My understanding is that the following config defines a number of concurrent connections and idle is a parameter making the connection manager closing the connection of idle ones:
module.exports = {
development: {
host: 'db.sqlite',
dialect: 'sqlite',
pool: {
max:5,
min:0,
idle:10000
}
},
test: {
host: 'test.sqlite',
dialect: 'sqlite',
pool: {
max:5,
min:0,
idle:10000
}
}
}
Any advice is welcome
Sequelize maintains an internal database connection pool, that's what the pool parameters are for, so this isn't necessary. Each call actually borrows a connection temporarily and then returns it to the pool when done.
Closing that connection manually may poison the pool and cause performance issues.
If you don't close the Sequelize connection, the micro-service will still run until the connection got timed out (idle time pool parameter).. I suggest to close Sequelize connection, at least in micro-services..

What changes are to be made in code to move from Feathers nedb to Postgres?

I have used feathers-nedb library and wrote a server code in Node. Now I need to shift to Postgres DB. I have written models and data insert queries in feathers-nedb so is there way I don't mess up with the structure, but connect to Postgres instead and run the code.
There is a way. You can use the feathers-knex library. Just change the nedb model to feathers-knex and create a Model using knex Postgres connection string.
const dbPath = app.get('nedb');
const Model = new NeDB({
filename: path.join(dbPath, 'test.db'),
autoload: true
});
const Model = new knex({
client: 'pg',
connection: "postgres://postgres:password#localhost:5432/test",
searchPath: ['knex', 'public'],
});
This is the only code change that is required on the Model side. On the service side, instead of feathers-nedb use feathers-knex.

Trying to create a connection in Node.js using the npm light-orm and PostgreSQL (pg)

I am trying to create a postgreSQL connection using the light-orm.
I have it working in MySQL using:
var mysql = require('mysql'),
lightOrm = require('light-orm');
lightOrm.driver = mysql.createConnection({
host: "localhost",
user: "me",
password: "secret",
database: "test"
});
lightOrm.driver.connect();
However I cannot seem to get it to work with PostgreSQL.
What I have is:
var pg = require('pg'),
lightOrm = require('light-orm');
lightOrm.driver = new pg.Client('postgres://me:secret#localhost/test');
lightOrm.driver.connect();
I know that the connection is happening, because if I change the password to something incorrect, I get an error. But when trying to use the same code that works with MySQL, I either get an error, or nada.
My guess is this problem stems from my lack of knowledge of the pg module and not a light-orm issue.

Resources