Hitting connection limits with Azure functions and Azure SQL (node.js) - node.js

I have a function app that serves a node.js API. We are hitting the 900 concurrent connections limit with tedious connected to Azure SQL and realize we should add connection pools (unless there is a better recommendation of course).
Azure Functions + Azure SQL + Node.js and connection pooling between requests? seems to answer our prayers but wanted to validate how you can use a single connection pool with Azure functions
Is the best practice to put "let pool = new ConnectionPool(poolConfig, connectionConfig);" above mode.exports on all functions? Is that not creating a new pool every time an individual function is called?
Microsoft doesn't have clear documentation on this for node.js unfortunately so any help would be greatly appreciated!

To make the whole Function app share one single pool, we need to put the initialization part to a shared module. Christiaan Westerbeek had posted a wonderful solution using mssql, there's not so much difference between a Function app and a web app in this respect.
I recommend using mssql(use tedious and generic-pool internally) instead of tedious-connection-pool which seems not updated for 2 years.
Put the connection code in poolConfig.js under a SharedLib folder.
const sql = require('mssql');
const config = {
pool:{
max:50 // default: 10
},
user: '',
password: '',
server: '',
database: '',
options: {
encrypt: true // For Azure Sql
}
};
const poolPromise = new sql.ConnectionPool(config).connect().then(pool => {
console.log('Connected to MSSQL');
return pool;
})
.catch(err => console.log('Database Connection Failed! Bad Config: ', err));
module.exports = {
sql, poolPromise
}
And load the module to connect to sql. We use await to get ConnectionPool the function should be async(default for v2 js function).
const { poolPromise } = require('../SharedLib/poolConfig');
module.exports = async function (context, req) {
var pool = await poolPromise;
var result = await pool.request().query("");
...
}
Note that if Function app is scaled out with multiple instances, new pool will be created for each instance as well.

Related

When should I use pool.end()

When I use pool.end() after a query I get the error message "Cannot read property 'rows' of undefined". It seems to me that I shouldn't be using pool.end after queries have finished. So when should I use pool.end()?
Below is my code snippet:
const pool = new Pool({
user: process.env.PGUSER,
host: process.env.PGHOST,
database: process.env.PGDATABASE,
password: process.env.PGPASSWORD,
port: process.env.PGPORT
})
// Display schedule on home page
app.get('/', (req, res) => {
const displaySchedule = `SELECT * FROM schedule`;
pool.query(displaySchedule, (err, results) => {
if(err) {
throw err;
}
else {
res.render('index', {schedules: results.rows});
}
})
//pool.end();
});
pool.end shuts down a pool completely. In your case - in a web scenario - you do not want do do this. Otherwise you would have to connect to a pool on every new request. This defeats the purpose of pooling.
In your example without calling pool.end - you are using the pool.query method. You are all set here and do not have to use any kind of client cleanup or pool ending.
The pool is usually a long-lived process in your application. You almost never have to shut it down yourself in a web application.
You will have to shut it down - when your are creating pools dynamically or when you are attempting a gracefull shutdown.
For example: in a testing environment, where you connect to a pool before all the features/tests and disconnect after the tests are run, you call pool.end at the end on all dynamically created pools.
https://github.com/brianc/node-postgres/issues/1670
This issue describes the use of pool.end()

how pooling works in node-postgres module for node.js

I am trying to create a small app that queries a database in nodejs for this I am using the following module https://node-postgres.com/
I have managed to establish the connection with the database in postgres and it connects correctly
(this is my code)
const express = require('express');
const fs = require('fs');
const app = express();
const port = 8080;
const { Pool } = require('pg');
const pool = new Pool({
user: 'Inv',
host: 'localhost',
database: 'database',
password: 'password',
port: 3000,
idleTimeoutMillis:1000,
connectionTimeoutMillis:0,
});
pool.connect()
.then(
function (client){
let query = 'SELECT * FROM Clients';
function query_db(query){
client.query(query)
.then(
function(res){
console.log('query')
}
)
.catch(
function(err){
console.log(err.stack)
}
)
.finally(
function(){
client.release()
console.log('client disconect')
}
)
}
query_db(query)
return;
}
);
console.log(pool.totalCount);
setInterval(()=>{console.log(pool.totalCount)}, 100)
pool.on('error', (err, client) => {
console.error('Unexpected error on idle client', err)
process.exit(-1)
});
app.listen(port, () => {
console.log(`\u001b[7mServer in: http://localhost:${port}\u001b[0m\n`);
});
My question is how the pooling really works in this module???
Since from what i have learned so far is that pooling serves to not make a connection for each client since this would make each of the clients be authenticated by the database and this process is time consuming while consuming more server resources so it would slow down the execution of the program for this reason, pooling connects a group of clients to the same user in the database.
Then within the program i create the pool and it connects correctly to postgres with the statement pool.connect () which i check with the list of users connected to the database in pgadmin4, after this i make a query with the statement client.query (query)and this query is done correctly, the problem is that according to what i understand from the node-postgres documentation after finishing the query, the client must be disconnected from the pool so that other clients can connect to the space left by the client that has already made its transaction and this is done with the client.release () statement but when the call is made to client.release () the user of the pool that is connected to postgres disconnects but ... why is it like that ???
That the user of the pool should not remain connected in postgres and only leave the space free for another client ????
So if the entire pool is disconnected, the very objective of making the pool is not lost ???
Doing tests by omitting the client.release () this behavior stops but then ...
If the client is not released, the limit of clients connected to the pool will be reached and the clients will be left waiting forever, right????
Also according to the documentation in the sentence idleTimeoutMillis: 1000, it indicates the time it will wait before disconnecting the client due to inactivity and it does so, but when it disconnects the client as in the previous case it disconnects the entire pool of postgres ...
Then what is the real operation of the pool ???
So if what I understand is correct then.... there is no difference between using the pool and using individual clients, right?
I'm sorry for so many questions and in case some of these questions are very obvious or silly but I am something new in node and I have already searched ad nauseam on google but the documentation is very basic and minimal thanks for taking your time to read my doubts :'D

Node Lambda function accessing Aurora database times out with postgres

I'm trying to write a Lambda function (in nodejs) which executes a simple SQL query. The database is an Aurora Postgres. I have set up function configuration for VPC access, worked fine with a MySQL database.
As you can see in the following, I'm using the "pg" node module (https://github.com/brianc/node-postgres).
app.js
// For example, this function times out
const { Client } = require('pg');
const config = {
user: '<MY_DB_USER>',
host: '<MY_HOST>',
database: '<MY_DB_NAME>',
password: '<MY_DB_PASSWORD>',
port: <MY_PORT>
};
var postgresClient = new Client(config);
module.exports.handler = (event, context, cb) => {
console.log('entered handler');
postgresClient
.connect()
.then(() => {
console.log('connected');
return postgresClient.end();
})
.then(() => cb())
.catch(cb);
};
Yet, when using a postgres database, the lambda function times out every time I invoke. I have set up the subnets and the Security Group.
I have also tried using Pool from the pg module, with no success.
Note that this functions runs locally with success... I can't figure out why the lambda fails when it is deployed.

What is a simple command I can run to test if I can connect to PostgreSQL using node-postgres?

My file db/index.js
const { Pool } = require('pg');
const pool = new Pool;
module.exports = {
query: (text, params, callback) => {
return pool.query(text,params,callback);
}
};
In my main file main.js I do:
const db = require('./db/index');
What command can I run on db to figure out if node-postgres is able to connect to my Postgres setup correctly?
To simply test if you can connect from node.js to pgsql database you can use the following snippet:
const { Pool } = require('pg')
const pool = new Pool()
pool.query('SELECT NOW()', (err, res) => {
console.log(err, res)
pool.end()
})
// or you can use async/await instead of callbacks
const res = await pool.query('SELECT NOW()')
console.log(res)
await pool.end()
This should return the response in form of pg.Result object containing current datetime.
node-postgres uses the same environment variables as libpq to connect to a PostgreSQL server, so to run the above code you can invoke it like so:
PGUSER=postgres PGHOST=127.0.0.1 PGPASSWORD=mysecretpassword PGDATABASE=postgres PGPORT=5432 node script.js
But you have to provide connection details to you database instance.
The default values for the environment variables used are:
PGHOST='localhost'
PGUSER=process.env.USER
PGDATABASE=process.env.USER
PGPASSWORD=null
PGPORT=5432
You can also provide the connection details programmatically, directly to either the Pool or Client instances. You can also use the connection string URI.
You can read more in the node-postgres documentation in the "connecting" section.

Correct way to connect to (a pool with) node-mongodb-native

With MongoDB, a suggestion was to always re-use the same database connection, and have a pool of connections to support some concurrency.
In node-mongodb-native 1.x you could configure the db and server object, creating a pool like so:
var server = new Server(
config.host,
config.port,
{
auto_reconnect : true,
poolSize : 5 // <= here is the pool
}
);
var db = new Db(
config.database,
server
);
db.open(function(err, db) {
// ...
}
In 2.0 they deprecated everything except MongoClient for connecting:
MongoClient.connect(URI, callback);
Where do I add the pool options? Do I have a pool now automatically?
With 2.1 they go a step further and suggest using a generator: https://mongodb.github.io/node-mongodb-native/2.1/reference/ecmascript6/connecting/
Is this effectively using a separate connection for every action again? Is using pools obsolete?
MongoClient.connect takes an optional options parameter with a server field that lets you set the size of the connection pool:
const options = {
server: {
poolSize: 10
}
};
MongoClient.connect(url, options, callback);
If you don't specify it, the default poolSize is 5. The server options are documented here.

Resources