Oracle Database Connection Pool with node.js - node.js

I am new to Node.js. I am trying to make connection pools with multiple databases. I have successfully made connection pools (i think) with below mentioned code. I know in order to execute query operations i have to do something at "Connection pool DBX Success", but i can't seem to figure out what to do so that i am able to execute queries on desired pool say crm1.execute or crm2.execute. What can i do here to achieve this. The only way i can think of is to write execute functions for each database separately which i know is wrong and i have to work with 15 databases so it isn't possible to write functions for all 15 databases separately.
const config = require("../config/config");
const oracledb = require("oracledb");
crm1 = config.crm1;
crm2 = config.crm2;
const crm1pool = oracledb.createPool ({
user: crm1.user,
password: crm1.password,
connectString: crm1.connectString,
poolMin: 1,
poolMax: 10,
poolTimeout: 300
}, (error,pool)=>{
if (error){
console.log(error);
}
console.log("Connection Pool DB1 success")
});
const crm2pool = oracledb.createPool ({
user: crm2.user,
password: crm2.password,
connectString: crm2.connectString,
poolMin: 1,
poolMax: 10,
poolTimeout: 300
}, (error,pool)=>{
if (error){
console.log(error);
}
console.log("Connection Pool DB2 success")
});

There is a lot of node-oracledb documentation on pooling and examples. Study those first.
Then you might find that giving each pool a poolAlias will let you easily choose which to use:
await oracledb.createPool({
user: 'hr',
password: myhrpw, // myhrpw contains the hr schema password
connectString: 'localhost/XEPDB1',
poolAlias: 'hrpool'
});
await oracledb.createPool({
user: 'sh',
password: myshpw, // myshpw contains the sh schema password
connectString: 'otherhost/OTHERDB',
poolAlias: 'shpool'
});
const connection = await oracledb.getConnection('hrpool');
const result = await connection.execute(
`SELECT manager_id, department_id, department_name
FROM departments
WHERE manager_id = :id`,
[103], // bind value for :id
);
console.log(result.rows);

Related

sequelize.sync() does nothing in seed

const dotenv = require('dotenv').config();
const Op = require('sequelize').Op;
//logger here
const Product = require('../models/product');
const MainCourse = require('../models/main_course');
const Drink = require('../models/drink');
const SideDish = require('../models/side_dishes.js');
const Sequelize = require('sequelize');
var database;
var username;
var password;
var host;
if (process.env.NODE_ENV !== 'production') {
database = process.env.DB_TABLE;
username = process.env.DB_USER;
password = process.env.DB_PASS;
host = process.env.DB_HOST;
}
const sequelize = new Sequelize(
database,
username,
password,
{
dialect: 'postgres',
host: host,
pool: {
max: 40,
min: 0,
idle: 20000000,
acquire: 100000000,
},
logging: console.log
}
);
sequelize
.authenticate()
.then(async () => {
sequelize.sync({force: true, logging: console.log}).then(() => console.log('Synced DB'))
console.log('Connection with Sequelize established successfully.');
})
.catch(err => {
console.error('Unable to connect via Sequelize:', err);
});
When I run this file through the terminal my result is:
Executing (default): SELECT 1+1 AS result
Connection with Sequelize established successfully.
Executing (default): SELECT 1+1 AS result
Connection with Sequelize established successfully.
Executing (default): SELECT 1+1 AS result
Synced DB
Why is the database not actually syncing even when the model files are imported? I'm trying to create a seed file where the database is dropped entirely before data is seeded. I want to authenticate the database connection and then drop all tables before recreating them, and then adding the fake data. How do you make the database sync immediately after authenticating the connection?

Async pg pool query takes forever to finish

I am currently working on login/register API for my database search engine. I am using express#4.17.1, pg#7.18.2, PostgreSQL 12. The problem is that on one of the machines (Ubuntu bionic 18.0.4 LTS) the query for /register route resolves fine - the user is saved in postgres db, but on Fedora 32 the await function takes forever to resolve.
Here is the code:
db.js:
const pool = new Pool({
host: "localhost",
user: "postgres",
password: "postgres",
port: 5432,
database: "jimjones"
});
module.exports = pool;
register route in jwtAuth.js:
router.post("/register", validInfo, async (req, res) => {
const { email, name, password } = req.body;
try {
const user = await pool.query("SELECT * FROM users WHERE user_email = $1", [
email
]);
//doesnt get past first query
if (user.rows.length > 0) {
return res.status(401).json("User already exist!");
}
const salt = await bcrypt.genSalt(10);
const bcryptPassword = await bcrypt.hash(password, salt);
let newUser = await pool.query(
"INSERT INTO users (user_name, user_email, user_password) VALUES ($1, $2, $3) RETURNING *",
[name, email, bcryptPassword]
);
const jwtToken = jwtGenerator(newUser.rows[0].user_id);
return res.json({ jwtToken });
} catch (err) {
console.error(err.message);
res.status(500).send("Server error");
}
});
The query:
const user = await pool.query("SELECT * FROM users WHERE user_email = $1", [
email
]);
The req.body on Fedora 32 is parsed, so it's not a firewall problem regarding the POST request.
Using Postman, the program fails on this query (but only on Fedora 32). Both databases have the same tables on both machines. Sql select query in psql returns data on both machines.
Any suggestions on how to fix/debug this? Any help will be greatly appreciated.
Upgrading to pg#8.3.0 solved this issue.
The whole discussion:
https://github.com/brianc/node-postgres/issues/2069

What is the best way for a node app to maintain connection with hundred or even thousands of databases using mongoose?

I am creating a multi-tenant Saas App. I was advised by many to keep my separate clients on separate databases, for better security and easier management.
How do we connect multiple databases to the Node app?
I know how to make my app run with a single database connection to mongodb, but not sure about multiple connections.
The mongoose docs mentions the following solutions for multiple connections:
export schema pattern (https://mongoosejs.com/docs/connections.html#multiple_connections)
connection pools (which has only up to 5 connections, which may not be ideal as I may have hundreds of clients in the future)
Another way which I tried (and it works!), is connecting to mongodb during a node API call and executing my logic, as shown below. The code below is a test route for registering a user with name and email. dbutils() is a function that I call to connect to mongodb, using mongoose.connect(). I am not sure if this is a good practice to connect during the API call.
router.post('/:db/register', async (req,res, next) => {
const startTime = new Date();
try {
if(!req.body.name) {
throw new Error("Name required");
}
if(!req.body.email) {
throw new Error("Email required");
}
await dbutils(req.params.db);// connect to db
const session = await mongoose.startSession();
session.startTransaction();
const newUser = new User({
name: req.body.name,
email: req.body.email,
})
await newUser.save({session});
await session.commitTransaction();
session.endSession();
const endTime = new Date();
const diff = endTime.getTime() - startTime.getTime();
return res.json({
newUser: {
email: req.body.email,
name: req.body.name
},
db: req.params.db,
timeElapsed: diff,
});
} catch(ex) {
return next(ex);
}
})
My dbutils() code
const mongoose = require('mongoose');
const mongoURI = "mongodb://PC:27017,PC:27018,PC:27019";
module.exports = async function(db) {
try {
await mongoose.connect(
`${mongoURI}/${db}`,
{
useNewUrlParser: true,
useCreateIndex: true,
useFindAndModify: false,
useUnifiedTopology: true,
}
)
} catch(ex) {
throw ex
}
}
I would be very happy for any recommendation or solution to this problem. Thank you very much in advance for your answer.
It is never a good idea to connect to your DB in an API call, you will be wasting a lot of resources, and deplaying the API responses as well.
The best way for you would be connect to multiple databases when Application starts, along with connection pooling configuration.
You can specify which schema belongs to which connection, and maintain separate DB collections.
You can use below code to work with multiple connections, and pooling:
const connection1 = mongoose.createConnection('mongodb://username:password#host1:port1[?options]',{
poolSize: 10
});
const connection2 = mongoose.createConnection('mongodb://username:password#host2:port2[?options]',{
poolSize: 10
});
Models/Schema on connection 1 can be created as below:
//User schema on connection 1
const userSchema = new Schema({ ... });
const UserModel = connection1.model('User', userSchema);
module.exports = UserModel;
Models/Schema on connection 2 can be created as below:
//Product schema on connection 2
const productSchema = new Schema({ ... });
const ProductModel = connection2.model('Product', productSchema);
module.exports = ProductModel;
For better performance, you can also have shared DB clusters for each DB, and use the cluster to connect to your database.
const conn = mongoose.createConnection('mongodb://[username:password#]host1[:port1][,host2[:port2],...[,hostN[:portN]]][/[database][?options]]', options);
For detailed information, Please read Mongoose Multiple Connections, and Connection Pooling

I can't connect to snowflake using node js connector

I'm trying to connect to snowflake database, using snowflake-sdk connector.
First I installed the snowflake-sdk, using the command line:
npm install snowflake-sdk
After I followed all the instructions reported here.
i created the file index.js containing:
var snowflake = require('snowflake-sdk');
var connection = snowflake.createConnection( {
account : 'xxxx.east-us-2'
username: 'MYUSERNAME'
password: 'MYPASSWORD'
}
);
connection.connect(
function(err, conn) {
if (err) {
console.error('Unable to connect: ' + err.message);
}
else {
console.log('Successfully connected to Snowflake.');
}
}
);
and after I run the command node index.js
and I had the Connection error:
Unable to connect: Network error. Could not reach Snowflake.
I Tried again, changing the account value in xxxx.east-us-2.azure.snowflakecomputing.com but nothing changed.
Your account name should include cloud provider as well.
Change the account name as :
var connection = snowflake.createConnection( {
account : 'xxxx.east-us-2.azure'
username: 'MYUSERNAME'
password: 'MYPASSWORD'
}
For full account names refer docs
The issue is with your account name. Please pass your account name as xxxx.east-us-2.azure
Here's the code I used in a tiny issue reproduction that I sent to the Snowflake support people.
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
const snowflake = require("snowflake-sdk");
const Q = require("q");
const SNOWFLAKE_HOST_SUFFIX = 'snowflakecomputing.com';
const SNOWFLAKE_ACCOUNT = 'companyname';
function getSFConnection(connParams) {
var d = Q.defer();
let connection = snowflake.createConnection({
account: connParams.account || SNOWFLAKE_ACCOUNT,
username: connParams.user,
password: connParams.password || '',
database: connParams.name,
warehouse: connParams.warehouse
});
connection.connect(function (err, conn) {
if (err) {
console.error('Unable to connect: ' + err.message);
d.reject(err);
}
else {
console.info('Successfully connected as id: ' + connection.getId());
connection.close = function () {
return disconnectSF(connection);
};
d.resolve(connection);
}
});
return d.promise;
}
and I used it like:
getSFConnection({
user: 'username',
account: 'companyname',
password: 'password',
name: '',
warehouse: 'warehouse_name'
}).then...
upon reflection I wonder why I have the host suffix set, but am not using it.. but there it is.
Following is the right config for "snowflake-sdk": "^1.5.3"
var connection = snowflake.createConnection({
account: 'xxx.us-east-1',
username: 'yourUsername',
password: 'yourPassword',
});
Do not specify the region.
region — Deprecated https://docs.snowflake.com/en/user-guide/nodejs-driver-use.html

Node JS sequelize Do Not Close Automatically when the Script Done

I am facing some problem with my DB connection design with sequelize.js. What I want to achieve is to have a centralize connection and configuration files for my application DB connection. Therefore I have created a file name database.js as below.
const Sequelize = require("sequelize");
const dbConfig = require("../../config/database.json");
const db = {};
sequelize = new Sequelize({
dialect: dbConfig.dialect,
database: dbConfig.database,
username: dbConfig.username,
password: dbConfig.password,
host: dbConfig.host,
port: dbConfig.port,
operatorsAliases: false,
logging: false,
pool: {
max: 5,
min: 0,
acquire: 30000,
idle: 10000
}
});
db.Sequelize = Sequelize;
db.sequelize = sequelize;
module.exports = db;
If there is any scripts going to use database, I just have to require the database.js file. However, there is a problem when my script is finished, the process is not exiting (terminal hang there) because of the sequelize connection is not close.
I have tried to call the close function on the finally block but this causing others query script not working (if I call it on every query block) due to the fact that they are sharing same instant. Once the first query done, then the connection will be closed.
sequelize
.query("SELECT * FROM users WHERE id = ?", {
replacements: [userId],
type: sequelize.QueryTypes.SELECT,
model: User,
mapToModel: true
})
.then(users => {
console.log(users);
})
.finally(() => {
sequelize.close();
});
I can close the connection on the last query, but it is stupid that whenever I got a new query that will need to execute at last, I will have to move the close to the new query block.
I am looking for a clean code that can help to maintain DB connection and also able to automatic close the connection when all scripts are executed.
sequelize.close() returns a promise so use async function and call
await sequelize.close()

Resources