DB Connection close issue in Node.js using trireme-jdbc - node.js

I have developed and deployed a Node.js application in apigee edge, which performs few CRUD operations. To establish db connection I have used trireme-jdbc module of node.js. I am able to perform all CRUD operations well through trireme-jdbc but I have problem with DB session close using db.close() function in trireme-jdbc. When I use db.close() it do not close currently active/open session from the pool. I want to know is there any other process or way to close db connection perfectly. Also I want to close all active connections from pool.
Any help will be appreciated. Below is the sample code to establish connection, run a select query and used db.close() to close session. My database is Openedge/Progress.
var jdbc = require('trireme-jdbc');
var db = new jdbc.Database({
url : connectionURL,
properties: {
user: 'XXXXXX',
password: 'XXXXXX',
},
minConnections:1,
maxConnections:2,
idleTimeout:10
});
db.execute('select * from users ', function(err, result, rows) {
console.log(err);
console.log(result);
rows.forEach(function(row) {
console.log(row);
});
db.close(); // Used to close DB session/connection.
});

Related

How to get data from an API into a database with different schemas?

Suppose you have an MS SQL Server which has a Database A {id, name, age} and I need to take this into an REST API. Now I want to map the values into another Database B{student_id, student_name, student_age} in PostgreSQL . How do I do it?
Also assume I have made the API from Database A and now only mapping is required.
The API is as follows:
API goes like this!
I have read about spring boot one to one mappings but I have no idea how to do it.
Hello this is Gulshan Negi
Well, you can extract the data from the MS SQL Server database, transform it to match the schema of the PostgreSQL database, load it into the PostgreSQL database, and then set up a REST API to query the data and return the desired results in order to map values from an MS SQL Server database to a PostgreSQL database and expose it through a REST API. Before loading the data into PostgreSQL, it is essential to ensure that it has been accurately transformed and that any potential issues, such as naming conflicts or mismatches between data types, have been resolved.
Thanks
To map the values from MS SQL Server to PostgreSQL database using NodeJS, you can follow the following steps:
Connect to the MS SQL Server using the appropriate library for NodeJS
(for example, 'mssql' library).
Execute the SELECT statement on the 'Database A' and get the results.
Close the connection to the MS SQL Server.
Connect to the PostgreSQL database using the appropriate library for
NodeJS (for example, 'pg' library).
Use the results obtained from the MS SQL Server to construct INSERT
statements and execute them on 'Database B' in PostgreSQL using the
appropriate library for NodeJS.
Here is some example code to get you started:
const mssql = require('mssql');
const pg = require('pg');
// Connect to the MS SQL Server and execute the SELECT statement
const config = {
user: 'username',
password: 'password',
server: 'mssqlserver',
database: 'A'
};
mssql.connect(config, (err) => {
if (err) {
console.log(err);
return;
}
const request = new mssql.Request();
request.query('SELECT id, name, age FROM table', (err, result) => {
if (err) {
console.log(err);
return;
}
// Close the connection to the MS SQL Server
mssql.close();
// Connect to the PostgreSQL database and execute the INSERT statements
const pgConfig = {
user: 'username',
password: 'password',
host: 'postgresqlhost',
database: 'B',
port: '5432'
};
const pgClient = new pg.Client(pgConfig);
pgClient.connect();
result.recordset.forEach((row) => {
const query = {
text: 'INSERT INTO table(student_id, student_name, student_age) VALUES ($1, $2, $3)',
values: [row.id, row.name, row.age]
};
pgClient.query(query, (err, result) => {
if (err) {
console.log(err);
}
});
});
// Close the connection to the PostgreSQL database
pgClient.end();
});
});

Connecting Node JS app to GCP Cloud SQL - ReferenceError: Pool is not defined

So i have this small Node JS app where i have the following script, which i invoke in my HTML index page, in order to connect to a Cloud SQL database in GCP and perform a specific query so i can pass the values to a dropdown later:
try {
pool = new Pool({
user: "postgres",
host: "/cloudsql/sfmcsms-d-970229:europe-west1:dsi-sfmc-sms-database",
database: "postgres",
password: "dsi-sfmc-sms-database",
port: "5432",
});
console.log("Connection successfull!");
pool.query("select * from ConfigParameter;", (error, results) => {
if (error) {
console.log(error);
}
qResult = results;
console.log(qResult);
//insert logic to populate dropdowns
});
} catch (err) {
console.log("Failed to start pool", err);
}
I'm still working on the logic to populate the dropdowns but for now, i'm focusing on establishing a successful connection first before i get to that. However, everytime i run the script, i seem to get this particular error:
ReferenceError: Pool is not defined
I've been looking around for some possible answers but no luck.
Before using Pool you have import first like this
const { Pool } = require('pg')
And obviously node-postgres should be installed
npm i pg

how pooling works in node-postgres module for node.js

I am trying to create a small app that queries a database in nodejs for this I am using the following module https://node-postgres.com/
I have managed to establish the connection with the database in postgres and it connects correctly
(this is my code)
const express = require('express');
const fs = require('fs');
const app = express();
const port = 8080;
const { Pool } = require('pg');
const pool = new Pool({
user: 'Inv',
host: 'localhost',
database: 'database',
password: 'password',
port: 3000,
idleTimeoutMillis:1000,
connectionTimeoutMillis:0,
});
pool.connect()
.then(
function (client){
let query = 'SELECT * FROM Clients';
function query_db(query){
client.query(query)
.then(
function(res){
console.log('query')
}
)
.catch(
function(err){
console.log(err.stack)
}
)
.finally(
function(){
client.release()
console.log('client disconect')
}
)
}
query_db(query)
return;
}
);
console.log(pool.totalCount);
setInterval(()=>{console.log(pool.totalCount)}, 100)
pool.on('error', (err, client) => {
console.error('Unexpected error on idle client', err)
process.exit(-1)
});
app.listen(port, () => {
console.log(`\u001b[7mServer in: http://localhost:${port}\u001b[0m\n`);
});
My question is how the pooling really works in this module???
Since from what i have learned so far is that pooling serves to not make a connection for each client since this would make each of the clients be authenticated by the database and this process is time consuming while consuming more server resources so it would slow down the execution of the program for this reason, pooling connects a group of clients to the same user in the database.
Then within the program i create the pool and it connects correctly to postgres with the statement pool.connect () which i check with the list of users connected to the database in pgadmin4, after this i make a query with the statement client.query (query)and this query is done correctly, the problem is that according to what i understand from the node-postgres documentation after finishing the query, the client must be disconnected from the pool so that other clients can connect to the space left by the client that has already made its transaction and this is done with the client.release () statement but when the call is made to client.release () the user of the pool that is connected to postgres disconnects but ... why is it like that ???
That the user of the pool should not remain connected in postgres and only leave the space free for another client ????
So if the entire pool is disconnected, the very objective of making the pool is not lost ???
Doing tests by omitting the client.release () this behavior stops but then ...
If the client is not released, the limit of clients connected to the pool will be reached and the clients will be left waiting forever, right????
Also according to the documentation in the sentence idleTimeoutMillis: 1000, it indicates the time it will wait before disconnecting the client due to inactivity and it does so, but when it disconnects the client as in the previous case it disconnects the entire pool of postgres ...
Then what is the real operation of the pool ???
So if what I understand is correct then.... there is no difference between using the pool and using individual clients, right?
I'm sorry for so many questions and in case some of these questions are very obvious or silly but I am something new in node and I have already searched ad nauseam on google but the documentation is very basic and minimal thanks for taking your time to read my doubts :'D

How to check if db connection is open? (mysql2 node.js)

Dear stackoverflowers:)
I am making my first mvc application on node.js and have such a problem:
Sometimes, when I don't make any queries to the database for a long time (about 3 mins) and try to do one, I have such an error: "This Socket Has Been Ended By The Other Party"
I found out, that this is because of wait_timeout option in mysql config which closes the connection if downtime is more than value
So am i right that i should check connection for being open before every query to database? And if i should, how and where?
This my db connection file:
const mysql = require('mysql2');
// create the connection to database
const connection = mysql.createConnection({
host: 'localhost',
user: 'mysql',
password: 'mysql',
database: 'qa'
});
exports.connect = (done) => {
connection.connect((err) =>{
if(err){
console.log('db connection error');
}
else{
done()
}
})
}
exports.connection = connection;
And this is the part of one of my models:
const db = require('../db');
exports.makeNewQuestion = async (topic, text, subsection, user) => {
return db.connection.promise().execute("INSERT INTO `questions` (`topic`, `text`,
`subsection_id`, `user_id`) VALUES (?, ?, ?, ?)", [topic, text, subsection, user]);
}
As far as I know you don't need a query an existent table to check the connection/
You can use something like:
select 1
If that works you are connected.
Normally you don't need this approach, apart if you need to leave the connection open in the long term.
Depending on the library you are using you can receive a Promise back. Every time you query your db you might want to check if that promise was refused or not and handle the problem accordingly.

How do I get a MongoDB import script to close the db after inserting results?

I am running a quick little nodejs script to find documents in one collection and insert them into another collection but on the same DB. I came up with this guy, but it has no way to close because I think its running open or async?
I have tried placing the db.close() in various places and tried mongoClient.close(). No luck which had me thinking about trying to force a timeout for the async call. Added a connection Time out but it did not have the desired behaviour.
var MongoClient = require('mongodb').MongoClient
, assert = require('assert');
const async = require("async");
// Connection URL
var url = 'mongodb://localhost:27017/sourceDB';
// Use connect method to connect to the Server
MongoClient.connect(url,{connectTimeoutMS: "5"}, (err, db) => {
db.collection('source.collection', function(err, col) {
assert.equal(null, err);
col.find().forEach(function (data) {
console.log(data);
db.collection('destination.collection').insertOne(data, function(err, res) {
assert.equal(null, err);
});
console.log("Moved");
});
});
});
The script does well and picks up the collection and inserts, but the connection remains open.
It is not recommended to explicitly close the connection as shown by this SO thread.
Rather, allow the client library to manage the connection for you.

Resources