Is it possible to modify exported variable after it is exported - node.js

i'm still new to node.js and i'm trying to create and export my database connection with mysqljs.
I'm having a trouble because if something happens like the database crash or a network problem, the connection needs to be re-created. I can't figure out how to udpate the previously exported connection to my app.
Is there a way to update an object exported by the module like i did ?
Am i missing something or using module.exports in the wrong way ?
I think it would be possible to use a function or a constructor but i would have to edit every module that currently uses my dbh variable.
var reconnection_interval = 5000;
var db_config = {
host : "localhost",
user : "root",
password : "",
database : "test"
};
var dbh;
function handleDisconnect(conn)
{
conn.on('error', function(err)
{
// If the error is not fatal, we just ignore it.
if(!err.fatal){return;}
else
{
logger.error('Database error : '+err);
logger.info('Trying to reconnect in '+reconnection_interval/1000+'s');
// Destroying old connection.
dbh = undefined;
// Creating a new connection and trying to reconnect every <reconnection_interval>
setTimeout(function(){
dbh = mysql.createConnection(db_config);
handleDisconnect(dbh);
dbh.connect();
// I want to update the module.exports value, so it becomes the new connection
module.exports = dbh;
}, reconnection_interval);
}
});
}
dbh = mysql.createConnection(db_config);
handleDisconnect(dbh);
dbh.connect();
logger.info("Connected to database.");
module.exports = dbh;

Related

node-postgres library : How to avoid race condition with event driven database insertion?

I'm currentling facing a race condition issue with db insertion.
I try to simplify my original code to try explaining the issue.
(i hope i didn't brought errors in the rewriting process)
** Main Program **
var pool = new pg.Pool({
database: 'DB',
user: 'User',
password: 'Password'
});
conn.on('someEvent', async function(e) {
var param1 = e.getHeader('param1');
var param2 = e.getHeader('param2');
var myObj = new MYOBJ(null);
var params:{
param1:param1,
param2:param2
}
await myObj.InsertInPQ(pool,params);
})
** A custom object i created to access the DB **
myObj.prototype.insertInPQ = async function(pool,params) {
var myObj=this;
try {
console.log("ENTERING myObj.insertInDB() with param1:"+params.param1+" and param2:"+params.param2);
var query= "INSERT INTO schema.SOMETABLE "+
"VALUES ($1,$2)"
const { rows } = await pool.query(query,[params.param1,params.param2]);
return (true);
}
catch(err){
throw new Error("myObj.insertInDB() for param1:"+params.param1+" and params2:"+params.param2+" error:"+err)
}
};
Let's now say eventA and eventB happen almost simustanously with :
eventA: param1=1, param2=2
eventB: param1=3 param2=4
On eventA :
console will display : ENTERING myObj.insertInDB() with param1=1 and param2=2
DB inserttion will be fine.
On eventB :
console will display : ENTERING myObj.insertInDB() with param1=3 and param2=4
But postgres logs will report inserting a row with the eventA values (1,2).
I don't understand how nodejs and the pg library runs under the hood, and what i should do to avoid this kind of problems (which are, by the way, unpermanent, so very hard to troubleshout).
Anybody have a suggestion ?
Regards,
Yaducam

MongoDB connections not releasing after query

I made some node app with using MongoDB and Socket.IO, and when I insert some documents, every operation isn't working. And I figured out that MongoDB doesn't insert documents after some documents inserted(I don't know how many exactly). So I checked connections with mongodb shell:
db.serverStatus().connections
before starting node app, it saids:
{ "current" : 1, "available" : 3275, "totalCreated" : NumberLong(639) }
after inserts some docs:
{ "current" : 51, "available" : 3225, "totalCreated" : NumberLong(708) }
turn off node app:
{ "current" : 1, "available" : 3275, "totalCreated" : NumberLong(708) }
This is the code of server side. (I'm using external MongoDB module so it can be little different from Original MongoDB module for node.js. this module is just simple wrapper for MongoDB with Promise based API)
const app = require('http').createServer();
const io = require('socket.io')(app);
const am2 = require('am2');
...
am2.connect('localhost', '...', { ... });
const sockets = {};
io.on('connection', (socket) => {
// save the socket into sockets object
sockets[socket.id] = socket;
// release socket on disconnection
socket.on('disconnect', () => {
delete sockets[socket.id];
});
...
// pseudo doc inserting event
socket.on('/insert/some/docs', (data) => {
am2.insert({ ... }).then( () => {
socket.emit('/after/insert/some/docs', {});
}).catch( ... );
});
});
when client emit '/insert/some/docs' event, server will insert document into MongoDB. first few tries works well, but after some insertion, it does not work anymore.
I think this happen because lot's of connections are still alive after insertion is done, but I don't know why. If it was RDBMS like MySQL, every connection must be close after operation is done, but in MongoDB, it should not be(as I know).
I don't know why this is happening, so it will be very appreciate give me a hand.
I solved this by releasing cursor after get data from MongoDB. Make sure to release, or it keeps your connection pool and makes your application not working.

A better way to structure a Mongoose connection module

I have refactored some code to place all my mongoose.createConnection(...) in a single file. This file is then required in other files that use connections to the various databases specified. The connections are lazily created and are used in both an http server and in utility scripts.
The connection file looks like this:
var mongoose = require("mongoose");
var serverString = "mongodb://localhost:27017";
var userDBString = "/USER";
var customerDBString = "/CUSTOMER";
var userConnection = null;
exports.getUserConnection = function () {
if (userConnection === null) {
userConnection = mongoose.createConnection(serverString + userDBString, {server: { poolSize: 4 }});
}
return userConnection;
};
var customerConnection = null;
exports.getCustomerConnection = function () {
if (customerConnection === null) {
customerConnection = mongoose.createConnection(serverString + customerDBString, { server: { poolSize: 4 }});
}
return customerConnection;
};
My models are stored in a separate files (based on their DB) that looks a bit like this:
exports.UserSchema = UserSchema; //Just assume I know how to define a valid schema
exports.UserModel = connection.getUserConnection().model("User", UserSchema);
Later , I use the getUserConnection() to refer to the connection I have created to actually do work the model.
TL;DR
In utilities that use this connection format, I have to call
connection.getUserConnection().on("open", function() {
logger.info("Opened User DB");
//Do What I Need To Do
});
It is possible that in some scenarios the task processor will have already broadcast the open event. In some, it won't be guaranteed to have happened yet. I noticed that it doesn't queue work if the connection isn't open (specifically, dropCollection) so I feel stuck.
How can I be certain that the connection is open before proceeding given that I can't depend on subscribing to the open event before the task processor runs?
Is there a better pattern for centralizing the managing of multiple connections?
I can answer part of my own question
How can I be certain that the connection is open before proceeding
given that I can't depend on subscribing to the open event before the
task processor runs?
if (connection.getUserConnection().readyState!==1) {
logger.info("User connection was not open yet. Adding open listener");
connection.getSR26Connection().on("open", function () {
logger.info("User open event received");
doStuff();
});
} else {
logger.info("User is already open.");
doStuff();
}
function doStuff() {
logger.info("Doing stuff");
}
If you see a better way then please comment or offer up an answer. I would still like to hear how other people manage connections without rebuilding the connection every time.

node.js - Require Exports Variables Errors

I have a node.js script that does some database queries for me and works fine. The script is starting to get a bit longer so I thought I might start to break it up and thought moving the database connection code out to another file made sense.
Below is the code that I have moved into another file and then included with a require statement.
The issue I'm having is with the 'exports' commands at the bottom of the script. It appears the function 'dbHandleDisconnectUsers()' exports fine however the variable 'dbConnectionUsers' doesn't.
The script errors refer to methods of the object'dbConnectionUsers' (I hope thats the correct terminalogy) missing and gives me the impression I'm not really passing a complete object. Note: I would include the exact errors but I'm not in front of the machine.
var mysql = require('/usr/lib/node_modules/mysql');
// Users Database Configuration
var dbConnectionUsers;
var dbConfigurationUsers = ({
host : 'xxxxx',
user : 'xxxxx',
password : 'xxxxx',
database : 'xxxxxx',
timezone : 'Asia/Singapore'
});
// Users Database Connection & Re-Connection
function dbHandleDisconnectUsers() {
dbConnectionUsers = mysql.createConnection(dbConfigurationUsers);
dbConnectionUsers.connect(function(err) {
if(err) {
console.log('Users Error Connecting to Database:', err);
}else{
dbConnectionUsers.query("SET SESSION TRANSACTION ISOLATION LEVEL SERIALIZABLE;");
dbConnectionUsers.query("SET SESSION sql_mode = 'ANSI';");
dbConnectionUsers.query("SET NAMES UTF8;");
dbConnectionUsers.query("SET time_zone='Asia/Singapore';");
}
});
dbConnectionUsers.on('error', function(err) {
console.log('Users Database Protocol Connection Lost: ', err);
if(err.code === 'PROTOCOL_CONNECTION_LOST') {
dbHandleDisconnectUsers();
} else {
throw err;
}
});
}
dbHandleDisconnectUsers();
exports.dbHandleDisconnectUsers() = dbHandleDisconnectUsers();
exports.dbConnectionUsers = dbConnectionUsers;
In the core script I have this require statement:
var database = require('database-connect.js');
And I refer the function/variable as
database.dbHandleDisconnectUsers()
database.dbConnectionUsers
Ignoring the syntax error that everybody else has pointed out in exports.dbHandleDisconnectUsers() = dbHandleDisconnectUsers(), I will point out that dbConnectionUsers is uninitialized.
JavaScript is a pass-by-copy-of-reference language, therefore these lines:
var dbConnectionUsers;
exports.dbConnectionUsers = dbConnectionUsers;
are essentially identical to
exports.dbConnectionUsers = undefined;
Even though you set dbConnectionUsers later, you are not affecting exports.dbConnectionUsers because it holds a copy of the original dbConnectionUsers reference.
It's similar, in primitive data types, to:
var x = 5;
var y = x;
x = 1;
console.log(x); // 1
console.log(y); // 5
For details on how require and module.exports work, I will refer you to a recent answer I posted on the same topic:
Behavior of require in node.js
It's odd that your function is working but your other variable isn't exporting. This shouldn't be the case.
When you export functions you generally don't want to be exporting them as evaluated functions (ie. aFunction() ). The only time you might is if you want export whatever that function returns, or if you want to export an instance of a constructor function as part of your module.
The other thing, which is really odd, and is mentioned in a comment above is that you are trying to assign a value to exports.dbHandleDisconnectUsers(), which should be an undefined and throw an error.
So,in other words: Your code should not look like exports.whatever() = whatever().
Instead you should export both functions and other properties like this:
exports.dbHandleDisconnectUsers = dbHandleDisconnectUsers; // no evaluation ()
exports.dbConnectionUsers = dbConnectionUsers;
I don't know if this is the only thing wrong here, but this is definitely one thing that might be causing an execution error or two :)
Also, taking into consideration what Brandon has pointed out as well, you are initially exporting something undefined. But in your script, you are overwriting the reference anyway.
What you should do instead is make a new object reference, which is persistent and has a property in it that you can update. ie:
var dbConnection = {users: null};
exports.dbConnection = dbConnection;
Then when you run your function:
function dbHandleDisconnectUsers() {
dbConnection.users = mysql.createConnection(dbConfigurationUsers);
dbConnection.users.connect(function(err) {
if(err) {
console.log('Users Error Connecting to Database:', err);
}else{
dbConnection.users.query("SET SESSION TRANSACTION ISOLATION LEVEL SERIALIZABLE;");
dbConnection.users.query("SET SESSION sql_mode = 'ANSI';");
dbConnection.users.query("SET NAMES UTF8;");
dbConnection.users.query("SET time_zone='Asia/Singapore';");
}
});
dbConnection.users.on('error', function(err) {
console.log('Users Database Protocol Connection Lost: ', err);
if(err.code === 'PROTOCOL_CONNECTION_LOST') {
dbHandleDisconnectUsers();
} else {
throw err;
}
});
}
This way, the object reference of dbConnection is never overwritten.
You will then refer to your users db connection in your module as:
database.dbConnection.users
Your function should still work as you were intending on using it before with:
database.dbHandleDisconnectUsers();

Keeping open a MongoDB database connection

In so many introductory examples of using MongoDB, you see code like this:
var MongoClient = require('mongodb').MongoClient;
MongoClient.connect("mongodb://localhost:port/adatabase", function(err, db)
{
/* Some operation... CRUD, etc. */
db.close();
});
If MongoDB is like any other database system, open and close operations are typically expensive time-wise.
So, my question is this: Is it OK to simply do the MongoClient.connect("... once, assign the returned db value to some module global, have various functions in the module do various database-related work (insert documents into collections, update documents, etc. etc.) when they're called by other parts of the application (and thereby re-use that db value), and then, when the application is done, only then do the close.
In other words, open and close are done once - not every time you need to go and do some database-related operation. And you keep re-using that db object that was returned during the initial open\connect, only to dispose of it at the end, with the close, when you're actually done with all your database-related work.
Obviously, since all the I/O is asynch, before the close you'd make sure that the last database operation completed before issuing the close. Seems like this should be OK, but i wanted to double-check just in case I'm missing something as I'm new to MongoDB. Thanks!
Yes, that is fine and typical behavior. start your app, connect to db, do operations against the db for a long time, maybe re-connect if the connection ever dies unexpectedly, and then just never close the connection (just rely on the automatic close that happens when your process dies).
mongodb version ^3.1.8
Initialize the connection as a promise:
const MongoClient = require('mongodb').MongoClient
const uri = 'mongodb://...'
const client = new MongoClient(uri)
const connection = client.connect() // initialized connection
And then call the connection whenever you wish you perform an action on the database:
// if I want to insert into the database...
const connect = connection
connect.then(() => {
const doc = { id: 3 }
const db = client.db('database_name')
const coll = db.collection('collection_name')
coll.insertOne(doc, (err, result) => {
if(err) throw err
})
})
The current accepted answer is correct in that you may keep the same database connection open to perform operations, however, it is missing details on how you can retry to connect if it closes. Below are two ways to automatically reconnect. It's in TypeScript, but it can easily be translated into normal Node.js if you need to.
Method 1: MongoClient Options
The most simple way to allow MongoDB to reconnect is to define a reconnectTries in an options when passing it into MongoClient. Any time a CRUD operation times out, it will use the parameters passed into MongoClient to decide how to retry (reconnect). Setting the option to Number.MAX_VALUE essentially makes it so that it retries forever until it's able to complete the operation. You can check out the driver source code if you want to see what errors will be retried.
class MongoDB {
private db: Db;
constructor() {
this.connectToMongoDB();
}
async connectToMongoDB() {
const options: MongoClientOptions = {
reconnectInterval: 1000,
reconnectTries: Number.MAX_VALUE
};
try {
const client = new MongoClient('uri-goes-here', options);
await client.connect();
this.db = client.db('dbname');
} catch (err) {
console.error(err, 'MongoDB connection failed.');
}
}
async insert(doc: any) {
if (this.db) {
try {
await this.db.collection('collection').insertOne(doc);
} catch (err) {
console.error(err, 'Something went wrong.');
}
}
}
}
Method 2: Try-catch Retry
If you want more granular support on trying to reconnect, you can use a try-catch with a while loop. For example, you may want to log an error when it has to reconnect or you want to do different things based on the type of error. This will also allow you to retry depending on more conditions than just the standard ones included with the driver. The insert method can be changed to the following:
async insert(doc: any) {
if (this.db) {
let isInserted = false;
while (isInserted === false) {
try {
await this.db.collection('collection').insertOne(doc);
isInserted = true;
} catch (err) {
// Add custom error handling if desired
console.error(err, 'Attempting to retry insert.');
try {
await this.connectToMongoDB();
} catch {
// Do something if this fails as well
}
}
}
}
}

Resources