I have a node.js script that does some database queries for me and works fine. The script is starting to get a bit longer so I thought I might start to break it up and thought moving the database connection code out to another file made sense.
Below is the code that I have moved into another file and then included with a require statement.
The issue I'm having is with the 'exports' commands at the bottom of the script. It appears the function 'dbHandleDisconnectUsers()' exports fine however the variable 'dbConnectionUsers' doesn't.
The script errors refer to methods of the object'dbConnectionUsers' (I hope thats the correct terminalogy) missing and gives me the impression I'm not really passing a complete object. Note: I would include the exact errors but I'm not in front of the machine.
var mysql = require('/usr/lib/node_modules/mysql');
// Users Database Configuration
var dbConnectionUsers;
var dbConfigurationUsers = ({
host : 'xxxxx',
user : 'xxxxx',
password : 'xxxxx',
database : 'xxxxxx',
timezone : 'Asia/Singapore'
});
// Users Database Connection & Re-Connection
function dbHandleDisconnectUsers() {
dbConnectionUsers = mysql.createConnection(dbConfigurationUsers);
dbConnectionUsers.connect(function(err) {
if(err) {
console.log('Users Error Connecting to Database:', err);
}else{
dbConnectionUsers.query("SET SESSION TRANSACTION ISOLATION LEVEL SERIALIZABLE;");
dbConnectionUsers.query("SET SESSION sql_mode = 'ANSI';");
dbConnectionUsers.query("SET NAMES UTF8;");
dbConnectionUsers.query("SET time_zone='Asia/Singapore';");
}
});
dbConnectionUsers.on('error', function(err) {
console.log('Users Database Protocol Connection Lost: ', err);
if(err.code === 'PROTOCOL_CONNECTION_LOST') {
dbHandleDisconnectUsers();
} else {
throw err;
}
});
}
dbHandleDisconnectUsers();
exports.dbHandleDisconnectUsers() = dbHandleDisconnectUsers();
exports.dbConnectionUsers = dbConnectionUsers;
In the core script I have this require statement:
var database = require('database-connect.js');
And I refer the function/variable as
database.dbHandleDisconnectUsers()
database.dbConnectionUsers
Ignoring the syntax error that everybody else has pointed out in exports.dbHandleDisconnectUsers() = dbHandleDisconnectUsers(), I will point out that dbConnectionUsers is uninitialized.
JavaScript is a pass-by-copy-of-reference language, therefore these lines:
var dbConnectionUsers;
exports.dbConnectionUsers = dbConnectionUsers;
are essentially identical to
exports.dbConnectionUsers = undefined;
Even though you set dbConnectionUsers later, you are not affecting exports.dbConnectionUsers because it holds a copy of the original dbConnectionUsers reference.
It's similar, in primitive data types, to:
var x = 5;
var y = x;
x = 1;
console.log(x); // 1
console.log(y); // 5
For details on how require and module.exports work, I will refer you to a recent answer I posted on the same topic:
Behavior of require in node.js
It's odd that your function is working but your other variable isn't exporting. This shouldn't be the case.
When you export functions you generally don't want to be exporting them as evaluated functions (ie. aFunction() ). The only time you might is if you want export whatever that function returns, or if you want to export an instance of a constructor function as part of your module.
The other thing, which is really odd, and is mentioned in a comment above is that you are trying to assign a value to exports.dbHandleDisconnectUsers(), which should be an undefined and throw an error.
So,in other words: Your code should not look like exports.whatever() = whatever().
Instead you should export both functions and other properties like this:
exports.dbHandleDisconnectUsers = dbHandleDisconnectUsers; // no evaluation ()
exports.dbConnectionUsers = dbConnectionUsers;
I don't know if this is the only thing wrong here, but this is definitely one thing that might be causing an execution error or two :)
Also, taking into consideration what Brandon has pointed out as well, you are initially exporting something undefined. But in your script, you are overwriting the reference anyway.
What you should do instead is make a new object reference, which is persistent and has a property in it that you can update. ie:
var dbConnection = {users: null};
exports.dbConnection = dbConnection;
Then when you run your function:
function dbHandleDisconnectUsers() {
dbConnection.users = mysql.createConnection(dbConfigurationUsers);
dbConnection.users.connect(function(err) {
if(err) {
console.log('Users Error Connecting to Database:', err);
}else{
dbConnection.users.query("SET SESSION TRANSACTION ISOLATION LEVEL SERIALIZABLE;");
dbConnection.users.query("SET SESSION sql_mode = 'ANSI';");
dbConnection.users.query("SET NAMES UTF8;");
dbConnection.users.query("SET time_zone='Asia/Singapore';");
}
});
dbConnection.users.on('error', function(err) {
console.log('Users Database Protocol Connection Lost: ', err);
if(err.code === 'PROTOCOL_CONNECTION_LOST') {
dbHandleDisconnectUsers();
} else {
throw err;
}
});
}
This way, the object reference of dbConnection is never overwritten.
You will then refer to your users db connection in your module as:
database.dbConnection.users
Your function should still work as you were intending on using it before with:
database.dbHandleDisconnectUsers();
Related
I have a client-side form that can create a document upon submission. I want to see if one of the input fields already exists on a Document in the DB though. This would then alert the user and ask them if they want to continue creating the record.
Client-side event
Template.createDoc.events({
'click button[type=submit]'(e, template) {
//This particular example is checking to see if a Doc with its `name` property set to `value` already exists
const value = $('#name');
const fieldName = 'name';
const exists = Meteor.call('checkIfFieldExistsOnDoc', fieldName, value);
if (exists) {
if (confirm(`Doc with ${value} as its ${fieldName} already exists. Are you sure you want to continue creating Doc?`) {
//db.Docs.insert....
}
}
}
});
Server-side Meteor Method
'checkIfFieldExistsOnDoc'(field, val) {
if (this.isServer) {
this.unblock();
check(field, String);
check(val, String);
if (!this.userId) {
throw new Meteor.Error('not-authorized', 'You are not authorized.');
}
const findObj = {};
findObj[field] = val;
const fieldsObj = {};
fieldsObj[fieldsObj] = 1;
const doc = Docs.findOne(findObj, {fields: fieldsObj});
return doc;
}
},
My issue is that the client-side code always gets undefined back when calling the Server method. I now understand why, however, I'm not keen on wrapping all of my subsequent client-code into a callback yet.
So - any other ideas on how I can attempt to do this simple feature?
Also - I was thinking of having the client-side page's onCreated do a 1-time server call to get ALL names for all Docs, storing this in memory, and then doing the check upon form submission using this. Obviously, this is inefficient and not-scalable, although it would work
Meteor.call in the client side is always an async call. Then you need implement a callback.
See docs: https://docs.meteor.com/api/methods.html#Meteor-call
Meteor.call('checkIfFieldExistsOnDoc', fieldName, value, function(error, result) {
if (result) {
if (confirm(`Doc with ${value} as its ${fieldName} already exists. Are you sure you want to continue creating Doc?`) {
//db.Docs.insert....
}
}
});
On the client, you can wrap any Meteor.call with a Promise and then use it with async/await. There are some packages on Atmosphere that do this for you to.
I've used this package for years: https://atmospherejs.com/deanius/promise
On the client I often just use await Meteor.callPromise() which returns a response nicely.
Here are a couple of the best write-ups on the many options available to you:
https://blog.meteor.com/using-promises-on-the-client-in-meteor-fb4f1c155f84
https://forums.meteor.com/t/meteor-methods-return-values-via-promise-async/42060
https://dev.to/jankapunkt/async-meteor-method-calls-24f9
i'm still new to node.js and i'm trying to create and export my database connection with mysqljs.
I'm having a trouble because if something happens like the database crash or a network problem, the connection needs to be re-created. I can't figure out how to udpate the previously exported connection to my app.
Is there a way to update an object exported by the module like i did ?
Am i missing something or using module.exports in the wrong way ?
I think it would be possible to use a function or a constructor but i would have to edit every module that currently uses my dbh variable.
var reconnection_interval = 5000;
var db_config = {
host : "localhost",
user : "root",
password : "",
database : "test"
};
var dbh;
function handleDisconnect(conn)
{
conn.on('error', function(err)
{
// If the error is not fatal, we just ignore it.
if(!err.fatal){return;}
else
{
logger.error('Database error : '+err);
logger.info('Trying to reconnect in '+reconnection_interval/1000+'s');
// Destroying old connection.
dbh = undefined;
// Creating a new connection and trying to reconnect every <reconnection_interval>
setTimeout(function(){
dbh = mysql.createConnection(db_config);
handleDisconnect(dbh);
dbh.connect();
// I want to update the module.exports value, so it becomes the new connection
module.exports = dbh;
}, reconnection_interval);
}
});
}
dbh = mysql.createConnection(db_config);
handleDisconnect(dbh);
dbh.connect();
logger.info("Connected to database.");
module.exports = dbh;
I'm new to nodejs and trying to learn the basics by rebuilding an existing i2c sensor system.
Got it all running using a named functions and async.series inside a single file. To keep make reusable i now want to create a class which i then can import. unfortunatly i get some errors i don't understand.
class.js
const async = require('async');
const i2c = require('i2c-bus');
class Sensor {
constructor (channel) {
this.channel = channel;
var self = this;
}
openBus (callback) {
bus = i2c.open(self.channel, (err) => {callback()}); // shorted for stackoverflow
}
closeBus (callback) {
bus.close( (err) => {callback()}); //also shorted for better readability
}
connection (callback) {
/* first variation */
async.series([openBus, closeBus], callback);
connection2 (callback) {
/* second variation */
async.series([this.openBus, this.closeBus], callback);
}
}
module.exports = K30;
when i import the class, i can without any problem create a new sensor 'object' and call the functions directly using:
> var Sensor = require('./class.js');
> var mySensor = new Sensor(1);
> mySensor.openBus(foo);
> mySensor.closeBus(bar);
but if i go an try call the wrapper-functions, i get the following errors:
> mySensor.connection(foo);
ReferenceError: openBus is not defined (at 'connection')
> mySensor.connection2(foo);
ReferenceError: self is not defined (at 'openBus')
i believe those errors occure due to my lack of understanding the correct usage of this and self. sadly i can't find any good ead on that topic. any help is highly appreciated.
UPDATE
the solution provided in the first two anwsers was in fact my first approch before starting to use "self" (after some googling [this-that-trick]).
anyways, here is the output/error i get using "this.channel" instead:
> mySensor.connection2(foo);
TypeError: Cannot read property 'channel' of undefined (at openBus)
This is not saved anywhere var self = this; and therefore is lost when the function (constructor is function) ends.
Just remove the above line in constructor and use everywhere the this instead of self.
Its true that this keyword is little tricky in javascript, but if you follow reasonable approach, you should be fine.
You indeed have issue with this and self
Every member inside the class has to be referred by this. If you declare a variable named var EBZ-Krisemendt = "SO user";, to access it, you need to use it with this, eg: console.log(this.EBZ-Krisemendt);
What you need here is
openBus (callback) {
bus = i2c.open(this.channel, (err) => {callback()});
}
and then mysensor.connection2(foo) will work fine.
while i still don't fully understand the reason behind this i fixed my code by getting rid of that "ES6" class definition.
class.js
const i2c = require('i2c-bus');
const async = require('async');
function Sensor(channel) {
let that = this; // make 'this' available in sub-function scope
this.channel = channel;
function openBus(cb) {
// open the bus-connection
bus = i2c.open(that.channel);
}
function closeBus(cb) {
// close the bus-connection
}
function connection(cb) {
async.series([openBus, closeBus], cb);
}
function getReading(cb) {
async.until(
function() {
// loop condition e.g. max tries to get reading
},
function(cb) {
connection(cb); // calling nested synchronous connection-routine
},
function (err) {
// result handling
}
); // end async.until
} // end getReading
return {
getReading: getReading
} // make only 'getReading' available
}
module.exports = {
Sensor: Sensor
} // make 'Sensor' available
in the 'member'-functions i can now use the 'class'-variables of 'Sensor' by accessing them with 'that' (e.g.: 'that.channel')
Detail:
function openBus(cb){
bus = i2c.open(that.channel);
}
if i'd use this instead of that it would only work while calling openBus directly. in my example it's neccessary to call openBus and closeBus in a synchronous manner (for obvious reasons). since async.series is additionally nested inside async.until (sensor might need several tries to response) the scope of this changes. by using that instead i'm able to ignore the scope.
Comment:
since the solution is kinda generally pointing to using nested async-calls inside custom modules i'll slightly alter the titel of the initial question. i'm still hoping for better solutions and/or explanations, so i won't mark my own anwser as accepted yet.
Is it possible to process a db.model.find() query inside of function context and retrieve a result without using callbacks and promises with mongoose library?
I need to get assured, if some user exists in process of running controller, so, I can't minimize current scope to callback due to large amount of same operations (for example, communication with database). Also I'm trying to realize MVC model in my project, so, I want to keep the helper libs (modules) in separated files. That's why I don't want to use any callbacks or promises - they will much times complicate everything even more then things already do.
For example, how should I rewrite the following code to be executed successfully (if it's actually possible) (you can ignore login model and controller - they are written to represent complicacy if to rewrite that code using callbacks):
user.js lib
var db = require('./lib/db');
class User{
constructor(id){ //get user by id
var result = db.models.user.findOne({_id: id}); //unsupported syntax in real :(
if(!result || result._id != _id)
return false;
else{
this.userInfo = result;
return result;
}
}
}
module.exports = User;
login model
var user = require('./lib/user')
var model = {};
model.checkUserLogged(function(req){
if(!req.user.id || req.user.id == undefined)
return false;
if(!(this.user = new user(req.user.id)))
return false;
else
return true;
});
module.exports = model;
login controller
var proxy = require('express').router();
proxy.all('/login', function(req, res){
var model = require('./models/login');
if(!model.checkUserLogged()){
console.log('User is not logged in!');
res.render('unlogged', model);
}else{
console.log('User exists in database!');
res.render('logged_in', model);
}
});
Generator functions/yields, async/await (es2017), and everything et cetera can be used just to solve the problem without nesting.
Thx in advance.
There are two points wrong:
Mongoose methods can't be called synchronously (Anyway a call to a DB done synchronously is not a good idea at all).
Nor async/await nor generators can be used in the constructor of an ES6 Class. It is explained in this answer.
If you don't want nested code an easy option could be to use async/await (currently available in Node.js using a flag, not recommended for production). Since Mongoose methods return promises they can be used with async/await.
But as I said you can not do that in the constructor, so it has to be somewhere else.
As an example you could do something like this:
var proxy = require('express').router();
var db = require('./lib/db');
proxy.all('/login', async function(req, res){
const result = await db.models.user.findOne({_id: req.user.id}).exec();
if (!result) {
console.log('User is not logged in!');
return res.render('unlogged');
}
res.render('logged_in');
});
Old question, but I want to share a method for handling this that I didn't see in my first couple searches.
I want to get data from a model, run some logic and return the results from that logic. I need a promise wrapper around my call to the model.
Below is a slightly abstracted function that takes a model to run a mongoose/mongo query on, and a couple params to help it do some logic. It then returns the value that is expected in the promise or rejects.
export function promiseFunction(aField: string, aValue, model: Model<ADocument, {}>): Promise<aType> {
return new Promise<string>((resolve, reject) => {
model.findOne({[aField]: aValue}, (err, theDocument) => {
if(err){
reject(err.toString());
} else {
if(theDocument.someCheck === true){
return(theDocument.matchingTypeField)
} else {
reject("there was an error of some type")
}
}
});
})
}
In so many introductory examples of using MongoDB, you see code like this:
var MongoClient = require('mongodb').MongoClient;
MongoClient.connect("mongodb://localhost:port/adatabase", function(err, db)
{
/* Some operation... CRUD, etc. */
db.close();
});
If MongoDB is like any other database system, open and close operations are typically expensive time-wise.
So, my question is this: Is it OK to simply do the MongoClient.connect("... once, assign the returned db value to some module global, have various functions in the module do various database-related work (insert documents into collections, update documents, etc. etc.) when they're called by other parts of the application (and thereby re-use that db value), and then, when the application is done, only then do the close.
In other words, open and close are done once - not every time you need to go and do some database-related operation. And you keep re-using that db object that was returned during the initial open\connect, only to dispose of it at the end, with the close, when you're actually done with all your database-related work.
Obviously, since all the I/O is asynch, before the close you'd make sure that the last database operation completed before issuing the close. Seems like this should be OK, but i wanted to double-check just in case I'm missing something as I'm new to MongoDB. Thanks!
Yes, that is fine and typical behavior. start your app, connect to db, do operations against the db for a long time, maybe re-connect if the connection ever dies unexpectedly, and then just never close the connection (just rely on the automatic close that happens when your process dies).
mongodb version ^3.1.8
Initialize the connection as a promise:
const MongoClient = require('mongodb').MongoClient
const uri = 'mongodb://...'
const client = new MongoClient(uri)
const connection = client.connect() // initialized connection
And then call the connection whenever you wish you perform an action on the database:
// if I want to insert into the database...
const connect = connection
connect.then(() => {
const doc = { id: 3 }
const db = client.db('database_name')
const coll = db.collection('collection_name')
coll.insertOne(doc, (err, result) => {
if(err) throw err
})
})
The current accepted answer is correct in that you may keep the same database connection open to perform operations, however, it is missing details on how you can retry to connect if it closes. Below are two ways to automatically reconnect. It's in TypeScript, but it can easily be translated into normal Node.js if you need to.
Method 1: MongoClient Options
The most simple way to allow MongoDB to reconnect is to define a reconnectTries in an options when passing it into MongoClient. Any time a CRUD operation times out, it will use the parameters passed into MongoClient to decide how to retry (reconnect). Setting the option to Number.MAX_VALUE essentially makes it so that it retries forever until it's able to complete the operation. You can check out the driver source code if you want to see what errors will be retried.
class MongoDB {
private db: Db;
constructor() {
this.connectToMongoDB();
}
async connectToMongoDB() {
const options: MongoClientOptions = {
reconnectInterval: 1000,
reconnectTries: Number.MAX_VALUE
};
try {
const client = new MongoClient('uri-goes-here', options);
await client.connect();
this.db = client.db('dbname');
} catch (err) {
console.error(err, 'MongoDB connection failed.');
}
}
async insert(doc: any) {
if (this.db) {
try {
await this.db.collection('collection').insertOne(doc);
} catch (err) {
console.error(err, 'Something went wrong.');
}
}
}
}
Method 2: Try-catch Retry
If you want more granular support on trying to reconnect, you can use a try-catch with a while loop. For example, you may want to log an error when it has to reconnect or you want to do different things based on the type of error. This will also allow you to retry depending on more conditions than just the standard ones included with the driver. The insert method can be changed to the following:
async insert(doc: any) {
if (this.db) {
let isInserted = false;
while (isInserted === false) {
try {
await this.db.collection('collection').insertOne(doc);
isInserted = true;
} catch (err) {
// Add custom error handling if desired
console.error(err, 'Attempting to retry insert.');
try {
await this.connectToMongoDB();
} catch {
// Do something if this fails as well
}
}
}
}
}