Opening less connections with rethinkDB - node.js

My question is; Can i have less connections open when using rethindb? Right now I'm opening a new one every time I want to insert or get some data. Im afraid that this is not a right thing to do. Is there any way I can open just one and use it instead? I'm using nodejs.

Yes. You can run multiple queries on a connection. That's the recommended way of doing things.

The best way is to use connection pool. For nodejs, for example, we are using rethinkdb-pool.

I havent looked into the opensource connection pools for rethink, but I have a node app that uses rethink, and it will have a limited number of users, So I save my one connection to rethink as a global var and then use it for all queries.
'use strict';
var r = require('rethinkdb);
var rethinkConnect = null;
r.connect(
{
'host': 'localhost',
'port': '28015',
'password': 'noneya',
},
function(err, conn) {
if (err) {
console.log(err);
} else {
rethinkConnect = conn;
}
}
);
Now all queries the node.js server makes can use the connection. Keep in mind this code is async so you could not immediately make a query the next line after r.connect(). You could, however, use the payload of an inbound socket.io event as the params of a rethink query.

I'd advise you to use rethinkdbdash, an "advanced Node.js RethinkDB driver" that includes connection pools and some other advanced features. It has many more stars and contributors than rethinkdb-pool.

Related

Quickly using up all connections on postgresql in Node.js

I am using an app on GCP with Node.js with Postgresql (Cloud SQL, lowest tier i.e. 25 connections) using the 'pg' package ("pg": "^8.7.3",). I am quite new with this configuration so there may be some very basic errors here.
I configure my pg_client like this
// CLOUD SQL POSTGRESQL DATABASE
const { Client, Pool } = require('pg')
const pg_client = new Pool({
user: process.env.PG_USER,
host: process.env.PG_HOST,
database: process.env.PG_DB,
password: process.env.PG_PWD,
port: 5432,
})
and then, in order to copy the data from a nosql-database with some 50.000+ items I go through them pretty much like this. I know the code doesn't make perfect sense but this is how the SQL calls are being made:
fiftyThoussandOldItems.forEach(async (item) => {
let nameId = await pg_client.query("SELECT id from some1000items where name='John'")
pg_client.query("INSERT into items (id, name, url) VALUES (nameId, 1,2)"
})
This does however quickly render sorry, too many clients already :: proc.c:362 and error: remaining connection slots are reserved for non-replication superuser connections.
I have done similar runs before without experiencing this issue (but then with about 1000 items).
As far as I understand, I do Not need to do a pg_client.connect() and pg_client.release() (or is it .end()) any longer, according to a SO-answer I unfortunately can't find any longer. Is this really correct? (When I tried to before, I ended up with a lot of other issues that causes other types of problems)
So, my questions are:
What am I doing wrong? Do I need to use pg_client.connect() before every SQL-call and then pg_client.release() after every SQL-call? Or is it pg_client.end()?
Is there a way to have this automatically handled? It doesn't seem very DRY and bug prone.

node-postgres pool management

I'm trying to connect Nodejs to PostgreSQL database, for that I'm using node-postgres.
var pool = new Pool({
user: username,
password: password,
host: server
database: database,
max: 25
});
module.exports = {
execute_query: function (query2) {
//usage of query
pool.query('query2', function(err, result){
return (result);
});
}
};
Then in my application, the function execute_query is called in different places in the application.
Locally it works but I wonder how the pool is managed, is this enough configuration to manage concurrent users if my application is used by different people ?
Do I need to do anything else to ensure that I have clients in the pool ?
Or should I use the old way of managing clients with a hard code ?
I read the documentation of node-postgres and it says that pool.query is the simplist way but it doesnt say how it manages the connections...
Do you have any information ?
Thank you
is this enough configuration to manage concurrent users if my
application is used by different people ?
This is a very broad question and depends on more than one thing. Let me still give this a shot
Number of connection in the pool is the number of active connections your server will maintain with the db. Each connection as a cost as postgres maintains this as a separate process. So just focussing on the connection pool is not enough.
pgtune gives you a good recommendation on your postgresql.conf setting based on your hardware.
If you want to test out your application, you can test using jMeter or any other load testing tool to see how your application will perform under certain load.
Some good resources to read on the topic stack overflow answer, postgres wiki

What's the proper way of using Postgres connections in Node?

I was wondering if anyone can help me understand what the proper way of maintaining multiple connections to multiple postgres servers via https://github.com/brianc/node-postgres is.
Obviously when running a node server for long duration we want to make sure we keep everything clean with no leaks and so I am wondering what the proper pattern is.
Please remember that my Node server will need to connect to 7-8 Postgres servers.
https://github.com/brianc/node-postgres supports the idea of pools. I am wondering: do I just connect to all servers on initial Node server set up and maintain open connections and each function can ask for a pool when it needs to talk to a server?
In other words, am I supposed to call pg.connect every time I make a server query? (minus the var pg and var connectionString which could be global)
Can't I just have a single connection be on and ready?
var pg = require('pg');
var connectionString = "pg://brian:1234#localhost/postgres"
pg.connect(connectionString, function(err, client, done) {
client.query('SELECT name FROM users WHERE email = $1', ['brian#example.com'], function(err, result) {
assert.equal('brianc', result.rows[0].name);
done();
});
});
Code snippets are greatly appreciated.

How to work with node.js and mongoDB

I read :
How do I manage MongoDB connections in a Node.js web application?
http://mongodb.github.io/node-mongodb-native/driver-articles/mongoclient.html
How can I set up MongoDB on a Node.js server using node-mongodb-native in an EC2 environment?
And I am really confused. How I should work with mongoDB from node.js? I’m a rookie, and my question may look stupid.
var db = new db.MongoClient(new db.Server('localhost', 27017));
db.open(function(err, dataBase) {
//all code here?
dataBase.close();
});
Or every time when I needing something from db I need call:
MongoClient.connect("mongodb://localhost:27017/myDB", function(err, dataBase) {
//all code here
dataBase.close();
});
What is the difference betwen open and connect? I read in the manual that open: Initialize and second connect. But what exactly does that mean? I assume that both do the same, but in the other way, so when should I use one instead the other?
I also wanna ask it's normal that mongoClient needing 4 socket? I running two myWEbServer at the same time, here’s picture:
http://i43.tinypic.com/29mlr14.png
EDIT:
I wanna mention that this isn't a problem ( rather doubt :D), my server works perfect. I ask because I wanna know if I am using mongoDB driver correctly.
Now/Actually I use first option,init mongo dirver at the beginning and inside load put all code.
I'd recommend trying the MongoDB tutorial they offer. I was in the same boat, but this breaks it down nicely. In addition, there's this article on github that explains the basics of DB connection.
In short, it does look like you're doing it right.
MongoClient.connect("mongodb://localhost:27017/myDB", function(err, dataBase) {
//all code here
var collection = dataBase.collection('users');
var document1 = {'name':'John Doe'};
collection.insert(document1, {w:1}, function(err,result){
console.log(err);
});
dataBase.close();
});
You still can sign up for a free course M101JS: MongoDB for Node.js Developers, provided by MongoDB guys
Here is short description:
This course will go over basic installation, JSON, schema design,
querying, insertion of data, indexing and working with language
drivers. In the course, you will build a blogging platform, backed by
MongoDB. Our code examples will be in Node.js.
I had same question. I couldn't find any proper answer from mongo documentation.
All document say is to prefer new db connection and then use open (rather than using connect() )
http://docs.mongodb.org/manual/reference/method/connect/

Connection to Mongodb-Native-Driver in express.js

I am using mongodb-native-driver in express.js app. I have around 6 collections in the database, so I have created 6 js files with each having a collection as a javascript object (e.g function collection(){}) and the prototypes functions handling all the manipulation on those collections. I thought this would be a good architecture.
But the problem I am having is how to connect to the database? Should I create a connection in each of this files and use them? I think that would be an overkill as the connect in mongodb-native-driver creates a pool of connections and having several of them would not be justified.
So how do I create a single connection pool and use it in all the collections.js files? I want to have the connection like its implemented in mongoose. Let me know if any of my thought process in architecture of the app is wrong.
Using Mongoose would solve these problems, but I have read in several places thats it slower than native-driver and also I would prefer a schema-less models.
Edit: I created a module out of models. Each collection was in a file and it took the database as an argument. Now in the index.js file I called the database connection and kept a variable db after I got the database from the connection. (I used the auto-reconnect feature to make sure that the connection wasn't lost). In the same index.js file I exported each of the collections like this
exports.model1 = require('./model1').(db)
exprorts.model2 = require('./model2').(db)
This ensured that the database part was handled in just one module and the app would just call function that each model.js file exported like save(), fincdbyid() etc (whatever you do in the function is upto you to implement).
how to connect to the database?
In order to connect using the MongoDB native driver you need to do something like the following:
var util = require('util');
var mongodb = require('mongodb');
var client = mongodb.MongoClient;
var auth = {
user: 'username',
pass: 'password',
host: 'hostname',
port: 1337,
name: 'databaseName'
};
var uri = util.format('mongodb://%s:%s#%s:%d/%s',
auth.user, auth.pass, auth.host, auth.port, auth.name);
/** Connect to the Mongo database at the URI using the client */
client.connect(uri, { auto_reconnect: true }, function (err, database) {
if (err) throw err;
else if (!database) console.log('Unknown error connecting to database');
else {
console.log('Connected to MongoDB database server at:');
console.log('\n\t%s\n', uri);
// Create or access collections, etc here using the database object
}
});
A basic connection is setup like this. This is all I can give you going on just the basic description of what you want. Post up some code you've got so far to get more specific help.
Should I create a connection in each of this files and use them?
No.
So how do I create a single connection pool and use it in all the collections.js files?
You can create a single file with code like the above, lets call it dbmanager.js connecting to the database. Export functions like createUser, deleteUser, etc. which operate on your database, then export functionality like so:
module.exports = {
createUser: function () { ; },
deleteUser: function () { ; }
};
which you could then require from another file like so:
var dbman = require('./dbmanager');
dbman.createUser(userData); // using connection established in `dbmanager.js`
EDIT: Because we're dealing with JavaScript and a single thread, the native driver indeed automatically handles connection pooling for you. You can look for this in the StackOverflow links below for more confirmation of this. The OP does state this in the question as well. This means that client.connect should be called only once by an instance of your server. After the database object is successfully retrieved from a call to client.connect, that database object should be reused throughout the entire instance of your app. This is easily accomplished by using the module pattern that Node.JS provides.
My suggestion is to create a module or set of modules which serves as a single point of contact for interacting with the database. In my apps I usually have a single module which depends on the native driver, calling require('mongodb'). All other modules in my app will not directly access the database, but instead all manipulations must be coordinated by this database module.
This encapsulates all of the code dealing with the native driver into a single module or set of modules. The OP seems to think there is a problem with the simple code example I've posted, describing a problem with a "single large closure" in my example. This is all pretty basic stuff, so I'm adding clarification as to the basic architecture at work here, but I still do not feel the need to change any code.
The OP also seems to think that multiple connections could possibly be made here. This is not possible with this setup. If you created a module like I suggest above then the first time require('./dbmanager') is called it will execute the code in the file dbmanager.js and return the module.exports object. The exports object is cached and is also returned on each subsequent call to require('./dbmanager'), however, the code in dbmanager.js will only be executed the first require.
If you don't want to create a module like this then the other option would be to export only the database passed to the callback for client.connect and use it directly in different places throughout your app. I recommend against this however, regardless of the OPs concerns.
Similar, possibly duplicate Stackoverflow questions, among others:
How to manage mongodb connections in nodejs webapp
Node.JS and MongoDB, reusing the DB object
Node.JS - What is the right way to deal with MongoDB connections
As accepted answer says - you should create only one connection for all incoming requests and reuse it, but answer is missing solution, that will create and cache connection. I wrote express middleware to achieve this - express-mongo-db. At first sight this task is trivial, and most people use this kind of code:
var db;
function createConnection(req, res, next) {
if (db) { req.db = db; next(); }
client.connect(uri, { auto_reconnect: true }, function (err, database) {
req.db = db = databse;
next();
});
}
app.use(createConnection);
But this code lead you to connection-leak, when multiple request arrives at the same time, and db is undefined. express-mongo-db solving this by holding incoming clients and calling connect only once, when module is required (not when first request arrives).
Hope you find it useful.
I just thought I would add in my own method of MongoDB connection for others interested or having problems with different methods
This method assumes you don't need authentication(I use this on localhost)
Authentication is still easy to implement
var MongoClient = require('mongodb').MongoClient;
var Server = require('mongodb').Server;
var client = new MongoClient(new Server('localhost',27017,{
socketOptions: {connectTimeoutMS: 500},
poolSize:5,
auto_reconnect:true
}, {
numberOfRetries:3,
retryMilliseconds: 500
}));
client.open(function(err, client) {
if(err) {
console.log("Connection Failed Via Client Object.");
} else {
var db = client.db("theDbName");
if(db) {
console.log("Connected Via Client Object . . .");
db.logout(function(err,result) {
if(!err) {
console.log("Logged out successfully");
}
client.close();
console.log("Connection closed");
});
}
}
});
Credit goes to Brad Davley which goes over this method in his book (page 231-232)

Resources