I'm trying to connect Nodejs to PostgreSQL database, for that I'm using node-postgres.
var pool = new Pool({
user: username,
password: password,
host: server
database: database,
max: 25
});
module.exports = {
execute_query: function (query2) {
//usage of query
pool.query('query2', function(err, result){
return (result);
});
}
};
Then in my application, the function execute_query is called in different places in the application.
Locally it works but I wonder how the pool is managed, is this enough configuration to manage concurrent users if my application is used by different people ?
Do I need to do anything else to ensure that I have clients in the pool ?
Or should I use the old way of managing clients with a hard code ?
I read the documentation of node-postgres and it says that pool.query is the simplist way but it doesnt say how it manages the connections...
Do you have any information ?
Thank you
is this enough configuration to manage concurrent users if my
application is used by different people ?
This is a very broad question and depends on more than one thing. Let me still give this a shot
Number of connection in the pool is the number of active connections your server will maintain with the db. Each connection as a cost as postgres maintains this as a separate process. So just focussing on the connection pool is not enough.
pgtune gives you a good recommendation on your postgresql.conf setting based on your hardware.
If you want to test out your application, you can test using jMeter or any other load testing tool to see how your application will perform under certain load.
Some good resources to read on the topic stack overflow answer, postgres wiki
Related
I am using an app on GCP with Node.js with Postgresql (Cloud SQL, lowest tier i.e. 25 connections) using the 'pg' package ("pg": "^8.7.3",). I am quite new with this configuration so there may be some very basic errors here.
I configure my pg_client like this
// CLOUD SQL POSTGRESQL DATABASE
const { Client, Pool } = require('pg')
const pg_client = new Pool({
user: process.env.PG_USER,
host: process.env.PG_HOST,
database: process.env.PG_DB,
password: process.env.PG_PWD,
port: 5432,
})
and then, in order to copy the data from a nosql-database with some 50.000+ items I go through them pretty much like this. I know the code doesn't make perfect sense but this is how the SQL calls are being made:
fiftyThoussandOldItems.forEach(async (item) => {
let nameId = await pg_client.query("SELECT id from some1000items where name='John'")
pg_client.query("INSERT into items (id, name, url) VALUES (nameId, 1,2)"
})
This does however quickly render sorry, too many clients already :: proc.c:362 and error: remaining connection slots are reserved for non-replication superuser connections.
I have done similar runs before without experiencing this issue (but then with about 1000 items).
As far as I understand, I do Not need to do a pg_client.connect() and pg_client.release() (or is it .end()) any longer, according to a SO-answer I unfortunately can't find any longer. Is this really correct? (When I tried to before, I ended up with a lot of other issues that causes other types of problems)
So, my questions are:
What am I doing wrong? Do I need to use pg_client.connect() before every SQL-call and then pg_client.release() after every SQL-call? Or is it pg_client.end()?
Is there a way to have this automatically handled? It doesn't seem very DRY and bug prone.
I am currently working on a web site using Angular, Node.Js, express and an Oracle database.
I'm still not familiar with all this technologies and I'm doing my best! My problem is the Oracle database connection can't be exported. I searched about this and I think it's a promise thing. The only solution I found is to connect to the database every time I use a new SQL query which is impossible for me because the project is big.
Is there any method I can use to make the database connection in a separate file and export it when I need it?
With a multi-user app, you should use a connection pool. You open it at app start, and then access it as needed - you don't have to pass anything around.
If you open the connection pool in your initialization routine:
await oracledb.createPool({
user: dbConfig.user,
password: dbConfig.password,
connectString: dbConfig.connectString
});
console.log('Connection pool started');
then any other module that needs a connection can simply do:
conn = await oracledb.getConnection();
result = await conn.execute(statement, binds, opts);
await conn.close();
Read up on connection pools and sizing and threads in the node-oracledb
documentation on Connection Pooling.
Also see the series Build REST APIs for Node.js and its code https://github.com/oracle/oracle-db-examples/tree/master/javascript/rest-api. In particular, look at database.js.
My question is; Can i have less connections open when using rethindb? Right now I'm opening a new one every time I want to insert or get some data. Im afraid that this is not a right thing to do. Is there any way I can open just one and use it instead? I'm using nodejs.
Yes. You can run multiple queries on a connection. That's the recommended way of doing things.
The best way is to use connection pool. For nodejs, for example, we are using rethinkdb-pool.
I havent looked into the opensource connection pools for rethink, but I have a node app that uses rethink, and it will have a limited number of users, So I save my one connection to rethink as a global var and then use it for all queries.
'use strict';
var r = require('rethinkdb);
var rethinkConnect = null;
r.connect(
{
'host': 'localhost',
'port': '28015',
'password': 'noneya',
},
function(err, conn) {
if (err) {
console.log(err);
} else {
rethinkConnect = conn;
}
}
);
Now all queries the node.js server makes can use the connection. Keep in mind this code is async so you could not immediately make a query the next line after r.connect(). You could, however, use the payload of an inbound socket.io event as the params of a rethink query.
I'd advise you to use rethinkdbdash, an "advanced Node.js RethinkDB driver" that includes connection pools and some other advanced features. It has many more stars and contributors than rethinkdb-pool.
I've just set up MongoDB, got a hold of mongoose as well and while I was following some tutorial of connecting to the database I have noticed that there's no user/password being required by default which lead me to the following question:
Is that a security issue when moving to production? What security measures do I need to take? Would anyone be able to access MongoDB remotely? How to suppress that if so.
Indeed, You have stumbled upon a valid question.
You check out this source:
http://mongodb.github.io/node-mongodb-native/contents.html
And have a code which does something of this sort:
var Db = require('mongodb').Db,
MongoClient = require('mongodb').MongoClient,
BSON = require('mongodb').pure().BSON,
assert = require('assert');
var db = new Db('integration_tests', new Server("127.0.0.1", 27017,
{auto_reconnect: false, poolSize: 4}), {w:0, native_parser: false});
// Establish connection to db
db.open(function(err, db) {
assert.equal(null, err);
// Add a user to the database
db.addUser('user', 'name', function(err, result) {
assert.equal(null, err);
// Authenticate
db.authenticate('user', 'name', function(err, result) {
assert.equal(true, result);
db.close();
});
});
});
Just adding a few more sources you might want to have a look at:
https://docs.mongodb.org/v3.0/administration/security-checklist/
https://docs.mongodb.org/manual/administration/security/
Hope this is a good starting point in your quest for productionizing MongoDB!
Yes it is problematic if access is possible from the internet (i.e., is not firewalled, has a weak/no password, or not bound to localhost only). Attackers might be easily able to access your DB and read all data. This is not just a theoretical threat, see http://www.cso.com.au/article/566040/students-find-40k-unprotected-mongodb-databases-8-million-telco-customer-records-exposed/ for a recent "indicent".
The mongodb developers provide a security checklist and also provide a security tutorial.
So, at least set a password and at best bind it to localhost only (also problematic if other users also have access to that machine) in order to prevent brute force attacks.
Security should be a multi-layered approach.
First of all, in production, I would recommend putting MongoDB on a separate physical machine.
I would restrict access to this machine via the firewall, such that only MongoDB traffic on port 27017 can access the MongoDB machine from the webservers.
I would only allow ssh access to the MongoDB machine or the webservers from specific IP addresses which need access to them.
I would use only key-based authentication for SSH to both the Webservers and MongoDB machine.
I would completely block the MongoDB machine from accessing the internet, aside from NTP for time synchronization.
Although I feel the above steps are more important, I would enable MongoDB authentication.
This is a weak layer of security, though, since only the webservers should be able to access port 27017 on the MongoDB machine, and anyone who compromised the webserver would be able to extract the MongoDB password from the source code on the server.
If you have multiple MongoDB databases, you can use different authentication credentials for each database, to add a level of isolation between applications.
In short, no, you don't need authentication, but it can add an extra layer of security. The other layers are far more important.
I was wondering if anyone can help me understand what the proper way of maintaining multiple connections to multiple postgres servers via https://github.com/brianc/node-postgres is.
Obviously when running a node server for long duration we want to make sure we keep everything clean with no leaks and so I am wondering what the proper pattern is.
Please remember that my Node server will need to connect to 7-8 Postgres servers.
https://github.com/brianc/node-postgres supports the idea of pools. I am wondering: do I just connect to all servers on initial Node server set up and maintain open connections and each function can ask for a pool when it needs to talk to a server?
In other words, am I supposed to call pg.connect every time I make a server query? (minus the var pg and var connectionString which could be global)
Can't I just have a single connection be on and ready?
var pg = require('pg');
var connectionString = "pg://brian:1234#localhost/postgres"
pg.connect(connectionString, function(err, client, done) {
client.query('SELECT name FROM users WHERE email = $1', ['brian#example.com'], function(err, result) {
assert.equal('brianc', result.rows[0].name);
done();
});
});
Code snippets are greatly appreciated.