How do you connect to a localhost PostgreSQL Database? - node.js

I'm getting an authorization error when trying to connect to my postgres database through localhost. I can log in fine with sudo -u postgres psql. When trying to connect through my Express application (using both Sequelize and node-postgres) I get an error.
error:
error: password authentication failed for user: "postgres"
.env:
PGUSER="postgres"
PGHOST=""
PGPASSWORD=""
PGDATABASE=""
PGPORT=""
sequelize config:
const sequelize = new Sequelize('postgres://postgres#localhost:5432/test', { ... });
I've edited my pg_hba.conf file, as well to accept all localhost connections with:
local all all _BLANK_ trust

Related

Depending on HOST NAME value for PostgreSQL database in my .env file I get different result

I have created a PostgreSQL with a web server DOCKER CONTAINER and my app running with an API to manage all the endpoints.
The link of my git repo is: https://github.com/JulioValderrama/store-front-project.git
But I am facing problems with the connections with the Database, as depending on the Postgres HOST name value in my .env file one of the two servers (one local running in PORT 4000 and the web server running in the docker container in PORT 3000) will work or will fail.
I have tried the next:
Local server running in PORT 4000
Docker web server running in PORT 3000
HOST "127.0.0.1"
Going to¨:
http://localhost:3000/
Works fine and get response, now when trying to connect it to any of my database API:
http://localhost:3000/products
I get the error:
"Could not get PRODUCTS. Error: Error: connect ECONNREFUSED 127.0.0.1:5432"
Going to:
http://localhost:4000
Works fine and get response, now when trying to connect it to any of my database API:
http://localhost:4000/products
It works! I get the list of all my products!!
HOST "postgres"
I put "postgres" because I read online that you have to name HOST as the postgres docker image created, which is my case. And it works for the remote server.
Going to:
http://localhost:3000
Works fine and get response, then when trying to connect it to database API:
http://localhost:3000/products
It works!! It gives me the list of my products !!
Going to:
http://localhost:4000
Works fine and get response, then when trying to connect it to database API:
http://localhost:4000/products
It gives me the error:
"Could not get PRODUCTS. Error: Error: getaddrinfo ENOTFOUND postgres"
So it seems like there is an error when trying to connect to the database due to the server or the HOST name, I have no idea....
In docker-compose.yaml you have linked your machine 5432 to containers 5432 port.
If you are using docker container and want to reach postgres, use postgres as POSTGRES_HOST value.
If you are running application outside the docker environment use 0.0.0.0, 127.0.0.1.
Make sure you haven't installed postgres locally
brew uninstall postgres

Why connection to postgres from nodejs is not working?

I have back-end created in nodejs hosted on Debian server using postgres 14.
the Nodejs listenning on port 5000, and when I try to connect to postgres using port 5432, I am getting the following reponse from Postman:
"Password is invalid".
I was trying the postgres role, and to create a new role, and problem is the same.
I was installing postgres on my Windows 10 computer, and all works fine in my localhost, but on the remote server not.
My Nodejs back-end, and postgres are on the same computer, but each role I've tryed won't connect to the database.
There is pg_hba.conf in /etc/postgres/14 on your server you have to add:
host username database 0.0.0.0/0 md5
This would allow connection to database from any host to username using password authentication.
Also there is a pg.conf or postgres conf
There is line allow_connections the value should be '*'
allow_connections='*'
This would generally allow connections to postgres running on that server
Guide https://coderwall.com/p/cr2a1a/allowing-remote-connections-to-your-postgresql-vps-installation
don’t forget you have to reload postgres after config change:
systemctl reload postgres

MongoDB How to fix: Error: couldn't connect to server 197.0.196.205:27017, connection attempt failed: SocketException

I'm trying to connect remotely on a mongodb server throw my local machine, but i'm having some issues.
On the remote server i modified the 'mongod.cfg' file and i changed the bindIp from 127.0.0.1 to 0.0.0.0 to allow access. In the same file i changed the security by adding Authentication: 'enabled'.
I created an admin user:
> use admin
> db.createUser({user: "root", pwd: "root", roles:["root"]})
I started mongodb with --auth flag
> mongod --auth --port 27017
Once the server was up, i connect to it as administrator
mongo 127.0.0.1:27017 -u "root" -p "root" --authenticationDatabase "admin"
Once i was connected, i created a normal user
> use base
> db.createUser({user: "base", pwd: "base", roles:["dbOwner"]})
Then i disconnected from mongo shell and reconnected with new user credentials
> mongo 127.0.0.1/base -u "base" -p "base"
It worked properly on the remote server.
On the local machine i tried:
> mongo <ip address of the server>/base -u "base" -p "base"
I'm getting this error:
> mongo <ip address>:27017/base -u "base" -p "base"
[js] Error: couldn't connect to server <ip address>:27017, connection attempt failed: SocketException: Error connecting to <ip address>:27017 :: caused by :: Operation timed out :
connect#src/mongo/shell/mongo.js:344:17
#(connect):2:6
exception: connect failed
Could be that your local machine and remote server use different versions of MongoDB shell.
Which versions of mongo shell are you using?
You would also need to enclose the connection string in double quotes:
mongo "<ip address>:27017/base" -u "base" -p "base".
Another issue could be that you need to white list your i.p or configure your firewall rules.

connection error while connecting to AWS DocumentDB

getting the following error while connecting to AWS DocumentDB from node.js
connection error: { [MongoNetworkError: connection 1 to
docdb-2019-01-28-06-57-37.cluster-cqy6h2ypc0dj.us-east-1.docdb.amazonaws.com:27017
timed out] name: 'MongoNetworkError', errorLabels: [
'TransientTransactionError' ] }
here is my node js file
app.js
var mongoose = require('mongoose');
mongoose.connect('mongodb://abhishek:abhishek#docdb-2019-01-28-06-57-37.cluster-cqy6h2ypc0dj.us-east-1.docdb.amazonaws.com:27017/?ssl_ca_certs=rds-combined-ca-bundle.pem&replicaSet=rs0', {
useNewUrlParser: true
});
var db = mongoose.connection;
db.on('error', console.error.bind(console, 'connection error:'));
db.once('open', function() {
console.log("connected...");
});
By default aws documentdb is designed to connect only from same VPC.
So to connect nodejs application from an ec2 in same vpc. You need to have the pem file as by default SSL is enabled while db instance is created.
step-1 : $ wget https://s3.amazonaws.com/rds-downloads/rds-combined-ca-bundle.pem in required directory
step-2 : Change the mongoose connection with options pointing to pem file
mongoose.connect(database.url, {
useNewUrlParser: true,
ssl: true,
sslValidate: false,
sslCA: fs.readFileSync('./rds-combined-ca-bundle.pem')})
.then(() => console.log('Connection to DB successful'))
.catch((err) => console.error(err,'Error'));
Here am using mongoose 5.4.0
To connnect from outside the VPC, please try to follow the below doc from aws:
https://docs.aws.amazon.com/documentdb/latest/developerguide/connect-from-outside-a-vpc.html
Personally I tried only to connect from VPC and it worked fine.
Update =====:>
To connect from Robo 3T outside VPC please follow the link -
AWS DocumentDB with Robo 3T (Robomongo)
to use AWS DocumentDB outside VPC for example your development server EC2 or from the local machine will get a connection error unless you use ssh tunneling or port forwarding
and about tunneling it simple
use this command in your local
ssh -i "ec2Access.pem" -L 27017:sample-cluster.node.us-east-1.docdb.amazonaws.com:27017 ubuntu#EC2-Host -N
in application configuration use
{
uri: 'mongodb://:#127.0.0.1:27017/Db',
useNewUrlParser: true,
useUnifiedTopology:true,
directConnection: true
}
just make sure you can connect from this tunneling ec2 and database
and if you decide to use port forwarding
steps
0- in ec2 security grou[p add inbound role with custom TCP and port 27017 All traffic
1- go to your ec2 instance and install Haproxy
$ sudo apt install haproxy
2- edit Haproxy configuration
$ sudo nano haproxy.cfg
3- in end off file add
listen mongo
bind 0.0.0.0:27017
timeout connect 10s
timeout client 1m
timeout server 1m
mode TCP
server AWSmongo <database-host-url>:27017
4- now restart HaProxy
$ sudo service HaPoxy restart
5- now you can access your database using
{uri: 'mongodb://<database-user>:<database-pass>#<EC2-IP>:27017/<db>'}

Failed to connect mongodb from shell / robomongo. But connected via nodejs code

I have tried to connect the mongodb from shell using the following command.
mongo xxxxx.mongolab.com:47612/dbname -u user -p password
But I am getting the error : Error: 18 Authentication failed.
Same authorization failure when I try to connect from Robomongo.
But when I tried to connect mongodb from my NodeJS code, its working perfect.
NodeJS Code:
mongoose.connect('mongodb://user:password#xxxxx.mongolab.com:47612/dbname',
function (err){
if (err) throw err;
console.log("db connected")
});
Also I tried to change the bind IP in mongod.conf from 127.0.0.1 to 0.0.0.0
Issue still exists.

Resources