GCP Postgres refusing connection from App Engine nodejs - node.js

I am following the tutorial for Strapi on CGP App Engine (nodejs- standard env) and unable to get the app to start because the connection is being refused Error: connect ECONNREFUSED 127.0.0.1:5432 by the GCP Postgres instance (Public IP) .
Why I'm confused
GCP Service Principle Persmissions: <project_name>#appspot.gserviceaccount.com has Cloud SQL Client for the App Engine default service account so this should apply to all App Engine Services.
I have other App Engine Services (python) connecting successfully to other Postgres Databases. This tells me I have the correct permissions, Cloud SQL Admin API enabled, and the correct username/password.
The code works locally (Docker) while linking the GCP Postgres database, but only with TCP routing, not a Unix Socket SQL proxy:
../../cloud_sql_proxy -instances=<project_name>:europe-west1:<sql_instance_name>=tcp:5432 & (sleep 5 && yarn strapi start)
I can login to the locally hosted Strapi app, add users, etc. and the changes are reflected in the GCP Postgres database.
The only difference between the local deployment (docker-compose.yml) and the App engine (app.yml) is how I set the environment variables.
#Dockerfile
FROM node:14-buster
RUN echo "deb [signed-by=/usr/share/keyrings/cloud.google.gpg] http://packages.cloud.google.com/apt cloud-sdk main" | tee -a /etc/apt/sources.list.d/google-cloud-sdk.list && curl https://packages.cloud.google.com/apt/doc/apt-key.gpg | apt-key --keyring /usr/share/keyrings/cloud.google.gpg add - && apt-get update -y && apt-get install google-cloud-sdk -y
RUN wget https://dl.google.com/cloudsql/cloud_sql_proxy.linux.amd64 -O cloud_sql_proxy && chmod +x cloud_sql_proxy
#docker-compose.yml
version: "3.8"
services:
dev:
build: .
ports:
- "1337:1337"
volumes:
- .:/src
command: ["yarn", "run", "start"]
working_dir: /src
environment:
NODE_ENV: "production"
DATABASE_NAME: '<database name>'
DATABASE_USERNAME: '<username>'
DATABASE_PASSWORD: '<password>'
INSTANCE_CONNECTION_NAME: '<project_name>:europe-west1:<instance_name>'
# app.yml
runtime: nodejs14
instance_class: F2
service: strapi
env_variables:
HOST: '0.0.0.0'
NODE_ENV: 'local'
DATABASE_NAME: '<database name>'
DATABASE_USERNAME: '<username>'
DATABASE_PASSWORD: '<password>'
INSTANCE_CONNECTION_NAME: '<project_name>:europe-west1:<instance_name>'
beta_settings:
cloud_sql_instances: '<project_name>:europe-west1:<instance_name>'
The code that defines the connection in the nodejs project, from the Strapi tutorial:
module.exports = ({ env }) => ({
defaultConnection: 'default',
connections: {
default: {
connector: 'bookshelf',
settings: {
client: 'postgres',
socketPath: `/cloudsql/${env('INSTANCE_CONNECTION_NAME')}`,
database: env('DATABASE_NAME'),
username: env('DATABASE_USERNAME'),
password: env('DATABASE_PASSWORD'),
},
options: { }
},
},
});
What have I missed? What else can I check? Someone please help me end this insanity.

What fixed it for me was the following:
Go to my App engine default service principal and give it the following roles (as described here)
Cloud SQL Client
Cloud SQL Editor
Cloud SQL Admin
Change socketPath key to 'host' in the following default connection settings:
module.exports = ({ env }) => ({
defaultConnection: 'default',
connections: {
default: {
connector: 'bookshelf',
settings: {
client: 'postgres',
----> socketPath: `/cloudsql/${env('INSTANCE_CONNECTION_NAME')}`,
database: env('DATABASE_NAME'),
username: env('DATABASE_USERNAME'),
password: env('DATABASE_PASSWORD'),
},
options: { }
},
},
});

Related

Sequelize CLI ,Docker ,ConnectionRefusedError [SequelizeConnectionRefusedError] [duplicate]

I have a CRUD app working on node, on my local machine. It is running on node, with postgres as the database, using knex.js as a query builder, etc.
I have created a docker file, and a docker-compose file, and the containers start, but the node container can't reach the postgres container. I suspect it has to do with the enviornment variables but I am not sure. here is my docker file:
FROM node:12
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package*.json ./
RUN npm ci --only=production
# Bundle app source
COPY . .
ENV PORT=8080
EXPOSE 8080
CMD [ "npm", "start" ]
This is the docker-compose file:
version: '2'
services:
postgres:
image: postgres:alpine
environment:
POSTGRES_PASSWORD: password
POSTGRES_USER: app
POSTGRES_DB: db
app:
build: .
depends_on:
- "postgres"
links:
- "postgres"
environment:
DB_PASSWORD: 'password'
DB_USER: 'app'
DB_NAME: 'db'
DB_HOST: 'postgres'
PORT: 8080
ports:
- '8080:8080'
command: npm start
also, here is the knex.js file on root that handles the db connections based on the environment:
// Update with your config settings.
module.exports = {
development: {
client: 'pg',
connection: 'postgres://localhost/db'
},
test: {
client: 'pg',
connection: 'postgres://localhost/test-db'
}
};
additionally when I check the hosts file on the node app inside the docker i don't see anything mentioning the link to postgres container. Any help would be appreciated, thanks.
The reason why your node application is not connecting is because it is trying to connect to itself as you are referencing localhost. Your database is in a second container which is not local so you need to reference it by service name which would be postgres.
So assuming your application is handling authentication another way, your config would be something like this:
// Update with your config settings.
module.exports = {
development: {
client: 'pg',
connection: 'postgres://postgres/db'
},
test: {
client: 'pg',
connection: 'postgres://postgres/test-db'
}
};
If you can, you should use the environment variables you assigned to the app container.
Docker-compose creates an internal network shared by the different containers it launches.
Since app and postgres are 2 separate containers, they are considered as 2 hosts. This causes app to look for postgres on the same container when you point it at localhost instead of the postgres container.
You can solve this by just changing localhost with postgres in your knex.js file.

Nodejs cannot connect to Postgresdb container

I am currently working on an angular app using Rest API (Express, Nodejs) and Postgresql. Everything worked well when hosted on my local machine. After testing, I moved the images to Ubuntu server so the app can be hosted on an external port. I am able to access the angular frontend using the https://serveripaddress:80 but when trying to login, the api is not connecting to Postgresql. I am getting an error message: ERR_CONNECTION_REFUSED. Here is my docker-compose file:
version: '3.0'
services:
db:
image: postgres:9.6-alpine
environment:
POSTGRES_DB: myDatabase
POSTGRES_PASSWORD: myPwd
POSTGRES_PORT: 5432
POSTGRES_HOST: db
ports:
- 5434:5432
restart: always
volumes:
- ./postgres-data:/var/lib/postgresql/data
backend: # name of the second service
image: myid/nodeapi
ports:
- 3000:3000
environment:
POSTGRES_DB: myDatabase
POSTGRES_PASSWORD: myPwd
POSTGRES_PORT: 5432
POSTGRES_HOST: db
depends_on:
- db
command: bash -c "sleep 20 && node server.js"
myapp-portal:
image: myId/angular-app
ports:
- "80:80"
depends_on:
- backend
volumes:
postgres-data:
The code to connect to database:
const { Client } = require('pg')
const client = new Client({
database: process.env.POSTGRES_DB,
user: 'postgres',
password: process.env.POSTGRES_PASSWORD,
host: process.env.POSTGRES_HOST,
port: process.env.POSTGRES_PORT
})
client.connect()
.then(
() => {
console.log("db connected");
})
and the docker-compose log for backend:
backend_1 | db connected
When I exec into the database docker container and connect to psql, I see that my database is created(used pg_dump manually) with all the tables and data. My guess is that node.js is connecting to the default Postgres database created at the time of the installation. I had the same issue on my local machine but I resolved it by creating a new server group in pgAdmin4 and creating a new db on port 5434. I prefer not to do this on server as it defeats the purpose of the concept of docker. Another thought is perhaps node.js is attempting to connect to the database even before it is up. That is the reason I added the line 'sleep 20' which worked on my local machine. Any thoughts on how I can fix this? TIA!
If you want to wait on the availability of a host and TCP port, you can use this script https://github.com/vishnubob/wait-for-it
In your docker file you can copy this file into container and change mode
RUN chmod +x wait-for-it.sh
Then in your docker compose run this script on service which you want to wait
entrypoint: bash -c "./wait-for-it.sh --timeout=0 service_name:service_port && node server.js"

How to connect a node docker container with postgres docker container

I have a CRUD app working on node, on my local machine. It is running on node, with postgres as the database, using knex.js as a query builder, etc.
I have created a docker file, and a docker-compose file, and the containers start, but the node container can't reach the postgres container. I suspect it has to do with the enviornment variables but I am not sure. here is my docker file:
FROM node:12
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package*.json ./
RUN npm ci --only=production
# Bundle app source
COPY . .
ENV PORT=8080
EXPOSE 8080
CMD [ "npm", "start" ]
This is the docker-compose file:
version: '2'
services:
postgres:
image: postgres:alpine
environment:
POSTGRES_PASSWORD: password
POSTGRES_USER: app
POSTGRES_DB: db
app:
build: .
depends_on:
- "postgres"
links:
- "postgres"
environment:
DB_PASSWORD: 'password'
DB_USER: 'app'
DB_NAME: 'db'
DB_HOST: 'postgres'
PORT: 8080
ports:
- '8080:8080'
command: npm start
also, here is the knex.js file on root that handles the db connections based on the environment:
// Update with your config settings.
module.exports = {
development: {
client: 'pg',
connection: 'postgres://localhost/db'
},
test: {
client: 'pg',
connection: 'postgres://localhost/test-db'
}
};
additionally when I check the hosts file on the node app inside the docker i don't see anything mentioning the link to postgres container. Any help would be appreciated, thanks.
The reason why your node application is not connecting is because it is trying to connect to itself as you are referencing localhost. Your database is in a second container which is not local so you need to reference it by service name which would be postgres.
So assuming your application is handling authentication another way, your config would be something like this:
// Update with your config settings.
module.exports = {
development: {
client: 'pg',
connection: 'postgres://postgres/db'
},
test: {
client: 'pg',
connection: 'postgres://postgres/test-db'
}
};
If you can, you should use the environment variables you assigned to the app container.
Docker-compose creates an internal network shared by the different containers it launches.
Since app and postgres are 2 separate containers, they are considered as 2 hosts. This causes app to look for postgres on the same container when you point it at localhost instead of the postgres container.
You can solve this by just changing localhost with postgres in your knex.js file.

Create default mongoDB user with access to all databases

I am trying to create local dev environment NodeJS (node:latest) & MongoDB on docker (mongo:latest) and I use mongoose to connect with MongoDB. I would like setup very simple environment.
My docker-compose.yml
version: '3'
services:
server:
build: .
container_name: cms_webserver
ports:
- "4000:4000"
volumes:
- ./api:/app/api
- ./node_modules:/app/node_modules
links:
- mongo
mongo:
container_name: cms_mongo
image: mongo
environment:
- MONGO_INITDB_ROOT_USERNAME=root
- MONGO_INITDB_ROOT_PASSWORD=root
- MONGO_INITDB_DATABASE=cms
ports:
- "27017:27017"
volumes:
- ./docker/mongod.conf:/etc/mongod.conf
- ./data:/data/db
adminmongo:
image: mrvautin/adminmongo
ports:
- "1234:1234"
mongod.conf
systemLog:
destination: file
logAppend: true
path: /var/log/mongodb/mongod.log
storage:
dbPath: /var/lib/mongo
journal:
enabled: true
processManagement:
fork: true # fork and run in background
pidFilePath: /var/run/mongodb/mongod.pid # location of pidfile
timeZoneInfo: /usr/share/zoneinfo
security:
authorization: disabled
net:
port: 27017
bindIp: 127.0.0.1 # Enter 0.0.0.0,:: to bind to all IPv4 and IPv6 addresses or, alternatively, use the net.bindIpAll setting.
Anytime when I rebuild docker I remove content of data/. Every time I have seen like new user is added with success:
Successfully added user: {
"user" : "root",
"roles" : [
{
"role" : "root",
"db" : "admin"
}
]
}
When I try connect to mongoDB in node I use
mongoose.connect('mongodb://mongo')
and get info about success.
When I try connect to mongoDB to default DB: cms
mongoose.connect('mongodb://root:root#mongo/cms')
I get an error:
SCRAM-SHA-1 authentication failed for root on cms from client
172.18.0.4:59142 ; UserNotFound: Could not find user root#cms
My first question is how can I disabled security? I found and an option to set security: authorization: disabled and even I have done with this still get problem.
Second question is in topic.
Run docker ps and check which container belongs to mongo then use docker exec -it XXX bash to log into. I use simple command mongo and I can log to mongo. What I can do is open mongo with command
mongo -u root -proot admin but I cannot with mongo -u root -p admin (mongo does not ask me about password)
As an admin (root/root) I can create new DB by run command use cms, I can also add record and check if DB exist
I can create new user let say:
db.createUser({user:"test",pwd:"test",roles:[{role:"dbOwner",db:"cms"}]});
I exit and try again log to DB cms I created and get an error:
MongoDB shell version v3.6.3
connecting to: mongodb://127.0.0.1:27017/cms
MongoDB server version: 3.6.3
2018-04-03T12:13:41.428+0000 E QUERY [thread1] Error: Authentication > failed. :
DB.prototype._authOrThrow#src/mongo/shell/db.js:1608:20
#(auth):6:1
#(auth):1:2
exception: login failed
Is there anyone who knows what's going on and could explain me what I am doing wrong or even give me ready to use examples how I can setup mongo DB on my local? I am not very strong in DevOps and I am totally lost.

Link nodejs app to Rethinkdb from another container

I made two 2 containers, one for the RethinkDB and one for a nodejs app.
I want to connect my nodejs app to this RethinkDB but everytime I try get an error
Error:{"message":"Failed to connect to localhost:58015\nFull error:\n{\"code\":\"ECONNREFUSED\"
But I can connect the same nodejs app running without Docker to the RethinkDB, with the open port (58015).
My Docker compose config look like this
# Rethink DB
rethink:
build: docker/rethinkdb
container_name: rethink
ports:
- 58080:8080
- 58015:28015
- 59015:29015
# NodeJS
nodejs:
build: docker/nodejs
container_name: nodejs
ports:
- 53000:3000
- 55000:5000
depends_on:
- rethink
To connect my app to the db I set the host and port inside a JS config file
database: {
servers: [
{
host: process.env.DB_PORT_28015_TCP_ADDR || 'localhost',
port: process.env.DB_PORT_28015_TCP_PORT || 28015
}
],
name: 'atlas'
},
I tried with RethinkDB port (28015) and with my open port (58015) without success.
I tried to link this two containers with links, network_mode, without success too.
Every solutions I tried don't work.
I think my Rethink container is not ready when the nodejs app try to connect. I really don't understand the problem, if this not this.
The nodejs app is running with pm2
How can I made this app connect to my db ?
For you config, you should use
# Rethink DB
rethink:
build: docker/rethinkdb
container_name: rethink
ports:
- 58080:8080
- 58015:28015
- 59015:29015
# NodeJS
nodejs:
build: docker/nodejs
container_name: nodejs
ports:
- 53000:3000
- 55000:5000
links:
- rethink
depends_on:
- rethink
and in JS code
database: {
servers: [
{
host: process.env.DB_PORT_28015_TCP_ADDR || 'rethink',
port: process.env.DB_PORT_28015_TCP_PORT || 28015
}
],
name: 'atlas'
},
As far as I know, one Docker container will not see the other unless specifically linked and using the same net:
docker run \
--name ${NEWAPP} \
--restart=always \
--env MYAPPPAR=${PROJ} \
-v /var/log/docker/node/logs:/usr/src/app/log \
--link myapp_rethink_1:myapp_rethink_1 \
--net myapp_default \
-p ${PORT}:9000 \
-d ${NEWAPP}
So you need both --net and --link:
--link format is sourcecontainername:containeraliasname
--net so that containers can find each other with internal DNS / containername. You can check network with 'docker network ls'
When using newer versions of docker-compose, your services will be configured to run on the same network.
The top level service name in your 'docker-compose.yml' will become the host that needs to be specified, when connecting to RethinkDB from your app:
# docker-compose.yml
version: '3.2'
Services:
web:
build: .
links: db
...
db:
image: rethinkDB
...
Using the example above, you can connect to RethinkDB by using the host named 'db', within the app configuration file:
module.exports = {
rethinkdb: {
host: 'db',
port: 28015
}
};

Resources