Docker: Unhandled Promise rejection when trying to access mongoDB using docker - node.js

So I was trying to deploy my very basic Node.js application using docker on a remote linux server(digital ocean droplet) following this tutorial. I'm pretty new to deployment related stuff so i'm surely doing some basic error. Please bear with me for the long post.
The code for project and docker, docker-compose config files that I was using was:
.env
MONGO_USERNAME=sammy
MONGO_PASSWORD=password
MONGO_DB=sharkinfo
PORT=8080
MONGO_PORT=27017
MONGO_HOSTNAME=127.0.0.1
Dockerfile
FROM node:10
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package*.json ./
RUN npm install
# If you are building your code for production
# RUN npm ci --only=production
# Bundle app source
COPY . .
EXPOSE 8080
CMD [ "node", "src/index.js"]
docker-compose.yml
version: '3'
services:
nodejs:
build: .
env_file: .env
environment:
- MONGO_USERNAME=$MONGO_USERNAME
- MONGO_PASSWORD=$MONGO_PASSWORD
- MONGO_HOSTNAME=myDB
- MONGO_PORT=$MONGO_PORT
- MONGO_DB=$MONGO_DB
ports:
- "80:8080"
networks:
- app-network
myDB:
image: mongo:4.1.8-xenial
env_file: .env
environment:
- MONGO_INITDB_ROOT_USERNAME=$MONGO_USERNAME
- MONGO_INITDB_ROOT_PASSWORD=$MONGO_PASSWORD
volumes:
- dbdata:/data/db
networks:
- app-network
networks:
app-network:
driver: bridge
volumes:
dbdata:
mongoose.js
const mongoose = require('mongoose');
const {
MONGO_USERNAME,
MONGO_PASSWORD,
MONGO_HOSTNAME,
MONGO_PORT,
MONGO_DB
} = process.env;
const url = `mongodb://${MONGO_USERNAME}:${MONGO_PASSWORD}#${MONGO_HOSTNAME}:${MONGO_PORT}/${MONGO_DB}?authSource=admin`;
mongoose.connect(url, {
useNewUrlParser: true,
useCreateIndex: true,
useFindAndModify: false
});
Then I created and ran a docker image using the commands (in-order) - docker-compose build docker-compose up -d.
Both the node and mongoDB server show up in list of running docker processes. However when i do docker logs node-application, i get an error:
Server is up on port 8080
(node:1) UnhandledPromiseRejectionWarning: MongoNetworkError: failed to connect to server [mydb:27017] on first connect [MongoNetworkError: connect ECONNREFUSED 172.19.0.3:27017]
at Pool.<anonymous> (/usr/src/app/node_modules/mongodb-core/lib/topologies/server.js:564:11)
at Pool.emit (events.js:198:13)
at Connection.<anonymous> (/usr/src/app/node_modules/mongodb-core/lib/connection/pool.js:317:12)
at Object.onceWrapper (events.js:286:20)
at Connection.emit (events.js:198:13)
at Socket.<anonymous> (/usr/src/app/node_modules/mongodb-core/lib/connection/connection.js:246:50)
at Object.onceWrapper (events.js:286:20)
at Socket.emit (events.js:198:13)
at emitErrorNT (internal/streams/destroy.js:91:8)
at emitErrorAndCloseNT (internal/streams/destroy.js:59:3)
at process._tickCallback (internal/process/next_tick.js:63:19)
(node:1) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 1)
(node:1) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
I've tried various tweaks to my code with no success and it continues to give the same error. What could be the issue here?

Related

Can not connect docker hosted SSMS from containerized GraphQL api

I am using SQL Server image as DB and trying to connect it with GraphQL api written using TypeORM. If I run docker-compose up -d it is simply runs the DB(I can connect it from local SSMS) but API is not able to connect it.
Dockerfile:
FROM node:14
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
COPY package*.json ./
RUN npm install
# Copy app source code
COPY . .
#Expose port and start application
EXPOSE 4000
CMD [ "npm", "run", "dev" ]
Docker-Compose File:
version: "2"
services:
server:
build: .
ports:
- "4000:4000"
depends_on:
- plhdashboardb
plhdashboardb:
container_name: plhdashboardb
image: mcr.microsoft.com/mssql/server:2019-latest
ports:
- "1434:1433"
environment:
SA_PASSWORD: "I_AM_mr_React"
ACCEPT_EULA: "Y"
Building it using docker build -t server . and running it using docker-compose up -d
If I try to run the URL http://localhost:4000/graphql, it never loads, but Server can be connected from local SSMS.
ormconfig.json:
{
"type": "mssql",
"host": "plhdashboardb",
"username": "sa",
"password": "I_AM_mr_React",
"database": "plhdashboardb_Dev_Mock",
"synchronize": true,
"autoSchemaSync": true,
"logging": false,
"options": {
"encrypt": true,
"enableArithAbort": true
},
"entities": [
"src/entity/**/*.ts"
],
"migrations": [
"src/migration/**/*.ts"
],
"subscribers": [
"src/subscriber/**/*.ts"
],
"cli": {
"entitiesDir": "src/entity",
"migrationsDir": "src/migration",
"subscribersDir": "src/subscriber"
},
"seeds": [
"src/database/seeds/**/*{.ts,.js}"
],
"factories": [
"src/database/factories/**/*{.ts,.js}"
]
}
If I enter the container docker exec -it server_server_1 bash and run npm run dev and getting this error:
(node:167) UnhandledPromiseRejectionWarning: ConnectionError: Failed to connect to localhost:1434 - Could not connect (sequence)
at Connection.<anonymous> (/usr/src/app/node_modules/mssql/lib/tedious/connection-pool.js:68:17)
at Object.onceWrapper (events.js:422:26)
at Connection.emit (events.js:315:20)
at Connection.EventEmitter.emit (domain.js:467:12)
at Connection.socketError (/usr/src/app/node_modules/mssql/node_modules/tedious/lib/connection.js:1290:12)
at /usr/src/app/node_modules/mssql/node_modules/tedious/lib/connection.js:1116:21
at SequentialConnectionStrategy.connect (/usr/src/app/node_modules/mssql/node_modules/tedious/lib/connector.js:87:14)
at Socket.onError (/usr/src/app/node_modules/mssql/node_modules/tedious/lib/connector.js:100:12)
at Socket.emit (events.js:315:20)
at Socket.EventEmitter.emit (domain.js:467:12)
(Use `node --trace-warnings ...` to show where the warning was created)
(node:167) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 1)
(node:167) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
[nodemon] clean exit - waiting for changes before restart

Connecting database (MongoDB) and backend (nodejs) running in docker containers [duplicate]

This question already has answers here:
MongoDB on with Docker "failed to connect to server [localhost:27017] on first connect "
(4 answers)
Closed 2 years ago.
First, I looked through several discussion with similar problems and it still did not work.
I have a mongodb docker container running, I did port forwarding with -p command
to be exact this is the command I ran:
sudo docker run -t -d -p 27017:27017 --name mongo mongo-0000
docker ps shows container running
9d9040a7bd66 mongo-0000 "docker-entrypoint.s…" 4 minutes ago Up 4 minutes 0.0.0.0:27017->27017/tcp mongo
as in another thread it was suggested to change mongodb bindip from 127.0.0.1 to 0.0.0.0 , which I also did (I tried both ways).
then I am trying to start up a backend app container with nodejs express backend app, I have had them working fine together on a VM, not on docker jet.
And i get following error
sudo docker run conduit-backend
Listening on port 3000
/ConduitReactApp/src/node_modules/mongodb/lib/server.js:261
process.nextTick(function() { throw err; })
^
Error [MongoError]: failed to connect to server [localhost:27017] on first connect
at Pool.<anonymous> (/ConduitReactApp/src/node_modules/mongodb-core/lib/topologies/server.js:313:35)
at Pool.emit (node:events:378:20)
at Connection.<anonymous> (/ConduitReactApp/src/node_modules/mongodb-core/lib/connection/pool.js:260:12)
at Object.onceWrapper (node:events:485:26)
at Connection.emit (node:events:378:20)
at Socket.<anonymous> (/ConduitReactApp/src/node_modules/mongodb-core/lib/connection/connection.js:162:49)
at Object.onceWrapper (node:events:485:26)
at Socket.emit (node:events:378:20)
at emitErrorNT (node:internal/streams/destroy:188:8)
at emitErrorCloseNT (node:internal/streams/destroy:153:3)
at processTicksAndRejections (node:internal/process/task_queues:81:21)
Emitted 'error' event on NativeConnection instance at:
at /ConduitReactApp/src/node_modules/mongoose/lib/connection.js:288:17
at NativeConnection.Connection.error (/ConduitReactApp/src/node_modules/mongoose/lib/connection.js:489:12)
at /ConduitReactApp/src/node_modules/mongoose/lib/connection.js:520:15
at /ConduitReactApp/src/node_modules/mongoose/lib/drivers/node-mongodb-native/connection.js:69:21
at /ConduitReactApp/src/node_modules/mongodb/lib/db.js:229:14
at Server.<anonymous> (/ConduitReactApp/src/node_modules/mongodb/lib/server.js:259:9)
at Object.onceWrapper (node:events:485:26)
at Server.emit (node:events:378:20)
at Pool.<anonymous> (/ConduitReactApp/src/node_modules/mongodb-core/lib/topologies/server.js:313:21)
at Pool.emit (node:events:378:20)
[... lines matching original stack trace ...]
at Socket.emit (node:events:378:20)
Also, inside app.js (in the backend app) for connecting to mongoDB it reads so
if(isProduction){
mongoose.connect(process.env.MONGODB_URI);
} else {
mongoose.connect('mongodb://localhost/conduit');
mongoose.set('debug', true);
}
What is still wrong here ?
Firstly, you should run your mongodb image after that you should mount mongo container to app container while run your app container. For example,
docker run -p xxxx:xxxx --link mongo:mongo <image-name>
And to connect mongodb like this, you should use a connection string like below,
'mongodb://mongo:27017/<db-name>'

MongoDB & nodejs with Docker "MongoTimeoutError: Server selection timed out after 30000 ms"

I am using mongoDB with and NodeJS backend. I have Problem and the Problem have docker logs is below.
{ MongoTimeoutError: Server selection timed out after 30000 ms
at Timeout.setTimeout [as _onTimeout] (/usr/src/app/node_modules/mongodb/lib/core/sdam/topology.js:897:9)
at ontimeout (timers.js:436:11)
at tryOnTimeout (timers.js:300:5)
at listOnTimeout (timers.js:263:5)
at Timer.processTimers (timers.js:223:10)
name: 'MongoTimeoutError',
reason:
{ Error: connect ECONNREFUSED 127.0.0.1:27017
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1113:14)
name: 'MongoNetworkError',
errorLabels: [ 'TransientTransactionError' ],
[Symbol(mongoErrorContextSymbol)]: {} },
[Symbol(mongoErrorContextSymbol)]: {} }
(node:1) UnhandledPromiseRejectionWarning: MongoTimeoutError: Server selection timed out after 30000 ms
at Timeout.setTimeout [as _onTimeout] (/usr/src/app/node_modules/mongodb/lib/core/sdam/topology.js:897:9)
at ontimeout (timers.js:436:11)
at tryOnTimeout (timers.js:300:5)
at listOnTimeout (timers.js:263:5)
at Timer.processTimers (timers.js:223:10)
(node:1) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 1)
(node:1) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
This is my Dockerfile
#FROM node:10.12.0
FROM node:latest-alpline
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package.json ./
RUN npm install
# If you are building your code for production
# RUN npm install --only=production
# Bundle app source
COPY . .
EXPOSE 3000
CMD [ "node", "app.js" ]
This is docker-compose.yml
version: "2"
services:
app:
container_name: app_test
restart: always
build: .
ports:
- "3000:3000"
links:
- mongo_test
depends_on:
- mongo_test
networks:
- nodeapp_network
mongo_test:
container_name: mongo_test
image: mongo
volumes:
- ./data:/data/db
ports:
- "27017:27017"
networks:
- nodeapp_network
networks:
nodeapp_network:
driver: bridge
The following code connects mongodb to mongoose in node js:
mongoose.connect('mongodb://mongo_test:27017/collectionName', {useNewUrlParser: true});
I get the same result if I use localhost, 127.0.0.1, IP address instead of mongo_test.
I have consulted several blogs, stack overflows.
How can I do?
(ps. I can't well English, so Please understand even if words are awkward. Thank you.)
There's nothing wrong with the current configuration, after few retries the app.js is connecting to the database
Here's what is happening:
docker-compose is launching Mongodb service, the port is not yet open
docker-compose starts app_test and app_test tries to connect it fails(2 to 3 times)
After a few seconds, Mongodb port is now open on 27017
Since your config has restart: always the node.js app restarts and connects successfully to the database
To avoid these restarts, you can write it like this
setTimeout(function() {
try {
await mongoose.connect('mongodb://mongo_test:27017/collectionName',
{ useNewUrlParser: true }
);
} catch (error) {
console.log('Error connecting - retrying in 30 seconds')
}, 30000); // Wait for 30 seconds and try to connect again
Explanation:
The connectTimeoutMS is a MongoDB Connection Option and it sets the number of milliseconds a socket stays inactive before closing during the connection phase of the driver. That is to say, when the application initiates a connection, when a replica set connects to new members, or when a replica set reconnects to members. the default value is 30000 milliseconds would mean the driver would wait up to 3 seconds for a response from a MongoDB server. link
Compose does not wait until your mongo_test container is “ready”.
The problem of waiting for a database to be ready is really just a subset of a much larger problem of distributed systems. In production, your database could become unavailable or move hosts at any time. Your application needs to be resilient to these types of failures. link
Using depends_on in your app_test service is still no guarantee to establish the connection between your services.
Solution:
You can update your code to connect to MongoDB:
const connectWithRetry = () => {
mongoose
.connect('mongodb://mongo_test:27017/collectionName', {useNewUrlParser: true})
.then(() => console.log("succesfully connected to DB"))
.catch((e) => {
console.log(e);
setTimeout(connectWithRetry, 5000);
});
};
connectWithRetry();

ERROR : [ioredis] Unhandled error event: Error: connect ETIMEDOUT

I have facing this issue redis connection timeout in my node application.
I have tried this code,
new Redis({
connectTimeout: 10000
})
But was of no use it didn't help me with the code
[ioredis] Unhandled error event: Error: connect ETIMEDOUT
at Socket.<anonymous> (/code/node_modules/ioredis/lib/redis.js:291:21)
at Object.onceWrapper (events.js:313:30)
at emitNone (events.js:106:13)
at Socket.emit (events.js:208:7)
at Socket._onTimeout (net.js:407:8)
at ontimeout (timers.js:475:11)
at tryOnTimeout (timers.js:310:5)
at Timer.listOnTimeout (timers.js:270:5)
When you say container, I'd assume you mean docker containers. If you're using the docker network, I believe you can always change the host to the name of your docker container within the network. If you're using docker-compose, here's an idea of how that may be done:
version: '3'
services:
app:
image: app_image
depends_on:
- redis
networks:
- app_network
redis:
image: redis
networks:
- app_network
networks:
app_network: {}
So in your app, you'd then do
redis = new Redis({
host: 'redis://redis'
})

How to connect to postgres inside docker compose with NodeJs?

I have installed Mainflux in Cloud Server using Docker. Postgres DB is also
running in Docker Container. I have a situation to connect to PostgresDB with Node.Js(programmatically). I found "pg" module to connect to Cloud
Postgres DB. But, am unable to connect to Postgres which is running on
Docker Container. I pasted my code below, pls let me know the scenario to connect "Docker Postgres".
const pg = require('pg');
const conStringPri = postgres://mainflux:mainflux#MYIP/mainflux-things-db;
const Client = pg.Client;
const client = new Client({connectionString: conStringPri});
client.connect();
client.query(CREATE DATABASE DB_Name)
.then(() => client.end());
Getting error as below:
node:8084) UnhandledPromiseRejectionWarning: Error: connect ECONNREFUSED
MYIP:5432
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1191:14)
(node:8084) UnhandledPromiseRejectionWarning: Unhandled promise rejection.
This error originated either by throwing inside of an async function
without a catch block or by rejecting a promise which was not handled with
.catch() (rejection id: 1)
(node:8084) [DEP0018] DeprecationWarning: Unhandled promise rejections are
deprecated. In the future, promise rejections that are not handled will
terminate the Node.js process with a
non-zero exit code.
(node:8084) UnhandledPromiseRejectionWarning: Error: Connection
terminated unexpectedly
at Connection.con.once
(D:\postgres_Nodejs\node_modules\pg\lib\client.js:200:9)
at Object.onceWrapper (events.js:313:30)
at emitNone (events.js:106:13)
at Connection.emit (events.js:208:7)
at Socket.
(D:\postgres_Nodejs\node_modules\pg\lib\connection.js:76:10)
at emitOne (events.js:116:13)
at Socket.emit (events.js:211:7)
at TCP._handle.close [as _onclose] (net.js:561:12)
If you started Mainflux by using docker-compose configuration supplied here https://github.com/mainflux/mainflux/blob/master/docker/docker-compose.yml , then your PostgreSQL container doesn't have the port exposed to the host. In order to be able to connect to the database, you need to expose this port.
Here's an example of how the part of the docker-compose would look, with the things-db container having port 5432 (default PostgreSQL port) exposed
things-db:
image: postgres:10.2-alpine
container_name: mainflux-things-db
restart: on-failure
environment:
POSTGRES_USER: mainflux
POSTGRES_PASSWORD: mainflux
POSTGRES_DB: things
networks:
- mainflux-base-net
ports:
- 5432:5432
So you will need to modify your docker-compose.yml.
Please note that the Mainflux docker compose has 2 PostgreSQL databases in 2 containers: things-db and users-db.

Resources