Can not connect docker hosted SSMS from containerized GraphQL api - node.js

I am using SQL Server image as DB and trying to connect it with GraphQL api written using TypeORM. If I run docker-compose up -d it is simply runs the DB(I can connect it from local SSMS) but API is not able to connect it.
Dockerfile:
FROM node:14
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
COPY package*.json ./
RUN npm install
# Copy app source code
COPY . .
#Expose port and start application
EXPOSE 4000
CMD [ "npm", "run", "dev" ]
Docker-Compose File:
version: "2"
services:
server:
build: .
ports:
- "4000:4000"
depends_on:
- plhdashboardb
plhdashboardb:
container_name: plhdashboardb
image: mcr.microsoft.com/mssql/server:2019-latest
ports:
- "1434:1433"
environment:
SA_PASSWORD: "I_AM_mr_React"
ACCEPT_EULA: "Y"
Building it using docker build -t server . and running it using docker-compose up -d
If I try to run the URL http://localhost:4000/graphql, it never loads, but Server can be connected from local SSMS.
ormconfig.json:
{
"type": "mssql",
"host": "plhdashboardb",
"username": "sa",
"password": "I_AM_mr_React",
"database": "plhdashboardb_Dev_Mock",
"synchronize": true,
"autoSchemaSync": true,
"logging": false,
"options": {
"encrypt": true,
"enableArithAbort": true
},
"entities": [
"src/entity/**/*.ts"
],
"migrations": [
"src/migration/**/*.ts"
],
"subscribers": [
"src/subscriber/**/*.ts"
],
"cli": {
"entitiesDir": "src/entity",
"migrationsDir": "src/migration",
"subscribersDir": "src/subscriber"
},
"seeds": [
"src/database/seeds/**/*{.ts,.js}"
],
"factories": [
"src/database/factories/**/*{.ts,.js}"
]
}
If I enter the container docker exec -it server_server_1 bash and run npm run dev and getting this error:
(node:167) UnhandledPromiseRejectionWarning: ConnectionError: Failed to connect to localhost:1434 - Could not connect (sequence)
at Connection.<anonymous> (/usr/src/app/node_modules/mssql/lib/tedious/connection-pool.js:68:17)
at Object.onceWrapper (events.js:422:26)
at Connection.emit (events.js:315:20)
at Connection.EventEmitter.emit (domain.js:467:12)
at Connection.socketError (/usr/src/app/node_modules/mssql/node_modules/tedious/lib/connection.js:1290:12)
at /usr/src/app/node_modules/mssql/node_modules/tedious/lib/connection.js:1116:21
at SequentialConnectionStrategy.connect (/usr/src/app/node_modules/mssql/node_modules/tedious/lib/connector.js:87:14)
at Socket.onError (/usr/src/app/node_modules/mssql/node_modules/tedious/lib/connector.js:100:12)
at Socket.emit (events.js:315:20)
at Socket.EventEmitter.emit (domain.js:467:12)
(Use `node --trace-warnings ...` to show where the warning was created)
(node:167) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 1)
(node:167) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
[nodemon] clean exit - waiting for changes before restart

Related

Firebase Emulator - UI says emulators disconnected, command times out after 60 seconds

When I run firebase emulators:start in my project, the command has the following output:
i emulators: Starting emulators: functions, firestore, hosting
⚠ functions: The following emulators are not running, calls to these services from the Functions emulator will affect production: auth, database, pubsub, storage
⚠ Your requested "node" version "16" doesn't match your global version "18". Using node#18 from host.
⚠ firestore: Did not find a Cloud Firestore rules file specified in a firebase.json config file.
⚠ firestore: The emulator will default to allowing all reads and writes. Learn more about this option: https://firebase.google.com/docs/emulator-suite/install_and_configure#security_rules_configuration.
i firestore: Firestore Emulator logging to firestore-debug.log
i hosting: Serving hosting files from: public
✔ hosting: Local server: http://127.0.0.1:9001
i ui: Emulator UI logging to ui-debug.log
When I open the emulator UI at localhost:4000, I get the a message saying the emulators are disconected:
Emulator UI. Despite this message, the emulators are still running and I can access the hosting at localhost:9001.
Eventually, the emulator command stops with this error message:
Error: TIMEOUT: Port 4000 on localhost was not active within 60000ms
despite the fact that I did open the emulator UI at port 4000.
This is at the top of ui-debug.log:
Web / API server started at localhost:4000
u [FetchError]: request to http://localhost:4400/emulators failed, reason: connect ECONNREFUSED ::1:4400
at ClientRequest.<anonymous> (/Users/patrick/.cache/firebase/emulators/ui-v1.7.0/server.bundle.js:326:16909)
at ClientRequest.emit (node:events:527:28)
at Socket.socketErrorListener (node:_http_client:454:9)
at Socket.emit (node:events:527:28)
at emitErrorNT (node:internal/streams/destroy:151:8)
at emitErrorCloseNT (node:internal/streams/destroy:116:3)
at process.processTicksAndRejections (node:internal/process/task_queues:82:21) {
type: 'system',
errno: 'ECONNREFUSED',
code: 'ECONNREFUSED'
}
Sometimes, but not all the time, before stopping it will give this message:
i ui: Stopping Emulator UI
⚠ Emulator UI has exited upon receiving signal: SIGINT
but with no clear explanation.
They've raised this issue on firebase-tools. Specifying host as "127.0.0.1" instead of the default "localhost" also worked for me.
{
"emulators": {
"auth": {
"port": 9099
},
"firestore": {
"port": 8080
},
"hub": {
"host": "127.0.0.1"
},
"ui": {
"enabled": true,
"host": "127.0.0.1"
}
}
}

Heroku | Node.js | Exception when debugging remotely

I have express.js REST API server on Heroku.
I am trying to debug an error remotely. But, I receive below exception. The steps I tried to debug remote server are mentioned after this exception section.
Can someone please help me how can I debug remote node.js/express.js server?
The exception I receive:
events.js:291
throw er; // Unhandled 'error' event
^
Error: connection not allowed by ruleset
at Parser._onData (/usr/local/Cellar/heroku/7.47.11/lib/client/7.59.1/node_modules/#heroku/socksv5/lib/client.parser.js:122:21)
at Socket.Parser.__onData (/usr/local/Cellar/heroku/7.47.11/lib/client/7.59.1/node_modules/#heroku/socksv5/lib/client.parser.js:33:10)
at Socket.emit (events.js:314:20)
at addChunk (_stream_readable.js:297:12)
at readableAddChunk (_stream_readable.js:272:9)
at Socket.Readable.push (_stream_readable.js:213:10)
at TCP.onStreamRead (internal/stream_base_commons.js:188:23)
Emitted 'error' event on Client instance at:
at Parser.<anonymous> (/usr/local/Cellar/heroku/7.47.11/lib/client/7.59.1/node_modules/#heroku/socksv5/lib/client.js:132:10)
at Parser.emit (events.js:314:20)
at Parser._onData (/usr/local/Cellar/heroku/7.47.11/lib/client/7.59.1/node_modules/#heroku/socksv5/lib/client.parser.js:126:16)
at Socket.Parser.__onData (/usr/local/Cellar/heroku/7.47.11/lib/client/7.59.1/node_modules/#heroku/socksv5/lib/client.parser.js:33:10)
[... lines matching original stack trace ...]
at TCP.onStreamRead (internal/stream_base_commons.js:188:23) {
code: 'EACCES'
}
This is what I am trying:
On my local Terminal, I run heroku ps:forward 9229 --app MY_EXPRESS_APP command.
On my local Visual Code, I have the below launch.json configuration. I tried with both the configurations mentioned in it. (I understand I need only one. But I checked other Stackoverflow articles. So, I tried both the suggestions).
"configurations": [
{
"address": "TCP/IP address of process to be debugged",
"localRoot": "${workspaceFolder}",
"name": "ONE: Attach to Remote",
"port": 9229,
"remoteRoot": "/app",
"request": "attach",
"skipFiles": [
"<node_internals>/**"
],
"type": "node"
},
{
"type": "node",
"request": "attach",
"name": "TWO: Remote Heroku: Debug Remote Server",
"address": "localhost",
"port": 9229,
"protocol": "inspector",
"localRoot": "${workspaceFolder}",
"remoteRoot": "/app"
}
]
I try to run below command. When I start Debug with above one of configuration... the debugger tries to connect and pauses. And, after 2 or 3 minutes, I see the above exception on Terminal under this command.
$ heroku ps:forward 9229 --app MY_EXPRESS_APP
heroku ps:forward 9229 --app MY_EXPRESS_APP
Establishing credentials... done
SOCKSv5 proxy server started on port 1080
Listening on 9229 and forwarding to web.1:9229
Use CTRL+C to stop port fowarding

MongoDB & nodejs with Docker "MongoTimeoutError: Server selection timed out after 30000 ms"

I am using mongoDB with and NodeJS backend. I have Problem and the Problem have docker logs is below.
{ MongoTimeoutError: Server selection timed out after 30000 ms
at Timeout.setTimeout [as _onTimeout] (/usr/src/app/node_modules/mongodb/lib/core/sdam/topology.js:897:9)
at ontimeout (timers.js:436:11)
at tryOnTimeout (timers.js:300:5)
at listOnTimeout (timers.js:263:5)
at Timer.processTimers (timers.js:223:10)
name: 'MongoTimeoutError',
reason:
{ Error: connect ECONNREFUSED 127.0.0.1:27017
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1113:14)
name: 'MongoNetworkError',
errorLabels: [ 'TransientTransactionError' ],
[Symbol(mongoErrorContextSymbol)]: {} },
[Symbol(mongoErrorContextSymbol)]: {} }
(node:1) UnhandledPromiseRejectionWarning: MongoTimeoutError: Server selection timed out after 30000 ms
at Timeout.setTimeout [as _onTimeout] (/usr/src/app/node_modules/mongodb/lib/core/sdam/topology.js:897:9)
at ontimeout (timers.js:436:11)
at tryOnTimeout (timers.js:300:5)
at listOnTimeout (timers.js:263:5)
at Timer.processTimers (timers.js:223:10)
(node:1) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 1)
(node:1) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
This is my Dockerfile
#FROM node:10.12.0
FROM node:latest-alpline
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package.json ./
RUN npm install
# If you are building your code for production
# RUN npm install --only=production
# Bundle app source
COPY . .
EXPOSE 3000
CMD [ "node", "app.js" ]
This is docker-compose.yml
version: "2"
services:
app:
container_name: app_test
restart: always
build: .
ports:
- "3000:3000"
links:
- mongo_test
depends_on:
- mongo_test
networks:
- nodeapp_network
mongo_test:
container_name: mongo_test
image: mongo
volumes:
- ./data:/data/db
ports:
- "27017:27017"
networks:
- nodeapp_network
networks:
nodeapp_network:
driver: bridge
The following code connects mongodb to mongoose in node js:
mongoose.connect('mongodb://mongo_test:27017/collectionName', {useNewUrlParser: true});
I get the same result if I use localhost, 127.0.0.1, IP address instead of mongo_test.
I have consulted several blogs, stack overflows.
How can I do?
(ps. I can't well English, so Please understand even if words are awkward. Thank you.)
There's nothing wrong with the current configuration, after few retries the app.js is connecting to the database
Here's what is happening:
docker-compose is launching Mongodb service, the port is not yet open
docker-compose starts app_test and app_test tries to connect it fails(2 to 3 times)
After a few seconds, Mongodb port is now open on 27017
Since your config has restart: always the node.js app restarts and connects successfully to the database
To avoid these restarts, you can write it like this
setTimeout(function() {
try {
await mongoose.connect('mongodb://mongo_test:27017/collectionName',
{ useNewUrlParser: true }
);
} catch (error) {
console.log('Error connecting - retrying in 30 seconds')
}, 30000); // Wait for 30 seconds and try to connect again
Explanation:
The connectTimeoutMS is a MongoDB Connection Option and it sets the number of milliseconds a socket stays inactive before closing during the connection phase of the driver. That is to say, when the application initiates a connection, when a replica set connects to new members, or when a replica set reconnects to members. the default value is 30000 milliseconds would mean the driver would wait up to 3 seconds for a response from a MongoDB server. link
Compose does not wait until your mongo_test container is “ready”.
The problem of waiting for a database to be ready is really just a subset of a much larger problem of distributed systems. In production, your database could become unavailable or move hosts at any time. Your application needs to be resilient to these types of failures. link
Using depends_on in your app_test service is still no guarantee to establish the connection between your services.
Solution:
You can update your code to connect to MongoDB:
const connectWithRetry = () => {
mongoose
.connect('mongodb://mongo_test:27017/collectionName', {useNewUrlParser: true})
.then(() => console.log("succesfully connected to DB"))
.catch((e) => {
console.log(e);
setTimeout(connectWithRetry, 5000);
});
};
connectWithRetry();

Docker: Unhandled Promise rejection when trying to access mongoDB using docker

So I was trying to deploy my very basic Node.js application using docker on a remote linux server(digital ocean droplet) following this tutorial. I'm pretty new to deployment related stuff so i'm surely doing some basic error. Please bear with me for the long post.
The code for project and docker, docker-compose config files that I was using was:
.env
MONGO_USERNAME=sammy
MONGO_PASSWORD=password
MONGO_DB=sharkinfo
PORT=8080
MONGO_PORT=27017
MONGO_HOSTNAME=127.0.0.1
Dockerfile
FROM node:10
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package*.json ./
RUN npm install
# If you are building your code for production
# RUN npm ci --only=production
# Bundle app source
COPY . .
EXPOSE 8080
CMD [ "node", "src/index.js"]
docker-compose.yml
version: '3'
services:
nodejs:
build: .
env_file: .env
environment:
- MONGO_USERNAME=$MONGO_USERNAME
- MONGO_PASSWORD=$MONGO_PASSWORD
- MONGO_HOSTNAME=myDB
- MONGO_PORT=$MONGO_PORT
- MONGO_DB=$MONGO_DB
ports:
- "80:8080"
networks:
- app-network
myDB:
image: mongo:4.1.8-xenial
env_file: .env
environment:
- MONGO_INITDB_ROOT_USERNAME=$MONGO_USERNAME
- MONGO_INITDB_ROOT_PASSWORD=$MONGO_PASSWORD
volumes:
- dbdata:/data/db
networks:
- app-network
networks:
app-network:
driver: bridge
volumes:
dbdata:
mongoose.js
const mongoose = require('mongoose');
const {
MONGO_USERNAME,
MONGO_PASSWORD,
MONGO_HOSTNAME,
MONGO_PORT,
MONGO_DB
} = process.env;
const url = `mongodb://${MONGO_USERNAME}:${MONGO_PASSWORD}#${MONGO_HOSTNAME}:${MONGO_PORT}/${MONGO_DB}?authSource=admin`;
mongoose.connect(url, {
useNewUrlParser: true,
useCreateIndex: true,
useFindAndModify: false
});
Then I created and ran a docker image using the commands (in-order) - docker-compose build docker-compose up -d.
Both the node and mongoDB server show up in list of running docker processes. However when i do docker logs node-application, i get an error:
Server is up on port 8080
(node:1) UnhandledPromiseRejectionWarning: MongoNetworkError: failed to connect to server [mydb:27017] on first connect [MongoNetworkError: connect ECONNREFUSED 172.19.0.3:27017]
at Pool.<anonymous> (/usr/src/app/node_modules/mongodb-core/lib/topologies/server.js:564:11)
at Pool.emit (events.js:198:13)
at Connection.<anonymous> (/usr/src/app/node_modules/mongodb-core/lib/connection/pool.js:317:12)
at Object.onceWrapper (events.js:286:20)
at Connection.emit (events.js:198:13)
at Socket.<anonymous> (/usr/src/app/node_modules/mongodb-core/lib/connection/connection.js:246:50)
at Object.onceWrapper (events.js:286:20)
at Socket.emit (events.js:198:13)
at emitErrorNT (internal/streams/destroy.js:91:8)
at emitErrorAndCloseNT (internal/streams/destroy.js:59:3)
at process._tickCallback (internal/process/next_tick.js:63:19)
(node:1) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 1)
(node:1) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
I've tried various tweaks to my code with no success and it continues to give the same error. What could be the issue here?

How to connect to postgres inside docker compose with NodeJs?

I have installed Mainflux in Cloud Server using Docker. Postgres DB is also
running in Docker Container. I have a situation to connect to PostgresDB with Node.Js(programmatically). I found "pg" module to connect to Cloud
Postgres DB. But, am unable to connect to Postgres which is running on
Docker Container. I pasted my code below, pls let me know the scenario to connect "Docker Postgres".
const pg = require('pg');
const conStringPri = postgres://mainflux:mainflux#MYIP/mainflux-things-db;
const Client = pg.Client;
const client = new Client({connectionString: conStringPri});
client.connect();
client.query(CREATE DATABASE DB_Name)
.then(() => client.end());
Getting error as below:
node:8084) UnhandledPromiseRejectionWarning: Error: connect ECONNREFUSED
MYIP:5432
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1191:14)
(node:8084) UnhandledPromiseRejectionWarning: Unhandled promise rejection.
This error originated either by throwing inside of an async function
without a catch block or by rejecting a promise which was not handled with
.catch() (rejection id: 1)
(node:8084) [DEP0018] DeprecationWarning: Unhandled promise rejections are
deprecated. In the future, promise rejections that are not handled will
terminate the Node.js process with a
non-zero exit code.
(node:8084) UnhandledPromiseRejectionWarning: Error: Connection
terminated unexpectedly
at Connection.con.once
(D:\postgres_Nodejs\node_modules\pg\lib\client.js:200:9)
at Object.onceWrapper (events.js:313:30)
at emitNone (events.js:106:13)
at Connection.emit (events.js:208:7)
at Socket.
(D:\postgres_Nodejs\node_modules\pg\lib\connection.js:76:10)
at emitOne (events.js:116:13)
at Socket.emit (events.js:211:7)
at TCP._handle.close [as _onclose] (net.js:561:12)
If you started Mainflux by using docker-compose configuration supplied here https://github.com/mainflux/mainflux/blob/master/docker/docker-compose.yml , then your PostgreSQL container doesn't have the port exposed to the host. In order to be able to connect to the database, you need to expose this port.
Here's an example of how the part of the docker-compose would look, with the things-db container having port 5432 (default PostgreSQL port) exposed
things-db:
image: postgres:10.2-alpine
container_name: mainflux-things-db
restart: on-failure
environment:
POSTGRES_USER: mainflux
POSTGRES_PASSWORD: mainflux
POSTGRES_DB: things
networks:
- mainflux-base-net
ports:
- 5432:5432
So you will need to modify your docker-compose.yml.
Please note that the Mainflux docker compose has 2 PostgreSQL databases in 2 containers: things-db and users-db.

Resources