NodeJS and SQL Server : unable to connect when in a Docker Container - node.js

I have a NodeJS app that works fine when run standalone on a server. When I run it in a docker container it displays the following error message:
ConnectionError: Failed to connect to TEXTREPLACED:undefined - socket hang up
at /app/node_modules/mssql/lib/tedious/connection-pool.js:71:17
at Connection.onConnect (/app/node_modules/tedious/lib/connection.js:1037:9)
at Object.onceWrapper (node:events:514:26)
at Connection.emit (node:events:394:28)
at Connection.emit (/app/node_modules/tedious/lib/connection.js:1065:18)
at Connection.socketError (/app/node_modules/tedious/lib/connection.js:1663:12)
at Connection.socketEnd (/app/node_modules/tedious/lib/connection.js:1693:12)
at Socket. (/app/node_modules/tedious/lib/connection.js:1433:14)
at Socket.emit (node:events:406:35)
at endReadableNT (node:internal/streams/readable:1331:12) {
code: 'ESOCKET',
originalError: ConnectionError: Failed to connect to TEXTREPLACED:undefined - socket hang up
at ConnectionError (/app/node_modules/tedious/lib/errors.js:13:12)
at Connection.socketError (/app/node_modules/tedious/lib/connection.js:1663:56)
at Connection.socketEnd (/app/node_modules/tedious/lib/connection.js:1693:12)
at Socket. (/app/node_modules/tedious/lib/connection.js:1433:14)
at Socket.emit (node:events:406:35)
at endReadableNT (node:internal/streams/readable:1331:12)
at processTicksAndRejections (node:internal/process/task_queues:83:21) {
code: 'ESOCKET'
}
}
My connection code is:
const sqlConfig = {
user: 'LOGIN',
password: 'PASSWORD',
server: 'SERVER\\INSTANCE',
database: 'DATABASE',
debug:true,
port: 1433,
driver:'tedious',
pool:{
idleTimeoutMillis: 1000
},
options:{
port:1433,
enableArithAbort:true,
encrypt:true,
trustServerCertificate:true,
instanceName:'INSTANCE',
database:'DATABASE',
debug:{
packet: true,
data:true,
payload:true,
token:true,
log:true
}
}
};
const global_pool = new sql.ConnectionPool(sqlConfig);
var global_pool_con = null;
try {
global_pool_con = global_pool.connect();
}catch{
console.log(err);
}
Dockerfile
# Use Node base image
FROM node:latest
#ports
EXPOSE 3000
EXPOSE 1433
WORKDIR /app
COPY package.json /app
RUN npm install
COPY . /app
USER node
CMD [run, script, release]
The "release" script runs the initial js file.
Confusingly the error shows "undefined" instead of a port number. I have used both host and bridge connections with the ports 1433 and 3000 (HAPI) routed and confirmed the ports are exposed in the dockerfile.
Considering that it works when standalone i'm presuming that a docker setting somewhere it's causing the issue.
Update: The TLS/SSL negotiation packet is being sent but is not received
State change: SentPrelogin -> SentTLSSSLNegotiation
Update: The SQL Server is displaying this error in the event logs when the dockerised app attempts to connect.
A fatal alert was generated and sent to the remote endpoint. This may result in termination of the connection. The TLS protocol defined fatal error code is 40. The Windows SChannel error state is 1205.
An TLS 1.2 connection request was received from a remote client application, but none of the cipher suites supported by the client application are supported by the server. The SSL connection request has failed.

Thanks to everyone for your help.
The resolution for this was to update the openSSL.cnf file, it had additional lines in it from the source image that weren't required.

Related

Error: No event 'socketConnect' in state 'SentPrelogin'

I am trying to connect my SQL Server database with node.js using knex but I am facing issue
Error: No event 'socketConnect' in state 'SentPrelogin'
at Connection.dispatchEvent (C:\Users\temp\Documents\PRactice\node_modules\tedious\lib\connection.js:1281:26)
at Connection.socketConnect (C:\Users\temp\Documents\PRactice\node_modules\tedious\lib\connection.js:1303:10)
at C:\Users\temp\Documents\PRactice\node_modules\tedious\lib\connection.js:1145:12
at Socket.onConnect (C:\Users\temp\Documents\PRactice\node_modules\tedious\lib\connector.js:106:7)
at Socket.emit (events.js:314:20)
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1131:10)
Emitted 'error' event on Connection instance at:
at Connection.dispatchEvent (C:\Users\temp\Documents\PRactice\node_modules\tedious\lib\connection.js:1281:12)
at Connection.socketConnect (C:\Users\temp\Documents\PRactice\node_modules\tedious\lib\connection.js:1303:10)
[... lines matching original stack trace ...]
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1131:10)
[nodemon] app crashed - waiting for file changes before starting...
My code is
var knex = require('knex')({
client: 'mssql',
version:"7_1",
connection: {
user: 'sa',
password: 'Admin#123',
server: 'localhost',
database: 'Demo'
}
});
knex.select("*").from("Country")
.then(function (depts){
depts.forEach((dept)=>{ //use of Arrow Function
console.log({...dept});
});
}).catch(function(err) {
// All the error can be checked in this piece of code
console.log(err);
}).finally(function() {
// To close the connection pool
knex.destroy();
});
You need to add a missing dependency:
npm install --save tedious
As of knex v0.95.0 you'll need to use the tedious library instead of mssql when connecting to an MSSQL database. According to the knex upgrade instructions:
MSSQL driver was completely reworked in order to address the multitude of connection pool, error handling and performance issues. Since the new implementation uses tedious library directly instead of mssql, please replace mssql with tedious in your dependencies if you are using a MSSQL database.
Installing the package above should resolve your issue. I also had to set the encrypt option to false when connecting to my local database to avoid this error:
ConnectionError: Failed to connect to localhost:1433 - self signed certificate

How to connect to a MongoDB running on an aws ec2 host (linux) out of docker

Situation:
I prototyped a small web (node.js) app and dockerized it for deployment and replicability purposes.
The app speaks with a MongoDB running directly on the host.
Problem:
On the server (AWS EC2 instance, only Port 80 and 443 open), I am not able to interact with MongoDB and I am wondering why.
docker run --net="host" -e 'NODE_ENV=production' -e 'MONGO_URI=mongodb://USER:PASSWORD!#172.31.32.1:27017/test_db' DOCKER_IMAGE
MongoDB connected ...
Warning: connect.session() MemoryStore is not
designed for a production environment, as it will leak
memory, and will not scale past a single process.
HTTP Server started on port 80
(node:1) DeprecationWarning: current Server Discovery and Monitoring engine is deprecated, and will be removed in a future version. To use the new Server Discover and Monitoring engine, pass option { useUnifiedTopology: true } to the MongoClient constructor.
MongoNetworkError: failed to connect to server [172.31.32.1:27017] on first connect [MongoNetworkError: connection timed out
at connectionFailureError (/app/server/node_modules/mongodb/lib/core/connection/connect.js:377:14)
at Socket.<anonymous> (/app/server/node_modules/mongodb/lib/core/connection/connect.js:287:16)
at Object.onceWrapper (events.js:284:20)
at Socket.emit (events.js:196:13)
at Socket._onTimeout (net.js:432:8)
at listOnTimeout (internal/timers.js:531:17)
at processTimers (internal/timers.js:475:7) {
name: 'MongoNetworkError',
[Symbol(mongoErrorContextSymbol)]: {}
}]
at Pool.<anonymous> (/app/server/node_modules/mongodb/lib/core/topologies/server.js:433:11)
at Pool.emit (events.js:196:13)
at /app/server/node_modules/mongodb/lib/core/connection/pool.js:571:14
at /app/server/node_modules/mongodb/lib/core/connection/pool.js:994:11
at /app/server/node_modules/mongodb/lib/core/connection/connect.js:40:11
at callback (/app/server/node_modules/mongodb/lib/core/connection/connect.js:262:5)
at Socket.<anonymous> (/app/server/node_modules/mongodb/lib/core/connection/connect.js:287:7)
at Object.onceWrapper (events.js:284:20)
at Socket.emit (events.js:196:13)
at Socket._onTimeout (net.js:432:8)
at listOnTimeout (internal/timers.js:531:17)
at processTimers (internal/timers.js:475:7) {
name: 'MongoNetworkError',
[Symbol(mongoErrorContextSymbol)]: {}
}
I initially tried localhost instead of the IP address, but that does not work. It throws an authentication error (which is somewhat strange). Since there is no host.docker.internal for Linux I had to (temporarily) resolve to the explicit IP address. The IP address that I am using I got via:
netstat -nr | grep '^0\.0\.0\.0' | awk '{print $2}'
What I find very strange is that I am not getting an Authentication error but a Timeout error, so to me it seems like the app is able to connect to Mongo. Also, the "MonogDB connected ..." would indicate that, as it is produced by the following line in my server script.
mongoose
.connect(DB, { useNewUrlParser: true })
.then(console.log("MongoDB connected ..."))
.catch(err => console.log(err));
For completeness sake, the same setup (i.e. dockerized app and running MongoDB directly on the host) worked without a problem locally.
Also, I am able to enter the Mongo Shell on the server via mongo.
Any explanation or tip is appreciated!
If mongo DB is running outside of the Docker, you can use the private IP of the EC2 instance to connect with Mongo DB (running on Host) from inside Docker.
EC2_PRIVATE_IP=$(curl http://169.254.169.254/latest/meta-data/local-ipv4)
docker run --net="host" -e 'NODE_ENV=production' -e 'MONGO_URI=mongodb://USER:PASSWORD!#$EC_PRIVATE_IP:27017/test_db' DOCKER_IMAGE

Nodejs Bull queue not connecting to Azure

When changing the settings inside the new bull Queue object, I get an error in the console. When running Bull Queue locally the application works perfectly fine. As soon as I change the credentials to Azure, I get the error below. When running locally I run the redis-server but not when using the Azure credentials.
I have tried the example tutorial, on the Azure website, with nodejs and the redis npm package, and the Azure redis cache works perfectly fine. Therefore, I am left to believe that I am doing something wrong in the config. I have also tried adding "maxRetriesPerRequest" and "enableReadyCheck" to the redis object however, they have had no effect. I also make sure I execute the done function within the process function.
const queue = new Queue('sendQueue', {
defaultJobOptions: { removeOnComplete: true },
redis: {
port: env.AZURE_REDIS_PORT,
host: env.AZURE_REDIS_HOST,
password: env.AZURE_REDIS_PASSWORD
},
});
at Queue.<anonymous> (/Users/abc/Projects/Sean/dist/tasks/sendQueue.js:47:11)
at Queue.emit (events.js:208:15)
at Redis.emit (events.js:203:13)
at Redis.silentEmit (/Users/abc/Projects/Sean/node_modules/ioredis/built/redis/index.js:482:26)
at Socket.<anonymous> (/Users/abc/Projects/Sean/node_modules/ioredis/built/redis/event_handler.js:122:14)
at Object.onceWrapper (events.js:291:20)
at Socket.emit (events.js:203:13)
at emitErrorNT (internal/streams/destroy.js:91:8)
at emitErrorAndCloseNT (internal/streams/destroy.js:59:3)
at processTicksAndRejections (internal/process/task_queues.js:77:11)
Error: read ECONNRESET
at TCP.onStreamRead (internal/stream_base_commons.js:183:27)
Try to add configuration for TLS when using Azure redis cache. Should be the same config value as host. I did not manage to get a connection without it.
var notificationQueue = new Queue('notifications', {
redis: {
port: Number(process.env.REDIS_PORT),
host: process.env.REDIS_HOST,
password: process.env.REDIS_PASS,
tls: {
servername: process.env.REDIS_HOST
}
}});

Parse Server connect to remote database over SSL

I have been using Parse Server locally without any issues until I enabled SSL. Now I have a number of unauthorized and other errors that prevent me connecting to the remote DB.
The database is hosted with NodeChef and they provide a sslCAFile which I have tried to add to my Parse Server config as advised here. However, I am still unable to connect and am getting the following errors in terminal.
warn: Unabled to ensure uniqueness for user email addresses: Error:
unable to get issuer certificate
at Error (native)
at TLSSocket. (_tls_wrap.js:1000:38)
at emitNone (events.js:67:13)
at TLSSocket.emit (events.js:166:7)
at TLSSocket._finishInit (_tls_wrap.js:567:8)
error: Uncaught internal server error. { [MongoError: unable to get issuer certificate]
name: 'MongoError',
message: 'unable to get issuer certificate' } Error: unable to get issuer certificate
at Error (native)
at TLSSocket. (_tls_wrap.js:1000:38)
at emitNone (events.js:67:13)
at TLSSocket.emit (events.js:166:7)
at TLSSocket._finishInit (_tls_wrap.js:567:8)
My Parse Server config looks like this:
var api = new ParseServer({
databaseURI: databaseUri || '',
databaseOptions: {
mongos: {
ssl: true,
sslValidate: true,
allowConnectionsWithoutCertificates: true,
sslCA: [fs.readFileSync('SSLCA.pem')] // cert from nodechef dashboard
}
},
cloud: process.env.CLOUD_CODE_MAIN || __dirname + '/cloud/main.js',
appId: process.env.APP_ID
})
I have tried changing the server URL to HTTPS on both the client and the server and every combination in between to no avail.
Many thanks.
The issue was that I still had a legacy javascript key set. If this optional param set in the Parse Server initialization option then it it completely refuses to connect (quite rightly) using the new JS Client, for which you only need an App ID and Initialisation function.
I can confirm that I didn't need a certificate to connect to my remote database at NodeChef using ?ssl=true in my connection string.
Although embarrassing I'll leave this here, may help someone.

Error: Redis connection to 10.130.212.246:6379 failed - connect ETIMEDOUT

I have a sails app. I am running the app in AWS. When i run the code in development mode(sails lift --verbose) it works fine. I am able to access it from the browser bye typing the Ip and port no.(xx.xx.xxx.xx:1337/). But when i run the code in production mode (sails lift --prod --verbose) i am not able to access by ip(xx.xx.xxx.xx) when i try with xx.xx.xxx.xx:1337 it gives me the below error.
Grunt :: Done, without errors.
Unable to parse HTTP body- error occurred:
Error: Redis connection to 10.130.212.246:6379 failed - connect ETIMEDOUT
at RedisClient.flush_and_error (/home/ubuntu/vka/node_modules/sails/node_modules/connect- redis/node_modules/redis/index.js:142:13)
at RedisClient.on_error (/home/ubuntu/vka/node_modules/sails/node_modules/connect-redis/node_modules/redis/index.js:180:10)
at Socket.<anonymous> (/home/ubuntu/vka/node_modules/sails/node_modules/connect-redis/node_modules/redis/index.js:95:14)
at Socket.emit (events.js:95:17)
at net.js:441:14
at process._tickDomainCallback (node.js:492:13) [Error: Redis connection to 10.130.212.246:6379 failed - connect ETIMEDOUT]
Unable to parse HTTP body- error occurred:
Error: Redis connection to 10.130.212.246:6379 failed - connect ETIMEDOUT
at RedisClient.flush_and_error (/home/ubuntu/vka/node_modules/sails/node_modules/connect-redis/node_modules/redis/index.js:142:13)
at RedisClient.on_error (/home/ubuntu/vka/node_modules/sails/node_modules/connect-redis/node_modules/redis/index.js:180:10)
at Socket.<anonymous> (/home/ubuntu/vka/node_modules/sails/node_modules/connect-redis/node_modules/redis/index.js:95:14)
at Socket.emit (events.js:95:17)
at net.js:441:14
at process._tickDomainCallback (node.js:492:13) [Error: Redis connection to 10.130.212.246:6379 failed - connect ETIMEDOUT]
please suggest a possible solution.
Check if you are hardcoding the host in session.js with ip, change it to localhost.
For your question on how to access without port 1337, You need to set up a reverse proxy, say NginX, open up just the port 80 for public access, configure nginx to route the request coming to port 80 to your sails app running on port 1337, use something like pm2 or forever to run the sails app.
Steps to setup NginX as reverse proxy is explained here: https://www.digitalocean.com/community/tutorials/how-to-set-up-a-node-js-application-for-production-on-ubuntu-14-04

Resources