Connection problem to mongoDB with nodeJS - node.js

I host a Mongo database on an ubuntu server. I created an admin user in order to be able to connect with Nodejs to create a database, add tables, etc. I can connect with mongoDB compass without problems but from nodeJS mongo returns an error.
connect function:
const mongoose = require("mongoose");
mongoose.set("strictQuery", true);
//connect to db
mongoose
.connect("mongodb://" + process.env.DB_USER_PASS + "#2.56.247.250:27017/?authMechanism=DEFAULT")
.then(() => console.log('Connecté a la base de donné'))
.catch((err) => console.log("Erreur de connexion :", err));
Here is the error:
Erreur de connexion : MongoServerError: Authentication failed.
at Connection.onMessage (C:\Users\arnau\Desktop\messIO\node_modules\mongodb\lib\cmap\connection.js:230:30)
at MessageStream.<anonymous> (C:\Users\arnau\Desktop\messIO\node_modules\mongodb\lib\cmap\connection.js:61:60)
at MessageStream.emit (node:events:513:28)
at processIncomingData (C:\Users\arnau\Desktop\messIO\node_modules\mongodb\lib\cmap\message_stream.js:125:16)
at MessageStream._write (C:\Users\arnau\Desktop\messIO\node_modules\mongodb\lib\cmap\message_stream.js:33:9)
at writeOrBuffer (node:internal/streams/writable:392:12)
at _write (node:internal/streams/writable:333:10)
at Writable.write (node:internal/streams/writable:337:10)
at Socket.ondata (node:internal/streams/readable:766:22)
at Socket.emit (node:events:513:28) {
ok: 0,
code: 18,
codeName: 'AuthenticationFailed',
connectionGeneration: 0,
[Symbol(errorLabels)]: Set(2) { 'HandshakeError', 'ResetPool' }

The error AuthenticationFailed means that there is a problem with your connection string and your driver cannot connect into it.
Check for all details here: https://www.mongodb.com/docs/manual/reference/connection-string/
Potential problems:
Special Chars: From the docs: If the username or password includes the following characters: / ? # [ ] # those characters must be converted using percent encoding.
username:password format check that your environmental variable is on the right format, with : between username and pass, and no spaces.
Check your auth database: When you created the user, you created on default ("admin" db is the default.) or created on a specific db, using the command use dbname. If you created on a specific db, you might need to add the auth db name on the connection string.
You can try all the above solutions, making a connection using the mongosh command, to verify that your connection string is fine.

Related

Connection timed out while connecting to AWS DocumentDB outside the VPC

I'm trying create a very simple node app that can use DocumentDB. I'm not using Cloud9 neither Lambda, I'm coding locally. I was following this link https://docs.aws.amazon.com/documentdb/latest/developerguide/connect-from-outside-a-vpc.html and this link https://docs.aws.amazon.com/documentdb/latest/developerguide/connect-ec2.html
I created a poorly secured EC2 instance with the following inbound rules
port range
protocol
source
security group
22
TCP
0.0.0.0/0
demoEC2
This demoEC2 security group has the following inbound rules
type
protocol
port range
source
SSH
TCP
22
0.0.0.0/0
Then I created a DocumentDB cluster with 1 instance available that belongs to a security group that has the following inbound rules
type
protocol
port range
source
custom tcp
TCP
27017
demoEC2
After that, I open my terminal and created a tunnel:
ssh -i "mykeypair.pem" -L 27017:<CLUSTER ENDPOINT>:27017 ec2-user#<EC2 PUBLIC IPV4 DNS> -N
And, to test if my tunnel is working, I connect using mongoshell:
> mongo "mongodb://<MASTER USERNAME>:<MASTER PASSWORD>#localhost:27017/<DATABASE>" --tls --tlsAllowInvalidHostnames --tlsCAFile rds-combined-ca-bundle.pem
MongoDB shell version v4.2.13
connecting to: mongodb://localhost:27017/<DATABASE>?compressors=disabled&gssapiServiceName=mongodb
2021-07-29T10:10:59.309+0200 W NETWORK [js] The server certificate does not match the host name. Hostname: localhost does not match docdb-2021-07-27-10-32-49.ctuxybn342pe.eu-central-1.docdb.amazonaws.com docdb-2021-07-27-10-32-49.cluster-ctuxybn342pe.eu-central-1.docdb.amazonaws.com docdb-2021-07-27-10-32-49.cluster-ro-ctuxybn342pe.eu-central-1.docdb.amazonaws.com , Subject Name: C=US,ST=Washington,L=Seattle,O=Amazon.com,OU=RDS,CN=docdb-2021-07-27-10-32-49.ctuxybn342pe.eu-central-1.docdb.amazonaws.com
Implicit session: session { "id" : UUID("63340995-54ad-471b-aa8d-85763f3c7281") }
MongoDB server version: 4.0.0
WARNING: shell and server versions do not match
Warning: Non-Genuine MongoDB Detected
This server or service appears to be an emulation of MongoDB rather than an official MongoDB product.
Some documented MongoDB features may work differently, be entirely missing or incomplete, or have unexpected performance characteristics.
To learn more please visit: https://dochub.mongodb.org/core/non-genuine-mongodb-server-warning.
rs0:PRIMARY>
However, when I try to connect in my node app:
const mongoose = require('mongoose');
const fs = require('fs');
const path = require('path');
const username = ...
const password = ...
const database = ...
const connstring = `mongodb://${username}:${password}#localhost:27017/${database}?tls=true&replicaSet=rs0&readPreference=secondaryPreferred`;
const certFile = path.resolve(__dirname, './rds-combined-ca-bundle.pem');
const certFileBuf = fs.readFileSync(certFile); //I tried this one in tlsCAFile option as well
mongoose.connect(connstring,
{
tlsCAFile: certFile,
useNewUrlParser: true,
tlsAllowInvalidHostnames: true,
}
).then(() => console.log('Connection to DB successful'))
.catch((err) => console.error(err, 'Error'));
I get a connection timeout error after a while:
> > node .\index.js
(node:12388) [MONGODB DRIVER] Warning: Current Server Discovery and Monitoring engine is deprecated, and will be removed in a future version. To use the new Server Discover and Monitoring engine, pass option { useUnifiedTopology: true } to the MongoClient constructor.
MongoNetworkError: failed to connect to server [<CLUSTER ENDPOINT WITHOUT HAVING .cluster->:27017] on first connect [MongoNetworkTimeoutError: connection timed out
at connectionFailureError (D:\projects\documentdb-connect\node_modules\mongoose\node_modules\mongodb\lib\core\connection\connect.js:345:14)
at TLSSocket.<anonymous> (D:\projects\documentdb-connect\node_modules\mongoose\node_modules\mongodb\lib\core\connection\connect.js:313:16)
at Object.onceWrapper (events.js:421:28)
at TLSSocket.emit (events.js:315:20)
at TLSSocket.Socket._onTimeout (net.js:481:8)
at listOnTimeout (internal/timers.js:549:17)
at processTimers (internal/timers.js:492:7)]
at Pool.<anonymous> (D:\projects\documentdb-connect\node_modules\mongoose\node_modules\mongodb\lib\core\topologies\server.js:441:11)
at Pool.emit (events.js:315:20)
at D:\projects\documentdb-connect\node_modules\mongoose\node_modules\mongodb\lib\core\connection\pool.js:564:14
at D:\projects\documentdb-connect\node_modules\mongoose\node_modules\mongodb\lib\core\connection\pool.js:1013:9
at D:\projects\documentdb-connect\node_modules\mongoose\node_modules\mongodb\lib\core\connection\connect.js:32:7
at callback (D:\projects\documentdb-connect\node_modules\mongoose\node_modules\mongodb\lib\core\connection\connect.js:283:5)
at TLSSocket.<anonymous> (D:\projects\documentdb-connect\node_modules\mongoose\node_modules\mongodb\lib\core\connection\connect.js:313:7)
at Object.onceWrapper (events.js:421:28)
at TLSSocket.emit (events.js:315:20)
at TLSSocket.Socket._onTimeout (net.js:481:8)
at listOnTimeout (internal/timers.js:549:17)
at processTimers (internal/timers.js:492:7) Error
Since I could connect using mongoshell, I think the tunnel is working and I can even do some inserts on it, but why Mongoose can't connect? I tried also using the MongoClient (const MongoClient = require('mongodb').MongoClient and MongoClient.connect(same everything)) but it didn't worked, I'm still getting the same timeout error.
Turns out all I need to do is to pass the username and password through the options, not in the connection string:
const connstring = `mongodb://localhost:27017/${database}`;
const certFile = path.resolve(__dirname, './rds-combined-ca-bundle.pem');
const certFileBuf = fs.readFileSync(certFile);
mongoose.connect(connstring,
{
tls: true,
tlsCAFile: certFile,
useNewUrlParser: true,
tlsAllowInvalidHostnames: true,
auth: {
username,
password
}
}
)

Redis sentinel connection is timing out from nodeJS

Am trying to connect redis sentinel instance from nodeJS using ioredis. Am not able to connect redis sentinel instance despite trying multiple available options. We have not configured sentinel password. But, able to connect same redis sentinel instance from .net core using StackExchange.Redis. Please find below nodeJS code,
import { AppModule } from './app.module';
import IORedis from 'ioredis';
async function bootstrap() {
const app = await NestFactory.create(AppModule);
const ioredis = new IORedis({
sentinels: [
{ host: 'sentinel-host-1' },
{ host: 'sentinel-host-2' },
{ host: 'sentinel-host-3' },
],
name: 'mastername',
password: 'password',
showFriendlyErrorStack: true,
});
try {
ioredis.set('foo', 'bar');
} catch (exception) {
console.log(exception);
}
await app.listen(3000);
}
bootstrap();
Error we got is,
[ioredis] Unhandled error event: Error: connect ETIMEDOUT
node_modules\ioredis\built\redis\index.js:317:37)
at Object.onceWrapper (node:events:475:28)
at Socket.emit (node:events:369:20)
at Socket._onTimeout (node:net:481:8)
at listOnTimeout (node:internal/timers:557:17)
at processTimers (node:internal/timers:500:7)
Connection String used from .net core is below,
Redis_Configuration = "host-1,host-2,host-3,serviceName=mastername,password=password,abortConnect=False,connectTimeout=1000,responseTimeout=1000";
Answering this for the benefit of others. Everything is fine, but this nodeJS package is resolving redis instances into private IPs which i cannot access from my local. So, had to put it over subnet group and make it work. However, FYI - .net core package does not resolve into private IPs, hence i was able to access instances from my local itself.
"The arguments passed to the constructor are different from the ones you use to connect to a single node"
Try to replace password with sentinelPassword.

Managed DigitalOcean Redis instance giving Redis AbortError

I setup managed redis and managed postgres on digital ocean. Digital ocean gave me a .crt file, I don't know what to do with this, so didn't do anything with it. Can this be the root of the problem below:
Or do I have to allow docker container to reach outside of the container on the rediss protocol?
I dockerized a node app and then put this container onto my droplet. I have my droplet and managed redis and postgres in same region (SFO2). It connects to redis using this url:
url: 'rediss://default:REMOVED_THIS_PASSWORD#my-new-app-sfo2-do-user-5053627-0.db.ondigitalocean.com:25061/0',
I then did ran my docker container with docker run.
It then gives me error:
node_redis: WARNING: You passed "rediss" as protocol instead of the "redis" protocol!
events.js:186
throw er; // Unhandled 'error' event
^
AbortError: Connection forcefully ended and command aborted. It might have been processed.
at RedisClient.flush_and_error (/opt/apps/mynewapp/node_modules/redis/index.js:362:23)
at RedisClient.end (/opt/apps/mynewapp/node_modules/redis/lib/extendedApi.js:52:14)
at RedisClient.onPreConnectionEnd (/opt/apps/mynewapp/node_modules/machinepack-redis/machines/get-connection.js:157:14)
at RedisClient.emit (events.js:209:13)
at RedisClient.connection_gone (/opt/apps/mynewapp/node_modules/redis/index.js:590:14)
at Socket.<anonymous> (/opt/apps/mynewapp/node_modules/redis/index.js:293:14)
at Object.onceWrapper (events.js:298:28)
at Socket.emit (events.js:214:15)
at endReadableNT (_stream_readable.js:1178:12)
at processTicksAndRejections (internal/process/task_queues.js:80:21)
Emitted 'error' event on RedisClient instance at:
at /opt/apps/mynewapp/node_modules/redis/index.js:310:22
at Object.callbackOrEmit [as callback_or_emit] (/opt/apps/mynewapp/node_modules/redis/lib/utils.js:89:9)
at Command.callback (/opt/apps/mynewapp/node_modules/redis/lib/individualCommands.js:199:15)
at RedisClient.flush_and_error (/opt/apps/mynewapp/node_modules/redis/index.js:374:29)
at RedisClient.end (/opt/apps/mynewapp/node_modules/redis/lib/extendedApi.js:52:14)
[... lines matching original stack trace ...]
at processTicksAndRejections (internal/process/task_queues.js:80:21) {
code: 'NR_CLOSED',
command: 'AUTH',
args: [ 'REMOVED_I_DONT_KNOW_IF_THIS_IS_SENSITIVE' ]
The redis protocol is different from rediss because the latter uses TLS connection. DigitalOcean Managed Redis requires the connections to be made over TLS, so you have to use rediss. However, I couldn't find any info about the TLS certificate provided by DigitalOcean to connect to the Managed Redis service.
Based on your error message, I presumed you're using this redis package. If that's the case, you can pass empty TLS object option in the connection string like so:
const Redis = require('redis')
const host = 'db-redis.db.ondigitalocean.com'
const port = '25061'
const username = 'user'
const password = 'secret'
const url = `${username}:${password}#${host}:${port}`
const client = Redis.createClient(url, {tls: {}})
Further reading/source:
SSL connections arrive for Redis on Compose
Connecting to IBM Cloud Databases for Redis from Node.js
I solved this. Below are snippets from config/env/production.js
Sockets
For sockets, to enable rediss you have to pass in all options through adapterOptions like this:
sockets: {
onlyAllowOrigins: ['https://my-website.com'],
// pass in as adapterOptions so it gets through to redis-adapter
// as i need it "rediss" but this url is not supported i get an error.
// so i need to pass in `tls` empty object. and i see he moves things into
// `adapterOptions` here here - https://github.com/balderdashy/sails-hook-sockets/blob/master/lib/configure.js#L128
adapterOptions: {
user: 'username',
pass: 'password',
host: 'host',
port: 9999,
db: 2, // pick a number
tls: {},
},
adapter: '#sailshq/socket.io-redis',
},
Session
For session, pass tls: {} empty object to config:
session: {
pass: 'password',
host: 'host',
port: 9999,
db: 1, // pick a number not used by sockets
tls: {},
cookie: {
secure: true,
maxAge: 24 * 60 * 60 * 1000, // 24 hours
},
},

Unable to retrieve data from PG DB (in Azure) using Sequelize

I am unable to retrieve data from a PG DB resource hosted in Azure. I am using Sequelize and Node.
I am able to connect to the DB hosted in Azure using the terminal and a GUI, I can create a new DB with a table and some prepopulated fields to do a proof of concept.
However, when I try to connect in my local and get the data, I get an empty array response ([ ]). If I hit the same endpoint in production, I get a 502 (after a while) with the following message displayed on the client:
Server Error.
There was an unexpected error in the request processing.
Some code below (it works with my local db configured the same way):
This is my DB config:
'use strict';
var Sequelize = require('sequelize');
var cfg = require('../config');
var sequelize = new Sequelize(cfg.db, cfg.username, cfg.password, {
define: {
timestamps: false
},
host: cfg.host,
dialect: 'postgres',
port: 5432
});
And this is my router code:
'use strict';
const express = require('express');
const router = express.Router();
var User = require('../../models/users-model');
router.get('/', (req, res) => {
User.findAll().then(user => {
res.json(user);
});
});
module.exports = router;
Both in local and prod I expect to get the JSON response with an array of User objects.
In my local, as explained, I get an empty array.
In production, as mentioned as well, it seems to timeout and finally I get a 502 err response.
Any help is much appreciated!
Update!: I managed to activate the app logs on Azure (it took me a bit to find it as I'm quite new to the platform!) and got this now when I hit the endpoint in prod:
2019-08-12T12:52:06.355595892Z Unhandled rejection SequelizeConnectionRefusedError: connect ECONNREFUSED 127.0.0.1:5432
2019-08-12T12:52:06.355632393Z at connection.connect.err (/usr/src/app/server/node_modules/sequelize/lib/dialects/postgres/connection-manager.js:170:24)
2019-08-12T12:52:06.355637793Z at Connection.connectingErrorHandler (/usr/src/app/server/node_modules/pg/lib/client.js:191:14)
2019-08-12T12:52:06.355641493Z at emitOne (events.js:116:13)
2019-08-12T12:52:06.355645293Z at Connection.emit (events.js:211:7)
2019-08-12T12:52:06.355648693Z at Socket.reportStreamError (/usr/src/app/server/node_modules/pg/lib/connection.js:72:10)
2019-08-12T12:52:06.355652093Z at emitOne (events.js:116:13)
2019-08-12T12:52:06.355655393Z at Socket.emit (events.js:211:7)
2019-08-12T12:52:06.355658393Z at emitErrorNT (internal/streams/destroy.js:64:8)
2019-08-12T12:52:06.355661493Z at _combinedTickCallback (internal/process/next_tick.js:138:11)
2019-08-12T12:52:06.355664693Z at process._tickCallback (internal/process/next_tick.js:180:9)
After hours and hours, I have hardcoded the data rather than getting that dynamically from my config files, probably I did not set up my Dockerfile properly and was not setting the ENV variable correctly.
Now I attacked the PROD DB from my local and it seems to work! Would really appreciate if someone can affirm my problem lies at the configuration level and the NODE_ENV env node var.
Dockerfile
# Node server serving Angular App
FROM node:8.11-alpine as node-server
WORKDIR /usr/src/app
COPY /server /usr/src/app/server
WORKDIR /usr/src/app/server
ENV NODE_ENV=prod
RUN npm install --production --silent
EXPOSE 80 443
CMD ["node", "index.js"]
Then in /config/index.js I have:
var env = process.env.NODE_ENV || 'global'
, cfg = require('./config.' + env);
module.exports = cfg;
So I understand that by setting the NODE_ENV to prod in Docker, when starting the Node app in Azure it should get the config.prod.js file rather than the config.global.js file, right?
You can see how I implement this on the db.js file on the question.

MongoDB Auth Fails to find username on Bitnami MEAN Stack Image

Trying to run a web app (MEAN) on Amazon EC2 Instance but am encountering the following problem. Can anyone help me with this?
node app.js The Server has started on 9091
/opt/bitnami/apps/YelpCamp/node_modules/mongodb-core/lib/auth/scram.js:128
username = username.replace('=', "=3D").replace(',', '=2C');
^
TypeError: Cannot read property 'replace' of undefined
at executeScram (/opt/bitnami/apps/SomeApp/node_modules/mongodb-core/lib/auth/scram.js:128:24)
at /opt/bitnami/apps/SomeApp/node_modules/mongodb-core/lib/auth/scram.js:277:7
at _combinedTickCallback (internal/process/next_tick.js:73:7)
at process._tickCallback (internal/process/next_tick.js:104:9)
Mongoose can do auth in 2 ways:
1, Connection string:
mongoose.connect('mongodb://username:password#host:port(usually 27017)/db')
Where username and password are the respective username and password for that specific db, host would be the host where your db is hosted (so localhost or some domain/IP), port is the port mongo listens on, and db is the name of the db you want to connect to
2, Using options. From the docs:
var options = {
useMongoClinet: true,
auth: {authdb: 'admin'},
user: 'myUsername',
pass: 'myPassword',
}
mongoose.connect(uri, options);
I also faced the 'username undefined' error in the first approach, but I succeeded in the second approach.
[Reference] https://github.com/Automattic/mongoose/issues/4891

Resources