I installed a local instance of mongodb with password connection using docker and linked it to my backend in node.js. Everything works fine on my laptop. The problem is when I put the mongo with docker and my backend on a vps I got a weird error from the backend when testing the endpoints : MongoError: command find requires authentication
I tried to investigate and at first I thought there were some problems with mongo config file so I ran this command : db._adminCommand( {getCmdLineOpts: 1}) and I got this output :
{
"argv" : [
"mongod",
"--auth",
"--bind_ip_all"
],
"parsed" : {
"net" : {
"bindIp" : "*"
},
"security" : {
"authorization" : "enabled"
}
},
"ok" : 1
}
which shows that authorization is activated.
Also I got no erros while running my backend that would tell me that there were a connection error. On contrary while running this server :
#Configuration({
...config,
acceptMimes: ["application/json"],
httpPort: process.env.PORT || 8083,
httpsPort: false, // CHANGE
mongoose: [
{
id: "mydb",
url: "mongodb://127.0.0.1:27017/mydb",
connectionOptions: {
user: process.env.USER_MONGO_MYDB,
pass: process.env.PASSWORD_MONGO_MYDB
}
}
],
componentsScan: [
`${rootDir}/protocols/*.ts` // scan protocols directory
],
mount: {
"/rest": [
`${rootDir}/controllers/**/*.ts`
],
"/": [IndexCtrl]
},
views: {
root: `${rootDir}/../views`,
viewEngine: "ejs"
},
exclude: [
"**/*.spec.ts"
]
})
export class Server {
#Inject()
app: PlatformApplication;
#Configuration()
settings: Configuration;
}
I got a log that would tell me that connection was successful :
[2021-06-18T16:07:42.391] [INFO ] [TSED] - Connect to mongo database: mydb
(node:18398) DeprecationWarning: current URL string parser is deprecated, and will be removed in a future version. To use the new parser, pass option { useNewUrlParser: true } to MongoClient.connect.
(node:18398) [MONGODB DRIVER] Warning: Current Server Discovery and Monitoring engine is deprecated, and will be removed in a future version. To use the new Server Discover and Monitoring engine, pass option { useUnifiedTopology: true } to the MongoClient constructor.
I have no idea where to investigate and how to solve the problem ? Anyone has any recommendation ?
I'll put here my dockerfile docker-compose and other config files I used for the mongo instance if that can be of any help :
docker-compose.yml :
version: "3"
services:
mongodb:
build: .
container_name: mongodb
environment:
MONGO_INITDB_DATABASE: mydb
MONGO_INITDB_ROOT_USERNAME: "${MONGO_INITDB_ROOT_USERNAME}"
MONGO_INITDB_ROOT_PASSWORD: "${MONGO_INITDB_ROOT_PASSWORD}"
volumes:
- ./database:/data/db
- ./log/:/var/log/mongodb/
- ./mongod.conf:/etc/mongod.conf
ports:
- 27017:27017
restart: unless-stopped
Dockerfile:
FROM mongo
COPY docker-entrypoint.sh /usr/bin/
COPY seed-data.js /docker-entrypoint-initdb.d/
COPY .env /docker-entrypoint-initdb.d/
COPY mongod.conf /etc/mongod.conf
mongod.conf
# mongod.conf
# for documentation of all options, see:
# http://docs.mongodb.org/manual/reference/configuration-options/
# Where and how to store data.
storage:
dbPath: /var/lib/mongodb
journal:
enabled: true
# engine:
# mmapv1:
# wiredTiger:
# where to write logging data.
systemLog:
destination: file
logAppend: true
path: /var/log/mongodb/mongod.log
# network interfaces
net:
port: 27017
bindIp: 127.0.0.1
# how the process runs
processManagement:
timeZoneInfo: /usr/share/zoneinfo
# security settings including user password protection
security:
authorization: enabled
#operationProfiling:
#replication:
#sharding:
## Enterprise-Only Options:
#auditLog:
#snmp:
seed-data.js :
db.createUser(
{
user: "user",
pwd: "userpassword",
roles: [ "readWrite", "dbAdmin" ]
}
)
docker-entrypoint.sh
if [ "$MONGO_INITDB_ROOT_USERNAME" ] && [ "$MONGO_INITDB_ROOT_PASSWORD" ]; then
rootAuthDatabase='admin'
"${mongo[#]}" "$rootAuthDatabase" <<-EOJS
db.createUser({
user: $(_js_escape "$MONGO_INITDB_ROOT_USERNAME"),
pwd: $(_js_escape "$MONGO_INITDB_ROOT_PASSWORD"),
roles: [ { role: 'root', db: $(_js_escape "$rootAuthDatabase") } ]
})
EOJS
fi
I solved the problem, really stupid mistake : I forgot to put back the .env file after I pulled the repo into the vps. The backend received undefined instead of process.env.USER_MONGO_MYDB and process.env.PASSWORD_MONGO_MYDB and was not throwing any mongo connection error !
Related
I am posting it here as I am run out of options.
I am trying to make a connection from my node app to the mongodb.
I am getting AuthenticationFailed: SCRAM-SHA-1 authentication failed, storedKey mismatch
My local environment works 100%. I dumped my local MongoDB (the app database and admin too) into the docker container.
I created my docker-compose.yml as below:
version: "1.0"
services:
mongodb:
image: mongo:3.4.7
container_name: MongoDB2
restart: unless-stopped
ports:
- '27017:27017'
app:
links:
- mongodb
depends_on:
- mongodb
image: eamello/gsd:myCore
ports:
- '8087:8087'
stdin_open: true
tty: true
volumes:
db:
networks:
node-webapp-network:
driver: bridge
My config.json file, which has the database connection details:
"myCore": {
"database": {
"url": "mongodb://mongodb:27017/myCore",
"options": {
"db": {
"native_parser": true
},
"server": {
"poolSize": 100,
"socketOptions": {
"keepAlive": 1000,
"connectTimeoutMS": 30000
}
},
"replset": {},
"user": "myAdmin",
"pass": "/WnUU5Jqithypb9970AfIQ==",
"auth": {
"authdb": "admin"
},
"queryLevel":{
"common":{
"maxTimeMS": 15000
}
}
}
},
I am 100% sure the user created in my admin database has the same password.
I check and rechecked several times.
I also tried to add my user via javascript file... it looks like the javascript was never executed.
db.createUser(
{
user: "myAdmin",
pwd: "/WnUU5Jqithypb9970AfIQ==",
roles: [
{
role: "userAdmin", db: "myCore"
},
{
role: "readWrite", db: "myCore"
}
]
}
);
As I can manage my MongoDB via Compass, I left this javascript aside.
Does anyone have any clue why I am getting AuthenticationFailed: SCRAM-SHA-1 authentication failed, storedKey mismatch?
I changed some names above as this is a company issue. Thanks.
I just create a new Laravel 9 project and I'm using lando.
I have followed instructions from this : https://sinnbeck.dev/posts/getting-vite-and-laravel-to-work-with-lando
Currently, the site is working, I can update php, css or js code.
But, there is no livereloading and I've got an error in my console about a missing sourcemapping for the css at http://localhost:3009/resources/css/app.css.map
There is a link to the project : https://github.com/CrunchyArtie/warene-next
There is my vite.config.js file:
export default defineConfig({
plugins: [
laravel({
input: ['resources/css/app.css', 'resources/js/app.js'],
refresh: true,
}),
],
server: {
https: false,
host: true,
port: 3009,
hmr: {host: 'localhost', protocol: 'ws'},
},
});
and my .lando.yml file :
name: warene
recipe: laravel
config:
webroot: ./public
php: '8.1'
xdebug: true
services:
node:
type: node:16
scanner: false
ports:
- 3009:3009
build:
- npm install
tooling:
dev:
service: node
cmd: npm run dev
build:
service: node
cmd: npm run build
EDIT :
With this vite.config.js the livereloading works :
export default defineConfig({
plugins: [
laravel({
input: ['resources/css/app.css', 'resources/js/app.js'],
refresh: true,
}),
],
server: {
https: false,
host: true,
strictPort: true,
port: 3009,
hmr: {host: 'localhost', protocol: 'ws'},
watch: {
usePolling: true,
}
},
});
With css.devSourcemap: true a sourcemap file is generated and used.
With server.watch.usePolling: true vite will detect file changed inside lando environment.
This is my is my vite.config.js file:
export default defineConfig({
plugins: [
laravel({
input: ['resources/css/app.css', 'resources/js/app.js'],
refresh: true,
}),
],
css: {
devSourcemap: true,
},
server: {
https: false,
host: true,
strictPort: true,
port: 3009,
hmr: {host: 'localhost', protocol: 'ws'},
watch: {
usePolling: true,
}
},
});
I'm looking for some help on how I can connect to a mongodb using node in two different containers.
I have three services set up in my docker compose:
webserver (irrelevant to question)
nodeJs
mongo database
The nodejs container is essentially an api which I can use to communicate with mongodb:
require('dotenv').config();
const express = require('express');
const cors = require('cors')
const app = express();
var MongoClient = require('mongodb').MongoClient;
var mongodb = require('mongodb');
app.use(express.json())
app.use(cors())
app.post('/api/fetch-items', (req, res) => {
if (req.headers.apikey !== process.env.API_KEY) return res.sendStatus(401)
// URL is in the format: mongodb://user:pwd#database:27017
MongoClient.connect(process.env.MONGODB_URL, function(err, db) {
if (err) return res.status(500).send(err);
var dbo = db.db("db");
dbo.collection("col").find({}).toArray(function(err, result) {
if (err) return res.status(500).send(err);
db.close();
return res.status(200).send(result);
});
});
})
app.listen(4000)
This all works perfectly fine if I run node as a standalone container (not using docker-compose) and use localhost in the URL.
However, when I use the image in docker-compose I receive the response:
{
"name": "MongoNetworkError"
}
when sending a request to the API.
I am currently using the hostname 'database' in the URL and this does not work. I have also tried using localhost.
There are also no errors as a result of the command node server.
If needed my Dockerfile for the node server is:
FROM node:10-alpine
RUN mkdir -p /home/node/app/node_modules && chown -R node:node /home/node/app
WORKDIR /home/node/app
COPY package*.json ./
RUN chown node:node ./package*.json
USER node
RUN npm install
COPY --chown=node:node . .
EXPOSE 4000
CMD [ "node", "server" ]
My docker-compose.yml file:
version: "3.1"
services:
mongodb:
image: mongo
restart: always
container_name: database
ports:
- 27017:27017
environment:
MONGO_INITDB_ROOT_USERNAME: root
MONGO_INITDB_ROOT_PASSWORD: xxxxxxxx
# Web server stuff
node:
image: created-node-server
container_name: node
ports:
- 4000:4000
Finally, the output of docker network inspect:
[
{
"Name": "network_default",
"Id": "3e51a90a23f2785cfc405243ad4c73991852f52826fd1cd0b14da5d4eaa180e4",
"Created": "2021-01-12T01:07:42.656013002Z",
"Scope": "local",
"Driver": "bridge",
"EnableIPv6": false,
"IPAM": {
"Driver": "default",
"Options": null,
"Config": [
{
"Subnet": "172.23.0.0/16",
"Gateway": "172.23.0.1"
}
]
},
"Internal": false,
"Attachable": true,
"Ingress": false,
"ConfigFrom": {
"Network": ""
},
"ConfigOnly": false,
"Containers": {
"418876a06c3f8fa430804ae77c66cca986a49dbc88374266346463f7f448baa7": {
"Name": "database",
"EndpointID": "ac08c5a439edd43e612723d269714e9dfbae29dbdb50790b61c66207287d70c8",
"MacAddress": "02:42:ac:17:00:04",
"IPv4Address": "172.23.0.4/16",
"IPv6Address": ""
},
"7b6dcbb8f76618575c988a026ac0308075a116f79a2e58d8a146e33fb5d7674c": {
"Name": "node",
"EndpointID": "e6beb412a2fe97ae7d04d2484a7ca3634bfa37c82680becc412d1f44502da72f",
"MacAddress": "02:42:ac:17:00:03",
"IPv4Address": "172.23.0.3/16",
"IPv6Address": ""
},
"f2ea250bccdb2c6a0c4d7818912ddbf29196eff072dad699e8dbcef466cd38a3": {
"Name": "webserver",
"EndpointID": "f6617aab4001032069e68300c5303fa730f3458e2fe0092ace45a9f67e16d7c5",
"MacAddress": "02:42:ac:17:00:02",
"IPv4Address": "172.23.0.2/16",
"IPv6Address": ""
}
},
"Options": {},
"Labels": {
"com.docker.compose.network": "default",
"com.docker.compose.project": "proj",
"com.docker.compose.version": "1.27.4"
}
}
]
Essentially, I am retrieving the MongoNetworkError when trying to communicate with mongodb through node, both of which are docker containers created using docker-compose.
I hope all the above makes sense, sorry if it is a bit wordy, I have tried to include as much info as possible. Comment if you need any more info
Thanks :)
You just need to include an environmental variable under the node service MONGODB_URL=mongodb://database:27017
I'm trying to wrap around pm2:
pm2.start(
{
apps: [
{
script: 'app.js',
path: 'remote/path',
name: 'App',
autorestart: false,
host: [123.456.789],
username: 'root',
},
],
},
err => {
if (err) throw err
},
)
It still seems to be trying to run app.js on my local machine, and not the host 123.456.789 - any idea what's going on?
host should be an array of strings: host: ['123.456.789'].
Everytime I write something in the logs I get this and the pm2 restart giving me an Service unavailable error in my webpage.
To write the logs I use
winston.info('some info');
and in the app.js I have this:
winston.add(winston.transports.File, { name: 'app-info', maxFiles: 3, filename: 'logs/app-info.log', level: 'info' });
How can avoid the pm2 to restart everytime I am writing in the log?
i solved by ignore the log and watch
watch: 'false',
ignore_watch : [ "*.log"],
so full the code :
module.exports = {
apps : [{
name: "API-XXX",
script: "/var/www/api/api-xxx-xxx/server.js",
env: {
NODE_ENV: "production",
},
env_production: {
NODE_ENV: "production",
},
watch: 'false',
ignore_watch : [ "*.log"],
}],
deploy : {
production : {
user : 'root',
host : 'localhost',
ref : 'origin/master',
repo : 'https://github.com/yogithesymbian/api-xxx-nodejs.git',
path : '/var/www/api/api-xxx-nodejs/',
'pre-deploy-local': '',
'post-deploy' : 'npm install && pm2 reload ecosystem.config.js --env production',
'pre-setup': ''
}
}
};
then run
pm2 restart ecosystem.config.js --env production
i forgot the github disscussion to refer someone said put the line .
You can resolve this problem using --no-autorestart,
example:
pm2 start app.js --no-autorestart