Keep on getting MongoNetworkError connection 6 to xx.x.xx.xx:xxxxx closed - node.js

I keep on getting the below error in an AWS Lambda with Node.js 16 + MongoDB v4, this usually happens for a lambda that has high traffic, other lambdas seem fine with the current setup.
MongoNetworkError: connection 6 to xx.x.xx.xx:xxxxx closed at Connection.onClose (/var/task/node_modules/mongodb/lib/cmap/connection.js:135:19)
MongoDB connection inside the lambda:
const MongoClient = require('mongodb').MongoClient;
const logger = require(''); const log = logger(__filename);
const getDbClient = async (uri) => {
try {
log.info('Connecting to Mongo client...');
const dbClient = await MongoClient.connect(uri);
log.info('Connected to Mongo client');
return dbClient;
}
catch (err) {
log.error('Error encountered connecting to database: ', err);
throw err;
}
};
module.exports = {
getDbClient
};
The mongodb uri has an option of maxPoolSize=10 since I recently did an upgrade from MongoDB v3 to v4, and v4 has a maxPoolSize of 100 by default and v3 had it to 10. https://github.com/mongodb/node-mongodb-native/blob/HEAD/etc/notes/CHANGES_4.0.0.md#connection-pool-options
MongoDB hardware:
3 x M4.XLarge(4Core/16GB RAM)
This issue started happening after I upgraded MongoDB driver from v3 to v4 and stopped checking inside the lambda if there is an existing connection so I can use it because in v4 apparently this is done automatically.
I used to use: MongoClient.isConnected() from MongoDB v3.
Do you guys have any idea what could be the cause of this?

I don't have enough reputation to comment, so I'll just post an answer :D
I think your Lambda function is on extremely high traffic now, so it's even over the pool size of mongodb

Related

How can I verify I don't need the mLab add-on for my Heroku node.js app?

After reading through the mLab -> Atlas migration plan a few times, I decided I'd try a different way. My coding background is mainly asm on mcs51 so I'm something of a n00b in the node.js/mongo/heroku world. I barely understood half of the migration process.
So I wrote a small test app following this blog entry and then used what I'd learned to modify my actual app to talk to Atlas directly. I exported the collections from the old db to JSON, then imported them into the Atlas version to recreate the database. Everything appears to be working correctly; I don't see any data going into the old db and it looks like the new Atlas db is getting all the action.
But I'm leery of deleting the mLab add-on from Heroku until I've verified that it's truly not needed any more, because I'm pretty sure that I won't be able to recreate it if it turns out I've missed something.
So my question is, how can I ensure I'm no longer using the mLab add-on? I don't really understand what it was doing for me in the first place so I'm not sure how to verify I'm not using it any more.
Here are the relevant code snippets I'm using to access the Atlas db...
function myEncode(str) { // https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/encodeURIComponent
return encodeURIComponent(str).replace(/[!'()*]/g, function(c) {
return '%' + c.charCodeAt(0).toString(16);
});
}
const ATLASURI = process.env.ATLASURI;
const ATLASDB = process.env.ATLASDB;
const ATLASUSER = process.env.ATLASUSER;
const ATLASPW = myEncode(process.env.ATLASPW); // wrapper needed to handle strong paswords...
const dbURL = "mongodb+srv://"+ATLASUSER+":"+ATLASPW+"#"+ATLASURI+"/"+ATLASDB+"?retryWrites=true&w=majority";
var GoogleStrategy = require('passport-google-oauth20').Strategy;
const {MongoClient} = require('mongodb');
const client = new MongoClient(dbURL, { useNewUrlParser: true, useUnifiedTopology: true });
var store = new MongoDBStore({uri: dbURL,collection: 'Sessions'});
var db = undefined;
client.connect(async function(err) {
if(err) {console.log("Error:\n"+String(err));}
db = await client.db(ATLASDB);
console.log("Connected to db!");
banner();
});

MongoDB queries are taking 2-3 seconds from Node.js app on Heroku

I am having major performance problems with MongoDB. Simple find() queries are sometimes taking 2,000-3,000 ms to complete in a database with less than 100 documents.
I am seeing this both with a MongoDB Atlas M10 instance and with a cluster that I setup on Digital Ocean on VMs with 4GB of RAM. When I restart my Node.js app on Heroku, the queries perform well (less than 100 ms) for 10-15 minutes, but then they slow down.
Am I connecting to MongoDB incorrectly or querying incorrectly from Node.js? Please see my application code below. Or is this a lack of hardware resources in a shared VM environment?
Any help will be greatly appreciated. I've done all the troubleshooting I know how with Explain query and the Mongo shell.
var Koa = require('koa'); //v2.4.1
var Router = require('koa-router'); //v7.3.0
var MongoClient = require('mongodb').MongoClient; //v3.1.3
var app = new Koa();
var router = new Router();
app.use(router.routes());
//Connect to MongoDB
async function connect() {
try {
var client = await MongoClient.connect(process.env.MONGODB_URI, {
readConcern: { level: 'local' }
});
var db = client.db(process.env.MONGODB_DATABASE);
return db;
}
catch (error) {
console.log(error);
}
}
//Add MongoDB to Koa's ctx object
connect().then(db => {
app.context.db = db;
});
//Get company's collection in MongoDB
router.get('/documents/:collection', async (ctx) => {
try {
var query = { company_id: ctx.state.session.company_id };
var res = await ctx.db.collection(ctx.params.collection).find(query).toArray();
ctx.body = { ok: true, docs: res };
}
catch (error) {
ctx.status = 500;
ctx.body = { ok: false };
}
});
app.listen(process.env.PORT || 3000);
UPDATE
I am using MongoDB Change Streams and standard Server Sent Events to provide real-time updates to the application UI. I turned these off and now MongoDB appears to be performing well again.
Are MongoDB Change Streams known to impact read/write performance?
Change Streams indeed affect the performance of your server. As noted in this SO question.
As mentioned in the accepted answer there,
The default connection pool size in the Node.js client for MongoDB is 5. Since each change stream cursor opens a new connection, the connection pool needs to be at least as large as the number of cursors.
const mongoConnection = await MongoClient.connect(URL, {poolSize: 100});
(Thanks to MongoDB Inc. for investigating this issue.)
You need to increase your pool size to get back your normal performance.
I'd suggest you do more log works. Slow queries after restarted for a while might be worse than you might think.
For a modern database/web app running on a normal machine, it's not very easy to encounter with performance issues if you are doing right. There might be a memory leak or other unreleased resources, or network congestion.
IMHO, you might want to determine whether it's a network problem first, and by enabling slow query log on MongoDB and logging in your code where the query begins and ends, you could achieve this.
If the network is totally fine and you see no MongoDB slow queries, that means something goes wrong in your own application. Detailed logging might really help where query goes slow.
Hope this would help.

Share mongoDB instance in multiple modules using Express [duplicate]

This question already has answers here:
Setting up singleton connection with node.js and mongo
(5 answers)
Closed 5 years ago.
I new to express. Now i am working in mongodb connection and CRUD activity. I created a connection in connection.js. And i get the database instance in app.js. The same DB instance i need to share register, login and some other modules without calling connection.js in these modules again. Or i need the proper answer to resuse connection in multiple modules in express.
Thanks in Advance,
You can setup a db module like so:
// db.js
let _db;
module.exports = {
getDb,
initDb
};
function initDb(callback) {
//connect to db and set _db to the new connection.
}
function getDb() {
assert.ok(_db, "Db has not been initialized. Please called init first.");
return _db;
}
Then somewhere in your app initialization you can do:
//app.js
const dbmodule = require("./db.js");
initDb(function(err){
// more app init;
});
then in your other modules, say login.js, do:
//login.js
const dmModule = require("./db")
const db = dbModule.getDb() //at this point the db should have been initialized.

Connection to PSQL RDS instance via a lambda function?

I'm using AWS and trying to connect to my PSQL RDS instance when the lambda function runs. I'm using the pg npm module and this is my code :
exports.handler = (event, context, callback) => {
"use strict"
const pg = require('pg');
const connectionStr = "dbstr";
var client = new pg.Client(connectionStr);
client.connect(function(err){
if(err) {
callback(err)
}
callback(null, 'Connection established');
});
};
I've been researching for ages how to do it, but I can't really find anything specific. I've added an IAM role that allows VPC access for my lambda, like what it says in the aws tutorial and I've even set all traffic in my VPC security group, but I still keep getting timeout errors like this:
"errorMessage": "2017-01-22T16:11:21.969Z 544e7fc4-e0bd-11e6-87e6-071c13fc2fc8 Task timed out after 30.00 seconds"
I've tested my function locally and it works fine in connecting to the DB and doing what I want to, but the lamda doesn't do it, and I'm not too sure why.
Any ideas would be greatly appreciated!
Nevermind, I've just solved it. Adding:
context.callbackWaitsForEmptyEventLoop = false;
in your lambda function fixed it for me

mongoose connection as a separate module in nodejs app

In my project I want to make a separate module to get mongoose connection,say connection.js ,
var mongoose = require('mongoose');
mongoose.connect('mongodb://host:port/db');
mongoose.connection.on('connected', function () {
console.log('Mongoose default connection open to localhost:27017' );
// If the connection throws an error
mongoose.connection.on('error',function (err) {
console.log('Mongoose default connection error: ' + err);
});
// When the connection is disconnected
mongoose.connection.on('disconnected', function () {
console.log('Mongoose default connection disconnected');
});
module.exports= mongoose;
which I can import using require in another file,say file1.js , as
var connect_to_mongoose = require('connection');
whenever necessary.
But I have came across the problem that since in nodejs IO is async then how can i make sure that the connection is successful and I can now use connect_to_mongoose variable for queries,insertions,deletions etc.
My second question is that after handling the above scenario how can I manage multiple connections for multiple databases. Bcoz as far as i know(for practical reasons) in mongoose one connection is dedicated to one DB only.
I think you should consider scenarios while working with mongodb and mongoose.
mongoose.connect opens a default connection as soon as app starts
you don't have to create every time a new connection if you are dealing with multiple tables / collections (whatever you call).
if you are dealing with multiple databases then you've separate your mongodb url like mongoose.connect(mongodb://localhost/db1) && mongoose.connect(mongodb://localhost/db2)
But above point no. 3 would give you an Warning : Trying to Close an open connection
To solve above issue just use the following :
var db = mongoose.createConnection(mongodb://localhost/db1)
And after your all tasks are completed close the connection
Cheers :)

Resources