How to properly connect to MongoDB using Cloud functions? - node.js

I would like to connect to my Atlas cluster only once per instance running Cloud Functions.
Here is my code for an instance :
const MongoClient = require("mongodb").MongoClient;
const client = new MongoClient("myUrl", {
useNewUrlParser: true,
useUnifiedTopology: true,
});
exports.myHttpMethod = functions.region("europe-west1").runWith({
memory: "128MB",
timeoutSeconds: 20,
}).https.onCall((data, context) => {
console.log("Data is: ", data);
client.connect(() => {
const testCollection = client.db("myDB").collection("test");
testCollection.insertOne(data);
});
});
And i would like to avoid the client.connect() in each function call that seems to be really too much.
I would like to do something like this :
const MongoClient = require("mongodb").MongoClient;
const client = await MongoClient.connect("myUrl", {
useNewUrlParser: true,
useUnifiedTopology: true,
});
const db = client.db("myDB");
exports.myHttpMethod = functions.region("europe-west1").runWith({
memory: "128MB",
timeoutSeconds: 20,
}).https.onCall((data, context) => {
console.log("Data is: ", data);
const testCollection = db.collection("test");
testCollection.insertOne(data);
});
But i can't await like this.
In my AWS Lambda functions (running in python) i have not this issue and i am able to connect only once per instance, so i guess there is an equivalent but i don't know much JS / Node JS.

You can store your database client as a global variable. From the documentation,
Cloud Functions often recycles the execution environment of a previous invocation. If you declare a variable in global scope, its value can be reused in subsequent invocations without having to be recomputed.
Try refactoring the code as shown below:
import * as functions from "firebase-functions";
import { MongoClient } from "mongodb";
let client: MongoClient | null;
const getClient = async () => {
if (!client) {
const mClient = new MongoClient("[MONGODB_URI]", {});
client = await mClient.connect();
functions.logger.log("Connected to MongoDB");
} else {
functions.logger.log("Using existing MongoDB connection");
}
functions.logger.log("Returning client");
return client;
};
export const helloWorld = functions.https.onRequest(
async (request, response) => {
const db = (await getClient()).db("[DATABASE]");
const result = await db.collection("[COLLECTION]").findOne({});
response.send("Hello from Firebase!");
}
);
This should reuse the connection for that instance.

Related

Node.js and MongoAtlas - How can I connect to multiple databases in the same application?

I'm writing a Node.js cli in which I've to read from one Mongo Atlas DB and write to another Mongo Atlas DB. I'll be reading documents from one db and writing equivalent documents in the other db, one document at a time. I've two separate connection files like this:
ReadDB.js:
require('dotenv').config();
const mongoose = require('mongoose');
const read_db_url = process.env.READDB_URI;
const readDB = async () => {
try {
await mongoose.connect(read_db_url,
{
useNewUrlParser: true,
useUnifiedTopology: true,
dbName: "dbProd"
}
);
} catch (err) {
console.error(err);
}
}
module.exports = readDB
WriteDB.js:
require('dotenv').config();
const mongoose = require('mongoose');
const write_db_url = process.env.WRITEDB_URI;
const writeDB = async () => {
try {
await mongoose.connect(write_db_url,
{
useNewUrlParser: true,
useUnifiedTopology: true,
dbName: "dbQA"
}
);
} catch (err) {
console.error(err);
}
}
module.exports = writeDB
This what I've so far for the main application (cli.js):
cli.js:
require('dotenv').config();
const mongoose = require('mongoose');
const connectReadDB = require('./ReadDB.js');
const connectWriteDB = require('./WriteDB.js');
connectReadDB();
connectWriteDB();
const findProduct = async (productId) => {
products = await Products.find({_id:productId});
}
I guess my confusion is how Node.js will know which db to read from to begin with? Will I need separate set of models, one for read and one for write? How can I establish two simultaneous connections in the same Node.js app?
Mongoose handling connections via connections pool http://mongoosejs.com/docs/connections.html
You can use server: {poolSize: 5} option for increase/decrease pool (number of parallel connections)
If you need connections to different databases look here Mongoose and multiple database in single node.js project
Example of multiple connections:
const mongoose = require('mongoose')
const connection = mongoose.createConnection('mongodb://localhost/db1');
const connection2 = mongoose.createConnection('mongodb://localhost/db2');
const Schema = new mongoose.Schema({})
const model1 = connection.model('User', Schema);
const model2 = connection2.model('Item', Schema);
model1.find({}, function() {
console.log("this will print out last");
});
model2.find({}, function() {
console.log("this will print out first");
});

node-media-server: session.reject() not working

I am trying to create an RTMP-server with the npm package: http://github.com/illuspas/Node-Media-Server. So the server works fine but I need to implement authentication in it. I am trying to check the authentication on "prePublish" event. I am querying the database and retrieving the user if the user was found then I want to let the user stream otherwise rejected. But the problem is, it doesn't leave it instead disconnects and then the stream automatically reconnected to it then it disconnects again and the loop goes on. How do I fix this problem?
Here is the code for the event:
const NodeMediaServer = require('node-media-server');
const config = require('./config').rtmp_server;
const db = require('./db');
const nms = new NodeMediaServer(config);
const getStreamKeyFromStreamPath = (path) => {
const parts = path.split('/');
return parts[parts.length - 1];
};
nms.on('prePublish', async (id, StreamPath, args) => {
const session = nms.getSession(id);
try {
const streamKey = getStreamKeyFromStreamPath(StreamPath);
const validStream = (
await db.query('SELECT * FROM public."People" WHERE stream_key = $1', [streamKey])
).rows[0];
console.log(validStream);
if (validStream) {
// do stuff
} else {
session.reject((reason) => {
console.log(reason);
});
}
console.log(
'[NodeEvent on prePublish]',
`id=${id} StreamPath=${StreamPath} args=${JSON.stringify(args)}`
);
} catch (err) {
session.reject();
}
});
module.exports = nms;
Here is the code of the entry point of the server:
require("dotenv").config();
const db = require("./db");
const nms = require("./nms");
// database connection
db.connect()
.then(() => {
console.log("Connected to database");
// start the rtmp server
nms.run();
})
.catch((err) => console.log(err.message));
Here is the db file:
const { Pool } = require('pg');
const connectionString = process.env.PG_CONNECTION_STRING;
const poolOptions = {
host: process.env.PG_HOST,
user: process.env.PG_USER,
port: process.env.PG_PORT,
password: process.env.PG_PASSWORD,
database: process.env.PG_DATABASE,
};
const pool = new Pool(process.env.NODE_ENV === 'production' ? connectionString : poolOptions);
module.exports = pool;
My procedures to solve that problem:
Instead of the async function, I tried to handle the database query using a callback but it didn't work.
Before I was calling session.reject() now I am passing a callback there but the behavior is still the same
If you have any solution for that, please let me know.
Thanks in advance

How use MongoDB change streams in Node.js to populate a new collection

I want to use MongoDB change streams to watch insertion/updates on a first collection to populate, when a condition is meet,another collection with computed values extracted from the watched collection.
Following Mongodb tutorial, I came to the following results:
require('dotenv').config();
const { MongoClient } = require('mongodb');
const stream = require('stream');
const es = require('event-stream');
async function monitorListingsUsingStreamAPI(client, pipeline = []) {
const collection = client
.db(process.env.MONGO_DB)
.collection(process.env.COLLECTION_TO_MONITOR);
const changeStream = collection.watch(pipeline);
const collection_dest = client
.db(process.env.MONGO_DB)
.collection(process.env.COLLECTION_TO_POPULATE);
changeStream.pipe(
es.map(function (doc, next) {
const { _id, ...data } = doc.fullDocument;
const new_doc = { size: data.samples.length, data };
(async () => {
await collection_dest.insertOne(new_doc, next);
})();
}),
);
}
async function main() {
const uri = process.env.MONGO_DB_URI;
const client = new MongoClient(uri, {
useUnifiedTopology: true,
useNewUrlParser: true,
});
try {
// Connect to the MongoDB cluster
await client.connect();
const pipeline = [
{
$match: {
operationType: 'insert',
'fullDocument.samples': { $size: 3 },
},
},
];
// Monitor new listings using the Stream API
await monitorListingsUsingStreamAPI(client, pipeline);
}
}
Actually it seems to work but I used event-stream to pipe MongoDB change stream into another one where I used an immediately-invoked anonymous async functions to populate the second collection.
I wonder if this approach is correct? How to use transform streams?

Mongoose not resolving callback queries?

I have been working on this project for 2 years now, and I'm thinking this was caused by the recent update, but am wondering if there are any kind, intelligent, Mongoose/NoSQL DBA, souls out there who would do the awesome service of helping me either track-down, and/or resolve this issue.
So, as you can see below, this is a simple mongoose find query over express to MongoDB. This is rather evident, at a high-level, and for most devs, the interactions will be natural, as any Mongo, Express, Node Stack using Mongoose.
The is issue is that, when I send this query, disregarding environment (a production project), it does not resolve.
The "data" seems to get lost somewhere, and therefore, the query simply never resolves.
It's a simple setup, really a test endpoint, so help out, run it through, and send some feedback.
Greatly Appreciated!
Model.js
const mongoose = require('mongoose');
const mongoosePaginate = require('mongoose-paginate');
const Schema = mongoose.Schema;
const TestSchema = new Schema({
data: {
type: String,
unique: false,
required: true
},
}, {
timestamps: true
});
TestSchema.plugin(mongoosePaginate);
module.exports = mongoose.model('Test', TestSchema);
Constructor.js
class Constructor {
constructor() {}
getAll() {
return TestSchema.find({}, function (err, tests) {})
}
}
module.exports = Constructor
db.js
let mongoose = require('mongoose')
// Connect to db
mongoose.connect('mongodb://localhost:27017/test', {useNewUrlParser: true, useUnifiedTopology: true }, err => {
if (err)
return console.log("Cannot connect to DB")
connectionCallback()
console.log("DB Connected")
});
let connectionCallback = () => {}
module.exports.onConnect = cb => {
connectionCallback = cb
}
App.js
const express = require('express');
const app = express();
const ip = require('ip');
let db = require('./db')
const router = express.Router();
const port = 8888;
const http = require('http').createServer(app);
let ipAddress = 'localhost'; // only works to the local host
try {
// will enable the server to be accessed from the network
ipAddress = ip.address();
} catch( err ){
console.err( err );
}
http.listen(port, ipAddress,
() => {
let message = [
`Server is running at ${ipAddress}:${port}`,
];
console.log( ...message )
});
db.onConnect(() => {
let Constructor = require("./pathTo/Constructor")
let construct = new Constructor()
app.use('/api', router.get('/test', function(req, res) {construct.getAll()}))
})
Your problem is with the constructor.js getAll function, as you are returning also and passed a callback also, the promise will never be resolved. You should either resolve the promise or return the response from the callback.
Resolve Promise:
class Constructor {
constructor() {}
async getAll() {
return await TestSchema.find({})
}
}
module.exports = Constructor
Return from callback:
class Constructor {
constructor() {}
getAll() {
TestSchema.find({}, function (err, tests){
return tests.
})
}
}
module.exports = Constructor
I ended up just scaling the project for production. I put the connectionCallback in a class and called it with the createConnection mongoose function.
Looks like this:
mongoose.Promise = global.Promise;
const url = 'mongodb://localhost/db'
const connection = mongoose.createConnection(url, options);
//load models
require('/models').connectionCallback();
modules.export = connectionInstance;
Please note, I am no longer using express!

How to cache or reuse MongoDB database connection in Serverless (AWS Lambda) application

I'm building a Serverless application using NodeJS and using MongoDB. I noticed that a lot of connections are created when handling a request. So I came up with a caching mechanism but it doesn't seem to be working. Here is my code.
Connection file
'use strict';
const connectDatabase = mongoClient => mongoClient.connect(process.env.MONGODB_STRING, { poolSize: 10 });
const createConnection = (mongoClient, dbConnection) => {
return new Promise(async (resolve, reject) => {
try {
if (dbConnection) {
console.log('===== Cached Connection =====');
resolve(dbConnection);
}
console.log('===== New Connection =====');
let dbPool = await connectDatabase(mongoClient);
resolve(dbPool);
} catch (error) {
reject(error);
}
});
};
const closeDbConnection = conn => {
if (conn) {
return conn.close();
}
};
module.exports = {
connectDatabase: connectDatabase,
createConnection: createConnection,
closeDbConnection: closeDbConnection
};
Handler code
let admin = require('./lib/admin');
let helper = require('./lib/helper');
let connection = require('./lib/connection');
let mongo = require('mongodb');
let mongoClient = mongo.MongoClient;
let cachedDb = null;
let createUser = async (event, context, callback) => {
context.callbackWaitsForEmptyEventLoop = false;
cachedDb = await connection.createConnection(mongoClient, cachedDb);
admin.createUser(cachedDb, helper.parseEvent(event), callback);
};
I'm using a global variable cachedDb to store the database connection but
each and every time I make a request it logs ===== New Connection ===== any idea how to achieve this.
Is there a better way to handle this?.
The approach I like using here is a singleton class, like this:
export default class MyMongoConnection {
static instance;
static getInstance() {
if (!MyMongoConnection.instance) {
MyMongoConnection.instance = MyMongoConnection.createInstance();
}
return MyMongoConnection.instance;
}
static createInstance() {
// Do whatever you need to create the instance
}
}
So you import MyMongoConnection anywhere and just ask for the instance using .getInstance(), if it exists, it reuses it.
Hope this helps.

Resources