Is declaring a Node.js redis client as a const in multiple helpers a safe way to use it? - node.js

This is a little hard articulate so I hope my title isn't too terrible.
I have a frontend/backend React/Node.js(REST API) Web app that I want to add Redis support to for storing retrieving app global settings and per-user specific settings (like language preference, last login, etc... simple stuff) So I was considering adding a /settings branch to my backend REST API to push/pull this information from a redis instance.
This is where my Node.js inexperience comes through. I'm looking at using the ioredis client and it seems too easy. If I have a couple of helpers (more than one .js which will call upon redis) will constructing the client as a const in each be safe to do? Or is there another recommended approach to reusing a single instance of it be the way to go?
Here's a sample of what I'm thinking of doing. Imagine if I had 3 helper modules that require access to the redis client. Should I declare them as const in each? Or centralize them in a single helper module, and get the client from it? Is there a dis-advantage to doing either?
const config = require('config.json');
const redis_url = config.redis_url;
//redis setup
const Redis = require('ioredis');
const redis = new Redis(redis_url);
module.exports = {
test
}
async function test(id) {
redis.get(id, function (err, result) {
if (err) {
console.error(err);
throw(err);
} else {
return result;
}
});
Thank you.

If no redis conflicts...
If the different "helper" modules you are referring to have no conflicts when interacting with redis, such as overwriting / using the same redis keys, then I can't see any reason not to use the same redis instance (as outlined by garlicman) and export this to the different modules in which it is used.
Otherwise use separate redis databases...
If you do require separate redis database connections, redis ships with 16 databases so you can specify which to connect to when creating a new instance - see below:
const redis = new Redis({ // SET UP CONFIG FOR CONNECTION TO REDIS
port: 6379, // Redis port
host: 127.0.0.1, // Redis host
family: 4, // 4 (IPv4) or 6 (IPv6)
db: 10, // Redis database to connect to
});

Normally what I would do (in Java say) is implement any explicit class with singleton access the hold the connection and any connection error/reconnect handling.
All modules in Node.js are already singletons I believe, but what I will probably go with will be a client class to hold it and my own access related methods. Something like:
const config = require('config.json');
const Redis = require("ioredis");
class Redis {
constructor(){
client = new Redis(config.redis_url);
}
get(key) {
return this.client.get(key);
}
set(key, value, ttl) {
let rp;
if (ttl === 0) {
rp = this.client.set(key, value);
}
else {
rp = this.client.set(key, value)
.then(function(res) {
this.client.expire(key, ttl);
});
}
return rp;
}
}
module.exports = new Redis;
I'll probably include a data_init() method to check and preload an initial key/value structure on first connect.

Related

How to share a single promise based RabbitMQ connection across files or controllers in Node js instead of creating a new Connection each time?

amqplib library lets you create a rabbitmq connection and that object will be a segue to doing other things such as creating a channel and etc.
suppose that I'm going for a Producer/Consumer pattern, where each time a user hits a specific route, a job is produced and sent to the rabbitmq server where it's processed by certain consumers(workers).
app.post("/routethatdelegatesheavywork", async (req,res) => {
const amqpServerLink =
"link-to-cloudmq";
const connection = await amqp.connect(amqpServerLink);
const channel = await connection.createChannel();
//do other stuff with channel
})
while this "works", but i don't want to re-create that connection every time the controller is invoked since it makes the producer very slow and it's really not how it's supposed to be done.
here is where my problem comes:
how do i initialize one connection and re-use it every time i need it?
i have tried to create a connection outside controllers and use it when necessary but it's not possible since the connection is promise-based and await doesn't work on entry point and it has to be inside an async function to work.
although it is possible to run await without async using ESM (es modules) i don't want to do so since i have written all of the application using CommonJS (require("package")), changing that would require me to go through a lot of files and change every import/export according to ESM.
So, is there any other way to create one connection(that is promise based) and re-use it without having to migrate to ESM syntax?
Yes, remember that require in nodejs are singletons. Make a new amqpServerInterface module, and do
const amqpServerLink = "link-to-cloudmq"
const connection = amqp.connect(amqpServerLink)
function connect() {
return connection
}
module.exports = {
connect
}
Then in your controllers
const amqpServerInterface = require('amqpServerInterface')
app.post("/routethatdelegatesheavywork", async (req,res) => {
const connection = await amqpServerInterface.connect();
const channel = await connection.createChannel();
//do other stuff with channel
})
This will always return the same connection promise and will resolve to the save connection.

Can you keep a PostgreSQL connection alive from within a Next.js API?

I'm using Next.js for my side project. I have a PostrgeSQL database hosted on ElephantSQL. Inside the Next.js project, I have a GraphQL API set up, using the apollo-server-micro package.
Inside the file where the GraphQL API is set up (/api/graphql), I import a database helper-module. Inside that, I set up a pool connection and export a function which uses a client from the pool to execute a query and return the result. This looks something like this:
// import node-postgres module
import { Pool } from 'pg'
// set up pool connection using environment variables with a maximum of three active clients at a time
const pool = new Pool({ max: 3 })
// query function which uses next available client to execute a single query and return results on success
export async function queryPool(query) {
let payload
// checkout a client
try {
// try executing queries
const res = await pool.query(query)
payload = res.rows
} catch (e) {
console.error(e)
}
return payload
}
The problem I'm running into, is that it appears as though the Next.js API doesn't (always) keep the connection alive but rather opens up a new one (either for every connected user or maybe even for every API query), which results in the database quickly running out of connections.
I believe that what I'm trying to achieve is possible for example in AWS Lambda (by setting context.callbackWaitsForEmptyEventLoop to false).
It is very possible that I don't have a proper understanding of how serverless functions work and this might not be possible at all but maybe someone can suggest me a solution.
I have found a package called serverless-postgres and I wonder if that might be able to solve it but I'd prefer to use the node-postgres package instead as it has much better documentation. Another option would probably be to move away from the integrated API functionality entirely and build a dedicated backend-server, which maintains the database connection but obviously this would be a last resort.
I haven't stress-tested this yet, but it appears that the mongodb next.js example, solves this problem by attaching the database connection to global in a helper function. The important bit in their example is here.
Since the pg connection is a bit more abstract than mongodb, it appears this approach just takes a few lines for us pg enthusiasts:
// eg, lib/db.js
const { Pool } = require("pg");
if (!global.db) {
global.db = { pool: null };
}
export function connectToDatabase() {
if (!global.db.pool) {
console.log("No pool available, creating new pool.");
global.db.pool = new Pool();
}
return global.db;
}
then in, eg, our API route, we can just:
// eg, pages/api/now
export default async (req, res) => {
const { pool } = connectToDatabase();
try {
const time = (await pool.query("SELECT NOW()")).rows[0].now;
res.end(`time: ${time}`);
} catch (e) {
console.error(e);
res.status(500).end("Error");
}
};

Is it necessary to close mongodb connection in nodejs?

I'm new to nodejs and mongodb. in mongodb native driver website they close connection after each request but it seems like to be very slow and problematic in high traffic websites. I'm just curious to know is it necessary to do that or I can declare a global variable and reference that to DB like this:
var mongodbClient = require('mongodb').MongoClient;
var db;
function connect() {
mongodbClient.connect('connection string', function (err, mdb) {
db = mdb;
});
}
connect();
function insert(query, collection, fn) {
db.collection(collection)
.insert(query, function (er, da) {
fn(er, da);
});
}
function find(query, collection, fn) {
db.collection(collection)
.find(query).toArray(function (er, da) {
fn(er, da);
});
}
I don't want to use mongoose and prefer to learn and understand what's going on under the hood.
The examples available in documentation are not actually good for real life use cases. If you are using a server framework you can normally connect to mongo and share reference to the connection throughout application. I use hapi and connect to server via a plugin which allows me to store the handle to open connection. This allows you to clean up on shutdown of server. Their are many modules for managing mongo such as mongoose, waterline or wadofgum-mongodb which I have recently written.

NodeJS Express Dependency Injection and Database Connections

Coming from a non Node background, my first instinct is to define my service as such
MyService.js
module.exports = new function(dbConnection)
{
// service uses the db
}
Now, I want one open db connection per request, so I define in middleware:
res.locals.db = openDbConnection();
And in some consuming Express api code:
api.js
var MyService = require(./services/MyService')
...
router.get('/foo/:id?', function (req, res) {
var service = new MyService(res.locals.db);
});
Now, being that Node's preferred method of dependency injection is via the require(...) statement, it seems that I shouldn't be using the constructor of MyService for injection of the db.
So let's say I want to have
var db = require('db');
at the top of MyService and then use somehow like db.current....but how would I tie the db to the current res.locals object now that db is a module itself? What's a recommended way of handling this kind of thin in Node?
Updated Answer: 05/02/15
If you want to attach a DB connection to each request object, then use that connection in your service, the connection will have to be passed to myService some how. The example below shows one way of doing that. If we try to use db.current or something to that effect, we'll be storing state in our DB module. In my experience, that will lead to trouble.
Alternatively, I lay out the approach I've used (and still use) in this previous answer. What this means for this example is the following:
// api.js
var MyService = require(./services/MyService')
...
router.get('/foo/:id?', function (req, res) {
MyService.performTask(req.params.id);
});
// MyService.js
var db = require('db');
module.exports = {
performTask: function(id)
{
var connection = db.getOpenConnection();
// Do whatever you want with the connection.
}
}
With this approach, we've decoupled the DB module from the api/app/router modules and only the module that actually uses it will know it exists.
Previous Answer: 05/01/15
What you're talking about could be done using an express middleware. Here's what it might look like:
var db = require('db');
// Attach a DB connection to each request coming in
router.use(req, res, next){
req.locals.db = db.getOpenConnection();
next();
}
// Later on..
router.get('/foo/:id?', function (req, res) {
// We should now have something attached to res.locals.db!
var service = new MyService(res.locals.db);
});
I personally have never seen something like new MyService before in express applications. That doesn't mean it can't be done, but you might consider an approach like this
// router.js
var MyService = require('MyService');
router.get('/foo/:id?', function (req, res) {
MyService.foo(res.locals.db);
});
// MyService.js
module.exports.foo(connection){
// I have a connection!
}

efficiency of mongodb/mongoskin access by multiple modules approach?

I'm developing an express app that provides a REST api, it uses mongodb through mongoskin. I wanted a layer that splits routing from db acess. I have seen an example that creates a database bridge by creating a module file, an example models/profiles.js:
var mongo = require('mongoskin'),
db = mongo.db('localhost:27017/profiler'),
profs = db.collection('profiles');
exports.examplefunction = function (info, cb) {
//code that acess the profs collection and do the query
}
later this module is required in the routing files.
My question is: If I use this aproach for creating one module for each collection, will it be efficient? Do I have an issue of connecting and disconnecting multiple(unnecessary) times from mongo by doing that?
I was thiking that maybe exporting the db variable from one module to the others that handle each collection would solve the suposed issue, but I'm not sure.
Use a single connection and then create your modules passing in the shared db instance. You want to avoid setting up separate db pools for each module. One of doing this is to construct the module as a class.
exports.build = function(db) {
return new MyClass(db);
}
var MyClass = function(db) {
this.db = db;
}
MyClass.doQuery = function() {
}

Resources