I would like to use knex as a query builder, but my project already handles its own connection pool.
I wish I could to do something like:
const { Client } = require('pg')
const client = new Client()
await client.connect()
const knex = require('knex')({
client: 'pg',
connection: client,
})
Is there any way to provide knex with the pg client object, instead of letting it to manage its own connection pool?
Workaround
I think there is a way to do that, but it can be done only with a specific query. To do that you can use the connection(pool) method that accept a connection pool instance as argument.
In my case I have required knex without passing a connection argument and then when the connection was established I saved the connection instance in my object. Then when I have to use knex as client I passed the connection instance in the query building.
Code example
Here a code example:
const knex = require('knex')({
client: 'pg' // postgreSQL or whatever
});
const poolConnection = await new Client().connect(); // or any other method to create your connection here
// when you have to query your db
const results = await knex('users')
.connection(poolConnection) // here pass the connection
.where({
email: 'test#tester.com'
})
.select('id', 'email', 'createdAt')
.offset(0)
.limit(1)
.first();
Use case with moleculer
As an example, I have used that with moleculer that provides a Database Adapter that use already itself a SQL client, so knex would build an additional connection to my db. After I retrieved the connection in my microservice I've used that inside knex in the same way described above.
"use strict";
const DbService = require("moleculer-db");
const SQLAdapter = require("moleculer-db-adapter-sequelize");
const Sequelize = require("sequelize");
// here requiring knex without an actual connection
const knex = require("knex")({
client: "pg"
});
module.exports = {
name: "users",
// implementing moleculer ORM
mixins: [DbService],
adapter: new SQLAdapter(process.env.POSTGRECONNECTIONSTRING),
model: {
name: "user",
define: {
id: {
type: Sequelize.INTEGER,
primaryKey: true,
autoIncrement: true
},
email: Sequelize.STRING,
password: Sequelize.STRING,
}
},
actions: {
findByIdRaw: {
params: {
id: "number"
},
handler(ctx) {
const { id } = ctx.params;
// use the connection pool instance
return knex("users")
.connection(this.connection)
.where({
id
})
.select("id", "email", "createdAt")
.offset(0)
.limit(1)
.first();
}
}
},
started() {
// getting the connection from the adapter
return this.adapter.db.connectionManager.getConnection()
.then((connection) => {
// saving connection
this.connection = connection;
return Promise.resolve();
});
}
};
Related documentation
Knex.js
Knex.js-connection
Related links
Moleculer
Moleculer ORM
No. Unless you write your own custom dialect and override connection fetching functionality. Writing custom dialect is described here https://github.com/tgriesser/knex/blob/master/CONTRIBUTING.md#i-would-like-to-add-support-for-new-dialect-to-knex-is-it-possible
Related
I am creating a multi-tenant Saas App. I was advised by many to keep my separate clients on separate databases, for better security and easier management.
How do we connect multiple databases to the Node app?
I know how to make my app run with a single database connection to mongodb, but not sure about multiple connections.
The mongoose docs mentions the following solutions for multiple connections:
export schema pattern (https://mongoosejs.com/docs/connections.html#multiple_connections)
connection pools (which has only up to 5 connections, which may not be ideal as I may have hundreds of clients in the future)
Another way which I tried (and it works!), is connecting to mongodb during a node API call and executing my logic, as shown below. The code below is a test route for registering a user with name and email. dbutils() is a function that I call to connect to mongodb, using mongoose.connect(). I am not sure if this is a good practice to connect during the API call.
router.post('/:db/register', async (req,res, next) => {
const startTime = new Date();
try {
if(!req.body.name) {
throw new Error("Name required");
}
if(!req.body.email) {
throw new Error("Email required");
}
await dbutils(req.params.db);// connect to db
const session = await mongoose.startSession();
session.startTransaction();
const newUser = new User({
name: req.body.name,
email: req.body.email,
})
await newUser.save({session});
await session.commitTransaction();
session.endSession();
const endTime = new Date();
const diff = endTime.getTime() - startTime.getTime();
return res.json({
newUser: {
email: req.body.email,
name: req.body.name
},
db: req.params.db,
timeElapsed: diff,
});
} catch(ex) {
return next(ex);
}
})
My dbutils() code
const mongoose = require('mongoose');
const mongoURI = "mongodb://PC:27017,PC:27018,PC:27019";
module.exports = async function(db) {
try {
await mongoose.connect(
`${mongoURI}/${db}`,
{
useNewUrlParser: true,
useCreateIndex: true,
useFindAndModify: false,
useUnifiedTopology: true,
}
)
} catch(ex) {
throw ex
}
}
I would be very happy for any recommendation or solution to this problem. Thank you very much in advance for your answer.
It is never a good idea to connect to your DB in an API call, you will be wasting a lot of resources, and deplaying the API responses as well.
The best way for you would be connect to multiple databases when Application starts, along with connection pooling configuration.
You can specify which schema belongs to which connection, and maintain separate DB collections.
You can use below code to work with multiple connections, and pooling:
const connection1 = mongoose.createConnection('mongodb://username:password#host1:port1[?options]',{
poolSize: 10
});
const connection2 = mongoose.createConnection('mongodb://username:password#host2:port2[?options]',{
poolSize: 10
});
Models/Schema on connection 1 can be created as below:
//User schema on connection 1
const userSchema = new Schema({ ... });
const UserModel = connection1.model('User', userSchema);
module.exports = UserModel;
Models/Schema on connection 2 can be created as below:
//Product schema on connection 2
const productSchema = new Schema({ ... });
const ProductModel = connection2.model('Product', productSchema);
module.exports = ProductModel;
For better performance, you can also have shared DB clusters for each DB, and use the cluster to connect to your database.
const conn = mongoose.createConnection('mongodb://[username:password#]host1[:port1][,host2[:port2],...[,hostN[:portN]]][/[database][?options]]', options);
For detailed information, Please read Mongoose Multiple Connections, and Connection Pooling
I want to understand how to switch between databases within mongoose global promise connection.
My current connection is established this way app.ts
import * as mongoose from 'mongoose';
...
try {
await mongoose.createConnection(`mongodb://localhost:27017/db1`, {
useNewUrlParser: true,
})
console.log("Connected")
} catch (error) {
console.log(error)
}
And then I am accessing it in different files some.model.ts
import { Schema, Document, model } from 'mongoose';
const SomeSchema: Schema = new Schema({
name: { type: String, required: true },
owner: { type: string, required: true }
});
export default model('Some', SomeSchema);
According to documentation.
So far we've seen how to connect to MongoDB using Mongoose's default connection. At times we may need multiple connections open to Mongo, each with different read/write settings, or maybe just to different databases for example. In these cases we can utilize mongoose.createConnection() which accepts all the arguments already discussed and returns a fresh connection for you.
const conn = mongoose.createConnection('mongodb://[username:password#]host1[:port1][,host2[:port2],...[,hostN[:portN]]][/[database][?options]]', options);
I can create multiple database connections like this
try {
const db1 = await mongoose.createConnection(`mongodb://localhost:27017/db1`, {
useNewUrlParser: true,
})
const db2 = await mongoose.createConnection(`mongodb://localhost:27017/db2`, {
useNewUrlParser: true,
})
console.log("Connected")
} catch (error) {
console.log(error)
}
I can see both connection in console.log(mongoose.connections)
But how can I specify what database should be used for the Model in some.model.ts?
import { Schema, Document, model } from 'mongoose';
const SomeSchema: Schema = new Schema({
name: { type: String, required: true },
owner: { type: string, required: true }
});
export default SPECIFY_DATABASE.model('Some', SomeSchema);
I have found other questions like this but there are connections created "localy", I need to use mongoose connection across many different files.
Thank you for answers, if you need more explanation please let me now.
You need to actually return the connection, and then register a given model to each of the connections. To clarify, you need:
something to create a (named, specific) connection
schemas
you create models by registering schemas to the given connections,
you also need something to orchestrate it.
Example, lets have a "db.js" file (I call mine "repo.js" usually) with a single export, a function that returns the initialized database Promise.
You'd use it by importing the function and awaiting for the db.
I have a bit of a longer example, so error handling etc is ommited for brevity.
import { createConnections } from './create-connections';
import { UsersSchema } from './users-schema';
import { PostsSchema } from './posts-schema';
let db: any;
export function getDatabase(): Promise<any> {
if (this.db) return Promise.resolve(db);
return createDatabases();
}
async function createDatabases() {
const { db1, db2 } = await createConnections('mongodb://localhost/db1', 'mongodb://localhost/db2');
const UserModel = db1.model('users', UserSchema);
const PostModel = db2.model('posts', PostSchema);
db = {
UserModel,
PostModel,
// also if you need this
connections: {
db1,
db2,
}
}
return db;
}
Now, I've used './create-connections' here, which is almost what you have:
// create-connection.js
const { createConnection } = require('mongoose');
// You create connections by calling this function and giving it the URL to the server
export function createConnections(url1, url2) {
const db1 = await createConnection(url1);
const db2 = await createConnection(url2);
return {
db1,
db2
}
}
Now, let's say you have two models: users and posts, let's have their schemas.
// users schema
import { Schema, Document } from 'mongoose';
export const UserSchema: Schema = new Schema({
name: { type: String, required: true },
});
// posts schema
import { Schema, Document } from 'mongoose';
export const PostSchema: Schema = new Schema({
text: { type: String, required: true },
owner: { type: SchemaID, required: true }
});
So now you need to bind it all in that fdirst file.
But how to use it? As I've said, since it's async, you always import it and use it as a simple async getDB:
// some controller, route handler, service etc.
import { getDatabase } from './get-database';
router.get('/users', async (req, res) => {
const User = await getDatabase().UserModel;
const users = await User.find();
return res.json(users);
});
router.post('/posts', async (req, res) {
const { text } = req.body;
const owner = req.user.id;
const Post = await getDatabase().PostModel;
const post = await Post.create({ text, owner });
return res.json(post);
});
This question already has answers here:
Why does a GraphQL query return null?
(6 answers)
Closed 3 years ago.
I am using apollo-graphql with postgres and now I want to be able to fetch my backend data into the apollo client. Its my first attempt at graphql and I did the following:
i.) created apollo-graphql server on localhost:4000 which also has the apollo graphql playground
ii.) Defined typeDefs and resolvers for my server
iii.) In typeDefs -> defined my schema
iv.) In resolvers -> just added a findAll Query (tried with both attributes and no parameters):
Query: {
me: () => account.findAll({attributes: ['user_id', 'username', 'email']})
}
v.) Then I added the postgres dbIndex I defined using sequelize ORM, into the server file (which I used in step iv above to query my db)
vi.) In my dbIndex file, I authenticate db using environment variables, get connected message, create db schema and export it.
After all these 6 steps, in apollo playground, I see null.
My list of files are below:
Server.js:
const {ApolloServer} = require('apollo-server');
const typeDefs = require('./schema');
const {account} = require('../database/dbIndex.js');
const resolvers = {
Query: {
me: () => account.findAll()
}
};
const server = new ApolloServer({
typeDefs,
resolvers
});
server.listen().then(({url}) => {
console.log(`Server ready at ${url}`);
});
dbIndex.js
const Sequelize = require('sequelize');
require('dotenv').config();
const sortDb = new Sequelize(
`${process.env.DATABASE}`,
process.env.DATABASE_USER,
process.env.DATABASE_PASSWORD,
{
dialect: 'postgres',
},
);
sortDb
.authenticate()
.then(() => {
console.log('Connected to DB');
})
.catch((err) => {
console.error('Unable to connect to DB', err);
});
const account = sortDb.define('account', {
user_id: {type: Sequelize.INTEGER},
username: {type: Sequelize.STRING},
email: {type: Sequelize.STRING}
});
module.exports.account = account;
schema.js
const {gql} = require('apollo-server');
const typeDef = gql
`
type Query {
"These are the queries we define in our query type"
me(user_id: ID): User
}
"How to define the structure of user? Below is an object type that does this:"
type User {
user_id: ID,
username: String,
email: String
}
`;
module.exports = typeDef;
Please help! Thanks in advance!
findAll in sequelize return a array of rows, try findOne with specific query or findById
It return null because the properties of type User can be resolve in array.
I am trying to watch my mongodb. whenever a change occurs I want to apply an action. This is what I have tried
var mongoose = require('mongoose');
//mongoose.connect('mongodb://localhost/test');
mongoose.Promise = global.Promise
mongoose.connect('mongodb://localhost:27017')
mongoose.connection.createCollection('people');
const Person = mongoose.model('Person', new mongoose.Schema({ name: String }));
Person.watch().
on('change', data => console.log(new Date(), data));
console.log(new Date(), 'Inserting doc');
Person.create({ name: 'john doe' });
console.log(new Date(), 'Inserted doc');
But I am getting the following error
node_modules/mongodb/lib/utils.js:132
throw err;
^
MongoError: $changeStream may not be opened on the internal admin
database
How can I fix this ?
Change streams in MongoDB requires a replica set to function.
According to Mongoose docs:
To connect to a replica set you pass a comma delimited list of hosts
to connect to rather than a single host.
mongoose.connect('mongodb://[username:password#]host1[:port1][,host2[:port2],...[,hostN[:portN]]][/[database][?options]]' [, options]);
Full example
const { ReplSet } = require('mongodb-topology-manager');
const mongoose = require('mongoose');
run().catch(error => console.error(error));
async function run() {
// Make sure you're using mongoose >= 5.0.0
console.log(new Date(), `mongoose version: ${mongoose.version}`);
await setupReplicaSet();
// Connect to the replica set
const uri = 'mongodb://localhost:31000,localhost:31001,localhost:31002/' +
'test?replicaSet=rs0';
await mongoose.connect(uri);
// For this example, need to explicitly create a collection, otherwise
// you get "MongoError: cannot open $changeStream for non-existent database: test"
await mongoose.connection.createCollection('Person');
// Create a new mongoose model
const personSchema = new mongoose.Schema({
name: String
});
const Person = mongoose.model('Person', personSchema, 'Person');
// Create a change stream. The 'change' event gets emitted when there's a
// change in the database
Person.watch().
on('change', data => console.log(new Date(), data));
// Insert a doc, will trigger the change stream handler above
console.log(new Date(), 'Inserting doc');
await Person.create({ name: 'Axl Rose' });
console.log(new Date(), 'Inserted doc');
}
// Boilerplate to start a new replica set. You can skip this if you already
// have a replica set running locally or in MongoDB Atlas.
async function setupReplicaSet() {
const bind_ip = 'localhost';
// Starts a 3-node replica set on ports 31000, 31001, 31002, replica set
// name is "rs0".
const replSet = new ReplSet('mongod', [
{ options: { port: 31000, dbpath: `${__dirname}/data/db/31000`, bind_ip } },
{ options: { port: 31001, dbpath: `${__dirname}/data/db/31001`, bind_ip } },
{ options: { port: 31002, dbpath: `${__dirname}/data/db/31002`, bind_ip } }
], { replSet: 'rs0' });
// Initialize the replica set
await replSet.purge();
await replSet.start();
console.log(new Date(), 'Replica set started...');
}
Full example excerpted from https://thecodebarbarian.com/stock-price-notifications-with-mongoose-and-mongodb-change-streams
You can’t, change stream cursor is not available on system collections, or any collections in the admin, local, and config databases. You could try configuring your database structure to not be an admin dB.
Mongodb changeStreams doc
What I want is to have arbitrary databases (50 for example) with the same collections (same schemas, exact models, different data) and 1 nodejs (expressjs + mongoose) web app.
Example simplified case:
I have:
a single web application (expressjs + mongoose) with User model.
50 domains 50 databases with users collection.
What behaviour I want to achieve:
GET /api/users/ http request is coming to one of domains (test-domain-39.myapp.com)
app gets the requested domain name (test-domain-39) and somehow mongoose understands that it wants to query database-39 when I just do User.find({isActive: true}) in users.controller
So I just want an abstraction. I pass db name to mongoose and continue to work with the User model (as we all usually do when having single DB connection) and mongoose, if needed, creates connection (if it's the first request to the specific db), keeps it alive for next requests in connection pool and etc.
What's the most simple and efficient way to accomplish that?
Thank's in advance!
IMHO, while this is possible with MongoDB, I wouldn't advise maintaining a separate database for each domain, especially if you are expecting to have a huge number of them. Have you considered a multi-tenant model instead?
The sample code below adds user 'Alex' into two different databases, "domainOne" and "domainTwo". Hope this helps
var mongoose = require('mongoose');
var personSchema = { name: String, domain : String };
var baseUri = 'mongodb://localhost/';
domains.forEach((domain) => {
var conn = mongoose.createConnection(baseUri + domain, (error) => {
if(error){
console.log('Ups! Database connection failed!');
return;
}
//Use the connection object to create your models,
//instead the mongoose object
//so that our data is saved into the database
//associated with this connection
var Person = conn.model('Person', personSchema);
//Lets add user 'Alex' into the database
(new Person({name : 'Alex', domain : domain })).save((error) => {
if(error){
console.log('Ups! Could not save person');
} else {
conn.close();
}
});
});
});
This is how I implemented my project:
// config/db.ts
import {createConnection} from 'mongoose'
const MONGO_URI = process.env.MONGO_URI
if (!MONGO_URI)
throw new Error(
'Please define the MONGO_URI environment variable inside .env'
)
const connections: any = {}
async function db(dbName: string) {
if (connections[dbName]) {
return connections[dbName]
} else {
connections[dbName] = createConnection(`${MONGO_URI}/${dbName}`)
return connections[dbName]
}
}
export default db
// models/Test.ts
import { Schema } from 'mongoose'
export interface ITest {
_id: Schema.Types.ObjectId
name: string
createdAt?: Date
}
const testSchema = new Schema<ITest>(
{
name: { type: String, required: true },
},
{ timestamps: true }
)
export default testSchema
// pages/api/test.ts
import nc from 'next-connect'
import db from '../../config/db'
import testSchema from '../../models/Test'
const handler = nc()
handler.get(
async (req: NextApiRequestExtended, res: NextApiResponseExtended) => {
try {
const conn = await db(req.headers['x-db-key'])
const model = await conn.model('Test', testSchema)
const data = await model.find({})
res.send(data)
} catch (error: any) {
res.status(500).json({ error: error.message })
}
}
)
export default handler