How can I use two different databases in one single node app? - node.js

I have install Hbase client and PostgreSql client install but how to connect two databases in a single node application
import { error } from "console";
const hbase = require('hbase');
export class Db {
private conn = hbase();
private config = { host: '0.0.0.0', port: 8080 };
public client = new hbase.Client(this.config);
constructor() {
this.conn = new hbase.Client(this.config);
}
public conection() {
this.conn.table('messages').exists((error: string, succuss: string) => {
if (!succuss) {
this.createTable('messages', 'message_data');
}
});
}
private createTable(TblName: string, CF: string) {
this.conn.table(TblName).create(CF, function (error: string, success: string) {
console.log(success);
return success
});
}
}

I would suggest creating two different classes for Hbase and PostgreSql. ANd use them in your application whenever needed.
Another thing also use dependency injection in the constructor instead of defining configs in class. That way you can inject any DB configuration in an instance.
Here's code example
Create Class to manage HBaseDB connection
import { error } from "console";
const hbase = require('hbase');
export class HBaseDB {
//Inject this config object in your class constructor
//private config = { host: '0.0.0.0', port: 8080 };
//Here we have injected config object
constructor(config) {
this.conn = new hbase.Client(config);
}
public conection() {
this.conn.table('messages').exists((error: string, succuss: string) => {
if (!succuss) {
this.createTable('messages', 'message_data');
}
});
}
private createTable(TblName: string, CF: string) {
this.conn.table(TblName).create(CF, function (error: string, success: string) {
console.log(success);
return success
});
}
}
Create Class to manage PostgreSQL connection
const pg = require('pg');
export class PostgresqlDB {
constructor(config) {
//config contains database,username,password etc... configs
this.pool = new pg.Pool({ config })
this.conn = undefined
}
public async conection() {
//If connection is already connected, no need to connect again
//This will save time
if (this.conn === undefined)
this.conn = await this.pool.connect()
}
public async query(query) {
await this.conection()
const response = await this.conn.query(query)
return response
}
}
Now you can use them in code like
const pgDB = require('./PostgresqlDB')
const hbaseDB = require('./HBaseDB')
const hbaseDBConn = new hbaseDB({ host: '0.0.0.0', port: 8080 })
const pgDBConn = new pgDB({ database: 'test', user:'test',password:'test'})
Note: Above code is for understanding purposes only, You may need to add validations and correct some syntax for actual use

Related

How to add JSON data to Jetstream strams?

I have a code written in node-nats-streaming and trying to convert it to newer jetstream. A part of code looks like this:
import { Message, Stan } from 'node-nats-streaming';
import { Subjects } from './subjects';
interface Event {
subject: Subjects;
data: any;
}
export abstract class Listener<T extends Event> {
abstract subject: T['subject'];
abstract queueGroupName: string;
abstract onMessage(data: T['data'], msg: Message): void;
private client: Stan;
protected ackWait = 5 * 1000;
constructor(client: Stan) {
this.client = client;
}
subscriptionOptions() {
return this.client
.subscriptionOptions()
.setDeliverAllAvailable()
.setManualAckMode(true)
.setAckWait(this.ackWait)
.setDurableName(this.queueGroupName);
}
listen() {
const subscription = this.client.subscribe(
this.subject,
this.queueGroupName,
this.subscriptionOptions()
);
subscription.on('message', (msg: Message) => {
console.log(`Message received: ${this.subject} / ${this.queueGroupName}`);
const parsedData = this.parseMessage(msg);
this.onMessage(parsedData, msg);
});
}
parseMessage(msg: Message) {
const data = msg.getData();
return typeof data === 'string'
? JSON.parse(data)
: JSON.parse(data.toString('utf8'));
}
}
As I searched through the documents it seems I can do something like following:
import { connect } from "nats";
const jsm = await nc.jetstreamManager();
const cfg = {
name: "EVENTS",
subjects: ["events.>"],
};
await jsm.streams.add(cfg);
But it seems there are only name and subject options available. But from my original code I need a data property it can handle JSON objects. Is there a way I can convert this code to a Jetstream code or I should change the logic of the whole application as well?

NodeJs & MongoDb , find data in a collection and insert it to another collection

I want to perform a database migration from an old model to another .
I have a database in MongoDB inside 2 collection the first one represents the old model and the other one present the model that the data would be transfer to it .
Thank you for sharing with me if there is a way to get the job done , perhaps a link or anything that could help please.
Note: example i get the data from large (instance of the old model) i must get it and transfer it to another instance width (step by step till having the second model).
import dotenv from 'dotenv';
dotenv.config();
async function main() {
const MONGO_URL = process.env.MONGO_URL || '';
const client = new MongoClient(MONGO_URL);
await listDatabases(client);
try {
await client.connect();
console.log('server is connected');
} finally {
await client.close();
}
}
main().catch(console.error);
async function listDatabases(client: { db: (arg0: string) => { (): any; new (): any; admin: { (): { (): any; new (): any; listDatabases: { (): any; new (): any } }; new (): any } } }) {
const databases = await client.db('park').admin().listDatabases();
console.log('Databases:');
databases.databases.forEach((db: { name: any }) => console.log(` - ${db.name}`));
}
async function ...{
}

DynamoDB DocumentClient do not return any data after put operation

I'm testing a lambda using the serverless framework with the sls offline command, this lambda should connect to my local dynamoDB (initialized with a docker-compose image), and put a new data in Dynamo using aws-sdk, but I can never get the return of the put().promise() function, if I use the get function I don't get any return either .I checked and the data is being entered into dynamodb. Follow the code below
import ILocationData, { CreateLocationDTO } from '#domain/location/data/ILocationData';
import { LocationEntity } from '#domain/location/entities/LocationEntity';
import { uuid } from 'uuidv4';
import DynamoDBClient from './DynamoDBClient';
export default class LocationProvider extends DynamoDBClient implements ILocationData {
private tableName = 'Locations';
public async createLocation(data: CreateLocationDTO): Promise<LocationEntity> {
const toCreateLocation: LocationEntity = {
...data,
locationId: uuid(),
hasOffers: false,
};
try {
const location = await this.client
.put({
TableName: this.tableName,
Item: toCreateLocation,
ReturnValues: 'ALL_OLD',
})
.promise();
console.log(location);
return location.Attributes as LocationEntity;
} catch (err) {
console.log(err);
return {} as LocationEntity;
}
}
}
DynamoDBClient.ts -> Class file
import * as AWS from 'aws-sdk';
import { DocumentClient } from 'aws-sdk/clients/dynamodb';
abstract class DynamoDBClient {
public client: DocumentClient;
private config = {};
constructor() {
if (process.env.IS_OFFLINE) {
this.config = {
region: process.env.DYNAMO_DB_REGION,
accessKeyId: 'xxxx',
secretAccessKey: 'xxxx',
endpoint: process.env.DYNAMO_DB_ENDPOINT,
};
}
this.client = new AWS.DynamoDB.DocumentClient(this.config);
}
}
export default DynamoDBClient;
I assume locationId is your partition key and you assign it to uuid() which will be always unique so you will never update any existing items with your put operation. Put operation returns anything only if there is already existing item with the same partition key which will be overwritten by newly provided item.

Nodejs Mongoose 'Operation `XXX.find()` buffering timed out after 10000ms'

index.ts is the entry point of this NodeJS program.
This is the code in index.ts:
import JobWorker from "./worker";
import { SwitchPlan } from "./jobs";
const worker = new JobWorker();
worker.addJob(SwitchPlan);
This is worker.ts:
import { CronJob } from "cron";
import mongoose from "mongoose";
import Config from "./config";
import logger from "./logger";
export default class JobWorker {
private jobs: CronJob[];
private config: {
NAME: string;
MONGO_URL: string;
};
constructor() {
this.config = Config;
this.connectDB();
this.jobs = [];
}
public async connectDB(): Promise<void> {
try {
await mongoose.connect(this.config.MONGO_URL,
{ useUnifiedTopology: true, useNewUrlParser: true, useCreateIndex: true },
);
logger.info("\nMONGODB has been connected\n");
} catch(err) {
logger.error("ERROR occurred while connecting to the database");
}
}
addJob(cronJob: CronJob) {
this.jobs.push(cronJob);
}
}
This is jobs.ts:
import moment from "moment";
import {
DatabaseOperations, Vehicle,
Plan1Doc, Plan1, VehicleDoc
} from "common-lib";
import logger from "../logger";
import { CronJob } from "cron";
const vehicleOps = new DatabaseOperations(Vehicle);
const SwitchPlan = new CronJob("25 * * * * *", async (): Promise<void> => {
const date: Date = moment(new Date()).startOf("date").toDate();
const expiringVehicles: VehicleDoc[] = vehicleOps.getAllDocuments(
{ "inspection.startTime": {
"$gte": date, "$lte": moment(date).startOf("date").add(1, "day").toDate()
}
},
{}, { pageNo: 0, limit: 0 }
).then((result: any) => {
logger.info("dsada");
}).catch((err: any) => {
logger.info("ssd");
});
});
SwitchPlan.start();
export { SwitchPlan };
I have omitted parts of code which are irrelevant to this problem. I ran this code through a debugger and there's no issue with the config. MonggoDB connected is getting printed at the start of the program. However the then block after getAllDocuments in jobs.ts is never reached and it always goes in the error block with the message, Operation vehicleinventories.find() buffering timed out after 10000ms. The getAllDocuments uses MongoDB's find() method and is working correctly because I am using this method in other projects where I have no such issues.
So far I have tried, deleting Mongoose from node_modules and reinstalling, tried connecting to MongoDB running on localhost, but the issue remains unsolved.
EDIT: DatabaseOperations class:
import { Model, Schema } from "mongoose";
class DatabaseOperations {
private dbModel: Model<any>;
constructor(dbModel: Model<any>) {
this.dbModel = dbModel;
}
getAllDocuments(
query: any,
projections: any,
options: { pageNo: number; limit: number },
sort?: any
): any {
const offset = options.limit * options.pageNo;
return this.dbModel
.find(query, projections)
.skip(offset)
.limit(options.limit)
.sort(sort ? sort : { createdAt: -1 })
.lean();
}
}
in your jobs.ts file you have the following line
SwitchToTier1Plan.start();
This line is called the moment you required the class file, hence before mongoose is connected, and all the models defined. Could this be the issue?
Another thing I noted is u are using mongoose.connect which may be wrong since mongoose.connect creates a global connection.
which means each new Worker you will be attempting to override the mongoose property with previous connection
Though i'm not sure what the implication is, but it could be because your .find could be using the old connection.
Since you are writing class, I would recommend using mongoose.createConnection which creates a new connection for each class initiation.

Bookshelf circular dependency with ES5

I have the following code:
const bookshelf = require('../config/bookshelf');
const BaseModel = require('bookshelf-modelbase')(bookshelf);
const moment = require("moment");
const User = require("./User");
const Meta = require("./Meta");
const Log = require("./Log");
class Session extends BaseModel {
get tableName() {
return "sessions";
}
get hasTimestamps() {
return false;
}
user() {
return this.belongsTo(User);
}
meta() {
return this.belongsTo(Meta);
}
logs() {
return this.hasMany(Log);
}
};
module.exports = Session;
and
const bookshelf = require('../config/bookshelf');
const BaseModel = require('bookshelf-modelbase')(bookshelf);
const Session = require("./Session");
const moment = require("moment");
class Log extends BaseModel {
get tableName() {
return "logs";
}
get hasTimestamps() {
return false;
}
session() {
return this.belongsTo(Session);
}
getDate() {
return moment(this.get("date")).format("MMM DD, YYYY - HH:mm:ss")
}
};
module.exports = Log;
belongsTo relation works properly, but when I try hasMany, I get: "Unhandled rejection Error: A valid target model must be defined for the sessions hasMany relation" error.
I had a look at https://github.com/tgriesser/bookshelf/wiki/Plugin:-Model-Registry but it is being done using pre-ES5 syntax.
I guess I need to make sure "Log" class is available before I appoint into a hasMany relationship but stuck here.
Any ideas?
Edit: Doing
logs() {
var log = require("./Log");
return this.hasMany(log);
}
works but it looks bad.
You may use Bookshelf registry. It exactly fits your needs.
You can load Bookshelf registry plugin like this :
const knex = require('knex')({
client: 'pg',
connection: {
host: config.db_host,
user: config.db_user, // user name for your database
password: config.db_password, // user password
database: config.db_database, // database name
charset: 'utf8',
},
});
const bookshelf = require('bookshelf')(knex);
bookshelf.plugin('registry');
module.exports = {
knex,
bookshelf,
};
Then, register your models in module.exports like this :
module.exports = {
session: db.bookshelf.model('Session', Session),
log: db.bookshelf.model('Log', Log),
db,
};

Resources