I use node.js , TS and typeorm for back-end project.
I need to connect to a different database in the middleware according to the parameter I send.
And I've got to send the query to the database.
ormconfig
[
{
"name": "default",
"type": "postgres",
"host": "localhost",
"port": 5432,
"username": "postgres",
"password": "12345",
"database": "dbOne"
},
{
"name": "second-connection",
"type": "postgres",
"host": "localhost",
"port": 5432,
"username": "postgres",
"password": "12345",
"database": "dbTwo"
}
]
That's my connection settings.
After I do that, I'm trying to connect to the middleware.
const connectionOptions = await getConnectionOptions("second-connection");
const conTwo = await createConnection(connectionOptions);
const managerTwo = getManager("second-connection");
const resultTwo = await managerTwo
.createQueryBuilder(SysCompany, "company")
.getOne();
console.log(resultTwo);
I think I can connect to the database, but I'm having trouble with the repository.
Error
EntityMetadataNotFound: No metadata for "SysCompany" was found.
#Entity()
export class SysCompany extends CoreEntityWithTimestamp {
#Column({ length: 100 })
name: string;
// FK
// SysPersonnel
#OneToMany(type => SysPersonnel, personnel => personnel.sysCompany)
sysPersonnels: SysPersonnel[];
}
Maybe typeORM cannot find your JavaScript entity. I had that problem some time ago. You can do the following:
Check your destination folder after you built the project. Is your SysCompany.js available?
Set the entities property in the configuration. It must contain the path to your JS entities. The typeORM docs state that "Each entity must be registered in your connection options".
{
"name": "second-connection",
"type": "postgres",
"host": "localhost",
"port": 5432,
"username": "postgres",
"password": "12345",
"database": "dbTwo"
"entities": ["<path to entities>/**/*.js"]
}
I would also recommend to use a JavaScript configuration file. Your ormconfig.js can then use __dirname (directory name of the current module) to set the path. So if your directories look like this:
project/ormconfig.js
project/dist/entity/SysCompany.js
project/dist/entity/OtherEntity.js
You can use a configuration like this:
import {join} from "path";
...
entities: [
join(__dirname, "dist/entity/**/*.js")
],
...
You could also prevent duplication by using a base configuration object.
import {join} from "path";
const baseOptions = {
type: "postgres",
host: "localhost",
port: 5432,
username: "postgres",
password: "12345",
entities: [
join(__dirname, "dist/entity/**/*.js")
]
}
const defaultConfig = Object.assign({
name: "default",
database: "dbOne",
}, baseOptions);
const secondConfig = Object.assign({
name: "second-connection",
database: "dbTwo",
}, baseOptions);
module.exports = [ defaultConfig, secondConfig ];
In the file where you open the connection you could use an import:
import { secondConfig } from "<path to file>/ormconfig";
const conTwo = await createConnection(secondConfig);
The simplest way to use multiple databases is to create different connections:
import {createConnections} from "typeorm";
const connections = await createConnections([{
name: "db1Connection",
type: "mysql",
host: "localhost",
port: 3306,
username: "root",
password: "admin",
database: "db1",
entities: [__dirname + "/entity/*{.js,.ts}"],
synchronize: true
}, {
name: "db2Connection",
type: "mysql",
host: "localhost",
port: 3306,
username: "root",
password: "admin",
database: "db2",
entities: [__dirname + "/entity/*{.js,.ts}"],
synchronize: true
}]);
This approach allows you to connect to any number of databases you have and each database will have its own configuration, own entities and overall ORM scope and settings.
For each connection a new Connection instance will be created. You must specify a unique name for each connection you create.
The connection options can also be loaded from an ormconfig file. You can load all connections from the ormconfig file:
import {createConnections} from "typeorm";
const connections = await createConnections();
or you can specify which connection to create by name:
import {createConnection} from "typeorm";
const connection = await createConnection("db2Connection");
When working with connections you must specify a connection name to get a specific connection:
import {getConnection} from "typeorm";
const db1Connection = getConnection("db1Connection");
// you can work with "db1" database now...
const db2Connection = getConnection("db2Connection");
// you can work with "db2" database now...
Benefit of using this approach is that you can configure multiple connections with different login credentials, host, port and even database type itself. Downside for might be that you'll need to manage and work with multiple connection instances.
Related
I'm newbie to typeorm and trying to create a connection to db. I read the typeorm's doc and found this code, it uses DataSource to create connection:
import "reflect-metadata"
import { DataSource } from "typeorm"
import { Photo } from "./entity/Photo"
const AppDataSource = new DataSource({
type: "postgres",
host: "localhost",
port: 5432,
username: "root",
password: "admin",
database: "test",
entities: [Photo],
synchronize: true,
logging: false,
})
AppDataSource.initialize()
.then(() => {
// here you can start to work with your database
})
.catch((error) => console.log(error))
But when searching for some references in other sources, they use createConnection instead:
import { createConnection } from "typeorm"
createConnection({
type: "mysql",
host: "localhost",
port: 3306,
username: "root",
password: "mysql",
database: "mysql",
entities: [
__dirname + "/entity/*.ts"
],
synchronize: true,
logging: false
}).then(async connection => {
…
…
}).catch(error => console.log(error));
I'm a bit confused. What approach should I use for creating connection to db between those two above?
For the next person that comes across this issue, createConnection has been deprecated in favor of using new DataSource. See here:
https://typeorm.io/changelog#030httpsgithubcomtypeormtypeormpull8616-2022-03-17
I'm having trouble setting up a project with the config npm package.
My config.js looks like this:
require('dotenv').config();
const { DB_HOST, DB_USERNAME, DB_PASSWORD, SENDGRID_API_KEY } = process.env;
const env = process.env.NODE_ENV;
const config = {
development: {
username: DB_USERNAME,
password: DB_PASSWORD,
database: "database_development",
host: DB_HOST,
dialect: "postgres",
sendgrid: {
base_url: 'https://localhost:3002',
sendgrid_api_key: SENDGRID_API_KEY,
sender_email: '',
enabled: true,
},
},
test: {
username: DB_USERNAME,
password: DB_PASSWORD,
database: "database_test",
host: DB_HOST,
dialect: "postgres",
sendgrid: {
base_url: 'https://localhost:3002',
sendgrid_api_key: SENDGRID_API_KEY,
sender_email: '',
enabled: true,
},
},
production: {
username: DB_USERNAME,
password: DB_PASSWORD,
database: "database_production",
host: DB_HOST,
dialect: "postgres",
sendgrid: {
base_url: 'https://localhost:3002',
sendgrid_api_key: SENDGRID_API_KEY,
sender_email: '',
enabled: true,
},
}
};
module.exports = config[env];
In 1 service driver file I have the following line:
const dialect = config.get('dialect');
I get the following error: "Error: Configuration property "dialect" is not defined".
I also tried using 'development.dialect' but that doesn't help either.
Is it possible that the require('config'); doesn't work?
In my Sequelize's index.js file I've got
const config = require(__dirname + '/../config/config.js'); and that seems to work fine.
This github repository provides a good example of using config package.
https://github.com/basarbk/tdd-nodejs
Notes:
for diffrent configs, you must have a file named whatever you use for NODE_ENV. for example for "start": "cross-env NODE_ENV=production node index" create a file called production.js
check config folder in root directory
if you are using window you should use cross-env.
I just started learning nestJS and came from Laravel. How do I make it such that I am using different database host when reading (SELECT statements) and writing (INSERT statements) within the same database?
In Laravel (the one I have experience with), it looks something like this:
'mysql' => [
'read' => [
'host' => ['192.168.1.1'],
],
'write' => [
'host' => ['196.168.1.2'],
],
'driver' => 'mysql',
'database' => 'database',
'username' => 'root',
'password' => '',
],
As you can see from above, I am using the same database but different host for read/write. Is it possible to do the same with NestJS? I'm using typeORM if it matters.
Sure, TypeORM supports read/write replication (details here).
Here's an example (master instance is used for write operations, random slave instance is used for reading).
{
type: "mysql",
replication: {
master: {
host: "192.168.1.1",
port: 3306,
username: "test",
password: "test",
database: "test"
},
slaves: [{
host: "192.168.1.2",
port: 3306,
username: "test",
password: "test",
database: "test"
}]
}
}
I am using loopback to develop API, and having problem with config by env in lb3, i have datasources.json file as below
{
"SSOSystem": {
"host": "x.y.z.o",
"port": 27017,
"url": "mongodb://SSO-System-mongo:U5KckMwrWs9EGyAh#x.y.z.o/SSO-System",
"database": "SSO-System",
"password": "U5KckMwrWs9EGyAh",
"name": "SSOSystem",
"connector": "mongodb",
"user": "SSO-System-mongo"
}
}
and datasources.local.js as below
module.exports = {
SSOSytem: {
connector: 'mongodb',
hostname: process.env.SSO_DB_HOST || 'localhost',
port: process.env.SSO_DB_PORT || 27017,
user: process.env.SSO_DB_USERNAME,
password: process.env.SSO_DB_PASSWORD,
database: process.env.SSO_DB_NAME,
url: `mongodb://${process.env.SSO_DB_USERNAME}:${process.env.SSO_DB_PASSWORD}#${process.env.SSO_DB_HOST}/${process.env.SSO_DB_NAME}`
}
}
but when I run my app with env local
NODE_ENV=local node .
loopback only loads datasources from datasources.json file, did I do something wrong in datasources config? Does anyone have same problem with me?
Many thanks,
Sorry, this is my typo mistake not problem of loopback SSOSystem in datasources.json and SSOSytem in file datasources.local.js
Background
I am creating a boilerplate express application. I have configured a database connection using pg and sequelize. When I add the cli and try to run sequlize db:migrate I get this error,
ERROR: The dialect [object Object] is not supported. Supported
dialects: mssql, mysql, postgres, and sqlite.
Replicate
Generate a new express application. Install pg, pg-hstore, sequelize and sequelize-cli.
Run sequelize init.
Add a config.js file to the /config path that was created from sequelize init.
Create the connection in the config.js file.
Update the config.json file created by sequelize-cli.
Run sequelize db:migrate
Example
/config/config.js
const Sequelize = require('sequelize');
const { username, host, database, password, port } = require('../secrets/db');
const sequelize = new Sequelize(database, username, password, {
host,
port,
dialect: 'postgres',
operatorsAliases: false,
pool: {
max: 5,
min: 0,
acquire: 30000,
idle: 10000
}
});
module.exports = sequelize;
/config/config.js
{
"development": {
"username": "user",
"password": "pass",
"database": "db",
"host": "host",
"dialect": "postgres"
},
"test": {
"username": "user",
"password": "pass",
"database": "db",
"host": "host",
"dialect": "postgres"
},
"production": {
"username": "user",
"password": "pass",
"database": "db",
"host": "host",
"dialect": "postgres"
}
}
Problem
I expect the initial migrations to run but instead get an error,
ERROR: The dialect [object Object] is not supported. Supported
dialects: mssql, mysql, postgres, and sqlite.
Versions
Dialect: postgres
Dialect version: "pg":7.4.3
Sequelize version: 4.38.0
Sequelize-Cli version: 4.0.0
Package Json
"pg": "^7.4.3",
"pg-hstore": "^2.3.2",
"sequelize": "^4.38.0"
Installed globally
npm install -g sequelize-cli
Question
Now that the major rewrite has been released for sequelize, what is the proper way to add the dialect so the migrations will run?
It is important to note that my connection is working fine. I can query the database without problems, only sequelize-cli will not work when running migrations.
i ran into same problem. there is a few thing that you need to change. first, i am not sure why you had 2 config/config.js file. i assumed the second file is config.json. the reason run into this problem is that
const sequelize = new Sequelize(database, username, password, {
host,
port,
dialect: 'postgres',
operatorsAliases: false,
pool: {
max: 5,
min: 0,
acquire: 30000,
idle: 10000
}
});
these lines of code is used for the node server to access db, not for sequlize-cli to migrate. you need to follow exact the sequlize-cli instruction. here is the link: instruction
my code:
config/db.js
const {sequlize_cli} = require('../config.json');
module.exports = sequlize_cli;
config.json
{
"sequlize_cli":{
"development":{
"username":"root",
"password":"passowrd",
"database":"monitor",
"host":"127.0.0.1",
"dialect": "postgres"
},
"test": {
"username":"root",
"password":"passowrd",
"database":"monitor",
"host":"127.0.0.1",
"dialect": "postgres"
},
"production": {
"username":"root",
"password":"passowrd",
"database":"monitor",
"host":"127.0.0.1",
"dialect": "postgres"
}
}
}
the main point i guess is to export the json object directly instead of exporting a sequelize object. In addition, this is only the problem with postges , i tested with mysql, your code works perfectly with mysql.