auto-generate models for sequelize - node.js

I'm wanting to start using Sequelize, a module that allows ORM for mysql in node.js. I was wondering if it's possible to auto-generate the models like CakePHP does. In CakePHP, it will read the table's info, and automatically create the associations and fields with their types in the model. i'd really hate to have to completely map out all my tables by hand, as some are relatively large. Is there something out there that will do this for me? Or am I on my own to hand-type all the models out?

You can auto generate models through sequelize-auto. Just Follow the following link
https://github.com/sequelize/sequelize-auto
It will generate models of your table.

Now you can use sequelize-automate to generate models automatically. sequelize-auto seems to be unmaintained for a long time, and the package is out-of-date.
$ npm install sequelize-automate
$ ./node_modules/.bin/sequelize-automate --help
For example:
$ ./node_modules/.bin/sequelize-automate -t js -h localhost -d test -u root -p root -o models

Sequelizer - A desktop application to export sequelize models automatically and visually.
Pretty impressive GUI client made with ElectronJS, grab here:
Source: https://github.com/andyforever/sequelizer

You can use a sync method for each of model
example:
Object.keys(db).forEach((modelName) => {
db[modelName].sync().then(result => {
// some logic
}).catch(err => {
// some logic
})
});
the logic will create a new table if the table not exist
full script index.js
'use strict';
const fs = require("fs");
const path = require("path");
const Sequelize = require("sequelize");
const sequelize = new Sequelize(process.env.DB_DATEBASE, process.env.DB_USER, process.env.DB_PASS, {
host: process.env.DB_HOST,
port: process.env.DB_PORT,
dialect: 'mysql',
operatorsAliases: false
});
const db = {};
fs
.readdirSync(__dirname)
.filter((file) => {
return (file.indexOf(".") !== 0) && (file !== "index.js") && (file !== "migrations") && (file !== "redshift-migrations");
})
.forEach((file) => {
const model = sequelize.import(path.join(__dirname, file));
db[model.name] = model;
});
Object.keys(db).forEach((modelName) => {
if ("associate" in db[modelName]) {
db[modelName].associate(db);
}
db[modelName].sync().then(result => {
// some logic
}).catch(err => {
// some logic
})
});
db.sequelize = sequelize;
db.Sequelize = Sequelize;
module.exports = db;
the script takes all files(models) in given directory where you will put the index.js file
the structure looks like here:

Use npm install --save-dev sequelize-cli
Use npm install --save-dev sequelize-auto
npx sequelize-auto -o "./database/models" -d -h localhost -u root -p 3306 -x '' -e mysql

see https://github.com/sequelize/sequelize/issues/339
Sequelize provides methods to read the existing table names of a database. Furthermore there is a method to read the structure of a table. Combined, it should be possible to automate the creation of models.

Related

Knex pool full on migration

I'm trying to get started with knex.js and I can't get migrations to work. Knex works fine for my API calls. Here's my setup:
knexfile.js
const env = process.env;
module.exports = {
client: 'mysql',
connection: {
host: env.DB_HOST,
database: env.DB_NAME,
user: env.DB_USER,
password: env.DB_PASSWORD,
port: env.PORT
},
pool: {
min: 0,
max: 50
},
migrations: {
directory: './db/migrations',
tableName: 'knex_migrations'
},
seeds: {
directory: './db/seeds'
}
};
knex.js
const config = require('../knexfile.js');
module.exports = require('knex')(config);
events.js
const express = require('express');
const router = express.Router();
const knex = require('../../db/knex.js');
// GET api/events
router.get('/', (req, res) => {
knex('events')
.then(events => { res.send(events) }
.catch(err => { console.log(err); })
});
module.exports = router;
and then I have a file in the migrations folder with:
exports.up = function(knex) {
return knex.schema.createTable('users', function (t) {
t.increments('id').primary()
t.string('username').notNullable()
t.string('password').notNullable()
t.timestamps(false, true)
}).then(() => { console.log('created users table') })
.catch((err) => { throw err} )
.finally(() => { knex.destroy() })
};
exports.down = function(knex) {
return knex.schema.dropTableIfExists('users')
};
When I run knex migrate:latest I get TimeoutError: Knex: Timeout acquiring a connection. The pool is probably full. Are you missing a .transacting(trx) call?
I know similar questions have been asked before, but I can't seem to find any that shed light on my particular situation. I've tried adding a knex.destroy() to the end of my GET request but that doesn't seem to help (it just makes the connection unusable if I add other request handlers below).
I did try checking the knex.client.pool in a finally clause at the end of the GET request. numUsed was 0, numFree was 1, numPendingAcquires and numPendingCreates were both 0. I do find it odd that numFree was only 1 given that my knexfile specifies max 50. Any advice greatly appreciated.
Following #technogeek1995's comment, the answer turned out to be adding require('dotenv').config({path: '../.env'}); to knexfile.js (in retrospect, this part seems obvious), and running the cli from the same directory. Hope this helps someone else.

Sequelize on NodeJS/ExpressJS return an error when running CLI command db:migrate

I'm following this tutorial Using PostgreSQL and Sequelize to persist our data on medium, and right now I'm stuck at the db:migrate. it's returning this error
Sequelize CLI [Node: 12.1.0, CLI: 5.4.0, ORM: 5.8.2]
Loaded configuration file "config.json".
Using environment "development".
ERROR: Error parsing url: undefined
as you can see I'm using NodeJS version 12.1.0 and Sequelize CLI version 5.4.0 and Sequelize version 5.8.2 and all of them were the latest version.
and before running sequelize db:migrate, I'm running this command first SET DATABASE_URL=postgresql://[user[:password]#][netlocation][:port][/dbname] and it does not returns any error.
but it's returning error after I ran db:migrate
I already tried to find the problem, but I can't found the answer yet.
Here is my ./Models/Index.js file.
'use strict';
require('dotenv').config();
import { readdirSync } from 'fs';
import { basename as _basename, join } from 'path';
import Sequelize from 'sequelize';
const basename = _basename(__filename);
const env = process.env.NODE_ENV || 'development';
const config = require(__dirname + '/../../config.json')[env];
const db = {};
let sequelize;
if (config.use_env_variable) {
sequelize = new Sequelize(process.env[config.use_env_variable], config);
} else {
sequelize = new Sequelize(config.database, config.username, config.password, config);
}
readdirSync(__dirname)
.filter(file => {
return (file.indexOf('.') !== 0) && (file !== basename) && (file.slice(-3) === '.js');
})
.forEach(file => {
const model = sequelize['import'](join(__dirname, file));
db[model.name] = model;
});
Object.keys(db).forEach(modelName => {
if (db[modelName].associate) {
db[modelName].associate(db);
}
});
db.sequelize = sequelize;
db.Sequelize = Sequelize;
export default db;
if you realize I just changed it to ES6 format which change some codes, but before I change it to ES6, it doesn't work either. and for all the rest of the files I following the tutorial.
Here are the files that I think have a connection:
.env
DATABASE_URL=postgres://postgres:admin#localhost:5432/test_app
.sequelizerc
const path = require('path');
module.exports = {
"config": path.resolve('./config.json'),
"models-path": path.resolve('./app/Models'),
"migrations-path": path.resolve('./migrations')
};
config.json
{
"development": {
"use_env_variable": "DATABASE_URL"
},
"test": {
"use_env_variable": "DATABASE_URL"
},
"production": {
"use_env_variable": "DATABASE_URL"
}
}
If there are some files that I haven't included yet please tell me, and please help me to fix find the solution for this problem. Thank you
OS: Windows 10
Basically you are unable to set environment variable DATABASE_URL successfully.
I am not a Windows guy, but this should do your job.
If you are using GitBash, then it is as simple as:
export DATABASE_URL=postgres://postgres#localhost:5432/database_name
and after that:
node_modules/.bin/sequelize db:migrate
EDIT:
I am not sure how to set this variable in gitbash and cmd.
Here is an alternate.
in config/config.json
"development": {
"username": "postgres"
"password": "postgres",
"database": "your_db_here",
"host": "127.0.0.1",
"dialect": "postgres"
},
update these variables according to your postgres db.
and run:
node_modules/.bin/sequelize db:migrate
You cannot fetch values at runtime inside config.json. It has to be static.
You should either use config.json or env variables or roll your own like mentioned in another answer.
To use env variables, you will eschew config.json. Instead, in models/index.js, set
if (config.use_env_variable) {
sequelize = new Sequelize(process.env[config.use_env_variable], config);
} else {
sequelize = new Sequelize(config.database, config.username, config.password, config);
}
to
sequelize = new Sequelize(process.env.DATABASE_URL)
AFAIK Sequelize migrations are a different beast than the normal sequelize workflow.
It is reading config/config.json when it loads - so you cannot use system environment variables - it has to be a static json file.
What I do in my projects, is having my config.js file making sure the config file is up to date with whatever settings I have.
I do this when the main program starts and also in package.json as follows:
(make sure to add npm-run-all to your package.json)
"scripts": {
"config": "node src/config.js",
"_migrate": "sequelize db:migrate",
"_migrate:status": "sequelize db:migrate:status",
"_migrate:undo": "sequelize db:migrate:undo",
"_seed": "sequelize db:seed:all",
"migrate": "npm-run-all config _migrate",
"migrate:status": "npm-run-all config _migrate:status",
"migrate:undo": "npm-run-all config _migrate:undo",
"seed": "npm-run-all config _seed"
},
config.js simply does something similar to this at the end of the file:
// Export sequelize config/config.json for easy compatibality with sequelize-cli
const filepath = path.resolve(__dirname, '../../config');
const filename = path.join(filepath, 'config.json');
fs.ensureDir(filepath)
.then(() => fs.writeFileSync(filename, JSON.stringify(sequelizeConfig, 2) + '\n'))
.catch((err) => console.error(`Failed to write config: ${err}`));
sequelizeConfig is should be the fully generated sequelize config object. You can also have a generic one like you have now, and build upon it.

knex.js - migration fails for dynamic schema name

Environment
Knex version: knex#0.16.3
Database + version: MySQL 5.7.24
OS: Linux (Manjaro)
Bug
I'm trying to setup an integration test suite using ava. As known, ava spawns different Node processes for each test file.
In order to make the tests isolated, I need to re-create my database and run the migrations in all test files. I started bootstrapping and got the following script:
import test from 'ava';
import cuid from 'cuid';
import knexFactory from 'knex';
import knexFile from './knexfile';
const knex = knexFactory(knexFile.development);
test.before(async t => {
const schemaName = `foo_${cuid()}`;
await knex.raw(`CREATE SCHEMA ${schemaName}`);
try {
await knex.migrate.latest({ schemaName });;
} catch (err) {
}
Object.assign(t.context, { schemaName });
});
test.after.always(async t => {
const { schemaName } = t.context;
await knex.raw(`DROP SCHEMA ${schemaName}`);
});
test('foo', () => {})
However, I'm running into the following error:
migration file "20190205110315_create_users_table.js" failed
migration failed with error: create table users (id int unsigned not null auto_increment primary key, email varchar(255), password text) - Table 'users' already exists
What is happening is that the .migrate.latest() call is ignoring its schemaName parameter and it's running against the configuration defined in the knexfile.js file, that is, the actual database.
The migration file is pretty straightforward:
exports.up = function(knex, Promise) {
return knex.schema.createTable('users', table => {
table.increments();
table.string('email');
table.text('password');
})
};
exports.down = function(knex, Promise) {
return knex.schema.dropTable('users');
};

Error trying to unit test Mongoose schema validation with Jest

I'm new to Jest and try to set up a simple test script:
"use strict"
// Local dependencies
const userModel = require('./user.model');
// Setting controllers
describe("Users endpoint", () => {
describe("Validating user schema", () => {
it("Should return an error if name property is missing", () => {
const user = new userModel();
user.validate((error) => {
expect(error.errors.name.kind).toBe("required");
});
});
});
});
When running Jest, I got the following error:
SyntaxError: Identifier 'global' has already been declared
2 |
3 | // Local dependencies
> 4 | const userModel = require('./user.model');
I search on Google and didn't find anything related to the 'global' identifier.
Any help would be really appreciate.
Thanks,
Steve
Ok so after digging a bit more, I figured out that the issue was with a const global inside my require script:
const global = require(path.join(__dirname, '..', 'config', 'config'));
If I change the "global" name for anything else (i.e. globalTest), it will works.
Sousing "global" seems to be not allowed.

Sequelize - Cannot read property 'list' of undefined

Im just learn to use sequelize for my node.js project. For summary my project is ExpressJS+Typescript with Sequelize as ORM and Webpack as module bundler.
Below is my project structure.
src
-router
-server
--config
config.json
--controllers
index.ts
User.ts
--migrations
--models
index.js
user.js
--seeders
App.ts
index.ts
(sorry can not post picture yet, new user to stackoverflow)
I have build some simple router '/user' and expect it should call the user controller and call sequelize method findAll() from my models module, but the result is its error says Cannot read property 'list' of undefined. Below is my code:
models/index.js
const fs = require('fs');
const path = require('path');
const Sequelize = require('sequelize');
const basename = path.basename(module.filename);
const env = process.env.NODE_ENV || 'development';
const config = require(`${__dirname}/../config/config.json`)[env];
const db = {};
let sequelize;
if (config.use_env_variable) {
sequelize = new Sequelize(process.env[config.use_env_variable]);
} else {
sequelize = new Sequelize(
config.database, config.username, config.password, config
);
}
fs
.readdirSync(__dirname)
.filter(file =>
(file.indexOf('.') !== 0) &&
(file !== basename) &&
(file.slice(-3) === '.js'))
.forEach(file => {
const model = sequelize.import(path.join(__dirname, file));
db[model.name] = model;
});
Object.keys(db).forEach(modelName => {
if (db[modelName].associate) {
db[modelName].associate(db);
}
});
db.sequelize = sequelize;
db.Sequelize = Sequelize;
export default db;
models/user.js
export default function(sequelize, DataTypes) {
var user = sequelize.define('user', {
username: DataTypes.STRING,
name: DataTypes.STRING,
email: DataTypes.STRING,
password: DataTypes.STRING,
phone: DataTypes.STRING,
wallet: DataTypes.DECIMAL
}, {
classMethods: {
associate: function(models) {
// associations can be defined here
user.hasMany(models.top_up);
}
}
});
return user;
};
controllers/User.ts
let user = require('../models').user;
export default {
list(req, res) {
return user
.findAll()
.then(topUp => res.status(200).send(topUp))
.catch(error => res.status(400).send(error));
}
};
controllers/Index.ts
import users from './User'
export default {
users
}
router/router.ts
import * as express from 'express';
const userController = require('../server/controllers').users;
// Init express router
let router = express.Router();
// Setting API URL
router.get('/', (req, res, next) => {
res.json({
message: 'Hello World!'
});
});
router.get('/about',(req, res, next) => {
res.send('<p>This is about about</p>');
});
router.get('/user', userController.list());
export default router
Fyi, all of my project configuration for start express server, typescript compile and webpack bundle is fine already, and the other route for '/' and '/about' is work fine. I know there is something I'm missing, im still new to sequelize, thanks for help.
TL;DR: The server/controllers/index.ts does not export a binding named users, itexports a binding named default.
You are importing the controllers/Index.ts module using the require function. In a CommonJS environment, this imports the entire module.exports object. As currently transpiled by TypeScript, every named export of the required module is exposed as a property of the import.
An export default clause implies an export named default. As per the ES Module specification, there is a shorthand for importing the default export of a module. That shorthand is
import userController from '../server/controllers';
On the other hand, the syntax
import userController = require('../server/controllers');
or
const userController = require('../server/controllers'); // (or let or var)
imports an object with a property corresponding to each export. In this case it has the shape
{ default }
So if you use require, you need to write
import userController = require('../server/controllers').default;
or
import userController from '../server/controllers';
All ES Module style exports are named, including the default which is named default.
To illustrate this, consider the following, more verbose but semantically identical form
import {default as userController} from '../server/controllers';
If you would prefer to stick with CommonJS style exports, eschewing ES Modules when working in NodeJS, the idiomatic way to export a single object as the entire module (the object returned by require)
You may write
// ../server/controllers/index.ts
export = {
list(req, res) {
return user
.findAll()
.then(topUp => res.status(200).send(topUp))
.catch(error => res.status(400).send(error));
}
};
Personally, I would stick with what you have and write
import userController from '../server/controllers';

Resources