I'm following the instructions here to use Ghost as an NPM module, and attempting to setup Ghost for production.
I'm running Google cloud sql proxy locally. When I run NODE_ENV=production knex-migrator init --mgpath node_modules/ghost I get this error message:
NAME: RollbackError
CODE: ER_ACCESS_DENIED_ERROR
MESSAGE: ER_ACCESS_DENIED_ERROR: Access denied for user 'root'#'cloudsqlproxy~[SOME_IP_ADDRESS]' (using password: NO)
Running knex-migrator init --mgpath node_modules/ghost works just fine, and I can launch the app locally with no problems. It's only the I try to setup the app for production that I get problems.
EDIT: I can connect to the db via MySQL Workbench, using the same credentials I'm using in the config below
Here's my config.production.json (with private data removed):
{
"production": {
"url": "https://MY_PROJECT_ID.appspot.com",
"fileStorage": false,
"mail": {},
"database": {
"client": "mysql",
"connection": {
"socketPath": "/cloudsql/MY_INSTANCE_CONNECTION_NAME",
"user": "USER",
"password": "PASSWORD",
"database": "DATABASE_NAME",
"charset": "utf8"
},
"debug": false
},
"server": {
"host": "0.0.0.0",
"port": "2368"
},
"paths": {
"contentPath": "content/"
}
}
}
And app.yaml:
runtime: nodejs
env: flex
manual_scaling:
instances: 1
env_variables:
MYSQL_USER: ******
MYSQL_PASSWORD: ******
MYSQL_DATABASE: ******
# e.g. my-awesome-project:us-central1:my-cloud-sql-instance-name
INSTANCE_CONNECTION_NAME: ******
beta_settings:
# The connection name of your instance on its Overview page in the Google
# Cloud Platform Console, or use `YOUR_PROJECT_ID:YOUR_REGION:YOUR_INSTANCE_NAME`
cloud_sql_instances: ******
# Setting to keep gcloud from uploading not required files for deployment
skip_files:
- ^(.*/)?#.*#$
- ^(.*/)?.*~$
- ^(.*/)?.*\.py[co]$
- ^(.*/)?.*/RCS/.*$
- ^(.*/)?\..*$
- ^(.*/)?.*\.ts$
- ^(.*/)?config\.development\.json$
The file ghost.prod.config.js isn't something Ghost recognises - I'm not sure where that file name came from, but Ghost < 1.0 used config.js with all environments in one file, and Ghost >= 1.0 uses config.<env>.json with each environment in its own file.
Your config.production.json file doesn't contain your MySQL connection info, and therefore the knex-migrator tool is not able to connect to your DB.
If you merge the contents of ghost.prod.config.js into config.producton.json this should work fine.
Your config.production.json should look something like this:
{
"url": "https://something.appspot.com",
"database": {
"client": "mysql",
"connection": {
"socketPath": "path",
"user": "user",
"password": "password",
"database": "dbname",
"charset": "utf8"
}
}
}
The caveat here is that the new JSON format cannot contain code or logic, only explicit values, e.g. process.env.PORT || "2368" is no longer permitted.
Instead, you'll need to use either arguments or environment variables to provide dynamic configuration. Documentation for how to use environment variables is here: https://docs.ghost.org/docs/config#section-running-ghost-with-config-env-variables
E.g. NODE_ENV=production port=[your port] database__connection__user=[your user] ...etc... knex-migrator init --mgpath node_modules/ghost
You'd need to add an environment variable for every dynamic variable in the config.
I figured out the problem.
My config file shouldn't have the "production" property. My config should look like this:
{
"url": "https://MY_PROJECT_ID.appspot.com",
"fileStorage": false,
"mail": {},
"database": {
"client": "mysql",
"connection": {
"socketPath": "/cloudsql/MY_INSTANCE_CONNECTION_NAME",
"user": "USER",
"password": "PASSWORD",
"database": "DATABASE_NAME",
"charset": "utf8"
},
"debug": false
},
"server": {
"host": "0.0.0.0",
"port": "8080"
},
"paths": {
"contentPath": "content/"
}
}
It now overrides the default config.
The only issue is that you can't use knex-migrator with the "socketPath" property set, but this is needed to run the app in the cloud.
Related
Well, when I enter heroku bash and try to run npx typeorm migration:run it just throws me an error:
What's weird is that locally it works when the DATABASE is on localhost like this in .env file:
DATABASE_URL=postgres://postgres:docker#localhost:5432/gittin
This is my ormconfig.js:
module.exports = {
"type": "postgres",
"url": process.env.DATABASE_URL,
"entities": ["dist/entities/*.js"],
"cli": {
"migrationsDir": "src/database/migrations",
"entitiesDir": "src/entities"
}
}
Yes, I added the heroku postgres addon to the app.
PS: If needed, this is the repo of the project: https://github.com/joaocasarin/gittin
As I was discussing with Carlo in the comments, I had to add the ssl property in the ormconfig.js, but not only setting it to true when the environment was production. So according to this, I had to put { rejectUnauthorized: false } when production mode, and just false when not.
So the ormconfig.js is like this right now:
module.exports = {
"type": "postgres",
"ssl": process.env.NODE_ENV === 'production' ? { rejectUnauthorized: false } : false,
"url": process.env.DATABASE_URL,
"entities": ["dist/entities/*.js"],
"cli": {
"migrationsDir": "src/database/migrations",
"entitiesDir": "src/entities"
}
}
i have this ormconfig.json:
{
"type": "postgres",
"host": "db-pg",
"port": 5432,
"username": "spirit",
"password": "api",
"database": "emasa_ci",
"synchronize": true,
"logging": false,
"entities": ["dist/src/entity/**/*.js"],
"migrations": ["dist/src/migration/**/*.js"],
"subscribers": ["dist/src/subscriber/**/*.js"],
"cli": {
"entitiesDir": "dist/src/entity",
"migrationsDir": "dist/src/migration",
"subscribersDir": "dist/src/subscriber"
}
}
and have this env:
SERVER_PORT=4000
DB_HOST=db-pg
DB_PORT=5432
DB_USER=spirit
DB_PASS=api
DB_NAME=emasa_ci
but .env doesn't work in .json and so I don't know how I'm going to use my enviroment variables in my config orm
There is a good documentation.
If you want to dig into the source code - there is a class ConnectionOptionReader, which is looking for file ormconfig (with extensions env, js, cjs, ts, json, yml, yaml, xml) or for file .env. See the load function for more details.
1 So the easiest way is to add a line in your .env file, like this:
TYPEORM_URL=postgres://user:pass#host:port/dbname
Or use this sample. TypeORM will parse .env file using dotenv.
Here you can find all available env varibales.
2 If you read your .env file before TypeORM initialization, you can already use your env variables. For example in a Javascript file, instead ormconfig.json. Just export object like this from the file ormconfig.js:
module.exports = {
"type": "postgres",
"host": process.env.DB_HOST,
"port": process.env.DB_PORT,
"username": process.env.DB_USER,
"password": process.env.DB_PASS,
"database": process.env.DB_NAME,
"synchronize": true,
"logging": false,
"entities": ["dist/src/entity/**/*.js"],
"migrations": ["dist/src/migration/**/*.js"],
"subscribers": ["dist/src/subscriber/**/*.js"],
"cli": {
"entitiesDir": "dist/src/entity",
"migrationsDir": "dist/src/migration",
"subscribersDir": "dist/src/subscriber"
}
};
Another example
Since ormconfig is deprecated, I suggest another approach using TypeORM DataSource.
I'm using Heroku to deploy my server, so I created an environment file containing the same variable as created in Heroku Dynos, named .development.env:
DATABASE_URL="postgres://user:password#localhost:5432/main"
Note that you can place this file anywhere in your project tree.
Then I created a datasource file:
import dotenv from 'dotenv';
import { DataSource } from 'typeorm';
// Load env file
dotenv.config({ path: '../api/.development.env' });
DataSource definition
const AppDataSource = new DataSource({
type: 'postgres',
url: process.env.DATABASE_URL,
logging: true,
entities: ['../api/dist/**/*.entity.js'],
migrations: ['./migrations/*.js'],
subscribers: [],
});
export default AppDataSource;
This way, you can store in your environment your database connection settings.
You could hide ormconfig.json and put your secrets directly in there, or viceversa, load TypeORM's configuration from your .env file. Is there a precise reason why you need them to be separated? If yes, then we can work out a solution.
I am using loopback to develop API, and having problem with config by env in lb3, i have datasources.json file as below
{
"SSOSystem": {
"host": "x.y.z.o",
"port": 27017,
"url": "mongodb://SSO-System-mongo:U5KckMwrWs9EGyAh#x.y.z.o/SSO-System",
"database": "SSO-System",
"password": "U5KckMwrWs9EGyAh",
"name": "SSOSystem",
"connector": "mongodb",
"user": "SSO-System-mongo"
}
}
and datasources.local.js as below
module.exports = {
SSOSytem: {
connector: 'mongodb',
hostname: process.env.SSO_DB_HOST || 'localhost',
port: process.env.SSO_DB_PORT || 27017,
user: process.env.SSO_DB_USERNAME,
password: process.env.SSO_DB_PASSWORD,
database: process.env.SSO_DB_NAME,
url: `mongodb://${process.env.SSO_DB_USERNAME}:${process.env.SSO_DB_PASSWORD}#${process.env.SSO_DB_HOST}/${process.env.SSO_DB_NAME}`
}
}
but when I run my app with env local
NODE_ENV=local node .
loopback only loads datasources from datasources.json file, did I do something wrong in datasources config? Does anyone have same problem with me?
Many thanks,
Sorry, this is my typo mistake not problem of loopback SSOSystem in datasources.json and SSOSytem in file datasources.local.js
Background
I am creating a boilerplate express application. I have configured a database connection using pg and sequelize. When I add the cli and try to run sequlize db:migrate I get this error,
ERROR: The dialect [object Object] is not supported. Supported
dialects: mssql, mysql, postgres, and sqlite.
Replicate
Generate a new express application. Install pg, pg-hstore, sequelize and sequelize-cli.
Run sequelize init.
Add a config.js file to the /config path that was created from sequelize init.
Create the connection in the config.js file.
Update the config.json file created by sequelize-cli.
Run sequelize db:migrate
Example
/config/config.js
const Sequelize = require('sequelize');
const { username, host, database, password, port } = require('../secrets/db');
const sequelize = new Sequelize(database, username, password, {
host,
port,
dialect: 'postgres',
operatorsAliases: false,
pool: {
max: 5,
min: 0,
acquire: 30000,
idle: 10000
}
});
module.exports = sequelize;
/config/config.js
{
"development": {
"username": "user",
"password": "pass",
"database": "db",
"host": "host",
"dialect": "postgres"
},
"test": {
"username": "user",
"password": "pass",
"database": "db",
"host": "host",
"dialect": "postgres"
},
"production": {
"username": "user",
"password": "pass",
"database": "db",
"host": "host",
"dialect": "postgres"
}
}
Problem
I expect the initial migrations to run but instead get an error,
ERROR: The dialect [object Object] is not supported. Supported
dialects: mssql, mysql, postgres, and sqlite.
Versions
Dialect: postgres
Dialect version: "pg":7.4.3
Sequelize version: 4.38.0
Sequelize-Cli version: 4.0.0
Package Json
"pg": "^7.4.3",
"pg-hstore": "^2.3.2",
"sequelize": "^4.38.0"
Installed globally
npm install -g sequelize-cli
Question
Now that the major rewrite has been released for sequelize, what is the proper way to add the dialect so the migrations will run?
It is important to note that my connection is working fine. I can query the database without problems, only sequelize-cli will not work when running migrations.
i ran into same problem. there is a few thing that you need to change. first, i am not sure why you had 2 config/config.js file. i assumed the second file is config.json. the reason run into this problem is that
const sequelize = new Sequelize(database, username, password, {
host,
port,
dialect: 'postgres',
operatorsAliases: false,
pool: {
max: 5,
min: 0,
acquire: 30000,
idle: 10000
}
});
these lines of code is used for the node server to access db, not for sequlize-cli to migrate. you need to follow exact the sequlize-cli instruction. here is the link: instruction
my code:
config/db.js
const {sequlize_cli} = require('../config.json');
module.exports = sequlize_cli;
config.json
{
"sequlize_cli":{
"development":{
"username":"root",
"password":"passowrd",
"database":"monitor",
"host":"127.0.0.1",
"dialect": "postgres"
},
"test": {
"username":"root",
"password":"passowrd",
"database":"monitor",
"host":"127.0.0.1",
"dialect": "postgres"
},
"production": {
"username":"root",
"password":"passowrd",
"database":"monitor",
"host":"127.0.0.1",
"dialect": "postgres"
}
}
}
the main point i guess is to export the json object directly instead of exporting a sequelize object. In addition, this is only the problem with postges , i tested with mysql, your code works perfectly with mysql.
I am trying to upload a node app online on aws.
When I launch the app on local it works perfectly fine because my app finds access to postgres.
However when I upload my server, it then can't connect to the database.
My app uses loopback.io.
Here is the server/config.json :
{
"restApiRoot": "/api",
"host": "0.0.0.0",
"port": 3000,
"remoting": {
"context": false,
"rest": {
"handleErrors": false,
"normalizeHttpPath": false,
"xml": false
},
"json": {
"strict": false,
"limit": "100kb"
},
"urlencoded": {
"extended": true,
"limit": "100kb"
},
"cors": false
},
"legacyExplorer": false,
"logoutSessionsOnSensitiveChanges": true
}
And here is /server/datasources.json
{
"db": {
"name": "db",
"connector": "memory"
},
"postgres": {
"host": "localhost",
"port": 5432,
"url": "",
"database": "postgres",
"password": "postgresseason",
"name": "postgres",
"user": "postgres",
"connector": "postgresql"
}
}
I have done researches and I think I have to change an url so it doesn't try to look for a "local" way, but don't manage to make it work.
I tried using the url postgress://postgres:postgresseason#db:5432/postgres without success.
The error I am getting are either :
Web server listening at: http://0.0.0.0:8080
Browse your REST API at http://0.0.0.0:8080/explorer
Connection fails: Error: getaddrinfo ENOTFOUND db db:5432
It will be retried for the next request.
Or :
Web server listening at: http://0.0.0.0:3000
Browse your REST API at http://0.0.0.0:3000/explorer
Connection fails: Error: connect ECONNREFUSED 127.0.0.1:5432
It will be retried for the next request.
Any help how to make it work?
Thanks
You need to make sure the postgres server is installed and reachable by aws.
By default it cannot reach your locally installed postgres (without complicate port forwarding etc... )
If you are using ec2 you can install a postgres server locally and use localhost.
Or setting postgres in another aws service like this one: https://aws.amazon.com/rds/postgresql/
Just make sure the nodejs server / service has the required permissions to reach and query the postgres.