Now Using sailsjs v0.10 .
Configure connections.js and models.js and change it to connection: 'localMongodbServer' ,installed npm install sails-mongo.
Aftet all this it shows error
var des = Object.keys(dbs[collectionName].schema).length === 0 ?
^
TypeError: Cannot read property 'schema' of undefined
at Object.module.exports.adapter.describe (app1_test/node_modules/sails-mongo/lib/adapter.js:70:48)
If change collections.js to adapter.js shows error
[err] In model (model1), invalid connection :: someMongodbServer
[err] Must contain an `adapter` key referencing the adapter to use.
Without seeing code, i can only assume a few things.
You're starting a new sailsjs v0.10 project
You dont have your configuration setup properly.
If this isnt the case, let me know so i can update the answer appropriately.
I have a boilerplate for v0.10 that has a few things baked into it, so you can see how its done. See that repo here
connections.js is the appropriate filename, it was changed in 0.10.
First make sure sails-mongo is installed.
#From your project root run
npm install sails-mongo --save
Next you need to define your connection, and tell sails what adapter to use for models by default. Here is an example of what connections.js and models.js should look like.
connections.js
module.exports.connections = {
mongodb: {
adapter : 'sails-mongo',
host : 'localhost',
port : 27017,
user : '',
password : '',
database : 'yourdevdb'
}
}
models.js
module.exports.models = {
// Your app's default connection.
// i.e. the name of one of your app's connections (see `config/connections.js`)
//
// (defaults to localDiskDb)
connection: 'mongodb'
};
You can also specify your connections in config/local.js to avoid commiting sensitive data to your repository. This is how you do it.
You dont need to specify all of the contents, as local.js will override whats defined in connections.js Sails will also combine them.
local.js
module.exports = {
connections: {
mongodb: {
host : 'localhost',
port : 27017,
user : '',
password : '',
database : 'yourdevdb'
}
}
}
You can even define your adapter in a single model, for instances where you need a single model to talk to a different database type.
You do this by specifying the adapter: in your model..
module.exports = {
adapter: 'myothermongodb',
},
config: {
user: 'root',
password: 'thePassword',
database: 'testdb',
host: '127.0.0.1'
},
If you are working with v0.10, you need to install sails-mongo from v0.10 branch on Github, 'cause the Waterline adapter API was changed in v0.10. In your package.json put
"sails-mongo": "https://github.com/balderdashy/sails-mongo/archive/v0.10.tar.gz"
then run npm install.
In config/connections.js you should have MongoDB adapter described, and in your config/models.js this adapter must be referenced.
That's it, sails lift should work after that.
Related
I am trying to dockerize a strapi app with mongodb atlas database. The issue I am facing is that database file in /config is not reading the variable from .env file.
.env file
HOST=0.0.0.0
PORT=1337
DATABASE_HOST=xyz.mongodb.net
DATABASE_USERNAME=abc-admin
DATABASE_PASSWORD=12345xyz
ADMIN_JWT_SECRET=abcd1234
Database connection code
const {
DATABASE_HOST,
DATABASE_USERNAME,
DATABASE_PASSWORD
} = process.env;
module.exports = ({ env }) =>
({
defaultConnection: 'default',
connections: {
default: {
connector: 'mongoose',
settings: {
host: env('DATABASE_HOST', process.env.DATABASE_HOST),
srv: env.bool('DATABASE_SRV', true),
port: env.int('DATABASE_PORT', 27017),
database: env('DATABASE_NAME', 'xyz-dev'),
username: env('DATABASE_USERNAME', process.env.DATABASE_USERNAME),
password: env('DATABASE_PASSWORD', process.env.DATABASE_PASSWORD)
},
options: {
authenticationDatabase: env('AUTHENTICATION_DATABASE', null),
ssl: env.bool('DATABASE_SSL', true),
},
},
},
});
I have tried with process.env and without it in the above file. But when I run the image after build it shows below error
error Error connecting to the Mongo database. URI does not have
hostname, domain name and tld
Any idea what I am doing wrong here? Thanks
One option is to use dotenv you need to import dotenv and run dotenv.config() before you can start using env variables
so change to
import dotenv from "dotenv";
dotenv.config()
// your code which user process.env
other option is to define all those env variable on your OS level. On unix you can add to ~/.bashrc file
Here's a bit more elaborate answer to your question (after reading your comments). Creating .env file means you just created it. It doesn't get automatically loaded. It's a typical way to use on unix machines, but has no relation to Node whatsoever.
What you need to do is somehow parse the content of that file (which is purely text), convert it to key-value pairs and pass it to node. There are many packages, and one that Amit showed is dotenv. It does all the work for you, and at the end, you get your variables injected inside process.env.
The simplest way would be to install this package (from npm) and use it as described. But if you cannot modify the code in any way, then you can simply parse the content of the file with a script, and then start the node server. Here's an example (taken from npm scripts: read .env file):
"scripts": {
"example": "some-lib --argument --domain $(grep DOMAIN .env | cut -d '=' -f2)"
}
The drawback here is that it doesn't work across various operating systems and that using a specific library for that is way more tested than your manual scripts.
This is background on the same error
The difference in this situation is that the pg client is constructed via db-migrate. In the process of upgrading to Node 14, in order to resolve compiler warnings, npm packages were updated as follows:
"db-migrate": "^0.11.12",
"db-migrate-pg": "^1.2.2",
"pg": "^8.5.1",
which caused deployment to heroku to now fail with this error:
psql: FATAL: no pg_hba.conf entry for host "...", user "...", database "...", SSL off
From the background, the postgres client (pg) must be changed to establish its connection using ssl = { rejectUnauthorized: false }. However, the client is created not directly but via db-migrate.
We are not using the documented command-line form of db-migrate, but instead have a wrapper migration script. This script does not invoke the command-line db-migrate but instead calls DBMigrate.getInstance(true, options);, where those options are programmatically generated per-environment. Thus we circumvent the documented file-based configuration of db-migrate.
Per the db-migrate docs, DATABASE_URL was intended for use with heroku but that assumes a command-line invocation of db-migrate will honor the environment variable and then ignore (non .rc-) file-based configuration. Since we have the wrapper script, we instead depend on DATABASE_URL via another documented db-migrate mechanism:
Note that if the settings for an environment are represented by a single string that string will be parsed as a database URL.
Thus we use this programmatic configuration generated in our wrapper script:
const options = {
env: "your_env_name_here",
config: {
your_env_name_here: process.env.DATABASE_URL
},
};
const dbmigrate = DBMigrate.getInstance(true, options);
How do I specify the required ssl configuration through db-migrate using DATABASE_URL?
It turns out that allowing db-migrate to parse our heroku-provided DATABASE_URL was itself interfering with heroku's infrastructural choices requiring ssl = { rejectUnauthorized: false } to begin with, which cannot be provided by that URL alone. The pg-connection-string library will do this for you, and the code snippet below translates that parsed URL to db-migrate's long-form but including the necessary heroku ssl configuration.
const parse = require('pg-connection-string').parse;
const r = parse(process.env.DATABASE_URL);
const options = {
env: "your_env_name_here",
config: {
your_env_name_here: {
driver: "pg",
user: r.user,
password: r.password,
host: r.host,
database: r.database,
port: r.port,
ssl: { rejectUnauthorized: false },
},
},
};
const dbmigrate = DBMigrate.getInstance(true, options);
This re-enabled our heroku deploy using Node 14 and updated npm libraries related to postgres and db-migrate.
I am using circleci to run some tests involving the database. My server uses node and sequelize as the orm. Whenever, I run my tests I get the following error:
SequelizeConnectionError: no pg_hba.conf entry for host X, user X, database X, SSL off
How do I fix this error?
I found the solution. In my model index file, where I create the instance of Sequelize, I needed to add the following options:
const sequelize = new Sequelize(
database_url,
{
ssl = true,
dialectOptions = {
ssl: true,
},
},
);
This fixed the it. The key is to set both ssl options. The problem was Sequelize was not using ssl.
I am trying to run my first migration which creates a single table in a Heroku postgres database.
When I try to run knex migrate:latest --env development I receive the error
Error: Unable to acquire a connection
Things I've tried:
adding ?ssl=true to the end of my connection string stored in process.env.LISTINGS_DB_URL as I'm aware this is sometimes a requirement to connect with heroku
setting the env variable PGSSLMODE=require
I also stumbled across this article where someone has commented that knex will not accept keys based on environment. However, I'm attempting to follow along with this tutorial which indicates that it does. I've also seen numerous other references which re-enforce that.
I'll also add that I've been able to connect to the database from my application and from external clients. I'm only encountering this error when trying to run the knex migration.
Furthermore, I've tried identifying how I can check what is being sent as the connection string. While looking at the knex documentation:
How do I debug FAQ
If you pass {debug: true} as one of the options in your initialize
settings, you can see all of the query calls being made.
Can someone help guide me in how I actually do this? Or have I already successfully done that in my knexfile.js?
Relevant files:
// knex.js:
var environment = process.env.NODE_ENV || 'development';
var config = require('../knexfile.js')[environment];
module.exports = require('knex')(config);
// knexfile.js:
module.exports = {
development: {
client: 'pg',
connection: process.env.LISTINGS_DB_URL,
migrations: {
directory: __dirname + '/db/migrations'
},
seeds: {
directory: __dirname + '/db/seeds'
},
debug: true
},
staging: {
client: 'postgresql',
connection: {
database: 'my_db',
user: 'username',
password: 'password'
},
pool: {
min: 2,
max: 10
},
migrations: {
tableName: 'knex_migrations'
}
},
production: {
client: 'postgresql',
connection: {
database: 'my_db',
user: 'username',
password: 'password'
},
pool: {
min: 2,
max: 10
},
migrations: {
tableName: 'knex_migrations'
}
}
};
As noted by #hhoburg in comments below, the error Error: Unable to acquire a connectionis a generic message indicating something is incorrect with Knex client configuration. See here.
In my case, Knex wasn't referencing process.env.LISTINGS_DB_URL in knexfile.js because:
that variable was set in my .env file
the dotenv module wasn't be referenced/called by Knex
The correct way of setting this up is detailed in the knex issue tracker here.
Step 1:
First install dotenv:
npm i dotenv --save
Create a .env file in the root of your project, add:
DATABASE_URL=postgres://...
Step 2:
In the beginning of your knexfile.js, add:
require('dotenv').config();
Change the postgres connection to something like:
{
client: 'postgresql',
connection: process.env.DATABASE_URL,
pool: {
min: 0,
max: 15
},
migrations: {
directory: ...
},
seeds: {
directory: ...
}
}
I'm not sure if this will help at all, but I began running into the same issue today on my local environment. After way too much searching, I found that this is the new error message for an invalid connection configuration or a missing connection pool. After fiddling around with it for way too long, I switched my connection to use my .env file for the configuration environment; I had been using a hard-coded string ('dev') in my knex.js file, which didn't work for some reason.
Is your .env file working properly? Did you try messing with the pool settings, and are you positive your username and password are correct for the staging and production database?
I hope that link helps!
if you are getting this error in nodejs try removing this line
myDb.destroy().then();
I received this same error in the same situation. Turns out I forgot to provision a database before migrating, so there was nothing to connect to.
To fix this error,
Before running:
heroku run knex migrate:latest
I ran this command:
heroku addons:create heroku-postgresql
and that worked nicely.
I got this error when trying to update data to a database before running corresponding migration.
I am using sails-cbes with sails and couchbase. When I try to lift sails getting below mentioned error
error: A hook (orm) failed to load!
error: Error: Failed to connect to the Couchbase/ElasticSearch clients { [Error: failed to connect to bucket] code: 25 }
This is my connections.js file
// config/connections.js
cb: {
adapter: 'sails-cbes',
host: '127.0.0.1',
port: 8091,
user: 'Administrator',
pass: 'word2pass',
operationTimeout: 60 * 1000, // 60s
bucket: {
name: 'default',
}
}
My best guess is that the bucket does not exist on the couchbase cluster when you start the application, as the current implementation does not create it at startup.
You have to do the bucket creation manually, matching the configuration before starting the application.
Also, I don't see any mention of elasticsearch and I have to say it's a necessary component for this setup, as the querying functionality is implemented on top of it. I didn't test this but it probably won't even run without it, failing in a similar fashion.