error:a hook ('orm') failed to load when lifting sails app using mongo DB - node.js

I get this error: a hook ('orm') failed to load when trying to lift a sails app using mongoDB. This my connections.js file:
module.exports.connections = {
mongodb: {
adapter : 'sails-mongo',
host : 'localhost',
port : 27017,
database : 'mydb1'
}
};
And this is my models.js file:
module.exports.models = {
connection: 'mongodb'
};
And this is my local.js file:
module.exports = {
connections: {
mongodb: {
host : 'localhost',
port : 27017,
database : 'mydb1'
}
}
}
Sails v 0.10.1
Any idea why this could be happening?
Thanks

All you need to do is start your mongodb.
Navigate to : path_to_your_mongodb_installation_folder\MongoDB\bin\mongod.exe
example : C:\Program Files\MongoDB 2.6 Standard\bin
Let us know if this helps!

Add
models: {
connect: 'your_mongod_db_adapter_name'
}
in both env/development and env/production fixed the problem for me

For those if it didn't work in heroku after deploying, try to change the node js, sails, sails-mongo versions and check.
It worked for me after changing the versions to the below:
1.nodejs - 14.15.1
2.sails - ^1.5.2
3.sails-mongo - ^2.0.0

Related

Connect Postgresql database remote server ubuntu server (Docker-Compose) and Node.js express backend

I have a small personal project of Flutter application with Node.js express backend and a Postgresql database.
For the moment my database is hosted on my pc locally but having an ubuntu server I would like my database on it.
So I created a Docker container with my Postgresql database in it.
However I'm a bit stuck now I don't know how to create a database instance on my remote server and make it communicate with my application...
Here is my ormconfig.ts file i suppose i have to change here...
import { join } from "path";
import {ConnectionOptions } from "typeorm";
import { PostEntity } from "./database/entity/post.entity";
import { UserEntity } from "./database/entity/user.entity";
const connectionOptions : ConnectionOptions = {
type: "postgres",
host: "localhost",
port: 5432
username:"postgres",
password:"pg",
database:"test",
entities: [UserEntity,PostEntity],
synchronize:true,
dropSchema:false,
migrationsRun:true,
logging:false,
logger:"debug",
migrations:[join(__dirname,"src/migration/**/*.ts")],
};
export = connectionOptions;
Thanks a lot !
Unsure of your network setup with your Ubuntu server, but realistically it should be something like:
import { join } from "path";
import {ConnectionOptions } from "typeorm";
import { PostEntity } from "./database/entity/post.entity";
import { UserEntity } from "./database/entity/user.entity";
const connectionOptions : ConnectionOptions = {
type: "postgres",
host: UBUNTU_SERVER_ADDRESS,
port: POSTGRES_DOCKER_PORT
username:"postgres",
password:"pg",
database:"test",
entities: [UserEntity,PostEntity],
synchronize:true,
dropSchema:false,
migrationsRun:true,
logging:false,
logger:"debug",
migrations:[join(__dirname,"src/migration/**/*.ts")],
};
export = connectionOptions;
You'll need to make sure that the Postgres Docker instance has opened ports to connect to. E.g.:
docker run -d -p 5432:5432 ...other-args postgres:latest
Make sure your Ubuntu server has correctly configured firewall and network settings to allow remote access on port 5432.

Cannot connect to DeepStream Node.js server if using any custom plugins

So, if I use my DeepStream server as the following
const {Deepstream} = require('#deepstream/server')
const server = new Deepstream({
server.start()
it's working just fine I can connect to it from my frontend app like the following
const {DeepstreamClient} = require('#deepstream/client')
const client = new DeepstreamClient('192.168.88.238:6020')
client.login()
but If I add MongoDB storage instance or RethinkDB
NPM - RethinkDB
const {Deepstream} = require('#deepstream/server')
const server = new Deepstream({
storage: {
name: 'rethinkdb',
options: {
host: 'localhost',
port: 28015
}
}
})
// start the server
server.start()
I get the following error message when trying to reach my ds server.
(I've also tried to connect via WSS:// instead of WS://)
So hi everybody who has the same problem as me...
I figured it out!
So first of all what the npm packages documentation says from the usage of the Mongo DB driver is completely out of data
so what they say how should u use the npm package :
var Deepstream = require( 'deepstream.io' ),
MongoDBStorageConnector = require( 'deepstream.io-storage-mongodb' ),
server = new Deepstream();
server.set( 'storage', new MongoDBStorageConnector( {
connectionString: 'mongodb://test:test#paulo.mongohq.com:10087/munchkin-dev',
splitChar: '/'
}));
server.start();
INSTEAD OF ALL THIS!
You ain't really need the 'deep stream.io-storage-MongoDB' because it's an old module (based on: ), and u don't really need to use this way...
The correct usage of the MongoDB connector :
const {Deepstream} = require('#deepstream/server');
const server = new Deepstream({
storage: {
name: "mongodb",
options: {
connectionString: MONGO_CONNECTION_STRING ,
splitChar: '/'
}
}
});
server.start();
or you can also create a config. yaml file from all this :
storage:
name: mongodb
options:
connectionString: 'MONGO_CONNECTION_STRING'
# optional database name, defaults to `deepstream`
database: 'DATABASE_NAME'
# optional table name for records without a splitChar
# defaults to deepstream_docs
defaultCollection: 'COLLECTION_NAME'
# optional character that's used as part of the
# record names to split it into a table and an id part
splitChar: '/'
and pass it to the deep stream constructor as below:
const {Deepstream} = require('#deepstream/server');
const server = new Deepstream('config.yaml');
server.start();

How to set and map environment data in loopback js?

I have followed loopback documentation to setup deployment & environment specific configuration.
https://loopback.io/doc/en/lb3/Environment-specific-configuration.html
There is a place where i need to get data source connector, for that i have configured and accessing data source throughout in the application. and it is working fine.
module.exports.DataSources = app.dataSources.hmsDs.connector;
But after configure for deploying application in production as documentation says. I am getting error.
module.exports.DataSources = app.dataSources.applianceDs.connector;
^
TypeError: Cannot read property 'connector' of undefined
datasource.production.js
module.exports = {
applianceDs: {
hostname: process.env.DB_HOST,
port: process.env.DB_PORT || 27017,
database: 'applianceDB',
user: process.env.DB_USER || "",
password: process.env.DB_PASSWORD || "",
name: "applianceDs",
connector: 'mongodb',
}
}
and when i tried to access data source as follow,
var { applianceDs }= require('../datasource.production');
module.exports.DataSources = applianceDs.connector;
then getting error like this.
AssertionError [ERR_ASSERTION]: User is referencing a dataSource that does not exist: "applianceDs"
Steps that i have followed at terminal
$ export PRODUCTION=true
$ export DB_HOST="127.0.0.1"
$ export DB_USER="me"
$ export DB_PASSWORD="0000"
$ nodemon app.js
Which step I am missing ? Is there something hidden in loopback js?
Please help.
Thanks
Some advice which can help:
Configure NODE_ENV="production" env variable. I'm not sure, but loopback takes this value as the config postfix.
Try to pass env variables to run operation directly as it can be not inherited from the console (to check this point print process.env[$VARIABLE_NAME])
To pass env variables to operation:
PRODUCTION=true DB_HOST="127.0.0.1" DB_USER="me" DB_PASSWORD="0000" nodemon app.js

Mongoose not connecting on Ubuntu Ubuntu 14.04

I've got a node app built on Hapi using MongoDB and mongoose. Locally, I can use the app without issue. I can connect to the db, add data, find it, etc.
I've created an Ubuntu 14.04 x64 droplet on Digital Ocean.
I can ssh into my droplet and verify that my db is there with the correct name. I'm using dokku-alt to deploy and I have linked the db name to the app using dokku's mongodb:link appName mydb
I was having issues once I deployed the app where it would hang and eventually timeout. After a lot of debugging and commenting out code I found that any time I try to hit mongo like this the app will hang:
var User = request.server.plugins.db.User;
User
.findOne({ id: request.auth.credentials.profile.raw.id })
.exec(function(err, user){
// do something
});
Without this, the app loads fine, albeit without data. So my thought is that mongoose is never properly connecting.
I'm using grunt-shell-spawn to run a script which checks if mongo is already running, if not it starts it up. I'm not 100% certain that this is needed on the droplet, but I was having issues locally where mongo was already running... script:
/startMongoIfNotRunning.sh
if pgrep mongod; then
echo running;
else
mongod --quiet --dbpath db/;
fi
exit 0;
/Gruntfile.js
shell: {
make_dir: {
command: 'mkdir -p db'
},
mongodb: {
command: './startMongoIfNotRunning.sh',
options: {
stdin: false,
}
}
},
And here's how I'm defining the database location:
/index.js
server.register([
{ register: require('./app/db'), options: { url: process.env.MONGODB_URL || 'mongodb://localhost:27017/mydb' } },
....
/app/db/index.js
var mongoose = require('mongoose');
var _ = require('lodash-node');
var models = require('require-all')(__dirname + '/models');
exports.register = function(plugin, options, next) {
mongoose.connect(options.url, function() {
next();
});
var db = mongoose.connection;
plugin.expose('connection', db);
_.forIn(models, function(value, key) {
plugin.expose(key, value);
});
};
exports.register.attributes = {
name: 'db'
};
My app is looking for db files in db/. Could it be that dokku's mongodb:link appName mydb linked it to the wrong location? Perhaps process.env.MONGODB_URL is not being set correctly? I really don't know where to go from here.
It turns out the solution to my problem was adding an entry to the hosts file of my droplet to point to the mongo db url:
127.0.0.1 mongodb.myurl.com
For some reason, linking the db to my app with Dokku didn't add this bit. I would have thought that it was automatic. The app container's host file did get a mongodb entry when i linked the db to the app.

Replica Set not working as expected

I have configured like below and my MongoDB don't need username or password:
mongo: {
module: 'sails-mongo',
url: "mongodb://127.0.0.1:27017/mydb",
replSet: {
servers: [
{
host: "127.0.0.1",
port : 27018
},
{
host: "127.0.0.1",
port : 27019
}
],
options: {connectWithNoPrimary:true, rs_name:"rs0"}
}
}
It's working fine, meaning I do not get a connection error and I am able to do querying. But when I brought down 127.0.0.1:27017, 127.0.0.1:27018 becomes PRIMARY as if I did a rs.status(). After this, I am no longer able to do any query and keep getting the following:
Error: no open connections
I am sure that I setup replica-set in my local machine correctly as I used MongoDB native driver to test the above mentioned scenario (bring down PRIMARY and SECONDARY take over as PRIMARY) and there is no problem.
var url = 'mongodb://127.0.0.1:27017,127.0.0.1:27018,127.0.0.1:27019/mydb?w=0&wtimeoutMS=5000&replicaSet=sg1&readPreference=secondary';
mongodb.MongoClient.connect(url, function(err, result) {
if(err || result === undefined || result === null) {
throw err;
} else {
db = result;
}
});
ok I found the answer. This message emitted because of session.js. I commented everything in the file and now it is working. The reason I guess is in session.js, it only pointing to a single host, which is the original PRIMARY. when you bring down this mongodb PRIMARY, session.js no longer can connect so it threw exception. I also tried the mongodb URL string in sessions.js by putting in also the hosts ip in the replica set (mongodb://127.0.0.1:27017,127.0.0.1:27018,127.0.0.1:27019/mydb) but failed to "sails lift". When put only single host then it is fine.
now if I need to store sessions info, I need to start another mongodb instance then session.js point to this new instant.

Resources