Mongoose not connecting on Ubuntu Ubuntu 14.04 - node.js

I've got a node app built on Hapi using MongoDB and mongoose. Locally, I can use the app without issue. I can connect to the db, add data, find it, etc.
I've created an Ubuntu 14.04 x64 droplet on Digital Ocean.
I can ssh into my droplet and verify that my db is there with the correct name. I'm using dokku-alt to deploy and I have linked the db name to the app using dokku's mongodb:link appName mydb
I was having issues once I deployed the app where it would hang and eventually timeout. After a lot of debugging and commenting out code I found that any time I try to hit mongo like this the app will hang:
var User = request.server.plugins.db.User;
User
.findOne({ id: request.auth.credentials.profile.raw.id })
.exec(function(err, user){
// do something
});
Without this, the app loads fine, albeit without data. So my thought is that mongoose is never properly connecting.
I'm using grunt-shell-spawn to run a script which checks if mongo is already running, if not it starts it up. I'm not 100% certain that this is needed on the droplet, but I was having issues locally where mongo was already running... script:
/startMongoIfNotRunning.sh
if pgrep mongod; then
echo running;
else
mongod --quiet --dbpath db/;
fi
exit 0;
/Gruntfile.js
shell: {
make_dir: {
command: 'mkdir -p db'
},
mongodb: {
command: './startMongoIfNotRunning.sh',
options: {
stdin: false,
}
}
},
And here's how I'm defining the database location:
/index.js
server.register([
{ register: require('./app/db'), options: { url: process.env.MONGODB_URL || 'mongodb://localhost:27017/mydb' } },
....
/app/db/index.js
var mongoose = require('mongoose');
var _ = require('lodash-node');
var models = require('require-all')(__dirname + '/models');
exports.register = function(plugin, options, next) {
mongoose.connect(options.url, function() {
next();
});
var db = mongoose.connection;
plugin.expose('connection', db);
_.forIn(models, function(value, key) {
plugin.expose(key, value);
});
};
exports.register.attributes = {
name: 'db'
};
My app is looking for db files in db/. Could it be that dokku's mongodb:link appName mydb linked it to the wrong location? Perhaps process.env.MONGODB_URL is not being set correctly? I really don't know where to go from here.

It turns out the solution to my problem was adding an entry to the hosts file of my droplet to point to the mongo db url:
127.0.0.1 mongodb.myurl.com
For some reason, linking the db to my app with Dokku didn't add this bit. I would have thought that it was automatic. The app container's host file did get a mongodb entry when i linked the db to the app.

Related

role postgres does not exist - attempting to connect to psql inside Express app

There are similar questions with the one I have, but I'm still having trouble figuring out this issue.
I've added psql bin path to my bash_profile (on Mac) and I can run psql by:
psql -U postgres
The problem I have is that I can't run it inside my Express app. following the instructions on Express documentations, I have the following code:
const dotenv = require('dotenv').config();
var pgp = require('pg-promise')();
var db = pgp(
`postgres://
${process.env.DB_USER}:${process.env.DB_PASSWORD}
#
${process.env.DB_HOST}:5432/${process.env.DB_DATABASE}`
);
module.exports = db;
Then I try to make the db connection in app.js by
var db = require('./models/dbConnection/postgres')
db.one('SELECT $1 AS value', 123)
.then(function (data) {
console.log('DATA:', data.value)
})
.catch(function (error) {
console.log('ERROR:', error)
})
I have defined my environment variables in my .env file as follows:
DB_CONLIMIT=50
DB_HOST=127.0.0.1
DB_USER=postgres
DB_PASSWORD=
DB_DATABASE=mydatabase
when I run the app, I get:
[nodemon] starting node ./bin/www local-library:server Listening
on port 3000 +0ms ERROR: error: role "
postgres" does not exist
at Parser.parseErrorMessage
looking at my user table in psql when I run it via terminal:
postgres=# \du
List of roles
Role name | Attributes | Member of
-----------+------------------------------------------------------------+-----------
postgres | Superuser, Create role, Create DB, Replication, Bypass RLS | {}
What am I missing?

Creating sub connections with azure-mobile-apps and NodeJS

I'm trying to create an API using nodeJS, express and azure-mobile-apps to do some data synchronisation between an Ionic3 mobile app (which use an SQLite local database) and a Microsoft SQL Database.
My API has to create a synchronisation connection for each mobile application. Each application will be linked to a distant database. For example, if user_01 wants to synchronise his data, he's going to be linked to his client_01 database. So each time it'll have to, the API will create a new process running on a different port.
here is an example : https://zupimages.net/up/19/36/szhu.png
The problem is that i'm not able to create more than one connection with azure-mobile-apps. The first one always works, but the second, third etc are still using the first connection that i have instantiated. I've looked into the app stack and everything seems fine.
Is that an issue with azure-mobile-app, or did I misunderstand something with express ?
Thanks for your responses !
var azureMobileApps = require('azure-mobile-apps');
var express = require('express');
module.exports = {
async createConnection(client) {
try {
let app = express();
mobileApp = azureMobileApps({
homePage: true,
swagger: true,
data: {
server: '****',
user: client.User,
password: client.Password,
port: '1443',
database: client.Database,
provider: 'mssql',
dynamicSchema: false,
options: {
encrypt: false
}
}
});
await mobileApp.tables.import('./tables');
await mobileApp.tables.initialize();
app.listen(global.portCounter);
app.use(mobileApp);
console.log(app._router.stack);
console.log('Listening on port ',global.portCounter);
global.portCounter++;
} catch (error) {
console.log(error);
}
}
}
It's working now. The thing is, it's impossible to do multiple connection with the azure-mobile-apps SDK for nodeJS.
I had to use worker-thread which seems to isolate the memory in a sub-proccess.
Hope it can help somebody one day

Hapi.js Catbox Redis returning "server.cache is not a function"

So I'm like 99% sure I'm just screwing up something dumb here.
I'm trying to set up catbox to cache objects to redis. I have redis up and running and I can hit it with RDM (sql pro like utility for redis) but Hapi is not cooperating.
I register the redis catbox cache like so:
const server = new Hapi.Server({
cache: [
{
name: 'redisCache',
engine: require('catbox-redis'),
host: 'redis',
partition: 'cache',
password: 'devpassword'
}
]
});
I am doing this in server.js After this block of code I go on to register some more plugins and start the server. I also export the server at the end of the file
module.exports = server;
Then in my routes file, I am attempting to set up a testing route like so:
{
method: 'GET',
path: '/cacheSet/{key}/{value}',
config: { auth: false },
handler: function(req, res) {
const testCache = server.cache({
cache: 'redisCache',
expireIn: 1000
});
testCache.set(req.params.key, req.params.value, 1000, function(e) {
console.log(e);
res(Boom.create(e.http_code, e.message));
})
res(req.params.key + " " + req.params.value);
}
},
Note: My routes are in an external file, and are imported into server.js where I register them.
If I comment out all the cache stuff on this route, the route runs fine and returns my params.
If I run this with the cache stuff, at first I got "server not defined". So I then added
const server = require('./../server.js');
to import the server.
Now when I run this, I get "server.cache is not a function" and a 500 error.
I don't understand what I'm doing wrong. My guess is that I'm importing server, but perhaps it's the object without all the configs set so it's unable to use the .cache method. However this seems wrong because .cache should always be a default method with the default memory cache, so even if my cache registration isn't active yet, server.cache should theoretically still be a method.
I know it has to be something basic I'm messing up, but what?
I was correct. I was doing something stupid. It had to do with how I was exporting my server. I modified my structure to pull out the initial server creation and make it more modular. Now I am simply exporting JUST the server like so:
'use strict';
const Hapi = require('hapi');
const server = new Hapi.Server({
cache: [
{
name: 'redisCache',
engine: require('catbox-redis'),
host: 'redis',
partition: 'cache',
password: 'devpassword'
}
]
});
module.exports = server;
I then import that into my main server file (now index.js previously server.js) and everything runs fine. I can also import this into any other file (in this case my routes file) and access the server for appropriate methods.
Redis is happily storing keys and Hapi is happily not giving me errors.
Leaving here in case anyone else runs into a dumb mistake like this.

Replica Set not working as expected

I have configured like below and my MongoDB don't need username or password:
mongo: {
module: 'sails-mongo',
url: "mongodb://127.0.0.1:27017/mydb",
replSet: {
servers: [
{
host: "127.0.0.1",
port : 27018
},
{
host: "127.0.0.1",
port : 27019
}
],
options: {connectWithNoPrimary:true, rs_name:"rs0"}
}
}
It's working fine, meaning I do not get a connection error and I am able to do querying. But when I brought down 127.0.0.1:27017, 127.0.0.1:27018 becomes PRIMARY as if I did a rs.status(). After this, I am no longer able to do any query and keep getting the following:
Error: no open connections
I am sure that I setup replica-set in my local machine correctly as I used MongoDB native driver to test the above mentioned scenario (bring down PRIMARY and SECONDARY take over as PRIMARY) and there is no problem.
var url = 'mongodb://127.0.0.1:27017,127.0.0.1:27018,127.0.0.1:27019/mydb?w=0&wtimeoutMS=5000&replicaSet=sg1&readPreference=secondary';
mongodb.MongoClient.connect(url, function(err, result) {
if(err || result === undefined || result === null) {
throw err;
} else {
db = result;
}
});
ok I found the answer. This message emitted because of session.js. I commented everything in the file and now it is working. The reason I guess is in session.js, it only pointing to a single host, which is the original PRIMARY. when you bring down this mongodb PRIMARY, session.js no longer can connect so it threw exception. I also tried the mongodb URL string in sessions.js by putting in also the hosts ip in the replica set (mongodb://127.0.0.1:27017,127.0.0.1:27018,127.0.0.1:27019/mydb) but failed to "sails lift". When put only single host then it is fine.
now if I need to store sessions info, I need to start another mongodb instance then session.js point to this new instant.

Deploying Node/Mongo to Openshift

Hello I'm trying to get Node/Mongo service going on Openshift, here's what it looks like:
var db = new mongodb.Db('myServiceName',
new mongodb.Server('mongodb://$OPENSHIFT_MONGODB_DB_HOST','$OPENSHIFT_MONGODB_DB_PORT', {}));
db.open(function (err, db_p) {
if (err) { throw err; }
db.authenticate('$USER', '$PASS', function (err, replies) {
if (err) { throw err; }
// should be connected and authenticated.
// ...
The app was created using rhc:
$ rhc create-app myServiceName nodejs-0.10 mongodb-2.4
The console shows the app was started and is running, and on cURL the response is 503
My logs don't show an error, however, the dB is obviously not live. Can anyone help?
If your mongodb driver supports connection with username/password, then use OPENSHIFT_MONGODB_DB_URL instead of OPENSHIFT_MONGODB_DB_HOST
OPENSHIFT_MONGODB_DB_URL gives you this format:
mongodb://admin:password#127.4.99.1:27017/
and OPENSHIFT_MONGODB_DB_HOST gives you this format:
ip addres, ex: 127.4.99.1
So you can just use OPENSHIFT_MONGODB_DB_URL to connect and authenticate at the same time
with mongoskin, you can just do this:
var db = require('mongoskin').db(process.env.OPENSHIFT_MONGODB_DB_URL + 'dbname'+ '?auto_reconnect=true',
{safe: true, strict: false}
);
It looks like you are attempting to connect to a server named "$OPENSHIFT_MONGODB_DB_HOST", (not a valid URL).
Instead, you'll probably want to read the value of the OPENSHIFT_MONGODB_DB_HOST environment variable to find your connection information:
process.env.OPENSHIFT_MONGODB_DB_HOST
I have some additional notes up here: https://www.openshift.com/blogs/getting-started-with-mongodb-on-nodejs-on-openshift

Resources