role postgres does not exist - attempting to connect to psql inside Express app - node.js

There are similar questions with the one I have, but I'm still having trouble figuring out this issue.
I've added psql bin path to my bash_profile (on Mac) and I can run psql by:
psql -U postgres
The problem I have is that I can't run it inside my Express app. following the instructions on Express documentations, I have the following code:
const dotenv = require('dotenv').config();
var pgp = require('pg-promise')();
var db = pgp(
`postgres://
${process.env.DB_USER}:${process.env.DB_PASSWORD}
#
${process.env.DB_HOST}:5432/${process.env.DB_DATABASE}`
);
module.exports = db;
Then I try to make the db connection in app.js by
var db = require('./models/dbConnection/postgres')
db.one('SELECT $1 AS value', 123)
.then(function (data) {
console.log('DATA:', data.value)
})
.catch(function (error) {
console.log('ERROR:', error)
})
I have defined my environment variables in my .env file as follows:
DB_CONLIMIT=50
DB_HOST=127.0.0.1
DB_USER=postgres
DB_PASSWORD=
DB_DATABASE=mydatabase
when I run the app, I get:
[nodemon] starting node ./bin/www local-library:server Listening
on port 3000 +0ms ERROR: error: role "
postgres" does not exist
at Parser.parseErrorMessage
looking at my user table in psql when I run it via terminal:
postgres=# \du
List of roles
Role name | Attributes | Member of
-----------+------------------------------------------------------------+-----------
postgres | Superuser, Create role, Create DB, Replication, Bypass RLS | {}
What am I missing?

Related

Node.js (TypeScript) Postgres client does not appear to run insert statement on Pool.query, and callback does not execute

Hopefully this is a minimum viable example ->
routes.ts
import express, { Request, Response } from "express";
import { QueryResult, Pool } from "pg";
const pool = new Pool({
user: process.env.DOCKER_USER,
host: "localhost",
database: process.env.DOCKER_DB,
password: process.env.DOCKER_PASSWORD,
port: 5432,
});
const router = express.Router();
router.post("/log/hub", (req: Request, res: Response) => {
console.log("Made it here!");
const username = "PLACEHOLDER";
const json = { a: "b" };
const now = new Date();
const cachedId = "12345";
pool.query(
"INSERT INTO data_actions (username, json_payload, cache_id, action_timestamp) VALUES ($1, $2, $3, $4)",
[username, json, cachedId, now],
(error: Error, results: QueryResult) => {
if (error) {
throw error;
}
res.status(201).send(results);
}
);
});
export = router;
echo $DOCKER_USER,$DOCKER_DB,$DOCKER_PASSWORD
produces
docker, docker, docker
I have verified in the client that these values match the values printed above and that they match the credentials set up in the Docker container for Postgres. I'm able to connect to the database on port 5432 with pgAdmin4 and view the database I expect, which looks like this:
List of relations
Schema | Name | Type | Owner
--------+------------------+-------+--------
public | data_actions | table | docker
(1 row)
SELECT * FROM data_actions;
produces
username | json_payload | cache_id | action_timestamp
----------+--------------+----------+------------------
(0 rows)
I am able to reach the express router endpoint at localhost:5000, so everything downstream is not the problem. I think the issues lies somewhere with how I'm using pg. Does anything obvious stand out here? I have a feeling I'm missing something small and I'm banging my head against the keyboard trying to figure out what's going wrong.
EDIT: The relation shows 1 row where select statement returns 0 rows. This is because I deleted the row I inserted from pgAdmin before I posted here. Sorry for the red herring.
Try updating the pg module. Old versions might not work with newer Nodejs versions.

How to set and map environment data in loopback js?

I have followed loopback documentation to setup deployment & environment specific configuration.
https://loopback.io/doc/en/lb3/Environment-specific-configuration.html
There is a place where i need to get data source connector, for that i have configured and accessing data source throughout in the application. and it is working fine.
module.exports.DataSources = app.dataSources.hmsDs.connector;
But after configure for deploying application in production as documentation says. I am getting error.
module.exports.DataSources = app.dataSources.applianceDs.connector;
^
TypeError: Cannot read property 'connector' of undefined
datasource.production.js
module.exports = {
applianceDs: {
hostname: process.env.DB_HOST,
port: process.env.DB_PORT || 27017,
database: 'applianceDB',
user: process.env.DB_USER || "",
password: process.env.DB_PASSWORD || "",
name: "applianceDs",
connector: 'mongodb',
}
}
and when i tried to access data source as follow,
var { applianceDs }= require('../datasource.production');
module.exports.DataSources = applianceDs.connector;
then getting error like this.
AssertionError [ERR_ASSERTION]: User is referencing a dataSource that does not exist: "applianceDs"
Steps that i have followed at terminal
$ export PRODUCTION=true
$ export DB_HOST="127.0.0.1"
$ export DB_USER="me"
$ export DB_PASSWORD="0000"
$ nodemon app.js
Which step I am missing ? Is there something hidden in loopback js?
Please help.
Thanks
Some advice which can help:
Configure NODE_ENV="production" env variable. I'm not sure, but loopback takes this value as the config postfix.
Try to pass env variables to run operation directly as it can be not inherited from the console (to check this point print process.env[$VARIABLE_NAME])
To pass env variables to operation:
PRODUCTION=true DB_HOST="127.0.0.1" DB_USER="me" DB_PASSWORD="0000" nodemon app.js

NodeJS connection to MongoDB (on AWS C9) returns undefined db object

I'm learning how to code on Amazon's Cloud 9, and I'm trying to connect to a MongoDB database from NodeJS, but when I call the connect() function the db returned in the callback is undefined. However, I can see that the data exists when I run show dbs in the mongo shell, and the mongo server is running without issue, and the connect() function itself doesn't throw any errors.
index.js: Here is my index.js file containing the MongDB connection (this file executes when I run npm start in the terminal):
var MongoClient = require('mongodb').MongoClient;
// Connect to the db
MongoClient.connect("mongodb://localhost:27017/testdb", function (err, db) {
if (err){ console.log(err)}
else{
console.log("Connected!")
console.log(db.name)
}
db.close()
})
mongo shell confirms the database exists (sample is the name of the collection):
use testdb
switched to db testdb
db.sample.find()
{ "_id" : ObjectId("5aed5fc7a44ab7d8a4efce2f"), "name" : "Luckyfield" }
mongo server is running as it says "waiting for connections on port 27017", and whenever I run the index.js file this server records the connection opening and closing in the terminal:
connection accepted from 127.0.0.1:43820 #16 (1 connection now open)
2018-05-05T11:07:16.128+0000 I NETWORK [conn16] received client metadata from 127.0.0.1:43820 conn16: { driver: { name: "nodejs", version: "3.0.7" }, os: { type: "Linux", name: "linux", architecture: "x64", version: "4.9.91-40.57.amzn1.x86_64" }, platform: "Node.js v6.14.1, LE, mongodb-core: 3.0.7" }
2018-05-05T11:07:16.138+0000 I NETWORK [conn16] end connection 127.0.0.1:43820 (0 connections now open)
When I replace db.name to just db, the console shows a MongoClient object full of data, although I don't know what it is. I also tried inserting or querying documents but again, db seems to be undefined.
In summary, when I run npm start in a separate terminal, the index.js file executes and prints undefined in the console, even though the connection is properly established and the data actually exists. Thank you for any help!

Error when attempting knex seed:run after successful knex migrate:latest for remote database

I'm running the following error when attempting to run knex seed:run against my remote postgres database (not localhost): Knex:Error Pool2 - Error: connect ECONNREFUSED 127.0.0.1:5432.
I am able to run knex migrate:latest successfully and can see that the tables are created on my postgres server, but when I try to seed I get that error. I've run the same migrations/seed file against my local configuration and it has worked without a problem, but when I attempt to seed my heroku postgres instance, it throws this error (I'm not running my local pg service when I'm seeding the new db, which is likely why it's throwing an error).
Any thoughts on why it is attempting to connect to localhost instead of the specified db? Sample of my file provided below:
var User = require("./models/User");
var Project = require("./models/Project");
exports.seed = function(knex, Promise) {
console.log(knex.client.config.connection); //This returns the correct db info.
return knex('user').del()
.then(function() {
return knex('project').del()
}).then(function() {
return new User({id: 1, firstName: "James", lastName: "Lee", phone: "123-456-2000", email: "test#test.com"}).save(null, {method: "insert"});
}).then(function() {
return new Project({id: 1, name: "Test"}).save(null, {method: "insert"});
})
};
This seems to have occurred due to how I was setting up the migrations / seeds. The configurations were actually pulling from two different places, one which had the correct SSL settings in place, the other without (seed file). Adding the correct settings in both places seemed to resolve the issue.

Mongoose not connecting on Ubuntu Ubuntu 14.04

I've got a node app built on Hapi using MongoDB and mongoose. Locally, I can use the app without issue. I can connect to the db, add data, find it, etc.
I've created an Ubuntu 14.04 x64 droplet on Digital Ocean.
I can ssh into my droplet and verify that my db is there with the correct name. I'm using dokku-alt to deploy and I have linked the db name to the app using dokku's mongodb:link appName mydb
I was having issues once I deployed the app where it would hang and eventually timeout. After a lot of debugging and commenting out code I found that any time I try to hit mongo like this the app will hang:
var User = request.server.plugins.db.User;
User
.findOne({ id: request.auth.credentials.profile.raw.id })
.exec(function(err, user){
// do something
});
Without this, the app loads fine, albeit without data. So my thought is that mongoose is never properly connecting.
I'm using grunt-shell-spawn to run a script which checks if mongo is already running, if not it starts it up. I'm not 100% certain that this is needed on the droplet, but I was having issues locally where mongo was already running... script:
/startMongoIfNotRunning.sh
if pgrep mongod; then
echo running;
else
mongod --quiet --dbpath db/;
fi
exit 0;
/Gruntfile.js
shell: {
make_dir: {
command: 'mkdir -p db'
},
mongodb: {
command: './startMongoIfNotRunning.sh',
options: {
stdin: false,
}
}
},
And here's how I'm defining the database location:
/index.js
server.register([
{ register: require('./app/db'), options: { url: process.env.MONGODB_URL || 'mongodb://localhost:27017/mydb' } },
....
/app/db/index.js
var mongoose = require('mongoose');
var _ = require('lodash-node');
var models = require('require-all')(__dirname + '/models');
exports.register = function(plugin, options, next) {
mongoose.connect(options.url, function() {
next();
});
var db = mongoose.connection;
plugin.expose('connection', db);
_.forIn(models, function(value, key) {
plugin.expose(key, value);
});
};
exports.register.attributes = {
name: 'db'
};
My app is looking for db files in db/. Could it be that dokku's mongodb:link appName mydb linked it to the wrong location? Perhaps process.env.MONGODB_URL is not being set correctly? I really don't know where to go from here.
It turns out the solution to my problem was adding an entry to the hosts file of my droplet to point to the mongo db url:
127.0.0.1 mongodb.myurl.com
For some reason, linking the db to my app with Dokku didn't add this bit. I would have thought that it was automatic. The app container's host file did get a mongodb entry when i linked the db to the app.

Resources