local Postgres connection not returning anything - node.js

I’m trying to perform a simple query on a local database. I expect this query to return the schema names of the database.
I am running Postgres version 13.1 and I installed it by following the steps shown here: https://postgresapp.com/
As per guidelines on Postgres Wiki, I'm including config file changes, I only manually edited settings to enable logging.
This computer is running MacOS Big Sur Version 11.0.1.
I'm using Node.js and Postgres is running on port 5432 and I can access it with psql.
The relevant changes I've made are the following:
Endpoint in server.js:
router.post('/mock_call', async (ctx) => {
try {
console.log('sup')
await sql.mockCall()
ctx.body = {
status: "It's good",
data: 'good'
}
} catch (e) {
console.log(e)
ctx.body = {
status: "Failed",
data: e
}
ctx.res.statusCode = 422;
}
})
SQL File:
require("openssl")
const { Pool, Client } = require('pg');
const client = new Client({
user: 'user1',
host: 'localhost',
database: 'postgres',
password: 'mypass',
port: 5432,
});
module.exports = {
mockCall: function () {
console.log('mockCall begin')
client.connect(err => {
if (err) {
console.error('connection error', err.stack)
} else {
console.log('connected')
}
})
console.log('before query')
client.query("SELECT schema_name FROM information_schema.schemata", (err, res) => {
if (err) {
console.log('theres an error:')
console.log(err)
};
console.log('theres a response:')
console.log(res)
for (let row of res.rows) {
console.log(JSON.stringify(row));
}
client.end();
});
}
}
Logs that actually get printed out when I hit the endpoint on localhost:
sup
mockCall begin
before query
Postgresql Logs (not helpful it's as if the server never gets hit):
This exact project and code is working on my personal local computer and the query goes through as expected. It used to be working on a Heroku server I had set up. The only difference with the Heroku server is that the connection is made like so:
const client = new Client({
connectionString: process.env.DATABASE_URL,
ssl: {
rejectUnauthorized: false
}
});
This connection had been working on a server I had for over a year. My database was running out of space so I upgraded from a hobby database to a standard plan on Heroku, the app continued to work. A couple weeks after this upgrade I pushed a new commit which included a couple new features on the app and this broke the postgresql connection. After this push I immediately checked out my last commit which was working and pushed that one, the issue however was still there.
I currently have the program running on my personal local computer but I need to move it back to Heroku as quickly as possible. The pictures and logs I've included above are the result of running my app locally on my friends computer, which seems to be having the same issue I'm having on Heroku so I'm hoping if I figure out the issue on his local computer I'll be able to solve what's going on in Heroku.
These are the logs that are printed out from my personal local computer which is working:
Edits:
Running psql -d postgres -U user1 -h localhost -p 5432 successfully connects me to the database on the command line.
The new features I added was a new endpoint for my apps customers. This commit works fine on my personal local computer, so I don't think it's an issue with the new features that I added. Additionally, since then I've reverted to my previous commit which used to be working so none of that new code is present anymore.
I'm running the entire app locally on my friends computer. I set up Postgres from scratch just as I did a year ago on my computer. However now, only my personal local computer is working.
I haven't changed anything on pg_hba.conf on either setup. This is what they both look like:
At first I thought the problem would be with Heroku since my local app was working fine. However after reaching out and talking for a couple days with support they said:
Hi there,
It looks like your application is able to successfully connect to the database, but something else in the application or framework is preventing the data from being retrieved. Unfortunately, as this is an application issue it falls outside the nature of the Heroku Support policy. I recommend searching our Knowledge Base or asking the community on Stack Overflow for more answers.

Turns out I was using an old version of pg, 7.8. I upgraded to 8.5 and now it works.

Related

How do i resolve this Mongo Server Selection Error

Am quiet new to backend and database. I have a solution currently using mongodb. It was working fine till yesterday when i starting having the connect ETIMEDOUT 13.37.254.237:27017 error. Nothing was changed in the URI path or tampered with. It just started and i have not been able to sort it out.
is there any help available please?
I have created another cluster and its working well. But my initial cluster that has datas which are live from clients is not connecting still.
My connection code
I have used these connections code but it has not worked. It was connecting fine all through yesterday but today without tampering with the code, couldn't connect to my mongodb
mongoose.connect(process.env.MONGO_URI,{ useNewUrlParser: true, useUnifiedTopology: true });
const connectDB = async () => {
try {
const conn = await mongoose.connect(process.env.MONGO_URL);
console.log(`MongoDB Connected: ${conn.connection.host}`);
} catch (error) {
console.log(error);
process.exit(1);
}
};
my mongoose connection and the timedout error
Whenever you connect to Mongodb using IPs that keep updating from your system causes this kind of issues.
also this can be due to your network connection. So i will advice you to:
To allow connection from any IP address(but must ensure your URI is not made known to the public to avoid attack/ access from unwanted users.)
2.Check your network status(data)
3. Run the mongo URI on your atlas

Heroku postgres timeout and SSL issues on API calls with Node

I'm trying to put my REST API built in Node and with PostgresSQL to Heroku. So, I created my application to Heroku, and created his own database. At this point, I tryied to commit, and the build worked corretly. This until I tryied to make some calls to the API. If the api calls I do has the uncorrect method, or doesn't exists, it gives me the correct error, but when the call is correct, there is a 503 error, with code H12, and description that is timeout error. Here is the code of one of the calls to the database that I'm testing:
router.get('/allpoints', async (req,res) =>{
try {
const points = await pool.query(
`SELECT nome, latitudine,longitudine
FROM luogo`);
res.json(points.rows);
}catch(err){
console.error(err.message);
}
});
Here there are the information about how I connect to the database.
const pool = new Pool({
connectionString: process.env.DATABASE_URL || 'postgresql://postgres:psw#localhost:5432/campione',
ssl: process.env.DATABASE_URL ? true : false
})
module.exports = pool;
The build on Heroku seems to work properly.
I read this question: Heroku h12 Timeout Error with PG / Node.js
It says that you have to put res.end() where there is not res.json(), but here there is the res.json(). So, I thought that the issue could be that there is an error that the route manage, and can't give back anything. So, I changed from console.log(err) to res.json(err), and the API response with `ssl self signed, as an error. At this point, in the second file, I put ssl as false by default, but it gaves me error because there is no SSL. I searched for a really long time for a solution, but I have not been able yet to fix the issue.
Someone thinks he knows what should I change? Thank you in advice
this option in databse config maybe useful
ssl: {
rejectUnauthorized : false,
}

Issue in connecting to heroku postgres from node + express + also locally

New to Node here, trying for the last 3 days straight, no clue.
Read all similar issues and tried literally everything I could find, no luck.
Suspecting something that is not common or related to my machine or code.
Issue: trying to fetch data in node.js from postgres db - to console (at least) so to render it later as HTML
Database has table name: students on heroku, and has records
Also locally on my macOS, I have postgres installed with simple data in a table called students
I couldn't fetch any data, no error, no idea how to track it!
Tried creating connection with pool, client.. also used heroku guide here exactly
Literally everything that other users mostly encountered
DATABASE_URL environment variable is ok, if i echo $DATABASE_URL in Terminal:
postgres://xnlkikdztxosk:kuadf76d555dfab0a6c159b8404e2ac254f581639c09079baae4752a7b64a#ec3-52-120-48-116.compute-1.amazonaws.com:5432/uytnmb7fvbg1764
When i run 'node app.js' server starts ok on port 3000, I can use postman on the root '/' OK and it works, it returns back the json info and console.log
If i try postman to '/students' then it tries forever, no results, no error, nothing
Tried also with my local installation of postgres, same thing
My modules are ok, and I run npm install several times
Thought could be my mac firewall, i turned it off completely
Also tried this, nothing prints out or no idea where to track it:
process.on('uncaughtException', function (err) {
console.log(err);
});
Guide or steps to follow in order to track issues like this will be highly appreciated
app.js file:
const express = require('express')
const bodyParser = require('body-parser')
const { Client } = require('pg');
const app = express()
const PORT = 3000
app.use(bodyParser.json())
app.use(
bodyParser.urlencoded({
extended: true,
})
)
const client = new Client({
connectionString: process.env.DATABASE_URL,
ssl: {
rejectUnauthorized: false
}
});
client.connect();
app.get('/', (req, res) => {
res.json({ info: 'Info: This is the root directory' });
console.log('main directory')
})
app.get('/students', (req, res) => {
client.query('SELECT * FROM students;', (err, res) => {
if (err) throw err;
for (let row of res.rows) {
console.log(JSON.stringify(row));
console.log('WHOOOOOO, finally!');
}
client.end();
});
});
app.listen(PORT, function(){
console.log('Server running on port 3000');
});
Well, my node version was for some reason v14, not sure how that happened, but the most stable version in node site is 12, so I installed v12 and the connection to pg worked locally and remotely on heroku.
This is just to highlight what worked with me after trying 4 days straight.
However, that may trigger for you different issue like like this which I'm facing:
DeprecationWarning: Implicit disabling of certificate verification is deprecated and will be removed in pg 8. Specify `rejectUnauthorized: true` to require a valid CA or `rejectUnauthorized: false` to explicitly opt out of MITM protection.
All answers found so far point to: pg module already fixed in v7.18.1 but for some reason I can't force package.json to take that version, it jumps me to version 7.18.2
Tried that along with latest version 8.3 same issue with heroku, but locally the message doesn't show
Not big deal though, connection works for now until figuring it out.
I think the issue here is that you don't send back any response in the /students route .Notice the / route u have a res.json which sends back a response but in /students route i don't see where your response is sent and that's why you wait forever

Connection to Redis instance closed every couple of minutes

I'm running a node.js on Google Cloud that uses a redis caching server. It was running fine for a couple of months but it suddenly started throwing connection errors and occasionally stops responding.
The app is running in the standard environment and connects to the VM that is running the Redis instance via a VPC connector. I suspect it is a networking issue because the issue doesn't seem to appear when I run the Node.js app from my own computer (connected to the same Redis server) or when the app is run in a flex environment and connects to the subnetwork directly. However, I'd prefer the app to run in the standard environment because as far as I know that's the only way to force the traffic over https.
When I monitor via Redis-cli the server just doesn't receive any commands when the connection has failed.
Time out in redis.conf is set to 0
Redis version: 5.0.5
Here's the Redis code. I don't think it is the issue though, it was running without issue a couple of weeks ago.
const redis = require('redis')
const redisOptions = {
host: process.env.REDIS_IP,
port: process.env.REDIS_PORT,
password: process.env.REDIS_PASS,
enable_offline_queue: false,
}
const client = redis.createClient(redisOptions.host, redisOptions.port)
// Log any errors
client.on('error', function(error) {
console.log('Error:')
console.log(error)
})
module.exports = client
These errors regularly show up in the Google App engine log. When they occur commands sent to Redis do not show up in in the logs.
A 2019-08-31T12:42:27.162834Z { Error: Redis connection to 10.128.15.197:6379 failed - read ETIMEDOUT "
A 2019-08-31T12:42:27.162868Z at TCP.onStreamRead (internal/stream_base_commons.js:111:27) errno: 'ETIMEDOUT', code: 'ETIMEDOUT', syscall: 'read' }
I see same issue many times with different databases. You already found the issue. Number of opened connections - is limited and costly resource. Try to use following pattern (it is just an example):
// Inside your db module
function dbCall(userFunc) {
const client = anyDb.createClient(host, port, ...);
userFunc(client, () => { client.quit(); /* client.close() or whatever */ }
}
// Usage
dbCall((client, done) => {
client.doSomethingWithCallback(..., () => {
// user code
done();
});
});
dbCall((client, done) => {
client.doSomePromise(...)
.finally(done);
});

Bitnami Meanstack Mongoose Connection

I created a simple service in Ubuntu 16.04 with mongo db node and express to return data to an angular 2 app.
I have a file called server.js that connects to a local mongodb instance with a database called game and a collection called players. It works fine installed on my local machine. However I am trying to deploy it with Bitnami's mean stack image on amazon ec2. (bleh mouth full). I have set ports correctly according to this guide, and I can connect to it remotely. However, I can't get mongoose to connect to any database. here is my code that works on my local machine.
mongoose.connect('mongodb://localhost:27017/game');
router.route('/player')
.get(function(req, res) {
console.log(mongoose.connection.readyState);
Player.find({"player":user,"password":password},function(err, Test) {
if (err)
res.send(err);
res.json(Test);
});
});
And here is my adjusted code for the mean stack image
mongoose.connect('mongodb://root:"My-Root-Password#127.0.0.1:27017/game');
router.route('/player')
.get(function(req, res) {
console.log(mongoose.connection.readyState);
Player.find({"player":user,"password":password},function(err, Test) {
if (err)
res.send(err);
res.json(Test);
});
});
On my local machine I get a value of 1 on the console.log and value of zero on the mean stack image. I'm not sure how to connect to bitnami's mongo instance with mongoose. I have checked that game exist and has the data I want.
I found a fix Although I don't yet fully understand it. It came from the guide I posted here . First I had to un comment out the section of the mongodb.conf that says noauth = true then comment out the line that says auth = true. I then restart mongo, and create a new user with permissions to read and write the the data base I want to use like this
db.createUser({
user: "NEW USERNAME",
pwd: "NEW PASSWORD",
roles:[
{
"role" : "readWrite",
"db": "game"
}
]})
After creating the user I undo what I did to noauth = true and auth = true, and restart mongodb. Then I am able to connect with mongoose like this
mongoose.connect('mongodb://NEW USERNAME:NEW PASSWORD#127.0.0.1:27017/game');
With MongoDB 3.0, they added a new authentication mechanism for MongoDB (more details in the links below).
Authentication information: https://docs.mongodb.com/manual/core/authentication/
How to use the new authentication mechanism: https://www.mongodb.com/blog/post/improved-password-based-authentication-mongodb-30-scram-explained-part-2
Due to this, the guide provides that workaround to get the connection with the database. Now, you have created that user with "readWrite" privileges on your database so you are able to use it.

Resources