Node.js (TypeScript) Postgres client does not appear to run insert statement on Pool.query, and callback does not execute - node.js

Hopefully this is a minimum viable example ->
routes.ts
import express, { Request, Response } from "express";
import { QueryResult, Pool } from "pg";
const pool = new Pool({
user: process.env.DOCKER_USER,
host: "localhost",
database: process.env.DOCKER_DB,
password: process.env.DOCKER_PASSWORD,
port: 5432,
});
const router = express.Router();
router.post("/log/hub", (req: Request, res: Response) => {
console.log("Made it here!");
const username = "PLACEHOLDER";
const json = { a: "b" };
const now = new Date();
const cachedId = "12345";
pool.query(
"INSERT INTO data_actions (username, json_payload, cache_id, action_timestamp) VALUES ($1, $2, $3, $4)",
[username, json, cachedId, now],
(error: Error, results: QueryResult) => {
if (error) {
throw error;
}
res.status(201).send(results);
}
);
});
export = router;
echo $DOCKER_USER,$DOCKER_DB,$DOCKER_PASSWORD
produces
docker, docker, docker
I have verified in the client that these values match the values printed above and that they match the credentials set up in the Docker container for Postgres. I'm able to connect to the database on port 5432 with pgAdmin4 and view the database I expect, which looks like this:
List of relations
Schema | Name | Type | Owner
--------+------------------+-------+--------
public | data_actions | table | docker
(1 row)
SELECT * FROM data_actions;
produces
username | json_payload | cache_id | action_timestamp
----------+--------------+----------+------------------
(0 rows)
I am able to reach the express router endpoint at localhost:5000, so everything downstream is not the problem. I think the issues lies somewhere with how I'm using pg. Does anything obvious stand out here? I have a feeling I'm missing something small and I'm banging my head against the keyboard trying to figure out what's going wrong.
EDIT: The relation shows 1 row where select statement returns 0 rows. This is because I deleted the row I inserted from pgAdmin before I posted here. Sorry for the red herring.

Try updating the pg module. Old versions might not work with newer Nodejs versions.

Related

Render Hosted Postgres database throws ECONNRESET on any connection attempt Node Express App

I am unable to query my Postgres DB in Render. I researched this issue online and found that it was a common problem but none of the solutions I found worked.
Here is my server-side code (I am using NodeJS, Express, Typescript and Postgres)
import postgres, { RowList, Row } from 'postgres'
import appconfig from '../app.config'
type Query = (sql: string) => Promise<RowList<Row[]>>
const query: Query = async (sql) => {
try {
const q = postgres({
host: appconfig.database.host,
port: appconfig.database.port,
database: appconfig.database.schema,
username: appconfig.database.username,
password: appconfig.database.password,
})
const res = await q`${sql}`
return res
} catch (err) {
throw err
}
}
export default query
I receive the following error every time and have not had a successful attempt. It's worth noting I have no issues connecting from PGAdmin on the same PC with the same credentials
Error: read ECONNRESET
at TCP.onStreamRead (node:internal/stream_base_commons:217:20)
at cachedError (C:\Users\xxxxx\repos\one-watch\node_modules\postgres\cjs\src\query.js:171:23)
at new Query (C:\Users\xxxxx\repos\one-watch\node_modules\postgres\cjs\src\query.js:36:24)
at sql (C:\Users\xxxxx\repos\one-watch\node_modules\postgres\cjs\src\index.js:111:11)
at C:\Users\xxxxx\repos\one-watch\src\database\query.ts:15:24
I have never used postgres before, all of my database experience has been in mysql up to this point, although I don't expect this is a postgres problem but potentially just a nuance of Render's implementation of their Postgres service. Any help would be greatly appreciated, thanks!
The only articles I've found like this one are related but they are able to get at least some sort of successful connection at least once. In my case they are all failing.
Adding
?sslmode=no-verify
in the end of the url worked for me.

MongoDB Quick Start fails, keeps returning "null" on Terminal

Hi, I am self-learning MongoDB (with Node.js). Totally new to programming.
My first Node.js application doesn't return the MongoDB document like it supposed to.
What I want to achieve:
To work with the native MongoDB driver, and to complete the quick start procedure on MongoDB website: https://www.mongodb.com/docs/drivers/node/current/quick-start/
What I have tried so far:
Installed node & npm correctly;
Installed MongoDB#4.8 correctly;
Initialized all these via Terminal;
Set up Atlas, obtained connection string.
Still, when I put template (obtained from MongoDB quick start tutorial) into my server.js file, entered "npx nodemon app.js" to test, it returns: "null".
Here's code I put into server.js: (all account & password typed in correctly)
const { MongoClient } = require("mongodb");
// const uri = "mongodb://localhost:27017";
const uri = "mongodb+srv://<myClusterUsername>:<myPassword>#cluster0.fytvkcs.mongodb.net/?retryWrites=true&w=majority";
const client = new MongoClient(uri);
async function run() {
try {
const database = client.db('sample_mflix');
const movies = database.collection('movies');
// Query for a movie that has the title 'Back to the Future'
const query = { title: 'Back to the Future' };
const movie = await movies.findOne(query);
console.log(movie);
} finally {
// Ensures that the client will close when you finish/error
await client.close();
}
}
run().catch(console.dir);
As you can see, I also tried uri: localhost:27017, but output stay still on my Terminal: "null".
According to MongoDB, it was supposed to return such online sample doc:
{
_id: ...,
plot: 'A young man is accidentally sent 30 years into the past...',
genres: [ 'Adventure', 'Comedy', 'Sci-Fi' ],
...
title: 'Back to the Future',
...
}
Your help would be appreciated! Thanks very much!
you should open the folder in visual studio code like this :
enter image description here

Cannot connect to DeepStream Node.js server if using any custom plugins

So, if I use my DeepStream server as the following
const {Deepstream} = require('#deepstream/server')
const server = new Deepstream({
server.start()
it's working just fine I can connect to it from my frontend app like the following
const {DeepstreamClient} = require('#deepstream/client')
const client = new DeepstreamClient('192.168.88.238:6020')
client.login()
but If I add MongoDB storage instance or RethinkDB
NPM - RethinkDB
const {Deepstream} = require('#deepstream/server')
const server = new Deepstream({
storage: {
name: 'rethinkdb',
options: {
host: 'localhost',
port: 28015
}
}
})
// start the server
server.start()
I get the following error message when trying to reach my ds server.
(I've also tried to connect via WSS:// instead of WS://)
So hi everybody who has the same problem as me...
I figured it out!
So first of all what the npm packages documentation says from the usage of the Mongo DB driver is completely out of data
so what they say how should u use the npm package :
var Deepstream = require( 'deepstream.io' ),
MongoDBStorageConnector = require( 'deepstream.io-storage-mongodb' ),
server = new Deepstream();
server.set( 'storage', new MongoDBStorageConnector( {
connectionString: 'mongodb://test:test#paulo.mongohq.com:10087/munchkin-dev',
splitChar: '/'
}));
server.start();
INSTEAD OF ALL THIS!
You ain't really need the 'deep stream.io-storage-MongoDB' because it's an old module (based on: ), and u don't really need to use this way...
The correct usage of the MongoDB connector :
const {Deepstream} = require('#deepstream/server');
const server = new Deepstream({
storage: {
name: "mongodb",
options: {
connectionString: MONGO_CONNECTION_STRING ,
splitChar: '/'
}
}
});
server.start();
or you can also create a config. yaml file from all this :
storage:
name: mongodb
options:
connectionString: 'MONGO_CONNECTION_STRING'
# optional database name, defaults to `deepstream`
database: 'DATABASE_NAME'
# optional table name for records without a splitChar
# defaults to deepstream_docs
defaultCollection: 'COLLECTION_NAME'
# optional character that's used as part of the
# record names to split it into a table and an id part
splitChar: '/'
and pass it to the deep stream constructor as below:
const {Deepstream} = require('#deepstream/server');
const server = new Deepstream('config.yaml');
server.start();

Can you keep a PostgreSQL connection alive from within a Next.js API?

I'm using Next.js for my side project. I have a PostrgeSQL database hosted on ElephantSQL. Inside the Next.js project, I have a GraphQL API set up, using the apollo-server-micro package.
Inside the file where the GraphQL API is set up (/api/graphql), I import a database helper-module. Inside that, I set up a pool connection and export a function which uses a client from the pool to execute a query and return the result. This looks something like this:
// import node-postgres module
import { Pool } from 'pg'
// set up pool connection using environment variables with a maximum of three active clients at a time
const pool = new Pool({ max: 3 })
// query function which uses next available client to execute a single query and return results on success
export async function queryPool(query) {
let payload
// checkout a client
try {
// try executing queries
const res = await pool.query(query)
payload = res.rows
} catch (e) {
console.error(e)
}
return payload
}
The problem I'm running into, is that it appears as though the Next.js API doesn't (always) keep the connection alive but rather opens up a new one (either for every connected user or maybe even for every API query), which results in the database quickly running out of connections.
I believe that what I'm trying to achieve is possible for example in AWS Lambda (by setting context.callbackWaitsForEmptyEventLoop to false).
It is very possible that I don't have a proper understanding of how serverless functions work and this might not be possible at all but maybe someone can suggest me a solution.
I have found a package called serverless-postgres and I wonder if that might be able to solve it but I'd prefer to use the node-postgres package instead as it has much better documentation. Another option would probably be to move away from the integrated API functionality entirely and build a dedicated backend-server, which maintains the database connection but obviously this would be a last resort.
I haven't stress-tested this yet, but it appears that the mongodb next.js example, solves this problem by attaching the database connection to global in a helper function. The important bit in their example is here.
Since the pg connection is a bit more abstract than mongodb, it appears this approach just takes a few lines for us pg enthusiasts:
// eg, lib/db.js
const { Pool } = require("pg");
if (!global.db) {
global.db = { pool: null };
}
export function connectToDatabase() {
if (!global.db.pool) {
console.log("No pool available, creating new pool.");
global.db.pool = new Pool();
}
return global.db;
}
then in, eg, our API route, we can just:
// eg, pages/api/now
export default async (req, res) => {
const { pool } = connectToDatabase();
try {
const time = (await pool.query("SELECT NOW()")).rows[0].now;
res.end(`time: ${time}`);
} catch (e) {
console.error(e);
res.status(500).end("Error");
}
};

TypeScript, NodeJs/Express and Mongo (with VS 2013)

I'm trying to create a Node (and Express) based app which is functionally similar to a WebAPI app using Mongo as the data store. I'm trying to use TypeScript (and also VS 2013).
I've gotten it to work, and now I'm trying to clean it up.
First, Express 4.1.1 is available, but no typings for it are.
Second, What's the proper way to access Mongo - both making a connection, and querying a collection - such that it's async the way Node would like it to be (so I'm not blocking, etc)? Should I be using q/Promises?
Third, What's the proper way (in this setup) to be accessing a method in another file? I've wrestled for awhile with import/export/require/module/class to get what I think seems ok, but what a pain to get there. Here's what I have:
app.ts with some imports, etc and a reference
/// <reference path='./scripts/typings/node/node.d.ts' />
import express = require('express');
import http = require('http');
import path = require('path');
import badgeApi = require('./routes/api/Badge');
var app = express();
...
new badgeApi.Badge();
app.get('/badge', badgeApi.Badge.ListAll);
...
And then, some generic DB code:
/// <reference path='../../scripts/typings/mongodb/mongodb.d.ts' />
import util = require('util');
import mongodb = require('mongodb');
export var ConnectDataStore: (dbname: string, dbuser: string, dbpass: string) => mongodb.Db;
ConnectDataStore = (dbname: string, dbuser: string = 'xxxx', dbpass: string = 'xxxx') => {
var baseMongoConnect: string = 'xxxx';
var mongoConnect = util.format(baseMongoConnect, dbuser, dbpass, dbname);
var DB: mongodb.Db;
mongodb.MongoClient.connect(mongoConnect, (err, db) => {
if (!err) {
DB = db;
return DB;
}
else {
throw err;
}
});
}
And then Badge.ts
import util = require('util');
import express = require('express');
import mongodb = require('mongodb');
import datastore = require('./DataStore');
export class Badge {
private static DB: mongodb.Db = null;
constructor() {
Badge.DB = datastore.ConnectDataStore('credential', 'xxxx', 'xxxx')
}
public static ListAll(req: express.Request, res: express.Response): void {
Badge.DB.collection('badge', (collErr, coll) => {
coll.find().toArray((arrayErr, badges) => {
if (badges.length > 0) {
res.jsonp(200, badges);
}
else {
res.send(200, 'No Badges');
}
});
});
}
}
Now I know this doesn't exactly work - ConnectDataStore is wrong (it doesn't return a Db as it's supposed to). I was starting to workaround that by having it return a promise when I noticed I was using express 3.5.2, not 4.1.1 and it all finally got to me - this stuff is not very solid.
Of course, I'm probably missing something, but I don't know what!
Any direction around this soup would be appreciated!
Until there's definition file for Express 4+, I'd suggest you use JavaScript for the core of the application (as it shouldn't need to change much) and use TypeScript for things like application logic (controllers, etc.). In a Node.JS, it's easy to mix and match.
I'd use a promise to handle the MongoDB connection or make the connection early enough that the application hasn't actually started serving web pages until the MongoDB connection has been successfully established.
The 3.0 typings for Express may work with the 4+ version in many places, as the signatures are often the same.

Resources