Close Database Connection (Firebase+Node.js) - node.js

I am currently working on Node.js and firebase and I am currently stuck in an issue, where after writing the data in database my program does not end on a terminal, is there a way I can end this.
I guess, maybe I need to end the firebase connection but I don't have an idea how would I do that.
Here is the program,
module.exports=
{
create_firebase_entry: function ()
{
console.log("Firebase Entry")
return new Promise((resolve, reject) => {
var secondaryAppConfig =
{
credential: admin.credential.cert(serviceAccount),
databaseURL: "firebase_databse_url"
};
if (admin.apps.length===0){secondary = admin.initializeApp(secondaryAppConfig, "secondary");}
// Retrieve the database.
var db = secondary.database();
var ref = db.ref("data_records");
var usersRef = ref.child("My_Data");
usersRef.set({Data1:"Trail Data"})
})
}
}

The script opens a connection to the Firebase Realtime Database, and then performs asynchronous read and write operations over that connection. Since the script has no way to know when you're done, you will have to signal this yourself.
The most common way to do this (that I know of) is to call process.exit() when you're done. E.g.
usersRef.set({Data1:"Trail Data"})
.then(function() {
process.exit(0);
});
Also see
How to properly exit firebase-admin nodejs script when all transaction is completed
Exit from a NodeJS script when all asynchronous tasks are done
node process doesn't exit after firebase once

I agree to Frank but I think using process.exit(0); under .finally() block will provide more guarantee. Because when error occur It will never close.
usersRef.set({Data1:"Trail Data"})
.then(result => {
// ...
})
.catch(err => {
// ...
})
.finally(() => {
process.exit(0)
})

Related

node-postgres script does not end after it is finished

I have a script called scheduler() which runs some queries on a Postgres database and then finished. I am having a problem where the script hangs in the terminal and does not exit after.
Here is my code:
scheduler.js
const {pool} = require("./db");
const scheduler = async function() {
try {
await auctionCheck();
return pool.end().then(() => {
console.log('Pool\'s closed');
return;
})
} catch (e) {
console.error(e)
return;
}
}
return scheduler();
db.js
const {Pool} = require("pg");
const pool = new Pool({
connectionString: process.env.DATABASE_URL,
});
pool.on('error', (err, client) => {
console.error('Unexpected error on idle client', err)
process.exit(-1)
})
pool.connect();
module.exports = {pool};
When I didn't have the pool.end() call, my script didn't exit, but it does not exit with the end() call either. The "Pool's closed" log is never printed. node-pg documentation says that I should do this to close connections (I assume that the reason my script isn't finishing is due to an open db connection.) I want to basically run some functions periodically and then have the script finish, but currently it just stays live. What am I doing wrong?
It seems that the reason was that in db.js I had pool.connect(). Thus, when I did pool.end(), it couldn't close the connection because it did not have all of the clients in the pool - one being already created in db.js but never released. When I removed pool.connect() in db.js the process exited as it should.

Wait for DB initialization before running Jest tests

I have to add some features, and their corresponding tests, to an API backend I developed some months ago using NodeJS. Checking the test cases I wrote back then, I've come across with this code:
index-route.spec.js
const app = require('../../app');
const request = require('supertest');
function delay() {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve();
}, 3000);
});
}
beforeAll(async () => {
await delay();
});
// START TESTS
The reason of this delay is to give time to the application to initialize the database connection (I use the Sequelize ORM). This is done in a file (db.js) required from the application main file (app.js):
db.js
async function run() {
try {
// Creates database if it doesn't exist
await DB_init();
// Test database connection
await DB_test();
// Connection was OK, sync tables and relationships
await DB_sync();
// Check if the DB is empty and, in this case, fill the basic data
await DB_init_values();
} catch(error => {
log.error("Couldn't connect to the database '"+dbName+"':", error);
process.exit(1);
}
}
// Init the DB
run();
This works, but now that I have to add more tests, I'd like to refactor this code to avoid the added artificial delay. How could I wait for the database initialization code to be finished before starting the tests? It seems that it's not possible to require a module synchronally, is it?
Cheers,

Why does my Node.js MS SQL script never end unless I CTRL+C out of it?

The await keyword can only be used inside an async function so I created a main() that is asynchronous and execute it at the global level. Everything runs correctly but then the program sits there in the event loop and never ends. I could add process.exit() but it seems heavy handed.
const mssql = require('mssql');
;(async function main() {
console.log("Started");
try {
await mssql.connect(process.env.CONNECTION_STRING);
const result = await mssql.query`SELECT CHECKSUM('a')`;
console.dir(result);
console.log("Success!");
} catch (err) {
console.log(err);
console.log("Failed!");
}
console.log("Finished");
})();
I assume this has to do with the mssql module peculiarities more than anything.
Is the connection closed automatically after a query or not?
If not I think the problem is that you need to close the connection and then it will end the process.
I hope this helped!

How to handle errors using the Tedious driver with Node

The Problem:
I keep running into issues with typos in my SQL queries and the tedious driver isn't helping much. When there are errors, it does a fine job telling me the type of error and what went wrong, but the stack trace never contains any of my project files. This is making it difficult to debug, especially when I'm making multiple queries in the same function. I'm currently using the mssql library, but experienced the same problem using tedious directly. Plus, mssql isn't mentioned in the error's stack at all, so I figure it's not the problem.
What I've Tried:
Looking for issues on the tedious GitHub page
Added a conn.on('error') listener, which doesn't appear to do anything.
Added process.on() listeners for 'uncaughtException' and 'unhandledRejection'.
Moved the initialization of the ConnectionPool into it's own function to try and catch the error there. No luck.
Some Code:
Here's a barebones example of what I'm using to run the queries:
async function dbConnect() {
const conn = await new ConnectionPool(dbConfig).connect();
conn.on('error', error => {
console.error('*Connection Error:', error);
});
return conn;
}
async function runQuery() {
const conn = await dbConnect();
await conn.request().query(`SELECT Thing FROM Place`);
await conn.close();
}
runQuery()
.then(results => {
console.log(results);
process.exit(0);
})
.catch(error => {
console.error(error);
process.exit(1);
});
The Error:
Hopefully, this is something that's possible. If not, is there a different Node driver for SQL Server that someone's come up with?
Edit: Added an error listener on request
async function dbConnect() {
const conn = await new ConnectionPool(dbConfig).connect();
const request = new Request(conn);
conn.on('error', error => {
console.error('*Connection Error:', error);
});
request.on('error', error => {
console.error('*Request Error:', error);
});
return request;
}
async function runQuery() {
const request = await dbConnect();
await request.query(`SELECT Thing FROM Place`);
}

mongoose.connection.collections.collection.drop() throws error every other time

I am setting up testing using Jest for an Node/Express/Mongo project. I have tried to write a function to clear collections so each test starts with a clean slate:
const clearCollection = (collectionName, done) => {
const collection = mongoose.connection.collections[collectionName]
collection.drop(err => {
if (err) throw new Error(err)
else done()
)
}
beforeEach(done => {
clearCollection('users', done)
})
And another try, with promises:
const clearCollection = collectionName => {
const collection = mongoose.connection.collections[collectionName]
return collection.drop()
}
beforeEach(async () => {
await clearCollection('users')
})
The problem is that it they both alternate between working and throwing an error. Every time I save the file, it either works perfectly, or throws an error, alternating each time. The errors are always either one of:
MongoError: cannot perform operation: a background operation is currently running for collection auth_test.users
MongoError: ns not found
I can get it to work 100% of the time (limited by the stack anyway) by making clearCollection() call itself inside a catch(), but this feels so wrong:
const clearCollection = collectionName => {
const collection = mongoose.connection.collections[collectionName]
return collection.drop()
.catch(() => clearCollection(collectionName))
}
I don't know why mongoose.connection.collections.<collection>.drop() randomly throws errors, but there is a simple way to remove all the documents in Mongoose, which works just fine for resetting the collection before tests:
beforeAll(async () => {
await User.remove({})
})
Works every time.
I was having a similar issue while trying to drop the database in the beginning of the tests and populating the database right after. In the first run, collections would be created; in the next one, I would get an error of 'database is in the process of being dropped.'; ...and it was alternating like this.
I was also using an "in memory" Mongodb, running $ run-rs -v 4.0.0 -s (https://www.npmjs.com/package/run-rs) in a separate terminal window before running my Mocha tests. Also, I have Mongoose 5.2.0 and Mocha 5.1.1 here.
I've found out that Mongoose not necessarily executes the the drop commands immediately. Instead, it will schedule them for when the connection is up.
So, there can be a race condition in which the promise of the drop command is resolved and you move on in the code until reaching the instruction for creating your collections... but the drop command didn't finish running yet, and you'll get the error for the creation the new collections. The drop finish running and for the next time you run your test, your database (or collection) is already dropped, and hence you'll be able to insert the new collections again.
So, this is how I've solved...
Run in the before hook:
test.dropDb = function(done) {
this.timeout(0)
let database = your_MongoDB_URI
mongoose.connect(database, function (err) {
if (err) throw err;
mongoose.connection.db.dropDatabase(function(err, result){
console.log('database dropping was scheduled');
});
})
mongoose.connection.on('connected', function() {
setTimeout(function(){ done() }, 3000);
});
}
Run in an nested before hook
test.createDb = function(done) {
this.timeout(0)
let createDb = require('./create-db.js')
createDb() // here I call all my MyCollections.insertMany(myData)...
.then( function() {
done()
})
.catch( function(err) {
assert( null == err)
});
}
Run in the after hook
test.afterAll = function(done) {
mongoose.connection.close()
.then( function() {
done()
})
}
I don't know why I had to explicitly close the connection in the after hook. I guess it has something to do with the way run-rs is programmed, but I didn't look into it. I just know that, in my case, closing the connection after the test was mandatory to get the expected behavior.

Resources