I am new to mongoDb, as I am trying to query from different collection and in order to do that, when I am fetching data from category collection I mean when I am running select * from collection it is throwing error, MongoError: pool destroyed.
As per my understanding it is because of some find({}) is creating a pool and that is being destroyed.
The code which I am using inside model is below,
const MongoClient = require('mongodb').MongoClient;
const dbConfig = require('../configurations/database.config.js');
export const getAllCategoriesApi = (req, res, next) => {
return new Promise((resolve, reject ) => {
let finalCategory = []
const client = new MongoClient(dbConfig.url, { useNewUrlParser: true });
client.connect(err => {
const collection = client.db(dbConfig.db).collection("categories");
debugger
if (err) throw err;
let query = { CAT_PARENT: { $eq: '0' } };
collection.find(query).toArray(function(err, data) {
if(err) return next(err);
finalCategory.push(data);
resolve(finalCategory);
// db.close();
});
client.close();
});
});
}
When my finding here is when I am using
let query = { CAT_PARENT: { $eq: '0' } };
collection.find(query).toArray(function(err, data) {})
When I am using find(query) it is returning data but with {} or $gte/gt it is throwing Pool error.
The code which I have written in controller is below,
import { getAllCategoriesListApi } from '../models/fetchAllCategory';
const redis = require("redis");
const client = redis.createClient(process.env.REDIS_PORT);
export const getAllCategoriesListData = (req, res, next, query) => {
// Try fetching the result from Redis first in case we have it cached
return client.get(`allstorescategory:${query}`, (err, result) => {
// If that key exist in Redis store
if (false) {
res.send(result)
} else {
// Key does not exist in Redis store
getAllCategoriesListApi(req, res, next).then( function ( data ) {
const responseJSON = data;
// Save the Wikipedia API response in Redis store
client.setex(`allstorescategory:${query}`, 3600, JSON.stringify({ source: 'Redis Cache', responseJSON }));
res.send(responseJSON)
}).catch(function (err) {
console.log(err)
})
}
});
}
Can any one tell me what mistake I am doing here. How I can fix pool issue.
Thanking you in advance.
I assume that toArray is asynchronous (i.e. it invokes the callback passed in as results become available, i.e. read from the network).
If this is true the client.close(); call is going to get executed prior to results having been read, hence likely yielding your error.
The close call needs to be done after you have finished iterating the results.
Separately from this, you should probably not be creating the client instance in the request handler like this. Client instances are expensive to create (they must talk to all of the servers in the deployment before they can actually perform queries) and generally should be created per running process rather than per request.
Related
I am trying to create an array of JSON objects from an SQL Server query using NodeJS but using just JSON.Stringify on each row I get the results I am looking for as all individual JSON objects but not in an array. Ideally I am just trying to write these results to a file as a JSON array. Any ideas appreciated.
Tried the following
const sql=require('mssql');
const fs=require('fs');
const config = {
// Creds removed
},
};
sql.connect(config, err => {
console.log(err);
const request = new sql.Request()
request.stream = true // You can set streaming differently for each request
request.query(fs.readFileSync('./new-query.sql').toString()); // or request.execute(procedure)
request.on('row', row => {
console.log(JSON.stringify(row));
})
request.on('error', err => {
console.log(err);
// May be emitted multiple times
})
request.on('done', result => {
// console.log('done emitted', result);
sql.close();
})
})
As always, reading the library notes in more detail led me to a solution. Using the MSSQL library you can actually work on the entire returned recordset as opposed to every column. Using this code works as I was wanting it to.
const sql = require('mssql')
const fs=require('fs');
const config = {
// cred removed
};
sql.connect(config).then(() => {
return sql.query(fs.readFileSync('./new-query.sql').toString())
}).then(result => {
console.log(JSON.stringify(result.recordsets[0]))
}).catch(err => {
// ... error checks
})
sql.on('error', err => {
// ... error handler
})
I'm just starting to use Mongodb without mongoose (to get away from the schemas), and wanted to create a simple module with various exported functions to use in the rest of my app. I've pasted the code below.
The problem I'm having is that the databasesList.databases comes back as undefined, and I'm not sure why. There should be 2 databases on my cluster, and one collection in each database.
As a tangential question, I thought maybe I would check the collections instead (now commented out), but though I found this page (https://docs.mongodb.com/manual/reference/method/db.getCollectionNames/) the function getCollectionNames seems not to exist. Now I'm wondering if I'm using the wrong documentation and that is why my databases are coming back undefined.
const client = new MongoClient(uri)
const connection = client.connect( function (err, database) {
if (err) throw err;
else if (!database) console.log('Unknown error connecting to database');
else {
console.log('Connected to MongoDB database server');
}
});
module.exports = {
getDatabaseList: function() {
console.log('start ' + client);
databasesList = client.db().admin().listDatabases();
//collectionList = client.db().getCollectionNames();
//console.log("Collections: " + collectionList);
console.log("Databases: " + databasesList.databases);
//databasesList.databases.forEach(db => console.log(` - ${db.name}`));
}
}```
your code is correct Just need to change few things.
module.exports = {
getDatabaseList: async function() {
console.log('start ' + client);
databasesList = await client.db().admin().listDatabases();
//collectionList = await client.db().getCollectionNames();
//console.log("Collections: " + collectionList);
console.log("Databases: " + databasesList.databases);
databasesList.databases.forEach(db => console.log(` - ${db.name}`));
}
}
You have to declare async function and use await also.
The async and await keywords enable asynchronous, promise-based behaviour to be written in a cleaner style, avoiding the need to explicitly configure promise chains.
You can use this modular approach to build your database access code:
index.js: Run your database application code, like list database names, collection names and read from a collection.
const connect = require('./database');
const dbFunctions = require('./dbFunctions');
const start = async function() {
const connection = await connect();
console.log('Connected...');
const dbNames = await dbFunctions.getDbNames(connection);
console.log(await dbNames.databases.map(e => e.name));
const colls = await dbFunctions.getCollNames(connection, 'test');
console.log(await colls.map(e => e.name));
console.log(await dbFunctions.getDocs(connection, 'test', 'test'));
};
start();
database.js:: Create a connection object. This connection is used for all your database access code. In general, a single connection creates a connection pool and this can be used throughout a small application
const { MongoClient } = require('mongodb');
const url = 'mongodb://localhost:27017/';
const opts = { useUnifiedTopology: true };
async function connect() {
console.log('Connecting to db server...');
return await MongoClient.connect(url, opts );
}
module.exports = connect;
dbFunctions.js:: Various functions to access database details, collection details and query a specific collection.
module.exports = {
// return list of database names
getDbNames: async function(conn) {
return await conn.db().admin().listDatabases( { nameOnly: true } );
},
// return collections list as an array for a given database
getCollNames: async function(conn, db) {
return await conn.db(db).listCollections().toArray();
},
// return documents as an array for a given database and collection
getDocs: async function(conn, db, coll) {
return await conn.db(db).collection(coll).find().toArray();
}
}
I am running a node server with the postgres-node (pg) package.
I wrote a program, which requests n-queries (for instance 20,000) at once to my postgres database.
When I do this with several clients who want to query 20,000 at once too, there is no parallelity. That means, the requests of the second client will be queued until the first client finished all his queries.
Is this a normal behavior for postgres? If yes, how can I prevent that one user gets all the ressources (and the others have to wait) if there is no parallelity?
This is my code:
const express = require('express');
const app = express();
const { Pool } = require("pg");
const pool = new Pool();
benchmark(){
pool.connect((err, client, done) => {
if (err) throw err;
client.query("SELECT * from member where m_id = $1", [1], (err, res) => {
done();
if (err) {
console.log(err.stack);
} else {
console.log(res.rows[0]);
}
});
});
}
app.get('/', function(req, res) {
for(let i=0;i<20000;i++){
benchmark();
}
});
First you need to create a connection pool, here's an example with node's pg in a separate module (node-pg-sql.js) for convenience:
node-pg-sql.js:
const { Pool } = require('pg');
const pool = new Pool(fileNameConfigPGSQL);
module.exports = {
query: (text, params, callback) => {
const start = Date.now()
return pool.query(text, params, (err, res) => {
const duration = Date.now() - start
// console.log('executed query', { text, duration, rows: res.rowCount })
callback(err, res)
})
},
getClient: (callback) => {
pool.connect((err, client, done) => {
const query = client.query.bind(client)
// monkey patch
client.query = () => {
client.lastQuery = arguments
client.query.apply(client, arguments)
}
// Timeout 5 sek
const timeout = setTimeout(() => {
// console.error('A client has been checked out for more than 5 seconds!')
// console.error(`The last executed query on this client was: ${client.lastQuery}`)
}, 5000)
const release = (err) => {
// 'done' Methode - returns client to the pool
done(err)
// clear Timeouts
clearTimeout(timeout)
// reset der Query-Method before Monkey Patch
client.query = query
}
callback(err, client, done)
})
}
}
In your postgresql.conf (on linux normally under /var/lib/pgsql/data/postgresql.conf) set max-connection to the desired value:
max_connection = 300
Keep in mind:
Each PostgreSQL connection consumes RAM for managing the connection or the client using it. The more connections you have, the more RAM you will be using that could instead be used to run the database.
While increasing your max-connections, you need to increase shared_buffers and kernel.shmmax as well in order for the client-connection increase to be effective .
Whenever you want to run a query from in one of your routes/endpoints just require the separate client-pool-file like:
const db = require('../../../node-pg-sql');
module.exports = (router) => {
router.get('/someRoute', (req, res) => {
console.log(`*****************************************`);
console.log(`Testing pg..`);
let sqlSelect = `SELECT EXISTS (
SELECT 1
FROM pg_tables
WHERE schemaname = 'someschema'
)`;
db.query(sqlSelect, (errSelect, responseSelect) => {
if (errSelect) {
/* INFO: Error while querying table */
console.log(`*****************************************`);
console.log(`ERROR WHILE CHECKING CONNECTION: ${errSelect}`);
}
else {
// INFO: No error from database
console.log(`*****************************************`);
console.log(`CONNECTION TO PGSQL WAS SUCCESSFUL..`);
res.json({ success: true, message: responseSelect, data:responseSelect.rows[0].exists });
}
})
});
}
EDIT:
"there is no parallelity.."
Node is asynchronous, you can either work with promises or spawn more clients/pools and tune your max-connections (as explained in my answer, but keep performance of your host-machine in mind), but with multiple clients running around 20.000 queries, they won't resolve with a result instantly or parallel. What is the exact goal you try to achieve?
"Is this a normal behavior for postgres?"
This is due to node's event-loop as well as due to certain performance-limitation of the host-machine running the Postgres.
I am trying to write a simple function to grab the id of a specific instance based on matching criteria from mongodb using the official node package 'mongodb'.
My function works as I can console log the data but I am unable to return the data to use it as I intended to do as you can see.
const mongo = require('mongodb').MongoClient;
const url = 'mongodb://localhost:27017';
// Function for finding database id of device based on deviceKey, The database is written into
// the code under the const 'db' as is the collection.
async function fetchId(deviceKey) {
const client = await mongo.connect(url, { useNewUrlParser: true });
const db = client.db('telcos');
const collection = db.collection('device');
try {
await collection.find({"deviceKey": deviceKey}).toArray((err, response) => {
if (err) throw err;
console.log(response[0]._id); // << works logs _id
return response[0]._id; // << does nothing... ?
})
} finally {
client.close();
}
}
// # fetchId() USAGE EXAMPLE
//
// fetchId(112233); < include deviceKey to extract id
//
// returns database id of device with deviceKey 112233
// Run test on fetchId() to see if it works
fetchId("112233")
.then(function(id) {
console.dir(id); // << undefined
})
.catch(function(error) {
console.log(error);
});
Why does my test return undefined but my console.log() inside the function works?
It looks like you're combining callback code with async/await code in an odd way. Your function fetchId isn't returning anything at all, which is why you don't see id after fetching.
try {
const response = await collection.find(...).toArray()
return response[0]._id
}...
If we weren't able to await collection.find(...).toArray() and needed to manually convert this from using callbacks to promises, we'd have to do something like:
function fetchId (id) {
// this function returns a promise
return new Promise((resolve, reject) => {
...
collection.find(...).toArray((err, response) => {
// within the callback, returning values doesn't do anything
if (err) return reject(err);
return resolve(response[0]._id);
})
});
}
You are returning a value but handled like a promise is being returned.Please try this code.I had not tested it.
const mongo = require('mongodb').MongoClient;
const url = 'mongodb://localhost:27017';
// Function for finding database id of device based on deviceKey, The database is written into
// the code under the const 'db' as is the collection.
async function fetchId(deviceKey) {
return new Promise((resolve,reject)=>{
const client = await mongo.connect(url, { useNewUrlParser: true });
const db = client.db('telcos');
const collection = db.collection('device');
try {
await collection.find({"deviceKey": deviceKey}).toArray((err, response) => {
if (err) throw err;
console.log(response[0]._id); // << works logs _id
return resolve(response[0]._id); // << does nothing... ?
})
}
catch(error){
return reject(error);
}
finally {
client.close();
}
});
}
// # fetchId() USAGE EXAMPLE
//
// fetchId(112233); < include deviceKey to extract id
//
// returns database id of device with deviceKey 112233
// Run test on fetchId() to see if it works
fetchId("112233")
.then(function(id) {
console.dir(id); // << undefined
})
.catch(function(error) {
console.log(error);
});
I'm trying to create web services using node.js from an sql server database,in the frontend when i call those 2 webservices simultaneously it throws an error Global connection already exists. Call sql.close() first .
Any Solution ?
var express = require('express');
var router = express.Router();
var sql = require("mssql");
router.get('/Plant/:server/:user/:password/:database', function(req, res, next) {
user = req.params.user;
password = req.params.password;
server = req.params.server;
database = req.params.database;
// config for your database
var config = {
user: user,
password: password,
server: server,
database:database
};
sql.connect(config, function (err) {
// create Request object
var request = new sql.Request();
// query to the database and get the records
request.query("SELECT distinct PlantName FROM MachineryStateTable"
, function (err, recordset) {
if (err) console.log(err)
else {
for(i=0;i<recordset.recordsets.length;i++) {
res.send(recordset.recordsets[i])
}
}
sql.close();
});
});
});
router.get('/Dep/:server/:user/:password/:database/:plantname', function(req, res, next) {
user = req.params.user;
password = req.params.password;
server = req.params.server;
database = req.params.database;
plantname = req.params.plantname;
// config for your database
var config = {
user: user,
password: password,
server: server,
database:database
};
sql.connect(config, function (err) {
// create Request object
var request = new sql.Request();
// query to the database and get the records
request.query("SELECT distinct DepName FROM MachineryStateTable where PlantName= '"+plantname+"'"
, function (err, recordset) {
if (err) console.log(err)
else {
for(i=0;i<recordset.recordsets.length;i++) {
res.send(recordset.recordsets[i])
}
sql.close();
}
});
});
});
module.exports = router;
You have to create a poolConnection
try this:
new sql.ConnectionPool(config).connect().then(pool => {
return pool.request().query("SELECT * FROM MyTable")
}).then(result => {
let rows = result.recordset
res.setHeader('Access-Control-Allow-Origin', '*')
res.status(200).json(rows);
sql.close();
}).catch(err => {
res.status(500).send({ message: `${err}`})
sql.close();
});
From the documentation, close method should be used on the connection, and not on the required module,
So should be used like
var connection = new sql.Connection({
user: '...',
password: '...',
server: 'localhost',
database: '...'
});
connection.close().
Also couple of suggestions,
1. putting res.send in a loop isn't a good idea, You could reply back the entire recordsets or do operations over it, store the resultant in a variable and send that back.
2. Try using promises, instead of callbacks, it would make the flow neater
You must use ConnectionPool.
Next function returns a recordset with my query results.
async function execute2(query) {
return new Promise((resolve, reject) => {
new sql.ConnectionPool(dbConfig).connect().then(pool => {
return pool.request().query(query)
}).then(result => {
resolve(result.recordset);
sql.close();
}).catch(err => {
reject(err)
sql.close();
});
});
}
Works fine in my code!
if this problem still bother you, then change the core api.
go to node_modules\mssql\lib\base.js
at line 1723, add below code before if condition
globalConnection = null
In case someone comes here trying to find out how to use SQL Server pool connection with parameters:
var executeQuery = function(res,query,parameters){
new sql.ConnectionPool(sqlConfig).connect().then(pool =>{
// create request object
var request = new sql.Request(pool);
// Add parameters
parameters.forEach(function(p) {
request.input(p.name, p.sqltype, p.value);
});
// query to the database
request.query(query,function(err,result){
res.send(result);
sql.close();
});
})
}
Don't read their documentation, I don't think it was written by someone that actually uses the library :) Also don't pay any attention to the names of things, a 'ConnectionPool' doesn't seem to actually be a connection pool of any sort. If you try and create more than one connection from a pool, you will get an error. This is the code that I eventually got working:
const sql = require('mssql');
let pool = new sql.ConnectionPool(config); // some object that lets you connect ONCE
let cnn = await pool.connect(); // create single allowed connection on this 'pool'
let result = await cnn.request().query(query);
console.log('result:', result);
cnn.close(); // close your connection
return result;
This code can be run multiple times in parallel and seems to create multiple connections and correctly close them.