I am trying to create an array of JSON objects from an SQL Server query using NodeJS but using just JSON.Stringify on each row I get the results I am looking for as all individual JSON objects but not in an array. Ideally I am just trying to write these results to a file as a JSON array. Any ideas appreciated.
Tried the following
const sql=require('mssql');
const fs=require('fs');
const config = {
// Creds removed
},
};
sql.connect(config, err => {
console.log(err);
const request = new sql.Request()
request.stream = true // You can set streaming differently for each request
request.query(fs.readFileSync('./new-query.sql').toString()); // or request.execute(procedure)
request.on('row', row => {
console.log(JSON.stringify(row));
})
request.on('error', err => {
console.log(err);
// May be emitted multiple times
})
request.on('done', result => {
// console.log('done emitted', result);
sql.close();
})
})
As always, reading the library notes in more detail led me to a solution. Using the MSSQL library you can actually work on the entire returned recordset as opposed to every column. Using this code works as I was wanting it to.
const sql = require('mssql')
const fs=require('fs');
const config = {
// cred removed
};
sql.connect(config).then(() => {
return sql.query(fs.readFileSync('./new-query.sql').toString())
}).then(result => {
console.log(JSON.stringify(result.recordsets[0]))
}).catch(err => {
// ... error checks
})
sql.on('error', err => {
// ... error handler
})
Related
I have a NODE.JS api using expressjs that connects to an SQL Server, and I want to use it in an angular project. I make use two files, a route file and a controllers file. My route file is as follows:
module.exports = (app) => {
const UsrContrllr = require('../Controllers/users.controllers');
//1. GET ALL USERS
app.get('/api/users', UsrContrllr.func1);
//2. POST NEW USER
app.post('/api/user/new', UsrContrllr.func2);
};
And my controllers file is given below:
const mssql = require('mssql');
exports.func1 = (req, res) =>
{
// Validate request
console.log(`Fetching RESPONSE`);
// create Request object
var request = new mssql.Request();
// query to the database and get the records
const queryStr = `SELECT * FROM USERS`;
request.query(queryStr, function (err, recordset) {
if (err) console.log(err)
else {
if (recordset.recordset.toString() === '') {
res.send('Oops!!! Required data not found...');
}
else {
// send records as a response
res.send(recordset);
}
};
});
};
exports.func2 = (req, res) =>
{
// Validate request
console.log(`INSERTING RECORD ${req}`);
// create Request object
var request = new mssql.Request();
// query to the database and get the records
const queryStr = `INSERT INTO GDUSERS (USERCODE, PASSWORD, LANGUAGE, USERCLASS, FIRSTNAME, LASTNAME, CONTACTNO) VALUES ('${req.body.usercode}', '${req.body.password}', 'EN', '0', '${req.body.firstname}', '${req.body.lastname}', '${req.body.contactno}');`;
request.query(queryStr, function (err, recordset) {
if (err) console.log(err)
else {
if (recordset.recordset.toString() == '') {
res.send('Oops!!! Required data not found...');
}
else {
// Send records as response
res.send(recordset);
}
};
});
};
The GET request works well, but when I try to run the POST request directly from the angular application, I get an error stating
Cannot GET URL/api/user/new
The angular code in my angular project is:
signup() {
let headers = new Headers({ 'Content-Type': 'application/json' });
let options = new RequestOptions({ headers: headers });
console.log(this.user); //User details come from a form
this.http.post(“URL", this.user, options)
.subscribe(
(err) => {
if(err) console.log(err);
console.log("Success");
});
}
I’m not sure whether the angular code I’m using, is right or not, and I don’t know where I’m going wrong. How does one exactly send a http POST request from an Angular project?
this i the way i handled my user signup with http.post calls. my approach is slightly different when signing up user because i am using a promise instead of observable (which i normally use for my servicecalls). but i will show you both ways.
createUser(user: User): Promise < string > {
const promise = new Promise < string > ((resolve, reject) => {
const userForPost = this.createUserForPost(user);
this.http.post(environment.backendUrl + '/api/user/signup', userForPost, this.config).toPromise < HttpConfig > ()
.then(createdUser => {
}).catch(error => {
console.log(error);
});
});
return promise;
}
here another example with an observable
createForumPost(forumPost: ForumPost) {
this.http.post < { message: string, forumPostId: string } > (environment.backendUrl + '/api/forumPosts', forumPost).subscribe((responseData) => {
const id = responseData.forumPostId;
forumPost.id = id;
});
}
i defined my URL somewhere else and then just use the environment.backedUrl + 'path' to define my path (the same as the path in your backend controller)
this is one of my first answers here on SO. i am sry if it is a bit messy
i hope i was able to help with my examples :)
I am new to mongoDb, as I am trying to query from different collection and in order to do that, when I am fetching data from category collection I mean when I am running select * from collection it is throwing error, MongoError: pool destroyed.
As per my understanding it is because of some find({}) is creating a pool and that is being destroyed.
The code which I am using inside model is below,
const MongoClient = require('mongodb').MongoClient;
const dbConfig = require('../configurations/database.config.js');
export const getAllCategoriesApi = (req, res, next) => {
return new Promise((resolve, reject ) => {
let finalCategory = []
const client = new MongoClient(dbConfig.url, { useNewUrlParser: true });
client.connect(err => {
const collection = client.db(dbConfig.db).collection("categories");
debugger
if (err) throw err;
let query = { CAT_PARENT: { $eq: '0' } };
collection.find(query).toArray(function(err, data) {
if(err) return next(err);
finalCategory.push(data);
resolve(finalCategory);
// db.close();
});
client.close();
});
});
}
When my finding here is when I am using
let query = { CAT_PARENT: { $eq: '0' } };
collection.find(query).toArray(function(err, data) {})
When I am using find(query) it is returning data but with {} or $gte/gt it is throwing Pool error.
The code which I have written in controller is below,
import { getAllCategoriesListApi } from '../models/fetchAllCategory';
const redis = require("redis");
const client = redis.createClient(process.env.REDIS_PORT);
export const getAllCategoriesListData = (req, res, next, query) => {
// Try fetching the result from Redis first in case we have it cached
return client.get(`allstorescategory:${query}`, (err, result) => {
// If that key exist in Redis store
if (false) {
res.send(result)
} else {
// Key does not exist in Redis store
getAllCategoriesListApi(req, res, next).then( function ( data ) {
const responseJSON = data;
// Save the Wikipedia API response in Redis store
client.setex(`allstorescategory:${query}`, 3600, JSON.stringify({ source: 'Redis Cache', responseJSON }));
res.send(responseJSON)
}).catch(function (err) {
console.log(err)
})
}
});
}
Can any one tell me what mistake I am doing here. How I can fix pool issue.
Thanking you in advance.
I assume that toArray is asynchronous (i.e. it invokes the callback passed in as results become available, i.e. read from the network).
If this is true the client.close(); call is going to get executed prior to results having been read, hence likely yielding your error.
The close call needs to be done after you have finished iterating the results.
Separately from this, you should probably not be creating the client instance in the request handler like this. Client instances are expensive to create (they must talk to all of the servers in the deployment before they can actually perform queries) and generally should be created per running process rather than per request.
I am trying to write a simple function to grab the id of a specific instance based on matching criteria from mongodb using the official node package 'mongodb'.
My function works as I can console log the data but I am unable to return the data to use it as I intended to do as you can see.
const mongo = require('mongodb').MongoClient;
const url = 'mongodb://localhost:27017';
// Function for finding database id of device based on deviceKey, The database is written into
// the code under the const 'db' as is the collection.
async function fetchId(deviceKey) {
const client = await mongo.connect(url, { useNewUrlParser: true });
const db = client.db('telcos');
const collection = db.collection('device');
try {
await collection.find({"deviceKey": deviceKey}).toArray((err, response) => {
if (err) throw err;
console.log(response[0]._id); // << works logs _id
return response[0]._id; // << does nothing... ?
})
} finally {
client.close();
}
}
// # fetchId() USAGE EXAMPLE
//
// fetchId(112233); < include deviceKey to extract id
//
// returns database id of device with deviceKey 112233
// Run test on fetchId() to see if it works
fetchId("112233")
.then(function(id) {
console.dir(id); // << undefined
})
.catch(function(error) {
console.log(error);
});
Why does my test return undefined but my console.log() inside the function works?
It looks like you're combining callback code with async/await code in an odd way. Your function fetchId isn't returning anything at all, which is why you don't see id after fetching.
try {
const response = await collection.find(...).toArray()
return response[0]._id
}...
If we weren't able to await collection.find(...).toArray() and needed to manually convert this from using callbacks to promises, we'd have to do something like:
function fetchId (id) {
// this function returns a promise
return new Promise((resolve, reject) => {
...
collection.find(...).toArray((err, response) => {
// within the callback, returning values doesn't do anything
if (err) return reject(err);
return resolve(response[0]._id);
})
});
}
You are returning a value but handled like a promise is being returned.Please try this code.I had not tested it.
const mongo = require('mongodb').MongoClient;
const url = 'mongodb://localhost:27017';
// Function for finding database id of device based on deviceKey, The database is written into
// the code under the const 'db' as is the collection.
async function fetchId(deviceKey) {
return new Promise((resolve,reject)=>{
const client = await mongo.connect(url, { useNewUrlParser: true });
const db = client.db('telcos');
const collection = db.collection('device');
try {
await collection.find({"deviceKey": deviceKey}).toArray((err, response) => {
if (err) throw err;
console.log(response[0]._id); // << works logs _id
return resolve(response[0]._id); // << does nothing... ?
})
}
catch(error){
return reject(error);
}
finally {
client.close();
}
});
}
// # fetchId() USAGE EXAMPLE
//
// fetchId(112233); < include deviceKey to extract id
//
// returns database id of device with deviceKey 112233
// Run test on fetchId() to see if it works
fetchId("112233")
.then(function(id) {
console.dir(id); // << undefined
})
.catch(function(error) {
console.log(error);
});
I've been using node-oracledb for a few months and I've managed to achieve what I have needed to so far.
I'm currently working on a search app that could potentially return about 2m rows of data from a single call. To ensure I don't get a disconnect from the browser and the server, I thought I would try queryStream so that there is a constant flow of data back to the client.
I implemented the queryStream example as-is, and this worked fine for a few hundred thousand rows. However, when the returned rows is greater than one million, Node runs out of memory. By logging and watching both client and server log events, I can see that client is way behind the server in terms of rows sent and received. So, it looks like Node is falling over because it's buffering so much data.
It's worth noting that at this point, my selectstream implementation is within a req/res function called via Express.
To return the data, I do something like....
stream.on('data', function (data) {
rowcount++;
let obj = new myObjectConstructor(data);
res.write(JSON.stringify(obj.getJson());
});
I've been reading about how streams and pipe can help with flow, so what I'd like to be able to do is to be able to pipe the results from the query to a) help with flow and b) to be able to pipe the results to other functions before sending back to the client.
E.g.
function getData(req, res){
var stream = myQueryStream(connection, query);
stream
.pipe(toSomeOtherFunction)
.pipe(yetAnotherFunction)
.pipe(res);
}
I'm spent a few hours trying to find a solution or example that allows me to pipe results, but I'm stuck and need some help.
Apologies if I'm missing something obvious, but I'm still getting to grips with Node and especially streams.
Thanks in advance.
There's a bit of an impedance mismatch here. The queryStream API emits rows of JavaScript objects, but what you want to stream to the client is a JSON array. You basically have to add an open bracket to the beginning, a comma after each row, and a close bracket to the end.
I'll show you how to do this in a controller that uses the driver directly as you have done, instead of using separate database modules as I advocate in this series.
const oracledb = require('oracledb');
async function get(req, res, next) {
try {
const conn = await oracledb.getConnection();
const stream = await conn.queryStream('select * from employees', [], {outFormat: oracledb.OBJECT});
res.writeHead(200, {'Content-Type': 'application/json'});
res.write('[');
stream.on('data', (row) => {
res.write(JSON.stringify(row));
res.write(',');
});
stream.on('end', () => {
res.end(']');
});
stream.on('close', async () => {
try {
await conn.close();
} catch (err) {
console.log(err);
}
});
stream.on('error', async (err) => {
next(err);
try {
await conn.close();
} catch (err) {
console.log(err);
}
});
} catch (err) {
next(err);
}
}
module.exports.get = get;
Once you get the concepts, you can simplify things a bit with a reusable Transform class which allows you to use pipe in the controller logic:
const oracledb = require('oracledb');
const { Transform } = require('stream');
class ToJSONArray extends Transform {
constructor() {
super({objectMode: true});
this.push('[');
}
_transform (row, encoding, callback) {
if (this._prevRow) {
this.push(JSON.stringify(this._prevRow));
this.push(',');
}
this._prevRow = row;
callback(null);
}
_flush (done) {
if (this._prevRow) {
this.push(JSON.stringify(this._prevRow));
}
this.push(']');
delete this._prevRow;
done();
}
}
async function get(req, res, next) {
try {
const toJSONArray = new ToJSONArray();
const conn = await oracledb.getConnection();
const stream = await conn.queryStream('select * from employees', [], {outFormat: oracledb.OBJECT});
res.writeHead(200, {'Content-Type': 'application/json'});
stream.pipe(toJSONArray).pipe(res);
stream.on('close', async () => {
try {
await conn.close();
} catch (err) {
console.log(err);
}
});
stream.on('error', async (err) => {
next(err);
try {
await conn.close();
} catch (err) {
console.log(err);
}
});
} catch (err) {
next(err);
}
}
module.exports.get = get;
Rather than writing your own logic to create a JSON stream, you can use JSONStream to convert an object stream to (stringified) JSON, before piping it to its destination (res, process.stdout etc) This saves the need to muck around with .on('data',...) events.
In the example below, I've used pipeline from node's stream module rather than the .pipe method: the effect is similar (with better error handling I think). To get objects from oracledb.queryStream, you can specify option {outFormat: oracledb.OUT_FORMAT_OBJECT} (docs). Then you can make arbitrary modifications to the stream of objects produced. This can be done using a transform stream, made perhaps using through2-map, or if you need to drop or split rows, through2. Below the stream is sent to process.stdout after being stringified as JSON, but you could equally send to it express's res.
require('dotenv').config() // config from .env file
const JSONStream = require('JSONStream')
const oracledb = require('oracledb')
const { pipeline } = require('stream')
const map = require('through2-map') // see https://www.npmjs.com/package/through2-map
oracledb.getConnection({
user: process.env.DB_USER,
password: process.env.DB_PASSWORD,
connectString: process.env.CONNECT_STRING
}).then(connection => {
pipeline(
connection.queryStream(`
select dual.*,'test' as col1 from dual
union select dual.*, :someboundvalue as col1 from dual
`
,{"someboundvalue":"test5"} // binds
,{
prefetchRows: 150, // for tuning
fetchArraySize: 150, // for tuning
outFormat: oracledb.OUT_FORMAT_OBJECT
}
)
,map.obj((row,index) => {
row.arbitraryModification = index
return row
})
,JSONStream.stringify() // false gives ndjson
,process.stdout // or send to express's res
,(err) => { if(err) console.error(err) }
)
})
// [
// {"DUMMY":"X","COL1":"test","arbitraryModification":0}
// ,
// {"DUMMY":"X","COL1":"test5","arbitraryModification":1}
// ]
I'm trying to create web services using node.js from an sql server database,in the frontend when i call those 2 webservices simultaneously it throws an error Global connection already exists. Call sql.close() first .
Any Solution ?
var express = require('express');
var router = express.Router();
var sql = require("mssql");
router.get('/Plant/:server/:user/:password/:database', function(req, res, next) {
user = req.params.user;
password = req.params.password;
server = req.params.server;
database = req.params.database;
// config for your database
var config = {
user: user,
password: password,
server: server,
database:database
};
sql.connect(config, function (err) {
// create Request object
var request = new sql.Request();
// query to the database and get the records
request.query("SELECT distinct PlantName FROM MachineryStateTable"
, function (err, recordset) {
if (err) console.log(err)
else {
for(i=0;i<recordset.recordsets.length;i++) {
res.send(recordset.recordsets[i])
}
}
sql.close();
});
});
});
router.get('/Dep/:server/:user/:password/:database/:plantname', function(req, res, next) {
user = req.params.user;
password = req.params.password;
server = req.params.server;
database = req.params.database;
plantname = req.params.plantname;
// config for your database
var config = {
user: user,
password: password,
server: server,
database:database
};
sql.connect(config, function (err) {
// create Request object
var request = new sql.Request();
// query to the database and get the records
request.query("SELECT distinct DepName FROM MachineryStateTable where PlantName= '"+plantname+"'"
, function (err, recordset) {
if (err) console.log(err)
else {
for(i=0;i<recordset.recordsets.length;i++) {
res.send(recordset.recordsets[i])
}
sql.close();
}
});
});
});
module.exports = router;
You have to create a poolConnection
try this:
new sql.ConnectionPool(config).connect().then(pool => {
return pool.request().query("SELECT * FROM MyTable")
}).then(result => {
let rows = result.recordset
res.setHeader('Access-Control-Allow-Origin', '*')
res.status(200).json(rows);
sql.close();
}).catch(err => {
res.status(500).send({ message: `${err}`})
sql.close();
});
From the documentation, close method should be used on the connection, and not on the required module,
So should be used like
var connection = new sql.Connection({
user: '...',
password: '...',
server: 'localhost',
database: '...'
});
connection.close().
Also couple of suggestions,
1. putting res.send in a loop isn't a good idea, You could reply back the entire recordsets or do operations over it, store the resultant in a variable and send that back.
2. Try using promises, instead of callbacks, it would make the flow neater
You must use ConnectionPool.
Next function returns a recordset with my query results.
async function execute2(query) {
return new Promise((resolve, reject) => {
new sql.ConnectionPool(dbConfig).connect().then(pool => {
return pool.request().query(query)
}).then(result => {
resolve(result.recordset);
sql.close();
}).catch(err => {
reject(err)
sql.close();
});
});
}
Works fine in my code!
if this problem still bother you, then change the core api.
go to node_modules\mssql\lib\base.js
at line 1723, add below code before if condition
globalConnection = null
In case someone comes here trying to find out how to use SQL Server pool connection with parameters:
var executeQuery = function(res,query,parameters){
new sql.ConnectionPool(sqlConfig).connect().then(pool =>{
// create request object
var request = new sql.Request(pool);
// Add parameters
parameters.forEach(function(p) {
request.input(p.name, p.sqltype, p.value);
});
// query to the database
request.query(query,function(err,result){
res.send(result);
sql.close();
});
})
}
Don't read their documentation, I don't think it was written by someone that actually uses the library :) Also don't pay any attention to the names of things, a 'ConnectionPool' doesn't seem to actually be a connection pool of any sort. If you try and create more than one connection from a pool, you will get an error. This is the code that I eventually got working:
const sql = require('mssql');
let pool = new sql.ConnectionPool(config); // some object that lets you connect ONCE
let cnn = await pool.connect(); // create single allowed connection on this 'pool'
let result = await cnn.request().query(query);
console.log('result:', result);
cnn.close(); // close your connection
return result;
This code can be run multiple times in parallel and seems to create multiple connections and correctly close them.