We're working on a Node/Express web app with a Postgres database, using the node-postgres package. We followed the instructions in this question, and have our query working written this way:
exports.getByFileNameAndColName = function query(data, cb) {
const values = data.columns.map(function map(item, index) {
return '$' + (index + 2);
});
const params = [];
params.push(data.fileName);
data.columns.forEach(function iterate(element) {
params.push(element);
});
db.query('SELECT * FROM columns ' +
'INNER JOIN files ON columns.files_id = files.fid ' +
'WHERE files.file_name = $1 AND columns.col_name IN (' + values.join(', ') + ')',
params, cb
);
};
data is an object containing a string fileName and an array of column names columns.
We want this query to extract information from our 'columns' and 'files' tables from a dynamic number of columns.
db.query takes as parameters (query, args, cb), where query is the SQL query, args is an array of parameters to pass into the query, and cb is the callback function executed with the database results.
So the code written in this way returns the correct data, but (we think) it's ugly. We've tried different ways of passing the parameters into the query, but this is the only format that has successfully returned data.
Is there a cleaner/simpler way to pass in our parameters? (e.g. any way to pass parameters in a way the node-postgres will accept without having to create an additional array from my array + non-array elements.)
Asking this because:
perhaps there's a better way to use the node-postgres package/we're using it incorrectly, and
if this is the correct way to solve this type of issue, then this code supplements the answer in the question referenced above.
Hello I tried to translate "but (we think) it's ugly" I believe my response answers your question.
In that same question you reference you will find this response
In which the user takes the pg-promise with special-case variable formatting
In your case it may look something like this using shared connection but in your example I would actually recommend using a plain db.query Im just using the shared connection to show you how i extended the "ugly":
exports.getByFileNameAndColName = function query(data,cb) {
var sco;
const params = [];
params.push(data.fileName);
data.columns.forEach(function iterate(element) {
params.push(element);
});
db.connect()
.then(function(obj){
sco=obj;
return sco.query('SELECT * FROM columns ' +
'INNER JOIN files ON columns.files_id = files.fid ' +
'WHERE files.file_name = $1 AND columns.col_name IN ($2^)',
pgp.as.csv(params)));
},function(reason){
console.log(reason);
})
.done(function(){
if(sco){
sco.done();
cb();
}
});
};
Now again I'm not sure what you meant by ugly but in my use case the return format was something like this:
{
column:[
{
id: data,
data: data,
col_name: data,
files_id: data,
fid: data,
files_name: data
},...
]
}
And in my case I really wanted this:
{
column:[
{
id: data,
data: data,
col_name: data,
files_id: data,
},...
],
file:[
{
fid: data,
files_name: data
},...
]
}
So in order to do that I took the same shared connection and added a extra variable to manage the results. Now this may not answer your question or I just might be on to something but I suggest looking into pg-promises it could be helpful for advance queries and formatting.
My question was asking if there was a way to use the node-postgres library in way that cleaned up our params creation code before the query. However, from the several deleted answers as well as the remaining one, it seems like we're being ornery and those few extra lines aren't that big of a deal and that this is the best way to write this code. So, I'm marking this question "answered," although now it appears that it wasn't the greatest question and perhaps we shouldn't have asked it in the first place.
Related
I am trying to write a function that lets me insert a value into the firebird database. The query works well, only I get no callback to tell me that the insert went well.
It is the first time I am using a firebird connector. In the past, when using mySql connectors I can recall having some sort of callback when inserting new values. Right now I am using the node-firebird library by Henri Gourvest to accomplish this:
https://github.com/hgourvest/node-firebird/
I tried adding 'RETURNING FEATURE_ID' at the end, but an error "Cursor is not open" was thrown. The feature ID is generated by a trigger.
Any advice would be very kind.
pool.get(function(error, db) {
if (error) {
console.log(error)
res.status(403)
res.send()
}
else {
var date = moment(req.body.date, "DD/MM/YYYY")
var values = " VALUES ('" + date.format("MM/DD/YYYY") + "','Requested','" + req.body.type + "','" + req.body.description + "','" + req.body.memo +"')"
var query = 'INSERT INTO "Features" (FEATURE_REQUESTDATE, FEATURE_STATUS, FEATURE_TYPE, FEATURE_DESCRIPTION, FEATURE_MEMO)' + values
db.query( query , function(err, result) {
if (result) { //why is there no result here?
res.status(200)
res.send('ok')
}
if (err) {
console.log(err)
res.status(403)
res.send('error')
}
})
db.detach();
}
})
I tried adding 'RETURNING FEATURE_ID' at the end, but an error "Cursor is not open" was thrown.
Sure, there can be no cursor. Cursors (AKA rowsets) are only created by queries - SELECT-type SQL statements.
As stated in Firebird documentation, statements with RETURNING clause are not of query type, they are of procedure call type. You should execute them as you do with regular DELETE-type statements, then read the PARAMETERS of the statement executed.
Right now I am using the node-firebird library by Henri Gourvest to accomplish this: https://github.com/hgourvest/node-firebird/
Any advice would be very kind.
There are two advices.
NEVER do splice your data values into SQL command text. It makes your program very fragile. It would give you all the kinds of data conversion errors, and also it opens highways for your database corruption made by unexpected - erroneous or malicious - user input. See http://bobby-tables.com/ and http://en.wikipedia.org/wiki/SQL_injection
"Use the source Luke". The library you mentioned is open-source. So you have to check the examples in that library. Henri is known to be very laconic about documentation. However he supplies his different libraries with vast sets of examples and/or tests. Both suit for you, as they do use the library, and so you can just read how the library was intended to be used by its creator. This particular library has tests. And tests are always examples of intended use.
So you go into test folder and you see there run.js file. Open it.
https://github.com/hgourvest/node-firebird/blob/master/test/run.js
Now press Ctrl+F and search for "RETURNING" word. Not always first time, but one of its occurrences should be exactly test for the SQL feature you need.
Here it is, the very FIRST occurrence of it in the library text you already have on your machine. Granted, the first occurrence adds complexity of working with BLOBs that you do not need right off. So I would quote the THIRD example in the library you downloaded. But even the first example shows you how to properly execute queries with values and with RETURNING clauses.
function test_insert(next) {
....skip.......
// Insert record without blob
database.query('INSERT INTO test (ID, NAME, CREATED, PARENT) VALUES(?, ?, ?, ?) RETURNING ID', [3, 'Firebird 3', now, 862304020112911], function(err, r) {
assert.ok(!err, name + ': insert without blob (buffer) (1) ' + err);
assert.ok(r['id'] === 3, name + ': without blob (buffer) returning value');
next();
});
// Insert record without blob (without returning value)
database.query('INSERT INTO test (ID, NAME, CREATED) VALUES(?, ?, ?)', [4, 'Firebird 4', '2014-12-12 13:59'], function(err, r) {
assert.ok(!err, name + ': insert without blob (buffer) (2) ' + err);
assert.ok(err === undefined, name + ': insert without blob + without returning value');
next();
});
I'm using pg-promise and I want to make multiple inserts to one table. I've seen some solutions like Multi-row insert with pg-promise and How do I properly insert multiple rows into PG with node-postgres?, and I could use pgp.helpers.concat in order to concatenate multiple selects.
But now, I need to insert a lot of measurements in a table, with more than 10,000 records, and in https://github.com/vitaly-t/pg-promise/wiki/Performance-Boost says:
"How many records you can concatenate like this - depends on the size of the records, but I would never go over 10,000 records with this approach. So if you have to insert many more records, you would want to split them into such concatenated batches and then execute them one by one."
I read all the article but I can't figure it out how to "split" my inserts into batches and then execute them one by one.
Thanks!
UPDATE
Best is to read the following article: Data Imports.
As the author of pg-promise I was compelled to finally provide the right answer to the question, as the one published earlier didn't really do it justice.
In order to insert massive/infinite number of records, your approach should be based on method sequence, that's available within tasks and transactions.
var cs = new pgp.helpers.ColumnSet(['col_a', 'col_b'], {table: 'tableName'});
// returns a promise with the next array of data objects,
// while there is data, or an empty array when no more data left
function getData(index) {
if (/*still have data for the index*/) {
// - resolve with the next array of data
} else {
// - resolve with an empty array, if no more data left
// - reject, if something went wrong
}
}
function source(index) {
var t = this;
return getData(index)
.then(data => {
if (data.length) {
// while there is still data, insert the next bunch:
var insert = pgp.helpers.insert(data, cs);
return t.none(insert);
}
// returning nothing/undefined ends the sequence
});
}
db.tx(t => t.sequence(source))
.then(data => {
// success
})
.catch(error => {
// error
});
This is the best approach to inserting massive number of rows into the database, from both performance point of view and load throttling.
All you have to do is implement your function getData according to the logic of your app, i.e. where your large data is coming from, based on the index of the sequence, to return some 1,000 - 10,000 objects at a time, depending on the size of objects and data availability.
See also some API examples:
spex -> sequence
Linked and Detached Sequencing
Streaming and Paging
Related question: node-postgres with massive amount of queries.
And in cases where you need to acquire generated id-s of all the inserted records, you would change the two lines as follows:
// return t.none(insert);
return t.map(insert + 'RETURNING id', [], a => +a.id);
and
// db.tx(t => t.sequence(source))
db.tx(t => t.sequence(source, {track: true}))
just be careful, as keeping too many record id-s in memory can create an overload.
I think the naive approach would work.
Try to split your data into multiple pieces of 10,000 records or less.
I would try splitting the array using the solution from this post.
Then, multi-row insert each array with pg-promise and execute them one by one in a transaction.
Edit : Thanks to #vitaly-t for the wonderful library and for improving my answer.
Also don't forget to wrap your queries in a transaction, or else it
will deplete the connections.
To do this, use the batch function from pg-promise to resolve all queries asynchronously :
// split your array here to get splittedData
int i = 0
var cs = new pgp.helpers.ColumnSet(['col_a', 'col_b'], {table: 'tmp'})
// values = [..,[{col_a: 'a1', col_b: 'b1'}, {col_a: 'a2', col_b: 'b2'}]]
let queries = []
for (var i = 0; i < splittedData.length; i++) {
var query = pgp.helpers.insert(splittedData[i], cs)
queries.push(query)
}
db.tx(function () {
this.batch(queries)
})
.then(function (data) {
// all record inserted successfully !
}
.catch(function (error) {
// error;
});
I have a scenario in which I need to insert multiple records. I have a table structure like id (it's fk from other table), key(char), value(char). The input which needs to be saved would be array of above data. example:
I have some array objects like:
lst = [];
obj = {};
obj.id= 123;
obj.key = 'somekey';
obj.value = '1234';
lst.push(obj);
obj = {};
obj.id= 123;
obj.key = 'somekey1';
obj.value = '12345';
lst.push(obj);
In MS SQL, I would have created TVP and passed it. I don't know how to achieve in postgres.
So now what I want to do is save all the items from the list in single query in postgres sql, using pg-promise library. I'm not able to find any documentation / understand from documentation. Any help appreciated. Thanks.
I am the author of pg-promise.
There are two ways to insert multiple records. The first, and most typical way is via a transaction, to make sure all records are inserted correctly, or none of them.
With pg-promise it is done in the following way:
db.tx(t => {
const queries = lst.map(l => {
return t.none('INSERT INTO table(id, key, value) VALUES(${id}, ${key}, ${value})', l);
});
return t.batch(queries);
})
.then(data => {
// SUCCESS
// data = array of null-s
})
.catch(error => {
// ERROR
});
You initiate a transaction with method tx, then create all INSERT query promises, and then resolve them all as a batch.
The second approach is by concatenating all insert values into a single INSERT query, which I explain in detail in Performance Boost. See also: Multi-row insert with pg-promise.
For more examples see Tasks and Transactions.
Addition
It is worth pointing out that in most cases we do not insert a record id, rather have it generated automatically. Sometimes we want to get the new id-s back, and in other cases we don't care.
The examples above resolve with an array of null-s, because batch resolves with an array of individual results, and method none resolves with null, according to its API.
Let's assume that we want to generate the new id-s, and that we want to get them all back. To accomplish this we would change the code to the following:
db.tx(t => {
const queries = lst.map(l => {
return t.one('INSERT INTO table(key, value) VALUES(${key}, ${value}) RETURNING id',
l, a => +a.id);
});
return t.batch(queries);
})
.then(data => {
// SUCCESS
// data = array of new id-s;
})
.catch(error => {
// ERROR
});
i.e. the changes are:
we do not insert the id values
we replace method none with one, to get one row/object from each insert
we append RETURNING id to the query to get the value
we add a => +a.id to do the automatic row transformation. See also pg-promise returns integers as strings to understand what that + is for.
UPDATE-1
For a high-performance approach via a single INSERT query see Multi-row insert with pg-promise.
UPDATE-2
A must-read article: Data Imports.
I am trying to generate a response that returns the same collection sorted by 3 different columns. Here's the code I currently have:
var findRoute = router.route("/find")
findRoute.get(function(req, res) {
Box.find(function(err, boxes) {
res.json(boxes)
}).sort("-itemCount");
});
As you can see, we're making a single get request, querying for the Boxes, and then sorting them by itemCount at the end. This does not work for me because the request only returns a single JSON collection that is sorted by itemCount.
What can I do if I want to return two more collections sorted by, say, name and size properties -- all in the same request?
Crete an object to encapsulate the information and chain your find queries, like:
var findRoute = router.route("/find");
var json = {};
findRoute.get(function(req, res) {
Box.find(function(err, boxes) {
json.boxes = boxes;
Collection2.find(function (error, coll2) {
json.coll2 = coll2;
Collection3.find(function (error, coll3) {
json.coll3 = coll3;
res.json(json);
}).sort("-size");
}).sort("-name");
}).sort("-itemCount");
});
Just make sure to do the appropriate error checking.
This is kind of uggly and makes your code kind of difficult to read. Try to adapt this logic using modules like async or even promises (Q and bluebird are good examples).
If I understand well, you want something like that : return Several collections with mongodb
Tell me if that helps.
Bye.
Have you tried ?
Box.find().sort("-itemCount").exec(function(err, boxes) {
res.json(boxes)
});
Also for sorting your results based on 2 or more fields you can use :
.sort({name: 1, size: -1})
Let me know if that helps.
I read all the documentation and this seemingly simple operation seems completely ignored throughout the entire README.
Currently, I am trying to run a SELECT query and console.log the results, but it is simply returning a database object. How do I view the results from my query in Node console?
exports.runDB = function() {
db.serialize(function() {
console.log(db.run('SELECT * FROM archive'));
});
db.close();
}
run does not have retrieval capabilities. You need to use all, each, or get
According to the documentation for all:
Note that it first retrieves all result rows and stores them in
memory. For queries that have potentially large result sets, use the
Database#each function to retrieve all rows or Database#prepare
followed by multiple Statement#get calls to retrieve a previously
unknown amount of rows.
As an illistration:
db.all('SELECT url, rowid FROM archive', function(err, table) {
console.log(table);
});
That will return all entries in the archive table as an array of objects.