I'd like to pass dictionaries with column names as keys, thus avoiding declaring the column names within the query itself (typing them directly).
Assume I have a table User with 2 column names:
idUser(INT)
fullName(VARCHAR)
To create a record using node-postgres, I'll need to declare within the query the column names like so:
var idUser = 2;
var fullName = "John Doe";
var query = 'INSERT INTO User(idUser, age) VALUES ($1, $2)';
database.query(query, [idUser, fullName], function(error, result) {
callback(error, result.rows);
database.end();
});
I'd prefer if there was a way to just pass a dictionary & have it infer the column names from the keys - If there's an easy trick I'd like to hear it.
E.g something like this:
var values = {
idUser : 2,
fullName: "John Doe"
};
var query = 'INSERT INTO User VALUES ($1)';
database.query(query, [values], function(error, result) {
callback(error, result.rows);
database.end();
});
A complete example of doing it with pg-promise:
const pgp = require('pg-promise')(/*options*/);
const cn = 'postgres://username:password#host:port/database';
const db = pgp(cn);
const values = {
idUser: 2,
fullName: 'John Doe'
};
// generating the insert query:
const query = pgp.helpers.insert(values, null, 'User');
//=> INSERT INTO "User"("idUser","fullName") VALUES(2,'John Doe')
db.none(query)
.then(data => {
// success;
})
.catch(error => {
// error;
});
And with focus on high performance it would change to this:
// generating a set of columns from the object (only once):
const cs = new pgp.helpers.ColumnSet(values, {table: 'User'});
// generating the insert query:
const query = pgp.helpers.insert(values, cs);
//=> INSERT INTO "User"("idUser","fullName") VALUES(2,'John Doe')
There's no support for key-value values in the insert statement, so it can not be done with native sql.
However, the node-postgres extras page mentions multiple sql generation tools, and for example Squel.js parameters can be used to construct sql in a way very close like what you're looking for:
squel.insert()
.into("User")
.setFieldsRows([
{ idUser: 2, fullName: "John Doe" }
])
.toParam()
// => { text: 'INSERT INTO User (idUser, fullName) VALUES (?, ?)',
// values: [ 2, 'John Doe' ] }
My case was a bit special as I had a field named order in the JSON object which is a keyword in SQL. Therefore I had to wrap everything in quotes using a JSONify() function.
Also note the numberedParameters argument as well as the double quotes around the 'Messages' string.
import { pool } from './connection';
function JSONify(obj: Map<string, any>) {
var o = {};
for (var i in obj) {
o['"' + i + '"'] = obj[i]; // make the quotes
}
return o;
}
// I have a table named "Messages" with the columns order and name
// I also supply the createdAt and updatedAt timestamps just in case
const messages = [
{
order: 0,
name: 'Message with index 0',
createdAt: new Date().toISOString(),
updatedAt: new Date().toISOString(),
}
]
// Create the insert statement
const insertStatement = insert({ numberedParameters: true })
.into('"Messages"')
.setFieldsRows(messages.map((message) => JSONify(message)))
.toParam();
console.log(insertStatement);
// Notice the quotes wrapping the table and column names
// => { text: 'INSERT INTO "Messages" ("order", "name", "createdAt", "updatedAt") VALUES ($1, $2, $3, $4)',
// values: [ 0, 'Message with index 0', '2022-07-22T13:51:27.679Z', '2022-07-22T13:51:27.679Z' ] }
// Create
await pool.query(insertStatement.text, insertStatement.values);
See the Squel documentation for more details.
And this is how I create the pool object if anyone is curious.
import { Pool } from 'pg';
import { DB_CONFIG } from './config';
export const pool = new Pool({
user: DB_CONFIG[process.env.NODE_ENV].username,
host: DB_CONFIG[process.env.NODE_ENV].host,
database: DB_CONFIG[process.env.NODE_ENV].database,
password: DB_CONFIG[process.env.NODE_ENV].password,
port: DB_CONFIG[process.env.NODE_ENV].port,
});
Related
I have a simple SELECT query that is returning an unusable result. I am using pg-promise in node.js
[
{
"function_name": "(f10d1988-4db5-49de-97ab-0c8b15bedfa7,image.jpg,Image)"
},
{
"function_name": "(f10d1988-4db5-49de-97ab-0c8b15bedfa7,image2.jpg,Image 2)"
}
]
but I was expecting a basic json structure like
[
{
id: '',
title: '',
image: ''
},
{...etc}
]
Why is it doing this? How do I get a normalized result?
My query looks like the below:
CREATE OR REPLACE FUNCTION get_photos(
title_param TEXT
)
RETURNS TABLE(
id UUID,
image varchar(200),
title varchar(200)
) AS
$func$
BEGIN
RETURN QUERY SELECT
i.id,
i.image,
i.title
FROM images AS i
WHERE i.title = title_param;
END;
$func$ LANGUAGE PLPGSQL;
Here is my db conenctor setup, almost all defaults.
require('dotenv').config();
const Promise = require('bluebird');
const pg = require('pg-promise')({
promiseLib: Promise
});
const config = {
user: process.env.USER,
host: process.env.HOST,
database: process.env.DATABASE,
password: process.env.PASSWORD
};
const db = pg(config);
export default db;
Here is the express endpoint that is calling the function:
export const getData = async (req, res) => {
const { title } = req.query;
let data;
try {
data = await db.many('SELECT function_name($1)', [title]);
} catch (err) {
data = err;
}
res.send(data);
};
EDIT
I ran the query manually instead of through a function and the data returned correctly which means that there is an issue with my TABLE() return. What could possibly cause this issue?
images = await db.many(`
SELECT
p.id,
p.img,
p.type,
p.title
FROM photos p
WHERE p.type = '${type}';
`, [type]);
Because the function is defined as returning a table, you need to use it like a table:
SELECT * FROM function_name($1)
Use func as the query method:
data = await db.func('function_name', [title]);
It assumes you return a table, and so will work for you by default.
And for stored procedures, there's proc method.
Also, your parameter formatting for the images query is wrong, see Named Parameters:
IMPORTANT: Never use the reserved ${} syntax inside ES6 template strings ...
I just started using graphql with mysql, i would like to know if it is possible to use a name in the graphql query different from the column name in my data base.
For example i have a table users with the columns userName and password, when i define the type for the schema i have the following:
const unidadesMedidaInternaType = new GraphQLObjectType({
name: 'unidadesMedidaInterna',
fields: () => ({
userName: { type: GraphQLID },
password: { type:GraphQLString }
})
});
the resolver:
resolve (parent, args) {
return pool.query(`SELECT * FROM users`);
}
so i have to query like this:
{
users {
userName,
password
}
}
i would like to have different names in the query like this:
{
users {
Name,
secret
}
}
i tried changing the names of the fields in the type definition but the result of the query is full of nulls values.
In order to have different names in the queries you have 2 options:
Option 1: Use aliases to run the query:
You can run your query with aliases like
{
users {
Name: userName,
secret: password
}
}
In this case you are just renaming the fields name on execution time, so the original names will still be available to query.
Option 2: Map the query result to the GraphQLObject type.
First rename the fields:
const unidadesMedidaInternaType = new GraphQLObjectType({
name: 'unidadesMedidaInterna',
fields: () => ({
Name: { type: GraphQLID },
secret: { type:GraphQLString }
})
});
Then map the result of the query to match the fields:
resolve (parent, args) {
const result = pool.query(`SELECT * FROM users`);
// If the result of the query is an array then you have to map its items
return { Name: result.userName, secret: result.password }
}
I'm trying to batch together a database update to re-populate a mongo collection. I've created an object to hold the properties needed to lookup the data from an external source, and then add it back to a MongoDb collection.
The array looks like this:
const pops = [
{ table: 'SFAccounts',
label: 'Account__c',
createListName: 'Accounts'
},
{ table: 'SFTimes',
label: 'CusTime__c',
createListName: 'Time'
}]
I want to then create a function that takes 'table', 'label, and 'createListName' and it does something basically like this..
async function processData(table, label, createListName) {
// Get some info from Salesforce
const dataFromSF = await getMetaDataFromSalesForce(table)
// Extract the parts I actually need
const relevantBits = dataFromSF.filter(field => field.name === label)
//Create a new list in the db
const createResult = await List.create( { name: createListName, values: relevantBits } )
return createResult
}
The end goal is to get to something like
await Promise.all(processData(pops))
Which will await all the tables being pulled and populated into the database.
If you change the args of processData:
async function processData({table, label, createListName}) {
// Get some info from Salesforce
const dataFromSF = await getMetaDataFromSalesForce(table)
// Extract the parts I actually need
const relevantBits = dataFromSF.filter(field => field.name === label)
//Create a new list in the db
const createResult = await List.create( { name: createListName, values: relevantBits } )
return createResult
}
it's just await Promise.all(pops.map(processData));
I have a large dataset that I want to insert into a postgres db, I can achieve this using pg-promise like this
function batchUpload (req, res, next) {
var data = req.body.data;
var cs = pgp.helpers.ColumnSet(['firstname', 'lastname', 'email'], { table: 'customer' });
var query = pgp.helpers.insert(data, cs);
db.none(query)
.then(data => {
// success;
})
.catch(error => {
// error;
return next(error);
});
}
The dataset is an array of objects like this:
[
{
firstname : 'Lola',
lastname : 'Solo',
email: 'mail#solo.com',
},
{
firstname : 'hello',
lastname : 'world',
email: 'mail#example.com',
},
{
firstname : 'mami',
lastname : 'water',
email: 'mami#example.com',
}
]
The challenge is I have a column added_at which isn't included in the dataset and cannot be null. How do I add a timestamp for each record insertion to the query.
As per the ColumnConfig syntax:
const col = {
name: 'added_at',
def: () => new Date() // default to the current Date/Time
};
const cs = pgp.helpers.ColumnSet(['firstname', 'lastname', 'email', col], { table: 'customer' });
Alternatively, you can define it in a number of other ways, as ColumnConfig is very flexible.
Example:
const col = {
name: 'added_at',
mod: ':raw', // use raw-text modifier, to inject the string directly
def: 'now()' // use now() for the column
};
or you can use property init to set the value dynamically:
const col = {
name: 'added_at',
mod: ':raw', // use raw-text modifier, to inject the string directly
init: () => {
return 'now()';
}
};
See the ColumnConfig syntax for details.
P.S. I'm the author of pg-promise.
If I were to perform this query with mongoose;
Schema.find({
_id: {
$in: ['abcd1234', 'abcd1234', 'abcd1234']
}
});
The query will only return something like:
[{
'property1': 'key1',
'property2': 'key2'
}]
With the array only having one object, obviously because I passed in all the same id's. However, I actually want duplicate objects returned. How can I do this?
Mongo itself will only return objects with no duplicates. But you can then build an array of objects with duplicates from that.
For example, if array is the array of objects returned my Mongo - in this case:
var array = [{
_id: 'abcd1234',
property1: 'key1',
property2: 'key2'
}];
and ids is your list of IDs that you want with duplicates - in your case:
var ids = ['abcd1234', 'abcd1234', 'abcd1234'];
then you can do:
var objects = {};
array.forEach(o => objects[o._id] = o);
var dupArray = ids.map(id => objects[id]);
Now dupArray should contain the objects with duplicates.
Full example:
var ids = ['abcd1234', 'abcd1234', 'abcd1234'];
Schema.find({_id: {$in: ids}}, function (err, array) {
if (err) {
// handle error
} else {
var objects = {};
array.forEach(o => objects[o._id] = o);
var dupArray = ids.map(id => objects[id]);
// here you have objects with duplicates in dupArray:
console.log(dupArray);
}
});