I'm trying to run two parameterised insert queries using node-postgres: the first one specifies the primary key column, the second doesn't.
The second query, even though doesn't specify the primary key column, fails saying there's a duplicate primary key.
My pg table:
CREATE TABLE teams (
id serial PRIMARY KEY,
created_by int REFERENCES users,
name text,
logo text
);
Code that reproduces this issue:
var pg = require('pg');
var insertWithId = 'INSERT INTO teams(id, name, created_by) VALUES($1, $2, $3) RETURNING id';
var insertWithoutId = 'INSERT INTO teams(name, created_by) VALUES($1, $2) RETURNING id';
pg.connect(process.env.POSTGRES_URI, function (err, client, releaseClient) {
client.query(insertWithId, [1, 'First Team', 1], function (err, result) {
releaseClient();
if (err) {
throw err;
}
console.log('first team created');
});
});
pg.connect(process.env.POSTGRES_URI, function (err, client, releaseClient) {
client.query(insertWithoutId, ['Second Team', 1], function (err, result) {
releaseClient();
if (err) {
console.log(err);
}
});
});
And output of running this:
first team created
{ [error: duplicate key value violates unique constraint "teams_pkey"]
name: 'error',
length: 173,
severity: 'ERROR',
code: '23505',
detail: 'Key (id)=(1) already exists.',
hint: undefined,
position: undefined,
internalPosition: undefined,
internalQuery: undefined,
where: undefined,
schema: 'public',
table: 'teams',
column: undefined,
dataType: undefined,
constraint: 'teams_pkey',
file: 'nbtinsert.c',
line: '406',
routine: '_bt_check_unique' }
What I gather from reading the node-postgres source, parameterised queries are treated as prepared queries, which get cached if they reuse a name parameter; though from digging around it's source, it doesn't seem to think that my queries have a name property.
Does anyone have any ideas on how this could be avoided?
The first insert supplies a value for id, so the serial is not incremented. The serial still is 1 after the first insert. The second insert does not supply a value for id, so the serial (=1) is used. Which is a duplicate. Best solution is to only use the second statement, and let the application use the returned id, if needed.
In short: don't interfere with serials.
If you need to correct the next value for a sequence, you can use something like the below statement.
SELECT setval('teams_id_seq', (SELECT MAX(id) FROM teams) )
;
Related
A single row can be inserted like this:
client.query("insert into tableName (name, email) values ($1, $2) ", ['john', 'john#gmail.com'], callBack)
This approach automatically comments out any special characters.
How do i insert multiple rows at once?
I need to implement this:
"insert into tableName (name, email) values ('john', 'john#gmail.com'), ('jane', 'jane#gmail.com')"
I can just use js string operators to compile such rows manually, but then i need to add special characters escape somehow.
Use pg-format like below.
var format = require('pg-format');
var values = [
[7, 'john22', 'john22#gmail.com', '9999999922'],
[6, 'testvk', 'testvk#gmail.com', '88888888888']
];
client.query(format('INSERT INTO users (id, name, email, phone) VALUES %L', values),[], (err, result)=>{
console.log(err);
console.log(result);
});
One other way using PostgreSQL json functions:
client.query('INSERT INTO table (columns) ' +
'SELECT m.* FROM json_populate_recordset(null::your_custom_type, $1) AS m',
[JSON.stringify(your_json_object_array)], function(err, result) {
if (err) {
console.log(err);
} else {
console.log(result);
}
});
Following this article: Performance Boost from pg-promise library, and its suggested approach:
// Concatenates an array of objects or arrays of values, according to the template,
// to use with insert queries. Can be used either as a class type or as a function.
//
// template = formatting template string
// data = array of either objects or arrays of values
function Inserts(template, data) {
if (!(this instanceof Inserts)) {
return new Inserts(template, data);
}
this.rawType = true;
this.toPostgres = function () {
return data.map(d=>'(' + pgp.as.format(template, d) + ')').join(',');
};
}
An example of using it, exactly as in your case:
var users = [['John', 23], ['Mike', 30], ['David', 18]];
db.none('INSERT INTO Users(name, age) VALUES $1', Inserts('$1, $2', users))
.then(data=> {
// OK, all records have been inserted
})
.catch(error=> {
// Error, no records inserted
});
And it will work with an array of objects as well:
var users = [{name: 'John', age: 23}, {name: 'Mike', age: 30}, {name: 'David', age: 18}];
db.none('INSERT INTO Users(name, age) VALUES $1', Inserts('${name}, ${age}', users))
.then(data=> {
// OK, all records have been inserted
})
.catch(error=> {
// Error, no records inserted
});
UPDATE-1
For a high-performance approach via a single INSERT query see Multi-row insert with pg-promise.
UPDATE-2
The information here is quite old now, see the latest syntax for Custom Type Formatting. What used to be _rawDBType is now rawType, and formatDBType was renamed into toPostgres.
You are going to have to generate the query dynamically. Although possible, this is risky, and could easily lead to SQL Injection vulnerabilities if you do it wrong. It's also easy to end up with off by one errors between the index of your parameters in the query and the parameters you're passing in.
That being said, here is an example of how you could do write this, assuming you have an array of users that looks like {name: string, email: string}:
client.query(
`INSERT INTO table_name (name, email) VALUES ${users.map(() => `(?, ?)`).join(',')}`,
users.reduce((params, u) => params.concat([u.name, u.email]), []),
callBack,
)
An alternative approach, is to use a library like #databases/pg (which I wrote):
await db.query(sql`
INSERT INTO table_name (name, email)
VALUES ${sql.join(users.map(u => sql`(${u.name}, ${u.email})`), ',')}
`)
#databases requires the query to be tagged with sql and uses that to ensure any user data you pass is always automatically escaped. This also lets you write the parameters inline, which I think makes the code much more readable.
Using npm module postgres (porsager/postgres) which has Tagged Template Strings at the core:
https://github.com/porsager/postgres#multiple-inserts-in-one-query
const users = [{
name: 'Murray',
age: 68,
garbage: 'ignore'
},
{
name: 'Walter',
age: 80,
garbage: 'ignore'
}]
sql`insert into users ${ sql(users, 'name', 'age') }`
// Is translated to:
insert into users ("name", "age") values ($1, $2), ($3, $4)
// Here you can also omit column names which will use all object keys as columns
sql`insert into users ${ sql(users) }`
// Which results in:
insert into users ("name", "age", "garbage") values ($1, $2, $3), ($4, $5, $6)
Just thought I'd post since it's like brand new out of beta and I've found it to be a better philosophy of SQL library. I think would be preferable over the other postgres/node libraries posted in other answers. IMHO
Hi I know I am late to the party, but what worked for me was a simple map.
I hope this will help someone seeking for same
let sampleQuery = array.map(myRow =>
`('${myRow.column_a}','${myRow.column_b}') `
)
let res = await pool.query(`INSERT INTO public.table(column_a, column_b) VALUES ${sampleQuery} `)
client.query("insert into tableName (name, email) values ($1, $2),($3, $4) ", ['john', 'john#gmail.com','john', 'john#gmail.com'], callBack)
doesn't help?
Futher more, you can manually generate a string for query:
insert into tableName (name, email) values (" +var1 + "," + var2 + "),(" +var3 + ", " +var4+ ") "
if you read here, https://github.com/brianc/node-postgres/issues/530 , you can see the same implementation.
A single row can be inserted like this:
client.query("insert into tableName (name, email) values ($1, $2) ", ['john', 'john#gmail.com'], callBack)
This approach automatically comments out any special characters.
How do i insert multiple rows at once?
I need to implement this:
"insert into tableName (name, email) values ('john', 'john#gmail.com'), ('jane', 'jane#gmail.com')"
I can just use js string operators to compile such rows manually, but then i need to add special characters escape somehow.
Use pg-format like below.
var format = require('pg-format');
var values = [
[7, 'john22', 'john22#gmail.com', '9999999922'],
[6, 'testvk', 'testvk#gmail.com', '88888888888']
];
client.query(format('INSERT INTO users (id, name, email, phone) VALUES %L', values),[], (err, result)=>{
console.log(err);
console.log(result);
});
One other way using PostgreSQL json functions:
client.query('INSERT INTO table (columns) ' +
'SELECT m.* FROM json_populate_recordset(null::your_custom_type, $1) AS m',
[JSON.stringify(your_json_object_array)], function(err, result) {
if (err) {
console.log(err);
} else {
console.log(result);
}
});
Following this article: Performance Boost from pg-promise library, and its suggested approach:
// Concatenates an array of objects or arrays of values, according to the template,
// to use with insert queries. Can be used either as a class type or as a function.
//
// template = formatting template string
// data = array of either objects or arrays of values
function Inserts(template, data) {
if (!(this instanceof Inserts)) {
return new Inserts(template, data);
}
this.rawType = true;
this.toPostgres = function () {
return data.map(d=>'(' + pgp.as.format(template, d) + ')').join(',');
};
}
An example of using it, exactly as in your case:
var users = [['John', 23], ['Mike', 30], ['David', 18]];
db.none('INSERT INTO Users(name, age) VALUES $1', Inserts('$1, $2', users))
.then(data=> {
// OK, all records have been inserted
})
.catch(error=> {
// Error, no records inserted
});
And it will work with an array of objects as well:
var users = [{name: 'John', age: 23}, {name: 'Mike', age: 30}, {name: 'David', age: 18}];
db.none('INSERT INTO Users(name, age) VALUES $1', Inserts('${name}, ${age}', users))
.then(data=> {
// OK, all records have been inserted
})
.catch(error=> {
// Error, no records inserted
});
UPDATE-1
For a high-performance approach via a single INSERT query see Multi-row insert with pg-promise.
UPDATE-2
The information here is quite old now, see the latest syntax for Custom Type Formatting. What used to be _rawDBType is now rawType, and formatDBType was renamed into toPostgres.
You are going to have to generate the query dynamically. Although possible, this is risky, and could easily lead to SQL Injection vulnerabilities if you do it wrong. It's also easy to end up with off by one errors between the index of your parameters in the query and the parameters you're passing in.
That being said, here is an example of how you could do write this, assuming you have an array of users that looks like {name: string, email: string}:
client.query(
`INSERT INTO table_name (name, email) VALUES ${users.map(() => `(?, ?)`).join(',')}`,
users.reduce((params, u) => params.concat([u.name, u.email]), []),
callBack,
)
An alternative approach, is to use a library like #databases/pg (which I wrote):
await db.query(sql`
INSERT INTO table_name (name, email)
VALUES ${sql.join(users.map(u => sql`(${u.name}, ${u.email})`), ',')}
`)
#databases requires the query to be tagged with sql and uses that to ensure any user data you pass is always automatically escaped. This also lets you write the parameters inline, which I think makes the code much more readable.
Using npm module postgres (porsager/postgres) which has Tagged Template Strings at the core:
https://github.com/porsager/postgres#multiple-inserts-in-one-query
const users = [{
name: 'Murray',
age: 68,
garbage: 'ignore'
},
{
name: 'Walter',
age: 80,
garbage: 'ignore'
}]
sql`insert into users ${ sql(users, 'name', 'age') }`
// Is translated to:
insert into users ("name", "age") values ($1, $2), ($3, $4)
// Here you can also omit column names which will use all object keys as columns
sql`insert into users ${ sql(users) }`
// Which results in:
insert into users ("name", "age", "garbage") values ($1, $2, $3), ($4, $5, $6)
Just thought I'd post since it's like brand new out of beta and I've found it to be a better philosophy of SQL library. I think would be preferable over the other postgres/node libraries posted in other answers. IMHO
Hi I know I am late to the party, but what worked for me was a simple map.
I hope this will help someone seeking for same
let sampleQuery = array.map(myRow =>
`('${myRow.column_a}','${myRow.column_b}') `
)
let res = await pool.query(`INSERT INTO public.table(column_a, column_b) VALUES ${sampleQuery} `)
client.query("insert into tableName (name, email) values ($1, $2),($3, $4) ", ['john', 'john#gmail.com','john', 'john#gmail.com'], callBack)
doesn't help?
Futher more, you can manually generate a string for query:
insert into tableName (name, email) values (" +var1 + "," + var2 + "),(" +var3 + ", " +var4+ ") "
if you read here, https://github.com/brianc/node-postgres/issues/530 , you can see the same implementation.
I'm working on a web app using React, Node, Express, Massive, and PostgreSQL, and having trouble performing one specific query:
SELECT
COUNT(DISTINCT cu.user_id) AS ucount,
COUNT(DISTINCT p.project_id) AS pcount,
COUNT(DISTINCT t.task_id) AS tcount,
c.clique_id, c.clique_name, c.admin_id, c.created_on
FROM cliques c
FULL OUTER JOIN cliques_users cu ON cu.clique_id = c.clique_id
FULL OUTER JOIN users u ON u.user_id = cu.user_id
FULL OUTER JOIN projects p ON p.clique_id = cu.clique_id
FULL OUTER JOIN tasks t ON t.clique_id = cu.clique_id
WHERE c.clique_id IN (
SELECT cu.clique_id FROM cliques_users cu WHERE cu.user_id = 3
)
GROUP BY c.clique_id;
**Note: I'm using 3 on the subquery just to test.
I'm using Postico to test my SQL statements, and this query returns the results I expect. But when this is done on the app itself, requesting the data by hitting an endpoint, the server throws an error:
{ error: function count(integer) does not exist at ...
name: 'error',
length: 225,
severity: 'ERROR',
code: '42883',
detail: undefined,
hint: 'No function matches the given name and argument types. You might need to add explicit type casts.',
position: '55',
internalPosition: undefined,
internalQuery: undefined,
where: undefined,
schema: undefined,
table: undefined,
column: undefined,
dataType: undefined,
constraint: undefined,
file: 'parse_func.c',
line: '528',
routine: 'ParseFuncOrColumn' }
The callback function that runs when the endpoint is hit looks like this:
(req, res, next) => {
req.app.get( 'db' )
.clique.getCliqueSummaryQuery( req.params.user_id )
.then( response => {
res.status(200).json(response);
})
.catch( err => {
console.log( 'getMyCliquesInfo failed: ', err );
res.status(500).json( err );
});
}
getCliqueSummaryQuery() runs the query, passing a url parameter as a variable that will replace the 3 I've hardcoded for testing. The error occurs with both variable and hardcoded values. I've copied the query straight from Postico to my sql file.
Anyone know why it works one way and not the other?
I have one field & it has comma separated ID, so i want to find from that selected id, here is my code,
.get(function(req, res) {
knex.select('*')
.from('exam')
.whereRaw('? = any(regexp_split_to_array(student_id))', [req.params.id])
.then(function(rows) {
//return res.send(rows);
console.log(rows);
})
.catch(function(error) {
console.log(error)
});
});
===> while i am using KNEX it will give an Error Like this,
{ error: function regexp_split_to_array(text) does not exist
name: 'error',
length: 220,
severity: 'ERROR',
code: '42883',
detail: undefined,
hint: 'No function matches the given name and argument types. You might need to add explicit type casts.',
position: '37',
internalPosition: undefined,
internalQuery: undefined,
where: undefined,
schema: undefined,
table: undefined,
column: undefined,
dataType: undefined,
constraint: undefined,
file: 'parse_func.c',
line: '523',
routine: 'ParseFuncOrColumn'
}
in student_id column i have ID like this, 33,34,35,36
in req.params.id i got only one single ID like, 35.
so i want that rows which have included 35 ID, in Same Table.
===> So i want Only Two Rows (2,3) because it has Included ID = 35.
Assuming you are using PostgreSQL database (I saw you use phpPgAdmin on the screenshot). You can use regexp_split_to_array function to convert your string to array (obviously :). And the perform search over the resulting array using any.
In SQL words, it can be written like this
select '35' = any(regexp_split_to_array('33,34,35,36', E','));
In your query, you can replace .where with
.whereRaw("? = any(regexp_split_to_array(student_id, E','))", [req.params.id])
But keep in mind, this can be performance-heavy request, as for each row you execute string split operation. A better way of doing this (assuming it is necessary for your project to contain array values in one row) is to store your student_id in the Array type and add gin index on student_id column and perform search operations like this
select * from table where student_id #> '{35}';
I am trying to have my code INSERT a row into my table called thoughtentries. It is in the public schema. I am able to run ths command while connected to my database using psql:
INSERT INTO thoughtentries VALUES('12/17/2016 14:10', 'hi');
The first column is of character type with length 17. The second column is of type text.
When I have my code attempt to INSERT using the same command above I get the error in my log:
ERROR: relation "thoughtentries" does not exist at character 13
STATEMENT: INSERT INTO thoughtentries VALUES('12/17/2016 14:11', 'hi');
I am using pg and pg-format to format the command. Here is my code to do this:
client.connect(function (err) {
if (err) throw err
app.listen(3000, function () {
console.log('listening on 3000')
})
var textToDB = format('INSERT INTO thoughtentries VALUES(%s, %s);', timestamp, "'hi'")
client.query(textToDB, function (err, result) {
if (err) {
console.log(err)
}
console.log(result)
client.end(function (err) {
if (err) throw err
})
})
})
How do I go about fixing this?
Have you verified that the table was, in fact, created in the public schema?
SELECT *
FROM information_schema.tables
WHERE table_name = 'thoughtentries';
Once you have verified that, I see two possible explanations remaining:
You are connecting to a different database by mistake. Verify, in the same session, with:
select current_database();
Your search_path setting does not include the public schema. If that's the case, you can schema-qualify the table to fix: public.thoughtentries
How does the search_path influence identifier resolution and the "current schema"
Aside: Save timestamps as data type timestamp, not character(17).
Actually, don't use character(n) at all:
Any downsides of using data type "text" for storing strings?