sequelize findAll sort order in nodejs - node.js

I'm trying to output all object list from database with sequelize as follow and want to get data are sorted out as I added id in where clause.
exports.getStaticCompanies = function () {
return Company.findAll({
where: {
id: [46128, 2865, 49569, 1488, 45600, 61991, 1418, 61919, 53326, 61680]
},
attributes: ['id', 'logo_version', 'logo_content_type', 'name', 'updated_at']
});
};
But the problem is after rendering, all data are sorted out as follow.
46128, 53326, 2865, 1488, 45600, 61680, 49569, 1418, ....
As I found, it's neither sorted by id nor name. Please help me how to solve it.

In sequelize you can easily add order by clauses.
exports.getStaticCompanies = function () {
return Company.findAll({
where: {
id: [46128, 2865, 49569, 1488, 45600, 61991, 1418, 61919, 53326, 61680]
},
// Add order conditions here....
order: [
['id', 'DESC'],
['name', 'ASC'],
],
attributes: ['id', 'logo_version', 'logo_content_type', 'name', 'updated_at']
});
};
See how I've added the order array of objects?
order: [
['COLUMN_NAME_EXAMPLE', 'ASC'], // Sorts by COLUMN_NAME_EXAMPLE in ascending order
],
Edit:
You might have to order the objects once they've been recieved inside the .then() promise. Checkout this question about ordering an array of objects based on a custom order:
How do I sort an array of objects based on the ordering of another array?

If you want to sort data either in Ascending or Descending order based on particular column, using sequlize js, use the order method of sequlize as follows
// Will order the specified column by descending order
order: sequelize.literal('column_name order')
e.g. order: sequelize.literal('timestamp DESC')

If you are using MySQL, you can use order by FIELD(id, ...) approach:
Company.findAll({
where: {id : {$in : companyIds}},
order: sequelize.literal("FIELD(company.id,"+companyIds.join(',')+")")
})
Keep in mind, it might be slow. But should be faster, than manual sorting with JS.

You can accomplish this in a very back-handed way with the following code:
exports.getStaticCompanies = function () {
var ids = [46128, 2865, 49569, 1488, 45600, 61991, 1418, 61919, 53326, 61680]
return Company.findAll({
where: {
id: ids
},
attributes: ['id', 'logo_version', 'logo_content_type', 'name', 'updated_at'],
order: sequelize.literal('(' + ids.map(function(id) {
return '"Company"."id" = \'' + id + '\'');
}).join(', ') + ') DESC')
});
};
This is somewhat limited because it's got very bad performance characteristics past a few dozen records, but it's acceptable at the scale you're using.
This will produce a SQL query that looks something like this:
[...] ORDER BY ("Company"."id"='46128', "Company"."id"='2865', "Company"."id"='49569', [...])

May be a little late but want to mention an approach.
Sorting based on the [46128, 2865, 49569, 1488, 45600, 61991, 1418, 61919, 53326, 61680] can be done using ARRAY_POSITION function of postgreSQL.
const arr = [46128, 2865, 49569, 1488, 45600, 61991, 1418, 61919, 53326, 61680];
const ord = [sequelize.literal(`ARRAY_POSITION(ARRAY[${arr}]::integer[], "id")`)];
return Company.findAll({
where: {
id: arr
},
attributes: ['id', 'logo_version', 'logo_content_type', 'name', 'updated_at'],
order: ord,
});

I don't think this is possible in Sequelize's order clause, because as far as I can tell, those clauses are meant to be binary operations applicable to every element in your list. (This makes sense, too, as it's generally how sorting a list works.)
So, an order clause can do something like order a list by recursing over it asking "which of these 2 elements is older?" Whereas your ordering is not reducible to a binary operation (compare_bigger(1,2) => 2) but is just an arbitrary sequence (2,4,11,2,9,0).
When I hit this issue with findAll, here was my solution (sub in your returned results for numbers):
var numbers = [2, 20, 23, 9, 53];
var orderIWant = [2, 23, 20, 53, 9];
orderIWant.map(x => { return numbers.find(y => { return y === x })});
Which returns [2, 23, 20, 53, 9]. I don't think there's a better tradeoff we can make. You could iterate in place over your ordered ids with findOne, but then you're doing n queries when 1 will do.

if required, databases order their output by the generic order of values in the order by fields.
if your order is not like this, you may add to the select an order_field, and give it a value based upon the value in id:
case
when id=46128 then 0
when id=2865 then 1
when id=49569 then 2
end as order_field
and order by order_field.
if there are lots of values, you may stuff them in their original order in a temporary table with an identity primary key order_field, and inner join your select to that temporary table by your value field, ordering by order_field.
i don't know how to do this in sequelize, but found here answers on how it does things that i needed.

Worked for me by using "" quotes surrounding the property name.
For Debugging You can see what is the query that is getting generated by sequelize and then try to run it on the particular DB console.
In My Case I was not able to sort the data by last updatedAt column
Code Snippet :
exports.readAll = (req, res) => {
console.log("Inside ReadAll Data method");
let data;
if (!req.body) {
data = CompanyModel.findAll({ order: [[sequelize.literal('"updatedAt"'), 'DESC']]});
} else {
data = CompanyModel.findAll({ order: [[sequelize.literal('"updatedAt"'), 'DESC']]});
}
data
.then((data) => {
res.send(
data
);
})
.catch((err) => {
res.status(500).send({
message: err.message || "Some error occurred while retrieving data.",
});
});
};
SQL Query getting formed in my case :
Inside ReadAll Data method
Executing (default): SELECT "company_name", "company_id", "on_record", "createdAt", "updatedAt" FROM "companies" AS "company" ORDER BY "updatedAt" DESC;

Incase anyone is using uuid type, using example by #Agniveer from above but modified
const ids = [
'f01a057e-5646-4527-a219-336804317246',
'ee900087-4910-42b4-a559-06aea7b4e250',
'b363f116-1fc5-473a-aed7-0ceea9beb14d'
];
const idsFormat = `'${ids.join("','")}'`;
const order = [sequelize.literal(`ARRAY_POSITION(ARRAY[${idsFormat}]::uuid[], "<insert_table_name>"."id")`)];

Related

How to update entities in a table from an array?

I'm trying to figure out how to update many elements at once. Suppose I have the following array:
[
{
id: 100,
order: 1,
},
{
id: 101,
order: 2,
},
{
id: 102,
order: 3,
},
]
I then transform this array, replacing the values of order. The resulting array becomes the following:
[
{
id: 102,
order: 1,
},
{
id: 101,
order: 2,
},
{
id: 100,
order: 3,
},
]
I use this on the frontend to render a list in the appropriate order, based on the value of order.
But how can I update these 3 entities in my database?
I can obviously make 3 UPDATE statements:
const promises = [];
newArray.forEach(({ id, order }) => {
promises.push(
// executeMutation is just a custom query builder
executeMutation({
query: `UPDATE my_table SET order = ${order} WHERE id = ${id}'`
})
)
})
await Promise.all(promises)
But is it possible to do this in one query?
You can do this using the UNNEST function. First you'll need to handle the query parameters properly. https://www.atdatabases.org/ does this for you, otherwise you need to separately pass a string with placeholders and then the values. If you use #databases, the code could look like:
await database.query(sql`
UPDATE my_table
SET order = updates_table.order
FROM (
SELECT
UNNEST(${newArray.map(v => v.id)}::INT[]) as id,
UNNEST(${newArray.map(v => v.order)}::INT[]) as order
) AS updates_table
WHERE my_table.id = updates_table.id
`);
The trick here is that UNNEST lets you take an array for each column and turn that into a kind of temporary table. You can then use that table to filter & update the records.

Insert multiple values into a column of a table in PostgreSQL [duplicate]

A single row can be inserted like this:
client.query("insert into tableName (name, email) values ($1, $2) ", ['john', 'john#gmail.com'], callBack)
This approach automatically comments out any special characters.
How do i insert multiple rows at once?
I need to implement this:
"insert into tableName (name, email) values ('john', 'john#gmail.com'), ('jane', 'jane#gmail.com')"
I can just use js string operators to compile such rows manually, but then i need to add special characters escape somehow.
Use pg-format like below.
var format = require('pg-format');
var values = [
[7, 'john22', 'john22#gmail.com', '9999999922'],
[6, 'testvk', 'testvk#gmail.com', '88888888888']
];
client.query(format('INSERT INTO users (id, name, email, phone) VALUES %L', values),[], (err, result)=>{
console.log(err);
console.log(result);
});
One other way using PostgreSQL json functions:
client.query('INSERT INTO table (columns) ' +
'SELECT m.* FROM json_populate_recordset(null::your_custom_type, $1) AS m',
[JSON.stringify(your_json_object_array)], function(err, result) {
if (err) {
console.log(err);
} else {
console.log(result);
}
});
Following this article: Performance Boost from pg-promise library, and its suggested approach:
// Concatenates an array of objects or arrays of values, according to the template,
// to use with insert queries. Can be used either as a class type or as a function.
//
// template = formatting template string
// data = array of either objects or arrays of values
function Inserts(template, data) {
if (!(this instanceof Inserts)) {
return new Inserts(template, data);
}
this.rawType = true;
this.toPostgres = function () {
return data.map(d=>'(' + pgp.as.format(template, d) + ')').join(',');
};
}
An example of using it, exactly as in your case:
var users = [['John', 23], ['Mike', 30], ['David', 18]];
db.none('INSERT INTO Users(name, age) VALUES $1', Inserts('$1, $2', users))
.then(data=> {
// OK, all records have been inserted
})
.catch(error=> {
// Error, no records inserted
});
And it will work with an array of objects as well:
var users = [{name: 'John', age: 23}, {name: 'Mike', age: 30}, {name: 'David', age: 18}];
db.none('INSERT INTO Users(name, age) VALUES $1', Inserts('${name}, ${age}', users))
.then(data=> {
// OK, all records have been inserted
})
.catch(error=> {
// Error, no records inserted
});
UPDATE-1
For a high-performance approach via a single INSERT query see Multi-row insert with pg-promise.
UPDATE-2
The information here is quite old now, see the latest syntax for Custom Type Formatting. What used to be _rawDBType is now rawType, and formatDBType was renamed into toPostgres.
You are going to have to generate the query dynamically. Although possible, this is risky, and could easily lead to SQL Injection vulnerabilities if you do it wrong. It's also easy to end up with off by one errors between the index of your parameters in the query and the parameters you're passing in.
That being said, here is an example of how you could do write this, assuming you have an array of users that looks like {name: string, email: string}:
client.query(
`INSERT INTO table_name (name, email) VALUES ${users.map(() => `(?, ?)`).join(',')}`,
users.reduce((params, u) => params.concat([u.name, u.email]), []),
callBack,
)
An alternative approach, is to use a library like #databases/pg (which I wrote):
await db.query(sql`
INSERT INTO table_name (name, email)
VALUES ${sql.join(users.map(u => sql`(${u.name}, ${u.email})`), ',')}
`)
#databases requires the query to be tagged with sql and uses that to ensure any user data you pass is always automatically escaped. This also lets you write the parameters inline, which I think makes the code much more readable.
Using npm module postgres (porsager/postgres) which has Tagged Template Strings at the core:
https://github.com/porsager/postgres#multiple-inserts-in-one-query
const users = [{
name: 'Murray',
age: 68,
garbage: 'ignore'
},
{
name: 'Walter',
age: 80,
garbage: 'ignore'
}]
sql`insert into users ${ sql(users, 'name', 'age') }`
// Is translated to:
insert into users ("name", "age") values ($1, $2), ($3, $4)
// Here you can also omit column names which will use all object keys as columns
sql`insert into users ${ sql(users) }`
// Which results in:
insert into users ("name", "age", "garbage") values ($1, $2, $3), ($4, $5, $6)
Just thought I'd post since it's like brand new out of beta and I've found it to be a better philosophy of SQL library. I think would be preferable over the other postgres/node libraries posted in other answers. IMHO
Hi I know I am late to the party, but what worked for me was a simple map.
I hope this will help someone seeking for same
let sampleQuery = array.map(myRow =>
`('${myRow.column_a}','${myRow.column_b}') `
)
let res = await pool.query(`INSERT INTO public.table(column_a, column_b) VALUES ${sampleQuery} `)
client.query("insert into tableName (name, email) values ($1, $2),($3, $4) ", ['john', 'john#gmail.com','john', 'john#gmail.com'], callBack)
doesn't help?
Futher more, you can manually generate a string for query:
insert into tableName (name, email) values (" +var1 + "," + var2 + "),(" +var3 + ", " +var4+ ") "
if you read here, https://github.com/brianc/node-postgres/issues/530 , you can see the same implementation.

Insert multiple rows at once in node postgres [duplicate]

A single row can be inserted like this:
client.query("insert into tableName (name, email) values ($1, $2) ", ['john', 'john#gmail.com'], callBack)
This approach automatically comments out any special characters.
How do i insert multiple rows at once?
I need to implement this:
"insert into tableName (name, email) values ('john', 'john#gmail.com'), ('jane', 'jane#gmail.com')"
I can just use js string operators to compile such rows manually, but then i need to add special characters escape somehow.
Use pg-format like below.
var format = require('pg-format');
var values = [
[7, 'john22', 'john22#gmail.com', '9999999922'],
[6, 'testvk', 'testvk#gmail.com', '88888888888']
];
client.query(format('INSERT INTO users (id, name, email, phone) VALUES %L', values),[], (err, result)=>{
console.log(err);
console.log(result);
});
One other way using PostgreSQL json functions:
client.query('INSERT INTO table (columns) ' +
'SELECT m.* FROM json_populate_recordset(null::your_custom_type, $1) AS m',
[JSON.stringify(your_json_object_array)], function(err, result) {
if (err) {
console.log(err);
} else {
console.log(result);
}
});
Following this article: Performance Boost from pg-promise library, and its suggested approach:
// Concatenates an array of objects or arrays of values, according to the template,
// to use with insert queries. Can be used either as a class type or as a function.
//
// template = formatting template string
// data = array of either objects or arrays of values
function Inserts(template, data) {
if (!(this instanceof Inserts)) {
return new Inserts(template, data);
}
this.rawType = true;
this.toPostgres = function () {
return data.map(d=>'(' + pgp.as.format(template, d) + ')').join(',');
};
}
An example of using it, exactly as in your case:
var users = [['John', 23], ['Mike', 30], ['David', 18]];
db.none('INSERT INTO Users(name, age) VALUES $1', Inserts('$1, $2', users))
.then(data=> {
// OK, all records have been inserted
})
.catch(error=> {
// Error, no records inserted
});
And it will work with an array of objects as well:
var users = [{name: 'John', age: 23}, {name: 'Mike', age: 30}, {name: 'David', age: 18}];
db.none('INSERT INTO Users(name, age) VALUES $1', Inserts('${name}, ${age}', users))
.then(data=> {
// OK, all records have been inserted
})
.catch(error=> {
// Error, no records inserted
});
UPDATE-1
For a high-performance approach via a single INSERT query see Multi-row insert with pg-promise.
UPDATE-2
The information here is quite old now, see the latest syntax for Custom Type Formatting. What used to be _rawDBType is now rawType, and formatDBType was renamed into toPostgres.
You are going to have to generate the query dynamically. Although possible, this is risky, and could easily lead to SQL Injection vulnerabilities if you do it wrong. It's also easy to end up with off by one errors between the index of your parameters in the query and the parameters you're passing in.
That being said, here is an example of how you could do write this, assuming you have an array of users that looks like {name: string, email: string}:
client.query(
`INSERT INTO table_name (name, email) VALUES ${users.map(() => `(?, ?)`).join(',')}`,
users.reduce((params, u) => params.concat([u.name, u.email]), []),
callBack,
)
An alternative approach, is to use a library like #databases/pg (which I wrote):
await db.query(sql`
INSERT INTO table_name (name, email)
VALUES ${sql.join(users.map(u => sql`(${u.name}, ${u.email})`), ',')}
`)
#databases requires the query to be tagged with sql and uses that to ensure any user data you pass is always automatically escaped. This also lets you write the parameters inline, which I think makes the code much more readable.
Using npm module postgres (porsager/postgres) which has Tagged Template Strings at the core:
https://github.com/porsager/postgres#multiple-inserts-in-one-query
const users = [{
name: 'Murray',
age: 68,
garbage: 'ignore'
},
{
name: 'Walter',
age: 80,
garbage: 'ignore'
}]
sql`insert into users ${ sql(users, 'name', 'age') }`
// Is translated to:
insert into users ("name", "age") values ($1, $2), ($3, $4)
// Here you can also omit column names which will use all object keys as columns
sql`insert into users ${ sql(users) }`
// Which results in:
insert into users ("name", "age", "garbage") values ($1, $2, $3), ($4, $5, $6)
Just thought I'd post since it's like brand new out of beta and I've found it to be a better philosophy of SQL library. I think would be preferable over the other postgres/node libraries posted in other answers. IMHO
Hi I know I am late to the party, but what worked for me was a simple map.
I hope this will help someone seeking for same
let sampleQuery = array.map(myRow =>
`('${myRow.column_a}','${myRow.column_b}') `
)
let res = await pool.query(`INSERT INTO public.table(column_a, column_b) VALUES ${sampleQuery} `)
client.query("insert into tableName (name, email) values ($1, $2),($3, $4) ", ['john', 'john#gmail.com','john', 'john#gmail.com'], callBack)
doesn't help?
Futher more, you can manually generate a string for query:
insert into tableName (name, email) values (" +var1 + "," + var2 + "),(" +var3 + ", " +var4+ ") "
if you read here, https://github.com/brianc/node-postgres/issues/530 , you can see the same implementation.

Knex.js multiple orderBy() columns

Is it possible to do multiple orderBy() columns?
knex
.select()
.table('products')
.orderBy('id', 'asc')
The orderBy() chainable only takes a single column key and a sort value, but how can I order by multiple columns?
You can call .orderBy multiple times to order by multiple columns:
knex
.select()
.table('products')
.orderBy('name', 'desc')
.orderBy('id', 'asc')
The original answer is technically correct, and useful, but my intention was to find a way to programatically apply the orderBy() function multiple times, here is the actual solution I went with for reference:
var sortArray = [
{'field': 'title', 'direction': 'asc'},
{'field': 'id', 'direction': 'desc'}
];
knex
.select()
.table('products')
.modify(function(queryBuilder) {
_.each(sortArray, function(sort) {
queryBuilder.orderBy(sort.field, sort.direction);
});
})
Knex offers a modify function which allows the queryBuilder to be operated on directly. An array iterator then calls orderBy() multiple times.
The Knex orderBy function also receives an array:
knex('users').orderBy(['email', 'age', 'name'])
or
knex('users').orderBy(['email', { column: 'age', order: 'desc' }])
or
knex('users').orderBy([{ column: 'email' }, { column: 'age', order: 'desc' }])
You can use the following solution to solve your problem:
const builder = knex.table('products');
sortArray.forEach(
({ field, direction }) => builder.orderBy(field, direction)
);
orderBy accepts an array of type:
[
{column: 'id', order: 'asc'},
{column: 'name', order: 'desc'},
{column: 'created_at', order: 'desc'},
]
i have a function that takes a param from the request:
sort=id,name,-created_at
and builds an array that is passed to the queryBuilder
columns is an array with the accepted values of table columns
sort(model, sorts, columns) {
let confirmed = true;
sorts = sorts.split(',')
sorts.forEach((sort: string) => {
sort = sort.replace('-', '')
sort = sort.replace(' ', '')
confirmed = columns.includes(sort)
if (!confirmed) {
let index = sorts.indexOf(sort)
sorts.splice(index, 1)
}
})
let sortsArr = [];
sorts.forEach((sort) => {
if (sort.startsWith('-')) {
sort = sort.replace('-', '')
sortsArr.push({column: model.tableName + '.' + sort, order: 'desc'})
} else {
sortsArr.push({column: model.tableName + '.' + sort, order: 'asc'})
}
})
return sortsArr;
}
and then use it like this in the query
const sortsArr = sort(model, sorts, model.columns);
knex('users').orderBy(sortsArr)

Order Bookshelf.js fetch by related column value

I'm using Bookshelf.js/Knex.js, fetching a model (call it user) with a related child model (call it company).Can I order by a field on the child model - company.name?
Also, if that's possible, can I multi sort, say company.name descending then lastName ascending
Here's my current code, which only works on root model fields. qb.orderBy('company.name', 'desc') doesn't work.
users.query(function(qb) {
qb.orderBy('lastName', 'asc');
})
.fetch({withRelated: ['company']})
.then(success, error);
Try the following:
users
.fetch({withRelated: [
{
'company': function(qb) {
qb.orderBy("name");
}
}
]})
.then(success, error);
I got the idea from https://github.com/tgriesser/bookshelf/issues/361
You can do it like this without the need of a function:
users.query(function(qb) {
qb.query('orderBy', 'lastName', 'asc');
})
.fetch({withRelated: ['company']})
.then(success, error);
Found here: Sort Bookshelf.js results with .orderBy()
I think I solved it by doing this:
let postHits =
await posts
.query(qb => qb
.innerJoin('post_actor_rel', function () {
this.on('post.id', '=', 'post_actor_rel.post_id');
})
.innerJoin('actor', function () {
this.on('post_actor_rel.actor_id', '=', 'actor.id');
})
.orderByRaw('actor.name ASC')
.groupBy('id')
)
.fetchPage({
withRelated: ['roles', 'themes', 'activity_types', 'subjects', 'educational_stages', 'images', 'documents', 'actors'],
limit,
offset
},
);
I modify the query by inner joining with the desired tables and after sorting (using orderByRaw since I will need to add some more sorting that I think is not possible with orderBy) I group by the post's id to get rid of the duplicate rows. The only problem is that it's not defined which actor name (of several possible) is used for the sorting.

Resources