Sequelize join when key is inside a JSONB field - node.js

Is there a way to use include (which is actually a join table) to another Model, where the key is INSIDE a JSONB field? for example:
Item { id: INTEGER, someJsonbField: JSONB }
(item example: { id: 1, someJsonbField: { storeId: 2 } })
Then, for getting all of the items of store with id 2, you write something like this:
Item.findAll({ include: { model: 'Store', key: 'someJsonbField.storeId', ... } })
OFCOURSE, in a real world scenario, storeId should be inside Item directly, but only for the purpose of this question - How could it be done?

Related

knex js query many to many

i'm having trouble with node & knex.js
I'm trying to build a mini blog, with posts & adding functionality to add multiple tags to post
I have a POST model with following properties:
id SERIAL PRIMARY KEY NOT NULL,
name TEXT,
Second I have Tags model that is used for storing tags:
id SERIAL PRIMARY KEY NOT NULL,
name TEXT
And I have many to many table: Post Tags that references post & tags:
id SERIAL PRIMARY KEY NOT NULL,
post_id INTEGER NOT NULL REFERENCES posts ON DELETE CASCADE,
tag_id INTEGER NOT NULL REFERENCES tags ON DELETE CASCADE
I have managed to insert tags, and create post with tags,
But when I want to fetch Post data with Tags attached to that post I'm having a trouble
Here is a problem:
const data = await knex.select('posts.name as postName', 'tags.name as tagName'
.from('posts')
.leftJoin('post_tags', 'posts.id', 'post_tags.post_id')
.leftJoin('tags', 'tags.id', 'post_tags.tag_id')
.where('posts.id', id)
Following query returns this result:
[
{
postName: 'Post 1',
tagName: 'Youtube',
},
{
postName: 'Post 1',
tagName: 'Funny',
}
]
But I want the result to be formated & returned like this:
{
postName: 'Post 1',
tagName: ['Youtube', 'Funny'],
}
Is that even possible with query or do I have to manually format data ?
One way of doing this is to use some kind of aggregate function. If you're using PostgreSQL:
const data = await knex.select('posts.name as postName', knex.raw('ARRAY_AGG (tags.name) tags'))
.from('posts')
.innerJoin('post_tags', 'posts.id', 'post_tags.post_id')
.innerJoin('tags', 'tags.id', 'post_tags.tag_id')
.where('posts.id', id)
.groupBy("postName")
.orderBy("postName")
.first();
->
{ postName: 'post1', tags: [ 'tag1', 'tag2', 'tag3' ] }
For MySQL:
const data = await knex.select('posts.name as postName', knex.raw('GROUP_CONCAT (tags.name) as tags'))
.from('posts')
.innerJoin('post_tags', 'posts.id', 'post_tags.post_id')
.innerJoin('tags', 'tags.id', 'post_tags.tag_id')
.where('posts.id', id)
.groupBy("postName")
.orderBy("postName")
.first()
.then(res => Object.assign(res, { tags: res.tags.split(',')}))
There are no arrays in MySQL, and GROUP_CONCAT will just concat all tags into a string, so we need to split them manually.
->
RowDataPacket { postName: 'post1', tags: [ 'tag1', 'tag2', 'tag3' ] }
The result is correct as that is how SQL works - it returns rows of data. SQL has no concept of returning anything other than a table (think CSV data or Excel spreadsheet).
There are some interesting things you can do with SQL that can convert the tags to strings that you concatenate together but that is not really what you want. Either way you will need to add a post-processing step.
With your current query you can simply do something like this:
function formatter (result) {
let set = {};
result.forEach(row => {
if (set[row.postName] === undefined) {
set[row.postName] = row;
set[row.postName].tagName = [set[row.postName].tagName];
}
else {
set[row.postName].tagName.push(row.tagName);
}
});
return Object.values(set);
}
// ...
query.then(formatter);
This shouldn't be slow as you're only looping through the results once.

JSONB Query Sequelize

I have one table in postgres table and table structure is
ID
Name
Details
Context
CreatedDate
Where as Context is JSONB field and CreatedDate is a timestamp
I am saving data in Context this way {"trade": {"id": 102}, "trader": {"id": 100}}
I am trying to select record from Context based on trader id and this is my query
this.findAll({
where: {
context: {
$contains: {
trader: [{id: '100'}]
}
}
}
})
I tried nested keys as well but no result yeild.
this.findAll({
where: {
'context.trader.id': {
$eq: '100'
}
}
})
Can you please suggest how I can select the records based on my structure.
In continuity to that how I can get records based on two statements like adding createdtime in this where clause

SQL all rows in child 'categories' and child child 'categories' : recursive?

I'm having trouble writing a query that solves the following problem, which I believe needs some kind of recursiveness:
I have a table with houses, each of them having a specific house_type, p.e. house, bungalow, etc. The house_types inherit from each other, also declared in a table called house_types.
table: houses
id | house_type
1 | house
2 | bungalow
3 | villa
etcetera...
table: house_types
house_type | parent
house | null
villa | house
bungalow | villa
etcetera...
In this logic, a bungalow is also a villa and a villa is also house. So when I want to get all villas, house 2 and 3 should show up, when I want to get all houses, house 1, 2 and 3 should show up, when I want all bungalows, only house 3 should show up.
Is a recursive query the answer and how should I work this out. I use knex/objection.js in a node.js application.
Here is a recursive CTE that gets every pair in the hierarchy:
with recursive house_types as (
select 'house' as housetype, null as parent union all
select 'villa', 'house' union all
select 'bungalow', 'villa'
),
cte(housetype, alternate) as (
select housetype, housetype as alternate
from house_types
union all
select ht.housetype, cte.alternate
from cte join
house_types ht
on cte.housetype = ht.parent
)
select *
from cte;
(The house_types CTE is just to set up the data.)
You can then join this to other data to get any level of the hierarchy.
To start with #gordon-linoffs answer is awesome. I'm just here to add specifics how to do this with knex / objection.js.
That sounds like pretty nasty db design. I would denormalise the type data so that queries would be easier to make without recursive common table expressions (knex doesn't support them currently).
Anyways here is some runnable code how to do objection.js models and type info denormalisation on JavaSript side for being able to make queries that you are trying to do: https://runkit.com/mikaelle/stackoverflow-43554373
Since stackoverflow likes to have code also contained in the answer I'll copy paste it here too. Example uses sqlite3 as DB backend but the same code works also with postgres.
const _ = require('lodash');
require("sqlite3");
const knex = require("knex")({
client: 'sqlite3',
connection: ':memory:'
});
const { Model } = require('objection');
// init schema and test data
await knex.schema.createTable('house_types', table => {
table.string('house_type');
table.string('parent').references('house_types.house_type');
});
await knex.schema.createTable('houses', table => {
table.increments('id');
table.string('house_type').references('house_types.house_type');
});
await knex('house_types').insert([
{ house_type: 'house', parent: null },
{ house_type: 'villa', parent: 'house' },
{ house_type: 'bungalow', parent: 'villa' }
]);
await knex('houses').insert([
{id: 1, house_type: 'house' },
{id: 2, house_type: 'villa' },
{id: 3, house_type: 'bungalow' }
]);
// show initial data from DB
await knex('houses')
.join('house_types', 'houses.house_type', 'house_types.house_type');
// create models
class HouseType extends Model {
static get tableName() { return 'house_types' };
// http://vincit.github.io/objection.js/#relations
static get relationMappings() {
return {
parent: {
relation: Model.HasOneRelation,
modelClass: HouseType,
join: {
from: 'house_types.parent',
to: 'house_types.house_type'
}
}
}
}
}
class House extends Model {
static get tableName() { return 'houses' };
// http://vincit.github.io/objection.js/#relations
static relationMappings() {
return {
houseType: {
relation: Model.HasOneRelation,
modelClass: HouseType,
join: {
from: 'houses.house_type',
to: 'house_types.house_type'
}
}
}
}
}
// get all houses and all house types with recursive eager loading
// http://vincit.github.io/objection.js/#eager-loading
JSON.stringify(
await House.query(knex).eager('houseType.parent.^'), null, 2
);
// however code above doesn't really allow you to filter
// queries nicely and is pretty inefficient so as far as I know recursive
// with query is only way how to do it nicely with pure SQL
// since knex doesn't currently support them we can first denormalize housetype
// hierarchy (and maybe cache this one if data is not changing much)
const allHouseTypes = await HouseType.query(knex).eager('parent.^');
// initialize house types with empty arrays
const denormalizedTypesByHouseType = _(allHouseTypes)
.keyBy('house_type')
.mapValues(() => [])
.value();
// create denormalized type array for every type
allHouseTypes.forEach(houseType => {
// every type should be returned with exact type e.g. bungalow is bungalow
denormalizedTypesByHouseType[houseType.house_type].push(houseType.house_type);
let parent = houseType.parent;
while(parent) {
// bungalow is also villa so when searched for villa bungalows are returned
denormalizedTypesByHouseType[parent.house_type].push(houseType.house_type);
parent = parent.parent;
}
});
// just to see that denormalization did work as expected
console.log(denormalizedTypesByHouseType);
// all villas
JSON.stringify(
await House.query(knex).whereIn('house_type', denormalizedTypesByHouseType['villa']),
null, 2
);

Replacing an object in an object array in Redux Store using Javascript/Lodash

I have an object array in a reducer that looks like this:
[
{id:1, name:Mark, email:mark#email.com},
{id:2, name:Paul, email:paul#gmail.com},
{id:3,name:sally, email:sally#email.com}
]
Below is my reducer. So far, I can add a new object to the currentPeople reducer via the following:
const INITIAL_STATE = { currentPeople:[]};
export default function(state = INITIAL_STATE, action) {
switch (action.type) {
case ADD_PERSON:
return {...state, currentPeople: [ ...state.currentPeople, action.payload]};
}
return state;
}
But here is where I'm stuck. Can I UPDATE a person via the reducer using lodash?
If I sent an action payload that looked like this:
{id:1, name:Eric, email:Eric#email.com}
Would I be able to replace the object with the id of 1 with the new fields?
Yes you can absolutely update an object in an array like you want to. And you don't need to change your data structure if you don't want to. You could add a case like this to your reducer:
case UPDATE_PERSON:
return {
...state,
currentPeople: state.currentPeople.map(person => {
if (person.id === action.payload.id) {
return action.payload;
}
return person;
}),
};
This can be be shortened as well, using implicit returns and a ternary:
case UPDATE_PERSON:
return {
...state,
currentPeople: state.currentPeople.map(person => (person.id === action.payload.id) ? action.payload : person),
};
Mihir's idea about mapping your data to an object with normalizr is certainly a possibility and technically it'd be faster to update the user with the reference instead of doing the loop (after initial mapping was done). But if you want to keep your data structure, this approach will work.
Also, mapping like this is just one of many ways to update the object, and requires browser support for Array.prototype.map(). You could use lodash indexOf() to find the index of the user you want (this is nice because it breaks the loop when it succeeds instead of just continuing as the .map would do), once you have the index you could overwrite the object directly using it's index. Make sure you don't mutate the redux state though, you'll need to be working on a clone if you want to assign like this: clonedArray[foundIndex] = action.payload;.
This is a good candidate for data normalization. You can effectively replace your data with the new one, if you normalize the data before storing it in your state tree.
This example is straight from Normalizr.
[{
id: 1,
title: 'Some Article',
author: {
id: 1,
name: 'Dan'
}
}, {
id: 2,
title: 'Other Article',
author: {
id: 1,
name: 'Dan'
}
}]
Can be normalized this way-
{
result: [1, 2],
entities: {
articles: {
1: {
id: 1,
title: 'Some Article',
author: 1
},
2: {
id: 2,
title: 'Other Article',
author: 1
}
},
users: {
1: {
id: 1,
name: 'Dan'
}
}
}
}
What's the advantage of normalization?
You get to extract the exact part of your state tree that you want.
For instance- You have an array of objects containing information about the articles. If you want to select a particular object from that array, you'll have to iterate through entire array. Worst case is that the desired object is not present in the array. To overcome this, we normalize the data.
To normalize the data, store the unique identifiers of each object in a separate array. Let's call that array as results.
result: [1, 2, 3 ..]
And transform the array of objects into an object with keys as the id(See the second snippet). Call that object as entities.
Ultimately, to access the object with id 1, simply do this- entities.articles["1"].
If you want to replace the old data with new data, you can do this-
entities.articles["1"] = newObj;
Use native splice method of array:
/*Find item index using lodash*/
var index = _.indexOf(currentPeople, _.find(currentPeople, {id: 1}));
/*Replace item at index using splice*/
arr.splice(index, 1, {id:1, name:'Mark', email:'mark#email.com'});

Serializing JSON array of foreign keys using sequelize

If I have a JSON object as follows
{
name: "John Smith",
tags: [1, 2, 3]
}
Each number in the tags array above is a foreign key associated with a Tag record.
What's the most efficient way of creating this record using sequelize?
Guessing your relations are like
Person.hasMany(Tag);
Tag.hasMany(Person);
You could create a Person and later on find those tasks and assign them to him
Person.create(obj)
.success(function (person) {
Tag.findAll({where: { id: obj.tags }})
.success(function (tags) {
person.setTags(tags);
});
});

Resources