jsonb query in AREL - ruby-on-rails-4.2

I'm fairly new to AREL, and have had trouble finding references to using Postgres JSONB with AREL.
How might I convert this SQL into an AREL statement?
SELECT
tag,
count(tag)
FROM
people p,
jsonb_array_elements(p.data->'tags') tag
WHERE p.data->'tags' ? '#{tag}'
GROUP BY tag

Related

How do I write this query without using raw query in sequelize?

I would like to a bulk update in sequelize. Unfortunately it seems like sequelize does not support bulk updates I am using sequelize-typescript if that helps and using postgresql 14
My query in raw SQL looks like this
UPDATE feed_items SET tags = val.tags FROM
(
VALUES ('ddab8ce7-afa3-824f-7b65-edfb53a71764'::uuid,ARRAY[]::VARCHAR(255)[]),
('ece9f2fc-2a09-4a95-16ce-07293b0a14d2'::uuid,ARRAY[]::VARCHAR(255)[])
) AS val(id, tags) WHERE feed_items.id = val.id
I would like to generate this query from a given array of string and array values. The tags is implemented as a string array in my table.
Is there a way to generate the above query without using raw query?
Or an SQL injection safe way of generating the above query?

JOOQ MYSQL QUERY

select question.*,
question_option.id
from question
left join question_option on question_option.question_id = question.id;
how do i write question.* in jooq instead of specifying all the entity vaiables
You can use field() or asterisk() methods from the JOOQ generated objects which are extended from TableImpl.
For example, if you just want to query the fields of a record:
dsl.select(QUESTION.fields()).from...
If you need fields from the join too:
dsl.select(QUESTION.asterisk(), QUESTION_OPTION.ID).from...
I assume that you generate the metamodel so you can use
dsl.select(QUESTION.fields()), QUESTION_OPTION.ID)...

How to add a raw PostgreSQL function to a query builder join in TypeORM?

I am using TypeORM on a NestJS service written in TypeScript.
I am trying to use PostgreSQL's jsonb_each function as a table in a join to create a query similar to this:
select * from customers c left join jsonb_each(c.some_json_column) as d
When I put jsonb_each(c.some_json_column) in TypeORM's leftJoin function, TypeORM wraps it with double quotes, which forces PostgreSQL to think that I am trying to access a table named jsonb_each(c.some_json_column) and PostgreSQL throws an error.
How can I force TypeORM to use jsonb_each(c.some_json_column) as a raw value in the leftJoin function, and not wrappiwrapg it with double-quotes?

Howto expose a native SQL function as a predicate

I have a table in my database which stores a list of string values as a jsonb field.
create table rabbits_json (
rabbit_id bigserial primary key,
name text,
info jsonb not null
);
insert into rabbits_json (name, info) values
('Henry','["lettuce","carrots"]'),
('Herald','["carrots","zucchini"]'),
('Helen','["lettuce","cheese"]');
I want to filter my rows checking if info contains a given value.
In SQL, I would use ? operator:
select * from rabbits_json where info ? 'carrots';
If my googling skills are fine today, I believe that this is not implemented yet in JOOQ:
https://github.com/jOOQ/jOOQ/issues/9997
How can I use a native predicate in my query to write an equivalent query in JOOQ?
For anything that's not supported natively in jOOQ, you should use plain SQL templating, e.g.
Condition condition = DSL.condition("{0} ? {1}", RABBITS_JSON.INFO, DSL.val("carrots"));
Unfortunately, in this specific case, you will run into this issue here. With JDBC PreparedStatement, you still cannot use ? for other usages than bind variables. As a workaround, you can:
Use Settings.statementType == STATIC_STATEMENT to prevent using a PreparedStatement in this case
Use the jsonb_exists_any function (not indexable) instead of ?, see https://stackoverflow.com/a/38370973/521799

Storing schema-less complex json object in Cassandra

I have a schema-less json object that I wish to store in Cassandra DB using spring-cassandra. I learned that Cassandra supports Map type but Cassandra doesn't accept Map<String, Object> as a data model.
I need to query on the fields of the json so storing it as a blob is out of question. Is there anyway I can do this?
PS: I've looked at Storing JSON object in CASSANDRA, the answer didn't seem applicable to my use case as my json could be very complex.
Did you look at UDT (user-defined-type) ?
You can define an UDT like this:
CREATE TYPE my_json(
property1 text,
property2 int,
property3 map<text, text>,
property4 map<int, another_json_type>,
...
)
And then in Java use Map<String, UserType>
Note: UserType comes from the Java driver: https://github.com/datastax/java-driver/blob/2.1/driver-core/src/main/java/com/datastax/driver/core/UserType.java
You cannot create an user type n Java, you can only get it from the metadata of your table, see this: https://github.com/datastax/java-driver/blob/3.0/driver-core/src/test/java/com/datastax/driver/core/UserTypesTest.java#L62-L81
1) one solution from me is, integrate solr search and index this table first.
2) Later write a solr analyser to parse the json and put under various fields in solr while indexing.
3) Next step is use solr supported query like select * from table where solr_query = "{search expression syntax}"

Resources