HowTo insert into tableName with select and specifying insert columns at Jooq? - jooq

I'm using Jooq to generate SQL
Here is resulting query
insert into MY_TABLE -- I want INSERT INTO(firstField,secondField)
select
?,
?
where not exists (
select 1
from MY_TABLE
where (
firstField = ?
)
)
returning id
MY_TABLE DDL:
create table IF NOT EXISTS MY_TABLE
(
id SERIAL PRIMARY KEY,
firstField int not null,
secondField int not null
)
I can't make Jooq add field names next to insert into MY_TABLE
My builder:
JooqBuilder.default()
.insertInto(table("MY_TABLE"))
.select(
select(
param(classOf[Int]), // 1
param(classOf[Int]), // 2
)
.whereNotExists(select(inline(1))
.from(table("MY_TABLE"))
.where(
DSL.noCondition()
.and(field("firstField", classOf[Long]).eq(0L))
)
)
).returning(field("id")).getSQL
I've tried
.insertInto(table("MY_TABLE"),field("firstField"), field("secondField"))
UPD:
I was confused by compiler exception.
The right solution is
```scala
JooqBuilder.default()
.insertInto(table("MY_TABLE"),
field("firstField",classOf[Int]),
field("secondField",classOf[Int])
)
.select(
select(
param(classOf[Int]),
param(classOf[Int])
)
.whereNotExists(select(inline(1))
.from(table("MY_TABLE"))
.where(
DSL.noCondition()
.and(field("firstField", classOf[Long]).eq(0L))
)
)
).returning(field("id")).getSQL
The thing is that Jooq takes field types from insertInto and doesn't compile if select field types don't match.
I've tried
.insertInto(table("MY_TABLE"),
field("firstField"),
field("secondField")
)
and it didn't compile since no match with
.select(
select(
param(classOf[Int]), // 1
param(classOf[Int]) // 2
)
I've added types to insertInto fields and got match, two ints in insert, two ints in select.
Jooq generated expected query
insert into MY_TABLE -- I want INSERT INTO(firstField,secondField)
select
?,
?
where not exists (
select 1
from MY_TABLE
where (
firstField = ?
)
)

jOOQ just generates exactly the SQL you tell it to generate. You're not listing firstField,secondField in jOOQ, so jOOQ doesn't list them in SQL. To list them in jOOQ, just add:
// ...
.insertInto(table("MY_TABLE"), field("firstField", classOf[Long]), ...)
// ...
Obviously, even without using the code generator, you can reuse expressions by assigning them to local variables:
val t = table("MY_TABLE")
val f1 = field("firstField", classOf[Long])
val f2 = field("secondField", classOf[Long])
And then:
// ...
.insertInto(t, f1, f2)
// ...
Using the code generator
Note that if you were using the code generator, which jOOQ recommends, your query would be much simpler:
ctx.insertInto(MY_TABLE, MY_TABLE.FIRST_FIELD, MY_TABLE.SECOND_FIELD)
.values(v1, v2)
.onDuplicateKeyIgnore()
.returningResult(MY_TABLE.ID)
.fetch();

Related

MssqlRow to json string without knowing structure and data type on compile time [duplicate]

Using PostgreSQL I can have multiple rows of json objects.
select (select ROW_TO_JSON(_) from (select c.name, c.age) as _) as jsonresult from employee as c
This gives me this result:
{"age":65,"name":"NAME"}
{"age":21,"name":"SURNAME"}
But in SqlServer when I use the FOR JSON AUTO clause it gives me an array of json objects instead of multiple rows.
select c.name, c.age from customer c FOR JSON AUTO
[{"age":65,"name":"NAME"},{"age":21,"name":"SURNAME"}]
How to get the same result format in SqlServer ?
By constructing separate JSON in each individual row:
SELECT (SELECT [age], [name] FOR JSON PATH, WITHOUT_ARRAY_WRAPPER)
FROM customer
There is an alternative form that doesn't require you to know the table structure (but likely has worse performance because it may generate a large intermediate JSON):
SELECT [value] FROM OPENJSON(
(SELECT * FROM customer FOR JSON PATH)
)
no structure better performance
SELECT c.id, jdata.*
FROM customer c
cross apply
(SELECT * FROM customer jc where jc.id = c.id FOR JSON PATH , WITHOUT_ARRAY_WRAPPER) jdata (jdata)
Same as Barak Yellin but more lazy:
1-Create this proc
CREATE PROC PRC_SELECT_JSON(#TBL VARCHAR(100), #COLS VARCHAR(1000)='D.*') AS BEGIN
EXEC('
SELECT X.O FROM ' + #TBL + ' D
CROSS APPLY (
SELECT ' + #COLS + '
FOR JSON PATH, WITHOUT_ARRAY_WRAPPER
) X (O)
')
END
2-Can use either all columns or specific columns:
CREATE TABLE #TEST ( X INT, Y VARCHAR(10), Z DATE )
INSERT #TEST VALUES (123, 'TEST1', GETDATE())
INSERT #TEST VALUES (124, 'TEST2', GETDATE())
EXEC PRC_SELECT_JSON #TEST
EXEC PRC_SELECT_JSON #TEST, 'X, Y'
If you're using PHP add SET NOCOUNT ON; in the first row (why?).

Compare uuid and string on TypeORM query builder

I want to join 2 tables where user.id = photo.userId but the problem here is that the userId on photo table is varchar and that can't change. So I did a queryBuilder to join and the problem is here:
....
.where(user.id = photo.userId)
....
this query throw an error: operator does not exists: uuid = character varying
Is there any way to make this work?
Note: My project is a NestJS API, using TypeORM and Postgresql.
EDIT
I already have the Photo result and use it on a subQuery:
query = query
.where(qb => {
const subQuery = qb.subQuery()
.select('user.id')
.from(User, 'user')
.where('user.id = photo.userId)
.getQuery();
return 'EXISTS' + subQuery;
});
https://www.postgresqltutorial.com/postgresql-cast/
where (user.id::VARCHAR = photo.userId)
Thank you for the help, finally the best solution I found was to create a postgres function as indicated here and then call it in the code like this:
query = query
.where(qb => {
const subQuery = qb.subQuery()
.select('user.id')
.from(User, 'user')
.where('user.id = uuid_or_null(photo.userId)) // here
.getQuery();
return 'EXISTS' + subQuery;
});
First off the conversion of 'I' to "i" (upper to lower) in userId is exactly what would be expected, as identifiers are all lower cased unless double quoted. Avoid that if possible as when used you must double quote every time the identifier is used.
Secondly the type uuid has some strange and unexpected formatting rules. You can compare a string::uuid to a uuid as expected, but uuid::text may not compare to a srting. As uuid::text will format as hhhhhhhh-hhhh-hhhh-hhhh-hhhhhhhhhhhh (where h is a hexdigit). The dashes are often removed if storing as a a string. So reverse the typical order; cast the string as uuid. See the following example:
create table id_uuid (id uuid, col1 text);
create table id_str (id text, col1 text
insert into id_uuid(id, col1) values(gen_random_uuid(),'Id defined as uuid');
insert into id_str (id, col1)
select replace(id::text,'-',''),'Id defined as string'
from id_uuid;
select * from id_uuid;
select * from id_str;
select *
from id_uuid u
join id_str s
on (u.id::text = s.id);
select *
from id_uuid u
join id_str s
on (u.id = s.id::uuid);

Oracle - query to retrieve CLOB value under multple tags with same name

I have a table T with CLOB column called XML_CLOB
Value in the column likes following:
<reportName>
<string>REPORT_A</string>
<string>REPORT_B</string>
<string>REPORT_C</string>
</reportName>
I'm trying to retrieve string value from this CLOB column and return in different rows. If I use
xmltype(xml_clob).extract('//reportName/string/text()').getstringval()
it outputs like 'REPORT_AREPORT_BREPORT_C' in the same row.
I also tried
extractValue(xmltype(xml_clob), '//reportName/string[1]')
but the problem is I don't know how much child values under tag
Is there anyway I can retrieve in different rows like:
1 REPORT_A
2 REPORT_B
3 REPORT_C
Many thanks in advance~
Oracle Setup:
CREATE TABLE table_name (xml_clob CLOB );
INSERT INTO table_name VALUES (
'<reportName>
<string>REPORT_A</string>
<string>REPORT_B</string>
<string>REPORT_C</string>
</reportName>'
);
Query 1:
SELECT x.string
FROM table_name t,
XMLTable('/reportName/string'
PASSING XMLType( t.xml_clob )
COLUMNS string VARCHAR2(50) PATH '/'
) x
Query 2:
SELECT EXTRACTVALUE( s.COLUMN_VALUE, '/string' ) AS string
FROM table_name t,
TABLE(
XMLSequence(
EXTRACT(
XMLType( t.xml_clob ),
'/reportName/string'
)
)
) s;
Output:
STRING
--------
REPORT_A
REPORT_B
REPORT_C
WITH test_table AS
(SELECT xmltype('<reportName>
<string>REPORT_A</string>
<string>REPORT_B</string>
<string>REPORT_C</string>
</reportName>' ) xml_clob
FROM dual
)
SELECT x.*
FROM test_table,
xmltable('/reportName/string'
passing test_table.xml_clob
columns report_name VARCHAR2(100) path 'text()') x

SQL Pivot to Return more than 1 value

Here's the issue. I have 2 tables that I am currently using in a pivot to return a single value, MAX(Date). I have been asked to return additional values associated with that particular MAX(Date). I know I can do this with an OVER PARTITION but it would require me doing about 8 or 9 LEFT JOINS to get the desired output. I was hoping there is a way to get my existing PIVOT to return these values. More specifically, let's say each MAX(Date) has a data source and we want that particular source to become part of the output. Here is a simple sample of what I am talking about:
Create table #Email
(
pk_id int not null identity primary key,
email_address varchar(50),
optin_flag bit default(0),
unsub_flag bit default(0)
)
Create table #History
(
pk_id int not null identity primary key,
email_id int not null,
Status_Cd char(2),
Status_Ds varchar(20),
Source_Cd char(3),
Source_Desc varchar(20),
Source_Dttm datetime
)
Insert into #Email
Values
('test#test.com',1,0),
('blank#blank.com',1,1)
Insert into #History
values
(1,'OP','OPT-IN','WB','WEB','1/2/2015 09:32:00'),
(1,'OP','OPT-IN','WB','WEB','1/3/2015 10:15:00'),
(1,'OP','OPT-IN','WB','WEB','1/4/2015 8:02:00'),
(2,'OP','OPT-IN','WB','WEB','2/1/2015 07:22:00'),
(2,'US','UNSUBSCRIBE','EM','EMAIL','3/2/2015 09:32:00'),
(2,'US','UNSUBSCRIBE','ESP','SERVICE PROVIDER','3/2/2015 09:55:00'),
(2,'US','UNSUBSCRIBE','WB','WEB','3/2/2015 10:15:00')
;with dates as
(
select
email_id,
[OP] as [OptIn_Dttm],
[US] as [Unsub_Dttm]
from
(
select
email_id,
status_cd,
source_dttm
from #history
) as src
pivot (min(source_dttm) for status_cd in ([OP],[US])) as piv
)
select
e.pk_id as email_id,
e.email_address,
e.optin_flag,
/*WANT TO GET THE OPTIN SOURCE HERE*/ /*<-------------*/
d.OptIn_Dttm,
e.unsub_flag,
d.Unsub_Dttm
/*WANT TO GET THE UNSUB SOURCE HERE*/ /*<-------------*/
from #Email e
left join dates d on e.pk_id = d.email_id

in Tsql can i compare two string "MY String" to my string and show they are different

I need to do a query between two tables and find non matching fields
table 1 field locations has "my String"
table 2 field locations has "MY string"
they = by text but not by capitalization i need to return a false for this
Having the following data:
DECLARE #TableOne TABLE
(
[ID] TINYINT
,[Value] VARCHAR(12)
)
DECLARE #TableTwo TABLE
(
[ID] TINYINT
,[Value] VARCHAR(12)
)
INSERT INTO #TableOne ([ID], [Value])
VALUES (1,'my String')
INSERT INTO #TableTwo ([ID], [Value])
VALUES (1,'MY String')
You can use set Case Sentitive collation like this:
SELECT [TO].[Value]
,[TW].[Value]
FROM #TableOne [TO]
INNER JOIN #TableTwo [TW]
ON [TO].[ID] = [TW].[ID]
AND [TO].[Value] <> [TW].[Value]
COLLATE Latin1_General_CS_AS
or use HASH functions like this:
SELECT [TO].[Value]
,[TW].[Value]
FROM #TableOne [TO]
INNER JOIN #TableTwo [TW]
ON [TO].[ID] = [TW].[ID]
WHERE HASHBYTES('SHA1', [TO].[Value]) <> HASHBYTES('SHA1', [TW].[Value])
DECLARE #Table1 AS TABLE (FieldName VARCHAR(100))
DECLARE #Table2 AS TABLE (FieldName VARCHAR(100))
INSERT INTO #Table1 (FieldName) VALUES ('MY Location')
INSERT INTO #Table2 (FieldName) VALUES ('My Location')
With a default case insensitive collation order - Matches and returns results
SELECT * FROM #Table1 AS T1
INNER JOIN #Table2 AS T2
ON T1.FieldName = T2.FieldName
With a case sensitive collation order specified. Will not match
SELECT * FROM #Table1 AS T1
INNER JOIN #Table2 AS T2
ON T1.FieldName = T2.FieldName COLLATE Latin1_General_CS_AS_KS_WS
Microsoft article on collation

Resources