Selecting from existing Records with jOOQ - jooq

Is it possible with jOOQ to create a VALUES() expression from existing Records without losing meta information?
Something like:
List<ExampleEntityRecord> records = ...
select()
.from(values(/* records */))
.where(EXAMPLE_ENTITY.EXAMPLE_FIELD.eq("some value"))
Currently I am using values(*records.map { it.valuesRow() }.toTypedArray()). However the result is an Table<RecordN</* ... */>> without meta information about the fields.

You can use the DSL.table(Result<R>) or DSL.table(R...) convenience methods, which maintain the <R extends Record> type in the resulting Table<R> type:
ExampleEntityRecord[] records = ...
select()
.from(table(records).as(EXAMPLE_ENTITY))
.where(EXAMPLE_ENTITY.EXAMPLE_FIELD.eq("some value"))
Notice, the overload accepting a Collection<?> does something slightly different, unfortunately.

Related

Use JOOQ Multiset with custom RecordMapper - How to create Field<List<String>>?

Suppose I have two tables USER_GROUP and USER_GROUP_DATASOURCE. I have a classic relation where one userGroup can have multiple dataSources and one DataSource simply is a String.
Due to some reasons, I have a custom RecordMapper creating a Java UserGroup POJO. (Mainly compatibility with the other code in the codebase, always being explicit on whats happening). This mapper sometimes creates simply POJOs containing data only from the USER_GROUP table, sometimes also the left joined dataSources.
Currently, I am trying to write the Multiset query along with the custom record mapper. My query thus far looks like this:
List<UserGroup> = ctx
.select(
asterisk(),
multiset(select(USER_GROUP_DATASOURCE.DATASOURCE_ID)
.from(USER_GROUP_DATASOURCE)
.where(USER_GROUP.ID.eq(USER_GROUP_DATASOURCE.USER_GROUP_ID))
).as("datasources").convertFrom(r -> r.map(Record1::value1))
)
.from(USER_GROUP)
.where(condition)
.fetch(new UserGroupMapper()))
Now my question is: How to create the UserGroupMapper? I am stuck right here:
public class UserGroupMapper implements RecordMapper<Record, UserGroup> {
#Override
public UserGroup map(Record rec) {
UserGroup grp = new UserGroup(rec.getValue(USER_GROUP.ID),
rec.getValue(USER_GROUP.NAME),
rec.getValue(USER_GROUP.DESCRIPTION)
javaParseTags(USER_GROUP.TAGS)
);
// Convention: if we have an additional field "datasources", we assume it to be a list of dataSources to be filled in
if (rec.indexOf("datasources") >= 0) {
// How to make `rec.getValue` return my List<String>????
List<String> dataSources = ?????
grp.dataSources.addAll(dataSources);
}
}
My guess is to have something like List<String> dataSources = rec.getValue(..) where I pass in a Field<List<String>> but I have no clue how I could create such Field<List<String>> with something like DSL.field().
How to get a type safe reference to your field from your RecordMapper
There are mostly two ways to do this:
Keep a reference to your multiset() field definition somewhere, and reuse that. Keep in mind that every jOOQ query is a dynamic SQL query, so you can use this feature of jOOQ to assign arbitrary query fragments to local variables (or return them from methods), in order to improve code reuse
You can just raw type cast the value, and not care about type safety. It's always an option, evne if not the cleanest one.
How to improve your query
Unless you're re-using that RecordMapper several times for different types of queries, why not do use Java's type inference instead? The main reason why you're not getting type information in your output is because of your asterisk() usage. But what if you did this instead:
List<UserGroup> = ctx
.select(
USER_GROUP, // Instead of asterisk()
multiset(
select(USER_GROUP_DATASOURCE.DATASOURCE_ID)
.from(USER_GROUP_DATASOURCE)
.where(USER_GROUP.ID.eq(USER_GROUP_DATASOURCE.USER_GROUP_ID))
).as("datasources").convertFrom(r -> r.map(Record1::value1))
)
.from(USER_GROUP)
.where(condition)
.fetch(r -> {
UserGroupRecord ug = r.value1();
List<String> list = r.value2(); // Type information available now
// ...
})
There are other ways than the above, which is using jOOQ 3.17+'s support for Table as SelectField. E.g. in jOOQ 3.16+, you can use row(USER_GROUP.fields()).
The important part is that you avoid the asterisk() expression, which removes type safety. You could even convert the USER_GROUP to your UserGroup type using USER_GROUP.convertFrom(r -> ...) when you project it:
List<UserGroup> = ctx
.select(
USER_GROUP.convertFrom(r -> ...),
// ...

How to use BQL In Operator with Select2 query / PXProjection and list of values

I am trying to replicate the following type of SQL query that you can perform in SQL Server...the important part here is the WHERE clause:
Select InventoryCD from InventoryItem WHERE InventoryCD IN ('123123', '154677', '445899', '998766')
It works perfectly using the IN3<> operator and a series of string constants:
i.e. And<InventoryItem.inventoryCD, In3<constantA,constantB,constantC>,
However, I need to be able to do this with an arbitrarily long list of values in an array, and I need to be able to set the values dynamically at runtime.
I'm not sure what type I need to pass in to the IN<> statement in my PXProjection query. I have been playing around with the following approach, but this throws a compiler error.
public class SOSiteStatusFilterExt : PXCacheExtension<SOSiteStatusFilter>
{
public static bool IsActive()
{
return true;
}
public abstract class searchitemsarray : PX.Data.IBqlField
{
}
[PXUnboundDefault()]
public virtual string[] Searchitemsarray { get; set; }
}
I think maybe I need an array of PXString objects? I'm really not sure, and there isn't any documentation that is helpful. Can anyone help?
This shows how to do it with a regular PXSelect: https://asiablog.acumatica.com/2017/11/sql-in-operator-in-bql.html
But I need to be able to pass in the correct type using Select2...
For reference I will post here the example mentioned by Hugues in the comments.
If you need to generate a query with an arbitrary list of values generated at runtime like this:
Select * from InventoryItem InventoryItem
Where InventoryItem.InventoryCD IN ('123123', '154677', '445899', '998766')
Order by InventoryItem.InventoryCD
You would write something like this:
Object[] values = new String[] { "123123", "154677", "445899", "998766" };
InventoryItem item = PXSelect<InventoryItem,
Where<InventoryItem.inventoryCD,
In<Required<InventoryItem.inventoryCD>>>>.Select(Base, values);
Please note that In<> operator is available only with Required<> parameter and you need to pass array of possible values manually to Select(…) method parameters. So you need to fill this array with your list before calling the Select method.
Also, the Required<> parameter should be used only in the BQL statements that are directly executed in the application code. The data views that are queried from the UI will not work if they contain Required<> parameters.
I ended up creating 100 variables and using the BQL OR operator, i.e.
And2<Where<InventoryItem.inventoryCD, Equal<CurrentValue<SOSiteStatusFilterExt.Pagefilter1>>,
Or<InventoryItem.inventoryCD, Equal<CurrentValue<SOSiteStatusFilterExt.Pagefilter2>>,
Or<InventoryItem.inventoryCD, Equal<CurrentValue<SOSiteStatusFilterExt.Pagefilter3>>,
Or<InventoryItem.inventoryCD, Equal<CurrentValue<SOSiteStatusFilterExt.Pagefilter4>>,
etc...etc...
You can then set the value of Pagefilter1, 2, etc inside of the FieldSelecting event for SOSiteStatusFilter.inventory, as an example. The key insight here, which isn't that obvious to the uninitiated in Acumatica, is that all variables parameterized in SQL Server via BQL are nullable. If the variable is null when the query is run, SQL Server automatically disables that variable using a "bit flipping" approach to disable that variable in the SQL procedure call. In normal T-SQL, this would throw an error. But Acumatica's framework handles the case of a NULL field equality by disabling that variable inside the SQL procedure before the equality is evaluated.
Performance with this approach was very good, especially because we are querying on an indexed key field (InventoryCD isn't technically the primary key but it is basically a close second).

Is it possible to do a Lookup use Kiba

Is it possible to do a "Lookup" with Kiba.
Since it's quite a normal process in a etl.
Could you show a demo if yes, thanks.
Yes, a lookup can be done with Kiba!
For a tutorial, see this live coding session I recorded, I create a lookup transform to lookup extra fields using a given fields by tapping in the MovieDB database.
Leveraging this example, you could for instance implement a simple ActiveRecord lookup using a block transform:
# assuming you have a 'country_iso_2' field in the row above
transform do |row|
country = Country.where(iso_2: row['country_iso_2']).first
row['country_name'] = country.try(:name) || 'Unknown'
row
end
or you could extract a more reusable class transform that you would call like this:
transform ActiveRecordLookup, model: Country,
lookup_on: 'country_iso_2',
fetch_fields: { 'name' => 'country_name' }
transform DefaultValue, 'name' => 'Unknown'
Obviously, if you have the need for large volumes, you will have to implement some improvements (e.g. caching, bulk reading).
Hope this helps!

How to convert Rep[T] to T in slick 3.0?

I used a code, generated from slick code generator.
My table has more than 22 columns, hence it uses HList
It generates 1 type and 1 function:
type AccountRow
def AccountRow(uuid: java.util.UUID, providerid: String, email: Option[String], ...):AccountRow
How do I write compiled insert code from generated code?
I tried this:
val insertAccountQueryCompiled = {
def q(uuid:Rep[UUID], providerId:Rep[String], email:Rep[Option[String]], ...) = Account += AccountRow(uuid, providerId, email, ...)
Compiled(q _)
}
I need to convert Rep[T] to T for AccountRow function to work. How do I do that?
Thank you
;TLDR; Not possible
Explanation
There are two levels of abstraction in Slick: Querys and DBIOActions.
When you're dealing with Querys, you have to access your schema definitions, and rows, Reps and, basically, it's very constrained as it's the closest level of abstraction to the actual DB you're using. A Rep refers to an hypothetical value in the database, not in your program.
Then you have DBIOActions, which are the next level... not just some definition of a query, but the execution of it. You usually get DBIOActions when getting information out of a query, like with the result method or (TADAN!) when inserting rows.
Inserts and Updates are not queries and so what you're trying to do is not possible. You're dealing with DBIOAction (the += method), and Query stuff (the Rep types). The only way to get a Rep inside a DBIOAction is by executing a Query and obtaining a DBIOAction and then composing both Actions using flatMap or for comprehensions (which is the same).

Is it possible to do data type conversion on SQLBulkUpload from IDataReader?

I need to grab a large amount of data from one set of tables and SQLBulkInsert into another set...unfortunately the source tables are ALL varchar(max) and I would like the destination to be the correct type. Some tables are in the millions of rows...and (for far too pointless policital reasons to go into) we can't use SSIS.
On top of that, some "bool" values are stored as "Y/N", some "0/1", some "T/F" some "true/false" and finally some "on/off".
Is there a way to overload IDataReader to perform type conversion? Would need to be on a per-column basis I guess?
An alternative (and might be the best solution) is to put a mapper in place (perhaps AutoMapper or custom) and use EF to load from one object and map into the other? This would provoide a lot of control but also require a lot of boilerplate code for every property :(
In the end I wrote a base wrapper class to hold the SQLDataReader, and implementing the IDataReader methods just to call the SQLDataReader method.
Then inherit from the base class and override GetValue on a per-case basis, looking for the column names that need translating:
public override object GetValue(int i)
{
var landingColumn = GetName(i);
string landingValue = base.GetValue(i).ToString();
object stagingValue = null;
switch (landingColumn)
{
case "D4DTE": stagingValue = landingValue.FromStringDate(); break;
case "D4BRAR": stagingValue = landingValue.ToDecimal(); break;
default:
stagingValue = landingValue;
break;
}
return stagingValue;
}
Works well, is extensible, and very fast thanks to SQLBulkUpload. OK, so there's a small maintenance overhead, but since the source columns will very rarely change, this doesn't really affect anything.

Resources