If can define nested classes (embedded in JPA) following learning slick2 like this:
case class Person(name: Name, address: Address)
case class Name(given: String, family: String)
case class Address(street: String, city: String)
class Directory(tag: Tag) extends Table[Person](tag, "directory") {
def givenName = column["String"] ( "given_name" )
def familyName = column["String"] ( "family_name" )
def street = column["String"] ( "street" )
def city = column["String"] ( "city" )
def * = (name, address) <> (Person.tupled, Person.unapply)
def name = (givenName, familyName) <> (Name.tupled, Name.unapply)
def address = (street, city) <> (Address.tupled, Address.unapply)
}
I would not like to write the definition of street, city and address in every table I embed Adress. I would like to write something like below, but this does not compile for the obvious reasons, because column is a method on Table etc... Is it possible to reuse column definitions somehow?
object Adresses {
def street = column["String"] ( "street" )
def city = column["String"] ( "city" )
def address = (street, city) <> (Address.tupled, Address.unapply)
}
class Directory(tag: Tag) extends Table[Person](tag, "directory") {
def givenName = column["String"] ( "given_name" )
def familyName = column["String"] ( "family_name" )
def * = (name, Adresses.address) <> (Person.tupled, Person.unapply) //address from Adresses
def name = (givenName, familyName) <> (Name.tupled, Name.unapply)
}
You can make Adresses a trait that inherits from the class Table (yep that works).
trait Adresses[T] extends Table[T]{
def street = column["String"] ( "street" )
def city = column["String"] ( "city" )
def address = (street, city) <> (Address.tupled, Address.unapply)
}
class Directory(tag: Tag) extends Table[Person](tag, "directory") with Adresses[Person] {
def givenName = column["String"] ( "given_name" )
def familyName = column["String"] ( "family_name" )
def * = (name, address) <> (Person.tupled, Person.unapply) //address from Adresses
def name = (givenName, familyName) <> (Name.tupled, Name.unapply)
}
Related
I have the following three models:
class User(AbstractUser):
username = None
name = models.CharField(max_length=255)
email = models.EmailField(unique=True)
account_type = models.CharField(
choices=AccountType.choices,
default=AccountType.GENERAL,
max_length=8,
)
first_name = None
last_name = None
USERNAME_FIELD = "email"
REQUIRED_FIELDS = ["name"]
objects = UserManager()
#property
def total_likes(self):
queryset = (
self.works.filter(visibility=Visibility.PUBLIC)
.select_related()
.annotate(likes=models.Count("social"))
)
return (
queryset.aggregate(models.Sum("likes")).get("likes__sum")
or 0
)
def __str__(self) -> str:
return self.name
class WorkImage(models.Model):
user = models.ForeignKey(
to=User, on_delete=models.CASCADE, related_name="works"
)
title = models.CharField(max_length=255)
image = models.ImageField(upload_to=work_image_directory_path)
#property
def likes(self):
return self.social.count()
def __str__(self) -> str:
return f"{self.user.name}'s {self.work_image_type} {self.title}"
class WorkImageSocial(models.Model):
work = models.ForeignKey(
to=WorkImage, on_delete=models.CASCADE, related_name="social"
)
liked_by = models.ForeignKey(
to=User, on_delete=models.CASCADE, related_name="liked_works"
)
def __str__(self) -> str:
return f"{self.liked_by.name}=>{self.work.title}"
I am writing a custom ordering filter to get the "most liked" users, what I want to do is annotate users with a field called "likes" which will have the sum of all likes on their work images, how do I achieve this?
In our Integration Tests we wan't to compare every field of an Object returned by an Rest Controller with an object constructed in the test.
This example illustrates the problem:
class RestIntegrationTest extends Specification {
def "Should return contracts"() {
when:
def actual = callRestController()
then:
// compare all fields of actual with "contract"
actual == new Contract(
number: "123",
signDate: "2017-04-01",
address: new Address(
name: "Foobar",
street: "Foostreet",
city: "Frankfurt",
zip: "60486"
),
persons: [new Person(name: "Christian")]
)
}
def callRestController() {
return new Contract(
number: "123",
signDate: "2017-04-01",
address: new Address(
name: "Foobar",
street: "Wrong Street",
city: "Frankfurt",
zip: "60486"
),
persons: [new Person(name: "Frank")]
)
}
static class Contract {
String number
String signDate
Address address
Person[] persons
}
static class Address {
String name
String street
String city
String zip
}
static class Person {
String name
}
}
As output we like expect something like this:
address.street "Wrong Street" != "Foostreet"
persons[0].name "Christian" != "Frank"
Breaking the assert into multiple "==" lines would lead into the correct output, but that will be not handy since some objects are quite huge.
You can try the groovy's #EqualsAndHashCode:
import groovy.transform.EqualsAndHashCode
#EqualsAndHashCode
static class Address {
String name
String street
String city
String zip
}
You can use unitils assertReflectionEquals
http://unitils.sourceforge.net/tutorial-reflectionassert.html
It's not comprehensive but may be sufficient for your needs:
def compareFields( obj1, obj2, propName = null ) {
obj1.properties.each {
if ( it.value instanceof Object[] ) {
def obj2Len = obj2."${it.key}".length
it.value.eachWithIndex { collObj, idx ->
if ( idx + 1 <= obj2Len )
compareFields( collObj, obj2."${it.key}"[idx], "${it.key}[${idx}]" )
}
}
if ( !it.value.class.getCanonicalName().contains( 'java' ) ) {
compareFields( it.value, obj2."${it.key}", it.key )
}
if ( it.value.class.getCanonicalName().contains( 'java' ) &&
it.key != 'class' &&
it.value <=> obj2."${it.key}") {
println "${propName ? "$propName." : ''}${it.key}: '${it.value}' != '" + obj2."${it.key}" + "'"
}
}
}
I have a table with a column type date. This column accepts null values, therefore, I declared it as an Option (see field perDate below). The issue is that apparently the implicit conversion from/to java.time.LocalDate/java.sql.Date is incorrect as reading from this table when perDate is null fails with the error:
slick.SlickException: Read NULL value (null) for ResultSet column <computed>
This is the Slick table definition, including the implicit function:
import java.sql.Date
import java.time.LocalDate
class FormulaDB(tag: Tag) extends Table[Formula](tag, "formulas") {
def sk = column[Int]("sk", O.PrimaryKey, O.AutoInc)
def name = column[String]("name")
def descrip = column[Option[String]]("descrip")
def formula = column[Option[String]]("formula")
def notes = column[Option[String]]("notes")
def periodicity = column[Int]("periodicity")
def perDate = column[Option[LocalDate]]("per_date")(localDateColumnType)
def * = (sk, name, descrip, formula, notes, periodicity, perDate) <>
((Formula.apply _).tupled, Formula.unapply)
implicit val localDateColumnType = MappedColumnType.base[Option[LocalDate], Date](
{
case Some(localDate) => Date.valueOf(localDate)
case None => null
},{
sqlDate => if (sqlDate != null) Some(sqlDate.toLocalDate) else None
}
)
}
Actually your implicit conversion from/to java.time.LocalDate/java.sql.Date is not incorrect.
I have faced the same error, and doing some research I found that the Node created by the Slick SQL Compiler is actually of type MappedJdbcType[Scala.Option -> LocalDate], and not Option[LocalDate].
That is the reason why when the mapping compiler create the column converter for your def perDate it is creating a Base ResultConverterand not a Option ResultConverter
Here is the Slick code for the base converter:
def base[T](ti: JdbcType[T], name: String, idx: Int) = (ti.scalaType match {
case ScalaBaseType.byteType => new BaseResultConverter[Byte](ti.asInstanceOf[JdbcType[Byte]], name, idx)
case ScalaBaseType.shortType => new BaseResultConverter[Short](ti.asInstanceOf[JdbcType[Short]], name, idx)
case ScalaBaseType.intType => new BaseResultConverter[Int](ti.asInstanceOf[JdbcType[Int]], name, idx)
case ScalaBaseType.longType => new BaseResultConverter[Long](ti.asInstanceOf[JdbcType[Long]], name, idx)
case ScalaBaseType.charType => new BaseResultConverter[Char](ti.asInstanceOf[JdbcType[Char]], name, idx)
case ScalaBaseType.floatType => new BaseResultConverter[Float](ti.asInstanceOf[JdbcType[Float]], name, idx)
case ScalaBaseType.doubleType => new BaseResultConverter[Double](ti.asInstanceOf[JdbcType[Double]], name, idx)
case ScalaBaseType.booleanType => new BaseResultConverter[Boolean](ti.asInstanceOf[JdbcType[Boolean]], name, idx)
case _ => new BaseResultConverter[T](ti.asInstanceOf[JdbcType[T]], name, idx) {
override def read(pr: ResultSet) = {
val v = ti.getValue(pr, idx)
if(v.asInstanceOf[AnyRef] eq null) throw new SlickException("Read NULL value ("+v+") for ResultSet column "+name)
v
}
}
}).asInstanceOf[ResultConverter[JdbcResultConverterDomain, T]]
Unfortunately I have no solution for this problem, what I suggest as a workaround, is to map your perDate property as follows:
import java.sql.Date
import java.time.LocalDate
class FormulaDB(tag: Tag) extends Table[Formula](tag, "formulas") {
def sk = column[Int]("sk", O.PrimaryKey, O.AutoInc)
def name = column[String]("name")
def descrip = column[Option[String]]("descrip")
def formula = column[Option[String]]("formula")
def notes = column[Option[String]]("notes")
def periodicity = column[Int]("periodicity")
def perDate = column[Option[Date]]("per_date")
def toLocalDate(time : Option[Date]) : Option[LocalDate] = time.map(t => t.toLocalDate))
def toSQLDate(localDate : Option[LocalDate]) : Option[Date] = localDate.map(localDate => Date.valueOf(localDate)))
private type FormulaEntityTupleType = (Int, String, Option[String], Option[String], Option[String], Int, Option[Date])
private val formulaShapedValue = (sk, name, descrip, formula, notes, periodicity, perDate).shaped[FormulaEntityTupleType]
private val toFormulaRow: (FormulaEntityTupleType => Formula) = { formulaTuple => {
Formula(formulaTuple._1, formulaTuple._2, formulaTuple._3, formulaTuple._4, formulaTuple._5, formulaTuple._6, toLocalDate(formulaTuple._7))
}
}
private val toFormulaTuple: (Formula => Option[FormulaEntityTupleType]) = { formulaRow =>
Some((formulaRow.sk, formulaRow.name, formulaRow.descrip, formulaRow.formula, formulaRow.notes, formulaRow.periodicity, toSQLDate(formulaRow.perDate)))
}
def * = formulaShapedValue <> (toFormulaRow, toFormulaTuple)
Hopefully the answer comes not too late.
I'm pretty sure the problem is that your'e returning null from your mapping function instead of None.
Try rewriting your mapping function as a function from LocalDate to Date:
implicit val localDateColumnType = MappedColumnType.base[LocalDate, Date](
{
localDate => Date.valueOf(localDate)
},{
sqlDate => sqlDate.toLocalDate
}
)
Alternately, mapping from Option[LocalDate] to Option[Date] should work:
implicit val localDateColumnType =
MappedColumnType.base[Option[LocalDate], Option[Date]](
{
localDateOption => localDateOption.map(Date.valueOf)
},{
sqlDateOption => sqlDateOption.map(_.toLocalDate)
}
)
I need to collect report data from a master-detail relations. Here is a simplified example:
case class Person(id: Int, name: String)
case class Order(id: String, personId: Int, description: String)
class PersonTable(tag: Tag) extends Table[Person](tag, "person") {
def id = column[Int]("id")
def name = column[String]("name")
override def * = (id, name) <>(Person.tupled, Person.unapply)
}
class OrderTable(tag: Tag) extends Table[Order](tag, "order") {
def id = column[String]("id")
def personId = column[Int]("personId")
def description = column[String]("description")
override def * = (id, personId, description) <>(Order.tupled, Order.unapply)
}
val persons = TableQuery[PersonTable]
val orders = TableQuery[OrderTable]
case class PersonReport(nameToDescription: Map[String, Seq[String]])
/** Some complex function that cannot be expressed in SQL and
* in slick's #join.
*/
def myScalaCondition(person: Person): Boolean =
person.name.contains("1")
// Doesn't compile:
// val reportDbio1:DBIO[PersonReport] =
// (for{ allPersons <- persons.result
// person <- allPersons
// if myScalaCondition(person)
// descriptions <- orders.
// filter(_.personId == person.id).
// map(_.description).result
// } yield (person.name, descriptions)
// ).map(s => PersonReport(s.toMap))
val reportDbio2: DBIO[PersonReport] =
persons.result.flatMap {
allPersons =>
val dbios = allPersons.
filter(myScalaCondition).map { person =>
orders.
filter(_.personId == person.id).
map(_.description).result.map { seq => (person.name, seq) }
}
DBIO.sequence(dbios)
}.map(ps => PersonReport(ps.toMap))
It looks far away from straightforward. When I need to collect master-detail data with 3 levels, it becomes incomprehensible.
Is there a better way?
Is there a way in groovy to do something like:
class Person{
def name, surname
}
public void aMethod(anoherBean){
def bean = retrieveMyBean()
p.properties = anoherBean.properties
}
The property properties is final, is there another way to do this shortcut?
properties is a virtual property; you have to call the individual setters. Try this:
def values = [name: 'John', surname: 'Lennon']
for( def entry : values.entries() ) {
p.setProperty( entry.getKey(), entry.getValue() );
}
Or, using MOP:
Object.class.putAllProperties = { values ->
for( def entry : values.entries() ) {
p.setProperty( entry.getKey(), entry.getValue() );
}
}
Person p = new Person();
p.putAllProperties [name: 'John', surname: 'Lennon']
[EDIT] To achieve what you want, you must loop over the properties. This blog post describes how to do that:
def copyProperties(def source, def target){
target.metaClass.properties.each{
if (source.metaClass.hasProperty(source, it.name) && it.name != 'metaClass' && it.name != 'class')
it.setProperty(target, source.metaClass.getProperty(source, it.name))
}
}
If you don't have any special reason then just use named parameters
def p = new Person(name: 'John', surname: 'Lennon')
After question being updated
static copyProperties(from, to) {
from.properties.each { key, value ->
if (to.hasProperty(key) && !(key in ['class', 'metaClass']))
to[key] = value
}
}