Slick 3.0 Mapping to a List - slick

I have a mapping definition in Slick as below:
class MyTable(tag: Tag) extends Table[MyTableRwo](tag, "myTable") {
def id = column[Option[Long]]("id", O.PrimaryKey, O.AutoInc)
def otherTableIds = column[Seq[Long]]("ids")
}
The other table looks like:
class MyOtherTable(tag: Tag) extends Table[MyOtherTableRow](tag, "myOtherTable") {
def id = column[Option[Long]]("id", O.PrimaryKey, O.AutoInc)
def myTableId = // this is a foreign key to the MyTable
}
where the def otherTableIds should come from another table called MyOtherTable. MyTable has a one-to-many relationship with the MyOtherTable. What I would want Slick to do is to get only the Id's of the MyOtherTable when I load the MyTable for a given id!
Any clues?

Related

how to connect to Cassandra at application start up

I have a Play application which need to connect to Cassandra. I am using Datastax's driver to connect to Cassandra.
I am able to connect to the db from a controller. The code snippet is (full code is from http://manuel.kiessling.net/setting-up-a-scala-sbt-multi-project-with-cassandra-connectivity-and-migrations
val cluster = new Cluster.Builder().
addContactPoints(uri.hosts.toArray: _*).
withPort(uri.port).
withQueryOptions(new QueryOptions().setConsistencyLevel(defaultConsistencyLevel)).build
val session = cluster.connect
session.execute(s"USE ${uri.keyspace}")
session
I am using the above code in a controller as follows:
class UserController #Inject()(cc: ControllerComponents)(implicit exec: ExecutionContext) extends AbstractController(cc){
def addUser = Action.async{ implicit request => {
println("addUser controller called")
println("testing database connection")
val uri = CassandraConnectionUri("cassandra://localhost:9042/killrvideo")
println(s"got uri object ${uri.host}, ${uri.hosts}, ${uri.port}, ${uri.keyspace}")
val session = Helper.createSessionAndInitKeyspace(uri)
val resultSet = session.execute(s"select * from users")
val row = resultSet.one()
println("got row ",row)
val user = User(UUID.randomUUID(),UserProfile(true,Some("m#m.com"),Some("m"),Some("c")))
...
}
Though the code works, I suppose I shouldn't be connecting to the database from within a controller. I should connect to the database when the play application starts and inject the connection in the controller. But I don't know how to do this. Is this the right way to create a database application in Play?
Short description:
It's not a good practice to connect C* from controller class. It is encouraged to have a separate repository/storage class while accessing DB. You will create a DB accessing class and inject that class to your controller class's constructor.
Here is an open-source sample application what I followed to create my own Cassandra application. Play-Framework-Cassandra-Example. You can follow this project.
Long description:
Here are some basic concepts how to do it:
Step 1:
Define DB configuration in application.conf file:
db {
keyspace = "persons"
table = "person_info"
preparedStatementCacheSize = 100
session {
contactPoints = ["127.0.0.1"]
queryOptions {
consistencyLevel = "LOCAL_QUORUM"
}
}
}
step 2: create a Singleton class to main the connection with Cassandra DB
class CassandraConnectionProvider #Inject()(config: Configuration) extends Provider[CassandraConnection] {
override def get(): CassandraConnection = {
val hosts = config.getStringList("db.session.contactPoints")
val keyspace = config.getString("db.keyspace")
// Use the Cluster Builder if you need to add username/password and handle SSL or tweak the connection
ContactPoints(hosts.asScala).keySpace(keyspace)
}
}
Step 3: Now create a repository class where you can operate CRUD operation into DB.
class PhantomPersonRepository #Inject()(config: Configuration, connection: CassandraConnection, ec: ExecutionContext)
extends CassandraTable[PhantomPersonRepository, Person] with PersonRepository[Future] {
// See https://github.com/outworkers/phantom/wiki/Using-the-Database-class-and-understanding-connectors
implicit val session: Session = connection.session
implicit val keySpace: KeySpace = connection.provider.space
override val tableName: String = config.getString("db.table").getOrElse("person_info")
implicit val executionContext: ExecutionContext = ec
object id extends UUIDColumn(this) with PartitionKey
object firstName extends StringColumn(this) {
override def name: String = "first_name"
}
object lastName extends StringColumn(this) {
override def name: String = "last_name"
}
object studentId extends StringColumn(this) {
override def name: String = "student_id"
}
object gender extends EnumColumn[Gender.Value](this)
override implicit val monad: Monad[Future] = cats.instances.future.catsStdInstancesForFuture
override def create(person: Person): Future[Person] =
insert.value(_.id, person.id)
.value(_.firstName, person.firstName)
.value(_.lastName, person.lastName)
.value(_.studentId, person.studentId)
.value(_.gender, person.gender)
.consistencyLevel_=(ConsistencyLevel.LOCAL_QUORUM)
.future()
.map(_ => person)
// https://github.com/outworkers/phantom/wiki/Querying#query-api
override def find(personId: UUID): Future[Option[Person]] =
select.where(_.id eqs personId)
.consistencyLevel_=(ConsistencyLevel.LOCAL_QUORUM)
.one()
override def update(person: Person): Future[Person] = create(person)
.....
Step 4: Now Inject this repository classes to your Controller class and access DB:
#Singleton
class PersonController #Inject()(personRepo: PersonRepository[Future])(implicit ec: ExecutionContext) extends Controller {
def create: Action[JsValue] = Action.async(parse.json) { request =>
onValidationSuccess[CreatePerson](request.body) { createPerson =>
val person = Person(UUID.nameUUIDFromBytes(createPerson.studentId.getBytes()), createPerson.firstName,
createPerson.lastName, createPerson.studentId, createPerson.gender.toModel)
personRepo.find(person.id).flatMap {
case None => personRepo.create(person).map(createdPerson => Created(createdPerson.toJson))
case Some(existing) => Future.successful(Conflict(existing.toJson))
}.recover { case _ => ServiceUnavailable }
}
}
.....
Hope this helps. All code credits to calvinlfer

Cassandra phantom-dsl derived column is missing in create database generated queries

I have following table definition
import com.outworkers.phantom.builder.primitives.Primitive
import com.outworkers.phantom.dsl._
abstract class DST[V, P <: TSP[V], T <: DST[V, P, T]] extends Table[T, P] {
object entityKey extends StringColumn with PartitionKey {
override lazy val name = "entity_key"
}
abstract class entityValue(implicit ev: Primitive[V]) extends PrimitiveColumn[V] {
override lazy val name = "entity_value"
}
In concrete table sub class
abstract class SDST[P <: TSP[String]] extends DST[String, P, SDST[P]] {
override def tableName: String = "\"SDS\""
object entityValue extends entityValue
}
Database class
class TestDatabase(override val connector: CassandraConnection) extends Database[TestDatabase](connector) {
object SDST extends SDST[SDSR] with connector.Connector {
override def fromRow(r: Row): SDSR=
SDSR(entityKey(r), entityValue(r))
}
}
The create table query generated by phantom-dsl looks like below
database.create()
c.o.phantom Executing query: CREATE TABLE IF NOT EXISTS test."SDS" (entity_key text,PRIMARY KEY (entity_key))
As you can see derived column is missing from the create table DDL.
Please let me know if I am missing something in the implementation.
Omitted class definitions like SDSR and TSP are simple case classes.
Thanks
Phantom doesn't currently support table to table inheritance. The reasons behind that decision are complexities inherent in the Macro API that we rely on to power the DSL.
This feature is planned for a future release, but until that stage we do not expect this to work, as the table helper macro does not read columns that are inherited basically.

slick schema extension with activeslick

I have defined Person and Org (organization) schemas as
trait Schema { this: Tables with TableQueries with Profile =>
import jdbcDriver.simple._
class PersonsTable(tag: Tag) extends EntityTable[Player](tag, "PERSON") { def id = column[Int]("ID", O.PrimaryKey, O.AutoInc)
def dateCreated = column[Date]("DATE_CREATED")
def firstName = column[String]("FIRST_NAME")
def lastName = column[String]("LAST_NAME")}
class OrgTable(tag: Tag) extends EntityTable[Player](tag, "ORG") {
def id = column[Int]("ID", O.PrimaryKey, O.AutoInc)
def dateCreated = column[Date]("DATE_CREATED")
def = column[String]("NAME")
}
}
I soon realized that the schemas shared common fields like id (primary key) and dateCreated. So, I grouped them together so I don't have to repeat them,
trait Schema { this: Tables with TableQueries with Profile =>
import jdbcDriver.simple._
class PersonsTable(tag: Tag) extends EntityTable[Player](tag, "PERSON") with CommonSchema#CommonFields { def firstName = column[String]("FIRST_NAME")
def lastName = column[String]("LAST_NAME")}
class OrgTable(tag: Tag) extends EntityTable[Player](tag, "ORG") with CommonSchema#CommonFields {
def = column[String]("NAME")
}
}
trait CommonSchema { this: Tables with TableQueries with Profile =>
import jdbcDriver.simple._
trait CommonFields{this: Table[_] =>
def id = column[Int]("ID", O.PrimaryKey, O.AutoInc)
def dateCreated = column[Date]("DATE_CREATED")
}
}
I could not get this to compile. The compiler complained "CommonSchema is not a legal prefix for a constructor". How do I get it to work? I am using ActiveSlick with this setup and I am using a generic jdbcDriver for these classes as a place holder until the final moment when I run this classes with a specific Slick driver i.e. scala.slick.driver.H2Driver.simple._. Thanks

Accessing a property with object."${property}"

I'm working on some dynamic filtering, and have this:
class Filterable {
def statusId
def secondaryFilterable
}
...
def filter = new Filter(validIds: [1], fieldName: 'statusId')
...
class Filter {
def validIds = [] as Set
def fieldName
private boolean containsFieldValue(input) {
def fieldValue = input."${fieldName}"
return fieldValue in validIds
}
}
Which works just fine for one property. However, now I need to filter by the secondary filterable - something like
def filter = new Filter(validIds: [1], fieldName: 'secondaryFilterable.statusId')
Which throws a groovy.lang.MissingPropertyException. Any advice?
Quoted properties assume a dot is part of the property name.
A simple solution would be:
...
def fieldValue = fieldName.split(/\./).inject(input){ parent, property -> parent?."$property" }
...
This will recursively look up the field value using dot notation for child properties.
I put up a working example here on the Groovy web console.

groovy: Have a field name, need to set value and don't want to use switch

I have an object with several fields,
class TestObj {
def field1
def field2
}
I have a pair of values v1="field1" and v2="value2" I would like to set v2 into the appropriate field based on the name of v1, but I'd prefer not to have to do it with a switch or if statements, I keep thinking there has to be a much "groovier" way of achieving the result other than doing something like this:
setValues(def fieldName, def fieldVal) {
if (fieldName.equals("field1")) {
field1 = fieldVal
}
if (fieldName.equals("field2")) {
field2 = fieldVal
}
}
I've tried doing this:
setValues(def fieldName, def fieldVal) {
this['${fieldName}'] = fieldVal
}
However that fails, saying there's no property ${fieldName}
Thanks.
You can use GStrings when you get a field, like this:
def obj = new TestObj()
def fieldToUpdate = 'field1'
obj."$fieldToUpdate" = 3
In Groovy you don't have to define a property to have a property. Use getProperty and setProperty called property access hooks in Groovy:
class TestObj {
def properties = [:]
def getProperty(String name) { properties[name] }
void setProperty(String name, value) { properties[name] = value }
void setValues(def fieldName, def fieldVal) {setProperty(fieldName, fieldVal)}
}
def test = new TestObj()
test.anyField = "anyValue"
println test.anyField
test.setValues("field1", "someValue")
println test.field1
test.setValues("field2", "anotherValue")
println test.field2

Resources