Cassandra phantom-dsl derived column is missing in create database generated queries - cassandra

I have following table definition
import com.outworkers.phantom.builder.primitives.Primitive
import com.outworkers.phantom.dsl._
abstract class DST[V, P <: TSP[V], T <: DST[V, P, T]] extends Table[T, P] {
object entityKey extends StringColumn with PartitionKey {
override lazy val name = "entity_key"
}
abstract class entityValue(implicit ev: Primitive[V]) extends PrimitiveColumn[V] {
override lazy val name = "entity_value"
}
In concrete table sub class
abstract class SDST[P <: TSP[String]] extends DST[String, P, SDST[P]] {
override def tableName: String = "\"SDS\""
object entityValue extends entityValue
}
Database class
class TestDatabase(override val connector: CassandraConnection) extends Database[TestDatabase](connector) {
object SDST extends SDST[SDSR] with connector.Connector {
override def fromRow(r: Row): SDSR=
SDSR(entityKey(r), entityValue(r))
}
}
The create table query generated by phantom-dsl looks like below
database.create()
c.o.phantom Executing query: CREATE TABLE IF NOT EXISTS test."SDS" (entity_key text,PRIMARY KEY (entity_key))
As you can see derived column is missing from the create table DDL.
Please let me know if I am missing something in the implementation.
Omitted class definitions like SDSR and TSP are simple case classes.
Thanks

Phantom doesn't currently support table to table inheritance. The reasons behind that decision are complexities inherent in the Macro API that we rely on to power the DSL.
This feature is planned for a future release, but until that stage we do not expect this to work, as the table helper macro does not read columns that are inherited basically.

Related

How to define custom JS objects in ScalaJS

The phaser game library has an API where you pass a custom object when starting a game scene (docs). This data object can be any javascript object at all and can be retrieved from within the scene from the scene's settings. My question is how do I define this object in the phaser facades in a generic way and define a strongly typed version in my own code?
So far I have just referenced the object as a js.Object in the phaser APIs and cast it to my own type when the scene is created:
#js.native
trait ScenePlugin extends js.Object {
def start(key: SceneKey, data: js.UndefOr[js.Object] = js.undefined): ScenePlugin
}
#js.annotation.ScalaJSDefined
class LevelConfig(
val key: LevelKey,
val loadingImage: Option[AssetKey] = None) extends js.Object
#ScalaJSDefined
class LoadScene extends Scene {
private val loader = new SceneLoader(scene = this)
private var levelConfig: LevelConfig = _
override def preload(): Unit = {
levelConfig = sys.settings.data.asInstanceOf[LevelConfig]
}
...
}
This works but I'm not happy with it because I have to cast the data object. Any errors with the actual object that gets passed to the ScenePlugin.start() will cause errors during runtime and I may as well have just used vanilla JS. Also, my LevelConfig cannot be a case class as I get the compile error Classes and objects extending js.Any may not have a case modifier which I don't fully understand.
Has anyone dealt with this situation before and what did you do to get around it? I'm guessing the issue stems from the library which is being used so perhaps I need to create some kind of wrapper around Phaser's Scene class to deal with this? I'm quite new to ScalaJS and am looking to improve my understanding so any explanations with solutions would be much appreciated (and upvoted). Thanks very much!
I followed Justin du Coeur's comment suggestion of modifying the Phaser facade's. I defined a non-native trait for a SceneData object and updated the native Scene facade to have two types which subclasses of Scene must override. Phaser scenes are abstract and intended to be overridden so I think this works well:
class Scene(config: SceneConfig) extends js.Object {
type Key <: SceneKey
type Data <: SceneData
def scene: ScenePlugin = js.native
def data: Data = js.native
def preload(): Unit = js.native
def create(): Unit = js.native
def update(time: Double, delta: Double): Unit = js.native
}
object Scene {
trait SceneKey { def value: String }
implicit def keyAsString(id: SceneKey): String = id.value
trait SceneData extends js.Object
}
#js.native
trait ScenePlugin extends js.Object {
def start[S <: Scene](id: String, data: js.UndefOr[S#Data] = js.undefined): ScenePlugin = js.native
}
And here's a simplified example of a scene in my game:
class LoadScene extends Scene(LoadScene.Config) {
override type Key = LoadId.type
override type Data = GameAssets
override def preload(): Unit = {
createLoadBar()
loadAssets(data)
}
private def createLoadBar(): Unit = { ... }
private def loadAssets(config: GameAssets): Unit = { ... }
override def create(): Unit = {
scene.start[GameScene](GameId)
}
}
object LoadScene {
case object LoadId extends SceneKey { val value = "loading" }
val Config: SceneConfig = ...
}
I quite like this because it's now impossible to start a scene with another scene's config type.

How to convert a Kotlin data class object to map?

Is there any easy way or any standard library method to convert a Kotlin data class object to a map/dictionary of its properties by property names? Can reflection be avoided?
I was using the jackson method, but turns out the performance of this is terrible on Android for first serialization (github issue here). And its dramatically worse for older android versions, (see benchmarks here)
But you can do this much faster with Gson. Conversion in both directions shown here:
import com.google.gson.Gson
import com.google.gson.reflect.TypeToken
val gson = Gson()
//convert a data class to a map
fun <T> T.serializeToMap(): Map<String, Any> {
return convert()
}
//convert a map to a data class
inline fun <reified T> Map<String, Any>.toDataClass(): T {
return convert()
}
//convert an object of type I to type O
inline fun <I, reified O> I.convert(): O {
val json = gson.toJson(this)
return gson.fromJson(json, object : TypeToken<O>() {}.type)
}
//example usage
data class Person(val name: String, val age: Int)
fun main() {
val person = Person("Tom Hanley", 99)
val map = mapOf(
"name" to "Tom Hanley",
"age" to 99
)
val personAsMap: Map<String, Any> = person.serializeToMap()
val mapAsPerson: Person = map.toDataClass()
}
This extension function uses reflection, but maybe it'll help someone like me coming across this in the future:
inline fun <reified T : Any> T.asMap() : Map<String, Any?> {
val props = T::class.memberProperties.associateBy { it.name }
return props.keys.associateWith { props[it]?.get(this) }
}
I have the same use case today for testing and ended up i have used Jackson object mapper to convert Kotlin data class into Map. The runtime performance is not a big concern in my case. I haven't checked in details but I believe it's using reflection under the hood but it's out of concern as happened behind the scene.
For Example,
val dataclass = DataClass(p1 = 1, p2 = 2)
val dataclassAsMap = objectMapper.convertValue(dataclass, object:
TypeReference<Map<String, Any>>() {})
//expect dataclassAsMap == mapOf("p1" to 1, "p2" to 2)
kotlinx.serialization has an experimental Properties format that makes it very simple to convert Kotlin classes into maps and vice versa:
#ExperimentalSerializationApi
#kotlinx.serialization.Serializable
data class Category constructor(
val id: Int,
val name: String,
val icon: String,
val numItems: Long
) {
// the map representation of this class
val asMap: Map<String, Any> by lazy { Properties.encodeToMap(this) }
companion object {
// factory to create Category from a map
fun from(map: Map<String, Any>): Category =
Properties.decodeFromMap(map)
}
}
The closest you can get is with delegated properties stored in a map.
Example (from link):
class User(val map: Map<String, Any?>) {
val name: String by map
val age: Int by map
}
Using this with data classes may not work very well, however.
Kpropmap is a reflection-based library that attempts to make working with Kotlin data classes and Maps easier. It has the following capabilities that are relevant:
Can transform maps to and from data classes, though note if all you need is converting from a data class to a Map, just use reflection directly as per #KenFehling's answer.
data class Foo(val a: Int, val b: Int)
// Data class to Map
val propMap = propMapOf(foo)
// Map to data class
val foo1 = propMap.deserialize<Foo>()
Can read and write Map data in a type-safe way by using the data class KProperty's for type information.
Given a data class and a Map, can do other neat things like detect changed values and extraneous Map keys that don't have corresponding data class properties.
Represent "partial" data classes (kind of like lenses). For example, say your backend model contains a Foo with 3 required immutable properties represented as vals. However, you want to provide an API to patch Foo instances. As it is a patch, the API consumer will only send the updated properties. The REST API layer for this obviously cannot deserialize directly to the Foo data class, but it can accept the patch as a Map. Use kpropmap to validate that the Map has the correct types, and apply the changes from the Map to a copy of the model instance:
data class Foo(val a: Int, val b: Int, val c: Int)
val f = Foo(1, 2, 3)
val p = propMapOf("b" to 5)
val f1 = p.applyProps(f) // f1 = Foo(1, 5, 3)
Disclaimer: I am the author.

groovy immutable object with parent class

I have two immutable groovy classes that have a few shared values that I'm trying to abstract to a parent class. However when I create the following, the second test case always fails. Although everything compiles correctly and no error is thrown at runtime, when I assign the parent property int he constructor, it is never set, resulting in a null value. I havent found any documentation that forbids this, but I'm wondering is this even possible? I've tried a number of configuration of Annotations and class-types (e.g. removing abstract from the parent) but nothing seems to work short of just removing the #Immutable tag altogether.
abstract class TestParent {
String parentProperty1
}
#ToString(includeNames = true)
#Immutable
class TestChild extends TestParent {
String childProperty1
String childProperty2
}
class TestCase {
#Test
void TestOne() {
TestChild testChild = new TestChild(
childProperty1: "childOne",
childProperty2: "childTwo",
parentProperty1: "parentOne"
)
assert testChild
assert testChild.parentProperty1
}
}
Based on the code for the ImmutableASTTransformation, the Map-arg constructor added by the createConstructorMapCommon method does not include a call to super(args) in the method body.
which means that immutable classes are self contained by default
Now if you want to do it you need to use composition instead of inheritance and this is an example of how you can do it :
import groovy.transform.*
#TupleConstructor
class A {
String a
}
#Immutable(knownImmutableClasses=[A])
class B {
#Delegate A base
String b
}
def b = new B(base: new A("a"), b: "b")
assert b.a
i hope this will help :)

Groovy Compile time AST transformation: Assignment to a field

I'm currently trying to implement some Groovy compile time AST transformations, but I ran into trouble:
How do I specify an AST transformation for an assignment statement to a field? i.e. the AST transformation should transform the following code:
class MyClass {
#MyTransformation
String myField
public void init() {
}
}
into something like
class MyClass {
String myField
public void init() {
this.myField = "initialized!"
}
}
I tried it with this AST builder invocation:
def ast = new AstBuilder().buildFromSpec {
expression{
declaration {
variable "myField"
token "="
constant "initialized!"
}
}
}
But after inserting the resulting statement in the "init" method of the declaring class, it instead inserted a variable assignment, as in
java.lang.Object myField = "initialized!"
I looked through the examples incorporated in the Ast Builder TestCase, but they only cover field declaration in the class body, not assignments to fields. My own tries using fieldNode all resulted in compiler errors. I set the compile phase to INSTRUCTION_SELECTION; I think this should be fine.
How do I achieve this? A solution based on the AstBuilder#buildFromSpec method is preferred, but any help would be highly appreciated.
I usually recommand not to use the AST builder. It's good for prototyping, but you don't really control what it generates. In particular, here, it's not capable of handling the fact that the variable expression you create should reference the field node. AST Builder is very nice to learn about the AST, but shouldn't be used in production code IMHO.
Here is a self contained example that demonstrates how you can acheive what you want. The code inside #ASTTest would correspond to your transform code:
import groovy.transform.ASTTest
import org.codehaus.groovy.ast.expr.BinaryExpression
import org.codehaus.groovy.ast.expr.VariableExpression
import org.codehaus.groovy.ast.expr.ConstantExpression
import org.codehaus.groovy.ast.stmt.ExpressionStatement
import org.codehaus.groovy.syntax.Token
import org.codehaus.groovy.syntax.Types
class MyClass {
String myField
#ASTTest(phase=SEMANTIC_ANALYSIS,value={
def classNode = node.declaringClass
def field = classNode.getDeclaredField('myField')
def assignment = new BinaryExpression(
new VariableExpression(field),
Token.newSymbol(Types.EQUAL, 0, 0),
new ConstantExpression('initialized!')
)
node.code.addStatement(new ExpressionStatement(assignment))
})
public void init() {
}
}
def c = new MyClass()
c.init()
println c.myField
Hope this helps!

Populating field from enum table

I have the following tables
Entity
id,name,categoryid
21,"Blah",1
EntityCategory(Enum table)
id, name
1,"New Blahs"
I have a FK relationship between Entities->categoryid and EntityCategories->id
I have generated SubSonic classes for both as well a corresponding Model object for Entity
class Entity{ID,Name,CategoryName}
I am trying to return the Model.Entity type with category name filled in i.e.
public Entity GetEntityByName(string name){
return new
Select(
Entity.IdColumn,
Entity.NameColumn,
EntityCategory.NameColumn)
.From(Entity.Schema)
.InnerJoin(Tables.EntityCategory)
.Where(Entity.NameColumn).IsEqualTo(name)
.ExecuteSingle<Model.Entity>();
Needless to say this is not working. I actually get a Model.Entity with the Entity.Name set to the EntityCategoryName.
If you use SubSonic 3.0 you can do this with projection:
var result = from e in db.Entities
where e.ID=1
select new Entity{
ID=e.ID,
name=e.Name,
CategoryName=(CategoryName)e.CategoryID
}
With SubSonic 2.x, I'd say to make it easy on yourself and extend the partial class with a readonly enum:
public partial class Entity{
public CategoryName{
return (CategoryName)this.CategoryID;
}
}

Resources