How to convert a Kotlin data class object to map? - object

Is there any easy way or any standard library method to convert a Kotlin data class object to a map/dictionary of its properties by property names? Can reflection be avoided?

I was using the jackson method, but turns out the performance of this is terrible on Android for first serialization (github issue here). And its dramatically worse for older android versions, (see benchmarks here)
But you can do this much faster with Gson. Conversion in both directions shown here:
import com.google.gson.Gson
import com.google.gson.reflect.TypeToken
val gson = Gson()
//convert a data class to a map
fun <T> T.serializeToMap(): Map<String, Any> {
return convert()
}
//convert a map to a data class
inline fun <reified T> Map<String, Any>.toDataClass(): T {
return convert()
}
//convert an object of type I to type O
inline fun <I, reified O> I.convert(): O {
val json = gson.toJson(this)
return gson.fromJson(json, object : TypeToken<O>() {}.type)
}
//example usage
data class Person(val name: String, val age: Int)
fun main() {
val person = Person("Tom Hanley", 99)
val map = mapOf(
"name" to "Tom Hanley",
"age" to 99
)
val personAsMap: Map<String, Any> = person.serializeToMap()
val mapAsPerson: Person = map.toDataClass()
}

This extension function uses reflection, but maybe it'll help someone like me coming across this in the future:
inline fun <reified T : Any> T.asMap() : Map<String, Any?> {
val props = T::class.memberProperties.associateBy { it.name }
return props.keys.associateWith { props[it]?.get(this) }
}

I have the same use case today for testing and ended up i have used Jackson object mapper to convert Kotlin data class into Map. The runtime performance is not a big concern in my case. I haven't checked in details but I believe it's using reflection under the hood but it's out of concern as happened behind the scene.
For Example,
val dataclass = DataClass(p1 = 1, p2 = 2)
val dataclassAsMap = objectMapper.convertValue(dataclass, object:
TypeReference<Map<String, Any>>() {})
//expect dataclassAsMap == mapOf("p1" to 1, "p2" to 2)

kotlinx.serialization has an experimental Properties format that makes it very simple to convert Kotlin classes into maps and vice versa:
#ExperimentalSerializationApi
#kotlinx.serialization.Serializable
data class Category constructor(
val id: Int,
val name: String,
val icon: String,
val numItems: Long
) {
// the map representation of this class
val asMap: Map<String, Any> by lazy { Properties.encodeToMap(this) }
companion object {
// factory to create Category from a map
fun from(map: Map<String, Any>): Category =
Properties.decodeFromMap(map)
}
}

The closest you can get is with delegated properties stored in a map.
Example (from link):
class User(val map: Map<String, Any?>) {
val name: String by map
val age: Int by map
}
Using this with data classes may not work very well, however.

Kpropmap is a reflection-based library that attempts to make working with Kotlin data classes and Maps easier. It has the following capabilities that are relevant:
Can transform maps to and from data classes, though note if all you need is converting from a data class to a Map, just use reflection directly as per #KenFehling's answer.
data class Foo(val a: Int, val b: Int)
// Data class to Map
val propMap = propMapOf(foo)
// Map to data class
val foo1 = propMap.deserialize<Foo>()
Can read and write Map data in a type-safe way by using the data class KProperty's for type information.
Given a data class and a Map, can do other neat things like detect changed values and extraneous Map keys that don't have corresponding data class properties.
Represent "partial" data classes (kind of like lenses). For example, say your backend model contains a Foo with 3 required immutable properties represented as vals. However, you want to provide an API to patch Foo instances. As it is a patch, the API consumer will only send the updated properties. The REST API layer for this obviously cannot deserialize directly to the Foo data class, but it can accept the patch as a Map. Use kpropmap to validate that the Map has the correct types, and apply the changes from the Map to a copy of the model instance:
data class Foo(val a: Int, val b: Int, val c: Int)
val f = Foo(1, 2, 3)
val p = propMapOf("b" to 5)
val f1 = p.applyProps(f) // f1 = Foo(1, 5, 3)
Disclaimer: I am the author.

Related

Empty set after collectAsList, even though it is not empty inside the transformation operator

I am trying to figure out if I can work with Kotlin and Spark,
and use the former's data classes instead of Scala's case classes.
I have the following data class:
data class Transaction(var context: String = "", var epoch: Long = -1L, var items: HashSet<String> = HashSet()) :
Serializable {
companion object {
#JvmStatic
private val serialVersionUID = 1L
}
}
And the relevant part of the main routine looks like this:
val transactionEncoder = Encoders.bean(Transaction::class.java)
val transactions = inputDataset
.groupByKey(KeyExtractor(), KeyExtractor.getKeyEncoder())
.mapGroups(TransactionCreator(), transactionEncoder)
.collectAsList()
transactions.forEach { println("collected Transaction=$it") }
With TransactionCreator defined as:
class TransactionCreator : MapGroupsFunction<Tuple2<String, Timestamp>, Row, Transaction> {
companion object {
#JvmStatic
private val serialVersionUID = 1L
}
override fun call(key: Tuple2<String, Timestamp>, values: MutableIterator<Row>): Transaction {
val seq = generateSequence { if (values.hasNext()) values.next().getString(2) else null }
val items = seq.toCollection(HashSet())
return Transaction(key._1, key._2.time, items).also { println("inside call Transaction=$it") }
}
}
However, I think I'm running into some sort of serialization problem,
because the set ends up empty after collection.
I see the following output:
inside call Transaction=Transaction(context=context1, epoch=1000, items=[c])
inside call Transaction=Transaction(context=context1, epoch=0, items=[a, b])
collected Transaction=Transaction(context=context1, epoch=0, items=[])
collected Transaction=Transaction(context=context1, epoch=1000, items=[])
I've tried a custom KryoRegistrator to see if it was a problem with Kotlin's HashSet:
class MyRegistrator : KryoRegistrator {
override fun registerClasses(kryo: Kryo) {
kryo.register(HashSet::class.java, JavaSerializer()) // kotlin's HashSet
}
}
But it doesn't seem to help.
Any other ideas?
Full code here.
It does seem to be a serialization issue.
The documentation of Encoders.bean states (Spark v2.4.0):
collection types: only array and java.util.List currently, map support is in progress
Porting the Transaction data class to Java and changing items to a java.util.List seems to help.

How to define custom JS objects in ScalaJS

The phaser game library has an API where you pass a custom object when starting a game scene (docs). This data object can be any javascript object at all and can be retrieved from within the scene from the scene's settings. My question is how do I define this object in the phaser facades in a generic way and define a strongly typed version in my own code?
So far I have just referenced the object as a js.Object in the phaser APIs and cast it to my own type when the scene is created:
#js.native
trait ScenePlugin extends js.Object {
def start(key: SceneKey, data: js.UndefOr[js.Object] = js.undefined): ScenePlugin
}
#js.annotation.ScalaJSDefined
class LevelConfig(
val key: LevelKey,
val loadingImage: Option[AssetKey] = None) extends js.Object
#ScalaJSDefined
class LoadScene extends Scene {
private val loader = new SceneLoader(scene = this)
private var levelConfig: LevelConfig = _
override def preload(): Unit = {
levelConfig = sys.settings.data.asInstanceOf[LevelConfig]
}
...
}
This works but I'm not happy with it because I have to cast the data object. Any errors with the actual object that gets passed to the ScenePlugin.start() will cause errors during runtime and I may as well have just used vanilla JS. Also, my LevelConfig cannot be a case class as I get the compile error Classes and objects extending js.Any may not have a case modifier which I don't fully understand.
Has anyone dealt with this situation before and what did you do to get around it? I'm guessing the issue stems from the library which is being used so perhaps I need to create some kind of wrapper around Phaser's Scene class to deal with this? I'm quite new to ScalaJS and am looking to improve my understanding so any explanations with solutions would be much appreciated (and upvoted). Thanks very much!
I followed Justin du Coeur's comment suggestion of modifying the Phaser facade's. I defined a non-native trait for a SceneData object and updated the native Scene facade to have two types which subclasses of Scene must override. Phaser scenes are abstract and intended to be overridden so I think this works well:
class Scene(config: SceneConfig) extends js.Object {
type Key <: SceneKey
type Data <: SceneData
def scene: ScenePlugin = js.native
def data: Data = js.native
def preload(): Unit = js.native
def create(): Unit = js.native
def update(time: Double, delta: Double): Unit = js.native
}
object Scene {
trait SceneKey { def value: String }
implicit def keyAsString(id: SceneKey): String = id.value
trait SceneData extends js.Object
}
#js.native
trait ScenePlugin extends js.Object {
def start[S <: Scene](id: String, data: js.UndefOr[S#Data] = js.undefined): ScenePlugin = js.native
}
And here's a simplified example of a scene in my game:
class LoadScene extends Scene(LoadScene.Config) {
override type Key = LoadId.type
override type Data = GameAssets
override def preload(): Unit = {
createLoadBar()
loadAssets(data)
}
private def createLoadBar(): Unit = { ... }
private def loadAssets(config: GameAssets): Unit = { ... }
override def create(): Unit = {
scene.start[GameScene](GameId)
}
}
object LoadScene {
case object LoadId extends SceneKey { val value = "loading" }
val Config: SceneConfig = ...
}
I quite like this because it's now impossible to start a scene with another scene's config type.

Cassandra phantom-dsl derived column is missing in create database generated queries

I have following table definition
import com.outworkers.phantom.builder.primitives.Primitive
import com.outworkers.phantom.dsl._
abstract class DST[V, P <: TSP[V], T <: DST[V, P, T]] extends Table[T, P] {
object entityKey extends StringColumn with PartitionKey {
override lazy val name = "entity_key"
}
abstract class entityValue(implicit ev: Primitive[V]) extends PrimitiveColumn[V] {
override lazy val name = "entity_value"
}
In concrete table sub class
abstract class SDST[P <: TSP[String]] extends DST[String, P, SDST[P]] {
override def tableName: String = "\"SDS\""
object entityValue extends entityValue
}
Database class
class TestDatabase(override val connector: CassandraConnection) extends Database[TestDatabase](connector) {
object SDST extends SDST[SDSR] with connector.Connector {
override def fromRow(r: Row): SDSR=
SDSR(entityKey(r), entityValue(r))
}
}
The create table query generated by phantom-dsl looks like below
database.create()
c.o.phantom Executing query: CREATE TABLE IF NOT EXISTS test."SDS" (entity_key text,PRIMARY KEY (entity_key))
As you can see derived column is missing from the create table DDL.
Please let me know if I am missing something in the implementation.
Omitted class definitions like SDSR and TSP are simple case classes.
Thanks
Phantom doesn't currently support table to table inheritance. The reasons behind that decision are complexities inherent in the Macro API that we rely on to power the DSL.
This feature is planned for a future release, but until that stage we do not expect this to work, as the table helper macro does not read columns that are inherited basically.

NSArrayController NSTableView Core Data Binding integers

I have an NSTableView bound to an NSArrayController, which in turn is bound to Core Data. The table displays integer values from core data nicely, but if I edit the numbers in the table I get an error:
Unacceptable type of value for attribute: property = "armorclass"; desired type = NSNumber; given type = NSTaggedPointerString; value = 10.
Any suggestions of how I can convert this pointer string back to an Int16 before the Array Controller tries to save it back to Core Data?
I wrote the following ValueTransformer but it's not working properly. I always get the error: Cannot find value transformer with name StringIntegerValueTransformer
class StringIntegerValueTransformer: ValueTransformer {
override class func transformedValueClass() -> AnyClass { //What do I transform
return String.self as! AnyClass
}
override class func allowsReverseTransformation() -> Bool { //Can I transform back?
return false
}
override func transformedValue(_ value: Any?) -> Any? {
if let val = value {
return String(describing: val)
}
return "nil"
}
override func reverseTransformedValue(_ value: Any?) -> Any? { //Revert transformation
if let val = value {
return val as? Int16
}
return nil
}
}
fff
To register the value transformer override init in AppDelegate
override init()
{
let stringIntegerValueTransformer = StringIntegerValueTransformer()
ValueTransformer.setValueTransformer(stringIntegerValueTransformer, forName:NSValueTransformerName(rawValue: "StringIntegerValueTransformer"))
super.init()
}
And consider this note from the documentation
Your NSValueTransformer subclasses are not automatically listed in the Interface Builder bindings inspector. When inspecting a binding you can enter the name that the value transformer is registered with, but the functionality will not be present in Interface Builder’s test mode. When your application is compiled and run the transformer will be used
Add a number formatter to the text field.

Why I cannot refer to a nested object from val or typealias referring to an object?

Consider the following code:
object SomeObjectA {
object SomeObjectB {
val a = "test"
}
}
val X = SomeObjectA
typealias Y = SomeObjectA
SomeObjectA.SomeObjectB // works
X.SomeObjectB // error
Y.SomeObjectB // error
I cannot refer to a nested object (in an outer object) using val or typealias which are referring to the outer object. Why?
the compiler error is comes from java, and the kotlin objects convert to java classes as below:
public final class SomeObjectA {
private SomeObjectA() {/**/}
public static final SomeObjectA INSTANCE = new SomeObjectA();
public static final class SomeObjectB {
private SomeObjectB() {/**/}
public static final SomeObjectB INSTANCE = new SomeObjectB();
}
}
SomeObjectA.SomeObjectB is compiled to java code as below:
SomeObjectA.SomeObjectB.INSTANCE;
SomeObjectA is compiled to java code as below:
SomeObjectA.INSTANCE
we know kotlin is base on java, and java don't allow access the nested classes via instance reference, if you do the compiler will reports an error:"Error: java: unexpected type required: class,package found: variable", for example:
SomeObjectA a = SomeObjectA.INSTANCE;
SomeObjectB b = a.SomeObjectB.INSTANCE;// error
// ^--- compiler don't know where to go? package&class or variable?
the code below, kotlin compiler will transforms the java compiler error as: "Error: Kotlin: Nested object 'SomeObjectB' accessed via instance reference".
val a = SomeObjectA;
val b = a.SomeObjectB;
// ^--- Error
Type aliases do not introduce new types. They are equivalent to the corresponding underlying types.
so the two statements below are equality:
val a = SomeObjectA;
typealias a2 = SomeObjectA;
avoiding to the use of typealias causing unnecessary compiler error, kotlin doesn't include all nested classes in typealias.
What you described happens because SomeObjectA in your example is simultaneously a name of an object and the name of its class.
So to access SomeObjectB, you need to use the <classname>.<classname> syntax. That is why X.SomeObjectB doesn't compile (<object>.<classname> is unsupported)
P.S. This doesn't really explain your second problem with typealias. It looks like like a bug to me, but I'm not sure.

Resources