i am trying to navigate from activity to another and pass object from entity table i have by using Serializable :
#Entity(tableName = "client")
data class ClientEntity(........) : Serializable
{
#PrimaryKey(autoGenerate = true)
#ColumnInfo(name = "id")
var id : Int? = null}
the navigate will happen when the button click
the intent implementation inside the button onClickListener :
val clientEntity = ClientEntity(......)
val intent = Intent(this,ProfileActivity::class.java)
intent.putExtra("clientEntity",clientEntity)
startActivity(intent)
so when i try to navigate i get this error :
java.lang.RuntimeException: Failure from system
i search for this error and all the answers make like what i do also i don't know the reason of error
Related
My preview of a composable is not working and displaying this render problem :
java.lang.ClassCastException: class com.android.layoutlib.bridge.android.BridgeContext cannot be cast to class android.app.Application
My preview code :
#Preview
#Composable
fun MainScreenPrev() {
val context = LocalContext.current
val application = context.applicationContext as Application
val navController = rememberNavController()
val myViewModel = MyViewModel(application)
MainScreen(navController = navController, myViewModel = myViewModel)
}
MyViewModel extends AndroidViewModel that has application as parameter. Most examples online show this as the proper way to get application inside a composable. Is there a better practice to get this preview to work ?
I'm trying to instanciate a Room database in my main activity in Android Studio, following codelabs and tutorials, but my app always crash. Here's a part of the code:
My database (AppDatabase.kt):
#Database(entities = [Device::class], version = 1, exportSchema = false)
abstract class AppDatabase : RoomDatabase() {
abstract fun deviceDao(): DeviceDao
companion object {
#Volatile
private var INSTANCE: AppDatabase? = null
fun getDatabase(context: Context): AppDatabase {
return INSTANCE ?: synchronized(this) {
val instance = Room.databaseBuilder(
context.applicationContext,
AppDatabase::class.java,
"item_database"
)
.fallbackToDestructiveMigration()
.build() // <---- The crash occurs here
INSTANCE = instance
return instance
}
}
}
}
And here's the activity from which I'm trying to instantiate it:
class NavigationActivity() : AppCompatActivity() {
private lateinit var binding: ActivityNavigationBinding
private val db by lazy { AppDatabase.getDatabase(this) }
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
binding = ActivityNavigationBinding.inflate(layoutInflater)
setContentView(binding.root)
Log.d("instantiation", "$db") // <----- Called from here
val navView: BottomNavigationView = binding.navView
val navController = findNavController(R.id.nav_host_fragment_activity_navigation)
val appBarConfiguration = AppBarConfiguration(
setOf(
R.id.navigation_devices, R.id.navigation_logs, R.id.navigation_settings
)
)
setupActionBarWithNavController(navController, appBarConfiguration)
navView.setupWithNavController(navController)
}
}
Finally, here's the error message, which doesn't helps me much:
Caused by: java.lang.NullPointerException: Attempt to invoke virtual method 'java.lang.String java.lang.Package.getName()' on a null object reference
at androidx.room.Room.getGeneratedImplementation(Room.java:82)
at androidx.room.RoomDatabase$Builder.build(RoomDatabase.java:1486)
at AppDatabase$Companion.getDatabase(AppDatabase.kt:24)
I tried a lot of things, including ViewModel, Repository and more, but got the crash systematically, at the same point.
Here's also the part of my build.gradle file where I import Room, maybe I'm wrong in some version or whatever...
plugins {
id 'com.android.application'
id 'org.jetbrains.kotlin.android'
id 'kotlin-android'
id 'kotlin-kapt'
}
[...]
def roomVersion = "2.4.2"
implementation("androidx.room:room-runtime:$roomVersion")
kapt("androidx.room:room-compiler:$roomVersion")
implementation "androidx.room:room-ktx:$roomVersion"
Make sure package declaration on top of the class is declared, for example:
package com.macrosystems.clean.ui.core.view
import android.content.Context
import android.content.Intent
import android.graphics.Color
etc...
The Code A is from CameraXBasic
I can't understand completely the code private val volumeDownReceiver = object : BroadcastReceiver().
I think the Code B will work well, but in fact it failed.
What does the keyword object mean in Kotlin ?
Code A
private val volumeDownReceiver = object : BroadcastReceiver() {
override fun onReceive(context: Context, intent: Intent) {
...
}
}
Code B
private val volumeDownReceiver = BroadcastReceiver() {
override fun onReceive(context: Context, intent: Intent) {
...
}
}
In Code A val volumeDownReceiver = object : BroadcastReceiver() refers to creating an object of an anonymous class that inherits from type BroadcastReceiver.
In Code B val volumeDownReceiver = BroadcastReceiver() tries to instantiate a new instance of an abstract class and that's why it's failing.
Edit: link to docs: https://kotlinlang.org/docs/reference/object-declarations.html#object-expressions
I am trying to figure out if I can work with Kotlin and Spark,
and use the former's data classes instead of Scala's case classes.
I have the following data class:
data class Transaction(var context: String = "", var epoch: Long = -1L, var items: HashSet<String> = HashSet()) :
Serializable {
companion object {
#JvmStatic
private val serialVersionUID = 1L
}
}
And the relevant part of the main routine looks like this:
val transactionEncoder = Encoders.bean(Transaction::class.java)
val transactions = inputDataset
.groupByKey(KeyExtractor(), KeyExtractor.getKeyEncoder())
.mapGroups(TransactionCreator(), transactionEncoder)
.collectAsList()
transactions.forEach { println("collected Transaction=$it") }
With TransactionCreator defined as:
class TransactionCreator : MapGroupsFunction<Tuple2<String, Timestamp>, Row, Transaction> {
companion object {
#JvmStatic
private val serialVersionUID = 1L
}
override fun call(key: Tuple2<String, Timestamp>, values: MutableIterator<Row>): Transaction {
val seq = generateSequence { if (values.hasNext()) values.next().getString(2) else null }
val items = seq.toCollection(HashSet())
return Transaction(key._1, key._2.time, items).also { println("inside call Transaction=$it") }
}
}
However, I think I'm running into some sort of serialization problem,
because the set ends up empty after collection.
I see the following output:
inside call Transaction=Transaction(context=context1, epoch=1000, items=[c])
inside call Transaction=Transaction(context=context1, epoch=0, items=[a, b])
collected Transaction=Transaction(context=context1, epoch=0, items=[])
collected Transaction=Transaction(context=context1, epoch=1000, items=[])
I've tried a custom KryoRegistrator to see if it was a problem with Kotlin's HashSet:
class MyRegistrator : KryoRegistrator {
override fun registerClasses(kryo: Kryo) {
kryo.register(HashSet::class.java, JavaSerializer()) // kotlin's HashSet
}
}
But it doesn't seem to help.
Any other ideas?
Full code here.
It does seem to be a serialization issue.
The documentation of Encoders.bean states (Spark v2.4.0):
collection types: only array and java.util.List currently, map support is in progress
Porting the Transaction data class to Java and changing items to a java.util.List seems to help.
When I applied ParDo.of(new ParDoFn()) to PCollection named textInput, The program throws this Exception. But The Program is normal when I delete .apply(ParDo.of(new ParDoFn())).
//SparkRunner
private static void testHadoop(Pipeline pipeline){
Class<? extends FileInputFormat<LongWritable, Text>> inputFormatClass =
(Class<? extends FileInputFormat<LongWritable, Text>>)
(Class<?>) TextInputFormat.class;
#SuppressWarnings("unchecked") //hdfs://localhost:9000
HadoopIO.Read.Bound<LongWritable, Text> readPTransfom_1 = HadoopIO.Read.from("hdfs://localhost:9000/tmp/kinglear.txt",
inputFormatClass,
LongWritable.class,
Text.class);
PCollection<KV<LongWritable, Text>> textInput = pipeline.apply(readPTransfom_1)
.setCoder(KvCoder.of(WritableCoder.of(LongWritable.class), WritableCoder.of(Text.class)));
//OutputFormat
#SuppressWarnings("unchecked")
Class<? extends FileOutputFormat<LongWritable, Text>> outputFormatClass =
(Class<? extends FileOutputFormat<LongWritable, Text>>)
(Class<?>) TemplatedTextOutputFormat.class;
#SuppressWarnings("unchecked")
HadoopIO.Write.Bound<LongWritable, Text> writePTransform = HadoopIO.Write.to("hdfs://localhost:9000/tmp/output", outputFormatClass, LongWritable.class, Text.class);
textInput.apply(ParDo.of(new ParDoFn())).apply(writePTransform.withoutSharding());
pipeline.run().waitUntilFinish();
}
Which Spark version are you running on top ? From my experience the error you're getting is thrown by Spark 2.x AccumulatorV2, Spark runner currently supports Spark 1.6.
I was facing similar issue when I created custom accumulator that extends org.apache.spark.util.AccumulatorV2. The cause was improper logic in override def isZero: Boolean method. So basically when you copyAndReset method called by default it calls copy() then reset() your isZero() should return true.
If you look at the AccumulatorV2 source that is where is a check is:
// Called by Java when serializing an object
final protected def writeReplace(): Any = {
if (atDriverSide) {
if (!isRegistered) {
throw new UnsupportedOperationException(
"Accumulator must be registered before send to executor")
}
val copyAcc = copyAndReset()
assert(copyAcc.isZero, "copyAndReset must return a zero value copy")
copyAcc.metadata = metadata
copyAcc
} else {
this
}
}
specifically this part
val copyAcc = copyAndReset()
assert(copyAcc.isZero, "copyAndReset must return a zero value copy")
Hope it helps. Happy sparking!