KeeperErrorCode = Unimplemented for / - apache-pulsar

I'm trying to initialize metadata with pulsar 2.9.1 and zookeeper 3.6.3 and I'm getting the following error
Exception in thread "main" org.apache.pulsar.metadata.api.MetadataStoreException: org.apache.zookeeper.KeeperException$UnimplementedException: KeeperErrorCode = Unimplemented for /
at org.apache.pulsar.metadata.impl.ZKMetadataStore.<init>(ZKMetadataStore.java:89)
at org.apache.pulsar.metadata.impl.MetadataStoreFactoryImpl.newInstance(MetadataStoreFactoryImpl.java:52)
at org.apache.pulsar.metadata.impl.MetadataStoreFactoryImpl.createExtended(MetadataStoreFactoryImpl.java:36)
at org.apache.pulsar.metadata.api.extended.MetadataStoreExtended.create(MetadataStoreExtended.java:43)
at org.apache.pulsar.PulsarClusterMetadataSetup.initMetadataStore(PulsarClusterMetadataSetup.java:338)
at org.apache.pulsar.PulsarClusterMetadataSetup.main(PulsarClusterMetadataSetup.java:203)
Caused by: org.apache.zookeeper.KeeperException$UnimplementedException: KeeperErrorCode = Unimplemented for /
at org.apache.zookeeper.KeeperException.create(KeeperException.java:106)
at org.apache.zookeeper.KeeperException.create(KeeperException.java:54)
at org.apache.zookeeper.ZooKeeper.addWatch(ZooKeeper.java:3192)
at org.apache.pulsar.metadata.impl.PulsarZooKeeperClient.access$3301(PulsarZooKeeperClient.java:74)
at org.apache.pulsar.metadata.impl.PulsarZooKeeperClient$23.call(PulsarZooKeeperClient.java:1130)
at org.apache.pulsar.metadata.impl.PulsarZooKeeperClient$23.call(PulsarZooKeeperClient.java:1124)
at org.apache.pulsar.metadata.impl.PulsarZooKeeperClient$ZooWorker.syncCallWithRetries(PulsarZooKeeperClient.java:1529)
at org.apache.pulsar.metadata.impl.PulsarZooKeeperClient.addWatch(PulsarZooKeeperClient.java:1124)
at org.apache.pulsar.metadata.impl.ZKMetadataStore.<init>(ZKMetadataStore.java:82)
... 5 more```

Related

NoSuchMethodError trying to ingest HDFS data into Elasticsearch

I'm using Spark 3.12, Scala 2.12, Hadoop 3.1.1.3.1.2-50, Elasticsearch 7.10.1 (due to license issues), Centos 7
to try an ingest json data in gzip files located on HDFS into Elasticsearch using spark streaming.
I get a
Logical Plan:
FileStreamSource[hdfs://pct/user/papago-mlops-datalake/raw/mt-log/engine=n2mt/year=2022/date=0430/hour=00]
at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:356)
at org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:244)
Caused by: java.lang.NoSuchMethodError: org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(Lorg/apache/spark/sql/SparkSession;Lorg/apache/spark/sql/execution/QueryExecution;Lscala/Function0;)Ljava/lang/Object;
at org.elasticsearch.spark.sql.streaming.EsSparkSqlStreamingSink.addBatch(EsSparkSqlStreamingSink.scala:62)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runBatch$16(MicroBatchExecution.scala:586)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runBatch$15(MicroBatchExecution.scala:584)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken(ProgressReporter.scala:357)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken$(ProgressReporter.scala:355)
at org.apache.spark.sql.execution.streaming.StreamExecution.reportTimeTaken(StreamExecution.scala:68)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.runBatch(MicroBatchExecution.scala:584)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runActivatedStream$2(MicroBatchExecution.scala:226)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken(ProgressReporter.scala:357)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken$(ProgressReporter.scala:355)
at org.apache.spark.sql.execution.streaming.StreamExecution.reportTimeTaken(StreamExecution.scala:68)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runActivatedStream$1(MicroBatchExecution.scala:194)
at org.apache.spark.sql.execution.streaming.ProcessingTimeExecutor.execute(TriggerExecutor.scala:57)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.runActivatedStream(MicroBatchExecution.scala:188)
at org.apache.spark.sql.execution.streaming.StreamExecution.$anonfun$runStream$1(StreamExecution.scala:334)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:317)
... 1 more
ApplicationMaster host: ac3m8x2183.bdp.bdata.ai
ApplicationMaster RPC port: 39673
queue: batch
start time: 1654588583366
final status: FAILED
tracking URL: https://gemini-rm2.bdp.bdata.ai:9090/proxy/application_1654575947385_29572/
user: papago-mlops-datalake
Exception in thread "main" org.apache.spark.SparkException: Application application_1654575947385_29572 finished with failed status
at org.apache.spark.deploy.yarn.Client.run(Client.scala:1269)
at org.apache.spark.deploy.yarn.YarnClusterApplication.start(Client.scala:1627)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:904)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
using
implementation("org.elasticsearch:elasticsearch-hadoop:8.2.2")
implementation("com.typesafe:config:1.4.2")
implementation("org.apache.spark:spark-sql_2.12:3.1.2")
testImplementation("org.scalatest:scalatest_2.12:3.2.12")
testRuntimeOnly("com.vladsch.flexmark:flexmark-all:0.61.0")
compileOnly("org.apache.spark:spark-sql_2.12:3.1.2")
compileOnly("org.apache.spark:spark-core_2.12:3.1.2")
compileOnly("org.apache.spark:spark-launcher_2.12:3.1.2")
compileOnly("org.apache.spark:spark-streaming_2.12:3.1.2")
compileOnly("org.elasticsearch:elasticsearch-spark-30_2.12:8.2.2")
libraries. I tried using ES-Hadoop version 7.10.1, but ES-Spark only supports down to 7.12.0 for Spark 3.0 and I still get the same error.
My code is pretty simple
def main(args: Array[String]): Unit = {
// Set the log level to only print errors
Logger.getLogger("org").setLevel(Level.ERROR)
val spark = SparkSession
.builder()
.config(ConfigurationOptions.ES_NET_HTTP_AUTH_USER, elasticsearchUser)
.config(ConfigurationOptions.ES_NET_HTTP_AUTH_PASS, elasticsearchPass)
.config(ConfigurationOptions.ES_NODES, elasticsearchHost)
.config(ConfigurationOptions.ES_PORT, elasticsearchPort)
.appName(appName)
.master(master)
.getOrCreate()
val streamingDF: DataFrame = spark.readStream
.schema(jsonSchema)
.format("org.apache.spark.sql.execution.datasources.json.JsonFileFormat")
.load(pathToJSONResource)
streamingDF.writeStream
.outputMode(outputMode)
.format(destination)
.option("checkpointLocation", checkpointLocation)
.start(indexAndDocType)
.awaitTermination()
// Stop the session
spark.stop()
}
}
If I can't use the ES-Hadoop libraries is there another way I can go about ingesting JSON into ES from HDFS?

spring-integration-smb : jcifs.smb.SmbException: The parameter is incorrect while connect to NAS

I encountered a problem when connecting to a NAS shared directory using spring-integration-smb.
The problem is that I was able to connect to another shared Nas directory but for the pre-prod Nas, I encountered this problem.
Also, the shared server administrator confirmed that both directories have the same configuration.
You will find below the stack encountered
07 mars 2022;14:49:50.702 [scheduling-1] WARN jcifs.smb.SmbTransportImpl - Disconnecting transport while still in use Transport12[NAS03/XXXXXXXX:445,state=5,signingEnforced=false,usage=1]: [SmbSession[credentials=XXXXXXXXXX,targetHost=nas03,targetDomain=null,uid=0,connectionState=2,usage=1]]
07 mars 2022;14:49:50.702 [scheduling-1] WARN jcifs.smb.SmbSessionImpl - Logging off session while still in use SmbSession[credentials=XXXXXXXXX,targetHost=nas03,targetDomain=null,uid=0,connectionState=3,usage=1]:[SmbTree[share=PPD,service=null,tid=4,inDfs=false,inDomainDfs=false,connectionState=0,usage=2]]
07 mars 2022;14:49:50.737 [scheduling-1] ERROR o.s.i.handler.LoggingHandler - org.springframework.messaging.MessagingException: Problem occurred while synchronizing '' to local directory; nested exception is org.springframework.messaging.MessagingException: Failure occurred while copying '/test.csv' from the remote to the local directory; nested exception is org.springframework.core.NestedIOException: Failed to read resource [/test.csv].; nested exception is jcifs.smb.SmbException: The parameter is incorrect.
at org.springframework.integration.file.remote.synchronizer.AbstractInboundFileSynchronizer.synchronizeToLocalDirectory(AbstractInboundFileSynchronizer.java:348)
at org.springframework.integration.file.remote.synchronizer.AbstractInboundFileSynchronizingMessageSource.doReceive(AbstractInboundFileSynchronizingMessageSource.java:267)
at org.springframework.integration.file.remote.synchronizer.AbstractInboundFileSynchronizingMessageSource.doReceive(AbstractInboundFileSynchronizingMessageSource.java:69)
at org.springframework.integration.endpoint.AbstractFetchLimitingMessageSource.doReceive(AbstractFetchLimitingMessageSource.java:47)
at org.springframework.integration.endpoint.AbstractMessageSource.receive(AbstractMessageSource.java:142)
at org.springframework.integration.endpoint.SourcePollingChannelAdapter.receiveMessage(SourcePollingChannelAdapter.java:212)
at org.springframework.integration.endpoint.AbstractPollingEndpoint.doPoll(AbstractPollingEndpoint.java:444)
at org.springframework.integration.endpoint.AbstractPollingEndpoint.pollForMessage(AbstractPollingEndpoint.java:413)
at org.springframework.integration.endpoint.AbstractPollingEndpoint.lambda$createPoller$4(AbstractPollingEndpoint.java:348)
at org.springframework.integration.util.ErrorHandlingTaskExecutor.lambda$execute$0(ErrorHandlingTaskExecutor.java:57)
at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:50)
at org.springframework.integration.util.ErrorHandlingTaskExecutor.execute(ErrorHandlingTaskExecutor.java:55)
at org.springframework.integration.endpoint.AbstractPollingEndpoint.lambda$createPoller$5(AbstractPollingEndpoint.java:341)
at org.springframework.scheduling.support.DelegatingErrorHandlingRunnable.run(DelegatingErrorHandlingRunnable.java:54)
at org.springframework.scheduling.concurrent.ReschedulingRunnable.run(ReschedulingRunnable.java:95)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:264)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java)
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: org.springframework.messaging.MessagingException: Failure occurred while copying '/BE1_2_MOUVEMENTS_Valorisation_20211231_20220218_164451.csv' from the remote to the local directory; nested exception is org.springframework.core.NestedIOException: Failed to read resource [/BE1_2_MOUVEMENTS_Valorisation_20211231_20220218_164451.csv].; nested exception is jcifs.smb.SmbException: The parameter is incorrect.
at org.springframework.integration.file.remote.synchronizer.AbstractInboundFileSynchronizer.copyRemoteContentToLocalFile(AbstractInboundFileSynchronizer.java:551)
at org.springframework.integration.file.remote.synchronizer.AbstractInboundFileSynchronizer.copyFileToLocalDirectory(AbstractInboundFileSynchronizer.java:488)
at org.springframework.integration.file.remote.synchronizer.AbstractInboundFileSynchronizer.copyIfNotNull(AbstractInboundFileSynchronizer.java:403)
at org.springframework.integration.file.remote.synchronizer.AbstractInboundFileSynchronizer.transferFilesFromRemoteToLocal(AbstractInboundFileSynchronizer.java:386)
at org.springframework.integration.file.remote.synchronizer.AbstractInboundFileSynchronizer.lambda$synchronizeToLocalDirectory$0(AbstractInboundFileSynchronizer.java:342)
at org.springframework.integration.file.remote.RemoteFileTemplate.execute(RemoteFileTemplate.java:452)
at org.springframework.integration.file.remote.synchronizer.AbstractInboundFileSynchronizer.synchronizeToLocalDirectory(AbstractInboundFileSynchronizer.java:341)
... 21 more
Caused by: org.springframework.core.NestedIOException: Failed to read resource [/BE1_2_MOUVEMENTS_Valorisation_20211231_20220218_164451.csv].; nested exception is jcifs.smb.SmbException: The parameter is incorrect.
at org.springframework.integration.smb.session.SmbSession.read(SmbSession.java:188)
at org.springframework.integration.file.remote.synchronizer.AbstractInboundFileSynchronizer.copyRemoteContentToLocalFile(AbstractInboundFileSynchronizer.java:545)
... 27 more
Caused by: jcifs.smb.SmbException: The parameter is incorrect.
at jcifs.smb.SmbTransportImpl.checkStatus2(SmbTransportImpl.java:1467)
at jcifs.smb.SmbTransportImpl.checkStatus(SmbTransportImpl.java:1578)
at jcifs.smb.SmbTransportImpl.sendrecv(SmbTransportImpl.java:1027)
at jcifs.smb.SmbTransportImpl.send(SmbTransportImpl.java:1549)
at jcifs.smb.SmbSessionImpl.send(SmbSessionImpl.java:409)
at jcifs.smb.SmbTreeImpl.send(SmbTreeImpl.java:472)
at jcifs.smb.SmbTreeConnection.send0(SmbTreeConnection.java:404)
at jcifs.smb.SmbTreeConnection.send(SmbTreeConnection.java:318)
at jcifs.smb.SmbTreeConnection.send(SmbTreeConnection.java:298)
at jcifs.smb.SmbTreeHandleImpl.send(SmbTreeHandleImpl.java:130)
at jcifs.smb.SmbTreeHandleImpl.send(SmbTreeHandleImpl.java:117)
at jcifs.smb.SmbFile.withOpen(SmbFile.java:1775)
at jcifs.smb.SmbFile.withOpen(SmbFile.java:1744)
at jcifs.smb.SmbFile.queryPath(SmbFile.java:793)
at jcifs.smb.SmbFile.exists(SmbFile.java:879)
at jcifs.smb.SmbFile.isFile(SmbFile.java:1102)
at org.springframework.integration.smb.session.SmbSession.read(SmbSession.java:182)
... 28 more
here is my code :
#Bean
public SmbSessionFactory smbSessionFactory() {
VaultResponse vaultResponse = vaultTemplate
.opsForKeyValue(vaultPath, VaultKeyValueOperationsSupport.KeyValueBackend.KV_2).get(vaultSecretsPath.toLowerCase());
SmbSessionFactory smbSession = new SmbSessionFactory();
smbSession.setHost(properties.getNasHost());
smbSession.setPort(properties.getNasPort());
smbSession.setDomain(properties.getNasDomain());
if (vaultResponse != null) {
Map<String, Object> data = vaultResponse.getData();
smbSession.setUsername(data != null && data.get("nasUsername") != null ? (String) data.get("nasUsername") : "");
smbSession.setPassword(data != null && data.get("nasPassword") != null ? (String) data.get("nasPassword") : "");
}
smbSession.setShareAndDir(properties.getNasShareAndDir());
smbSession.setReplaceFile(true);
smbSession.setSmbMinVersion(DialectVersion.SMB1);
smbSession.setSmbMaxVersion(DialectVersion.SMB311);
return smbSession;
}
thank you in advance,

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/sql/SparkSession$ : Redis Client

I am using redis client for storing my constants for a project but when I am reading executing the following command and using this into sparksession. Can any body help me out in this. its a maven project
and I am using the following maven dependency
val sparkSession=SparkSession.builder.appName(getAppName).enableHiveSupport().getOrCreate()
sparkSession
val r = new RedisClient("localhost", 6379)
def getAppName:String={
val a=r.get("xyz")
val v=new String(a.get)
logger.info("******** xyz ************"+ v)
v
}
I am getting following error:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/sql/SparkSession$
at com.hpe.cdp.common.SparkCommon$.createSparkSession(abc.scala:14)
at com.hpe.cdp.common.SparkCommon$.readHiveDataDelta(abc.scala:20)
at com.hpe.cdp.common.CdpEapObjectCall$.saveDataCsvHiveDelta(def.scala:16)
at com.hpe.cdp.CdpEapObject$.main(def.scala:16)
at com.hpe.cdp.CdpEapObject.main(def.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.SparkSession$
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 5 more
<!-- https://mvnrepository.com/artifact/org.redis/scala-redis -->
<dependency>
<groupId>org.redis</groupId>
<artifactId>scala-redis_2.11</artifactId>
<version>0.0.13</version>
</dependency>

Kafka Connect Sink to Cassandra :: java.lang.VerifyError: Bad return type

I'm trying to setup a Kafka Connect Sink to collect data from a topic into a Cassandra Table using the Datastax connector : https://downloads.datastax.com/#akc
Running a standalone worker running directly on the broker, running Kafka 0.10.2.2-1 :
name=dse-sink
connector.class=com.datastax.kafkaconnector.DseSinkConnector
tasks.max=1
datastax-java-driver.advanced.protocol.version = V4
key.converter=org.apache.kafka.connect.storage.StringConverter
value.converter=org.apache.kafka.connect.storage.StringConverter
key.converter.schemas.enable=false
value.converter.schemas.enable=false
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false
plugin.path=/usr/share/java/kafka-connect-dse/kafka-connect-dse-1.2.1.jar
topics=connect-test
contactPoints=172.16.0.48
loadBalancing.localDc=datacenter1
port=9042
ignoreErrors=true
topic.connect-test.cdrs.test.mapping= kafkakey=key, value=value
topic.connect-test.cdrs.test.consistencyLevel=LOCAL_QUORUM
But i have the following error :
2019-12-23 16:58:43,165] ERROR Task dse-sink-0 threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask)
java.lang.VerifyError: Bad return type
Exception Details:
Location:
com/fasterxml/jackson/databind/cfg/MapperBuilder.streamFactory()Lcom/fasterxml/jackson/core/TokenStreamFactory; #7: areturn
Reason:
Type 'com/fasterxml/jackson/core/JsonFactory' (current frame, stack[0]) is not assignable to 'com/fasterxml/jackson/core/TokenStreamFactory' (from method signature)
Current Frame:
bci: #7
flags: { }
locals: { 'com/fasterxml/jackson/databind/cfg/MapperBuilder' }
stack: { 'com/fasterxml/jackson/core/JsonFactory' }
Bytecode:
0x0000000: 2ab4 0002 b600 08b0
at com.fasterxml.jackson.databind.json.JsonMapper.builder(JsonMapper.java:114)
at com.datastax.dsbulk.commons.codecs.json.JsonCodecUtils.getObjectMapper(JsonCodecUtils.java:36)
at com.datastax.kafkaconnector.codecs.CodecSettings.init(CodecSettings.java:131)
at com.datastax.kafkaconnector.state.LifeCycleManager.lambda$buildInstanceState$9(LifeCycleManager.java:423)
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
at java.util.HashMap$ValueSpliterator.forEachRemaining(HashMap.java:1625)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
at com.datastax.kafkaconnector.state.LifeCycleManager.buildInstanceState(LifeCycleManager.java:457)
at com.datastax.kafkaconnector.state.LifeCycleManager.lambda$startTask$0(LifeCycleManager.java:106)
at java.util.concurrent.ConcurrentHashMap.computeIfAbsent(ConcurrentHashMap.java:1660)
at com.datastax.kafkaconnector.state.LifeCycleManager.startTask(LifeCycleManager.java:101)
at com.datastax.kafkaconnector.DseSinkTask.start(DseSinkTask.java:74)
at org.apache.kafka.connect.runtime.WorkerSinkTask.initializeAndStart(WorkerSinkTask.java:244)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:145)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:139)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:182)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
No additional error on cassandra or Kafka side.
I see active connection on the cassandra node but nothing arrive in the Keyspace.
Any idea why ?
Imho this is a problem caused by use of the JSON internal converters with BigDecimal data (see related SO question). As described in the following blog post, the internal.key.converter and internal.value.converter are deprecated since Kafka 2.0, and shouldn't be explicitly set. Can you comment out all internal. properties & re-try?
P.S. Also see how JSON + Decimal has changed in Kafka 2.4

weblogic 12c migration issue - No available router to destination

I have an application which is running in weblogic 10.3. I migrated the application to weblogic 12c with java version 1.8 ( earlier java version is 1.6).
when deploying the application I am getting the below exception in logs
2017-06-29 13:44:21,480 - INFO (Configuration.java:1547) - Configured SessionFactory: null
2017-06-29 13:44:21,527 - INFO (NamingHelper.java:26) - JNDI InitialContext properties:{java.naming.provider.url=t3s://xxxx.xxx.xxx.com:4040, java.naming.factory.initial=weblogic.jndi.WLInitialContextFactory}
2017-06-29 13:44:22,097 - ERROR (NamingHelper.java:33) - Could not obtain initial context
javax.naming.CommunicationException: t3s://xxxx.xxx.xxx.com:4040: Destination 10.xx.xx.xx, 4040 unreachable; nested exception is:
java.io.IOException: An existing connection was forcibly closed by the remote host; No available router to destination [Root exception is java.net.ConnectException: t3s://xxxx.xxx.xxx.com:4040: Destination 10.xx.xx.xx, 4040 unreachable; nested exception is:
java.io.IOException: An existing connection was forcibly closed by the remote host; No available router to destination]
at weblogic.jndi.internal.ExceptionTranslator.toNamingException(ExceptionTranslator.java:40)
at weblogic.jndi.WLInitialContextFactoryDelegate.toNamingException(WLInitialContextFactoryDelegate.java:808)
at weblogic.jndi.WLInitialContextFactoryDelegate.getInitialContext(WLInitialContextFactoryDelegate.java:365)
at weblogic.jndi.Environment.getContext(Environment.java:319)
at weblogic.jndi.Environment.getContext(Environment.java:288)
at weblogic.jndi.WLInitialContextFactory.getInitialContext(WLInitialContextFactory.java:117)
at javax.naming.spi.NamingManager.getInitialContext(NamingManager.java:684)
at javax.naming.InitialContext.getDefaultInitCtx(InitialContext.java:313)
at javax.naming.InitialContext.init(InitialContext.java:244)
at javax.naming.InitialContext.(InitialContext.java:216)
at org.hibernate.util.NamingHelper.getInitialContext(NamingHelper.java:28)
at org.hibernate.connection.DatasourceConnectionProvider.configure(DatasourceConnectionProvider.java:52)
at org.hibernate.connection.ConnectionProviderFactory.newConnectionProvider(ConnectionProviderFactory.java:124)
at org.hibernate.connection.ConnectionProviderFactory.newConnectionProvider(ConnectionProviderFactory.java:56)
at org.hibernate.cfg.SettingsFactory.createConnectionProvider(SettingsFactory.java:414)
at org.hibernate.cfg.SettingsFactory.buildSettings(SettingsFactory.java:62)
at org.hibernate.cfg.Configuration.buildSettings(Configuration.java:2073)
at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1298)
at com.xxxxx.k2.util.HibernateHelper.init(HibernateHelper.java:19)
at com.xxxxx.k2.servlet.StartupServlet.contextInitialized(StartupServlet.java:49)
at weblogic.servlet.internal.EventsManager$FireContextListenerAction.run(EventsManager.java:678)
at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
at weblogic.servlet.provider.WlsSubjectHandle.run(WlsSubjectHandle.java:57)
at weblogic.servlet.internal.EventsManager.executeContextListener(EventsManager.java:243)
at weblogic.servlet.internal.EventsManager.notifyContextCreatedEvent(EventsManager.java:200)
at weblogic.servlet.internal.EventsManager.notifyContextCreatedEvent(EventsManager.java:185)
at weblogic.servlet.internal.WebAppServletContext.preloadResources(WebAppServletContext.java:1838)
at weblogic.servlet.internal.WebAppServletContext.start(WebAppServletContext.java:2876)
at weblogic.servlet.internal.WebAppModule.startContexts(WebAppModule.java:1661)
at weblogic.servlet.internal.WebAppModule.start(WebAppModule.java:823)
at weblogic.application.internal.ExtensibleModuleWrapper$StartStateChange.next(ExtensibleModuleWrapper.java:360)
at weblogic.application.internal.ExtensibleModuleWrapper$StartStateChange.next(ExtensibleModuleWrapper.java:356)
at weblogic.application.utils.StateMachineDriver.nextState(StateMachineDriver.java:42)
at weblogic.application.internal.ExtensibleModuleWrapper.start(ExtensibleModuleWrapper.java:138)
at weblogic.application.internal.flow.ModuleListenerInvoker.start(ModuleListenerInvoker.java:124)
at weblogic.application.internal.flow.ModuleStateDriver$3.next(ModuleStateDriver.java:216)
at weblogic.application.internal.flow.ModuleStateDriver$3.next(ModuleStateDriver.java:211)
at weblogic.application.utils.StateMachineDriver.nextState(StateMachineDriver.java:42)
at weblogic.application.internal.flow.ModuleStateDriver.start(ModuleStateDriver.java:73)
at weblogic.application.internal.flow.StartModulesFlow.activate(StartModulesFlow.java:24)
at weblogic.application.internal.BaseDeployment$2.next(BaseDeployment.java:729)
at weblogic.application.utils.StateMachineDriver.nextState(StateMachineDriver.java:42)
at weblogic.application.internal.BaseDeployment.activate(BaseDeployment.java:258)
at weblogic.application.internal.SingleModuleDeployment.activate(SingleModuleDeployment.java:48)
at weblogic.application.internal.DeploymentStateChecker.activate(DeploymentStateChecker.java:165)
at weblogic.deploy.internal.targetserver.AppContainerInvoker.activate(AppContainerInvoker.java:80)
at weblogic.deploy.internal.targetserver.BasicDeployment.activate(BasicDeployment.java:226)
at weblogic.deploy.internal.targetserver.BasicDeployment.activateFromServerLifecycle(BasicDeployment.java:418)
at weblogic.management.deploy.internal.DeploymentAdapter$1.doActivate(DeploymentAdapter.java:51)
at weblogic.management.deploy.internal.DeploymentAdapter.activate(DeploymentAdapter.java:200)
at weblogic.management.deploy.internal.AppTransition$2.transitionApp(AppTransition.java:30)
at weblogic.management.deploy.internal.ConfiguredDeployments.transitionApps(ConfiguredDeployments.java:240)
at weblogic.management.deploy.internal.ConfiguredDeployments.activate(ConfiguredDeployments.java:169)
at weblogic.management.deploy.internal.ConfiguredDeployments.deploy(ConfiguredDeployments.java:123)
at weblogic.management.deploy.internal.DeploymentServerService.resume(DeploymentServerService.java:210)
at weblogic.management.deploy.internal.DeploymentServerService.start(DeploymentServerService.java:118)
at weblogic.server.AbstractServerService.postConstruct(AbstractServerService.java:78)
at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.glassfish.hk2.utilities.reflection.ReflectionHelper.invoke(ReflectionHelper.java:1017)
at org.jvnet.hk2.internal.ClazzCreator.postConstructMe(ClazzCreator.java:388)
at org.jvnet.hk2.internal.ClazzCreator.create(ClazzCreator.java:430)
at org.jvnet.hk2.internal.SystemDescriptor.create(SystemDescriptor.java:456)
at org.glassfish.hk2.runlevel.internal.AsyncRunLevelContext.findOrCreate(AsyncRunLevelContext.java:225)
at org.glassfish.hk2.runlevel.RunLevelContext.findOrCreate(RunLevelContext.java:82)
at org.jvnet.hk2.internal.Utilities.createService(Utilities.java:2488)
at org.jvnet.hk2.internal.ServiceHandleImpl.getService(ServiceHandleImpl.java:98)
at org.jvnet.hk2.internal.ServiceLocatorImpl.getService(ServiceLocatorImpl.java:606)
at org.jvnet.hk2.internal.ThreeThirtyResolver.resolve(ThreeThirtyResolver.java:77)
at org.jvnet.hk2.internal.ClazzCreator.resolve(ClazzCreator.java:231)
at org.jvnet.hk2.internal.ClazzCreator.resolveAllDependencies(ClazzCreator.java:254)
at org.jvnet.hk2.internal.ClazzCreator.create(ClazzCreator.java:413)
at org.jvnet.hk2.internal.SystemDescriptor.create(SystemDescriptor.java:456)
at org.glassfish.hk2.runlevel.internal.AsyncRunLevelContext.findOrCreate(AsyncRunLevelContext.java:225)
at org.glassfish.hk2.runlevel.RunLevelContext.findOrCreate(RunLevelContext.java:82)
at org.jvnet.hk2.internal.Utilities.createService(Utilities.java:2488)
at org.jvnet.hk2.internal.ServiceHandleImpl.getService(ServiceHandleImpl.java:98)
at org.jvnet.hk2.internal.ServiceHandleImpl.getService(ServiceHandleImpl.java:87)
at org.glassfish.hk2.runlevel.internal.CurrentTaskFuture$QueueRunner.oneJob(CurrentTaskFuture.java:1162)
at org.glassfish.hk2.runlevel.internal.CurrentTaskFuture$QueueRunner.run(CurrentTaskFuture.java:1147)
at weblogic.work.SelfTuningWorkManagerImpl$WorkAdapterImpl.run(SelfTuningWorkManagerImpl.java:548)
at weblogic.work.ExecuteThread.execute(ExecuteThread.java:311)
at weblogic.work.ExecuteThread.run(ExecuteThread.java:263)
Caused by: java.net.ConnectException: t3://xxxx.xxx.xxx.com:4040: Destination 10.xx.xx.xx, 4040 unreachable; nested exception is:
java.io.IOException: An existing connection was forcibly closed by the remote host; No available router to destination
at weblogic.rjvm.RJVMFinder.findOrCreateInternal(RJVMFinder.java:241)
at weblogic.rjvm.RJVMFinder.findOrCreate(RJVMFinder.java:169)
at weblogic.rjvm.ServerURL.findOrCreateRJVM(ServerURL.java:177)
at weblogic.jndi.WLInitialContextFactoryDelegate.getInitialContext(WLInitialContextFactoryDelegate.java:350)
... 82 more
Caused by: java.rmi.ConnectException: Destination 10.xx.xx.xx, 4040 unreachable; nested exception is:
java.io.IOException: An existing connection was forcibly closed by the remote host; No available router to destination
at weblogic.rjvm.ConnectionManager.bootstrap(ConnectionManager.java:490)
at weblogic.rjvm.ConnectionManager.bootstrap(ConnectionManager.java:328)
at weblogic.rjvm.RJVMManager.findOrCreateRemoteInternal(RJVMManager.java:300)
at weblogic.rjvm.RJVMManager.findOrCreate(RJVMManager.java:204)
at weblogic.rjvm.RJVMFinder.findOrCreateRemoteServer(RJVMFinder.java:263)
at weblogic.rjvm.RJVMFinder.findOrCreateInternal(RJVMFinder.java:225)
... 85 more
Error message when opening home page:
Error 500--Internal Server Error
java.lang.Exception: java.lang.UnsupportedOperationException
at com.icesoft.faces.context.View.reportException(View.java:318)
at com.icesoft.faces.context.View.servePage(View.java:200)
at com.icesoft.faces.webapp.http.core.SingleViewServer.service(SingleViewServer.java:84)
at com.icesoft.faces.webapp.http.common.ServerProxy.service(ServerProxy.java:43)
at com.icesoft.faces.webapp.http.servlet.MainSessionBoundServlet$4.service(MainSessionBoundServlet.java:187)
at com.icesoft.faces.webapp.http.servlet.BasicAdaptingServlet.service(BasicAdaptingServlet.java:51)
at com.icesoft.faces.webapp.http.servlet.PathDispatcher.service(PathDispatcher.java:55)
at com.icesoft.faces.webapp.http.servlet.SessionDispatcher.service(SessionDispatcher.java:100)
at com.icesoft.faces.webapp.http.servlet.PathDispatcher.service(PathDispatcher.java:55)
at com.icesoft.faces.webapp.http.servlet.MainServlet.service(MainServlet.java:205)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:844)
at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:280)
at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:254)
at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:136)
at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:346)
at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:25)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:79)
at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:79)
at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3436)
at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3402)
at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
at weblogic.servlet.provider.WlsSubjectHandle.run(WlsSubjectHandle.java:57)
at weblogic.servlet.internal.WebAppServletContext.doSecuredExecute(WebAppServletContext.java:2285)
at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2201)
at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2179)
at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1572)
at weblogic.servlet.provider.ContainerSupportProviderImpl$WlsRequestExecutor.run(ContainerSupportProviderImpl.java:255)
at weblogic.work.ExecuteThread.execute(ExecuteThread.java:311)
at weblogic.work.ExecuteThread.run(ExecuteThread.java:263)
Caused by: java.lang.UnsupportedOperationException
at javax.faces.context.FacesContext.getExceptionHandler(FacesContext.java:284)
at com.sun.faces.lifecycle.Phase.doPhase(Phase.java:119)
at com.sun.faces.lifecycle.RestoreViewPhase.doPhase(RestoreViewPhase.java:116)
at com.sun.faces.lifecycle.LifecycleImpl.execute(LifecycleImpl.java:118)
at com.icesoft.faces.webapp.http.core.JsfLifecycleExecutor.apply(JsfLifecycleExecutor.java:50)
at com.icesoft.faces.context.View$2$1.respond(View.java:85)
at com.icesoft.faces.webapp.http.servlet.ServletRequestResponse.respondWith(ServletRequestResponse.java:242)
at com.icesoft.faces.context.View$2.serve(View.java:119)
at com.icesoft.faces.context.View.servePage(View.java:192)
... 29 more

Resources