In java code, I call this method with simple command below
ItemSimilarityDriver.main(new String[] {
"--input", InFile,
"--output", OutPath,
"--master", masterUrl,
"--filter1", "purchase",
"--filter2", "view",
"--inDelim", ",",
"--itemIDColumn", "2",
"--rowIDColumn", "0",
"--filterColumn", "1"});
Then I got the following error when running:
Failed to instantiate [ch.qos.logback.classic.LoggerContext]
Reported exception:
java.lang.ExceptionInInitializerError
at java.net.SocketPermission$1.run(SocketPermission.java:1217)
at java.net.SocketPermission$1.run(SocketPermission.java:1209)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.SocketPermission.initEphemeralPorts(SocketPermission.java:1208)
at java.net.SocketPermission.<clinit>(SocketPermission.java:235)
at sun.security.util.SecurityConstants.<clinit>(SecurityConstants.java:259)
at sun.security.provider.PolicyFile$4.run(PolicyFile.java:688)
at sun.security.provider.PolicyFile$4.run(PolicyFile.java:684)
at java.security.AccessController.doPrivileged(Native Method)
at sun.security.provider.PolicyFile.initStaticPolicy(PolicyFile.java:684)
at sun.security.provider.PolicyFile.initPolicyFile(PolicyFile.java:510)
at sun.security.provider.PolicyFile.init(PolicyFile.java:464)
at sun.security.provider.PolicyFile.<init>(PolicyFile.java:322)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at java.lang.Class.newInstance(Class.java:374)
at java.security.Policy.getPolicyNoCheck(Policy.java:195)
at java.security.ProtectionDomain.implies(ProtectionDomain.java:272)
at java.security.AccessControlContext.checkPermission(AccessControlContext.java:350)
at java.security.AccessController.checkPermission(AccessController.java:559)
at ch.qos.logback.core.util.Loader$1.run(Loader.java:50)
at ch.qos.logback.core.util.Loader$1.run(Loader.java:47)
at java.security.AccessController.doPrivileged(Native Method)
at ch.qos.logback.core.util.Loader.<clinit>(Loader.java:46)
at ch.qos.logback.classic.util.ContextInitializer.findURLOfDefaultConfigurationFile(ContextInitializer.java:119)
at ch.qos.logback.classic.util.ContextInitializer.autoConfig(ContextInitializer.java:148)
at org.slf4j.impl.StaticLoggerBinder.init(StaticLoggerBinder.java:85)
at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:55)
at org.slf4j.LoggerFactory.bind(LoggerFactory.java:129)
at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:108)
at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:302)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:276)
at org.apache.log4j.Category.<init>(Category.java:56)
at org.apache.log4j.Logger.<init>(Logger.java:36)
at org.apache.log4j.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:62)
at org.apache.log4j.Logger.getLogger(Logger.java:40)
at org.apache.mahout.sparkbindings.package$.<init>(package.scala:40)
at org.apache.mahout.sparkbindings.package$.<clinit>(package.scala)
at org.apache.mahout.drivers.MahoutSparkDriver.start(MahoutSparkDriver.scala:81)
at org.apache.mahout.drivers.ItemSimilarityDriver$.start(ItemSimilarityDriver.scala:128)
at org.apache.mahout.drivers.ItemSimilarityDriver$.process(ItemSimilarityDriver.scala:211)
at org.apache.mahout.drivers.ItemSimilarityDriver$$anonfun$main$1.apply(ItemSimilarityDriver.scala:116)
at org.apache.mahout.drivers.ItemSimilarityDriver$$anonfun$main$1.apply(ItemSimilarityDriver.scala:114)
at scala.Option.map(Option.scala:145)
at org.apache.mahout.drivers.ItemSimilarityDriver$.main(ItemSimilarityDriver.scala:114)
at org.apache.mahout.drivers.ItemSimilarityDriver.main(ItemSimilarityDriver.scala)
I have tried to add
grant{
permission java.security.AllPermission;
};
in java.policy but it's still not effect. The problem seems to be related to permission but I really don't know exactly what it is here.
Please help me!
Related
I am trying to use the following command from a DSE client node in order to send a Scala application fat-jar to a single-node Datastax environment.
dse -u cassandra -p <password> spark-submit --class my.package.bde.TestSparkApp target/big-data-engine-0.0.1-jar-with-dependencies.jar --master dse://10.0.0.4?
Also, in the /etc/dse/spark-defaults.conf , among others, I have put the following line:
spark.cassandra.connection.host 10.0.0.4
spark.cassandra.auth.username cassandra
spark.cassandra.auth.password <pass>
com.datastax.bdp.fs.client.authentication.basic.username cassandra
com.datastax.bdp.fs.client.authentication.basic.password <pass>
However, the output of the command is the following.
Anybody knows how it could be fixed?
WARN 2019-05-24 09:01:58,116 com.datastax.driver.core.NettyUtil: Found Netty's native epoll transport in the classpath, but epoll is not available. Using NIO instead.
java.lang.NoSuchMethodError: Method io.netty.channel.epoll.NativeStaticallyReferencedJniMethods.efdnonblock()I name or signature does not match
at java.lang.ClassLoader$NativeLibrary.load(Native Method)
at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1941)
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1824)
at java.lang.Runtime.load0(Runtime.java:809)
at java.lang.System.load(System.java:1086)
WARN 2019-05-24 09:02:02,079 hive.ql.metadata.Hive: Failed to access metastore. This class should not accessed in runtime.
org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1236)
at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:191)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:362)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:266)
at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:195)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:105)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:93)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:35)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:289)
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:1059)
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:137)
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:136)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:136)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:133)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:171)
at org.apache.spark.sql.Dataset$.apply(Dataset.scala:62)
at org.apache.spark.sql.SparkSession.createDataset(SparkSession.scala:466)
at org.apache.spark.sql.SQLContext.createDataset(SQLContext.scala:377)
at org.apache.spark.sql.SQLImplicits.localSeqToDatasetHolder(SQLImplicits.scala:213)
at tech.metis.bde.TestSparkApp$.delayedEndpoint$tech$metis$bde$TestSparkApp$1(TestSparkApp.scala:35)
at tech.metis.bde.TestSparkApp$delayedInit$body.apply(TestSparkApp.scala:20)
at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at tech.metis.bde.TestSparkApp$.main(TestSparkApp.scala:20)
at tech.metis.bde.TestSparkApp.main(TestSparkApp.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.DseSparkSubmit$.org$apache$spark$deploy$DseSparkSubmit$$runMain(DseSparkSubmit.scala:662)
at org.apache.spark.deploy.DseSparkSubmit$.doRunMain$1(DseSparkSubmit.scala:167)
at org.apache.spark.deploy.DseSparkSubmit$.submit(DseSparkSubmit.scala:192)
at org.apache.spark.deploy.DseSparkSubmit$.main(DseSparkSubmit.scala:106)
at org.apache.spark.deploy.DseSparkSubmitBootstrapper$.main(DseSparkSubmitBootstrapper.scala:106)
at org.apache.spark.deploy.DseSparkSubmitBootstrapper.main(DseSparkSubmitBootstrapper.scala)
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
... 57 common frames omitted
Caused by: java.lang.reflect.InvocationTargetException: null
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
... 63 common frames omitted
Caused by: org.apache.hadoop.hive.metastore.api.MetaException: Got exception: com.datastax.bdp.fs.client.DseFsConnectionException Could not connect to DSEFS. Last endpoint tried: 10.0.0.4:5598
at org.apache.hadoop.hive.metastore.MetaStoreUtils.logAndThrowMetaException(MetaStoreUtils.java:1213)
at org.apache.hadoop.hive.metastore.Warehouse.getFs(Warehouse.java:106)
at org.apache.hadoop.hive.metastore.Warehouse.getDnsPath(Warehouse.java:140)
at org.apache.hadoop.hive.metastore.Warehouse.getDnsPath(Warehouse.java:146)
at org.apache.hadoop.hive.metastore.Warehouse.getWhRoot(Warehouse.java:159)
at org.apache.hadoop.hive.metastore.Warehouse.getDefaultDatabasePath(Warehouse.java:177)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB_core(HiveMetaStore.java:601)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
... 68 common frames omitted
WARN 2019-05-24 09:02:02,216 io.netty.channel.ChannelInitializer: Failed to initialize a channel. Closing: [id: 0x61f0844e]
java.lang.VerifyError: Bad invokespecial instruction: current class isn't assignable to reference class.
Exception Details:
Location:
com/datastax/bdp/fs/rest/client/StartTlsClientHandler.decode(Lio/netty/channel/ChannelHandlerContext;Lio/netty/handler/codec/http/HttpObject;Ljava/util/List;)V #4: invokespecial
Reason:
Error exists in the bytecode
Bytecode:
0x0000000: 2a2b 2c2d b701 72b2 003c 2db9 0178 0100
0x0000010: 04a3 0007 04a7 0004 03b6 017b 2db9 017e
0x0000020: 0100 9a00 1c2d 03b9 0181 0200 c001 373a
0x0000030: 042d b901 8401 002a 2b19 04b7 0186 b1
Stackmap Table:
same_locals_1_stack_item_frame(#24,Object[#56])
full_frame(#25,{Object[#2],Object[#80],Object[#396],Object[#372]},{Object[#56],Integer})
same_frame(#62)
at com.datastax.bdp.fs.rest.client.RestClientConnection$ClientChannelInitializer.initChannel(RestClientConnection.scala:568)
at com.datastax.bdp.fs.rest.client.RestClientConnection$ClientChannelInitializer.initChannel(RestClientConnection.scala:563)
ERROR 2019-05-24 09:02:04,434 hive.log: Got exception: com.datastax.bdp.fs.client.DseFsConnectionException Could not connect to DSEFS. Last endpoint tried: 10.0.0.4:5598
com.datastax.bdp.fs.client.DseFsConnectionException: Could not connect to DSEFS. Last endpoint tried: 10.0.0.4:5598
at com.datastax.bdp.fs.client.RestClientProxy.<init>(RestClientProxy.scala:159)
at com.datastax.bdp.fs.client.DseFsClient.liftedTree1$1(DseFsClient.scala:62)
at com.datastax.bdp.fs.client.DseFsClient.<init>(DseFsClient.scala:62)
at com.datastax.bdp.fs.hadoop.DseFileSystem.initialize(DseFileSystem.scala:118)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2653)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:92)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2687)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2669)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:371)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:362)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295)
at org.apache.hadoop.hive.metastore.Warehouse.getFs(Warehouse.java:104)
at org.apache.hadoop.hive.metastore.Warehouse.getDnsPath(Warehouse.java:140)
at org.apache.hadoop.hive.metastore.Warehouse.getDnsPath(Warehouse.java:146)
at org.apache.hadoop.hive.metastore.Warehouse.getWhRoot(Warehouse.java:159)
at org.apache.hadoop.hive.metastore.Warehouse.getDefaultDatabasePath(Warehouse.java:177)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB_core(HiveMetaStore.java:601)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:191)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:362)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:266)
at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:195)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:105)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:93)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:35)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:289)
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:1059)
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:137)
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:136)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:136)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:133)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:171)
at org.apache.spark.sql.Dataset$.apply(Dataset.scala:62)
at org.apache.spark.sql.SparkSession.createDataset(SparkSession.scala:466)
at org.apache.spark.sql.SQLContext.createDataset(SQLContext.scala:377)
at org.apache.spark.sql.SQLImplicits.localSeqToDatasetHolder(SQLImplicits.scala:213)
at tech.metis.bde.TestSparkApp$.delayedEndpoint$tech$metis$bde$TestSparkApp$1(TestSparkApp.scala:35)
at tech.metis.bde.TestSparkApp$delayedInit$body.apply(TestSparkApp.scala:20)
at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at tech.metis.bde.TestSparkApp$.main(TestSparkApp.scala:20)
at tech.metis.bde.TestSparkApp.main(TestSparkApp.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.DseSparkSubmit$.org$apache$spark$deploy$DseSparkSubmit$$runMain(DseSparkSubmit.scala:662)
at org.apache.spark.deploy.DseSparkSubmit$.doRunMain$1(DseSparkSubmit.scala:167)
at org.apache.spark.deploy.DseSparkSubmit$.submit(DseSparkSubmit.scala:192)
at org.apache.spark.deploy.DseSparkSubmit$.main(DseSparkSubmit.scala:106)
at org.apache.spark.deploy.DseSparkSubmitBootstrapper$.main(DseSparkSubmitBootstrapper.scala:106)
at org.apache.spark.deploy.DseSparkSubmitBootstrapper.main(DseSparkSubmitBootstrapper.scala)
Caused by: com.datastax.bdp.fs.rest.client.RestConnectException: Failed to open connection to 10.0.0.4:5598
at com.datastax.bdp.fs.rest.client.RestClientConnection$$anonfun$2.applyO
io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:405)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:500)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:906)
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
at java.lang.Thread.run(Thread.java:748)
Caused by: io.netty.channel.ExtendedClosedChannelException: null
at io.netty.channel.AbstractChannel$AbstractUnsafe.ensureOpen(...)(Unknown Source)
ERROR 2019-05-24 09:02:04,438 hive.log: Converting exception to MetaException
Exception in thread "main" java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionStateBuilder':
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:1062)
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:137)
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:136)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:136)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:133)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:171)
at org.apache.spark.sql.Dataset$.apply(Dataset.scala:62)
at org.apache.spark.sql.SparkSession.createDataset(SparkSession.scala:466)
at org.apache.spark.sql.SQLContext.createDataset(SQLContext.scala:377)
at org.apache.spark.sql.SQLImplicits.localSeqToDatasetHolder(SQLImplicits.scala:213)
at tech.metis.bde.TestSparkApp$.delayedEndpoint$tech$metis$bde$TestSparkApp$1(TestSparkApp.scala:35)
at tech.metis.bde.TestSparkApp$delayedInit$body.apply(TestSparkApp.scala:20)
at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at tech.metis.bde.TestSparkApp$.main(TestSparkApp.scala:20)
at tech.metis.bde.TestSparkApp.main(TestSparkApp.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.DseSparkSubmit$.org$apache$spark$deploy$DseSparkSubmit$$runMain(DseSparkSubmit.scala:662)
at org.apache.spark.deploy.DseSparkSubmit$.doRunMain$1(DseSparkSubmit.scala:167)
at org.apache.spark.deploy.DseSparkSubmit$.submit(DseSparkSubmit.scala:192)
at org.apache.spark.deploy.DseSparkSubmit$.main(DseSparkSubmit.scala:106)
at org.apache.spark.deploy.DseSparkSubmitBootstrapper$.main(DseSparkSubmitBootstrapper.scala:106)
at org.apache.spark.deploy.DseSparkSubmitBootstrapper.main(DseSparkSubmitBootstrapper.scala)
Caused by: org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient;
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
... 54 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
... 60 more
Caused by: MetaException(message:Got exception: com.datastax.bdp.fs.client.DseFsConnectionException Could not connect to DSEFS. Last endpoint tried: 10.0.0.4:5598)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.logAndThrowMetaException(MetaStoreUtils.java:1213)
at org.apache.hadoop.hive.metastore.Warehouse.getFs(Warehouse.java:106)
at org.apache.hadoop.hive.metastore.Warehouse.getDnsPath(Warehouse.java:140)
at org.apache.hadoop.hive.metastore.Warehouse.getDnsPath(Warehouse.java:146)
at org.apache.hadoop.hive.metastore.Warehouse.getWhRoot(Warehouse.java:159)
at org.apache.hadoop.hive.metastore.Warehouse.getDefaultDatabasePath(Warehouse.java:177)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB_core(HiveMetaStore.java:601)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
... 65 more
A part of the output has been ommitted.
The most probable problem is that you have compiled your fat jar with Spark or Scala versions that are different that is used by DSE, so it has incompatible components.
Usually, it's enough to specify only following dependency in your build file (it's located in DataStax repository):
<dependency>
<groupId>com.datastax.dse</groupId>
<artifactId>dse-spark-dependencies</artifactId>
<version>6.0.7</version>
<scope>provided</scope>
</dependency>
and that's all (of course, you need to add your application dependencies) - just compile, get jar file, and run. You can find more examples of building for DSE in DataStax repo.
Also, please note that everything that you're specifying after the name of the jar file is treated as application parameters, so you may need to move --master before name of jar file
Getting this error whenever I build:
Error:java.lang.UnsupportedOperationException: Unresolved local class: com/toppr/dubbio/v3/base/BaseSocketActivity$serviceConnection$1
at org.jetbrains.kotlin.descriptors.NotFoundClasses$classes$1.invoke(NotFoundClasses.kt:44)
at org.jetbrains.kotlin.descriptors.NotFoundClasses$classes$1.invoke(NotFoundClasses.kt:32)
at org.jetbrains.kotlin.storage.LockBasedStorageManager$MapBasedMemoizedFunction.invoke(LockBasedStorageManager.java:408)
at org.jetbrains.kotlin.storage.LockBasedStorageManager$MapBasedMemoizedFunctionToNotNull.invoke(LockBasedStorageManager.java:483)
at org.jetbrains.kotlin.descriptors.NotFoundClasses.getClass(NotFoundClasses.kt:101)
at org.jetbrains.kotlin.serialization.deserialization.TypeDeserializer$typeConstructor$1.invoke(TypeDeserializer.kt:120)
at org.jetbrains.kotlin.serialization.deserialization.TypeDeserializer.typeConstructor(TypeDeserializer.kt:124)
at org.jetbrains.kotlin.serialization.deserialization.TypeDeserializer.simpleType(TypeDeserializer.kt:82)
at org.jetbrains.kotlin.serialization.deserialization.TypeDeserializer.type(TypeDeserializer.kt:70)
at org.jetbrains.kotlin.serialization.deserialization.TypeDeserializer.type$default(TypeDeserializer.kt:62)
at org.jetbrains.kotlin.serialization.deserialization.MemberDeserializer.loadProperty(MemberDeserializer.kt:67)
at org.jetbrains.kotlin.serialization.deserialization.descriptors.DeserializedMemberScope.computeProperties(DeserializedMemberScope.kt:123)
at org.jetbrains.kotlin.serialization.deserialization.descriptors.DeserializedMemberScope.access$computeProperties(DeserializedMemberScope.kt:35)
at org.jetbrains.kotlin.serialization.deserialization.descriptors.DeserializedMemberScope$properties$1.invoke(DeserializedMemberScope.kt:61)
at org.jetbrains.kotlin.serialization.deserialization.descriptors.DeserializedMemberScope$properties$1.invoke(DeserializedMemberScope.kt:35)
at org.jetbrains.kotlin.storage.LockBasedStorageManager$MapBasedMemoizedFunction.invoke(LockBasedStorageManager.java:408)
at org.jetbrains.kotlin.storage.LockBasedStorageManager$MapBasedMemoizedFunctionToNotNull.invoke(LockBasedStorageManager.java:483)
at org.jetbrains.kotlin.serialization.deserialization.descriptors.DeserializedMemberScope.getContributedVariables(DeserializedMemberScope.kt:137)
at org.jetbrains.kotlin.serialization.deserialization.descriptors.DeserializedClassDescriptor$DeserializedClassMemberScope.getContributedVariables(DeserializedClassDescriptor.kt:232)
at org.jetbrains.kotlin.serialization.deserialization.descriptors.DeserializedMemberScope.addFunctionsAndProperties(DeserializedMemberScope.kt:185)
at org.jetbrains.kotlin.serialization.deserialization.descriptors.DeserializedMemberScope.computeDescriptors(DeserializedMemberScope.kt:153)
at org.jetbrains.kotlin.serialization.deserialization.descriptors.DeserializedClassDescriptor$DeserializedClassMemberScope$allDescriptors$1.invoke(DeserializedClassDescriptor.kt:218)
at org.jetbrains.kotlin.serialization.deserialization.descriptors.DeserializedClassDescriptor$DeserializedClassMemberScope$allDescriptors$1.invoke(DeserializedClassDescriptor.kt:211)
at org.jetbrains.kotlin.storage.LockBasedStorageManager$LockBasedLazyValue.invoke(LockBasedStorageManager.java:323)
at org.jetbrains.kotlin.storage.LockBasedStorageManager$LockBasedNotNullLazyValue.invoke(LockBasedStorageManager.java:370)
at org.jetbrains.kotlin.serialization.deserialization.descriptors.DeserializedClassDescriptor$DeserializedClassMemberScope.getContributedDescriptors(DeserializedClassDescriptor.kt:223)
at org.jetbrains.kotlin.resolve.scopes.ResolutionScope$DefaultImpls.getContributedDescriptors$default(ResolutionScope.kt:40)
at org.jetbrains.kotlin.resolve.scopes.SubstitutingScope$_allDescriptors$2.invoke(SubstitutingScope.kt:36)
at org.jetbrains.kotlin.resolve.scopes.SubstitutingScope$_allDescriptors$2.invoke(SubstitutingScope.kt:30)
at kotlin.SynchronizedLazyImpl.getValue(Lazy.kt:130)
at org.jetbrains.kotlin.resolve.scopes.SubstitutingScope.get_allDescriptors(SubstitutingScope.kt)
at org.jetbrains.kotlin.resolve.scopes.SubstitutingScope.getContributedDescriptors(SubstitutingScope.kt:80)
at org.jetbrains.kotlin.resolve.scopes.ResolutionScope$DefaultImpls.getContributedDescriptors$default(ResolutionScope.kt:40)
at org.jetbrains.kotlin.resolve.lazy.descriptors.LazyClassMemberScope.computeExtraDescriptors(LazyClassMemberScope.kt:71)
at org.jetbrains.kotlin.resolve.lazy.descriptors.LazyClassMemberScope$extraDescriptors$1.invoke(LazyClassMemberScope.kt:58)
at org.jetbrains.kotlin.resolve.lazy.descriptors.LazyClassMemberScope$extraDescriptors$1.invoke(LazyClassMemberScope.kt:47)
at org.jetbrains.kotlin.storage.LockBasedStorageManager$LockBasedLazyValue.invoke(LockBasedStorageManager.java:323)
at org.jetbrains.kotlin.storage.LockBasedStorageManager$LockBasedNotNullLazyValue.invoke(LockBasedStorageManager.java:370)
at org.jetbrains.kotlin.resolve.lazy.descriptors.LazyClassMemberScope.getContributedDescriptors(LazyClassMemberScope.kt:64)
at org.jetbrains.kotlin.resolve.DescriptorUtils.getAllDescriptors(DescriptorUtils.java:578)
at org.jetbrains.kotlin.resolve.lazy.descriptors.LazyClassDescriptor.resolveMemberHeaders(LazyClassDescriptor.java:569)
at org.jetbrains.kotlin.resolve.lazy.descriptors.LazyClassDescriptor.doForceResolveAllContents(LazyClassDescriptor.java:539)
at org.jetbrains.kotlin.resolve.lazy.descriptors.LazyClassDescriptor.lambda$new$4(LazyClassDescriptor.java:229)
at org.jetbrains.kotlin.storage.LockBasedStorageManager$LockBasedLazyValue.invoke(LockBasedStorageManager.java:323)
at org.jetbrains.kotlin.resolve.lazy.descriptors.LazyClassDescriptor.forceResolveAllContents(LazyClassDescriptor.java:535)
at org.jetbrains.kotlin.resolve.lazy.ForceResolveUtil.doForceResolveAllContents(ForceResolveUtil.java:75)
at org.jetbrains.kotlin.resolve.lazy.ForceResolveUtil.forceResolveAllContents(ForceResolveUtil.java:41)
at org.jetbrains.kotlin.resolve.jvm.extensions.PartialAnalysisHandlerExtension$doAnalysis$1.invoke(PartialAnalysisHandlerExtension.kt:66)
at org.jetbrains.kotlin.resolve.jvm.extensions.PartialAnalysisHandlerExtension$doAnalysis$1.invoke(PartialAnalysisHandlerExtension.kt:34)
at org.jetbrains.kotlin.resolve.jvm.extensions.PartialAnalysisHandlerExtension.doForEachDeclaration(PartialAnalysisHandlerExtension.kt:119)
at org.jetbrains.kotlin.resolve.jvm.extensions.PartialAnalysisHandlerExtension.doForEachDeclaration(PartialAnalysisHandlerExtension.kt:133)
at org.jetbrains.kotlin.resolve.jvm.extensions.PartialAnalysisHandlerExtension.doAnalysis(PartialAnalysisHandlerExtension.kt:61)
at org.jetbrains.kotlin.kapt3.AbstractKapt3Extension.doAnalysis(Kapt3Extension.kt:145)
at org.jetbrains.kotlin.cli.jvm.compiler.TopDownAnalyzerFacadeForJVM.analyzeFilesWithJavaIntegration(TopDownAnalyzerFacadeForJVM.kt:104)
at org.jetbrains.kotlin.cli.jvm.compiler.TopDownAnalyzerFacadeForJVM.analyzeFilesWithJavaIntegration$default(TopDownAnalyzerFacadeForJVM.kt:83)
at org.jetbrains.kotlin.cli.jvm.compiler.KotlinToJVMBytecodeCompiler$analyze$1.invoke(KotlinToJVMBytecodeCompiler.kt:376)
at org.jetbrains.kotlin.cli.jvm.compiler.KotlinToJVMBytecodeCompiler$analyze$1.invoke(KotlinToJVMBytecodeCompiler.kt:67)
at org.jetbrains.kotlin.cli.common.messages.AnalyzerWithCompilerReport.analyzeAndReport(AnalyzerWithCompilerReport.kt:96)
at org.jetbrains.kotlin.cli.jvm.compiler.KotlinToJVMBytecodeCompiler.analyze(KotlinToJVMBytecodeCompiler.kt:367)
at org.jetbrains.kotlin.cli.jvm.compiler.KotlinToJVMBytecodeCompiler.compileModules(KotlinToJVMBytecodeCompiler.kt:132)
at org.jetbrains.kotlin.cli.jvm.K2JVMCompiler.doExecute(K2JVMCompiler.kt:158)
at org.jetbrains.kotlin.cli.jvm.K2JVMCompiler.doExecute(K2JVMCompiler.kt:61)
at org.jetbrains.kotlin.cli.common.CLICompiler.execImpl(CLICompiler.java:107)
at org.jetbrains.kotlin.cli.common.CLICompiler.execImpl(CLICompiler.java:51)
at org.jetbrains.kotlin.cli.common.CLITool.exec(CLITool.kt:92)
at org.jetbrains.kotlin.incremental.IncrementalJvmCompilerRunner.runCompiler(IncrementalJvmCompilerRunner.kt:353)
at org.jetbrains.kotlin.incremental.IncrementalJvmCompilerRunner.runCompiler(IncrementalJvmCompilerRunner.kt:86)
at org.jetbrains.kotlin.incremental.IncrementalCompilerRunner.compileIncrementally(IncrementalCompilerRunner.kt:213)
at org.jetbrains.kotlin.incremental.IncrementalCompilerRunner.compile(IncrementalCompilerRunner.kt:83)
at org.jetbrains.kotlin.daemon.CompileServiceImpl.execIncrementalCompiler(CompileServiceImpl.kt:515)
at org.jetbrains.kotlin.daemon.CompileServiceImpl.access$execIncrementalCompiler(CompileServiceImpl.kt:96)
at org.jetbrains.kotlin.daemon.CompileServiceImpl$compile$$inlined$ifAlive$lambda$2.invoke(CompileServiceImpl.kt:399)
at org.jetbrains.kotlin.daemon.CompileServiceImpl$compile$$inlined$ifAlive$lambda$2.invoke(CompileServiceImpl.kt:96)
at org.jetbrains.kotlin.daemon.CompileServiceImpl$doCompile$$inlined$ifAlive$lambda$2.invoke(CompileServiceImpl.kt:892)
at org.jetbrains.kotlin.daemon.CompileServiceImpl$doCompile$$inlined$ifAlive$lambda$2.invoke(CompileServiceImpl.kt:96)
at org.jetbrains.kotlin.daemon.common.DummyProfiler.withMeasure(PerfUtils.kt:137)
at org.jetbrains.kotlin.daemon.CompileServiceImpl.checkedCompile(CompileServiceImpl.kt:919)
at org.jetbrains.kotlin.daemon.CompileServiceImpl.doCompile(CompileServiceImpl.kt:891)
at org.jetbrains.kotlin.daemon.CompileServiceImpl.compile(CompileServiceImpl.kt:398)
at sun.reflect.GeneratedMethodAccessor91.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at sun.rmi.server.UnicastServerRef.dispatch(UnicastServerRef.java:346)
at sun.rmi.transport.Transport$1.run(Transport.java:200)
at sun.rmi.transport.Transport$1.run(Transport.java:197)
at java.security.AccessController.doPrivileged(Native Method)
at sun.rmi.transport.Transport.serviceCall(Transport.java:196)
at sun.rmi.transport.tcp.TCPTransport.handleMessages(TCPTransport.java:568)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run0(TCPTransport.java:826)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.lambda$run$0(TCPTransport.java:683)
at java.security.AccessController.doPrivileged(Native Method)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run(TCPTransport.java:682)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
[KOTLIN] deleting /Users/mayankgupta/StudioProjects/toppr-android-chat/app/build/tmp/kapt3/incrementalData/debug on error
[KOTLIN] deleting /Users/mayankgupta/StudioProjects/toppr-android-chat/app/build/tmp/kapt3/incrementalData/debug on error
After clean and then building, it works fine. First build try always throws this error
I am using Amazon machine to run the pyspark code
code in pyspark shell:
a=open("test.txt")
s=sc.parallelize(a)
print(s.count())
Its working as i am not able to use directly sc.textFile("test.txt") due to some issues.
code in python file:
from pyspark import SparkContxt
sc=SparkContext()
with open("test.txt") as f:
s=sc.parallelize(f)
print(s.count())
when i try spark-submit test.py i am getting error Name or service not known
ubuntu#10-0-0-32:~/Deepak/projects$ spark-submit test1.py
16/06/12 03:44:53 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/06/12 03:44:59 ERROR : 10-0-0-32: 10-0-0-32: Name or service not known
java.net.UnknownHostException: 10-0-0-32: 10-0-0-32: Name or service not known
at java.net.InetAddress.getLocalHost(InetAddress.java:1496)
at tachyon.util.network.NetworkAddressUtils.getLocalIpAddress(NetworkAddressUtils.java:355)
at tachyon.util.network.NetworkAddressUtils.getLocalHostName(NetworkAddressUtils.java:320)
at tachyon.conf.TachyonConf.<init>(TachyonConf.java:122)
at tachyon.conf.TachyonConf.<init>(TachyonConf.java:111)
at tachyon.Version.<clinit>(Version.java:27)
at tachyon.Constants.<clinit>(Constants.java:328)
at tachyon.hadoop.AbstractTFS.<clinit>(AbstractTFS.java:63)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at java.lang.Class.newInstance(Class.java:383)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:373)
at java.util.ServiceLoader$1.next(ServiceLoader.java:445)
at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2364)
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2375)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2392)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2431)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2413)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
at org.apache.spark.SparkContext.addFile(SparkContext.scala:1362)
at org.apache.spark.SparkContext.addFile(SparkContext.scala:1340)
at org.apache.spark.SparkContext$$anonfun$15.apply(SparkContext.scala:491)
at org.apache.spark.SparkContext$$anonfun$15.apply(SparkContext.scala:491)
at scala.collection.immutable.List.foreach(List.scala:318)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:491)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:381)
at py4j.Gateway.invoke(Gateway.java:214)
at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
at py4j.GatewayConnection.run(GatewayConnection.java:209)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.net.UnknownHostException: 10-0-0-32: Name or service not known
at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
at java.net.InetAddress$1.lookupAllHostAddr(InetAddress.java:922)
at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1316)
at java.net.InetAddress.getLocalHost(InetAddress.java:1492)
... 40 more
16/06/12 03:44:59 ERROR SparkContext: Error initializing SparkContext.
java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider tachyon.hadoop.TFS could not be instantiated
at java.util.ServiceLoader.fail(ServiceLoader.java:224)
at java.util.ServiceLoader.access$100(ServiceLoader.java:181)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:377)
at java.util.ServiceLoader$1.next(ServiceLoader.java:445)
at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2364)
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2375)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2392)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2431)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2413)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
at org.apache.spark.SparkContext.addFile(SparkContext.scala:1362)
at org.apache.spark.SparkContext.addFile(SparkContext.scala:1340)
at org.apache.spark.SparkContext$$anonfun$15.apply(SparkContext.scala:491)
at org.apache.spark.SparkContext$$anonfun$15.apply(SparkContext.scala:491)
at scala.collection.immutable.List.foreach(List.scala:318)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:491)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:381)
at py4j.Gateway.invoke(Gateway.java:214)
at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
at py4j.GatewayConnection.run(GatewayConnection.java:209)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ExceptionInInitializerError
at tachyon.Constants.<clinit>(Constants.java:328)
at tachyon.hadoop.AbstractTFS.<clinit>(AbstractTFS.java:63)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at java.lang.Class.newInstance(Class.java:383)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:373)
... 27 more
Caused by: java.lang.RuntimeException: java.net.UnknownHostException: 10-0-0-32: 10-0-0-32: Name or service not known
at org.spark-project.guava.base.Throwables.propagate(Throwables.java:160)
at tachyon.util.network.NetworkAddressUtils.getLocalIpAddress(NetworkAddressUtils.java:398)
at tachyon.util.network.NetworkAddressUtils.getLocalHostName(NetworkAddressUtils.java:320)
at tachyon.conf.TachyonConf.<init>(TachyonConf.java:122)
at tachyon.conf.TachyonConf.<init>(TachyonConf.java:111)
at tachyon.Version.<clinit>(Version.java:27)
... 35 more
Caused by: java.net.UnknownHostException: 10-0-0-32: 10-0-0-32: Name or service not known
at java.net.InetAddress.getLocalHost(InetAddress.java:1496)
at tachyon.util.network.NetworkAddressUtils.getLocalIpAddress(NetworkAddressUtils.java:355)
... 39 more
Caused by: java.net.UnknownHostException: 10-0-0-32: Name or service not known
at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
at java.net.InetAddress$1.lookupAllHostAddr(InetAddress.java:922)
at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1316)
at java.net.InetAddress.getLocalHost(InetAddress.java:1492)
... 40 more
16/06/12 03:44:59 WARN MetricsSystem: Stopping a MetricsSystem that is not running
Traceback (most recent call last):
File "/home/ubuntu/Deepak/projects/test1.py", line 2, in <module>
sc = SparkContext("local", "test1", pyFiles=['test1.py'])
File "/home/ubuntu/spark-1.6.0-bin-hadoop2.4/python/lib/pyspark.zip/pyspark/context.py", line 115, in __init__
File "/home/ubuntu/spark-1.6.0-bin-hadoop2.4/python/lib/pyspark.zip/pyspark/context.py", line 172, in _do_init
File "/home/ubuntu/spark-1.6.0-bin-hadoop2.4/python/lib/pyspark.zip/pyspark/context.py", line 235, in _initialize_context
File "/home/ubuntu/spark-1.6.0-bin-hadoop2.4/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py", line 1064, in __call__
File "/home/ubuntu/spark-1.6.0-bin-hadoop2.4/python/lib/py4j-0.9-src.zip/py4j/protocol.py", line 308, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider tachyon.hadoop.TFS could not be instantiated
at java.util.ServiceLoader.fail(ServiceLoader.java:224)
at java.util.ServiceLoader.access$100(ServiceLoader.java:181)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:377)
at java.util.ServiceLoader$1.next(ServiceLoader.java:445)
at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2364)
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2375)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2392)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2431)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2413)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
at org.apache.spark.SparkContext.addFile(SparkContext.scala:1362)
at org.apache.spark.SparkContext.addFile(SparkContext.scala:1340)
at org.apache.spark.SparkContext$$anonfun$15.apply(SparkContext.scala:491)
at org.apache.spark.SparkContext$$anonfun$15.apply(SparkContext.scala:491)
at scala.collection.immutable.List.foreach(List.scala:318)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:491)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:381)
at py4j.Gateway.invoke(Gateway.java:214)
at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
at py4j.GatewayConnection.run(GatewayConnection.java:209)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ExceptionInInitializerError
at tachyon.Constants.<clinit>(Constants.java:328)
at tachyon.hadoop.AbstractTFS.<clinit>(AbstractTFS.java:63)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at java.lang.Class.newInstance(Class.java:383)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:373)
... 27 more
Caused by: java.lang.RuntimeException: java.net.UnknownHostException: 10-0-0-32: 10-0-0-32: Name or service not known
at org.spark-project.guava.base.Throwables.propagate(Throwables.java:160)
at tachyon.util.network.NetworkAddressUtils.getLocalIpAddress(NetworkAddressUtils.java:398)
at tachyon.util.network.NetworkAddressUtils.getLocalHostName(NetworkAddressUtils.java:320)
at tachyon.conf.TachyonConf.<init>(TachyonConf.java:122)
at tachyon.conf.TachyonConf.<init>(TachyonConf.java:111)
at tachyon.Version.<clinit>(Version.java:27)
... 35 more
Caused by: java.net.UnknownHostException: 10-0-0-32: 10-0-0-32: Name or service not known
at java.net.InetAddress.getLocalHost(InetAddress.java:1496)
at tachyon.util.network.NetworkAddressUtils.getLocalIpAddress(NetworkAddressUtils.java:355)
... 39 more
Caused by: java.net.UnknownHostException: 10-0-0-32: Name or service not known
at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
at java.net.InetAddress$1.lookupAllHostAddr(InetAddress.java:922)
at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1316)
at java.net.InetAddress.getLocalHost(InetAddress.java:1492)
... 40 more
Added hostname to etc/hosts file
previously i did as
IP ubuntu(username) alias_name
i changed to
IP hostname alias_name
The confusion part is here as i am using amazon machine my IP and hostname as same.
When a large class with 2000+ lines is added to #prepareForTest, the following error is encountered:
java.lang.IllegalStateException: Failed to transform class with name com.gnx.company.UtilityClass Reason: javassist.bytecode.Utf8Info cannot be cast to javassist.bytecode.ClassInfo
at org.powermock.core.classloader.MockClassLoader.loadMockClass(MockClassLoader.java:247)
at org.powermock.core.classloader.MockClassLoader.loadModifiedClass(MockClassLoader.java:177)
at org.powermock.core.classloader.DeferSupportingClassLoader.loadClass(DeferSupportingClassLoader.java:68)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:274)
at sun.reflect.generics.factory.CoreReflectionFactory.makeNamedType(CoreReflectionFactory.java:114)
at sun.reflect.generics.visitor.Reifier.visitClassTypeSignature(Reifier.java:125)
at sun.reflect.generics.tree.ClassTypeSignature.accept(ClassTypeSignature.java:49)
at sun.reflect.annotation.AnnotationParser.parseSig(AnnotationParser.java:432)
at sun.reflect.annotation.AnnotationParser.parseClassValue(AnnotationParser.java:413)
at sun.reflect.annotation.AnnotationParser.parseClassArray(AnnotationParser.java:715)
at sun.reflect.annotation.AnnotationParser.parseArray(AnnotationParser.java:522)
at sun.reflect.annotation.AnnotationParser.parseMemberValue(AnnotationParser.java:348)
at sun.reflect.annotation.AnnotationParser.parseAnnotation2(AnnotationParser.java:283)
at sun.reflect.annotation.AnnotationParser.parseAnnotations2(AnnotationParser.java:117)
at sun.reflect.annotation.AnnotationParser.parseAnnotations(AnnotationParser.java:70)
at java.lang.Class.initAnnotationsIfNecessary(Class.java:3271)
at java.lang.Class.getAnnotation(Class.java:3219)
at org.junit.internal.MethodSorter.getDeclaredMethods(MethodSorter.java:52)
at org.junit.internal.runners.TestClass.getAnnotatedMethods(TestClass.java:45)
at org.junit.internal.runners.MethodValidator.validateTestMethods(MethodValidator.java:71)
at org.junit.internal.runners.MethodValidator.validateStaticMethods(MethodValidator.java:44)
at org.junit.internal.runners.MethodValidator.validateMethodsForDefaultRunner(MethodValidator.java:50)
at org.powermock.modules.junit4.internal.impl.PowerMockJUnit44RunnerDelegateImpl.validate(PowerMockJUnit44RunnerDelegateImpl.java:108)
at org.powermock.modules.junit4.internal.impl.PowerMockJUnit44RunnerDelegateImpl.<init>(PowerMockJUnit44RunnerDelegateImpl.java:70)
at org.powermock.modules.junit4.internal.impl.PowerMockJUnit47RunnerDelegateImpl.<init>(PowerMockJUnit47RunnerDelegateImpl.java:42)
at org.powermock.modules.junit4.internal.impl.PowerMockJUnit49RunnerDelegateImpl.<init>(PowerMockJUnit49RunnerDelegateImpl.java:25)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.powermock.modules.junit4.common.internal.impl.JUnit4TestSuiteChunkerImpl.createDelegatorFromClassloader(JUnit4TestSuiteChunkerImpl.java:149)
at org.powermock.modules.junit4.common.internal.impl.JUnit4TestSuiteChunkerImpl.createDelegatorFromClassloader(JUnit4TestSuiteChunkerImpl.java:39)
at org.powermock.tests.utils.impl.AbstractTestSuiteChunkerImpl.createTestDelegators(AbstractTestSuiteChunkerImpl.java:218)
at org.powermock.modules.junit4.common.internal.impl.JUnit4TestSuiteChunkerImpl.<init>(JUnit4TestSuiteChunkerImpl.java:59)
at org.powermock.modules.junit4.common.internal.impl.AbstractCommonPowerMockRunner.<init>(AbstractCommonPowerMockRunner.java:32)
at org.powermock.modules.junit4.PowerMockRunner.<init>(PowerMockRunner.java:33)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.junit.internal.builders.AnnotatedBuilder.buildRunner(AnnotatedBuilder.java:29)
at org.junit.internal.builders.AnnotatedBuilder.runnerForClass(AnnotatedBuilder.java:21)
at org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.java:59)
at org.junit.internal.builders.AllDefaultPossibilitiesBuilder.runnerForClass(AllDefaultPossibilitiesBuilder.java:26)
at org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.java:59)
at org.junit.internal.requests.ClassRequest.getRunner(ClassRequest.java:26)
at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.<init>(JUnit4TestReference.java:33)
at org.eclipse.jdt.internal.junit4.runner.JUnit4TestClassReference.<init>(JUnit4TestClassReference.java:25)
at org.eclipse.jdt.internal.junit4.runner.JUnit4TestLoader.createTest(JUnit4TestLoader.java:48)
at org.eclipse.jdt.internal.junit4.runner.JUnit4TestLoader.loadTests(JUnit4TestLoader.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:452)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:683)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:390)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:197)
Caused by: java.lang.ClassCastException: javassist.bytecode.Utf8Info cannot be cast to javassist.bytecode.ClassInfo
at javassist.bytecode.ConstPool.getClassInfo(ConstPool.java:232)
at javassist.bytecode.stackmap.Tracer.doOpcode148_201(Tracer.java:631)
at javassist.bytecode.stackmap.Tracer.doOpcode(Tracer.java:81)
at javassist.bytecode.stackmap.MapMaker.make(MapMaker.java:187)
at javassist.bytecode.stackmap.MapMaker.make(MapMaker.java:199)
..............
The UtilityClass consists of 2300 lines of code. There are static methods inside the class that needs to be accessed. So, I need to add the class to #prepareForTest.
Anyone knows how to fix this issue?
I am new to Arquillian and want to get some basic testing working (inject a bean and assert it does something).
Exception:
-------------------------------------------------------------------------------
Test set: com.walterjwhite.test.TestCase
-------------------------------------------------------------------------------
Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 1.231 sec <<< FAILURE!
test(com.walterjwhite.test.TestCase) Time elapsed: 0.02 sec <<< ERROR!
java.lang.RuntimeException: Could not inject members
at org.jboss.arquillian.testenricher.cdi.CDIInjectionEnricher.injectClass(CDIInjectionEnricher.java:113)
at org.jboss.arquillian.testenricher.cdi.CDIInjectionEnricher.enrich(CDIInjectionEnricher.java:61)
at org.jboss.arquillian.impl.enricher.ClientTestEnricher.enrich(ClientTestEnricher.java:61)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at org.jboss.arquillian.impl.core.ObserverImpl.invoke(ObserverImpl.java:90)
at org.jboss.arquillian.impl.core.EventContextImpl.invokeObservers(EventContextImpl.java:98)
at org.jboss.arquillian.impl.core.EventContextImpl.proceed(EventContextImpl.java:80)
at org.jboss.arquillian.impl.client.ContainerDeploymentContextHandler.createContext(ContainerDeploymentContextHandler.java:133)
at org.jboss.arquillian.impl.client.ContainerDeploymentContextHandler.createBeforeContext(ContainerDeploymentContextHandler.java:115)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at org.jboss.arquillian.impl.core.ObserverImpl.invoke(ObserverImpl.java:90)
at org.jboss.arquillian.impl.core.EventContextImpl.proceed(EventContextImpl.java:87)
at org.jboss.arquillian.impl.TestContextHandler.createTestContext(TestContextHandler.java:82)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at org.jboss.arquillian.impl.core.ObserverImpl.invoke(ObserverImpl.java:90)
at org.jboss.arquillian.impl.core.EventContextImpl.proceed(EventContextImpl.java:87)
at org.jboss.arquillian.impl.TestContextHandler.createClassContext(TestContextHandler.java:68)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at org.jboss.arquillian.impl.core.ObserverImpl.invoke(ObserverImpl.java:90)
at org.jboss.arquillian.impl.core.EventContextImpl.proceed(EventContextImpl.java:87)
at org.jboss.arquillian.impl.TestContextHandler.createSuiteContext(TestContextHandler.java:54)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at org.jboss.arquillian.impl.core.ObserverImpl.invoke(ObserverImpl.java:90)
at org.jboss.arquillian.impl.core.EventContextImpl.proceed(EventContextImpl.java:87)
at org.jboss.arquillian.impl.core.ManagerImpl.fire(ManagerImpl.java:126)
at org.jboss.arquillian.impl.core.ManagerImpl.fire(ManagerImpl.java:106)
at org.jboss.arquillian.impl.EventTestRunnerAdaptor.before(EventTestRunnerAdaptor.java:85)
at org.jboss.arquillian.junit.Arquillian$4.evaluate(Arquillian.java:210)
at org.jboss.arquillian.junit.Arquillian.multiExecute(Arquillian.java:303)
at org.jboss.arquillian.junit.Arquillian.access$300(Arquillian.java:45)
at org.jboss.arquillian.junit.Arquillian$5.evaluate(Arquillian.java:228)
at org.junit.runners.BlockJUnit4ClassRunner.runNotIgnored(BlockJUnit4ClassRunner.java:79)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:71)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:49)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
at org.jboss.arquillian.junit.Arquillian$2.evaluate(Arquillian.java:173)
at org.jboss.arquillian.junit.Arquillian.multiExecute(Arquillian.java:303)
at org.jboss.arquillian.junit.Arquillian.access$300(Arquillian.java:45)
at org.jboss.arquillian.junit.Arquillian$3.evaluate(Arquillian.java:187)
at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
at org.jboss.arquillian.junit.Arquillian.run(Arquillian.java:127)
at org.apache.maven.surefire.junit4.JUnit4TestSet.execute(JUnit4TestSet.java:35)
at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:115)
at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:97)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at org.apache.maven.surefire.booter.ProviderFactory$ClassLoaderProxy.invoke(ProviderFactory.java:103)
at $Proxy0.invoke(Unknown Source)
at org.apache.maven.surefire.booter.SurefireStarter.invokeProvider(SurefireStarter.java:150)
at org.apache.maven.surefire.booter.SurefireStarter.runSuitesInProcess(SurefireStarter.java:91)
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:69)
Caused by: java.lang.NullPointerException
at org.jboss.arquillian.testenricher.cdi.CDIInjectionEnricher.getBeanManager(CDIInjectionEnricher.java:51)
at org.jboss.arquillian.testenricher.cdi.CDIInjectionEnricher.injectClass(CDIInjectionEnricher.java:100)
... 71 more
TestCase class
#RunWith(Arquillian.class)
public class TestCase
{
#Deployment
public static JavaArchive createDeployment()
{
return ShrinkWrap.create(JavaArchive.class).addClasses(TestEntity.class, Implementation.class)
.addAsManifestResource(EmptyAsset.INSTANCE, ArchivePaths.create("beans.xml"));
}
#Inject
Implementation implementation;
#Test
public void test() throws Exception
{
final TestEntity testEntity = implementation.create();
Assert.assertNotNull(testEntity);
}
}
When I run this, I get a NullPointerException, the bean manager is null. It looks like I am missing a step, but from the examples, it looks like this is all I should need.
Any ideas?
Walter
It looks to me like are trying to use the latest snapshot. Currently the Arquillian repository is being restructured, so unless you have a specific reason for tracking HEAD, I recommend using Alpha5.
You can see working CDI examples in the Arquillian showcase. http://github.com/arquillian/arquillian-showcase
I had a similar problem with arquillian 1.0.4.Final. Getting "java.lang.RuntimeException: Could not inject members" plus stacktrace. This on tests running since a year. The JUnit tests worked if calling the test classes one by one, but failed if calling all test either by doing a Junit test of the whole module, or doing a maven site test of the whole project. No matter to solve the problem, but resetting the arquillian version back to 1.0.3.Final!