In the process of my daily work, I am trying to use the WSDL and XSD from this article:
http://www.ibm.com/developerworks/webservices/library/ws-restwsdl/
as a template from which to generate some java code. My desire is to use the generated java code to in some way validate that my (to be) hand-rolled WSDL and schema are reasonably sane. Problem is, when I run (something like):
...WSDL2Java --noBuildXML --unpack-classes -uri booklist.wsdl -wv 2.0
I get this exception:
[java] Exception in thread "main" org.apache.axis2.wsdl.codegen.CodeGenerationException: Error parsing WSDL
[java] at org.apache.axis2.wsdl.codegen.CodeGenerationEngine.<init>(CodeGenerationEngine.java:159)
[java] at org.apache.axis2.wsdl.WSDL2Code.main(WSDL2Code.java:35)
[java] at org.apache.axis2.wsdl.WSDL2Java.main(WSDL2Java.java:24)
[java] Caused by: java.lang.NullPointerException
[java] at org.apache.xerces.impl.xs.opti.SchemaParsingConfig.setFeature(Unknown Source)
[java] at org.apache.xerces.parsers.BasicParserConfiguration.<init>(BasicParserConfiguration.java:261)
[java] at org.apache.xerces.impl.xs.opti.SchemaParsingConfig.<init>(Unknown Source)
[java] at org.apache.xerces.impl.xs.opti.SchemaParsingConfig.<init>(Unknown Source)
[java] at org.apache.xerces.impl.xs.traversers.XSDHandler.<init>(XSDHandler.java:340)
[java] at org.apache.xerces.impl.xs.traversers.XSDHandler.<init>(XSDHandler.java:347)
[java] at org.apache.xerces.impl.xs.XMLSchemaValidator.<init>(XMLSchemaValidator.java:1086)
[java] at org.apache.xerces.parsers.StandardParserConfiguration.configurePipeline(StandardParserConfiguration.java:673)
[java] at org.apache.xerces.parsers.StandardParserConfiguration.reset(StandardParserConfiguration.java:627)
[java] at org.apache.xerces.parsers.StandardParserConfiguration.parse(StandardParserConfiguration.java:502)
[java] at org.apache.xerces.parsers.StandardParserConfiguration.parse(StandardParserConfiguration.java:585)
[java] at org.apache.xerces.parsers.XMLParser.parse(XMLParser.java:147)
[java] at org.apache.xerces.parsers.DOMParser.parse(DOMParser.java:221)
[java] at org.apache.woden.internal.DOMWSDLReader.getDocument(DOMWSDLReader.java:735)
[java] at org.apache.woden.internal.DOMWSDLReader.retrieveSchema(DOMWSDLReader.java:629)
[java] at org.apache.woden.internal.DOMWSDLReader.parseSchemaImport(DOMWSDLReader.java:380)
[java] at org.apache.woden.internal.BaseWSDLReader.parseTypes(BaseWSDLReader.java:573)
[java] at org.apache.woden.internal.BaseWSDLReader.parseDescription(BaseWSDLReader.java:429)
[java] at org.apache.woden.internal.DOMWSDLReader.readWSDL(DOMWSDLReader.java:185)
[java] at org.apache.woden.internal.DOMWSDLReader.readWSDL(DOMWSDLReader.java:158)
[java] at org.apache.axis2.description.WSDL20ToAxisServiceBuilder.readInTheWSDLFile(WSDL20ToAxisServiceBuilder.java:1225)
[java] at org.apache.axis2.description.WSDL20ToAxisServiceBuilder.readInTheWSDLFile(WSDL20ToAxisServiceBuilder.java:1176)
[java] at org.apache.axis2.description.WSDL20ToAxisServiceBuilder.<init>(WSDL20ToAxisServiceBuilder.java:153)
[java] at org.apache.axis2.description.WSDL20ToAllAxisServicesBuilder.<init>(WSDL20ToAllAxisServicesBuilder.java:53)
[java] at org.apache.axis2.wsdl.codegen.CodeGenerationEngine.<init>(CodeGenerationEngine.java:102)
[java] ... 2 more
Can I not generate the code from files merely sitting in the current directory? (I have done this before, but it was a WSDL 1.1 file which did not have external supporting schema files)
Is there a known problem with this example WSDL?
Is there a bug in Axis2 (version 1.5) WSDL2Java? (obviously, it would be nice to have an error message about some missing data, instead of the null pointer/reference throw up)
Related
I am not able to connect to Big Query table from Spark on GCP.
https://cloud.google.com/hadoop/bigquery-connector
I already tried steps present in above by providing project Id dataset name and table name link still no success .When I am trying to print the data using below code I am getting below error:
Exception in thread "main" java.lang.NoClassDefFoundError: com/google/cloud/hadoop/io/bigquery/BigQueryConfiguration
at Main.main(Main.scala:27)
at Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.orgapachesparkdeploySparkSubmitrunMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmitanon2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.google.cloud.hadoop.io.bigquery.BigQueryConfiguration
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 14 more
You probably miss the spark-bigquery-connector in your classpath, you can add it by adding the following parameter:
gcloud dataproc jobs submit spark --cluster "$MY_CLUSTER" --jars gs://spark-lib/bigquery/spark-bigquery-latest.jar ...
I have a cluster with CDH 5.8.4. I'm runnin a spark streaming application which reads and writes data from/to HBase by using the cloudera spark-hbase connector namely the HBaseContext.
When I start the application I give the principal and the kinit to the spark-submit script.
I'm seeing that after 7 days the application crashed with an error about the expiration of the ticket kerberos related to the HBase context. This is the error from the executors log:
ERROR executor.Executor: Exception in task 0.0 in stage 544265.0 (TID 1149098)
org.apache.hadoop.hbase.client.RetriesExhaustedException: Can't get the location
at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java
:326)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:157)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:61)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:320)
at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:295)
at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:160)
at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:155)
at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:867)
at org.apache.hadoop.hbase.mapreduce.TableRecordReaderImpl.restart(TableRecordReaderImpl.java:91)
at org.apache.hadoop.hbase.mapreduce.TableRecordReaderImpl.initialize(TableRecordReaderImpl.java:169)
at org.apache.hadoop.hbase.mapreduce.TableRecordReader.initialize(TableRecordReader.java:134)
at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase$1.initialize(TableInputFormatBase.java:211)
at org.apache.spark.rdd.NewHadoopRDD$$anon$1.<init>(NewHadoopRDD.scala:164)
at org.apache.spark.rdd.NewHadoopRDD.compute(NewHadoopRDD.scala:129)
at org.apache.hadoop.hbase.spark.NewHBaseRDD.compute(NewHBaseRDD.scala:34)
at org.apache.hadoop.hbase.spark.NewHBaseRDD.compute(NewHBaseRDD.scala:25)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.security.token.SecretManager$InvalidToken: Token has expired
at sun.reflect.GeneratedConstructorAccessor58.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:327)
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRowOrBefore(ProtobufUtil.java:1593)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1398)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1199)
at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:315)
... 30 more
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.token.SecretManager$InvalidToken): Token has expired
at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.readStatus(HBaseSaslRpcClient.java:155)
at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:222)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:617)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$700(RpcClientImpl.java:162)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:743)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:740)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1783)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:740)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:906)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:873)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1242)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:227)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.get(ClientProtos.java:34070)
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRowOrBefore(ProtobufUtil.java:1589)
Does anyone knows how to solve this issue?
Thanks in advance,
Beniamino
We (Splice Machine) had the same issue with a customer. Our issue was caused by https://issues.apache.org/jira/browse/SPARK-12646. We wrote some code to fix the _HOST issue and we also upgraded to Spark 2.2 to get around this issue.
You should not rely on an external ticket cache for distributed jobs. The best solution is to ship a keytab with your application or rely on a keytab being deployed on all nodes where your Spark task may be executed.
UserGroupInformation.loginUserFromKeytab("name#xyz.com", keyTab);
connection=ConnectionFactory.createConnection(conf);
With your approach above, you would need to do something like the following after obtaining the UserGroupInformation instance:
ugi.doAs(new PrivilegedAction<Void>() {
public Void run() {
connection = ConnectionFactory.createConnection(conf);
...
return null;
}
});
When I run the main class I get these errors:
**Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/LoggerFactory
at org.apache.cassandra.cql.jdbc.CassandraDriver.<clinit>(CassandraDriver.java:52)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:188)
at oracledbtest.CqlJdbcTestBasic.main(CqlJdbcTestBasic.java:19)
Caused by: java.lang.ClassNotFoundException: org.slf4j.LoggerFactory
How can I create a new driver connection to Cassandra?
What exactly are you trying to do? The error simply means that you do not have slf4j's jar in your classpath..
I've configured HTTP Connector in server.xml adding some ssl features. I tryied to set my keyAlias to which is the name of the alias for certain certificate (not the private key of the keystore). Then, when I start JBoss I get something like:
[2012-04-12 17:01:37,236 ERROR [org.apache.coyote.http11.Http11Protocol] Error
initializing endpoint
java.io.IOException: Alias name <somealias> do not indetify a key entry
I'm new to ssl configuration and web security core concepts as well. Thanks for your patience.
Edit: complete stacktrace follows:
at org.apache.tomcat.util.net.jsse.JSSESocketFactory.getKeyManagers(JSSESocketFactory.java:412)
at org.apache.tomcat.util.net.jsse.JSSESocketFactory.init(JSSESocketFactory.java:378)
at org.apache.tomcat.util.net.jsse.JSSESocketFactory.createSocket(JSSESocketFactory.java:135)
at org.apache.tomcat.util.net.JIoEndpoint.init(JIoEndpoint.java:497)
at org.apache.tomcat.util.net.JIoEndpoint.start(JIoEndpoint.java:514)
at org.apache.coyote.http11.Http11Protocol.start(Http11Protocol.java:203)
at org.apache.catalina.connector.Connector.start(Connector.java:1146)
at org.jboss.web.tomcat.service.JBossWeb.startConnectors(JBossWeb.java:601)
at org.jboss.web.tomcat.service.JBossWeb.handleNotification(JBossWeb.java:638)
at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.jboss.mx.notification.NotificationListenerProxy.invoke(NotificationListenerProxy.java:153)
at $Proxy46.handleNotification(Unknown Source)
at org.jboss.mx.util.JBossNotificationBroadcasterSupport.handleNotification(JBossNotificationBroadcasterSupport.java:127)
at org.jboss.mx.util.JBossNotificationBroadcasterSupport.sendNotification(JBossNotificationBroadcasterSupport.java:108)
at org.jboss.system.server.ServerImpl.sendNotification(ServerImpl.java:916)
at org.jboss.system.server.ServerImpl.doStart(ServerImpl.java:497)
at org.jboss.system.server.ServerImpl.start(ServerImpl.java:362)
at org.jboss.Main.boot(Main.java:200)
at org.jboss.Main$1.run(Main.java:508)
at java.lang.Thread.run(Thread.java:662)
It looks like you are not importing your keys properties. I'd recommend you review your steps against these two documents
http://docs.jboss.org/jbossweb/3.0.x/ssl-howto.html
A shorter version is here
http://www.agentbob.info/agentbob/79-AB.html
I have tried database connection pooling using BoneCP with struts but on Running the program i got the folling LOG Please help.
Thanks in advance.
Feb 19, 2012 4:52:22 PM org.apache.catalina.core.StandardContext loadOnStartup
SEVERE: Servlet /DbcpDemo threw load() exception
javax.servlet.UnavailableException: Initializing application data source org.apache.struts.action.DATA_SOURCE
at org.apache.struts.action.ActionServlet.initModuleDataSources(ActionServlet.java:812)
at org.apache.struts.action.ActionServlet.init(ActionServlet.java:335)
at javax.servlet.GenericServlet.init(GenericServlet.java:212)
at org.apache.catalina.core.StandardWrapper.loadServlet(StandardWrapper.java:1173)
at org.apache.catalina.core.StandardWrapper.load(StandardWrapper.java:993)
at org.apache.catalina.core.StandardContext.loadOnStartup(StandardContext.java:4421)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4734)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:601)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:943)
at org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:778)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:504)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:840)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
at org.apache.catalina.core.StandardService.start(StandardService.java:525)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:754)
at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
Don't use Struts to configure a data source. Struts is a presentation framework. It has nothing to do with databases.
See http://tomcat.apache.org/tomcat-7.0-doc/jndi-datasource-examples-howto.html for how to declare and use a DataSource in an application deployed on Tomcat.