I'm trying to set up a basic Titan example. In following the docs, I tried running bin/gremlin-server.sh -i com.thinkaurelius.titan titan-all 1.0.0 which throws;
Could not install the dependency: java.io.FileNotFoundException: /usr/share/titan/ext/titan-all/plugin/titan-all-1.0.0.jar (No such file or directory)
java.lang.RuntimeException: java.io.FileNotFoundException: /usr/share/titan/ext/titan-all/plugin/titan-all-1.0.0.jar (No such file or directory)
at org.codehaus.groovy.vmplugin.v7.IndyInterface.selectMethod(IndyInterface.java:215)
at org.apache.tinkerpop.gremlin.groovy.util.DependencyGrabber.getAdditionalDependencies(DependencyGrabber.groovy:165)
at org.codehaus.groovy.vmplugin.v7.IndyInterface.selectMethod(IndyInterface.java:215)
at org.apache.tinkerpop.gremlin.groovy.util.DependencyGrabber.copyDependenciesToPath(DependencyGrabber.groovy:99)
at org.apache.tinkerpop.gremlin.server.util.GremlinServerInstall.main(GremlinServerInstall.java:38)
Caused by: java.io.FileNotFoundException: /usr/share/titan/ext/titan-all/plugin/titan-all-1.0.0.jar (No such file or directory)
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipFile.<init>(ZipFile.java:219)
at java.util.zip.ZipFile.<init>(ZipFile.java:149)
at java.util.jar.JarFile.<init>(JarFile.java:166)
at java.util.jar.JarFile.<init>(JarFile.java:130)
at org.codehaus.groovy.vmplugin.v7.IndyInterface.selectMethod(IndyInterface.java:215)
at org.apache.tinkerpop.gremlin.groovy.util.DependencyGrabber.getAdditionalDependencies(DependencyGrabber.groovy:148)
... 3 more
I also tried it from gremlin.sh;
root#ubuntu:/usr/share/titan# bin/gremlin.sh
\,,,/
(o o)
-----oOOo-(3)-oOOo-----
plugin activated: aurelius.titan
plugin activated: tinkerpop.server
plugin activated: tinkerpop.utilities
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/share/titan/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/share/titan/lib/logback-classic-1.1.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
14:45:44 INFO org.apache.tinkerpop.gremlin.hadoop.structure.HadoopGraph - HADOOP_GREMLIN_LIBS is set to: /usr/share/titan/lib
plugin activated: tinkerpop.hadoop
plugin activated: tinkerpop.tinkergraph
gremlin> :install com.thinkaurelius.titan titan-all 1.0.0
==>java.io.FileNotFoundException: /usr/share/titan/ext/titan-all/plugin/titan-all-1.0.0.jar (No such file or directory)
gremlin>
I've confirmed that groovy has the file;
root#ubuntu:/usr/share/titan# ls ~/.groovy/grapes/com.thinkaurelius.titan/titan-all/jars
titan-all-1.0.0.jar
So now I'm stumped.. Has anyone come across this before?
EDIT: Some notes on how I got here..
My first attempt at getting this working was to use the all-inclusive zip file as per the docs... I changed gremlin-server.yaml to;
graph: conf/titan-cassandra-es.properties
That threw;
407 [main] WARN org.apache.tinkerpop.gremlin.server.GremlinServer - Graph [graph] configured at [conf/titan-cassandra-es.properties] could not be instantiated and will not be available in Gremlin Server. GraphFactory message: Configuration must contain a valid 'gremlin.graph' setting
java.lang.RuntimeException: Configuration must contain a valid 'gremlin.graph' setting
Ok, simple google search tells me I need to add this to conf/titan-cassandra-es.properties;
gremlin.graph=com.thinkaurelius.titan.core.TitanFactory
At which point, I get..
484 [main] WARN org.apache.tinkerpop.gremlin.server.GremlinServer - Graph [graph] configured at [conf/titan-cassandra-es.properties] could not be instantiated and will not be available in Gremlin Server. GraphFactory message: GraphFactory could not instantiate this Graph implementation [class com.thinkaurelius.titan.core.TitanFactory]
java.lang.RuntimeException: GraphFactory could not instantiate this Graph implementation [class com.thinkaurelius.titan.core.TitanFactory]
This leads me to believe that I'm missing com.thinkaurelius.titan.core.TitanFactory. Which is curious, since $TITAN_HOME/lib does in fact contain titan-all-1.0.0.jar. So I assumed (perhaps wrongly) that I need to run the titan-all install to make it actually load the jars..
The basic install for Titan is unzip the titan-1.0.0-hadoop1.zip. That is it!
Download it from http://titandb.io
http://s3.thinkaurelius.com/docs/titan/1.0.0/getting-started.html
It is already packaged with the Titan plugins, so you don't need to install them into the Gremlin Console or Gremlin Server.
If you want to try the Titan Server, there is a pre-packaged titan.sh script which automatically starts Cassandra and Elasticsearch with the server.
http://s3.thinkaurelius.com/docs/titan/1.0.0/server.html#_getting_started
For anyone that comes across this strangeness, read the whole stack trace. It turns out waaay at the bottom, it actually had the real issue; it couldn't connect to Cassandra because I had not enabled Thrift.
Related
I am experimenting with spark reading from a kafka topic through "Structured Streaming + Kafka Integration Guide".
Spark version: 3.2.1
Scala version: 2.12.15
Following their guide on the spark-shell including the dependencies, I start my shell:
spark-shell --packages org.apache.spark:spark-sql-kafka-0-10_2.12:3.2.1
However, once I run something like the following in my shell:
val df = spark.readStream.format("kafka").option("kafka.bootstrap.servers","http://HOST:PORT").option("subscribe", "my-topic").load()
I get the following exception:
java.lang.NoClassDefFoundError: org/apache/kafka/common/serialization/ByteArraySerializer`
Any ideas how to overcome this issue?
My assumption was with using --packages, all dependencies should be loaded as well. But this does not seem to be the case. From the logs I assume that the package gets loaded successfully, including the kafka-clients dependency:
org.apache.spark#spark-sql-kafka-0-10_2.12 added as a dependency
resolving dependencies :: org.apache.spark#spark-submit-parent-3b04f646-471c-4cc8-88fb-7e32bc3226ed;1.0
confs: \[default\]
found org.apache.spark#spark-sql-kafka-0-10_2.12;3.2.1 in central
found org.apache.spark#spark-token-provider-kafka-0-10_2.12;3.2.1 in central
found org.apache.kafka#kafka-clients;2.8.0 in central
found org.lz4#lz4-java;1.7.1 in central
found org.xerial.snappy#snappy-java;1.1.8.4 in central
found org.slf4j#slf4j-api;1.7.30 in central
found org.apache.hadoop#hadoop-client-runtime;3.3.1 in central
found org.spark-project.spark#unused;1.0.0 in central
found org.apache.hadoop#hadoop-client-api;3.3.1 in central
found org.apache.htrace#htrace-core4;4.1.0-incubating in central
found commons-logging#commons-logging;1.1.3 in central
found com.google.code.findbugs#jsr305;3.0.0 in central
found org.apache.commons#commons-pool2;2.6.2 in central
The logs seem fine, but you can try to include kafka-clients dependency in --packages argument as well
Otherwise, I'd suggest creating an uber jar instead of downloading libraries every time you submit the app
I'm trying to connect spark to my elasticsearch with SSL.
Setup
Spark 2.4.0 from CDH 6.3.2 (Cloudera)
ElasticSearch 7.6.1 (Open Distro)
elasticsearch-hadoop-7.6.1.jar
Considering
1) I already managed to authenticate logstash with SSL and pkcs12 keystore manually created
2) Connexion Spark to ES works without security
Here spark conf provided :
spark.es.nodes=node1
spark.es.port=9200
spark.es.net.ssl=true
spark.es.net.ssl.keystore.location= ===> See below what i tried
spark.es.net.ssl.keystore.type=PKCS12
spark.es.net.ssl.cert.allow.self.signed=true
spark.es.net.http.auth.user=admin
spark.es.net.http.auth.pass=admin
spark.es.nodes.wan.only=false //tried true
Doing
spark.read.format("org.elasticsearch.spark.sql")
.option("es.query", "?q=*:*")
.load("spark/docs")
.show
====================================================
FileSystem Values tried with spark.es.net.ssl.keystore.location (after copying admin.pkcs12 on all nodes)
file:///PATH/certs/admin.pkcs12
Error :
org.elasticsearch.hadoop.EsHadoopIllegalArgumentException: Cannot detect ES version - typically this happens if the network/Elasticsearch cluster is not accessible or when targeting a WAN/Cloud instance without the proper setting 'es.nodes.wan.only'
... elided
Caused by: org.elasticsearch.hadoop.EsHadoopIllegalStateException: Cannot initialize SSL - Get Key failed: null
at org.elasticsearch.hadoop.rest.commonshttp.SSLSocketFactory.createSSLContext(SSLSocketFactory.java:175)
at org.elasticsearch.hadoop.rest.commonshttp.SSLSocketFactory.getSSLContext(SSLSocketFactory.java:160)
at org.elasticsearch.hadoop.rest.commonshttp.SSLSocketFactory.createSocket(SSLSocketFactory.java:129)
at org.apache.commons.httpclient.HttpConnection.open(HttpConnection.java:707)
at org.apache.commons.httpclient.HttpMethodDirector.executeWithRetry(HttpMethodDirector.java:387)
at org.apache.commons.httpclient.HttpMethodDirector.executeMethod(HttpMethodDirector.java:171)
at org.apache.commons.httpclient.HttpClient.executeMethod(HttpClient.java:397)
at org.apache.commons.httpclient.HttpClient.executeMethod(HttpClient.java:323)
at org.elasticsearch.hadoop.rest.commonshttp.CommonsHttpTransport.doExecute(CommonsHttpTransport.java:685)
at org.elasticsearch.hadoop.rest.commonshttp.CommonsHttpTransport.execute(CommonsHttpTransport.java:664)
at org.elasticsearch.hadoop.rest.NetworkClient.execute(NetworkClient.java:116)
at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:432)
at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:428)
at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:388)
at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:392)
at org.elasticsearch.hadoop.rest.RestClient.get(RestClient.java:168)
at org.elasticsearch.hadoop.rest.RestClient.mainInfo(RestClient.java:745)
at org.elasticsearch.hadoop.rest.InitializationUtils.discoverClusterInfo(InitializationUtils.java:330)
... 61 more
Caused by: java.security.UnrecoverableKeyException: Get Key failed: null
at sun.security.pkcs12.PKCS12KeyStore.engineGetKey(PKCS12KeyStore.java:435)
at java.security.KeyStore.getKey(KeyStore.java:1023)
at sun.security.ssl.SunX509KeyManagerImpl.<init>(SunX509KeyManagerImpl.java:133)
at sun.security.ssl.KeyManagerFactoryImpl$SunX509.engineInit(KeyManagerFactoryImpl.java:70)
at javax.net.ssl.KeyManagerFactory.init(KeyManagerFactory.java:256)
at org.elasticsearch.hadoop.rest.commonshttp.SSLSocketFactory.loadKeyManagers(SSLSocketFactory.java:217)
at org.elasticsearch.hadoop.rest.commonshttp.SSLSocketFactory.createSSLContext(SSLSocketFactory.java:173)
... 78 more
Caused by: java.lang.NullPointerException
at sun.security.pkcs12.PKCS12KeyStore.engineGetKey(PKCS12KeyStore.java:374)
... 84 more
====================================================
I copied a keystore a valid admin.pkcs12 to hdfs => /user/company/ with 777 rights, (as i'm writing, is it too permissive, like ssh ?)
//returns true
FileSystem.get(spark.sparkContext.hadoopConfiguration).exists(new Path("hdfs://namenode:8020/user/company/admin.pkcs12"))
HDFS Values tried with spark.es.net.ssl.keystore.location
hdfs:///namenode:8020/user/company/admin.pkcs12
hdfs://namenode:8020/user/company/admin.pkcs12
/user/company/admin.pkcs12
Error :
org.elasticsearch.hadoop.EsHadoopIllegalArgumentException: Cannot detect ES version - typically this happens if the network/Elasticsearch cluster is not accessible or when targeting a WAN/Cloud instance without the proper setting 'es.nodes.wan.only'
... elided
Caused by: org.elasticsearch.hadoop.EsHadoopIllegalStateException: Cannot initialize SSL - Expected to find keystore file at [...] but was unable to. Make sure that it is available on the classpath, or if not, that you have specified a valid URI.
at org.elasticsearch.hadoop.rest.commonshttp.SSLSocketFactory.createSSLContext(SSLSocketFactory.java:175)
at org.elasticsearch.hadoop.rest.commonshttp.SSLSocketFactory.getSSLContext(SSLSocketFactory.java:160)
at org.elasticsearch.hadoop.rest.commonshttp.SSLSocketFactory.createSocket(SSLSocketFactory.java:129)
at org.apache.commons.httpclient.HttpConnection.open(HttpConnection.java:707)
at org.apache.commons.httpclient.HttpMethodDirector.executeWithRetry(HttpMethodDirector.java:387)
at org.apache.commons.httpclient.HttpMethodDirector.executeMethod(HttpMethodDirector.java:171)
at org.apache.commons.httpclient.HttpClient.executeMethod(HttpClient.java:397)
at org.apache.commons.httpclient.HttpClient.executeMethod(HttpClient.java:323)
at org.elasticsearch.hadoop.rest.commonshttp.CommonsHttpTransport.doExecute(CommonsHttpTransport.java:685)
at org.elasticsearch.hadoop.rest.commonshttp.CommonsHttpTransport.execute(CommonsHttpTransport.java:664)
at org.elasticsearch.hadoop.rest.NetworkClient.execute(NetworkClient.java:116)
at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:432)
at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:428)
at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:388)
at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:392)
at org.elasticsearch.hadoop.rest.RestClient.get(RestClient.java:168)
at org.elasticsearch.hadoop.rest.RestClient.mainInfo(RestClient.java:745)
at org.elasticsearch.hadoop.rest.InitializationUtils.discoverClusterInfo(InitializationUtils.java:330)
... 61 more
Caused by: org.elasticsearch.hadoop.EsHadoopIllegalArgumentException: Expected to find keystore file at [...] but was unable to. Make sure that it is available on the classpath, or if not, that you have specified a valid URI.
at org.elasticsearch.hadoop.rest.commonshttp.SSLSocketFactory.loadKeyStore(SSLSocketFactory.java:195)
at org.elasticsearch.hadoop.rest.commonshttp.SSLSocketFactory.loadKeyManagers(SSLSocketFactory.java:215)
at org.elasticsearch.hadoop.rest.commonshttp.SSLSocketFactory.createSSLContext(SSLSocketFactory.java:173)
I tried JKS too.
What am I missing ?
//Works
file:///PATH/certs/admin.pkcs12
I was getting this error because of the missing password.
spark.es.net.ssl.keystore.pass=PASSWORD
I have upgraded Weblogic version in linux server by changing the wl_home path in the setDomainEnv.sh for 12.1.2 to 12.1.3 and restart. when restarting it gives below errors.
Appreciate if anyone can give idea about this.
java.lang.IllegalAccessError: tried to access method com.bea.logging.LogBufferHandler.bufferLogObject(Ljava/lang/Object;)V from class weblogic.logging.log4j.WLLog4jMemoryBufferAppender
java.lang.IllegalStateException: Unable to perform operation: post construct on weblogic.diagnostics.lifecycle.LoggingServerService
java.lang.IllegalArgumentException: While attempting to resolve the dependencies of weblogic.diagnostics.lifecycle.DiagnosticFoundationService errors were found
java.lang.IllegalStateException: Unable to perform operation: resolve on weblogic.diagnostics.lifecycle.DiagnosticFoundationService
java.lang.IllegalArgumentException: While attempting to resolve the dependencies of com.oracle.injection.integration.CDIIntegrationService errors were found
java.lang.IllegalStateException: Unable to perform operation: resolve on com.oracle.injection.integration.CDIIntegrationService
Use the wllog4j.jar which is part of WLS 12.1.3 build. The one part of WLS_HOME/server/lib diretory.
If your domains-home/lib folder holds older wllog4.jar which was part of 12.1.2 then you will face this issue
I am setting Jmeter with Groovy of Cassandra DB.
However, I cannot fix these errors.
could you help me?
Response message: javax.script.ScriptException:
org.codehaus.groovy.control.MultipleCompilatiON-ERRORsException:
startup failed: General error during class generation:
java.lang.NoClassDefFoundError: Unable to load class
com.datastax.driver.core.Session due to missing dependency
org/apache/cassandra/transport/Message$Request
You need to add:
Cassandra JDBC Driver itself
All its dependencies, to wit:
asm-5.0.3.jar
asm-analysis-5.0.3.jar
asm-commons-5.0.3.jar
asm-tree-5.0.3.jar
asm-util-5.0.3.jar
guava-19.0.jar
jffi-1.2.16.jar
jffi-1.2.16-native.jar
jnr-constants-0.9.9.jar
jnr-ffi-2.1.7.jar
jnr-posix-3.0.44.jar
jnr-x86asm-1.0.2.jar
metrics-core-3.2.2.jar
netty-buffer-4.0.56.Final.jar
netty-codec-4.0.56.Final.jar
netty-common-4.0.56.Final.jar
netty-handler-4.0.56.Final.jar
netty-transport-4.0.56.Final.jar
slf4j-api-1.7.25.jar
to JMeter Classpath
So you will need to:
Download cassandra-driver-core-3.6.0.jar
Download all the aforementioned dependencies
Copy the driver and the dependencies to "lib" folder of your JMeter installation
Restart JMeter to pick the libraries up
More information: Cassandra Load Testing with Groovy
I'm trying to run nutch and load the crawled data into cassandra.
I've got my sbt file
"org.apache.gora" % "gora-cassandra" % "0.3",
"org.apache.nutch" % "nutch" % "2.2.1",
"com.datastax.cassandra" % "cassandra-driver-core" % "2.1.2"
and am kicking off the job
ToolRunner.run(NutchConfiguration.create(), new Crawler(), Array("urls"));
but am hitting the slightly vague error
EDIT - updated to be full logs from start of request
[Ljava.lang.String;#526950c7
****file:/home/abdev/Working/Qordaoba/gl/web-crawling-services/crawling-services/urls
[error] play - Cannot invoke the action, eventually got an error: java.lang.RuntimeException: job failed: name=generate: null, jobid=job_local_0002
[error] application -
! #6kemm159h - Internal server error, for (POST) [/nutch/job] ->
play.api.Application$$anon$1: Execution exception[[RuntimeException: job failed: name=generate: null, jobid=job_local_0002]]
at play.api.Application$class.handleError(Application.scala:296) ~[play_2.11-2.3.6.jar:2.3.6]
at play.api.DefaultApplication.handleError(Application.scala:402) [play_2.11-2.3.6.jar:2.3.6]
at play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$3$$anonfun$applyOrElse$4.apply(PlayDefaultUpstreamHandler.scala:320) [play_2.11-2.3.6.jar:2.3.6]
at play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$3$$anonfun$applyOrElse$4.apply(PlayDefaultUpstreamHandler.scala:320) [play_2.11-2.3.6.jar:2.3.6]
at scala.Option.map(Option.scala:145) [scala-library-2.11.1.jar:na]
Caused by: java.lang.RuntimeException: job failed: name=generate: null, jobid=job_local_0002
at org.apache.nutch.util.NutchJob.waitForCompletion(NutchJob.java:54) ~[nutch-2.2.1.jar:na]
at org.apache.nutch.crawl.GeneratorJob.run(GeneratorJob.java:199) ~[nutch-2.2.1.jar:na]
at org.apache.nutch.crawl.Crawler.runTool(Crawler.java:68) ~[nutch-2.2.1.jar:na]
at org.apache.nutch.crawl.Crawler.run(Crawler.java:152) ~[nutch-2.2.1.jar:na]
at org.apache.nutch.crawl.Crawler.run(Crawler.java:250) ~[nutch-2.2.1.jar:na]
In cassandra - the keyspace webpage and tables sc p f are being created before the error is thrown.
EDIT --- If I put all (sorry its a long list I know) the below jars in my lib folder - then the job runs; and the first few logs are about connecting to cassandra. I don't see those logs when I'm trying to just use the SBT dependencies.
Logs when running with below jar files:
SLF4J: The following set of substitute loggers may have been accessed
SLF4J: during the initialization phase. Logging calls during this
SLF4J: phase were not honored. However, subsequent logging calls to these
SLF4J: loggers will work as normally expected.
SLF4J: See also http://www.slf4j.org/codes.html#substituteLogger
SLF4J: org.webjars.WebJarExtractor
[info] Compiling 5 Scala sources and 1 Java source to /home/abdev/Working/Qordaoba/gl/web-crawling-services/crawling-services/target/scala-2.11/classes...
14/12/10 07:31:03 INFO play: Application started (Dev)
14/12/10 07:31:03 INFO slf4j.Slf4jLogger: Slf4jLogger started
[Ljava.lang.String;#3a6f1296
14/12/10 07:31:05 INFO connection.CassandraHostRetryService: Downed Host Retry service started with queue size -1 and retry delay 10s
14/12/10 07:31:05 INFO service.JmxMonitor: Registering JMX me.prettyprint.cassandra.service_Test Cluster:ServiceType=hector,MonitorType=hector
14/12/10 07:31:06 INFO crawl.InjectorJob: InjectorJob: Using class org.apache.gora.cassandra.store.CassandraStore as the Gora storage class.
14/12/10 07:31:06 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/12/10 07:31:06 INFO input.FileInputFormat: Total input paths to process : 1
Full list of Jar files
activation-1.1.jar
antlr-3.2.jar
aopalliance-1.0.jar
apache-cassandra-1.2.19.jar
apache-cassandra-clientutil-1.2.19.jar
apache-cassandra-thrift-1.2.19.jar
apache-nutch-2.2.1.jar
asm-3.2.jar
avro-1.3.3.jar
commons-beanutils-1.7.0.jar
commons-beanutils-core-1.8.0.jar
commons-cli-1.1.jar
commons-cli-1.2.jar
commons-codec-1.2.jar
commons-codec-1.4.jar
commons-collections-3.2.1.jar
commons-configuration-1.6.jar
commons-digester-1.8.jar
commons-el-1.0.jar
commons-httpclient-3.1.jar
commons-io-2.4.jar
commons-lang-2.6.jar
commons-logging-1.1.1.jar
commons-math-2.1.jar
commons-net-1.4.1.jar
compress-lzf-0.8.4.jar
concurrentlinkedhashmap-lru-1.3.jar
cql-internal-only-1.4.1.zip
crawler-commons-0.2.jar
cxf-api-2.5.2.jar
cxf-common-utilities-2.5.2.jar
cxf-rt-bindings-xml-2.5.2.jar
cxf-rt-core-2.5.2.jar
cxf-rt-frontend-jaxrs-2.5.2.jar
cxf-rt-transports-common-2.5.2.jar
cxf-rt-transports-http-2.5.2.jar
elasticsearch-0.19.4.jar
geronimo-javamail_1.4_spec-1.7.1.jar
geronimo-stax-api_1.0_spec-1.0.1.jar
gora-cassandra-0.3.jar
gora-core-0.3.jar
guava-11.0.2.jar
guava-13.0.1.jar
hadoop-core-1.2.0.jar
hamcrest-core-1.3.jar
hector-core-1.1-4.jar
high-scale-lib-1.1.2.jar
hsqldb-2.2.8.jar
httpclient-4.1.1.jar
httpcore-4.1.jar
icu4j-4.0.1.jar
jackson-core-asl-1.8.8.jar
jackson-core-asl-1.9.2.jar
jackson-jaxrs-1.7.1.jar
jackson-mapper-asl-1.8.8.jar
jackson-mapper-asl-1.9.2.jar
jackson-xc-1.7.1.jar
jamm-0.2.5.jar
jaxb-api-2.2.2.jar
jaxb-impl-2.2.3-1.jar
jbcrypt-0.3m.jar
jdom-1.1.jar
jersey-core-1.8.jar
jersey-json-1.8.jar
jersey-server-1.8.jar
jettison-1.3.1.jar
jetty-6.1.26.jar
jetty-client-6.1.26.jar
jetty-sslengine-6.1.26.jar
jetty-util5-6.1.26.jar
jetty-util-6.1.26.jar
jline-0.9.1.jar
jline-1.0.jar
json-simple-1.1.jar
jsr305-1.3.9.jar
jsr311-api-1.1.1.jar
junit-4.11.jar
juniversalchardet-1.0.3.jar
libthrift-0.7.0.jar
log4j-1.2.16.jar
lucene-analyzers-3.6.0.jar
lucene-core-3.6.0.jar
lucene-highlighter-3.6.0.jar
lucene-memory-3.6.0.jar
lucene-queries-3.6.0.jar
lz4-1.1.0.jar
metrics-core-2.2.0.jar
neethi-3.0.1.jar
org.osgi.core-4.0.0.jar
org.restlet.ext.jackson-2.0.5.jar
org.restlet-2.0.5.jar
oro-2.0.8.jar
paranamer-2.2.jar
paranamer-ant-2.2.jar
paranamer-generator-2.2.jar
qdox-1.10.1.jar
serializer-2.7.1.jar
servlet-api-2.5-6.1.14.jar
servlet-api-2.5-20081211.jar
slf4j-api-1.6.6.jar
slf4j-api-1.7.2.jar
slf4j-log4j12-1.6.1.jar
slf4j-log4j12-1.7.2.jar
snakeyaml-1.6.jar
snappy-java-1.0.5.jar
snaptree-0.1.jar
solr-solrj-3.4.0.jar
spring-aop-3.0.6.RELEASE.jar
spring-asm-3.0.6.RELEASE.jar
spring-beans-3.0.6.RELEASE.jar
spring-context-3.0.6.RELEASE.jar
spring-core-3.0.6.RELEASE.jar
spring-expression-3.0.6.RELEASE.jar
spring-web-3.0.6.RELEASE.jar
stax2-api-3.1.1.jar
stax-api-1.0.1.jar
stax-api-1.0-2.jar
thrift-python-internal-only-0.7.0.zip
tika-core-1.3.jar
woodstox-core-asl-4.1.1.jar
wsdl4j-1.6.2.jar
wstx-asl-3.2.7.jar
xercesImpl-2.9.1.jar
xml-apis-1.3.04.jar
xmlenc-0.52.jar
xmlParserAPIs-2.6.2.jar
xmlschema-core-2.0.1.jar
zookeeper-3.3.1.jar
Thanks,
Brent