Getting this error when I try to run jar file that I created : Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/log4j/Appender - log4j

I am getting an exception when I try to run the jar file that I created for my java application. I am using log4j for logging purpose and I created a custom log that records per cron job transactions.
Then I have written a shell script where I call the jar file. I have put the properties file outside the jar file.
I am running the jar file through a shell script. The command I use is
java -jar app.jar $1
The java application has 2 properties files
1) app.properties
2) sublog4j.properties
The sublog4j properties file has data like this:
log4j.appender.log=package.CustomFileAppender
log4j.appender.log.File=/serverpath/error.log
I have a gut feeling that I am getting error because of package.CustomFileAppender. It is a java file in app.jar but I don't know how to create a custom appender and use it in the external properties file.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/log4j/Appender
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2615)
at java.lang.Class.getMethod0(Class.java:2856)
at java.lang.Class.getMethod(Class.java:1668)
at sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)
Caused by: java.lang.ClassNotFoundException: org.apache.log4j.Appender
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)

I was getting this error because I was not building the jar file correctly. So what I did is I created an executable jar file and also copied source code when I created the jar. This is how the issue got resolved.

Related

Spark2.4.6 without hadoop: A JNI error has occurred

On my windows machine, I am trying to use spark 2.4.6 without hadoop using -
spark-2.4.6-bin-without-hadoop-scala-2.12.tgz
After setting the SPARK_HOME, HADOOP_HOME and also SPARK_DIST_CLASSPATH with information from the post linked here
when i try to start the spark-shell, I get this error -
Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
at java.lang.Class.getMethod0(Class.java:3018)
at java.lang.Class.getMethod(Class.java:1784)
at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more
The link referenced above seems and many others point to SPARK_DIST_CLASSPATH , but I already have this in my system variables as -
$HADOOP_HOME;$HADOOP_HOME\etc\hadoop*;$HADOOP_HOME\share\hadoop\common\lib*;$HADOOP_HOME\share\hadoop\common*;$HADOOP_HOME\share\hadoop\hdfs*;$HADOOP_HOME\share\hadoop\hdfs\lib*;$HADOOP_HOME\share\hadoop\hdfs*;$HADOOP_HOME\share\hadoop\yarn\lib*;$HADOOP_HOME\share\hadoop\yarn*;$HADOOP_HOME\share\hadoop\mapreduce\lib*;$HADOOP_HOME\share\hadoop\mapreduce*;$HADOOP_HOME\share\hadoop\tools\lib*;
I also have this line in the spark-env.sh of the spark -
export SPARK_DIST_CLASSPATH=$(C:\opt\spark\hadoop-2.7.3\bin\hadoop classpath)
HADOOP_HOME = C:\opt\spark\hadoop-2.7.3
SPARK_HOME = C:\opt\spark\spark-2.4.6-bin-without-hadoop-scala-2.12
When I tried the spark 2.4.5 that came with hadoop seems to work just fine. This tells me there is something wrong with the way I have my hadoop set up. What am I missing here?
Thanks!
Found a solution here.
Go to your ~/.bashrc
Add export SPARK_DIST_CLASSPATH=$(hadoop classpath)
Apply environment using source ~/.bashrc

Spark Submit error when running a JAR from Azure Databricks

I'm trying to issue spark submit from Azure Databricks jobs scheduler, currently stuck with the below error. Error says: File file:/tmp/spark-events does not exist. I need some pointers to understand do we need to create this directory in Azure blob location(which is my storage Layer) or in Azure DBFS location.
As per the below link, not so clear where to create the directory when running the spark-submit from Azure Databricks jobs scheduler.
SparkContext Error - File not found /tmp/spark-events does not exist
Error:
OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0
Warning: Ignoring non-Spark config property: eventLog.rolloverIntervalSeconds
Exception in thread "main" java.lang.ExceptionInInitializerError
at com.dta.dl.ct.qm.hbase.reverse.pipeline.HBaseVehicleMasterLoad.main(HBaseVehicleMasterLoad.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.io.FileNotFoundException: File file:/tmp/spark-events does not exist
at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:611)
at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:824)
at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:601)
at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:421)
at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:97)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:580)
at com.dta.dl.ct.qm.hbase.reverse.pipeline.HBaseVehicleMasterLoad$.<init>(HBaseVehicleMasterLoad.scala:32)
at com.dta.dl.ct.qm.hbase.reverse.pipeline.HBaseVehicleMasterLoad$.<clinit>(HBaseVehicleMasterLoad.scala)
... 13 more
You need to create this folder on the driver node before collecting event logs (that's by design).
To do so, one way could be adding the property spark.history.fs.logDirectory (present at the spark-defaults.conf file) on a global init script as described here.
Please make sure that the folder defined on that property exist and can be accessed from the driver node

Karaf bundle using ch.qos.logback , log4j org/apache/logging/log4j/util/ReflectionUtil

I have an osgi 6.0.0 project which uses logback and slf4j.
Currently I am implementing a bundle which depends on an external jar which uses
org.apache.logging.log4j
during the command line execution the bundle my karaf console freezes by displaying this error :
Exception in thread "Karaf local console user karaf" java.lang.NoClassDefFoundError: org/apache/logging/log4j/util/ReflectionUtil
at org.apache.logging.log4j.core.impl.ThrowableProxy.(ThrowableProxy.java:145)
at org.apache.logging.log4j.core.impl.ThrowableProxy.(ThrowableProxy.java:125)
at org.apache.logging.log4j.core.impl.MutableLogEvent.getThrownProxy(MutableLogEvent.java:338)
at org.ops4j.pax.logging.log4j2.internal.PaxLoggingEventImpl.getThrowableStrRep(PaxLoggingEventImpl.java:76)
at org.apache.karaf.log.core.internal.KarafLogEvent.(KarafLogEvent.java:45)
at org.apache.karaf.log.core.internal.LogServiceImpl.doAppend(LogServiceImpl.java:177)
at org.ops4j.pax.logging.log4j2.internal.PaxAppenderProxy.doAppend(PaxAppenderProxy.java:65)
at org.ops4j.pax.logging.log4j2.appender.PaxOsgiAppender.append(PaxOsgiAppender.java:82)
at org.apache.logging.log4j.core.config.AppenderControl.tryCallAppender(AppenderControl.java:156)
at org.apache.logging.log4j.core.config.AppenderControl.callAppender0(AppenderControl.java:129)
at org.apache.logging.log4j.core.config.AppenderControl.callAppenderPreventRecursion(AppenderControl.java:120)
at org.apache.logging.log4j.core.config.AppenderControl.callAppender(AppenderControl.java:84)
at org.apache.logging.log4j.core.config.LoggerConfig.callAppenders(LoggerConfig.java:448)
at org.apache.logging.log4j.core.config.LoggerConfig.processLogEvent(LoggerConfig.java:433)
at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:417)
at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:403)
at org.apache.logging.log4j.core.config.AwaitCompletionReliabilityStrategy.log(AwaitCompletionReliabilityStrategy.java:63)
at org.apache.logging.log4j.core.Logger.logMessage(Logger.java:146)
at org.ops4j.pax.logging.log4j2.internal.PaxLoggerImpl.doLog0(PaxLoggerImpl.java:151)
at org.ops4j.pax.logging.log4j2.internal.PaxLoggerImpl.doLog(PaxLoggerImpl.java:144)
at org.ops4j.pax.logging.log4j2.internal.PaxLoggerImpl.error(PaxLoggerImpl.java:192)
at org.ops4j.pax.logging.internal.TrackingLogger.error(TrackingLogger.java:96)
at org.ops4j.pax.logging.slf4j.Slf4jLogger.error(Slf4jLogger.java:953)
at org.apache.karaf.shell.support.ShellUtil.logException(ShellUtil.java:152)
at org.apache.karaf.shell.impl.console.ConsoleSessionImpl.doExecute(ConsoleSessionImpl.java:474)
at org.apache.karaf.shell.impl.console.ConsoleSessionImpl.run(ConsoleSessionImpl.java:407)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: org.apache.logging.log4j.util.ReflectionUtil not found by org.apache.logging.log4j.api [112]
at org.apache.felix.framework.BundleWiringImpl.findClassOrResourceByDelegation(BundleWiringImpl.java:1639)
at org.apache.felix.framework.BundleWiringImpl.access$200(BundleWiringImpl.java:80)
at org.apache.felix.framework.BundleWiringImpl$BundleClassLoader.loadClass(BundleWiringImpl.java:2053)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.felix.framework.BundleWiringImpl.getClassByDelegation(BundleWiringImpl.java:1414)
at org.apache.felix.framework.BundleWiringImpl.searchImports(BundleWiringImpl.java:1660)
at org.apache.felix.framework.BundleWiringImpl.findClassOrResourceByDelegation(BundleWiringImpl.java:1590)
at org.apache.felix.framework.BundleWiringImpl.access$200(BundleWiringImpl.java:80)
at org.apache.felix.framework.BundleWiringImpl$BundleClassLoader.loadClass(BundleWiringImpl.java:2053)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 27 more
I found the solution, the class org/apache/logging/log4j/util/ReflectionUtil no longer exists as of 2.9.0.
i just have to use a version lower than this one

Exception while creating the image using JAI library in shell script

When I work with JAI from the Eclipse (all the classes specified) it works very fine, but when I bundle everything in a jar and make a shell script file from that and try to run that script I have a problem with javax.media.jai.OperationRegistry looking for a initialization file.
Has anyone else seen this problem?
Exception:
Exception in thread "AWT-EventQueue-0" java.lang.ExceptionInInitializerError
at javax.media.jai.TiledImage.<init>(TiledImage.java:259)
at javax.media.jai.TiledImage.<init>(TiledImage.java:222)
at org.syntec.ivb.img.SingleShot.<init>(SingleShot.java:75)
at org.syntec.ivb.jni.GetCapture.<init>(GetCapture.java:21)
at org.syntec.ivb.application.SampleWindow$4.actionPerformed(SampleWindow.java:178)
at javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:2012)
at javax.swing.AbstractButton$Handler.actionPerformed(AbstractButton.java:2335)
at javax.swing.DefaultButtonModel.fireActionPerformed(DefaultButtonModel.java:404)
at javax.swing.DefaultButtonModel.setPressed(DefaultButtonModel.java:259)
at javax.swing.plaf.basic.BasicButtonListener.mouseReleased(BasicButtonListener.java:252)
at java.awt.Component.processMouseEvent(Component.java:6288)
at javax.swing.JComponent.processMouseEvent(JComponent.java:3267)
at java.awt.Component.processEvent(Component.java:6053)
at java.awt.Container.processEvent(Container.java:2045)
at java.awt.Component.dispatchEventImpl(Component.java:4649)
at java.awt.Container.dispatchEventImpl(Container.java:2103)
at java.awt.Component.dispatchEvent(Component.java:4475)
at java.awt.LightweightDispatcher.retargetMouseEvent(Container.java:4633)
at java.awt.LightweightDispatcher.processMouseEvent(Container.java:4297)
at java.awt.LightweightDispatcher.dispatchEvent(Container.java:4227)
at java.awt.Container.dispatchEventImpl(Container.java:2089)
at java.awt.Window.dispatchEventImpl(Window.java:2587)
at java.awt.Component.dispatchEvent(Component.java:4475)
at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:675)
at java.awt.EventQueue.access$300(EventQueue.java:96)
at java.awt.EventQueue$2.run(EventQueue.java:634)
at java.awt.EventQueue$2.run(EventQueue.java:632)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.AccessControlContext$1.doIntersectionPrivilege(AccessControlContext.java:105)
at java.security.AccessControlContext$1.doIntersectionPrivilege(AccessControlContext.java:116)
at java.awt.EventQueue$3.run(EventQueue.java:648)
at java.awt.EventQueue$3.run(EventQueue.java:646)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.AccessControlContext$1.doIntersectionPrivilege(AccessControlContext.java:105)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:645)
at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:275)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:200)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:190)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:185)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:177)
at java.awt.EventDispatchThread.run(EventDispatchThread.java:138)
Caused by: java.lang.RuntimeException: Registry initialization file not found.
at javax.media.jai.OperationRegistry.initializeRegistry(OperationRegistry.java:365)
at javax.media.jai.JAI.<clinit>(JAI.java:566)
... 41 more

Hadoop HDFS test running issue - org.apache.hadoop.conf.Configuration NoClassDefFoundError

I'm working with Hadoop 0.21.0. and trying to run the hdfs_test application that comes alongside the C API library. After many problems I was able to compile hdfs_test. Now when I'm running it:
./hdfs_test
I'm getting the following error:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/commons/logging/LogFactory
at org.apache.hadoop.conf.Configuration.<clinit>(Configuration.java:153)
Caused by: java.lang.ClassNotFoundException: org.apache.commons.logging.LogFactory
at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
... 1 more
Can't construct instance of class org.apache.hadoop.conf.Configuration
Oops! Failed to connect to hdfs!
Any help is appreciated.. thanks
Like any other Java program you need the dependencies in the classpath or inside the jar. Hadoop also has an HADOOP_CLASSPATH to tell the cluster where to find dependencies in map-reduce tasks. Also see How to run a Hadoop program?

Resources