log4j runtime NoClassDefFoundError - android-studio

I am using log4j in android jar module. I can build the jar file successfully and run it in AndroidStudio successfully.
My gradle config is:
implementation 'log4j:log4j:1.2.17'
But when I try the jar file from command line:
java -jar test.jar
I got below error:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/log4j/Logger
at com.yeetor.Main.<clinit>(Main.java:39)
Caused by: java.lang.ClassNotFoundException: org.apache.log4j.Logger
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 1 more
Why it can run in AndroidStudio but not work from command line?

Refer to this SO, to generate one jar file include dependency. the dependency should be included using "compile" but not "implementation", then you will get a bigger Jar file include all dependency. Usually gradle will not include dependency in Jar file but generate all each dependency as a single jar file.

Related

spark-submit dependency conflict

I'm trying to submit a jar to spark but my jar contains dependencies that conflict with spark's built-in jars (snakeyml and others).
Is there a way to tell spark to prefer whatever dependencies my project has over the jars inside /jar
UPDATE
When i run spark-submit, i get the following exception:
Caused by: java.lang.NoSuchMethodError: javax.validation.BootstrapConfiguration.getClockProviderClassName()Ljava/lang/String;
at org.hibernate.validator.internal.xml.ValidationBootstrapParameters.<init>(ValidationBootstrapParameters.java:63)
at org.hibernate.validator.internal.engine.ConfigurationImpl.parseValidationXml(ConfigurationImpl.java:540)
at org.hibernate.validator.internal.engine.ConfigurationImpl.buildValidatorFactory(ConfigurationImpl.java:337)
at javax.validation.Validation.buildDefaultValidatorFactory(Validation.java:110)
at org.hibernate.cfg.beanvalidation.TypeSafeActivator.getValidatorFactory(TypeSafeActivator.java:501)
at org.hibernate.cfg.beanvalidation.TypeSafeActivator.activate(TypeSafeActivator.java:84)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.hibernate.cfg.beanvalidation.BeanValidationIntegrator.integrate(BeanValidationIntegrator.java:132)
... 41 more
which is caused by spark having an older version of validation-api (validation-api-1.1.0.Final.jar)
My project has a dependency on the newer version and it does get bundled with my jar (javax.validation:validation-api:jar:2.0.1.Final:compile)
I submit using this command:
/spark/bin/spark-submit --conf spark.executor.userClassPathFirst=true --conf spark.driver.userClassPathFirst=true
but i still get the same exception
If you are building your jar using SBT, you need to exclude those classes which are on the cluster. For example like below:
"org.apache.spark" %% "spark-core" % "2.2.0" % "provided"
You are doing that by adding "provided", that means these classes is provided already in the environment where you run it.
Not sure if using SBT, but I used this in build.sbt via assembly as I had also sorts of dependency conflicts at one stage. See below, maybe this will help.
This is controlled by setting the following confs to true:
spark.driver.userClassPathFirst
spark.executor.userClassPathFirst
I had issues with 2 jars, and this is what I ended up doing, ie copied the required jars to a directory, and used the extraClasspath option
spark-submit --conf spark.driver.extraClassPath="C:\sparkjars\validation-api-2.0.1.Final.jar;C:\sparkjars\gson-2.8.6.jar" myspringbootapp.jar
From the documentaion, spark.driver.extraClassPath Extra classpath entries to prepend to the classpath of the driver.

Android studio issues with linking gradle to android mk file

I am trying add a .so files to a dlib libary in android studio. I have followed the following steps
Migrate from ndkCompile
If you're using the deprecated ndkCompile, you should migrate to using either CMake or ndk-build. Because ndkCompile generates an intermediate Android.mk file for you, migrating to ndk-build may be a simpler choice.
To migrate from ndkCompile to ndk-build, proceed as follows:
Compile your project with ndkCompile at least once by selecting Build > Make Project. This generates the Android.mk file for you.
Locate the auto-generated Android.mk file by navigating to project-root/module-root/build/intermediates/ndk/debug/Android.mk.
Relocate the Android.mk file to some other directory, such as the same directory as your module-level build.gradle file. This makes sure that Gradle doesn't delete the script file when running the clean task.
Open the Android.mk file and edit any paths in the script such that they are relative to the current location of the script file.
Link Gradle to the Android.mk file .
Disable ndkCompile by opening the build.properties file and removing the following line:
// Remove this line
android.useDeprecatedNdk = true
Apply your changes by clicking Sync Project in the toolbar.
However I am getting this error saying the project cant configure.
org.gradle.api.ProjectConfigurationException: A problem occurred configuring project ':dlib'.
at org.gradle.configuration.project.LifecycleProjectEvaluator.addConfigurationFailure(LifecycleProjectEvaluator.java:94)
at org.gradle.configuration.project.LifecycleProjectEvaluator.notifyAfterEvaluate(Li
Caused by: java.lang.NullPointerException
Any help much appreciated.

Apache-Poi word and intellij

How do i add the jar files to intellij properly. The program works in intellij but not as a .jar file.
I have tried adding the files and exporting them. I dont know what to do. I've been going around in circles for the last 6 hours. Its obiously something to do with the runtime but i dont know anything about that and either there is not much information on it or (most probably) im not googling the right things.
Its this error Caused by: java.lang.NoClassDefFoundError: org/apache/poi/xwpf/usermodel/XWPFDocument
at Methods.word_output.createDocx(word_output.java:28)
at Controllers.mainController.createReport(mainController.java:466)
... 53 more
Caused by: java.lang.ClassNotFoundException: org.apache.poi.xwpf.usermodel.XWPFDocument
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 55 more
A good option is to use a Gradle build.gradle file in the project main directory and then import the project in IntelliJ via "Import project" and choosing that file. This way IntelliJ resolves all the necessary dependencies for you.
A sample minimal Gralde build-file is
apply plugin: 'java'
repositories {
mavenCentral()
}
dependencies {
compile 'org.apache.poi:poi:3.15-beta1'
compile 'org.apache.poi:poi-ooxml:3.15-beta1'
testCompile "junit:junit:[4.12,)"
}
Ideally you would follow the layout of Gradle builds and put your sources in src/main/java and your tests in src/test/java.
As a bonus you gain the possibility to build the project on the commandline/CI/... whatever!

Setting up Spark on IntelliJ for contributing to Spark

I have lot a lot of time trying to set up spark on Intellij on my local machine.
Goal : To run SparkPi.scala with out any errors.
Steps Taken:
git clone `https://github.com/apache/spark`
Import the project to Intellij as a Maven Project
build/mvn -DskipTests clean package
navigate to examples folder modify pom.xml (change occurrences of provided , test -> compile)
Open SparkPi.scala and add `.master("local[4]")` to Spark Session
Right click and run SparkPi
Error I am faced with
Exception in thread "main" java.lang.NoClassDefFoundError: com/google/common/collect/MapMaker
at org.apache.spark.SparkContext.<init>(SparkContext.scala:271)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2257)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:822)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:814)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:814)
at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)
at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
Caused by: java.lang.ClassNotFoundException: com.google.common.collect.MapMaker
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 13 more
You need to rebuild the project under Intellij. Sad but true: IJ is unable to simply reuse the maven built infrastructure.
However it does use part of the command line mvn structure: you do need to run the mvn first.
As for the google MapMaker class: it means the dependencies are not being downloaded properly and not available. This should be resolved after the full rebuild.

Groovy: How does #Grab inclusion differ from classpath inclusion?

1. Generally, how is #Grape/#Grab inclusion different than classpath inclusion?
2. Specifically, what might cause the following behavior difference?
I've got a requirement on xpp3 which I express as:
// TestScript.groovy
#Grab(group='xpp3', module='xpp3', version='1.1.3.4.O')
import org.xmlpull.v1.XmlPullParserFactory;
println "Created: " + XmlPullParserFactory.newInstance()
Running $ groovy TestScript.groovy fails with
Caught: org.xmlpull.v1.XmlPullParserException: caused by: org.xmlpull.v1.XmlPullParserException:
If, however, I manually add the .jar fetched by Grape to my Groovy classpath:
$ groovy -cp ~/.groovy/grapes/xpp3/xpp3/jars/xpp3-1.1.3.4.O.jar \
TestScript.groovy
... then everything works.
Grab uses ivy to fetch the specified library (plus all of its dependencies) from the maven core repository. It then adds these downloaded libraries to the classpath of the loader that's running the current script.
Adding the jar to the classpath just adds the specified jar to the system classpath.
As there are no dependencies in this example, it's probably a requirement that the library needs to be loaded by the system classloader.
To check this, try adding
#GrabConfig(systemClassLoader= true)
#Grab(group='xpp3', module='xpp3', version='1.1.3.4.O')
Instead of the one line Grab you currently have

Resources