Log4J 2 failing to configure - log4j

I have a Jetty 9 based application that I inherited. I am trying to put in some logging functionality. I am using slf4j with log4j2. I have added the appropriate jars for this using Maven:
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.22</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-slf4j-impl</artifactId>
<version>2.7</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-api</artifactId>
<version>2.7</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>2.7</version>
</dependency>
Unfortunately, the application keeps failing to find the log4j2.xml file:
ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
I have placed copies of that XML file in the webapps folder, the main folder under my server, and in the folder where the libraries are -- all places on the web application's classpath. The application still doesn't find it.
Can someone please advise on how I can make this application find the XML file?

Please take a look at the Log4j2 manual page for web applications.
By default Log4j2 looks in the WEB-INF folder of your web application.

Related

add custome dependency to spark

I want to add <JsonTemplateLayout eventTemplateUri="classpath:LogstashJsonEventLayoutV1.json" charset="UTF-8"/> to log4j2.xml for spark configuration.
but I don't have JsonTemplateLayout dependency. how can I add this dependency to spark?
You need to make sure that both log4j-core and log4j-json modules are included in your project. To use log4j with Apache Spark, you need to add the following dependencies to your project:
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>2.12.1</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-json</artifactId>
<version>2.12.1</version>
</dependency>
Note that the version numbers above are the latest version of log4j, you may need to adjust the version numbers appropriately depending on the version of log4j you are using.

Log4j Vulnerability in 3rd party applications like apache zookeeper

Apache log4j zookeeper uses log4j 1.2 which is vulnerable to RCE.
To rectify this issue we planned to exclude log4j 1.2 and include log4j 2.17.1 core and log4j 2.17.1 api in the dependency
It doesnt help. Can somebody please suggest how to exclude jars from third party libraries
Error:
Getting this errror :
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/log4j/jmx/HierarchyDynamicMBean
at org.apache.zookeeper.jmx.ManagedUtil.registerLog4jMBeans(ManagedUtil.java:50)
at org.apache.zookeeper.server.ZooKeeperServerMain.initializeAndRun(ZooKeeperServerMain.java:91)
at org.apache.zookeeper.server.ZooKeeperServerMain.main(ZooKeeperServerMain.java:61)
at org.apache.zookeeper.server.quorum.QuorumPeerMain.initializeAndRun(QuorumPeerMain.java:125)
at org.apache.zookeeper.server.quorum.QuorumPeerMain.main(QuorumPeerMain.java:79)
Caused by: java.lang.ClassNotFoundException: org.apache.log4j.jmx.HierarchyDynamicMBean
at java.net.URLClassLoader.findClass(URLClassLoader.
We tried this ..
<dependencies>
<dependency>
<groupId>org.apache.zookeeper</groupId>
<artifactId>zookeeper</artifactId>
<version>3.5.1-alpha</version>
<exclusions>
<exclusion>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>2.17.1</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-1.2-api</artifactId>
<version>2.17.1</version>
</dependency>
</dependencies>
I believe I figured it out but I haven't tested this for long enough.
Considering this was applied to a v3.6.1 Zookeeper server, a summary of what needs to be done is:
Delete old log4j libraries from Zookeeper
log4j-1.2.17.jar
log4j-1.2.17.LICENSE.txt (That's obviously not necessary)
Add recent log4j libraries that has the fix for the log4shell vulerability.
A log4j2 bridge that is backward compatible with log4j1.x: log4j-1.2-api-2.17.1.jar
Necessary log4j libraries: log4j-api-2.17.1.jar & log4j-core-2.17.1.jar
Modify Zookeeper's server environment options file (i.e. /zookeeper/conf/server_jvm.properties) by adding the following lines
-Dlog4j.configuration=/incorta/IncortaAnalytics/IncortaNode/zookeeper/conf/log4j.properties (A pointer for log4j2 to the existing log4j1.x configuration file, see the reference below for more details)
-Dzookeeper.jmx.log4j.disable=true (Disable Zookeeper's JMX dependency on log4j1.x. Thanks to Piotr for that tip he mentioned for this question)
What this does is that it keeps the sl4j libraries shipped with Zookeeper because changing those to a version that is log4j2 compatible wasn't a pleasant experience for me.
And instead, I upgraded log4j1.x libraries to log4j2 while having the log4j bridge library too to enable Zookeeper's outdated slf4j libraries to use the recent log4j2 ones.
Reference
Update: Using JDK 11, we faced a weird error where our Zookeeper client couldn't connect to Zookeeper, and the solution was to remove the slf4j-log4j12 binder from our classpath.
Zookeeper is apparently trying to directly access Log4j 1.2 internal classes, which no longer exist in log4j-1.2-api (cf. source code).
You can:
either set the system property zookeeper.jmx.log4j.disable to true
or upgrade to a newer version (e.g. 3.5.9), which will detect the absence of the HierarchyDynamicMBean class automatically.
You should upgrade anyway since the alpha version you are using has several security vulnerabilities: cf. Maven Repository.
The following dependency configuration seems to have worked for me:
<dependency>
<groupId>org.apache.zookeeper</groupId>
<artifactId>zookeeper</artifactId>
<version>3.7.0</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
<exclusion>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>2.17.1</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-api</artifactId>
<version>2.17.1</version>
</dependency>

activating asciidoctor diagram extension plantuml in maven pom.xml

I use asciidoctor and maven with the asciidcotor-maven-plugin
I use asciidoctorJ and asciidoctorJ-diagramm
Now i have in my src-code of the document to render
plantuml::input.puml[]
with mvn -X compile I get now information in the console :-(
In the generated document I see the line of the src identically iaw not rendered at all.
What is the problem?
<plugin>
<groupId>org.asciidoctor</groupId>
<artifactId>asciidoctor-maven-plugin</artifactId>
<version>${version.admvpl}</version>
<dependencies>
<dependency>
<groupId>org.jruby</groupId>
<artifactId>jruby-complete</artifactId>
<version>${version.jrcm}</version>
</dependency>
<dependency>
<groupId>org.asciidoctor</groupId>
<artifactId>asciidoctorj-pdf</artifactId>
<version>${version.adjpdf}</version>
</dependency>
<dependency>
<groupId>org.asciidoctor</groupId>
<artifactId>**asciidoctorj-diagram<**/artifactId>
<version>${version.addia}</version>
</dependency>
</dependencies>
do I have to add something like
<configuration>
<requires>
<require>asciidoctor-diagramm</require>
</requires>
</configuration>
In the internet I only find configurations with gradle and gems, but not with maven... the problem is in backend HTML and PDF.
You find a fully working example here: https://github.com/asciidoctor/asciidoctor-maven-examples/tree/master/asciidoctor-diagram-example
Yes, you have to add both the dependency to asciidoctorj-diagram and you have to specify the requires/require in the configuration like you did in your question.

Spark metric system sink errors after adding hadoop-aws dependency jar to pom file

I'm using the HDP version 2.6.3 with the 2.2 version of Spark (not HDP cloud) and I'm trying to write to s3 from an IntelliJ project. I have no problems writing to the s3 bucket from the shell on one of my data nodes, but when I try to test my app on my local machine in IntelliJ I get an error (ERROR MetricsSystem: Sink class org.apache.spark.metrics.sink.MetricsServlet cannot be instantiated) after adding the Hadoop-aws jar dependency to my pom file. Does anyone know if there is any nuance to how you need to add this dependency? If I put the dependency above the spark dependencies in my pom I get different errors with missing spark classes, so it seems to matter what order you put it in.
I had the same problem, solved it by excluding the libraries of Jackson from the Hadoop's dependencies.
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-aws</artifactId>
<version>${hadoop.version}</version>
<exclusions>
<exclusion> <!-- declare the exclusion here -->
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
</exclusion>
<exclusion> <!-- declare the exclusion here -->
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
</exclusion>
<exclusion> <!-- declare the exclusion here -->
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
</exclusion>
</exclusions>
</dependency>

Caused by: java.lang.NoSuchMethodError: com.datastax.driver.core.TypeCodec.getJavaType()Lcom/google/common/reflect/TypeToken;

I am creating an application using following dependencies, which are all the latest version so far.
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.0.0.M5</version>
</parent>
`<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.10</artifactId>
<version>2.0.5</</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-core</artifactId>
<version>3.3.0</version>
</dependency>`
I get this exception
"Caused by: java.lang.NoSuchMethodError: com.datastax.driver.core.TypeCodec.getJavaType()Lcom/google/common/reflect/TypeToken;"
After doing some research, I found the reason is that spark-cassandra-connector_2.10 jar also contains com.datastax.driver class files and it also has TypeCodeC.class but this is different from TypeCodeC.class file in cassandra-driver-core
I have 2 solutions so far.
use maven-shade-plugin to exclude class files from the jar. However, this requires a lot of extra work. And for some reason, only I compile the project to a jar and add this jar as a dependency in my project, then it works. I don't think this is a good solution
I remove /com/datastax/driver folder and files directly from the jar. Use this command
zip -d /Users/cicidi/.m2/repository/com/datastax/spark/spark-cassandra-connector_2.10/2.0.5/spark-cassandra-connector_2.10-2.0.5.jar /com/datastax/driver/*
And it works! Then you need to can add this jar to your project, instead of using maven.(you can use maven on your local, but won't work if you pull the jar from again. )
I don't find any answer on the internet. I know there will be some smart guys fixed this issue. But before that, I am posting this answer to help whoever want to fix this problem immediately.
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.10</artifactId>
<version>2.0.0-M1</version>
</dependency>
has no version conflict with
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-core</artifactId>
<version>3.3.0</version>
</dependency>
According to the question "Why is the Cassandra Java Driver embedded in Spark Cassandra Connector artifacts?" in https://github.com/datastax/spark-cassandra-connector/blob/master/doc/FAQ.md . It's difficult to use these two libraries together. But I try to avoid the problem by explicitly specify the dependencies in my pom.
There are some points to pay attention to:
The cassandra-driver-* must be put before the spark-cassandra-connector. And the 3.1.4 may only work fine with 2.0.0-M1.
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-core</artifactId>
<version>3.1.4</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-mapping</artifactId>
<version>3.1.4</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-extras</artifactId>
<version>3.1.4</version>
</dependency>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.11</artifactId>
<version>2.0.0-M1</version>
</dependency>

Resources