ERROR Appenders contains an invalid element or attribute "Http" - log4j

I am using Log4j Http appender to send data to Splunk using mule cloudhub. During the build it thorws the error:
ERROR Appenders contains an invalid element or attribute Http
and I am not seeing the data in Splunk.
The error happens with Log4j Configuration:
<Http name="Splunktest" url="myurl" token="mytoken"
disableCertificateValidation="true"></Http>
During the maven build it is throwing the mentioned error. Mule runtime version 3.8.4
Did anyone else face the same error?
Entire Log4j for reference
<!--These are some of the loggers you can enable.
There are several more you can find in the documentation.
Besides this log4j configuration, you can also use Java VM environment variables
to enable other logs like network (-Djavax.net.debug=ssl or all) and
Garbage Collector (-XX:+PrintGC). These will be append to the console, so you will
see them in the mule_ee.log file. -->
<Appenders>
<RollingFile name="file" fileName="${sys:mule.home}${sys:file.separator}logs${sys:file.separator}splunk.log"
filePattern="${sys:mule.home}${sys:file.separator}logs${sys:file.separator}splunk-%i.log">
<PatternLayout pattern="%d [%t] %-5p %c - %m%n" />
<SizeBasedTriggeringPolicy size="10 MB" />
<DefaultRolloverStrategy max="10"/>
</RollingFile>
<Http name="Splunktest" url="myurl" token="mytoken" disableCertificateValidation="true"></Http>
</Appenders>
<Loggers>
<!-- Http Logger shows wire traffic on DEBUG -->
<AsyncLogger name="org.mule.module.http.internal.HttpMessageLogger" level="WARN"/>
<!-- JDBC Logger shows queries and parameters values on DEBUG -->
<AsyncLogger name="com.mulesoft.mule.transport.jdbc" level="WARN"/>
<!-- CXF is used heavily by Mule for web services -->
<AsyncLogger name="org.apache.cxf" level="WARN"/>
<!-- Apache Commons tend to make a lot of noise which can clutter the log-->
<AsyncLogger name="org.apache" level="WARN"/>
<!-- Reduce startup noise -->
<AsyncLogger name="org.springframework.beans.factory" level="WARN"/>
<!-- Mule classes -->
<AsyncLogger name="org.mule" level="INFO"/>
<AsyncLogger name="com.mulesoft" level="INFO"/>
<!-- Reduce DM verbosity -->
<AsyncLogger name="org.jetel" level="WARN"/>
<AsyncLogger name="Tracking" level="WARN"/>
<AsyncRoot level="INFO">
<AppenderRef ref="file" />
</AsyncRoot>
<AsyncLogger name="splunk.logger" level="INFO" >
<AppenderRef ref="Splunktest" />
</AsyncLogger>
</Loggers>

The Http appender is not included in the log4j2 version used by the mule runtime 3.8.4.
As far as I know the latest version used in runtime 3.X.X is log4j2 2.8.2
and as you can see from the code here it doesn't define any Http appender.
The Http appender has been introduced in log4j2 2.10.0 ( code here) so you have 2 options:
bundle in you application the log4j2 version 2.10.0 and try to configure the classloader override as explained here
extract the Http appender class and it's dependencies from the version 2.10.0, package as a jar and import in your project, see picture below:
Hope this helps ...

Related

GCP Databricks - Send logs from application jar to Cloud Logging

This is what I am trying to do
Objective: I have a jar file which I am installing on my cluster and then invoking as part of a job. I want to redirect logs from my application (jar) to Cloud Logging.
I have followed this logback guide.
Everything works fine when I run code from my local machine, I can see logs in Cloud Logging. However, as soon as I create the jar and run it on Databricks, logs are not being redirected to Cloud Logging, rather it's being captured by console only.
I think this is what's happening: As Apache Spark uses log4j and it gets initialized before my jar is being loaded. When my jar is being loaded by Spark, sl4j finds the log4j implementation as it finds that jar in classpath rather than taking the logback implementation.
Is there a way I can fix this issue?
Code:
import org.slf4j.{Logger, LoggerFactory}
object App {
def test_slf4j() = {
import org.slf4j.{Logger, LoggerFactory}
val logger: Logger =LoggerFactory.getLogger(getClass.getName)
logger.info("using slf4j ")
}
def main(args:Array[String]):Unit ={
test_slf4j()
}
}
loback.xml
<configuration>
<appender name="CLOUD" class="com.google.cloud.logging.logback.LoggingAppender">
<!-- Optional : filter logs at or above a level -->
<filter class="ch.qos.logback.classic.filter.ThresholdFilter">
<level>INFO</level>
</filter>
<log>application.log</log> <!-- Optional : default java.log -->
<resourceType>gae_app</resourceType> <!-- Optional : default: auto-detected, fallback: global -->
<enhancer>com.example.logging.logback.enhancers.ExampleEnhancer</enhancer> <!-- Optional -->
<flushLevel>INFO</flushLevel> <!-- Optional : default ERROR -->
</appender>
<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<Pattern>
%d{HH:mm:ss.SSS} [%t] %-5level %logger{36} - %msg%n
</Pattern>
</encoder>
</appender>
<logger name="org.apache" level="INFO" additivity="false">
<appender-ref ref="CLOUD" />
<appender-ref ref="CONSOLE" />
</logger>
<root level="info">
<appender-ref ref="CLOUD" />
<appender-ref ref="CONSOLE" />
</root>
</configuration>

Log4j RollingFileAppender writing to previously rolled files

I'm using log4j 2.17.1.
Log4j is rolling files daily but will sometimes write to files it has already rolled. In some cases it is going back several days.
Example:
app.log.2022-01-03 has been overwritten with data from 2022-01-04.
app.log.2022-01-04 has been overwritten with data from 2022-01-10.
app.log.2022-01-11 has been overwritten with data from 2022-01-17.
Is there anything wrong with my configuration here? I just want it to roll everyday.
<Configuration>
<Appenders>
<RollingFile name="A1" append="true" fileName="/var/log/app/app.log">
<PatternLayout
pattern="%d{yyyy-MM-dd hh:mm:ss} [%t] %-5p %c %x %m%n" />
<FilePattern>/var/log/app/app.log.%d{yyyy-MM-dd}</FilePattern>
<Policies>
<TimeBasedTriggeringPolicy />
</Policies>
</RollingFile>
</Appenders>
<Loggers>
<Root level="info">
<AppenderRef ref="A1" />
</Root>
</Loggers>
</Configuration>
I believe my issue was caused by multiple LoggerContexts attempting to write to the same RollingFileAppender.
I resolved this problem by following the steps in the Log4j Web Application usage:
https://logging.apache.org/log4j/2.x/manual/webapp.html
I had several web applications deployed to a single JBoss instance. Each app had their own copy of the log4j jars. I moved the log4j jars out of the wars and into a JBoss module to get them into the server classloader.
Then I followed the directions on the Logging Separation page:
https://logging.apache.org/log4j/2.x/manual/logsep.html
and I configured a single LoggerContext for JBoss:
Place the logging jars in the container's classpath and set the system property log4j2.contextSelector to org.apache.logging.log4j.core.selector.BasicContextSelector. This will create a single LoggerContext using a single configuration that will be shared across all applications.

Get ride of "log4j:WARN No appenders could be found for logger"

I have a simple question, hope I will get a simple answer.
I need a log4j2 xml which will dump ALL logs no matter where they are generated from. Now, funny thing is that, I see all the logs that I do not want to see, but logs from my file show up the dreaded "log4j:WARN No appenders could be found for logger".
My simple log xml file:
<?xml version="1.0" encoding="UTF-8"?>
<!-- Console Appender -->
<Console name="Console" target="SYSTEM_OUT">
<PatternLayout
pattern="%d{yyyy-MMM-dd HH:mm:ss a} [%t] %-5level %logger{36} - %msg%n" />
</Console>
<!-- File Appender -->
<File name="File"
fileName="./log/abc.log">
<PatternLayout
pattern="%d{yyyy-MMM-dd HH:mm:ss a} [%t] %-5level %logger{36} - %msg%n" />
</File>
</Appenders>
<category name="com.abc.def.config.AppInitializer">
<priority value="DEBUG" />
<appender-ref ref="File" />
</category>
<category name="com.oli">
<priority value="DEBUG" />
<appender-ref ref="File" />
</category>
<Loggers>
<Root level="trace">
<AppenderRef ref="Console" />
<AppenderRef ref="File" />
</Root>
</Loggers>
Can somebody improve this xml file so that I am able to see the logs generated by my class "com.abc.def.config.AppInitializer" in the log file ?
Note, more logs is not bad for me, but missing logs absolutely not an option .. the ultimate goal is to "filter out messages that we do not need" rather than "filter in messages we need".
The error message log4j:WARN No appenders could be found for logger is not a Log4j2 warning.
It is coming from a log4j-1.2.x jar that is still on the classpath somewhere.
When migrating to Log4j2, include the log4-1.2-api jar and make sure to remove any log4j-1.2.x jars from the classpath.
From the config file you provided this seems quite good. You should see your log messages on the console as well as in the file.
The warning you get at the very beginning already give you a hint - the system is not able to find your configfile. So how did you name it and where did you put it. The way log4j2 is looking for your configuration is the following:
Log4j will inspect the "log4j.configurationFile" system property
and, if set, will attempt to load the configuration using the
ConfigurationFactory that matches the file extension.
If no system
property is set the properties ConfigurationFactory will look for
log4j2-test.properties in the classpath.
If no such file is found
the YAML ConfigurationFactory will look for log4j2-test.yaml or
log4j2-test.yml in the classpath.
If no such file is found the JSON
ConfigurationFactory will look for log4j2-test.json or
log4j2-test.jsn in the classpath.
If no such file is found the XML
ConfigurationFactory will look for log4j2-test.xml in the classpath.
If a test file cannot be located the properties ConfigurationFactory
will look for log4j2.properties on the classpath.
If a properties file cannot be located the YAML ConfigurationFactory will look for log4j2.yaml or log4j2.yml on the classpath.
If a YAML file cannot be
located the JSON ConfigurationFactory will look for log4j2.json or
log4j2.jsn on the classpath.
If a JSON file cannot be located the
XML ConfigurationFactory will try to locate log4j2.xml on the
classpath.
If no configuration file could be located the
DefaultConfiguration will be used. This will cause logging output to
go to the console.
This one was stolen from Log4j2 documentation.
Hope that helps. If not feel free to post your code (github link would be nice) so we can check in more depth.

log4j2- ERROR Appenders contains an invalid element or attribute "Flume"

I am trying to use Flume Appender Properties of log4j2 .But the following errors are obtained when run the program .
2016-01-20 16:36:42,436 main ERROR Appenders contains an invalid element or attribute "Flume"
2016-01-20 16:36:42,436 main ERROR Appenders contains an invalid element or attribute "Flume"
2016-01-20 16:36:42,446 main ERROR Unable to locate appender "eventLogger" for logger config "root"
2016-01-20 16:36:42,446 main ERROR Unable to locate appender "eventLogger" for logger config "root"
The log4j.xml file is:
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="WARN">
<Appenders>
<Console name="Console" target="SYSTEM_OUT">
<PatternLayout pattern="%d{HH:mm:ss.SSS} [%t] %-5level %logger{36} - %msg%n" />
</Console>
<Flume name="eventLogger" compress="false" type="Avro">
<Agent host="192.168.8.50" port="41414"/>
</Flume>
</Appenders>
<Loggers>
<Root level="info">
<AppenderRef ref="Console" />
<AppenderRef ref="eventLogger" />
</Root>
</Loggers>
</Configuration>
And in .java code
LoggerContext context = (org.apache.logging.log4j.core.LoggerContext) LogManager.getContext(false);
File file = new File("src/log4j2.xml");
context.setConfigLocation(file.toURI());
How I can figure out the problem .May be log4j2 doesn't work properly .
I encountered a similar situation, even Appender element is not recognized under Appenders. the fix is to add strict = "true" in configuration, like below
Configuration status="WARN" strict="true"
Do you have the log4j flume jar in your classpath? If you set status to debug you should see more information.

Configure Log4j in Camel context

Is it possible to configure a Camel route to send a message to a specific log4j logger? For example, I have the following logger:
<logger name="com.me.log.mylogger" additivity="false">
<level value="debug" />
<appender-ref ref="file_appender_messages" />
</logger>
file_appender_messages is just a RollingFileAppender.
I then try to log it using the following in my Camel context:
<to uri="log:com.me.log.mylogger?level=INFO" />
But it outputs on the command line instead of the log file specified in file_appender_messages:
25-Oct-2012 11:46:44 org.apache.camel.processor.CamelLogger log
INFO: [MESSAGE BODY]
I would like to be able to use dffferent loggers for messages from different sources. I could do it in my message processors but ideally it could be configured in the route xml. Can it be done?
Camel uses slf4j since some time. So you have to first configure slf4j to use log4j as backend. In maven add the following dependencies:
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.6.1</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.16</version>
</dependency>
I fixed this by defining the Logger as a Bean in my application XML file
<bean id="myLogger" class="org.apache.log4j.Logger" factory-method="getLogger">
<constructor-arg value="com.me.log.mylogger"/>
</bean>
Then in my route, when I want to log the message I just direct it to the relevant method on the Bean
<to uri="bean:myLogger?method=info" />

Resources