Solr 7.5 Log to application insights? - azure

I'm trying to get my solr to log to application insights, solr is running in docker but i tried similar to this ...
Solr to Application Insights
This however is using an older version of solr that uses log4j not log4j2.
I've tried downloading the insights errors into a folder (/opt/solr/server/lib) and loading them via plugins (the logs suggest the jar's are loaded - but i still get an error about by log4j2.xml config ... saying the class for the insights appender cannot be found,
Config is as followes ..
<Configuration packages="com.microsoft.applicationinsights.log4j.v2">
<Appenders>
<ApplicationInsightsAppender name="aiAppender" instrumentationKey="key-here">
<PatternLayout>
<Pattern>
%d{yyyy-MM-dd HH:mm:ss.SSS} %-5p (%t) [%X{collection} %X{shard} %X{replica} %X{core}] %c{1.} %m%n
</Pattern>
</PatternLayout>
</ApplicationInsightsAppender>
......
Any ideas where im going wrong ?

I believe below is the confguration for log4j:
<appender name="aiAppender"
class="com.microsoft.applicationinsights.log4j.v1_2.ApplicationInsightsAppender">
<param name="instrumentationKey" value="[APPLICATION_INSIGHTS_KEY]" />
</appender>
<root>
<priority value ="trace" />
<appender-ref ref="aiAppender" />
</root>
Reference:
https://learn.microsoft.com/en-gb/azure/azure-monitor/app/java-trace-logs
try this and see if it helps.

Related

GCP Databricks - Send logs from application jar to Cloud Logging

This is what I am trying to do
Objective: I have a jar file which I am installing on my cluster and then invoking as part of a job. I want to redirect logs from my application (jar) to Cloud Logging.
I have followed this logback guide.
Everything works fine when I run code from my local machine, I can see logs in Cloud Logging. However, as soon as I create the jar and run it on Databricks, logs are not being redirected to Cloud Logging, rather it's being captured by console only.
I think this is what's happening: As Apache Spark uses log4j and it gets initialized before my jar is being loaded. When my jar is being loaded by Spark, sl4j finds the log4j implementation as it finds that jar in classpath rather than taking the logback implementation.
Is there a way I can fix this issue?
Code:
import org.slf4j.{Logger, LoggerFactory}
object App {
def test_slf4j() = {
import org.slf4j.{Logger, LoggerFactory}
val logger: Logger =LoggerFactory.getLogger(getClass.getName)
logger.info("using slf4j ")
}
def main(args:Array[String]):Unit ={
test_slf4j()
}
}
loback.xml
<configuration>
<appender name="CLOUD" class="com.google.cloud.logging.logback.LoggingAppender">
<!-- Optional : filter logs at or above a level -->
<filter class="ch.qos.logback.classic.filter.ThresholdFilter">
<level>INFO</level>
</filter>
<log>application.log</log> <!-- Optional : default java.log -->
<resourceType>gae_app</resourceType> <!-- Optional : default: auto-detected, fallback: global -->
<enhancer>com.example.logging.logback.enhancers.ExampleEnhancer</enhancer> <!-- Optional -->
<flushLevel>INFO</flushLevel> <!-- Optional : default ERROR -->
</appender>
<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<Pattern>
%d{HH:mm:ss.SSS} [%t] %-5level %logger{36} - %msg%n
</Pattern>
</encoder>
</appender>
<logger name="org.apache" level="INFO" additivity="false">
<appender-ref ref="CLOUD" />
<appender-ref ref="CONSOLE" />
</logger>
<root level="info">
<appender-ref ref="CLOUD" />
<appender-ref ref="CONSOLE" />
</root>
</configuration>

Log4j RollingFileAppender writing to previously rolled files

I'm using log4j 2.17.1.
Log4j is rolling files daily but will sometimes write to files it has already rolled. In some cases it is going back several days.
Example:
app.log.2022-01-03 has been overwritten with data from 2022-01-04.
app.log.2022-01-04 has been overwritten with data from 2022-01-10.
app.log.2022-01-11 has been overwritten with data from 2022-01-17.
Is there anything wrong with my configuration here? I just want it to roll everyday.
<Configuration>
<Appenders>
<RollingFile name="A1" append="true" fileName="/var/log/app/app.log">
<PatternLayout
pattern="%d{yyyy-MM-dd hh:mm:ss} [%t] %-5p %c %x %m%n" />
<FilePattern>/var/log/app/app.log.%d{yyyy-MM-dd}</FilePattern>
<Policies>
<TimeBasedTriggeringPolicy />
</Policies>
</RollingFile>
</Appenders>
<Loggers>
<Root level="info">
<AppenderRef ref="A1" />
</Root>
</Loggers>
</Configuration>
I believe my issue was caused by multiple LoggerContexts attempting to write to the same RollingFileAppender.
I resolved this problem by following the steps in the Log4j Web Application usage:
https://logging.apache.org/log4j/2.x/manual/webapp.html
I had several web applications deployed to a single JBoss instance. Each app had their own copy of the log4j jars. I moved the log4j jars out of the wars and into a JBoss module to get them into the server classloader.
Then I followed the directions on the Logging Separation page:
https://logging.apache.org/log4j/2.x/manual/logsep.html
and I configured a single LoggerContext for JBoss:
Place the logging jars in the container's classpath and set the system property log4j2.contextSelector to org.apache.logging.log4j.core.selector.BasicContextSelector. This will create a single LoggerContext using a single configuration that will be shared across all applications.

The old log files didn't get removed when using logback RollingFileAppender

My logback.xml file as following, and I set MaxHistory=1 in order to delete old log files and only keep the log for one day. But I found the old files didn't get removed. I still could see them as:app.log.2019-02-11 app.log.2019-02-12
<configuration>
<property name="APP_NAME" value="logbacktest-logs" />
<property name="LOG_HOME" value="/tmp/${APP_NAME}" />
<property name="ENCODER_PATTERN" value="%d %C.%method:%L _ %msg%n"/>
<contextName>${APP_NAME}</contextName>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>${ENCODER_PATTERN}</pattern>
</encoder>
</appender>
<appender name="APP_APPEND" class="ch.qos.logback.core.rolling.RollingFileAppender">
<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<fileNamePattern>${LOG_HOME}/app.log.%d{yyyy-MM-dd}</fileNamePattern>
<MaxHistory>1</MaxHistory>
</rollingPolicy>
<encoder>
<pattern>${ENCODER_PATTERN}</pattern>
</encoder>
</appender>
<root level="INFO">
<appender-ref ref="STDOUT" />
<appender-ref ref="APP_APPEND" />
</root>
</configuration>
This is a part of my log
12:11:32,358 |-INFO in c.q.l.core.rolling.DefaultTimeBasedFileNamingAndTriggeringPolicy - Roll-over at midnight.
which means that say your application is not up during that time, then it doesn't get a chance to delete it
I had faced a same issue when I had set the maxHistory to 5, it deleted 5 log files sometimes but when my application died it could not delete the logs because it didn't get a chance to delete.
Refering the logback docs, it says you can use this:
<cleanHistoryOnStart> true </cleanHistoryOnStart>
If set to true, archive removal will be executed on appender start up. By default this property is set to false.
Archive removal is normally performed during roll over. However, some applications may not live long enough for roll over to be triggered. It follows that for such short-lived applications archive removal may never get a chance to execute. By setting cleanHistoryOnStart to true, archive removal is performed at appender start up.
Modify your piece of logback-spring.xml to:
<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<fileNamePattern>${LOG_HOME}/app.log.%d{yyyy-MM-dd}</fileNamePattern>
<maxHistory>1</maxHistory>
<cleanHistoryOnStart>true</cleanHistoryOnStart>
</rollingPolicy>

Get ride of "log4j:WARN No appenders could be found for logger"

I have a simple question, hope I will get a simple answer.
I need a log4j2 xml which will dump ALL logs no matter where they are generated from. Now, funny thing is that, I see all the logs that I do not want to see, but logs from my file show up the dreaded "log4j:WARN No appenders could be found for logger".
My simple log xml file:
<?xml version="1.0" encoding="UTF-8"?>
<!-- Console Appender -->
<Console name="Console" target="SYSTEM_OUT">
<PatternLayout
pattern="%d{yyyy-MMM-dd HH:mm:ss a} [%t] %-5level %logger{36} - %msg%n" />
</Console>
<!-- File Appender -->
<File name="File"
fileName="./log/abc.log">
<PatternLayout
pattern="%d{yyyy-MMM-dd HH:mm:ss a} [%t] %-5level %logger{36} - %msg%n" />
</File>
</Appenders>
<category name="com.abc.def.config.AppInitializer">
<priority value="DEBUG" />
<appender-ref ref="File" />
</category>
<category name="com.oli">
<priority value="DEBUG" />
<appender-ref ref="File" />
</category>
<Loggers>
<Root level="trace">
<AppenderRef ref="Console" />
<AppenderRef ref="File" />
</Root>
</Loggers>
Can somebody improve this xml file so that I am able to see the logs generated by my class "com.abc.def.config.AppInitializer" in the log file ?
Note, more logs is not bad for me, but missing logs absolutely not an option .. the ultimate goal is to "filter out messages that we do not need" rather than "filter in messages we need".
The error message log4j:WARN No appenders could be found for logger is not a Log4j2 warning.
It is coming from a log4j-1.2.x jar that is still on the classpath somewhere.
When migrating to Log4j2, include the log4-1.2-api jar and make sure to remove any log4j-1.2.x jars from the classpath.
From the config file you provided this seems quite good. You should see your log messages on the console as well as in the file.
The warning you get at the very beginning already give you a hint - the system is not able to find your configfile. So how did you name it and where did you put it. The way log4j2 is looking for your configuration is the following:
Log4j will inspect the "log4j.configurationFile" system property
and, if set, will attempt to load the configuration using the
ConfigurationFactory that matches the file extension.
If no system
property is set the properties ConfigurationFactory will look for
log4j2-test.properties in the classpath.
If no such file is found
the YAML ConfigurationFactory will look for log4j2-test.yaml or
log4j2-test.yml in the classpath.
If no such file is found the JSON
ConfigurationFactory will look for log4j2-test.json or
log4j2-test.jsn in the classpath.
If no such file is found the XML
ConfigurationFactory will look for log4j2-test.xml in the classpath.
If a test file cannot be located the properties ConfigurationFactory
will look for log4j2.properties on the classpath.
If a properties file cannot be located the YAML ConfigurationFactory will look for log4j2.yaml or log4j2.yml on the classpath.
If a YAML file cannot be
located the JSON ConfigurationFactory will look for log4j2.json or
log4j2.jsn on the classpath.
If a JSON file cannot be located the
XML ConfigurationFactory will try to locate log4j2.xml on the
classpath.
If no configuration file could be located the
DefaultConfiguration will be used. This will cause logging output to
go to the console.
This one was stolen from Log4j2 documentation.
Hope that helps. If not feel free to post your code (github link would be nice) so we can check in more depth.

How to make Log4j to be written in console

My console appender in Log4J writes to server.log as well as to console. How I make it write only to console?
Currently it is :
<appender name="console" class="org.apache.log4j.ConsoleAppender">
<param name="Target" value="System.out" />
<layout class="org.apache.log4j.PatternLayout">
<param name="ConversionPattern" value="%d %-5p %M (%C{1}:%L) – %m%n" />
</layout>
</appender>
Most probably it is not so. Your console appender writes only to console, but you have defined some file appender somewhere. Keep in mind, that this appender could be defined in any ancestor of a class where logging occur, including rootLogger, and it is possible to have several appenders for any element. If so, logging will occur also to this file appender.
Another possibility is to check that application and you use the same log4j configuration file. Sometimes log4j could use other configuration file than you thing of.

Resources