How to configure log4j and Selenium Grid? - log4j

When I launch a Grid hub using Ant, the logging is all by default on the console. I would like to know if there is a way wherein I can alter the build.xml file of Selenium Grid and include a log4j logger into it. I understand that I can specify a log file to Ant itself using the -logfile option. But this would cause the log file to be overwritten everytime the grid is launched. I want to ensure that the log files are automatically renamed after a threshold is reached for better maintenance. Any help and if possible some examples on how to do it would be greatly appreciated (I am new to using log4j which is why I am asking for some sample for this specific need).

You need to specify the location of the Log4j archive as well as its configuration file when starting up Selenium. In the following example, I've updated the classpath to load the log4j.hub.properties configuration file located in the /etc/selenium directory and the log4j.jar located in the /usr/lib/selenium directory. I've also setup some additional log files:
java -classpath /etc/selenium:/usr/lib/selenium/log4j.jar:/usr/lib/selenium/selenium-server-standalone.jar -Dlog4j.configuration=log4j.hub.properties org.openqa.grid.selenium.GridLauncher -role hub -log /var/log/selenium/hub.debug.log > /var/log/selenium/hub.output.log 2> /var/log/selenium/hub.error.log &
Then you can have something like the following to achieve what you are looking for:
log4j.rootLogger=ALL, file
log4j.appender.file=org.apache.log4j.FileAppender
log4j.appender.file.File=/var/log/selenium/hub.log
log4j.appender.file.Append=true
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss,SSS} <%p> <%t> %m%n
You might want to look at this page for more information about how Selenium logging is working.

Related

flyway commandline, How to log the messages in a log file during migration

This question is regarding the "logging the messages in a log file, while migrating through flyway command line". I went through the below links in StackOverflow, and followed the steps mentioned, but couldn't get the list of steps to follow to see the messages in log file.
How configure logging for Flyway command line
Flyway logging with log4j?
Flyway logging with Logback
I have placed log4j-1.2.17.jar, logback-classic-1.1.7.jar, logback-core-1.1.7.jar and slf4j-api-1.7.21.jar under flyway/lib folder and placed the logback.xml in conf location (also tried moving out of conf location too).
Lib is mentioned in the classpath in flyway and flyway.cmd files.
But I always see the debug messages on stdout, and no log file is being created.
Flyway version 4.2.0
Could someone share the list of steps, to write the log messages on a log file during migration/info.
While keeping logback.xml inside conf directory, You need to edit flyway/flyway.cmd based on your environment.
Replace line CP="$INSTALLDIR/lib/*:$INSTALLDIR/drivers/*"
with
CP="$INSTALLDIR/conf:$INSTALLDIR/lib/*:$INSTALLDIR/drivers/*"
Replace line
%JAVA_CMD% -cp "%INSTALLDIR%\lib\*;%INSTALLDIR%\drivers\*" org.flywaydb.commandline.Main %*
with
%JAVA_CMD% -cp "%INSTALLDIR%\conf:%INSTALLDIR%\lib\*;%INSTALLDIR%\drivers\*" org.flywaydb.commandline.Main %*
Explanation:
conf directory is not declared as classpath in the execution scripts. So need to add it in classpath so that logback.xml can be read from classpath.
anywhere you put the logback.xml file, it has to be declared as classpath.

Logstash 5 configure log4j logging for itself (not as plugin)

This is just for future reference since I solved it myself.
When I switched from logstash 2.x to 5.x, I was dealing with this warning (when I was runnig my logstash on this path D:\somepath\logstash-5.0.1):
Could not find log4j2 configuration at path /somepath/logstash-5.0.1/config/log4j2.properties. Using default config which logs to console
After some searching on internet and digging in ruby code (in the extracted logstash) I have found out this:
necessary to use path.settings (as mentioned many times) correctly
use correctly file or directory as URL path.
Finally I run my logstash as:
logstash.bat --path.settings=file://D:/somepath/logstash-5.0.1/config

Configurate log4j for external jar

In my project (myProject) I use an external jar (external.jar). Both of them make logging with log4j.jar . With the help of log4j.properties file (located in myProject) I can configure logging from myProject. How can I configurate log levels of logging from the the external.jar without changing that jar file ?
Simpy adding package from external.jar ( let say org.external) in property file
log4j.logger.org.external=ERROR does not make any difference.
Here I have found the salution.

How to get rid of "Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties" message?

I am trying to suppress the message
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
when i run my Spark app. I've redirected the INFO messages successfully, however this message keeps on showing up. Any ideas would be greatly appreciated.
Even simpler you just cd SPARK_HOME/conf then mv log4j.properties.template log4j.properties then open log4j.properties and change all INFO to ERROR. Here SPARK_HOME is the root directory of your spark installation.
Some may be using hdfs as their Spark storage backend and will find the logging messages are actually generated by hdfs. To alter this, go to the HADOOP_HOME/etc/hadoop/log4j.properties file. Simply change hadoop.root.logger=INFO,console to hadoop.root.logger=ERROR,console. Once again HADOOP_HOME is the root of your hadoop installation for me this was /usr/local/hadoop.
Okay, So I've figured out a way to do this. So basically, I had my own log4j.xml initially, that was being used, and hence we were seeing this property. Once I had my own "log4j.properties" file, this message went away.
If you put a log4j.properties file under both the main/resources and the test/resources this also occurs. In this case, deleting the file from the test/resources and using only the file from the main/resources fixes the issue.
None of the answers above did work for me using SBT. Turns out you need to explicitly define an appender in your log4j.properties, such as:
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target=System.out
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{HH:mm:ss} %-5p %c{1}:%L - %m%n
log4j.rootLogger=WARN, stdout
log4j.logger.org.apache.spark=WARN, stdout
log4j.logger.com.yourcompany=INFO, stdout
Put this in your resources directory and Bob's your uncle!

WARN No appenders could be found for logger (org.apache.accumulo.start.classloader.AccumuloClassLoader)

Does anyone know how to get rid of the following warnings when starting accumulo:
log4j:WARN No appenders could be found for logger (org.apache.accumulo.start.classloader.AccumuloClassLoader).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
I am running accumulo 1.4.0 hadoop 0.20.2 and zookeeper 3.3.3. I understand this warning happens because the class can not find the log4j.properties file and yes I have read http://logging.apache.org/log4j/1.2/manual.html. My log4j.properties file has the following lines copied from an accumulo 1.4.3 log4j file (I dont have the option to upgrade my system to 1.4.3):
# default logging properties:
# by default, log everything at INFO or higher to the console
log4j.rootLogger=INFO,A1
# hide Jetty junk
log4j.logger.org.mortbay.log=WARN,A1
# hide "Got brand-new compresssor" messages
log4j.logger.org.apache.hadoop.io.compress=WARN,A1
# hide junk from TestRandomDeletes
log4j.logger.org.apache.accumulo.server.test.TestRandomDeletes=WARN,A1
# hide almost everything from zookeeper
log4j.logger.org.apache.zookeeper=ERROR,A1
# hide AUDIT messages in the shell, alternatively you could send them to a different logger
log4j.logger.org.apache.accumulo.core.util.shell.Shell.audit=WARN,A1
# Send most things to the console
log4j.appender.A1=org.apache.log4j.ConsoleAppender
log4j.appender.A1.layout.ConversionPattern=%d{ISO8601} [%-8c{2}] %-5p: %m%n
log4j.appender.A1.layout=org.apache.log4j.PatternLayout
I have put this log4j file everyone. In the accumulo/bin folder, in the accumulo/conf folder, in the accumulo/lib folder but can not get rid of this warning (I know it has to go on the accumulo class path but dont know where that is). I also can't pass a log4j.configuration option to the java compiler because the accmulo executable comes pre-compiled (I just run it).
Thanks in advance for the help.
EDIT: Below is the result of an "accumulo classpath" command on my system:
[admin-cloud#NODE1 bin]$ echo $ACCUMULO_HOME
/accumulo/accumulo-1.4.0
[admin-cloud#NODE1 bin]$ accumulo classpath
Accumulo List of classpath items are:
file:/accumulo/accumulo-1.4.0/lib/commons-collections-3.2.jar
file:/accumulo/accumulo-1.4.0/lib/commons-configuration-1.5.jar
file:/accumulo/accumulo-1.4.0/lib/log4j-1.2.16.jar
file:/accumulo/accumulo-1.4.0/lib/libthrift-0.6.1.jar
file:/accumulo/accumulo-1.4.0/lib/commons-jci-core-1.0.jar
file:/accumulo/accumulo-1.4.0/lib/commons-lang-2.4.jar
file:/accumulo/accumulo-1.4.0/lib/commons-logging-api-1.0.4.jar
file:/accumulo/accumulo-1.4.0/lib/accumulo-server-1.4.0.jar
file:/accumulo/accumulo-1.4.0/lib/accumulo-start-1.4.0.jar
file:/accumulo/accumulo-1.4.0/lib/commons-jci-fam-1.0.jar
file:/accumulo/accumulo-1.4.0/lib/jline-0.9.94.jar
file:/accumulo/accumulo-1.4.0/lib/examples-simple-1.4.0.jar
file:/accumulo/accumulo-1.4.0/lib/cloudtrace-1.4.0.jar
file:/accumulo/accumulo-1.4.0/lib/commons-logging-1.0.4.jar
file:/accumulo/accumulo-1.4.0/lib/accumulo-core-1.4.0.jar
file:/accumulo/accumulo-1.4.0/lib/commons-io-1.4.jar
file:/zookeeper/zookeeper-3.3.6/zookeeper-3.3.6.jar
file:/hadoop/hadoop-0.20.2/conf/
file:/hadoop/hadoop-0.20.2/hadoop-0.20.2-examples.jar
file:/hadoop/hadoop-0.20.2/hadoop-0.20.2-test.jar
file:/hadoop/hadoop-0.20.2/hadoop-0.20.2-tools.jar
file:/hadoop/hadoop-0.20.2/hadoop-0.20.2-ant.jar
file:/hadoop/hadoop-0.20.2/hadoop-0.20.2-core.jar
file:/hadoop/hadoop-0.20.2/lib/log4j-1.2.15.jar
file:/hadoop/hadoop-0.20.2/lib/jasper-runtime-5.5.12.jar
file:/hadoop/hadoop-0.20.2/lib/slf4j-log4j12-1.4.3.jar
file:/hadoop/hadoop-0.20.2/lib/commons-httpclient-3.0.1.jar
file:/hadoop/hadoop-0.20.2/lib/mockito-all-1.8.0.jar
file:/hadoop/hadoop-0.20.2/lib/jetty-6.1.14.jar
file:/hadoop/hadoop-0.20.2/lib/oro-2.0.8.jar
file:/hadoop/hadoop-0.20.2/lib/servlet-api-2.5-6.1.14.jar
file:/hadoop/hadoop-0.20.2/lib/junit-3.8.1.jar
file:/hadoop/hadoop-0.20.2/lib/commons-logging-api-1.0.4.jar
file:/hadoop/hadoop-0.20.2/lib/commons-codec-1.3.jar
file:/hadoop/hadoop-0.20.2/lib/core-3.1.1.jar
file:/hadoop/hadoop-0.20.2/lib/jets3t-0.6.1.jar
file:/hadoop/hadoop-0.20.2/lib/hsqldb-1.8.0.10.jar
file:/hadoop/hadoop-0.20.2/lib/slf4j-api-1.4.3.jar
file:/hadoop/hadoop-0.20.2/lib/jasper-compiler-5.5.12.jar
file:/hadoop/hadoop-0.20.2/lib/jetty-util-6.1.14.jar
file:/hadoop/hadoop-0.20.2/lib/commons-net-1.4.1.jar
file:/hadoop/hadoop-0.20.2/lib/commons-logging-1.0.4.jar
file:/hadoop/hadoop-0.20.2/lib/commons-cli-1.2.jar
file:/hadoop/hadoop-0.20.2/lib/xmlenc-0.52.jar
file:/hadoop/hadoop-0.20.2/lib/kfs-0.2.2.jar
file:/hadoop/hadoop-0.20.2/lib/commons-el-1.0.jar
Line 84 of bin/accumulo in Apache Accumulo 1.4.0 sets the variable XML_FILES to $ACCUMULO_HOME/conf and then adds XML_FILES to the CLASSPATH variable which is later passed to the java command.
https://svn.apache.org/repos/asf/accumulo/tags/1.4.0/bin/accumulo
It sounds you have a misconfiguration of ACCUMULO_HOME either through your shell environment or in $ACCUMULO_HOME/conf/accumulo-env.sh.
I was troubleshooting an installation someone else set up that was having the same problem. My solution to this problem was simply that there actually was no log4j.properties in the conf directory! So I just copied up one of the log4j.properties from the conf/examples directory, restarted and everything worked like it should!

Resources