Log4j logs hidden or swallowed when deployed to CloudFoundry - log4j

I've just deployed my first app to CloudFoundry, and I use log4j. When I deploy the app to a local tomcat server, the logs print just fine as all is well. But, when I use the "vmc logs " command to get the logs from the instance on CloudFoundry, I only get the tomcat initialization logs and this message:
log4j:WARN No appenders could be found for logger (org.springframework.web.context.ContextLoader).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Anything further that I've printed with log4j is not visible. System.out.println messages show up, but no log4j messages.
I've placed my log4j.properties file in my WEB-INF directory, and here are its contents:
# Set root logger level to DEBUG and its only appender to A1.
log4j.rootLogger=INFO, A1
# A1 is set to be a ConsoleAppender.
log4j.appender.A1=org.apache.log4j.ConsoleAppender
# A1 uses PatternLayout.
log4j.appender.A1.layout=org.apache.log4j.PatternLayout
log4j.appender.A1.layout.ConversionPattern=%-5p %-35c{1} %m%n
log4j.logger.org.springframework=WARN
My logger object is created as would probably be expected in my classes:
private static Logger log = Logger.getLogger(MyClass.class);
Any suggestions as to what configuration I'm missing to have my log4j logs show up in my CloudFoundry logs? Or am I retrieving them incorrectly?

Is Log4j set to output to STDOUT by default? 'vmc logs' will only return the contents of STDOUT, STDERR and the staging log files.
If you app is logging to a different file then use 'vmc file' to view the content.

Related

Which logger should I use to get my data in Cloud Logging

I am running a PySpark job using Cloud Dataproc, and want to log info using the logging module of Python. The goal is to then push these logs to Cloud Logging.
From this question, I learned that I can achieve this by adding a logfile to the fluentd configuration, which is located at /etc/google-fluentd/google-fluentd.conf.
However, when I look at the log files in /var/log, I cannot find the files that contain my logs. I've tried using the default python logger and the 'py4j' logger.
logger = logging.getLogger()
logger = logging.getLogger('py4j')
Can anyone shed some light as to which logger I should use, and which file should be added to the fluentd configuration?
Thanks
tl;dr
This is not natively supported now but will be natively supported in a future version of Cloud Dataproc. That said, there is a manual workaround in the interim.
Workaround
First, make sure you are sending the python logs to the correct log4j logger from the spark context. To do this declare your logger as:
import pyspark
sc = pyspark.SparkContext()
logger = sc._jvm.org.apache.log4j.Logger.getLogger(__name__)
The second part involves a workaround that isn't natively supported yet. If you look at the spark properties file under
/etc/spark/conf/log4j.properties
on the master of your cluster, you can see how log4j is configured for spark. Currently it looks like the following:
# Set everything to be logged to the console
log4j.rootCategory=INFO, console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c: %m%n
# Settings to quiet third party logs that are too verbose
...
Note that this means log4j logs are sent only to the console. The dataproc agent will pick up this output and return it as the job driver ouput. However in order for fluentd to pick up the output and send it to Google Cloud Logging, you will need log4j to write to a local file. Therefore you will need to modify the log4j properties as follows:
# Set everything to be logged to the console and a file
log4j.rootCategory=INFO, console, file
# Set up console appender.
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c: %m%n
# Set up file appender.
log4j.appender.file=org.apache.log4j.RollingFileAppender
log4j.appender.file.File=/var/log/spark/spark-log4j.log
log4j.appender.file.MaxFileSize=512KB
log4j.appender.file.MaxBackupIndex=3
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.conversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c: %m%n
# Settings to quiet third party logs that are too verbose
...
If you set the file to /var/log/spark/spark-log4j.log as shown above, the default fluentd configuration on your Dataproc cluster should pick it up. If you want to set the file to something else you can follow the instructions in this question to get fluentd to pick up that file.

logger.IsDebugEnabled() always return false

i am new to log4j and i have the following log4j.properties file in my java application
i am working on this in websphere 6.1
log4j.properties file
log4j.rootLogger=info, console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.package_name=debug
However this is working if i am using only one project as part of my application.If there are multiple projects and i want to use logging facility,logger.isDebugEnabled() always return false.. can anybody suggest a solution for this?
Thanks in advance
Websphere use a classloader for default for each EAR. If you have several Web modules or EJB modules and several files for log4j, only one is loaded by the classloader.
See A Powerful, Easy-to-Use Logging System for configure log4j with several projects in a EAR.
# Set root logger level to INFO and appender to STDOUT.
log4j.rootLogger=INFO, STDOUT
#------------------------------------STDOUT-----------------------------------#
# STDOUT is set to be a ConsoleAppender.
log4j.appender.STDOUT=org.apache.log4j.ConsoleAppender
# STDOUT uses PatternLayout.
log4j.appender.STDOUT.layout=org.apache.log4j.PatternLayout
log4j.appender.STDOUT.layout.ConversionPattern=%d %-5p (%c.java:%L).%M - %m%n
log4j.appender.STDOUT.Encoding=UTF-8
#-----------------------------------------------------------------------------#
# Specify the logging level for loggers from other libraries
log4j.logger.org.apache.commons.beanutils.BeanUtils=DEBUG
log4j.logger.org.apache.struts.action=DEBUG
log4j.logger.org.apache.struts.tiles=DEBUG
log4j.logger.org.apache.struts.util.ModuleUtils=DEBUG
log4j.logger.org.apache.struts.util.RequestUtils=DEBUG
log4j.logger.org.apache.struts.util.PropertyMessageResources=ERROR
log4j.logger.com.ibm._jsp=DEBUG
May you are missing the log4j.logger. prefix for each particular package.
See more of log4j in http://logging.apache.org/log4j/1.2/manual.html

WARN No appenders could be found for logger (org.apache.accumulo.start.classloader.AccumuloClassLoader)

Does anyone know how to get rid of the following warnings when starting accumulo:
log4j:WARN No appenders could be found for logger (org.apache.accumulo.start.classloader.AccumuloClassLoader).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
I am running accumulo 1.4.0 hadoop 0.20.2 and zookeeper 3.3.3. I understand this warning happens because the class can not find the log4j.properties file and yes I have read http://logging.apache.org/log4j/1.2/manual.html. My log4j.properties file has the following lines copied from an accumulo 1.4.3 log4j file (I dont have the option to upgrade my system to 1.4.3):
# default logging properties:
# by default, log everything at INFO or higher to the console
log4j.rootLogger=INFO,A1
# hide Jetty junk
log4j.logger.org.mortbay.log=WARN,A1
# hide "Got brand-new compresssor" messages
log4j.logger.org.apache.hadoop.io.compress=WARN,A1
# hide junk from TestRandomDeletes
log4j.logger.org.apache.accumulo.server.test.TestRandomDeletes=WARN,A1
# hide almost everything from zookeeper
log4j.logger.org.apache.zookeeper=ERROR,A1
# hide AUDIT messages in the shell, alternatively you could send them to a different logger
log4j.logger.org.apache.accumulo.core.util.shell.Shell.audit=WARN,A1
# Send most things to the console
log4j.appender.A1=org.apache.log4j.ConsoleAppender
log4j.appender.A1.layout.ConversionPattern=%d{ISO8601} [%-8c{2}] %-5p: %m%n
log4j.appender.A1.layout=org.apache.log4j.PatternLayout
I have put this log4j file everyone. In the accumulo/bin folder, in the accumulo/conf folder, in the accumulo/lib folder but can not get rid of this warning (I know it has to go on the accumulo class path but dont know where that is). I also can't pass a log4j.configuration option to the java compiler because the accmulo executable comes pre-compiled (I just run it).
Thanks in advance for the help.
EDIT: Below is the result of an "accumulo classpath" command on my system:
[admin-cloud#NODE1 bin]$ echo $ACCUMULO_HOME
/accumulo/accumulo-1.4.0
[admin-cloud#NODE1 bin]$ accumulo classpath
Accumulo List of classpath items are:
file:/accumulo/accumulo-1.4.0/lib/commons-collections-3.2.jar
file:/accumulo/accumulo-1.4.0/lib/commons-configuration-1.5.jar
file:/accumulo/accumulo-1.4.0/lib/log4j-1.2.16.jar
file:/accumulo/accumulo-1.4.0/lib/libthrift-0.6.1.jar
file:/accumulo/accumulo-1.4.0/lib/commons-jci-core-1.0.jar
file:/accumulo/accumulo-1.4.0/lib/commons-lang-2.4.jar
file:/accumulo/accumulo-1.4.0/lib/commons-logging-api-1.0.4.jar
file:/accumulo/accumulo-1.4.0/lib/accumulo-server-1.4.0.jar
file:/accumulo/accumulo-1.4.0/lib/accumulo-start-1.4.0.jar
file:/accumulo/accumulo-1.4.0/lib/commons-jci-fam-1.0.jar
file:/accumulo/accumulo-1.4.0/lib/jline-0.9.94.jar
file:/accumulo/accumulo-1.4.0/lib/examples-simple-1.4.0.jar
file:/accumulo/accumulo-1.4.0/lib/cloudtrace-1.4.0.jar
file:/accumulo/accumulo-1.4.0/lib/commons-logging-1.0.4.jar
file:/accumulo/accumulo-1.4.0/lib/accumulo-core-1.4.0.jar
file:/accumulo/accumulo-1.4.0/lib/commons-io-1.4.jar
file:/zookeeper/zookeeper-3.3.6/zookeeper-3.3.6.jar
file:/hadoop/hadoop-0.20.2/conf/
file:/hadoop/hadoop-0.20.2/hadoop-0.20.2-examples.jar
file:/hadoop/hadoop-0.20.2/hadoop-0.20.2-test.jar
file:/hadoop/hadoop-0.20.2/hadoop-0.20.2-tools.jar
file:/hadoop/hadoop-0.20.2/hadoop-0.20.2-ant.jar
file:/hadoop/hadoop-0.20.2/hadoop-0.20.2-core.jar
file:/hadoop/hadoop-0.20.2/lib/log4j-1.2.15.jar
file:/hadoop/hadoop-0.20.2/lib/jasper-runtime-5.5.12.jar
file:/hadoop/hadoop-0.20.2/lib/slf4j-log4j12-1.4.3.jar
file:/hadoop/hadoop-0.20.2/lib/commons-httpclient-3.0.1.jar
file:/hadoop/hadoop-0.20.2/lib/mockito-all-1.8.0.jar
file:/hadoop/hadoop-0.20.2/lib/jetty-6.1.14.jar
file:/hadoop/hadoop-0.20.2/lib/oro-2.0.8.jar
file:/hadoop/hadoop-0.20.2/lib/servlet-api-2.5-6.1.14.jar
file:/hadoop/hadoop-0.20.2/lib/junit-3.8.1.jar
file:/hadoop/hadoop-0.20.2/lib/commons-logging-api-1.0.4.jar
file:/hadoop/hadoop-0.20.2/lib/commons-codec-1.3.jar
file:/hadoop/hadoop-0.20.2/lib/core-3.1.1.jar
file:/hadoop/hadoop-0.20.2/lib/jets3t-0.6.1.jar
file:/hadoop/hadoop-0.20.2/lib/hsqldb-1.8.0.10.jar
file:/hadoop/hadoop-0.20.2/lib/slf4j-api-1.4.3.jar
file:/hadoop/hadoop-0.20.2/lib/jasper-compiler-5.5.12.jar
file:/hadoop/hadoop-0.20.2/lib/jetty-util-6.1.14.jar
file:/hadoop/hadoop-0.20.2/lib/commons-net-1.4.1.jar
file:/hadoop/hadoop-0.20.2/lib/commons-logging-1.0.4.jar
file:/hadoop/hadoop-0.20.2/lib/commons-cli-1.2.jar
file:/hadoop/hadoop-0.20.2/lib/xmlenc-0.52.jar
file:/hadoop/hadoop-0.20.2/lib/kfs-0.2.2.jar
file:/hadoop/hadoop-0.20.2/lib/commons-el-1.0.jar
Line 84 of bin/accumulo in Apache Accumulo 1.4.0 sets the variable XML_FILES to $ACCUMULO_HOME/conf and then adds XML_FILES to the CLASSPATH variable which is later passed to the java command.
https://svn.apache.org/repos/asf/accumulo/tags/1.4.0/bin/accumulo
It sounds you have a misconfiguration of ACCUMULO_HOME either through your shell environment or in $ACCUMULO_HOME/conf/accumulo-env.sh.
I was troubleshooting an installation someone else set up that was having the same problem. My solution to this problem was simply that there actually was no log4j.properties in the conf directory! So I just copied up one of the log4j.properties from the conf/examples directory, restarted and everything worked like it should!

Cannot set log4j logger level with Tomcat 6

I am using log4j-1.2.16.jar with Tomcat 6. The log4j config is handled with a log4j.xml file in the WEB-INF\classes directory. I turned log4j debug on. When it process a level element for a logger I get output like:
log4j: Level value for root is [WARNING].
log4j: root level set to DEBUG
Therefore all debug messages are logged when not wanted.
Any ideas why this could be happening?
Level should be WARN not WARNING.
(It would be nice if log4j.debug gave an invalid level message.)

Grails log4J Logging question on linux

I have been very frustrated about this.
I am trying to do the following:
Log all application related logs in application.log that are INFO or above
Understand what controls the configuration for catalina.out
Log only WARN to catalina.out
I am running my server on ubuntu and I have the default configuration for tomcat which includes a conf directory with a logging.properties. I renamed this file to l.p so it wouldn't conflict. (Not sure if this is a good idea)
In my config file, I have:
def catalinaBase = System.properties.getProperty('catalina.base')
if (!catalinaBase) catalinaBase = '.' // just in case
def logDirectory = "${catalinaBase}${File.separator}logs"
println "Log Directory: ${logDirectory}"
log4j = {
appenders {
rollingFile name: 'applog', file: "${logDirectory}${File.separator}application.log", layout: pattern(conversionPattern: '%d{dd-MM-yyyy HH:mm:ss,SSS} %5p %c{1} - %m%n'), maxFileSize: 1024
}
error 'org.codehaus.groovy.grails.web.servlet', // controllers
'org.codehaus.groovy.grails.web.pages', // GSP
'org.codehaus.groovy.grails.web.sitemesh', // layouts
'org.codehaus.groovy.grails.web.mapping.filter', // URL mapping
'org.codehaus.groovy.grails.web.mapping', // URL mapping
'org.codehaus.groovy.grails.commons', // core / classloading
'org.codehaus.groovy.grails.plugins', // plugins
'org.codehaus.groovy.grails.orm.hibernate', // hibernate integration
'org.springframework',
'org.hibernate',
'net.sf.ehcache.hibernate'
warn 'org.mortbay.log'
info applog: 'grails.app'
root {
info 'applog'
}
}
As a result, I am getting three logs:
catalina.2011-01-17.log catalina.out localhost.2011-01-17.log
The catalina.out has the following output:
Log Directory: /var/lib/tomcat6/logs
log4j:WARN No appenders could be found for logger (org.apache.commons.beanutils.PropertyUtils).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
log4j:ERROR WARNING: Exception occured configuring log4j logging: Cannot invoke org.apache.log4j.FileAppender.setFile - argument type mismatch
I do NOT see the application.log in the log file directory. Any help would be appreciated I am really frustrated about this.
One more thing, in windows everything come out to the console and the application.log is created in the .grails\1.3.5\projects\<appnmae>\tomcat directory
Your problem is obviously that there's a type mismatch. Specifically, log4j is expecting a String when you're giving it a GString. Try replacing:
"${logDirectory}${File.separator}application.log"
With this:
"${logDirectory}${File.separator}application.log".toString()
EDIT: Please read this BUG

Resources