Unable to generate logs in pcf UI console - log4j

I have configured loggers in my spring boot project using log4j.properties file. I am able generate loggers on my local but not able to generate loggers in pcf UI console.
I have mentioned the file name as below.
log4j.appender.fileAppender.File=demoApplication.log
Could anyone please help me to provide path of PCF in this log4j properties file so that I can able generate logs on PCF UI console.

I encounter this problem too.
This is my solution (put it in your .properties files and you can change the pattern as you like):
log4j.rootLogger=debug, stdout
# Direct log messages to stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target=System.out
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss:SSS} | %m%n
I found it in pivotal.io and it works for me.

Related

How to configure log4j to send logs to stdout as JSON in ksqldb?

I'm using ksqldb deployed inside a kubernetes pod, that automatically sends console output to logstash and kibana.
I'm trying to read the server logs but the output is not great, especially when a stack trace occurs.
It would be better if I could output my logs as JSON. So, I got to this log4j.properties, but could not find how to do it.
This is the default file.
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=[%d] %p %m (%c:%L)%n
log4j.appender.streams=org.apache.log4j.ConsoleAppender
log4j.appender.streams.layout=org.apache.log4j.PatternLayout
log4j.appender.streams.layout.ConversionPattern=[%d] %p %m (%c:%L)%n
I tried to use the confluent json layout, but it did not work as expected.
log4j.appender.stdout.layout=io.confluent.common.logging.log4j.StructuredJsonLayout
I had some ideas of how to do it, but most of them sounded too complicated for something so simple.

PySpark Logging

I'm trying to view the spark logs in GCP stack driver. Below is the screenshot of the stack driver.
As per the above screenshot Spark INFO's are mapping to ERROR in stack driver. Not sure why?
I'm using log4j for spark custom logging. Below are the log properties.
# Root logger option
log4j.rootLogger=INFO, console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n
# Settings to quiet third party logs that are too verbose
log4j.logger.org.spark-project.jetty=INFO
log4j.logger.org.spark-project.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
Any suggestions to map spark log(INFO) to stack driver log(INFO)?
Stackdriver reads some messages as the severity, in this case you have
log4j.appender.console.target=System.err
Try add a "Severity" field to your logs with the proper value and, this way, Stackdriver would read it as you specify, check the values here. Also you can check the correct format at the beginning of the page.

How can I config log4j for send wowza logs?

I want to send the wowza logs to a remote server that has a logstash installed. For that, I want to use log4j but doesn´t work, this is my config:
Access appender (UDP) - uncomment and add to rootCategory list on first line log4j.appender.serverAccessUDP=com.wowza.wms.logging.UDPAppender log4j.appender.serverAccessUDP.remoteHost=x.x.x.x
log4j.appender.serverAccessUDP.port=5678 log4j.appender.serverAccessUDP.layout=com.wowza.wms.logging.ECLFPatternLayout log4j.appender.serverAccessUDP.layout.Fields=x-severity,x-category,x-event;date,time,c-client-$ log4j.appender.serverAccessUDP.layout.OutputHeader=true log4j.appender.serverAccessUDP.layout.QuoteFields=false log4j.appender.serverAccessUDP.layout.Delimeter=tab
As far as I know and Wowza said, no direct support of logstash yet (See answers for your question on wowza forums https://www.wowza.com/community/questions/46469/how-can-i-config-log4j-for-send-wowza-logs.html )
Our solution was to send logs with syslog to logstash server. Logstash has syslog input plugin.
log4j.rootCategory=INFO, serverAccess
# Console appender
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=com.wowza.wms.logging.ECLFPatternLayout
log4j.appender.stdout.layout.Fields=x-severity,x-category,x-event,x-ctx,x-comment
log4j.appender.stdout.layout.OutputHeader=false
log4j.appender.stdout.layout.QuoteFields=false
log4j.appender.stdout.layout.Delimiter=space
#SyslogAppender
log4j.appender.serverAccess=org.apache.log4j.net.SyslogAppender
log4j.appender.serverAccess.Facility=LOCAL1
log4j.appender.serverAccess.FacilityPrinting=false
log4j.appender.serverAccess.Header=true
log4j.appender.serverAccess.syslogHost={SYSLOG_IP_PORT}
log4j.appender.serverAccess.layout=org.apache.log4j.PatternLayout
log4j.appender.serverAccess.layout.ConversionPattern={HOSTNAME}: [%d{yyyy-MM-dd HH:mm:ss.SSS}] %5p [%t] --- %c : %m%n
The {SYSLOG_IP_PORT} is the target logstash server ip:port.
{HOSTNAME} should be replaced. If you would like to know from the message which "instance", server was sending the given message, fill it with the server's hostname for example.
The "Console appender" is optional, sometimes is good for debug, can comment out in production.

Log4j log custom level for specific user(s) in production

In production environment I'm only logging WARN and ERROR log messages.
I need to implement a mechanism that logs ALL log levels for (a) specific user(s).
I'm handling requests coming in from outside (not necessarily a servlet) and from the beginning I can retrieve the users name, so I'm able to put users name in MDC and it will go to the log messages.
However, I need to check if users name is in a predefined list of users (this list can actually dynamically be updated during runtime) and if the users name is in this list then I need to enable (for this user only) logging to be from fx. TRACE level (meaning all TRACE, DEBUG, INFO, WARN and ERROR).
Is this possible?
UPDATE #1: We're using Spring Boot Log4j
UPDATE #2: Log4j config
LOG_PATTERN=%d{yyyy-MM-dd HH:mm:ss.SSS} %X{context} ${PID} %5p %-10X{username} [%t] - %c{1}(%L): %m%n
log4j.rootCategory=INFO, amqp
log4j.category.org.springframework=WARN
log4j.category.com.acme=DEBUG
log4j.appender.file=org.apache.log4j.RollingFileAppender
log4j.appender.file.File=logs/app.log
log4j.appender.file.MaxFileSize=100MB
log4j.appender.file.MaxBackupIndex=10
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=${LOG_PATTERN}
log4j.appender.CONSOLE=org.apache.log4j.ConsoleAppender
log4j.appender.CONSOLE.layout=org.apache.log4j.PatternLayout
log4j.appender.CONSOLE.layout.ConversionPattern=${LOG_PATTERN}
log4j.appender.amqp=org.springframework.amqp.rabbit.log4j.AmqpAppender
log4j.appender.amqp.host=localhost
log4j.appender.amqp.port=5671
log4j.appender.amqp.username=username
log4j.appender.amqp.password=password
log4j.appender.amqp.virtualHost=vhost
log4j.appender.amqp.exchangeName=logging.pub
log4j.appender.amqp.exchangeType=topic
log4j.appender.amqp.routingKeyPattern=%c.%p
log4j.appender.amqp.declareExchange=false
log4j.appender.amqp.durable=true
log4j.appender.amqp.autoDelete=false
log4j.appender.amqp.contentType=text/plain
log4j.appender.amqp.generateId=false
log4j.appender.amqp.senderPoolSize=2
log4j.appender.amqp.maxSenderRetries=30
log4j.appender.amqp.layout=org.apache.log4j.PatternLayout
log4j.appender.amqp.layout.ConversionPattern=${LOG_PATTERN}
After all the clarifications, the short answer is: no.
The very first check that is done by log4j's trace/debug/.../error methods is the logger level. If the log event level is below the logger configured level -- nothing else is done, the logging method returns. Message formatting, processing the diagnostic context, triggering appenders -- all come later, provided that there is anything to process. In log4j 1 it cannot be changed by configuration.
In log4j 2 we have global filters, which are checked prior to the logger level test -- although I don't know if they'd be fit for this kind of task, I'd definitely start looking there. If you can upgrade to log4j 2, check them.

Using Log4j JMS appender to connect with ActiveMQ

I am trying to add logging to our Mule application with ActiveMQ so all Messages are written to a SQL Server Database. THis is the content of my Log4j properties file in Mule:
log4j.rootCategory=INFO, console, jms
log4j.logger.org.apache.activemq=INFO, stdout
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%-5p %d [%t] %c: %m%n
# Mule classes
log4j.logger.org.mule=INFO
log4j.logger.com.mulesoft=INFO
# Your custom classes
log4j.logger.com.mycompany=DEBUG
# JMS Appender
log4j.appender.jms=org.apache.log4j.net.JMSAppender
log4j.appender.jms.InitialContextFactoryName=org.apache.activemq.jndi.ActiveMQInitialContextFactory
log4j.appender.jms.ProviderURL=tcp://sjc04-wduatesb1:9162
#tcp//localhost:61616
log4j.appender.jms.TopicBindingName=logTopic
log4j.appender.jms.TopicConnectionFactoryBindingName=ConnectionFactory
As you can see I am trying to use the JMS appender . I also have a JNDI config file to point to the topic to read off , the contents of that file stored in $MULE_HOME/conf/ is
topic.logTopic=logTopic
However I find even though messages are getting enqueued & dequeued on the topic , it is not being written to the database. Does anybody have any ideas or suggestions as to where I am going wrong?
One way to solve this is to create a subscriber to the topic that puts the message in a queue and then have a queue consumer put the message in the DB.

Resources