is Rollover logic of TimeBasedRollingPolicy correct? - log4j

Documentation says that 'if file names haven't changed, no rollover'.fileName is derived with the fileName pattern String.
I have two observations :
1)If today appender doesn't have message to write then it wont roll the file even if triggring time has passed (i.e. we have a log file which was modified yesterday ).
2)If the yesterdays log file is 0KB(means yesterday no log was written in file) & today appender have some messages to write then it rolls the 0kb file and writes data to newly created log file
I wanted to discuss whether above both cases are correctly implemented by TimeBasedRollingPolicy class OR should the implementation be changed ?
My implementation strategy for 1st Scenario would be 'if FileNamePattern is set to %d{dd-MM-yyyy} then at the midnight file should be rolled irrespective whether appender has data to write if yesterday's file is non empty.'
In case of 2nd case if yesterday's file is 0kb means no message was logged yesterday then it should write data into same file . because main purpose of rolling is to take backup of the logs and if file is empty is it worth to roll it ?
take below configuration of log4j.properties file as referenec for above two scenario's discussion
Sample log4j.properties
####### Root Logger ########################################
log4j.rootLogger=ERROR,CA,FA
############################################################
################### APPENDERS ##############################
############################################################
# CA is set to be a ConsoleAppender
log4j.appender.CA=org.apache.log4j.ConsoleAppender
log4j.appender.CA.layout=org.apache.log4j.PatternLayout
log4j.appender.CA.layout.ConversionPattern=%d %p %t %c: %m%n
# FA is set to be a FileAppender
log4j.appender.FA=org.apache.log4j.rolling.RollingFileAppender
log4j.appender.FA.RollingPolicy=org.apache.log4j.rolling.TimeBasedRollingPolicy
log4j.appender.FA.RollingPolicy.FileNamePattern=.\\logs\\application.log-%d{dd-MM-yyyy}
log4j.appender.FA.File=.\\logs\\application.log
log4j.appender.FA.layout=org.apache.log4j.EnhancedPatternLayout
log4j.appender.FA.layout.ConversionPattern=%d %p %t %c: %m%n
log4j.appender.FA.Append=true

Related

Databricks Log4J Custom Appender Not Working as expected

I'm trying to figure out how a custom appender should be configured in a Databricks environment but I cannot figure it out.
When cluster is running, in driver logs, time is displayed as 'unknown' for my custom log file and when cluster is stopped, custom log file is not displayed at all in the log files list
#appender configuration
log4j.appender.bplm=com.databricks.logging.RedactionRollingFileAppender
log4j.appender.bplm.layout=org.apache.log4j.PatternLayout
log4j.appender.bplm.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n
log4j.appender.bplm.rollingPolicy=org.apache.log4j.rolling.TimeBasedRollingPolicy
log4j.appender.bplm.rollingPolicy.FileNamePattern=logs/log4j-%d{yyyy-MM-dd-HH}-bplm.log.gz
log4j.appender.bplm.rollingPolicy.ActiveFileName=logs/log4j-bplm.log
log4j.logger.com.myPackage=INFO,bplm
Above configuration was added to following files
"/databricks/spark/dbconf/log4j/executor/log4j.properties"
"/databricks/spark/dbconf/log4j/driver/log4j.properties"
"/databricks/spark/dbconf/log4j/master-worker/log4j.properties"
After above configuration was added to above mentioned files, there are two issues which i cannot figure out.
1 - When cluster is running, if I go to driver logs in the list of log files, I can see my custom logfile generated, correctly populated, but time column is displayed as 'unknown'.
2 - When cluster is stopped, if I go to driver logs in the list of log files, my custom appender are not displayed. ( stdout, stderr, and log4j-active are displayed )
I also used different FileNamePatterns, but issues mentioned above seems to happens for any configuration I tried
log4j.appender.bplm.rollingPolicy.FileNamePattern=logs/log4j-%d{yyyy-MM-dd-HH}.bplm.log.gz - appender1
log4j.appender.bplm.rollingPolicy.FileNamePattern=logs/log4j.bplm-%d{yyyy-MM-dd-HH}.log.gz - appender2
log4j.appender.bplm.rollingPolicy.FileNamePattern=logs/bplm-log4j-%d{yyyy-MM-dd-HH}.log.gz - appender3
log4j.appender.bplm.rollingPolicy.FileNamePattern=logs/bplm.log4j-%d{yyyy-MM-dd-HH}.log.gz - appender4
log4j.appender.bplm.rollingPolicy.FileNamePattern=logs/log4j-%d{yyyy-MM-dd-HH}.log.bplm.gz - appender5
log4j.appender.bplm7.rollingPolicy.FileNamePattern=logs/log4j-bplm-%d{yyyy-MM-dd-HH}.log.gz - appender7
log4j.appender.bplm8.rollingPolicy.FileNamePattern=logs/log4j-%d{yyyy-MM-dd-HH}-bplm.log.gz - appender8
I also tried to put *-active in ActiveFileName, but result was the same
log4j.appender.custom.rollingPolicy.FileNamePattern=/tmp/custom/logs/log4j-bplm-%d{yyyy-MM-dd-HH}.log.gz
log4j.appender.custom.rollingPolicy.ActiveFileName=/tmp/custom/logs/log4j-bplm-active.log

write spark application ID with spark logs

I have been researching it for 1 month and couldn't find a good solution. Default spark logs doesn't contain the application ID.
Default logs contains - "time":,"date":,"level":,"thread":,"message"
I tried to customize the log4j properties but I could't find a way. I am a newbie to the the big data area.
My default log4j.properties file is
log4j.rootLogger=${hadoop.root.logger}
hadoop.root.logger=INFO,console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{2}: %m%n
Does anyone know a solution ? Appreciate your help even it is a small help.

using winston daily rotate without adding datePattern to file name

in my nodejs logging i am using winston daily rotate file module and it works very well, however i have a question about file naming: is that possible to not have datePattern in the current file name but adding it once it got daily rotated over?
example:
my_log.log ----- current one
my_log_2017-10-18.log -----yesterday
my_log_2017-10-17.log ----- the day before yesterday

Does logstash update the .sincedb file after a log file is completely processed or during the reading process?

Does Logstash update the .sincedb file after a log file is read till the end or during the reading process ?
For example:
Let's say there is directory which is being monitored by Logstash. A file [say file1.log with max offset (file size) as 10000 ] is copied into this directory.
Does .sincedb file gets updated/created (if not already present) with the info of file1.log when the Logstash reaches offset 10000 ?
What I think is logstash should update the .sincedb file on regular basis, but what I have noticed is that it gets updated/created after a file is completely read.
The logstash file input plugin will write the sincedb file on a regular basis based on the sincedb_write_interval setting.
By default, the sincedb database is written every 15 seconds.

How can I write my properties file if I want to get a log file every hours using log4j?

I have make my properties file ok,but what should I do if I want to put the log file in a folder relate to the date?
For example,today is 12/29 2015,at 10:30,I started my java project,the log4j.propertites about the log like the following ones:
log4j.appender.inforlog=org.apache.log4j.DailyRollingFileAppender
log4j.appender.inforlog.DatePattern='.'yyyy-MM-dd-HH
log4j.appender.inforlog.File=D:/inforLogs/2015/12/searchrecord
when it comes to 11:00,there will be a log file named searchrecord.2015-12-29-10 in "D:/inforLogs/2015/12/", when it comes to 01/01 2016,the log file will alse in file "D:/inforLogs/2015/12/",but I want to make it in file "D:/inforLogs/2016/01/" by write the properties file properly,what should I do?
I have resolve the problem myself,here is the properties file
log4j.appender.inforlog.DatePattern='s/'yyyy'/'MM'/searchrecord-'dd'_'HH'.log'
log4j.appender.inforlog.File=D:/inforLog

Resources