Define Logging Based on packages using BasicConfigurator - log4j

I am trying to figure out how to seperate my log files based on the packages using BasicConfigurator
Like in my log4j.properties I used to have appenders like
log4j.logger.com.cambiahealth.engine.common.aspect=,memberservices
log4j.logger.com.cambiahealth.engine.rest.family=,familyservice
I tried the following but doesnt seem to seperate out the requests going to a particular file
FileAppender fa = new FileAppender();
fa.setName("abc");
fa.setFile("/usr/regence/mylog.log");
fa.setLayout(new PatternLayout("%d %-5p [%c{1}] %m%n"));
fa.setThreshold(Level.INFO);
fa.setAppend(true);
fa.activateOptions();
BasicConfigurator.configure(fa);
System.out.println("The logger abc is initialized");
Logger log= Logger.getLogger("com.cambiahealth.engine.rest.family");
log.addAppender(fa);
FileAppender xyz= new FileAppender();
xyz.setName("claims");
xyz.setFile("/usr/regence/myClaims.log");
xyz.setLayout(new PatternLayout("%d %-5p [%c{1}] %m%n"));
xyz.setThreshold(Level.INFO);
xyz.setAppend(true);
xyz.activateOptions();
BasicConfigurator.configure(claims);
System.out.println("The logger xyz is initialized");
BasicConfigurator.configure(xyz);
Logger.getLogger("com.xyz.claim").addAppender(xyz);

I figured it out. I had to removed the basicConfigurator.Configure out! and it all works now

Related

Hazelcast-jet: got error when enriching stream using direct lookup

I am following Doc to try out how to enrich an unbounded stream by directly looking up from a IMap. I have two Maps:
Product: Map<String, Product> (ProductId as key)
Seller: Map<String, Seller> (SellerId as key)
Both Product and Seller are very simple classes:
public class Product implements DataSerializable {
String productId;
String sellerId;
int price;
...
public class Seller implements DataSerializable {
String sellerId;
int revenue;
...
I have two data generators keep pushing data to the two maps. The event-journal are enabled for both maps. I have verified the event-journal works fine.
I want to enrich the stream event of Product map with Seller map. Here is a snippet of my code:
IMap<String, Seller> sellerIMap = jetClient.getMap(SellerDataGenerator.SELLER_MAP);
StreamSource<Product> productStreamSource = Sources.mapJournal(ProductDataGenerator.PRODUCT_MAP, Util.mapPutEvents(), Util.mapEventNewValue(), START_FROM_CURRENT);
p.drawFrom(productStreamSource)
.withoutTimestamps()
.groupingKey(Product::getSellerId)
.mapUsingIMap(sellerIMap, (product, seller) -> new EnrichedProduct(product, seller))
.drainTo(getSink());
try {
JobConfig jobConfig = new JobConfig();
jobConfig.addClass(TaskSubmitter.class).addClass(Seller.class).addClass(Product.class).addClass(ExtendedProduct.class);
jobConfig.setName(Constants.BASIC_TASK);
Job job = jetClient.newJob(p, jobConfig);
} finally {
jetClient.shutdown();
}
When job was submitted, I got following error:
com.hazelcast.spi.impl.operationservice.impl.Invocation - [172.31.33.212]:80 [jet] [3.1] Failed asynchronous execution of execution callback: com.hazelcast.util.executor.DelegatingFuture$DelegatingExecutionCallback#77ac0407for call Invocation{op=com.hazelcast.map.impl.operation.GetOperation{serviceName='hz:impl:mapService', identityHash=1939050026, partitionId=70, replicaIndex=0, callId=-37944, invocationTime=1570410704479 (2019-10-07 01:11:44.479), waitTimeout=-1, callTimeout=60000, name=sellerMap}, tryCount=250, tryPauseMillis=500, invokeCount=1, callTimeoutMillis=60000, firstInvocationTimeMs=1570410704479, firstInvocationTime='2019-10-07 01:11:44.479', lastHeartbeatMillis=0, lastHeartbeatTime='1970-01-01 00:00:00.000', target=[172.31.33.212]:80, pendingResponse={VOID}, backupsAcksExpected=0, backupsAcksReceived=0, connection=null}
I tried to put one and two instances in my cluster and got the same error message. I couldn't figure out what was the root cause.
It seems that your problem is a ClassNotFoundException, even though you added the appropriate classes to the job. The objects you store in the IMap exist independent of your Jet job and when the event journal source asks for them, Jet's IMap code tries to deserialize them and fails because Jet doesn't have your domain model classes on its classpath.
To move on, add a JAR with the classes you use in the IMap to Jet's classpath. We are looking for a solution that will remove this requirement.
The reason you haven't got the exception stacktrace in the log output is due to the default java.util.logging setup you end up with when you don't explicitly add a more flexible logging module, such as Log4j.
The next version of Jet's packaging will improve this aspect. Until that time you can follow these steps:
Go to the lib directory of Jet's distribution package and download Log4j into it:
$ cd lib
$ wget https://repo1.maven.org/maven2/log4j/log4j/1.2.17/log4j-1.2.17.jar
Edit bin/common.sh to add the module to the classpath. Towards the end of the file there is a line
CLASSPATH="$JET_HOME/lib/hazelcast-jet-3.1.jar:$CLASSPATH"
You can duplicate this line and replace hazelcast-jet-3.1 with log4j-1.2.17.
At the end of commons.sh there is a multi-line command that constructs the JAVA_OPTS variable. Add "-Dhazelcast.logging.type=log4j" and "-Dlog4j.configuration=file:$JET_HOME/config/log4j.properties" to the list.
Create a file log4j.properties in the config directory:
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target=System.out
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d %5p [%c{1}] [%t] - %m%n
# Change this level to debug to diagnose failed cluster formation:
log4j.logger.com.hazelcast.internal.cluster=info
log4j.logger.com.hazelcast.jet=info
log4j.rootLogger=info, stdout

How can I include xml configuration in logback.groovy

I'm writing a Spring Boot app and need the flexibility of controlling my logback configuration using Groovy. In Spring Boot all I have to do is create src/main/resources/logback.groovy and it is automatically used for configuration.
What I would like to do though is start with Spring Boot's default logback configuration, and just override or modify settings as needed.
If I were using logback.xml instead of logback.groovy I could do something like the following.
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<include resource="org/springframework/boot/logging/logback/base.xml"/>
<logger name="org.springframework.web" level="DEBUG"/>
</configuration>
Is there something similar to the include line above that I can use in logback.groovy? I can look at the contents of base.xml and it's other included files to see how to replicate this manually, but it would add a bit of boilerplate code I'd like to avoid.
Thanks for any help you can provide.
There's an online tool that translates given logback.xml file to equivalent logback.groovy. In your case it resulted in:
//
// Built on Thu Jul 16 09:35:34 CEST 2015 by logback-translator
// For more information on configuration files in Groovy
// please see http://logback.qos.ch/manual/groovy.html
// For assistance related to this tool or configuration files
// in general, please contact the logback user mailing list at
// http://qos.ch/mailman/listinfo/logback-user
// For professional support please see
// http://www.qos.ch/shop/products/professionalSupport
import static ch.qos.logback.classic.Level.DEBUG
logger("org.springframework.web", DEBUG)
When it comes to <include> it's not supported for groovy configurations.
How do you feel about instead of adding/overriding your configuration, you reload it again?
You can create a Spring Bean that will see if a logback file is in a location you specify, and if it is, reload using that file
Example
#Component
public class LoggingHelper {
public static final String LOGBACK_GROOVY = "logback.groovy";
#PostConstruct
public void resetLogging() {
String configFolder = System.getProperty("config.folder");
Path loggingConfigFile = Paths.get(configFolder, LOGBACK_GROOVY);
if (Files.exists(loggingConfigFile) && Files.isReadable(loggingConfigFile)) {
LoggerContext loggerContext = (LoggerContext) LoggerFactory.getILoggerFactory();
ContextInitializer ci = new ContextInitializer(loggerContext);
loggerContext.reset();
try {
ci.configureByResource(loggingConfigFile.toUri().toURL());
} catch (JoranException e) {
// StatusPrinter will handle this
} catch (MalformedURLException e) {
System.err.println("Unable to configure logger " + loggingConfigFile);
}
StatusPrinter.printInCaseOfErrorsOrWarnings(loggerContext);
}
}
}
I am using this snippet to start my logback.groovy file
import ch.qos.logback.classic.joran.JoranConfigurator
import org.xml.sax.InputSource
def configurator = new JoranConfigurator()
configurator.context = context
def xmlString = '<?xml version="1.0" encoding="UTF-8"?>\n<configuration>\n <include resource="org/springframework/boot/logging/logback/base.xml"/>\n</configuration>'
configurator.doConfigure(new InputSource(new StringReader(xmlString)))
Contrary to the documentation stating that:
Everything you can do using XML in configuration files, you can do in
Groovy with a much shorter syntax.
include is not possible with Groovy out-of-the-box. However, thanks to a bug ticket that was opened in 2014, there are a couple of workarounds. I am including them here (slightly edited), but all credit goes to "Yih Tsern" from the original JIRA bug:
logback.groovy
include(new File('logback-fragment.groovy'))
root(DEBUG, ["CONSOLE"])
def include(File fragmentFile) {
GroovyShell shell = new GroovyShell(
getClass().classLoader,
binding,
new org.codehaus.groovy.control.CompilerConfiguration(scriptBaseClass: groovy.util.DelegatingScript.name))
Script fragment = shell.parse(fragmentFile.text)
fragment.setDelegate(this)
fragment.run()
}
logback-fragment.groovy:
// NOTE: No auto-import
import ch.qos.logback.core.*
import ch.qos.logback.classic.encoder.*
appender("CONSOLE", ConsoleAppender) {
encoder(PatternLayoutEncoder) {
pattern = "%d [%thread] %level %mdc %logger{35} - %msg%n"
}
}
Given the workaround and a pull-request to add the feature, I'm not sure why the functionality hasn't been added to Logback core yet.

Log4net: Why is Pattern Layout ignored in RenderedMessage?

I'm debugging a custom log4net Appender that I've written. My setup uses this standard PatternLayout in my config file:
<layout type="log4net.Layout.PatternLayout">
<conversionPattern value="%date [%thread] %-5level %logger [%property{NDC}] - %message%newline" />
</layout
In my Append() method implementation if I call RenderLoggingEvent() it returns a properly formatted message. The loggingEvent's Renderedmessage, however, only contains the %message bit.
RenderLoggingEvent() returns:
"2015-06-09 14:09:37,382 [Main Thread] INFO MyConsole.Program
[(null)] - Thread test\r\n"
loggingEvent.RenderedMessage contains:
"Thread test"
Is this how RenderedMessage is supposed to work?
I need to render the message outside of the Appender so I'd rather not use its RenderLoggingEvent() method. Is there a more direct way to get the rendered message from the LoggingEvent instance?
You say you need to 'render the message outside of the Appender', but the Layout is associated with the appender.
Assuming you can work around this and access the layout somehow, then you can call the Format method on the layout:
PatternLayout layout = … ;
string result;
using (StringWriter writer = new StringWriter())
{
layout.Format(writer, loggingEvent);
result = writer.ToString();
}
Test output:
2015-06-09 14:06:43,357 [14] ERROR test [NDC] - Test Message
When you are extending the AppenderSkeleton to write your custom Appender, you already have the layout as a property at hand. You can then write a method in your Appender:
Here's my code (in TextBoxAppender.cs), inheriting AppenderSkeleton:
private string GetLayoutedMessage(LoggingEvent loggingEvent) {
using (StringWriter writer = new StringWriter())
{
Layout.Format(writer, loggingEvent);
return writer.ToString();
}
}

How to avoid CRLF (Carriage Return and Line Feed) in Logback - CWE 117

I'm using Logback and I need to avoid CRLF(Carriage Return and Line Feed) when I log a user parameter.
I tried to add my class, which extends ClassicConverter, on the static map PatternLayout.defaultConverterMap but It didn't work.
Thank you,
You should create a custom layout as described in logback documentation
Custom layout:
package com.foo.bar;
import ch.qos.logback.classic.PatternLayout;
import ch.qos.logback.classic.spi.ILoggingEvent;
public class RemoveCRLFLayout extends PatternLayout {
#Override
public String doLayout(ILoggingEvent event) {
return super.doLayout(event).replaceAll("(\\r|\\n)", "");
}
}
Logback configuration:
<encoder class="ch.qos.logback.core.encoder.LayoutWrappingEncoder">
<layout class="com.foo.bar.RemoveCRLFLayout">
<pattern>%d %t %-5p %logger{16} - %m%n</pattern>
</layout>
</encoder>
For a quick solution we used a %replace expression in our pattern, to replace line feed and carraige returns found in the message.
Note this example is using a Spring Boot property to set the pattern, but you can use %replace in your Logback config file the same way.
logging:
pattern:
console: "%d{yyyy-MM-dd HH:mm:ss.SSS} %-5level %logger - %replace(%msg){'\n|\r', '_'}%n"
(A custom converter would have been my first choice, but I had trouble getting it to work with Spring Boot and Spring Cloud Config. If you want to learn more about that approach, search the logback docs for conversionRule.)
ch.qos.logback.core.CoreConstants;
public static final String LINE_SEPARATOR = System.getProperty("line.separator");
ch.qos.logback.classic.pattern.LineSeparatorConverter:
public String convert(ILoggingEvent event) {
return CoreConstants.LINE_SEPARATOR;
}
package ch.qos.logback.classic.PatternLayout:
defaultConverterMap.put("n", LineSeparatorConverter.class.getName());
So the proper way to ensure fixed line ending is the property line.separator.
The same implementation is for java.lang.System.lineSeparator():
lineSeparator = props.getProperty("line.separator");

Getting values from Log4Net configuration

I have implement a custom log4net appender by extending the AppenderSkeleton-class. It was as simple as anyone could ask for and works perfectly.
My problem is that I had to hardcode a few values and I'd like to remove them from my code to the configuration of the appender. Since log4net knows how it is configured I think there should be a way to ask log4net for it's configuraion.
My appender could look something like this:
<appender name="MyLogAppender" type="xxx.yyy.zzz.MyLogAppender">
<MyProperty1>property</MyProperty1>
<MyProperty2>property</MyProperty2>
<MyProperty3>property</MyProperty3>
</appender>
How to get the value of MyProperty1-3 so I can use it inside my Appender?
Thanks in advance
Roalnd
It depends a bit on the type but for simple types you can do the following:
Define a property like this:
// the value you assign to the field will be the default value for the property
private TimeSpan flushInterval = new TimeSpan(0, 5, 0);
public TimeSpan FlushInterval
{
get { return this.flushInterval; }
set { this.flushInterval = value; }
}
This you can configure as follows:
<appender name="MyLogAppender" type="xxx.yyy.zzz.MyLogAppender">
<flushInterval value="02:45:10" />
</appender>
This certainly works for string, bool, int and TimeSpan.
Note: If your settings requires some logic to be activated (e.g. create a timer) then you can implement this in the ActivateOptions method.

Resources