Using log4net to create logfiles that can be viewed with SvcTraceViewer.exe - log4net

What is the best way to log to a file using log4net that has the correct format (correct XML, correct timestamp format, custom data in correct format, correct attributes, basically the exact same way as XmlWriterTraceListener does it) so it can be viewed in the Service Trace Viewer Tool (SvcTraceViewer.exe)?

If I wanted to this then I would write my custom layout. I did not (yet) look at the details but I would write a class that derives from XmlLayoutBase. I need some more time to look at the details...
You could also write your own appender but I think in this case it makes more sense to write a layout class.
Edit: Maybe writing your own appender is a good idea. In that case you could use the System.ServiceModel.Diagnostics.DiagnosticTrace class. Not sure yet though if that is the way to go. I do not have much time right now, but I will look into this.

Not an answer, but I asked a question earlier today about logging and WCF and one of the things I wanted to know was about Service Trace Viewer. All of the examples that I have seen describe the XML files consumed by Service Trace Viewer being generated via System.Diagnostics TraceSources and the System.Diagnostics XmlFileListener. Anyway, if I get any answers in my post you might find them useful.

Here is an idea:
You could write a custom log4net Appender and have it write messages (indirectly) to the XmlWriterTraceListener. Inside the Append method you simply send the message to System.Diagnostics.
Here is one example of a custom Appender.
In the example, Append is overridden. It is passed a LoggingEvent class/structure. For your purposes (to get the log4net output routed to an output format that can be read by SvcTraceViewer), you could write your output to System.Diagnostics (having first configured it to log to the XmlWriterTraceListener). You could either write using Trace.Write* methods, Trace.Trace* methods, or by TraceSources.
For TraceSources, you could consider the TraceSource name to be the same as the logger name (which is available in the LoggingEvent class/structure). So, you could configure a TraceSource in your app.config file for each log4net logger name that you want to go into the xml file. Your Append logic then might look something like this:
protected override void Append(LoggingEvent le)
{
TraceSource ts = new TraceSource(le.LoggerName); // Not sure of logger name field in LoggingEvent
ts.TraceEvent(LogLevelToTraceEventType(le.Level), 0, le.FormattedMessage);
}
This might give you what you want. Note that I have not actually done this, so I cannot say if it is a good idea or not, but it certainly seems like it would work.
Sorry for being brief, but am trying to finish this before have to leave.

Related

How can I change the log level on the fly with tracing?

As a developer I would like to adjust the log level on the fly. For example, I don't want to log debug! events when everything is going fine, but when something happens, I would like to adjust the log level without restarting the application to change the log level. I check the documentation and can't find an example there, so I want to know if it is possible to do that.
// how can I change the max_level of subscriber after it was initialised?
let subscriber = tracing_subscriber::fmt().with_max_level(Level::INFO).finish();
tracing::subscriber::set_global_default(subscriber);
debug!("some log message");
You can use reload handler exactly for this. Save remote handler in some global state and call it to change log level. See example under "Reloading a Filtered layer"
Use log4rs instead. It reads configs from a file which you can change on the fly.
I use it in one of my projects, it works like a charm.

How to change the Apache POI SXSSFWorkbook default temporary file name

I am using POI's SXSSFWorkbench class to create extremely large workbooks. Multiple processes may be running of my application concurrently, so I thought it prudent to append the processId to the default temporary filename. I don't know how to do that, and could not find any recent coding examples.
Can anyone point me to an example, or outline to me what has to be done? I see there is a static TempFile.createTempFile method. Should I be executing that using a class override before instantiating the SXSSFWorkbook class? Or after?
I also saw there was a DefaultTempFileCreationStrategy class. Could not find examples of how to use this either.
The main class that Apache POI uses for this is TempFile
The method you'll want to call is TempFile.setTempFileCreationStrategy
What you'll need to do is create your own class implementing the interface TempFileCreationStrategy. This is nice and simple, with just two methods, createTempDirectory and createTempFile.
To get an idea of what's involved, you can look at the source code for DefaultTempFileCreationStrategy online here. It's pretty easy, just put in the logic for your own needs in terms of threading and naming.

Logstash Shipper configuration for Jira

I am running Jira and Confluence within my company. I would like the logfiles to be shipped to Kibana.
This is very easy to do but I do not want to rewrite the Grok filters. I cannot imagine that nobody has done this already.
Does anybody have an example of a logstash shipper configuration. Most of the logging like catalina.log is standard.
Please help me with examples
One would think that Java application logs only come in one form, but my experience is that there often are subtle differences. Sometimes the thread name is in square brackets and sometimes in parentheses, sometimes the thread name goes first and other times after the logger name, and so on. This gets more painful as you attempt to parse more than one type of log.
Instead of messing with various filters to join multiline messages and grok all the fields I strongly favor using the Log4j layout in github.com/logstash/log4j-jsonevent-layout to produce JSON-based logs that Logstash can read directly without any filters. Apart from not having to maintain filters you get all fields from each log event. Since I don't know what your catalina.log looks like I can't say what you'd be missing by parsing its contents instead of using the JSON layout.
The drawback is that it's a bit more work deployment-wise. You obviously have to deploy the layout jar file itself, but it has a couple of dependencies of its own (net.minidev:jsonsmart and commons-lang:commons-lang) that you need to make available too.

log4net configuration: Can I refer to the same layout in several appenders?

I want to send log messages to several files (i.e. different appenders) based upon some property of the message.
The problem is that each appender needs to specify quite a verbose layout (that contains a compication conversionPattern and a couple of converters). I have ended up duplicating this configuration in each appender. This works but is not ideal as it makes the config much longer than I would like as well as the pain of having to update 3 complicated bit of configuration when the layout changes.
I want to be able to define the layout once and have all my appenders refer back to that one definition (in the same way that several loggers can refer to the same appender). But perhaps there is a better way to achieve my goal of reducing duplication in the configuration?
My google-foo is weak and I could not find an answer. Can anyone here help?
TIA.
I am sorry but unless you are ready to code your own Layout class there is currently no way to skip on the copy-pasted configuration.
You can inherit from the LayoutSkeleton to get started, and either build your layout in code or use an alternate configuration file (I don't think that log4net would be kind to a dangling layout configuration in its config file)

How to disable any log for a thread

I am using org.apache.log4j.Logger for logging and I am developing a jsp just for monitoring purposes.
This jsp is using classes that write logs (INFO level) not interesting for this monitoring, but annoying, as long as I want to execute this jsp very often.
So, my question is the next:
Is there any way to disable these INFO logs just for the thread where my jsp is running?
If there is not, maybe this approach might pay the bill:
Is there any way to tell log4j level for one given class is FATAL just for a few milliseconds?
There's no easy way to do this using simple log4j config.
However, you can have your code install a custom filter on the appropriate logger. See the interface reference here:
http://logging.apache.org/log4j/1.2/apidocs/org/apache/log4j/spi/Filter.html
Construct an object that implements
the filter you want.
Find the logger using LogManager.getLogger("loggername")
Insert the filter.
Is the thread-id being logged? If so, can't you just use grep -v to remove the annoying lines that are for that thread and INFO level?

Resources