How do I use "Load Remote Log4J File" functionality of Chainsaw v2? - log4j

I'm trying to set up the Chainsaw viewer. I'm not really getting how it's supposed to work.
This is my XML file in the java project to be logged(i.e the one I want to watch in Chainsaw v2):
<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE log4j:configuration >
<log4j:configuration xmlns:log4j="http://jakarta.apache.org/log4j/" debug="true">
<plugin name="XMLSocketReceiver" class="org.apache.log4j.net.XMLSocketReceiver">
<param name="decoder" value="org.apache.log4j.xml.UtilLoggingXMLDecoder"/>
<param name="Port" value="4000"/>
<param name="threshold" value="ALL"/>
</plugin>
<root>
<priority value="debug"/>
</root>
</log4j:configuration>
Here's a screenshot of the Chainsaw option menu:

A couple of things:
The latest developer snapshot of Chainsaw has a lot of new features, including a reworked configuration UI that should make it simpler (File, Load Chainsaw configuration menu option). You can get it here: http://people.apache.org/~sdeboy
The log4j.xml file used by the application generating the logging needs to have an 'appender' entry, not a 'receiver' entry. The Chainsaw configuration will contain a 'receiver' entry once you have it set up, which again, I would suggest doing via the configuration UI (it 'receives' events generated by an 'appender'). Just choose the option to save the config file from the configuration screen, and check the box that says 'always start Chainsaw with this configuration'
You can use a SocketAppender/SocketHubAppender on the application logging side, or a FileAppender of some kind. If you choose to use a FileAppender, Chainsaw's configuration screen can read in your application-side log4j.xml and generate the correct configuration for you.
If you have additional questions, feel free to send them here or to the log4j users mailing list, available here: http://logging.apache.org/log4j/1.2/mail-lists.html

Related

Azure Log Streaming with NLog

I'm trying to get nlog working with the Azure webapp Log Stream.
The logs do appear if I don't use nlog, and just use System.Diagnostics.Trace.WriteLine.
However, if I use the Trace type in my nlog.config, it doesn't show the trace logs ...
<?xml version="1.0" encoding="utf-8" ?>
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<targets>
<target xsi:type="Trace" name="trace" layout="${message}" />
</targets>
<rules>
<logger name="*" minlevel="Trace" writeTo="trace" />
</rules>
</nlog>
I can't see anything that I'm doing differently from the accepted answer here ...
How to integrate NLog to write log to Azure Streaming log
Note that I cut down that nlog.config file to just show the trace - but I do normally also have a File target type - I've tried with and without this.
I've logged onto the deployed Azure website, and the nlog config file had been uploaded successfully. I'm deploying using the Github deployment.
I have the logging set in Azure to just use the file system logging, and I have that set to verbose.
Any ideas?
It turns out that when Visual Studio enabled Application Insights (a recent thing I added to the project), it had inserted an nlog config section into my web.config. This had then meant that my nlog.config file wasn't being used at all. I've fixed it by removing that nlog config section from my web.config, and copying the Application Insights target into my nlog.config file instead. The 'Trace' target type now works as expected and is appearing in the Azure Streaming Logs.
The NLog-Trace-Target only performs Trace.WriteLine for Debug-Level Log-Events.
Maybe try the custom MyTraceTarget shown here:
https://github.com/NLog/NLog/issues/1968
Update NLog ver. 4.5 adds the new setting rawWrite for the NLog Trace-target so it always performs WriteLine independent of LogLevel. See also https://github.com/NLog/NLog/wiki/Trace-target

Service Fabric Default Publish Profile other than Local.xml

Our company is developing our new applications using Service Fabric.
A common problem we have, multiple developers use queues, databases, storages that are on remote servers, and each one has different configuration for this, all the settings are stored on ApplicationParameters file per environment, for Local development there is a single Local.5Node.xml. It is very common developers checkin their credentials and overwrite others when we get latest version of these files.
I'm trying to customize the ServiceFabric deployment script 'Deploy-FabricApplication.ps1' to use a custom PublishProfile depending on windows credentials of logged user. I can achieve that updating the deployment file, it works well when we deploy using the publish, but it seems that the default behavior of ServiceFabric when we hit F5(debug) is overwrite the parameters with a specific Local.5Node.xml application parameters.
I explored all service fabric .ps1 files and couldn't find where this is defined. I guess this is defined on .targets file, so I don't know how can I avoid this default behaviour.
Is there any other approach to use custom PublishProfiles on local development machines other than Local.5Node.xml?
I actually just ran into this with setting up some team specific environments. I borrowed information from the following sources:
Web Config Transformation
Replace String In File With MSBUILD
I added multiple parameters files based on what was needed for the different teams. Each one containing their specific resource settings.
I also added a Local.1Node.Template.xml and Local.5Node.Template.xml. I even removed the Local.1Node.xml and Local.5Node.xml from source control and set them up to be ignored while leaving them in the projects so that Visual Studio doesn't think they are truly missing. The contents of the 1Node (5Node is the same except for replacing 1Node with 5Node) are as follows:
<?xml version="1.0" encoding="utf-8"?>
<PublishProfile xmlns="http://schemas.microsoft.com/2015/05/fabrictools">
<ClusterConnectionParameters />
<ApplicationParameterFile Path="..\ApplicationParameters\Local.1Node.$(Configuration).xml" />
</PublishProfile>
I then edited the sfproj file for the Service Fabric project to contain the following MSBuild Task and Target:
<UsingTask TaskName="ReplaceFileText" TaskFactory="CodeTaskFactory" AssemblyFile="$(MSBuildToolsPath)\Microsoft.Build.Tasks.v4.0.dll">
<ParameterGroup>
<InputFilename ParameterType="System.String" Required="true" />
<OutputFilename ParameterType="System.String" Required="true" />
<MatchExpression ParameterType="System.String" Required="true" />
<ReplacementText ParameterType="System.String" Required="true" />
</ParameterGroup>
<Task>
<Reference Include="System.Core" />
<Using Namespace="System" />
<Using Namespace="System.IO" />
<Using Namespace="System.Text.RegularExpressions" />
<Code Type="Fragment" Language="cs">
<![CDATA[
File.WriteAllText(
OutputFilename,
Regex.Replace(File.ReadAllText(InputFilename), MatchExpression, ReplacementText)
);
]]>
</Code>
</Task>
</UsingTask>
<Target Name="UpdateProfile" BeforeTargets="UpdateServiceFabricApplicationManifest">
<ReplaceFileText InputFilename="PublishProfiles\Local.1Node.Template.xml" OutputFilename="PublishProfiles\Local.1Node.xml" MatchExpression="\$\(Configuration\)" ReplacementText="$(Configuration)" />
<ReplaceFileText InputFilename="PublishProfiles\Local.5Node.Template.xml" OutputFilename="PublishProfiles\Local.5Node.xml" MatchExpression="\$\(Configuration\)" ReplacementText="$(Configuration)" />
</Target>
The final step was to setup the different Build Configurations for the teams. I created a FT1-Debug through FT6-Debug based on the Debug configuration in the Service Fabric Service project and the Service Fabric Host project. I left all of my other projects alone.
At this point everyone on the different teams can debug locally with the correct configuration for the cluster they are doing work in just by changing the Build Configuration and pressing F5 to debug.
The VS extension for Service Fabric define a hard coded publish profile when we debug the solution using Visual Studio, it check how many nodes my cluster has and create a link to Local.5Node.xml and Local.1Node.xml depending how many nodes my cluster have.
To accomplish the same results, we end up using custom Application Parameters per developer and each developer update the Publish Profile (Local.5node.xml) to point to their respective Application parameter files.
It is not automated as the required feature, but can solve the main problem.

XML transformation not working

I have installed the SlowCheetah extension and Nuget Package into my Console App Project. I have used the context menu to add a UAT build configuration and updated a test setting to check that the value is being transformed.
Unfortunately its not, when I try to Preview the Transform via the Context Menu its just showing me the non transformed App.Config.
What steps can I check to see why this extension is not working?
In the main App Config I have specified an appSetting.
<appSettings>
<add key="TomTestTransform" value="LOCAL" />
</appSettings>
In the App.UAT.config I overwrite it
<appSettings>
<add key="TomTestTransform" value="UAT" />
</appSettings>
When I preview the Transform, or build and check the configuration output, its always using the non transformed version. The setting equals LOCAL.
You need to use xdt: attributes to match and adapt the elements, like so:
<?xml version="1.0" encoding="utf-8" ?>
<!-- For more information on using transformations
see the web.comfig examples at http://go.microsoft.com/fwlink/?LinkId=214134. -->
<configuration xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform">
<appSettings>
<add key="TomTestTransform"
value="UAT"
xdt:Transform="Replace"
xdt:Locator="Match(key)" />
</appSettings>
</configuration>
With xdt:Locator="Match(key)" you are telling the processor to match the add element based on the key attribute, and apply xdt:Transform="Replace" logic on the whole (located) element.
There is a msdn entry available on possible XML transformations, which is also applicable for SlowCheetah transformations, as they are based on the same "technology".
Additionally, the extension overview has also some good documentation in it!

What is the best way to centralize logging with NLog?

I have been assigned a project with a lot of poorly written code that is based around SharePoint.
It consists of about 15 subprojects, some of them being windows services, some web services, some web applications running inside of SharePoint, some being webparts and even console applications. They all run on the same server and call each other.
There are already many issues in production but they are hard to trace down.
The original developer must have been a fan of either Salinger or Pokémon series judging by his tireless effort to catch all exceptions. Unfortunately, none of them get reported or logged, ever.
My current task is to introduce logging into the whole project so I could find now-invisible exceptions, follow tangled recurring calls and have some stack traces at least. I decided to go with NLog, seeing it's active and cool, as opposed to log4net which is perfectly fine but somewhat not as fancy to my taste.
Because the components are tightly coupled, I want to centralize logging in one file so related errors don't get scattered across the hard drive. Therefore, I am looking to have two or three different log files with five or more projects writing to each of them more or less simultaneously.
What is the best way to configure NLog to centralize logging? Should I have a config file for each project, or should related projects share them? Where should I put config file to log from SharePoint webparts? Am I going to face any permission issues?
I'm using SharePoint 2007.
The easiest way to centralize is probably to simply log to a database, one benefit being that multiple applications and write to the database easier than to the same log file. For each application, configure NLog to log to the Database target, using the same Database target configuration parameters for each. Your NLog.config file might look something like this:
<?xml version="1.0" encoding="utf-8" ?>
<!--
This file needs to be put in the application directory. Make sure to set
'Copy to Output Directory' option in Visual Studio.
-->
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
autoReload="true"
internalLogLevel="Debug"
internalLogFile="nlog_log.log">
<targets async="true">
<target name="sqlexpress" xsi:type="Database">
<connectionString>
Data Source=.\SQLEXPRESS;Initial Catalog=LoggingDB;Integrated Security=True;
</connectionString>
<commandText>
insert into LogTable(DateTime,Logger,LogLevel,Message,ProcessId,ManagedThreadId) values (#DateTime,#Logger,#LogLevel,#Message,#ProcessId,#ManagedThreadId);
</commandText>
<parameter name="#DateTime" layout="${date:format=yyyy\-MM\-dd HH\:mm\:ss.fff}"/>
<parameter name="#Logger" layout="${logger}"/>
<parameter name="#LogLevel" layout="${level}"/>
<parameter name="#Message" layout="${message}"/>
</target>
</target>
</targets>
<rules>
<logger name="*" minlevel="Trace" writeTo="sqlexpress" />
</rules>
</nlog>
You could certainly log to files in addition to (or instead of) logging to a database.
I am not familiar with doing this from SharePoint, so I can't comment on any configuration or permission issues that you might run into there.
Here is a link I found where there is a discussion of getting NLog to work in a SharePoint environment:
http://nlog-forum.1685105.n2.nabble.com/Is-anyone-using-NLog-with-SharePoint-td2171451.html
That link appears to put you at the top of the NLog forum instead of the specific post. Search for this text in the forum "Is anyone using NLog with SharePoint" and you should find the right post.
Good luck!
You could also just leverage the existing logging infrastructure in SharePoint and write to the ULS logs. This way your log information can be viewed in a complete context using the ULS log viewer. For SharePoint 2007 see this blog how to write to the ULS log:
SharePoint Trace Logs and the Unified Logging Service (ULS)
With SharePoint 2010 it has become even easier with improvements to the SPDiagnosticsService class where you can use the new WriteTrace method.
Personally I log Exceptions to the Event Logger. And I use NLog for logging details, debug information or tracing.
Since NLog can be easily switched on and off I only activate it when I'm debugging or when I need to inspect an exception in production. I never was a big fan of the default tracing functionality in .NET.
I prefer simple plain text log files. Although logging to a database works great if you don't have too many "log lines" implemented in your code.
I feel like we are working on the same project! Multiple projects consisting of web projects, core dll projects, console apps, services, etc. Unfortunately I'm not working in sharepoint like you are, but I can describe how I am trying to centralize our logging.
We have 1 core .Net framework project.This is where I placed our wrapper class of the log. This project also holds the nlog dlls and the nlog config file. In this core project file you can add this which automatically moves the config when you build projects with a dependency on this core project.
<None Include="Logging\NLog.config">
<link>NLog.config</link>
<CopyToOutputDirectory>Always</CopyToOutputDirectory>
</None>
We found that some web projects that don't compile a dll don't automatically pull in the config file, so we will leave that up to the build process. This helps centralize our logging so you only have to manage a single config across them all.
In addition remember when you create a logger per class, the log name should have the namespace in it, so you can make specific targets that filter based off namespace if you want different settings for particular projects.
As for centralizing where the logs end up, we chose to use a file target and specify the full path. This is because on our servers the applications run off a C:\ but we have larger D:\ which can store the logs. In our production servers, we also have multiple servers, so we are using splunk to aggregate all of our logs.
If splunk is out of the question and you are on a distributed system, a database sounds like a good idea as suggested above. If you don't want to stand up an sql instance, there are target wrappers for mongo db as well.
Hopefully helpful, I'm curious if anyone has suggestions or opinions on how I'm doing it as well!

SharePoint and Log4Net

I'm looking for best practices to integrate log4net to SharePoint for web request, feature activation and all timer stuff.
I have several subprojects in my farm, and I would like to have only one Log4Net.config file.
[Edit]
Not only I need to configure log4net for the web application, which is easy to do (I use global.asax, and a log4net.config file, so I can modify log settings withtout reloading the webapp), but I also need to log asynchronous events:
Event Handler (like ItemAdded)
Timer Jobs
...
I implemented this recently and came up with a solution that worked for me.
Deploy your log4net config file to the 12 hive and the log4net dll into the GAC using a globally scoped solution. Then in your application code explicitly initialize log4net from the location of your global file. This allows you to log feature receiver, timer jobs and web application code.
[assembly: log4net.Config.XmlConfigurator(ConfigFile =
#"C:\Program Files\Common Files\Microsoft Shared\" +
#"Web Server Extensions\12\CONFIG\log4net.config", Watch = true)]
see here http://www.codeproject.com/KB/sharepoint/SharepointLog4Net.aspx
Firstly, you will need to modify the web.config where your SharePoint virtual directory resides. This is because you'll need to add SafeControl entries to trust the log4net assembly. You can update the web.config programmatically using the SPWebConfigModification class in a feature receiver. As you have to modify web.config anyway, you may want to consider including your log4net config inside and not set up an external log4net config.
However, if you'd still like to do this, it may work if you add the following to the web.config file:
<configuration ...>
...
<configSections>
<section name="log4net" type="log4net.Config.Log4NetConfigurationSectionHandler,log4net" />
</configSections>
<log4net configSource="log4Net.config">
...
</configuration>
The log4net.config file should then be able to live alongside your web.config. As Nat says, you could deploy this file as a solution package.
Assuming you are attempting to run a minimal trust, you will need to update your Code Access Security file to include the log4net assemblies as well. All of your custom SharePoint code should then automatically use your log4net configuration.
You could release the config file as part of the solution package(s) to the 12 hive (use STSDev) to create any packages). This would give you a set location for the config and any changes to it can be released in a controlled manner (i.e. no need for manual editm, just roll back and re-install the solution).
I developed a log4net feature and packaged it in a wsp file. The feature receiver adds an httpmodule to the the web.config and the httpmodule loads the log4net.config from the layouts direcory when the application start event is raised in the http module.

Resources