Azure Log Streaming with NLog - azure

I'm trying to get nlog working with the Azure webapp Log Stream.
The logs do appear if I don't use nlog, and just use System.Diagnostics.Trace.WriteLine.
However, if I use the Trace type in my nlog.config, it doesn't show the trace logs ...
<?xml version="1.0" encoding="utf-8" ?>
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<targets>
<target xsi:type="Trace" name="trace" layout="${message}" />
</targets>
<rules>
<logger name="*" minlevel="Trace" writeTo="trace" />
</rules>
</nlog>
I can't see anything that I'm doing differently from the accepted answer here ...
How to integrate NLog to write log to Azure Streaming log
Note that I cut down that nlog.config file to just show the trace - but I do normally also have a File target type - I've tried with and without this.
I've logged onto the deployed Azure website, and the nlog config file had been uploaded successfully. I'm deploying using the Github deployment.
I have the logging set in Azure to just use the file system logging, and I have that set to verbose.
Any ideas?

It turns out that when Visual Studio enabled Application Insights (a recent thing I added to the project), it had inserted an nlog config section into my web.config. This had then meant that my nlog.config file wasn't being used at all. I've fixed it by removing that nlog config section from my web.config, and copying the Application Insights target into my nlog.config file instead. The 'Trace' target type now works as expected and is appearing in the Azure Streaming Logs.

The NLog-Trace-Target only performs Trace.WriteLine for Debug-Level Log-Events.
Maybe try the custom MyTraceTarget shown here:
https://github.com/NLog/NLog/issues/1968
Update NLog ver. 4.5 adds the new setting rawWrite for the NLog Trace-target so it always performs WriteLine independent of LogLevel. See also https://github.com/NLog/NLog/wiki/Trace-target

Related

Problem with NLog / NuGet adding duplicate sections to app.config

If I create a console app for example with VS2019, then add NLOG and NLOG.SCHEMA NuGet packages, I configure NLog in app.config and it looks like this:
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.nlog-project.org/schemas/NLog.xsd NLog.xsd" autoReload="true" throwExceptions="true" throwConfigExceptions="true"
internalLogLevel="Off" internalLogToConsoleError="false" internalLogFile="c:\temp\nlog-internal.log">
...
My problem is if a new version of Nlog comes out and its updated via NuGet, it insists on adding a fairly empty section to my app.config. If I don't pick this up then the app will not run because there are two nlog sections.
Is there a way to prevent this? Apart from being more careful and checking after updates? I am thinking of putting the config in a separate file and see if that helps.
My problem is if a new version of Nlog comes out and its updated via NuGet,
This only happens when you update the NLog.Schema package. So you could skip updating that package. The schema package is not used for the behavior of NLog

Refer a connectionstring from web config in another config file

I am using NLog to write logs to my database,
I have created a file NLog.config which is writing logs to a text file as of now.
To write the logs to a database, I am following this tutorial.
However, the connectionstrings for diferrent environments can be only modified in Web.config. (I am using Azure App services). Is there any way I can refer the connection string from web.config in NLog.config.
TIA
If you not using ASP.NET Core (but "full" ASP.NET), you could use ${appsetting:name=..}
Install NLog.Extended with Nuget and use ${appsetting:name=..} in your config file.
e.g.
<target name="database"
type="Database"
connectionString="${appsetting:name=myConnectionString}" />
See also the ${appsetting} documentation
NB: It can only read <appSettings> and not <connectionStrings>

How to override web.config values in custom section in Azure Web App?

It is possible in Azure Web App to override web.config AppSettings section easily. E.g. if I have the following web.config:
<appSettings>
<add key="AllowedCORSOrigin" value="http://localhost:26674"/>
</appSettings>
I can override it in the app settings UI in the portal like that:
I have also a custom section in the web.config:
<AdWordsApi>
<add key="OAuth2RefreshToken" value="TOKEN" />
</AdWordsApi>
Is it possible to override it somehow as well? I have tried AdWordsApi.OAuth2RefreshToken and AdWordsApi:OAuth2RefreshToken, but that does not work that easily.
P.S. It's interesting to know if it's possible with other custom sections like e.g if I want another authentication mode on the server.
<system.web>
<authentication mode="None" />
</system.web>
Short answer is that it is not possible.
The mechanism you describes only works with App Settings and Connection Strings. High level, the way it works is:
Your Azure App Settings become environment variables
At runtime, a special module sets those dynamically in the .NET config system. Note that the physical web.config is never modified.
But it would be hard to make such mechanism work on arbitrary config sections, as those could not be dynamically affected without modifying the physical file.
If you are using Visual Studio use web.config transformations to change configuration settings depending on whether you are running locally or deploying to Azure:
How to Transform Web.config
In simple terms you create one more more build configurations (typically Debug & Release). In your Visual Studio solution right-click on your existing web.config file and click "Add Config Transform", this will create a Web.Debug.Config and Web.Release.Config file which you can then customise with specific settings depending on the environment. Link this with your Azure build configuration and you can then have any combination of settings for local and remote deployment.
This is old but leaving this reference to how to use Azure Resource Manager to potentially solve this.
You can transform the values by the listed in VSTS by doing the following steps in App.Release.config:-
Add xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform" in configuration section
<configuration xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform">
</configuration>
Add xdt:Transform="Replace" in custom section like below
<AdWordsApi xdt:Transform="Replace">
<add key="OAuth2RefreshToken" value="TOKEN" />
</AdWordsApi>
Create variable token in the release pipeline e.g OAuth2RefreshToken
Then in file config use it as following
<AdWordsApi xdt:Transform="Replace">
<add key="OAuth2RefreshToken" value="#{OAuth2RefreshToken}#" />
</AdWordsApi>
If you are adding any in web.config --> Appsetting, you can overirde it in Azure App Service using variable prefix
Key Name: APPSETTING_AllowedCORSOrigin
Value: http://localhost:26674
https://learn.microsoft.com/en-us/azure/app-service/reference-app-settings?tabs=kudu%2Cdotnet#variable-prefixes

How do I use "Load Remote Log4J File" functionality of Chainsaw v2?

I'm trying to set up the Chainsaw viewer. I'm not really getting how it's supposed to work.
This is my XML file in the java project to be logged(i.e the one I want to watch in Chainsaw v2):
<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE log4j:configuration >
<log4j:configuration xmlns:log4j="http://jakarta.apache.org/log4j/" debug="true">
<plugin name="XMLSocketReceiver" class="org.apache.log4j.net.XMLSocketReceiver">
<param name="decoder" value="org.apache.log4j.xml.UtilLoggingXMLDecoder"/>
<param name="Port" value="4000"/>
<param name="threshold" value="ALL"/>
</plugin>
<root>
<priority value="debug"/>
</root>
</log4j:configuration>
Here's a screenshot of the Chainsaw option menu:
A couple of things:
The latest developer snapshot of Chainsaw has a lot of new features, including a reworked configuration UI that should make it simpler (File, Load Chainsaw configuration menu option). You can get it here: http://people.apache.org/~sdeboy
The log4j.xml file used by the application generating the logging needs to have an 'appender' entry, not a 'receiver' entry. The Chainsaw configuration will contain a 'receiver' entry once you have it set up, which again, I would suggest doing via the configuration UI (it 'receives' events generated by an 'appender'). Just choose the option to save the config file from the configuration screen, and check the box that says 'always start Chainsaw with this configuration'
You can use a SocketAppender/SocketHubAppender on the application logging side, or a FileAppender of some kind. If you choose to use a FileAppender, Chainsaw's configuration screen can read in your application-side log4j.xml and generate the correct configuration for you.
If you have additional questions, feel free to send them here or to the log4j users mailing list, available here: http://logging.apache.org/log4j/1.2/mail-lists.html

What is the best way to centralize logging with NLog?

I have been assigned a project with a lot of poorly written code that is based around SharePoint.
It consists of about 15 subprojects, some of them being windows services, some web services, some web applications running inside of SharePoint, some being webparts and even console applications. They all run on the same server and call each other.
There are already many issues in production but they are hard to trace down.
The original developer must have been a fan of either Salinger or Pokémon series judging by his tireless effort to catch all exceptions. Unfortunately, none of them get reported or logged, ever.
My current task is to introduce logging into the whole project so I could find now-invisible exceptions, follow tangled recurring calls and have some stack traces at least. I decided to go with NLog, seeing it's active and cool, as opposed to log4net which is perfectly fine but somewhat not as fancy to my taste.
Because the components are tightly coupled, I want to centralize logging in one file so related errors don't get scattered across the hard drive. Therefore, I am looking to have two or three different log files with five or more projects writing to each of them more or less simultaneously.
What is the best way to configure NLog to centralize logging? Should I have a config file for each project, or should related projects share them? Where should I put config file to log from SharePoint webparts? Am I going to face any permission issues?
I'm using SharePoint 2007.
The easiest way to centralize is probably to simply log to a database, one benefit being that multiple applications and write to the database easier than to the same log file. For each application, configure NLog to log to the Database target, using the same Database target configuration parameters for each. Your NLog.config file might look something like this:
<?xml version="1.0" encoding="utf-8" ?>
<!--
This file needs to be put in the application directory. Make sure to set
'Copy to Output Directory' option in Visual Studio.
-->
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
autoReload="true"
internalLogLevel="Debug"
internalLogFile="nlog_log.log">
<targets async="true">
<target name="sqlexpress" xsi:type="Database">
<connectionString>
Data Source=.\SQLEXPRESS;Initial Catalog=LoggingDB;Integrated Security=True;
</connectionString>
<commandText>
insert into LogTable(DateTime,Logger,LogLevel,Message,ProcessId,ManagedThreadId) values (#DateTime,#Logger,#LogLevel,#Message,#ProcessId,#ManagedThreadId);
</commandText>
<parameter name="#DateTime" layout="${date:format=yyyy\-MM\-dd HH\:mm\:ss.fff}"/>
<parameter name="#Logger" layout="${logger}"/>
<parameter name="#LogLevel" layout="${level}"/>
<parameter name="#Message" layout="${message}"/>
</target>
</target>
</targets>
<rules>
<logger name="*" minlevel="Trace" writeTo="sqlexpress" />
</rules>
</nlog>
You could certainly log to files in addition to (or instead of) logging to a database.
I am not familiar with doing this from SharePoint, so I can't comment on any configuration or permission issues that you might run into there.
Here is a link I found where there is a discussion of getting NLog to work in a SharePoint environment:
http://nlog-forum.1685105.n2.nabble.com/Is-anyone-using-NLog-with-SharePoint-td2171451.html
That link appears to put you at the top of the NLog forum instead of the specific post. Search for this text in the forum "Is anyone using NLog with SharePoint" and you should find the right post.
Good luck!
You could also just leverage the existing logging infrastructure in SharePoint and write to the ULS logs. This way your log information can be viewed in a complete context using the ULS log viewer. For SharePoint 2007 see this blog how to write to the ULS log:
SharePoint Trace Logs and the Unified Logging Service (ULS)
With SharePoint 2010 it has become even easier with improvements to the SPDiagnosticsService class where you can use the new WriteTrace method.
Personally I log Exceptions to the Event Logger. And I use NLog for logging details, debug information or tracing.
Since NLog can be easily switched on and off I only activate it when I'm debugging or when I need to inspect an exception in production. I never was a big fan of the default tracing functionality in .NET.
I prefer simple plain text log files. Although logging to a database works great if you don't have too many "log lines" implemented in your code.
I feel like we are working on the same project! Multiple projects consisting of web projects, core dll projects, console apps, services, etc. Unfortunately I'm not working in sharepoint like you are, but I can describe how I am trying to centralize our logging.
We have 1 core .Net framework project.This is where I placed our wrapper class of the log. This project also holds the nlog dlls and the nlog config file. In this core project file you can add this which automatically moves the config when you build projects with a dependency on this core project.
<None Include="Logging\NLog.config">
<link>NLog.config</link>
<CopyToOutputDirectory>Always</CopyToOutputDirectory>
</None>
We found that some web projects that don't compile a dll don't automatically pull in the config file, so we will leave that up to the build process. This helps centralize our logging so you only have to manage a single config across them all.
In addition remember when you create a logger per class, the log name should have the namespace in it, so you can make specific targets that filter based off namespace if you want different settings for particular projects.
As for centralizing where the logs end up, we chose to use a file target and specify the full path. This is because on our servers the applications run off a C:\ but we have larger D:\ which can store the logs. In our production servers, we also have multiple servers, so we are using splunk to aggregate all of our logs.
If splunk is out of the question and you are on a distributed system, a database sounds like a good idea as suggested above. If you don't want to stand up an sql instance, there are target wrappers for mongo db as well.
Hopefully helpful, I'm curious if anyone has suggestions or opinions on how I'm doing it as well!

Resources