I'm trying to configure archiving of logs, using NLOg File target
I have the following configuration:
<target xsi:type="File"
name="file"
fileName="${basedir}/Logs/${appdomain:format={1\}}.${shortdate}.log"
archiveFileName="${basedir}/Logs/archives/${appdomain:format={1\}}.Archive.{#}.zip"
archiveEvery="Day"
archiveNumbering="Date"
maxArchiveFiles="30"
enableArchiveFileCompression="True"
encoding="UTF-8"
layout="${longdate} PID:${processid} ${uppercase:${level}} ${callsite} - ${message:withException=false}${onexception:${newline}EXCEPTION DETAILS\:
${newline}################################################################################${newline}${exception: format=ToString,Data: maxInnerExceptionLevel=10: innerFormat=Type,Message,Data }${newline}################################################################################${newline}}" />
NLog version is 4.4.12 (the same was behaviour for other NLog 4.x.x versions )
But this configuration doesn't work or works not every time. It doesn't create archives at all or can skip many days. I think the main reason is that our application can complete execution early than archive is created. We start application many times but it lasts several seconds and produces log file about 20MB-30MB per day.
Is there any way to wait until archiving is complete or some other approaches to force archiving to work?
Related
I'm trying to get nlog working with the Azure webapp Log Stream.
The logs do appear if I don't use nlog, and just use System.Diagnostics.Trace.WriteLine.
However, if I use the Trace type in my nlog.config, it doesn't show the trace logs ...
<?xml version="1.0" encoding="utf-8" ?>
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<targets>
<target xsi:type="Trace" name="trace" layout="${message}" />
</targets>
<rules>
<logger name="*" minlevel="Trace" writeTo="trace" />
</rules>
</nlog>
I can't see anything that I'm doing differently from the accepted answer here ...
How to integrate NLog to write log to Azure Streaming log
Note that I cut down that nlog.config file to just show the trace - but I do normally also have a File target type - I've tried with and without this.
I've logged onto the deployed Azure website, and the nlog config file had been uploaded successfully. I'm deploying using the Github deployment.
I have the logging set in Azure to just use the file system logging, and I have that set to verbose.
Any ideas?
It turns out that when Visual Studio enabled Application Insights (a recent thing I added to the project), it had inserted an nlog config section into my web.config. This had then meant that my nlog.config file wasn't being used at all. I've fixed it by removing that nlog config section from my web.config, and copying the Application Insights target into my nlog.config file instead. The 'Trace' target type now works as expected and is appearing in the Azure Streaming Logs.
The NLog-Trace-Target only performs Trace.WriteLine for Debug-Level Log-Events.
Maybe try the custom MyTraceTarget shown here:
https://github.com/NLog/NLog/issues/1968
Update NLog ver. 4.5 adds the new setting rawWrite for the NLog Trace-target so it always performs WriteLine independent of LogLevel. See also https://github.com/NLog/NLog/wiki/Trace-target
Our company is developing our new applications using Service Fabric.
A common problem we have, multiple developers use queues, databases, storages that are on remote servers, and each one has different configuration for this, all the settings are stored on ApplicationParameters file per environment, for Local development there is a single Local.5Node.xml. It is very common developers checkin their credentials and overwrite others when we get latest version of these files.
I'm trying to customize the ServiceFabric deployment script 'Deploy-FabricApplication.ps1' to use a custom PublishProfile depending on windows credentials of logged user. I can achieve that updating the deployment file, it works well when we deploy using the publish, but it seems that the default behavior of ServiceFabric when we hit F5(debug) is overwrite the parameters with a specific Local.5Node.xml application parameters.
I explored all service fabric .ps1 files and couldn't find where this is defined. I guess this is defined on .targets file, so I don't know how can I avoid this default behaviour.
Is there any other approach to use custom PublishProfiles on local development machines other than Local.5Node.xml?
I actually just ran into this with setting up some team specific environments. I borrowed information from the following sources:
Web Config Transformation
Replace String In File With MSBUILD
I added multiple parameters files based on what was needed for the different teams. Each one containing their specific resource settings.
I also added a Local.1Node.Template.xml and Local.5Node.Template.xml. I even removed the Local.1Node.xml and Local.5Node.xml from source control and set them up to be ignored while leaving them in the projects so that Visual Studio doesn't think they are truly missing. The contents of the 1Node (5Node is the same except for replacing 1Node with 5Node) are as follows:
<?xml version="1.0" encoding="utf-8"?>
<PublishProfile xmlns="http://schemas.microsoft.com/2015/05/fabrictools">
<ClusterConnectionParameters />
<ApplicationParameterFile Path="..\ApplicationParameters\Local.1Node.$(Configuration).xml" />
</PublishProfile>
I then edited the sfproj file for the Service Fabric project to contain the following MSBuild Task and Target:
<UsingTask TaskName="ReplaceFileText" TaskFactory="CodeTaskFactory" AssemblyFile="$(MSBuildToolsPath)\Microsoft.Build.Tasks.v4.0.dll">
<ParameterGroup>
<InputFilename ParameterType="System.String" Required="true" />
<OutputFilename ParameterType="System.String" Required="true" />
<MatchExpression ParameterType="System.String" Required="true" />
<ReplacementText ParameterType="System.String" Required="true" />
</ParameterGroup>
<Task>
<Reference Include="System.Core" />
<Using Namespace="System" />
<Using Namespace="System.IO" />
<Using Namespace="System.Text.RegularExpressions" />
<Code Type="Fragment" Language="cs">
<![CDATA[
File.WriteAllText(
OutputFilename,
Regex.Replace(File.ReadAllText(InputFilename), MatchExpression, ReplacementText)
);
]]>
</Code>
</Task>
</UsingTask>
<Target Name="UpdateProfile" BeforeTargets="UpdateServiceFabricApplicationManifest">
<ReplaceFileText InputFilename="PublishProfiles\Local.1Node.Template.xml" OutputFilename="PublishProfiles\Local.1Node.xml" MatchExpression="\$\(Configuration\)" ReplacementText="$(Configuration)" />
<ReplaceFileText InputFilename="PublishProfiles\Local.5Node.Template.xml" OutputFilename="PublishProfiles\Local.5Node.xml" MatchExpression="\$\(Configuration\)" ReplacementText="$(Configuration)" />
</Target>
The final step was to setup the different Build Configurations for the teams. I created a FT1-Debug through FT6-Debug based on the Debug configuration in the Service Fabric Service project and the Service Fabric Host project. I left all of my other projects alone.
At this point everyone on the different teams can debug locally with the correct configuration for the cluster they are doing work in just by changing the Build Configuration and pressing F5 to debug.
The VS extension for Service Fabric define a hard coded publish profile when we debug the solution using Visual Studio, it check how many nodes my cluster has and create a link to Local.5Node.xml and Local.1Node.xml depending how many nodes my cluster have.
To accomplish the same results, we end up using custom Application Parameters per developer and each developer update the Publish Profile (Local.5node.xml) to point to their respective Application parameter files.
It is not automated as the required feature, but can solve the main problem.
I tried out the log4net sample from Apache under C# 2.0, consoleapp. It seems to create 10 rolling files before it ever needs to roll. I created basically the exact same app but I use XmlConfigurator.Configure(fileName) instead of putting the config inside the app.config like it is in the example. In my app, no extra rolling files are created ahead of time. In the Apache sample, 10 files appear immediately.
Has anyone else ever noticed this?
I can post my app details, but I am mostly interested in the Apache sample C# console app.
Thanks,
Chris
Doh!
This was just silly. The sample specifies a super tiny rollover size.
The issue went away with:
<maximumFileSize value="10MB" />
<maxSizeRollBackups value="10" />
I have been assigned a project with a lot of poorly written code that is based around SharePoint.
It consists of about 15 subprojects, some of them being windows services, some web services, some web applications running inside of SharePoint, some being webparts and even console applications. They all run on the same server and call each other.
There are already many issues in production but they are hard to trace down.
The original developer must have been a fan of either Salinger or Pokémon series judging by his tireless effort to catch all exceptions. Unfortunately, none of them get reported or logged, ever.
My current task is to introduce logging into the whole project so I could find now-invisible exceptions, follow tangled recurring calls and have some stack traces at least. I decided to go with NLog, seeing it's active and cool, as opposed to log4net which is perfectly fine but somewhat not as fancy to my taste.
Because the components are tightly coupled, I want to centralize logging in one file so related errors don't get scattered across the hard drive. Therefore, I am looking to have two or three different log files with five or more projects writing to each of them more or less simultaneously.
What is the best way to configure NLog to centralize logging? Should I have a config file for each project, or should related projects share them? Where should I put config file to log from SharePoint webparts? Am I going to face any permission issues?
I'm using SharePoint 2007.
The easiest way to centralize is probably to simply log to a database, one benefit being that multiple applications and write to the database easier than to the same log file. For each application, configure NLog to log to the Database target, using the same Database target configuration parameters for each. Your NLog.config file might look something like this:
<?xml version="1.0" encoding="utf-8" ?>
<!--
This file needs to be put in the application directory. Make sure to set
'Copy to Output Directory' option in Visual Studio.
-->
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
autoReload="true"
internalLogLevel="Debug"
internalLogFile="nlog_log.log">
<targets async="true">
<target name="sqlexpress" xsi:type="Database">
<connectionString>
Data Source=.\SQLEXPRESS;Initial Catalog=LoggingDB;Integrated Security=True;
</connectionString>
<commandText>
insert into LogTable(DateTime,Logger,LogLevel,Message,ProcessId,ManagedThreadId) values (#DateTime,#Logger,#LogLevel,#Message,#ProcessId,#ManagedThreadId);
</commandText>
<parameter name="#DateTime" layout="${date:format=yyyy\-MM\-dd HH\:mm\:ss.fff}"/>
<parameter name="#Logger" layout="${logger}"/>
<parameter name="#LogLevel" layout="${level}"/>
<parameter name="#Message" layout="${message}"/>
</target>
</target>
</targets>
<rules>
<logger name="*" minlevel="Trace" writeTo="sqlexpress" />
</rules>
</nlog>
You could certainly log to files in addition to (or instead of) logging to a database.
I am not familiar with doing this from SharePoint, so I can't comment on any configuration or permission issues that you might run into there.
Here is a link I found where there is a discussion of getting NLog to work in a SharePoint environment:
http://nlog-forum.1685105.n2.nabble.com/Is-anyone-using-NLog-with-SharePoint-td2171451.html
That link appears to put you at the top of the NLog forum instead of the specific post. Search for this text in the forum "Is anyone using NLog with SharePoint" and you should find the right post.
Good luck!
You could also just leverage the existing logging infrastructure in SharePoint and write to the ULS logs. This way your log information can be viewed in a complete context using the ULS log viewer. For SharePoint 2007 see this blog how to write to the ULS log:
SharePoint Trace Logs and the Unified Logging Service (ULS)
With SharePoint 2010 it has become even easier with improvements to the SPDiagnosticsService class where you can use the new WriteTrace method.
Personally I log Exceptions to the Event Logger. And I use NLog for logging details, debug information or tracing.
Since NLog can be easily switched on and off I only activate it when I'm debugging or when I need to inspect an exception in production. I never was a big fan of the default tracing functionality in .NET.
I prefer simple plain text log files. Although logging to a database works great if you don't have too many "log lines" implemented in your code.
I feel like we are working on the same project! Multiple projects consisting of web projects, core dll projects, console apps, services, etc. Unfortunately I'm not working in sharepoint like you are, but I can describe how I am trying to centralize our logging.
We have 1 core .Net framework project.This is where I placed our wrapper class of the log. This project also holds the nlog dlls and the nlog config file. In this core project file you can add this which automatically moves the config when you build projects with a dependency on this core project.
<None Include="Logging\NLog.config">
<link>NLog.config</link>
<CopyToOutputDirectory>Always</CopyToOutputDirectory>
</None>
We found that some web projects that don't compile a dll don't automatically pull in the config file, so we will leave that up to the build process. This helps centralize our logging so you only have to manage a single config across them all.
In addition remember when you create a logger per class, the log name should have the namespace in it, so you can make specific targets that filter based off namespace if you want different settings for particular projects.
As for centralizing where the logs end up, we chose to use a file target and specify the full path. This is because on our servers the applications run off a C:\ but we have larger D:\ which can store the logs. In our production servers, we also have multiple servers, so we are using splunk to aggregate all of our logs.
If splunk is out of the question and you are on a distributed system, a database sounds like a good idea as suggested above. If you don't want to stand up an sql instance, there are target wrappers for mongo db as well.
Hopefully helpful, I'm curious if anyone has suggestions or opinions on how I'm doing it as well!
I have an application that I am just migrating to Azure. Currently I use web.config transformation to manage changing the database connecting string dev/staging/prod environments. How is it best to manage these multiple connection strings in Azure?
In cases where it doesn't matter if the developer can see production credentials, you can use the built-in Visual Studio 10 config transformations. If this is what you're looking for, follow these steps:
1.Navigate to your Azure project folder in file explorer
2. Make a copy of ServiceConfiguration.cscfg
3. Rename copy to ServiceConfiguration.Base.cscfg
4. For each build configuration (e.g. Dev, Staging, Production), create a ServiceConfiguration.<build config name>.cscfg file. In these files, you can use the normal config transformation syntax
5. Open your .ccproj file in a text editor
6. Find the following node,
<ItemGroup>
<ServiceDefinition Include="ServiceDefinition.csdef" />
<ServiceConfiguration Include="ServiceConfiguration.cscfg" />
</ItemGroup>
and replace it with this (you will have to edit this block to match your build configs):
<ItemGroup>
<ServiceDefinition Include="ServiceDefinition.csdef" />
<ServiceConfiguration Include="ServiceConfiguration.cscfg" />
<None Include="ServiceConfiguration.Base.cscfg">
<DependentUpon>ServiceConfiguration.cscfg</DependentUpon>
</None>
<None Include="ServiceConfiguration.Dev.cscfg">
<DependentUpon>ServiceConfiguration.cscfg</DependentUpon>
</None>
<None Include="ServiceConfiguration.Staging.cscfg">
<DependentUpon>ServiceConfiguration.cscfg</DependentUpon>
</None>
<None Include="ServiceConfiguration.Production.cscfg">
<DependentUpon>ServiceConfiguration.cscfg</DependentUpon>
</None>
</ItemGroup>
7.Add the following at the end of the .ccproj file, just above </Project>:
<Import Project="$(MSBuildExtensionsPath)\Microsoft\VisualStudio\v10.0\Web\Microsoft.Web.Publishing.targets" />
<Target Name="BeforeBuild">
<TransformXml Source="ServiceConfiguration.Base.cscfg" Transform="ServiceConfiguration.$(Configuration).cscfg" Destination="ServiceConfiguration.cscfg" />
</Target>
8.If you're using a CI server that doesn't have Visual Studio 10 installed, you'll probably have to copy the C:\Program Files\MSBuild\Microsoft\VisualStudio\v10.0\Web folder and its contents from a development machine to the server.
Update: As #SolarSteve noted, you might have to add a namespace to your ServiceConfiguration.*.cscfg files. Here's an example of ServiceConfiguration.Base.cscfg:
<sc:ServiceConfiguration serviceName="MyServiceName" osFamily="1" osVersion="*" xmlns:sc="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform">
<sc:Role name="MyRoleName">
<sc:Instances count="1" />
<sc:ConfigurationSettings>
<sc:Setting name="DataConnectionString" value="xxx" />
</sc:ConfigurationSettings>
</sc:Role>
</sc:ServiceConfiguration>
Personally we:
Dropped web config transformations completely.
Setting is retrieved from cscfg.
Development version of cscfg points to local development environment (that's stored in version control).
While deploying to production, we supply secure credentials for production SQL Azure and storage.
For sample of the settings management class that scans application settings and cloud environment for configuration values, you can check out open source Lokad.CQRS for Windows Azure project (see CloudSettingsProvider)
You can use CloudConfigurationManager in Azure SDK 1.7 http://msdn.microsoft.com/en-us/LIBRARY/microsoft.windowsazure.cloudconfigurationmanager
This starts by looking in the ServiceConfiguration.cscfg e.g. ServiceConfiguration.Cloud.cscfg for config setting. If it isn't there it falls back to web.config and app.config
For example
CloudConfigurationManager.GetSetting("StorageConnectionString")
Will look in the appropriate cscfgfile for StorageConnectionString setting, then it will search the web.config and then app.config.
We have a number of environments (local dev inside dev fabric, local dev outside dev fabric, testing, release which has 2 versions: release/prod and release/staging and 20 projects some of which need some variability in configure settings. We solved this problem by creating a tiny "config" project, included subfolders there that match the environments. We copy files from the subfolder depending on which build we're doing into root folder of the config project, during every compile.
All other projects link to the config project for .config files. We also use partial config files to keep the insanity of repeating the same info all the time across various environments.
Hope this helps
I had the same requirement for transforming ServiceConfiguration.
I went with the answer from jmac (thank you!), but had trouble with the namespace in the Base version:
<ServiceConfiguration serviceName="TestCloud2" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="1" osVersion="*">
after a bit more poking around found this by Andrew Patterson (Thank You).
so my resulting transform file:
<asc:ServiceConfiguration serviceName="TestCloud2" xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform" xmlns:asc="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="1" osVersion="*">
<asc:Role name="WebRole1">
<asc:Instances count="1" />
<asc:ConfigurationSettings>
<asc:Setting name="LoggingStorage" value="UseDevelopmentStorage=true" xdt:Transform="SetAttributes" xdt:Locator="Match(name)"/>
</asc:ConfigurationSettings>
</asc:Role>