How do I only keep the most recent n entries in a log4net sql table? - log4net

I am using log4net to log to a sql table. I'd like to be able to either only keep the most recent n days, or the most recent n entries in the table. Is this possible with log4net?

Log4net do not have this capability built-in. But such a task is probably best placed as a job ,e.g. in SSIS (if you're running MS SQL Server) or similar tools.

I figured it out, for the commandText for the AdoNetAppender I set the command text to:
<commandText value="INSERT INTO Log ([Date],[Thread],[Level],[Logger],[Message],[Exception]) VALUES (#log_date, #thread, #log_level, #logger, #message, #exception); DELETE FROM [Log] WHERE [Date] < DATEADD(dd, -28, GETDATE())" />
It feels hacky, but it works. I'll post here if I find a neater solution.

I know I'm late to the party... but looking at ilivewithian's solution I would agree with Peter Lillevold's observation that having it cause additional load in the logging process is undesirable.
Wouldn't it also be possible to use a trigger in the database to auto-delete the older items? Sure, you would need a DB that supports triggers, but it seems like most modern ones (including open source ones like SQLite and PostgreSQL) do.

Related

Solr Question about Loading Changes to Schema

I'm new to Solr and received the following error when adding a document through pysolr:
pysolr.SolrError: Solr responded with an error (HTTP 400): [Reason: ERROR: [doc=bc4aa768-6f35-4888-80e0-1578d9971b3c] Error adding field 'periodical_nlm'='2984692R' msg=For input string: "2984692R"]
I ended up finding out that the first periodical_nlm value added was 404536.0, so I assumed it was a type issue. In Python I then cast every periodical_nlm explicitly to string before adding 2984692R. However, the error persisted.
I Googled a bit and found that I should probably explicitly tell Solr that I want that field to be a string. I've not gotten very "hands on" with the schema yet, so I just had some questions:
(1) There appear to be two schema files: managed-schema in the directory for the core and managed-schema in the conf folder of the core. I'm assuming that the initialized schema which is in use is the one in the conf folder?
(2) Which do I update in order for things to proceed smoothly? I attempted adding the following to the schema file in the core directory but the error persisted:
field name="periodical_nlm" type="string" indexed="true" stored="true" required="false" multiValued="false" />
Do I need to rerun some initialization process or add something to the conf file separately?
Thank you so much and please let me know if you need more info. I'm running on a Windows 10 Home x64 platform (not sure if that's important if there are any command-line things I need to run...).
As long as you reload the core after changing the managed-schema file under conf, you should be fine. Be aware that you should do this before indexing content - so you might need to clean out the index by deleting everything, then changing the schema and re-indexing your content. Changing the schema does not change content that has already been indexed.
Otherwise your assumption is correct, and the schemaless mode (where the type is determined by the format of the first value submitted (not the type - as that's usually not included in any way, all values are just strings when being submitted, so Solr attempts to guess the type by applying a hierarchy of pattern matching)) is useful for prototyping - when you're moving to production you should always define the schema explicitly to avoid issues like you've seen here.

Creating dynamic log files via Log4Net by process ID & date without rename policy

I have been scratching around Stack and the net for a solution to my particular use case but has yet have not found one.
Log4Net 1.2.10
IIS 6.1
.NET application using CastleCore
My requirement is to create a unique file name based on process id per hour with process id, date and hour in the filename. Although if the process Id were to change within that hour then create a new file name based on that process id and date/hour.
I have a service running on a web farm in IIS, this has 4 worker processes running simultaneously, 2 app pools 2 process per pool. So 4 log files being written too simultaneously.
My current config (below) appears to work upto a point. The files get created and are logged too simultaneously, great! The logging is quite heavy in that over the course of an hour it will log around
200MB per process id.
During the course of the hour one of the files out of the two simply stops writing. I have set Minimal lock and different process id's which should prevent any deadlocks, race conditions or collisions.
My current config is below
<appender name="WSG_file_appender" type="WSG.Logger.LogAppender,WSG.Logger">
<file type="log4net.Util.PatternString" value="../../WSG/IWSGServices-[%processid]" />
<datePattern value="-dd.MM.yyyy-HH'.log'" />
<staticLogFileName value="false"/>
<rollingStyle value="Date" />
<appendToFile value="true" />
<maximumFileSize value="500MB" />
<maxSizeRollBackups value="50" />
<lockingModel type="log4net.Appender.FileAppender+MinimalLock" />
<layout type="log4net.Layout.PatternLayout">
<conversionPattern value="[%d] [%t] %-5p [%m]%n" />
</layout>
The format of the log that is generated is:
IWSGServices[8977]-23.03.15.14.log
This is what I want, in the above example that is the log for process ID 8977 for 1400 hours. At 1500 new logs gets generated for each process id in use and the new hour stamp.
When the logging stops in log4net debug mode I get this error many times:
log4net:ERROR [LogAppender] Failed to write [].
System.ArgumentOutOfRangeException: Count cannot be less than zero.
Parameter name: count
at System.String.CopyTo(Int32 sourceIndex, Char[] destination, Int32 destinationIndex, Int32 count)
at System.IO.StreamWriter.Write(String value)
at log4net.Util.QuietTextWriter.Write(String value)
I have not found any similar cases of this error within log4net so use cases are limited. I suspect there is some I/O issue or log4net (at least this older version) is not capable of fulfilling my use case. However I am not convinced latest versions will be either. I have not upgraded log4net yet due to reported issues with later versions and Castle Core implementations.
Setting StaticLogFileName=true results in filenames being created without the date stamp i.e WSGServices[8699], and then on the hour gets renamed to IWSGServices[8977]-23.03.15.14.log for example. This is not what I want as for this use case I do not want files to get renamed as I have a log reader which sees this as a new file and parses it again.
I have tried a large number of combinations including with and without locking models, Rolling styles, datepatterns and the config above seems to be the closest I have come. It's possible this config will work if I roll per minute but this will result in over 3000 log files per day which is not practical.
I'd like to achieve this under the current version of log4net I am using, however if I patch to 2.11 there is a setting of PreserveLogFileNameExtension parameter although the file will still get renamed but just not after the extension.
One suggestion I looked at was setting the IIS pool setting Disable Overlap Recycle to false but this didn't make any difference as would only come into play on recycles of the app pool.
Another alternative is too log both processes form each app pool to a single log file although due to the amount of writes I think this solution would not work without further process collisions.
Any help appreciated in case I have missed something or if anyone has used this in current or later log4net versions.

log4net: Roll by date, enumerate file by number

Previously, my team had been using log4net to roll by maximum file size, and it was generating files like:
MyLog.log
MyLog.log.1
MyLog.log.2
etc.
Recently, we've switched to
<rollingStyle value="Date"/>
<MaxSizeRollBackups value="14"/>
<datePattern value="yyyyMMdd"/>
Which now produces filenames like:
MyLog.log
MyLog.log20130324
MyLog.log20130323
etc.
Is it possible to roll by date and yet still have the log files enumerated like before? If so, how would I do this? I looked all over Log4net's website, but couldn't find a good reference on how to do this.
I don't think you can do this without creating a custom appender. For the RollingFileAppender,
if the rollingStyle is set to Date or Composite (the default), then the next output file name is generated from the base file name + the current time formatted by datePattern.

how to switch log4net multiple log files using C# code?

I'm using log4net, I would like to have 2 logs,
- BasketballCustomer.log, for all Customers that plays Basketball;
- ChessCustomer.log, for all Customers that plays Chess.
, while for each customer, whether he/she plays Basketball or Chess, is only known until runtime.
I would like to have each log configured separately, about log file name, size, number, log level, etc.
Also, I'd prefer such set up done by C# code, not config file.
How could I do that?
I tried search on net, there are some articles but none meet exactly my requirements
- Log4Net and multiple log files talked about multiple log files but it does not toggle during runtime;
- Configure Log4net to write to multiple files is similiar but it's done in config file....
Please kindly suggest, many thanks!
You can do this by using an environment variable in the log4net.config and then set the value of the environment variable through the C# code
So somewhere in your C# class, do something like:
Environment.SetEnvironmentVariable("log_file_name", "MyLogFileName");
And then in the log4net.config that is used, specify the value to the name of the environment variable. The syntax would be something like this:
<param name="File" value="${log_file_name}".log/>

Issue with SubSonic and Multiple Providers

I have two Subsonic-generated Data Access Layers for 2 different databases that I use in one project and so I have the following in my web.config:
<SubSonicService>
<providers>
<add name="BLLDB" type="SubSonic.SqlDataProvider, SubSonic" connectionStringName="BLLDB" generatedNamespace="BLLDB" useSPs="true" />
<add name="BLLDB2" type="SubSonic.SqlDataProvider, SubSonic" connectionStringName="BLLDB2" generatedNamespace="BLLDB2" useSPs="true" />
</providers>
</SubSonicService>
yet, anytime I call the code for either DAL it always ends up using the second data Provider listed ("BLLDB2"), and so gives an error like "Invalid object name 'dbo.Users'" when it should be reading from "BLLDB" (despite me explicitly specifying "BLLDB" in the Select())
e.g. check the following code for the "BLLDB" DAL:
Dim mySelect As New
SubSonic.Select(Databases.BLLDB)
mySelect.From(Of User)()
"mySelect.ProviderName" returns a string value: "BLLDB2"
whereas "Databases.BLLDB" returns a string value: "BLLDB"
what gives??
One of your provider is failing. Subsonic is not good about telling you why and where its failing.
I usually debug in couple of ways.
Use only one provider at a time and comment the other one. Check if you are able to see the namespace. If both of them load fine, then you atleast know its not the database.
Check if any of you tables start with -, _, or numbers. This can cause it fail as well.
Let me know how it goes.
You can specify which provider is default by using:
< SubSonicService defaultProvider="BLLDB" >
I am new to Subsonic (using version 2.2 with C#). I spent a lot of time trying to get the same configuration as Stimpy to work. I think I figured it out - my solution below. If this is correct, it would be great to add this information to the Select Queries documentation. so others can resolve this multiple data provider issue earlier.
Here's the web.config
<SubSonicService enableTrace="false" templateDirectory="">
<providers>
<clear/>
<add name="DB1" type="SubSonic.SqlDataProvider, SubSonic" connectionStringName="DB1" excludeProcedureList="*" generatedNamespace="DB1" includeTableList="TableA" tableBaseClass="RepositoryRecord"/>
<add name="DB2" type="SubSonic.SqlDataProvider, SubSonic" connectionStringName="DB2" excludeProcedureList="*" generatedNamespace="DB2" includeTableList="TableB,TableC" tableBaseClass="RepositoryRecord"/>
</providers>
Here's the statement that doesn't work (taken from the Select Queries documentation examples). SQL Server can't find "TableA" because it's looking for it in DB2 instead of DB1:
DB1.TableA doesntWork = new Select().From<DB1.TableA>().
Where("idCol").IsEqualTo(1).ExecuteSingle<DB1.TableA>();
(My assumption had been that the dataProvider for each table would have been generated as a property of the table class).
Here's the modification to make this work:
Select mySelect = DB1.DB.Select();
DB1.TableA works = mySelect.From<DB1.TableA>().
Where("idCol").IsEqualTo(1).ExecuteSingle<DB1.TableA>();
OR, this also works:
DB1.TableA worksAlso = new Select(DataService.GetInstance(Databases.DB1)).From<DB1.TableA>().
Where("idCol").IsEqualTo(1).ExecuteSingle<DB1.TableA>();
It seems that if you have a single dataProvider OR you specify in the SubSonicService config. that the default dataProvider is the one you're trying to use, everything works fine:
<SubSonicService enableTrace="false" defaultProvider="DB1" templateDirectory="">
But, if you leave out the "defaultProvider", it defaults to the last one in the providers list (in this case, DB2)
Another important piece of information for the multiple DAL case - if you generate the code into different folders for each provider, be sure to “include” in your project only ONE of the "AllStructs.cs" files that automatically gets generated into each folder (otherwise, compile errors).
FYI: kudos to the developers of this open source alternative to Codesmith. So far (other than this issue), it's been easy to get started and make it work (especially with SubStage). Also, I think it will end up being a lighter weight, cost-effective solution for my clients. Thank you!
I am using subsonic 2.2 substage. You need to change the namespaces for multiple providers respectfully.
One important thing. It will generate multiple "AllStructs.vb" files. It will be repeated by code automatically. But you need to add single allstructs.vb another just delete and it will work 1000%.
here would be the configuration settings at web config file.
<connectionStrings>
<add name="aspnetdb" connectionString="Data Source=(local); Database=aspnetdb; Integrated Security=true;"/>
<add name="office" connectionString="Data Source=(local); Database=office; Integrated Security=true;"/>
</connectionStrings>
<SubSonicService defaultProvider="aspnetdb" enableTrace="false" templateDirectory="">
<providers>
<clear/>
<add name="aspnetdb" type="SubSonic.SqlDataProvider, SubSonic" connectionStringName="aspnetdb"/>
<add name="office" type="SubSonic.SqlDataProvider, SubSonic" connectionStringName="office"/>
</providers>
</SubSonicService>
Hi all and thanks for the replies... Rob if I specify the default provider of "BLLDB" this ends up doing the same thing, but for BLLDB instead of BLLDB2
i.e. it only reads from BLLDB and not BLLDB2
CodeToGlory, I'm using the latest dlls and usually have no issues running SubSonic until I came across this prob.
Would it be possible for you to post up the web.config entries you use for subsonic where you have both providers working please?
Thanks a million!
P.S. It's also rather odd that when I specify which data provider to use directly in my Select() function:
e.g.
Dim mySelect As New SubSonic.Select(Databases.BLLDB)
mySelect.From(Of User)()
and then I do:
"mySelect.ProviderName", this returns a string value: "BLLDB2" (not correct)
whereas when I output the value for "Databases.BLLDB" this returns a string value: "BLLDB" (correct)
This could actually be the key to the problem...
Just answering my own question here .. it seems to indeed be NOT possible to use multiple providers when working through VB.NET.
This possibly only works in C# (as CodeToGlory mentioned), as I've tested several different scenarios from scratch and cannot get 2 Subsonic-generated DALs to work side-by-side
Will have to hack my own one together for one of them so.. Cheers for the advice though!

Resources