Is it possible to use the ${shortdate} in the internalLogFile? - nlog

Is it possible to use the ${shortdate} in the internalLogFile?
<nlog internalLogFile="C:\Logs\${shortdate}_nlog.log"
<targets>
<target name="logfile"
fileName="C:/logs/${shortdate}_dev.log"
</target>
I'm getting the expected dated logfile, but the internal log file is named ...
${shortdate}_nlog.log

Short answer: No.
Longer answer: The internal logger file name is just a string. It's read in during initialization and the XmlLoggingConfiguration class ensures that the directory exists whereas (for example) the FileTarget uses a Layout for fileName that converts the value provided using LayoutRenderers.

https://github.com/NLog/NLog/issues/581#issuecomment-74923718
My understanding from reading their comments is that the internal logging should be simple, stable, and used sparingly. Typically you are only supposed to turn it on when trying to figure out whats going wrong with your setup.
You can still dynamically name your internal log file based on the date time if you want. However it won't have the same rollover effect a target file would. It would essentially have a different datetime whenever you initialized your logger I think.
DateTime dt = DateTime.Now;
NLog.Common.InternalLogger.LogFile = #"C:\CustomLogs\NLog_Internal\internal_NLogs" + dt.ToString("yyyy-MM-dd") + ".txt";

Related

Log statement is repeating in log files

I have to log in different-2 file. So I have created two appender. One for basic log which would log little bit information.
Second appender will be dynamic and depending on the one parameter log file name will be different. Both scenario are working fine.
Now just found the log statement are getting added.
Means first time it write once, second time tow lines and third time three and so on.. My program runs on every 20 seconds. If I close the program and run again it will not repeat but if continuous runs every 20 second then it start repeat log.
I have used log4j.Create to logger and adding appender in this. Every thing I am doing by code. Not using any log file. Below is one of them.
static Logger loggerCustom = Logger.getLogger("CustomLog");
PatternLayout plt = new PatternLayout();
plt.setConversionPattern("%-7p %d [%t] %c %x - %m%n");
fh = new FileAppender(plt, "logs\\" + strDate + "\\CustomLog.log");
loggerCustom.addAppender(fh);
loggerCustom.setAdditivity(false);
Dear All above issue has been resolve by adding below line before appending appender.
.removeAllAppenders()

Output other than .txt

I'm looking to build a simple program that will simply modify existing output files from an other program so I don't have to open the program and enter a bunch of data the long way. This program is very specific to my domain and has an extension named .wcc. However, when I change the extension of one of these output files to .txt, I get half gibberish :
ÿÿ WPointÿÿ WPolygonÿÿ  WQuadrilateralÿÿ  WMemberDataÿÿ
WLoadÿÿ WLStandardMembersÿÿ WLSavedDesignSettingsÿÿ WLSavedFormatSettingsÿÿ  WLSavedViewSettingsÿÿ WLSavedProjectSettingsÿÿ  WLSavedSettingsÿÿ  WLSavedLoadSettingsÿÿ WLSavedDefaultSettingsÿÿ WLineÿÿ WProductÿÿ WBeamDataÿÿ  WColumnDataÿÿ
WJoistDataÿÿ
WWallStudDataÿÿ WSupportingMemberDataÿÿ WSavedAnalysisSettingsÿÿ WSavedGravityDesignSettingsÿÿ WSavedPreferencesSettingsÿÿ WNotchÿÿ WIJoistÿÿ WFloorCWC37 ÀAE LumberS-P-F No.1/No.2 # À# lumwall.cww ÿÿÿÿ1.2.3.1.Mur_1_EX-D ÿÿÿÿÿÿ B Cÿÿ B C €? 4C 4C   Neige #F #F ÈC ÿÿÿ
WLStandardMembersÿÿ "
There are also musical notes and perpendicular signs which I can't copy paste here. I can sorta read the text, but still not enough to make modifications via txt file. What type of file could this be? Is it even possible to do what I'm trying to do? Thanks!
I am surprised that you are trying to open a .wcc file as a text file (it's contents - as you will see - don't lend themselves to being converted to such a file type); however, the attempt to open the file as a .txt file seems to be specific to your domain.
I noticed part of your question is as follows: "What type of file could this be?"
You are right in thinking that the .wcc file is a rather obscure file type - we don't think about that file type a lot (or are not conscious of it existing). A .wcc file is a WinCam 2000 Cache file that allows WinCam 2000 movies to be previewed in the slide browser - these were often generated by older WinCam 2000 screen recording and editing programs.
Again, the file extension is very rare these days (a Google search only returns ~700 results). But, it appears you have a program that is producing the file, which - as you are saying - "is quite specific to your domain". You may be out of luck with regard to opening them for modification purposes.
Supposedly, you can covert .wac files to .wav files, which are much more relevant to today's technology (and definitely alterable from code); however, without knowing the purpose of the file, e.g. what you are trying to do with the file domain-side, I can't say that this will suit your needs.
Also, the above comments are "correct": changing a file extension will not convert the file to the file extension type. Typically, converters - like a simple software - are needed to convert files.

Log4net hourly rollover but with datePattern of seconds

I'm trying to write a log which creates new file every hour, which can be done simply using the datePattern set to but I need the datePattern (or at least the filename to consist of yyyyMMddHHmmss but still, rollover every 1 hour.
Obviously when I set it gives the result but the rollover is every second.
I've search all over but couldn't find any answer.
Thanks for the assistance.
Log4net does not support this. You could copy the source code of the rolling file appender and implement this feature for yourself. As far as I can tell you cannot derive from the class and override the behavior since the date pattern is used in private methods.

Relative path for JMeter XML Schema?

I'm using JMeter 2.6, and have the following setup for my test:
-
|-test.jmx
|-myschema.xsd
I've set up an XML Schema Assertion, and typed "myschema.xsd" in the File Name field. Unfortunately, this doesn't work:
HTTP Request
Output schema : error: line=1 col=114 schema_reference.4:
Failed to read schema document 'myschema.xsd', because
1) could not find the document;
2) the document could not be read;
3) the root element of the document is not <xsd:schema>.
I've tried adding several things to the path, including ${__P(user.dir)} (points to the home dir of the user) and ${__BeanShell(pwd())} (doesn't return anything). I got it working by giving the absolute path, but the script is supposed to be used by others, so that's no good.
I could make it use a property value defined in the command line, but I'd like to avoid it as well, for the same reason.
How can I correctly point the Assertion to the schema under these circumstances?
Looks like you have to in this situation
validate your xml against xsd manually: simply use corresponding java code from e.g. BeanShell Assertion or BeanShell PostProcessor;
here is a pretty nice solution: https://stackoverflow.com/a/16054/993246 (as well you can use any other you want for this);
dig into jmeter's sources, amend XML Schema file obtaining to support variables in path (File Name field) - like CSV Data Set Config does;
but the previous way seems to be much easier;
run your jmeter test-scenario from shell-script or ant-task which will first copy your xsd to jmeter's /bin dir before script execution - at least XML Schema Assertion can be used "as is".
Perhaps if you will find any other/better - please share it.
Hope this helps.
Summary: in the end I've used http://path.to.schema/myschema.xsd as the File Name parameter in the Assertion.
Explanation: following Alies Belik's advice, I've found that the code for setting up the schema looks something like this:
DocumentBuilderFactory parserFactory = DocumentBuilderFactory.newInstance();
...
parserFactory.setAttribute("http://java.sun.com/xml/jaxp/properties/schemaSource", xsdFileName);
where xsdFileName is a string (the attribute string is actually a constant, I inlined it for readability).
According to e.g. this page, the attribute, when in the form a String, is interpreted as an URI - which includes HTTP URLs. Since I already have the schema accessible through HTTP, I've opted for this solution.
Add the 'myschema.xsd' to the \bin directory of your apache-jmeter next to the 'ApacheJMeter.jar' or set the 'File Name' from the 'XML Schema Assertion' to your 'myschema.xsd' from this starting point.
E.g.
JMeter: C:\Users\username\programs\apache-jmeter-2.13\bin\ApacheJMeter.jar
Schema: C:\Users\username\workspace\yourTest\schema\myschema.xsd
File Name: ..\\..\\..\workspace\yourTest\schema\myschema.xsd

Perl program structure for parsing

I've got question about program architecture.
Say you've got 100 different log files with different formats and you need to parse and put that info into an SQL database.
My view of it is like:
use general config file like:
program1->name1("apache",/var/log/apache.log) (modulename,path to logfile1)
program2->name2("exim",/var/log/exim.log) (modulename,path to logfile2)
....
sqldb->configuration
use something like a module (1 file per program) type1.module (regexp, logstructure(somevariables), sql(tables and functions))
fork or thread processes (don't know what is better on Linux now) for different programs.
So question is, is my view of this correct? I should use one module per program (web/MTA/iptablat)
or there is some better way? I think some regexps would be the same, like date/time/ip/url. What to do with that? Or what have I missed?
example: mta exim4 mainlog
2011-04-28 13:16:24 1QFOGm-0005nQ-Ig
<= exim#mydomain.org.ua** H=localhost
(exim.mydomain.org.ua)
[127.0.0.1]:51127 I=[127.0.0.1]:465
P=esmtpsa
X=TLS1.0:DHE_RSA_AES_256_CBC_SHA1:32
CV=no A=plain_server:spam S=763
id=1303985784.4db93e788cb5c#mydomain.org.ua T="test" from
<exim#exim.mydomain.org.ua> for
test#domain.ua
everything that is bold is already parsed and will be putted into sqldb.incoming table. now im having structure in perl to hold every parsed variable like $exim->{timstamp} or $exim->{host}->{ip}
my program will do something like tail -f /file and parse it line by line
Flexability: let say i want to add supprot to apache server (just timestamp userip and file downloaded). all i need to know what logfile to parse, what regexp shoud be and what sql structure should be. So im planning to have this like a module. just fork or thread main process with parameters(logfile,filetype). Maybe further i would add some options what not to parse (maybe some log level is low and you just dont see mutch there)
I would do it like this:
Create a config file that is formatted like this: appname:logpath:logformatname
Create a collection of Perl class that inherit from a base parser class.
Write a script which loads the config file and then loops over its contents, passing each iteration to its appropriate handler object.
If you want an example of steps 1 and 2, we have one on our project. See MT::FileMgr and MT::FileMgr::* here.
The log-monitoring tool wots could do a lot of the heavy lifting for you here. It runs as a daemon, watching as many log files as you could want, running any combination of perl regexes over them and executing something when matches are found.
I would be inclined to modify wots itself (which its licence freely allows) to support a database write method - have a look at its existing handle_* methods.
Most of the hard work has already been done for you, and you can tackle the interesting bits.
I think File::Tail is a nice fit.
You can make an array of File::Tail objects and poll them with select like this:
while (1) {
($nfound,$timeleft,#pending)=
File::Tail::select(undef,undef,undef,$timeout,#files);
unless ($nfound) {
# timeout - do something else here, if you need to
} else {
foreach (#pending) {
# here you can handle log messages depending on filename
print $_->{"input"}." (".localtime(time).") ".$_->read;
}
(from perl File::Tail doc)

Resources