I have a number of applications that I want to log to Splunk. I will be sending the data in an XML format via a UDP listener. The data that is being sent looks like:
<log4j:event logger="ASP.global_asax" level="INFO" timestamp="1303830487907" thread="15">
<log4j:message>New session started</log4j:message>
<log4j:properties>
<log4j:data name="log4japp" value="4ef113dd-9-129483040292873753(4644)" />
<log4j:data name="log4jmachinename" value="W7-SUN-JSTANTON" />
</log4j:properties>
</log4j:event>
However when it is processed by Splunk it appears like:
Apr 26 16:18:09 127.0.0.1 <log4j:message>New session started</log4j:message><log4j:properties><log4j:data name="log4japp" value="4ef113dd-9-129483040292873753(4644)"/><log4j:data name="log4jmachinename" value="W7-SUN-JSTANTON"/></log4j:properties></log4j:event>
Basically it looks like Splunk looks like it has overwritten the opening node, and as a result lossing the log level data, with the datetime that it received it. The applications that are sending it are using nLog with a log4j type target (with an Log4JXmlEventLayout layout). I have configured the sourcetype as log4jxml (custom name) but I think I need to tell it not to do something with the date/time field in the props.conf file (but not too sure what that something is).
I am also using the windows version of Splunk so the file paths are slightly different to the online manuals.
Any help would be most welcome.
It turns out I was doing 2 things wrong (maybe more but I have not found thoses yet)
In the inputs.conf file I need to add the following to my input definition:
no_priority_stripping = true
no_appending_timestamp = true
The second thing I was doing wrong was to put these files in
C:\Program Files\Splunk\etc\system\local\
when they SHOULD have been put in
C:\Program Files\Splunk\etc\apps\search\local\
I hope that this helps somebody else out
Related
I wonder if you can configure logstash in the following way:
Background Info:
Every day I get a xml file pushed to my server, which should be parsed.
To indicate a complete file transfer afterwards I get an empty .ctl (custom file) transfered to the same folder.
The files both have the following name schema 'feedback_{year}{yearday}_UTC{hoursminutesseconds}_51.{extention}' (e.g. feedback_16002_UTC235953_51.xml). So they have the same file name but one is with .xml and the other is a .ctl file.
Question:
Is there a way to configure logstash to wait parsing the xml file until the according .ctl file is present?
EDIT:
Is there maybe a way to archiev that with filebeat?
EDIT2:
It would also be enough to be able to configure logstash in a way that it will wait x minutes before starting to process a new file, if that is easier.
Thanks for any help in advance
Your problem is that you don't want to start the parser before the file transfer hasn't been completed. So, why don't push the data to a file (file-complete.xml) when you find your flag file (empty.ctl)?
Here is the possible logic for a script and runs using crontab:
if empty.ctl exists:
Clear file-complete.xml
Add the content of file.xml to file-complete.xml.
Remove empty.ctl
This way, you'd need to parse the data from file-complete.xml. I think is simpler to debug and configure.
Hope it helps,
I'm currently using logstash to import dozens of log files from different webapps into Graylog. It works great the files are tagged so I know from wich webapp they originate.
I can't change the webapp thus I can't add a GELF appender to the log4j conf of the webapp. The idea is to periodically retrieve the log files, parse them and import them with logstash into Graylog.
My problem is how do I make sure I don't import a log event I've already imported.
For example, I have a log file that has a log pattern that increments: log.1, log.2, etc. So I'll have log events that could be in log.1 the first time and 2 weeks later when I reimport them they'll maybe be in log.3.
I'm afraid I can't handle that with logstash's file input "sincedb_path" and "start_position".
So here are a few options I've gathered and I'd like your input about them, if anyone encountered the same issue:
Use a logstash filter dropping all events before a certain date,
requires to keep an index of every last log date of every file
imported (potentially 50+) and a lot of configuration writing
Use of a drool rule in GrayLog to refuse logs with timestamps prior
to last log received for a given type
Ask to change the log pattern to be something like log.date instead
of a log pattern that renames files (but I'd rather avoid this one)
Any other idea?
When I start logstash, the old logs are not imported into ES.
Only the new request logs are recorded in ES.
Now I've see this in the doc.
Even if I set the start_position=>"beginning", old logs are not inserted.
This only happens when I run logstash on linux.
If I run it with the same config, old logs are imported.
I don't even need to set start_position=>"beginning" on windows.
Any idea about this ?
When you read an input log to Logstash, Logstash will keep an record about the position it read on this file, that's call sincedb.
Where to write the sincedb database (keeps track of the current position of monitored log files).
The default will write sincedb files to some path matching "$HOME/.sincedb*"
So, if you want to import old log files, you must delete all the .sincedb* at your $HOME.
Then, you need to set
start_position=>"beginning"
at your configuration file.
Hope this can help you.
Please see this line also.
This option only modifies "first contact" situations where a file is new and not seen before. If a file has already been seen before, this option has no effect.
I'm using log4net, I would like to have 2 logs,
- BasketballCustomer.log, for all Customers that plays Basketball;
- ChessCustomer.log, for all Customers that plays Chess.
, while for each customer, whether he/she plays Basketball or Chess, is only known until runtime.
I would like to have each log configured separately, about log file name, size, number, log level, etc.
Also, I'd prefer such set up done by C# code, not config file.
How could I do that?
I tried search on net, there are some articles but none meet exactly my requirements
- Log4Net and multiple log files talked about multiple log files but it does not toggle during runtime;
- Configure Log4net to write to multiple files is similiar but it's done in config file....
Please kindly suggest, many thanks!
You can do this by using an environment variable in the log4net.config and then set the value of the environment variable through the C# code
So somewhere in your C# class, do something like:
Environment.SetEnvironmentVariable("log_file_name", "MyLogFileName");
And then in the log4net.config that is used, specify the value to the name of the environment variable. The syntax would be something like this:
<param name="File" value="${log_file_name}".log/>
I want to know how to dynamically assign file name using Log4net .My application is such that 10 different files should be dynamically created based on user input ,and later based on the name the corresponding file name needs to be picked up and information written to it
For example in my application based on my buisness requirement for every xml file a corresponding log file with the same name as xml file should be created .Later whenever I do any modification to the xml file an entry needs to be in the corresponding log file
Please help . I having trouble to get control of the appropriate log to write it
Have not done this, but there are probably a number of ways of doing this, so this may not be the best way, but it should work
public OpenLogFile(string fileName)
{
log4net.Layout.ILayout layout = new log4net.Layout.PatternLayout("%d [%t]%-5p : - %m%n");;
log4net.Appender.FileAppender appender = new log4net.Appender.FileAppender(layout , filename);
appender.Threshold = log4net.Core.Level.Info;
log4net.Config.BasicConfigurator.Configure(appender);
}
Then just call OpenLogfile when you need to switch files.
You might need to tweak the layout or appender type.
A big disadvantage of this method is you losing the xml configuration and the ability to change settings at runtime. So a better way might be to configure your appender in the xml file to use a property
eg
file type="log4net.Util.PatternString" value="Logfiles\Log_For_%property{MyLogFileName}"
Then in your code you could change the property
log4net.GlobalContext.Properties["MyLogFileName"] = ...;
The tricky bit is to get log4net to reload itself. I haven't read the documentation of this, so I don't know if there is a way of forcing a reload. It might work if you just call log4net.Config.XmlConfigurator.ConfigureAndWatch again. Otherwise it should work if you opened the xml file and saved it again (without needing to change anything)
Hope this helps.