Handle compressed logs in logstash - logstash

I have zip files which has log files.My scenario from the zip file need to extract it and then parse it with logstash?.Could anyone guide me this would be very helpful.Thanks for your valuable time.

Logstash itself doesn't have any input plugins to process log files inside zip archives, but as long as you extract the files yourself to a directory that you've configured Logstash to look for log files in you'll be fine. The standard file input plugin accepts wildcards, so you could say
input {
file {
path => ["/some/path/*.log"]
}
}
to have Logstash process all files in the /some/path directory (you'd probably want to assign a type to the messages too).
If the names of the files inside the zip archives aren't unique you'll have to make sure you don't overwrite existing files when you unzip the archives. Or, probably better, extract each archive into a directory of its own and have Logstash process the whole tree:
input {
file {
path => ["/some/path/*/*.log"]
}
}
Logstash isn't capable of deleting processed logs or moving them out of the way, so that's something you'll have to take care of too.

Related

How to parse multiple log files using logstash?

I have a few log files like below
data_log_01.log
data_log_02.log
data_log_03.log
data_log_04.log
Is there any way that I can parse these logs one by one using a single config file in logstash?
How about using the file input plugin with a wildcard?
An exmaple configuration could look like this, assuming your log files are located /home/of/your/logs/:
input {
file {
path => [
"/home/of/your/logs/*.log"
]
}
}
}
The path value has to be an absolute path! You might want to see the docs on using path.

Configure Logstash to wait before parsing a file

I wonder if you can configure logstash in the following way:
Background Info:
Every day I get a xml file pushed to my server, which should be parsed.
To indicate a complete file transfer afterwards I get an empty .ctl (custom file) transfered to the same folder.
The files both have the following name schema 'feedback_{year}{yearday}_UTC{hoursminutesseconds}_51.{extention}' (e.g. feedback_16002_UTC235953_51.xml). So they have the same file name but one is with .xml and the other is a .ctl file.
Question:
Is there a way to configure logstash to wait parsing the xml file until the according .ctl file is present?
EDIT:
Is there maybe a way to archiev that with filebeat?
EDIT2:
It would also be enough to be able to configure logstash in a way that it will wait x minutes before starting to process a new file, if that is easier.
Thanks for any help in advance
Your problem is that you don't want to start the parser before the file transfer hasn't been completed. So, why don't push the data to a file (file-complete.xml) when you find your flag file (empty.ctl)?
Here is the possible logic for a script and runs using crontab:
if empty.ctl exists:
Clear file-complete.xml
Add the content of file.xml to file-complete.xml.
Remove empty.ctl
This way, you'd need to parse the data from file-complete.xml. I think is simpler to debug and configure.
Hope it helps,

Use Log4View to a folder with zipped logfiles

I tried to use log4viewer for the first time. So my question is basic-level. It is possible to set Log4view to a folder, which has zipped log-files in it? And if yes, how can I configure Log4view correctly? I tried but can't find an example of to do that.
c:\Folder1\zippedLogfiles001.zip
c:\Folder1\zippedLogfiles002.zip
c:\Folder1\zippedLogfiles003.zip (up to 300 logfiles)
...
I heard it is possible that Log4view can read automatically from a folder, so that it doesn't need to unzip the log files manually.
Log4View doesn't work with zipped log files.
To open a log file with Log4View, it has to be a regular, uncompressed text or xml file.
Ulrich

How to zip list of files + append custom string to each file on Linux on the fly

I need a Linux solution as I've figured out a way to do this on Windows by modifying a C# implementation but I am not sure where to start on Linux. I would like to be able to do the following from the command line:
Run a command providing a list of files to be zipped, an output path, and a custom string
The custom string should be automatically appended to the end of the internal data of each file but not written to the original file. I want it all handled in-stream / in memory.
The data stream is fed to the zip utility and zip file is created at the output location with 0 compression (store only)
Explanation: this custom string is used as a watermark to uniquely identify the files in the zip.

Logstash file input plugin

Currently I am using file input plugin to go over my log archive but file input plugin is not the right solution for me because file input plugin inherently expects that file is stream of events and not as a static file. Now, this is causing a great deal of problem for me because my log archive has a 100,000 + log files and I logstash opens a handle on all these files which are never going to change.
I am facing following problems
1) Logstash fails with problem mentioned in SO
2) With those many open file handles log archival storage is getting very slow.
Does anybody know a way to let logstash know that treat files statically or once a file is processed do not keep file handle on it.
In logstash Jira bug, I was told to write my own plugin with some other suggestions which won't help me much.
Logstash file input can process static file. You need to add this configuration
file {
path => "/your/logs/path"
start_position => "beginning"
}
After adding the start_position, logstash reads the file from the beginning. Please refer here for more information. Remember that this option only modifies “first contact” situations where a file is new and not seen before. If a file has already been seen before, this option has no effect. Otherwise you have set your sincedb_path to /dev/null .
For the first question, I have answer in the comment. Please try to add the maximum file opened.
For my suggestion, You can try to write a script copy the log file to the logstash monitor path and move it out constantly. You have to estimate the time that logstash process a log file.
look out for this also turn on -v and --debug for logstash
{:timestamp=>"2016-05-06T18:47:35.896000+0530",
:message=>"_discover_file: /datafiles/server.log:
**skipping because it was last modified more than 86400.0 seconds ago**",
:level=>:debug, :file=>"filewatch/watch.rb", :line=>"330",
:method=>"_discover_file"}
solution is to touch the file or change the ignore_older setting

Resources