How to compare float in logstash? - logstash

It's the pipeline config for test:
input {
file {
path => "/tmp/test1.log"
}
}
filter {
json {
source => "message"
}
}
output {
if [a] == 1.1 {
stdout {}
}
}
I echo some test log to logfile:
echo '{"a": 1.1,"b": "test"}' >> /tmp/test1.log
But there isn't any output in console, and I try use the condition if [a] == "1.1" either not work.
Somebody know how to compare float?
thanks!

Alternatively you can use custom Ruby code to compare float:
ruby {
code=> '
if event.get("[a]") == 1.1
event.set("isFloat", "true")
end
'
}

Related

Logstash can't read file

I'm using Logstash for the first time. I'm on Windows 10
I try to access a multiline json file from a input/ folder like this :
input
{
file
{
codec => multiline
{
pattern => '^/{'
negate => true
what => previous
}
path => "/input/*.json"
exclude => "*.gz"
}
}
filter
{
mutate
{
replace => [ "message", "%{message}" ]
gsub => [ 'message','\n','']
}
if [message] =~ /^{.*}$/
{
json { source => message }
}
}
output
{
file
{
path => "/output/output.json"
}
}
The problem here is when i'm launching logstash instance I get this error :
WARN logstash.inputs.file - failed to open
/input/sample.json: \input\sample.json
I already replace the LS_GROUP by adm in startup.options and tried to replace "/" with "\", didn't work.
I also "chmod 777" my json file but it change nothing.
Any idea ?
Going with / should work fine. What if you're trying to have the path something like this, escaping with \\:
path => "\\input\\*.json"
Make sure you give the complete path.

Multiple If in single filter

if [CREATION_DATE] == ""
{
mutate {
convert => [ "CREATION_DATE", "string" ]
}
}
else
{
date {
locale => "en"
match => [ "CREATION_DATE", "dd-MMM-yy hh.mm.ss.SSS a"]
target => "CREATION_DATE"
}
}
if [SUBMITTED_DATE] == ""
{
mutate {
convert => [ "SUBMITTED_DATE", "string" ]
}
}
else
{
date {
locale => "en"
match => [ "SUBMITTED_DATE", "dd-MMM-yy hh.mm.ss.SSS a"]
target => "SUBMITTED_DATE"
}
}
if [LAST_MODIFIED_DATE] == ""
{
mutate {
convert => [ "LAST_MODIFIED_DATE", "string" ]
}
}
else
{
date {
locale => "en"
match => [ "LAST_MODIFIED_DATE", "dd-MMM-yy hh.mm.ss.SSS a"]
target => "LAST_MODIFIED_DATE"
}
}'
am getting output if i have all three (CREATION_DATE,SUBMITTED_DATE,LAST_MODIFIED_DATE) in date format.If any is STRING am not getting that log file in my input.
for ex:
my input is
12-JUL-13 11.33.56.259 AM,12-JUL-13 03.59.36.136 PM,12-JUL-13 04.00.05.584 PM
14-JUL-13 11.33.56.259 AM,11-JUL-13 04.00.05.584 PM
my output will come successfully for
12-JUL-13 11.33.56.259 AM,12-JUL-13 03.59.36.136 PM,12-JUL-13 04.00.05.584 PM
but NOT FOR 2nd line
In Simple,Logstash is indexing only when three if clauses have dates.
Help me out.THanks in advance!!
The issue with your if statements is pointed out by the comments by #Fairy and #alain-collins.
if [CREATION_DATE] == ""
Does not check if that field exists, it checks if it is an empty string.
Instead you could use a regex check to see if there is any content in the field using:
if [CREATION_DATE] =~ /.*/
and perform your date filter when this returns true.
Issue is solved when i change input format.
(New Format) 11-JUL-13 06.36.33.425000000 PM,13-JUL-13 06.36.33.425000000 PM,,
instead of
(Old format)11-JUL-13 06.36.33.425000000 PM,13-JUL-13 06.36.33.425000000 PM,"",
But My ques is still open.
I posted this because this solution might be useful to some.
Thanks!!!

How to remove xml attribute value from logstash and pass to elasticserach?

Here is the xml file, Tag test-methode contains name=reEnterURL is repeat with each test. i dont want to pass this value to elastic search. Only status=pass name=newProfile is required to pass elastic search.
</test-method>
<test-method status="PASS" signature="reEnterURL()[pri:0, instance:testSuite.DriverScript_Transformation#6fc757]" name="reEnterURL" is-config="true" duration-ms="1107" started-at="2015-12-30T15:55:24Z" finished-at="2015-12-30T15:55:26Z">
<reporter-output>
</reporter-output>
</test-method>
<test-method status="PASS" signature="newProfile()[pri:0,instance:testSuite.DriverScript_Transformation#6fc757]" name="newProfile" duration-ms="818999" started-at="2015-12-30T15:55:26Z" finished-at="2015-12-30T16:09:05Z">
<reporter-output>
</reporter-output>
</test-method>
Here is the conf file:
input {
file {
path=>["file-path/testng-result.xml"]
start_position =>"beginning"
type =>suitefile
}
}
filter
{
if[type]=="suitefile"
{
xml
{
source =>"message"
remove_namespaces => "true"
xpath => ["//test-method/#name","Name",
"//test-method/#status", "Status"]
store_xml => false
}
}
}
output {
stdout { codec => rubydebug}
elasticsearch_http{
host => "ip address"
}
}

Collect Data from log files with logstash

I'm trying with logstash to collect data from a log file for a version of NETASQ Firewall which contains a lot of lines , but i can not collect correctly my data , I don't know if there is a standard to follow, but I started like this:
input {
stdin { }
file {
type => "FireWall"
path => "/var/log/file.log"
start_position => 'beginning'
}
}
filter {
grok {
match => [ "message", "%{SYSLOGTIMESTAMP:date} %{WORD:id}"]
}
}
output {
stdout { }
elasticsearch {
cluster => "logstash"
}
}
The first line of my file.log looks like this :
Feb 27 04:02:23 id=firewall time="2015-02-27 04:02:23" fw="GVGM-NEWYORK"
tz=+0200 startime="2015-02-27 04:02:22" pri=5 confid=01 slotlevel=2 ruleid=57
srcif="Vlan2" srcifname="SSSSS" ipproto=udp dstif="Ethernet0"
dstifname="out" proto=teredo src=192.168.21.12 srcport=52469
srcportname=ephemeral_fw_udp dst=94.245.121.253 dstport=3544
dstportname=teredo dstname=teredo.ipv6.microsoft.com.nsatc.net
action=block logtype="filter"#015
And finally How can I collect data from the others lines. Please give me a topic just to start. Thanks All.

Parse error for float values

I'm trying to get a float value from a log line but logstash mutate filter rounds the value and converts it into integer.
The log line is
f413e89e-8c2f-e411-97a5-005056820dbe|0,0033
and the configuration file is
input {
file {
path => "log.txt"
}
}
filter {
grok {
match => ["message", "%{UUID:request_object_id}[/|]%{LOCALNUM:total_time}"]
}
mutate {
gsub => ["total_time", "[,]", "."]
convert => [ "total_time", "float" ]
}
}
output {
elasticsearch { host => localhost }
}
LOCALNUM is a custom pattern and it is
(?<![0-9.+-])(?>[+-]?(?:(?:[0-9]+(?:[,][0-9]+)?)|(?:[,][0-9]+)))
(uses "," instead of "." in floating numbers).
With this configuration, total_time is 0 instead of 0.0033.
Looking at the logstash source code it does this:
convert(event) if #convert
gsub(event) if #gsub
So it does the convert before the gsub. Try splitting your mutate into two different mutates and it will fix your problem.
mutate {
gsub => ["total_time", "[,]", "."]
}
mutate {
convert => [ "total_time", "float" ]
}
Oh I found my mistake. I used 2 seperate mutate blocks, 1 for gsub and the other for convert and it solved the problem.

Resources