Lostash grok cant parse sophos logs - logstash

I followed this article to send my sophos utm logs to elk
when i start logstash service logs are recieved from sophos fw but get error on all recieved logs as bellow
[ERROR][logstash.codecs.json ][input][919266bd10e76503d980d2b94f51c5195b09c1afe56017a2ecf62d9c60086291] JSON parse error, original data now in message field {:message=>"Could not set field 'ip' on object 'my sophos ip' to value '127.0.0.1'.This is probably due to trying to set a field like [foo][bar] = someValuewhen [foo] is not either a map or a string", :exception=>Java::OrgLogstash::Accessors::InvalidFieldSetException, :data=>"{"#timestamp":"2022-08-26T10:38:05.455937+04:30","#Version":"1","message":"08:26-10:38:05 my-fw-name ulogd[32435]: id="2002" sever>.............
Edit : This is the log sample that i am trying to filter :
Sep 7 10:27:49 10.211.254.6 2022: 09:07-10:27:48 sepahbod-main-fw ulogd[7421]: id="2000" severity="info" sys="SecureNet" sub="packetfilter" name="Packet logged" action="log" fwrule="62446" initf="eth6" srcmac="b4:14:89:71:bc:90" dstmac="00:1a:8c:36:6f:84" srcip="10.139.0.14" dstip="192.168.7.5" proto="17" length="58" tos="0x00" prec="0x00" ttl="126" srcport="59212" dstport="53"
and this is grok filter that i am using
can anyone help me to solve this issue?
sorry for my bad english
thanks

Related

How to parse syslog message using logstash

Hi I have a syslog made up of two events
Jul 6 13:24:27 NODE1 zeus.eventd[14176]: pools/POOL nodes/IP:3000 nodefail Node NODE2 has failed - A monitor has detected a failure
Jul 6 13:24:34 NODE1 zeus.eventd[14176]: pools/POOL nodes/IP:3000 nodeworking Node NODE2 is working again
I would like to pull NODE2 from the syslog and add it as a field in the index along with nodefail/nodeworking
Currently my input/grok is
syslog {
grok_pattern => "%{SYSLOGLINE}"
}
with no filter however all of the info I need is populated in a "message" field so I am unable to use it in elastic
I know the position what I want in the syslog line I just need to pull it out and add it as a field
Is anyone able show me the input/filter config I need in order to achieve this?
Thanks,
TheCube
Edit: The message fields look like this:
zeus.eventd 14176 - - SERIOUS pools/POOL nodes/IP:3000 nodefail Node NODENAME has failed - A monitor has detected a failure
zeus.eventd 14176 - - INFO pools/POOL nodes/IP:3000 nodeworking Node NODENAME is working again
You can use the dissect filter plugin on the message field created while parsing with %{SYSLOGLINE}:
dissect {
mapping => {
"message" => "%{} %{} %{status} %{} %{node_name} %{}"
}
}
Or a second grok filter, applied on the message field created while parsing with %{SYSLOGLINE}, with this pattern:
^pools/POOL nodes/IP:\d+ %{WORD:status} Node %{WORD:node_name}
In both cases, with the logs given in your question, you get those results:
"status":"nodefail"
"node_name":"NODE2"
"status":"nodeworking"
"node_name":"OFSVDBM101"

how to add bytes, session and source parameter in kibana to visualise suricata logs?

I redirected all the logs(suricata logs here) to logstash using rsyslog. I used template for rsyslog as below:
template(name="json-template"
type="list") {
constant(value="{")
constant(value="\"#timestamp\":\"") property(name="timereported" dateFormat="rfc3339")
constant(value="\",\"#version\":\"1")
constant(value="\",\"message\":\"") property(name="msg" format="json")
constant(value="\",\"sysloghost\":\"") property(name="hostname")
constant(value="\",\"severity\":\"") property(name="syslogseverity-text")
constant(value="\",\"facility\":\"") property(name="syslogfacility-text")
constant(value="\",\"programname\":\"") property(name="programname")
constant(value="\",\"procid\":\"") property(name="procid")
constant(value="\"}\n")
}
for every incoming message, rsyslog will interpolate log properties into a JSON formatted message, and forward it to Logstash, listening on port 10514.
Reference link: https://devconnected.com/monitoring-linux-logs-with-kibana-and-rsyslog/
(I have also configured logstash as mention on the above reference link)
I am getting all the column in Kibana discover( as mentioned in json-template of rsyslog) but I also require bytes, session and source column in kibana which I am not getting here. I have attached the snapshot of the column I am getting on Kibana here
Available fields(or say column) on Kibana are:
#timestamp
t #version
t _type
t facility
t host
t message
t procid
t programname
t sysloghost
t _type
t _id
t _index
# _score
t severity
Please let me know how to add bytes, session and source in the available fields of Kibana. I require these parameters for further drill down in Kibana.
EDIT: I have added how my "/var/log/suricata/eve.json" looks like (which I need to visualize in Kibana. )
For bytes, I will use (bytes_toserver+bytes_toclient) which is an available inside flow.
Session I need to calculate.
Source_IP I will use as the source.
{"timestamp":"2020-05 04T14:16:55.000200+0530","flow_id":133378948976827,"event_type":"flow","src_ip":"0000:0000:0000:0000:0000:0000:0000:0000","dest_ip":"ff02:0000:0000:0000:0000:0001:ffe0:13f4","proto":"IPv6-ICMP","icmp_type":135,"icmp_code":0,"flow":{"pkts_toserver":1,"pkts_toclient":0,"bytes_toserver":87,"bytes_toclient":0,"start":"2020-05-04T14:16:23.184507+0530","end":"2020-05-04T14:16:23.184507+0530","age":0,"state":"new","reason":"timeout","alerted":false}}
Direct answer
Read the grok docs in detail.
Then head over to the grok debugger with some sample logs, to figure out expressions. (There's also a grok debugger built in to Kibana's devtools nowadays)
This list of grok patterns might come in handy, too.
A better way
Use Suricata's JSON log instead of the syslog format, and use Filebeat instead of rsyslog. Filebeat has a Suricata module out of the box.
Sidebar: Parsing JSON logs
In Logstash's filter config section:
filter {
json {
source => "message"
# you probably don't need the "message" field if it parses OK
#remove_field => "message"
}
}
[Edit: added JSON parsing]

Create a custom grok pattern

I was working with logstash to structure the following type of logs:
14 Apr 2020 22:49:02,868 [INFO] 1932a8e0-3892-4bae-81e3-1fc1850dff55-LPmAoB (coral-client-orchestrator-41786) hub_delivery_audit: RequestContext{CONTAINER_ID=200414224842439045902810201AZ, TRACKING_ID=TSTJ8N7GLBS0ZZW, PHYSICAL_ATTRIBUTES=PhysicalAttributes(length=Dimension(value=30.0, unit=CM, type=null), width=Dimension(value=30.0, unit=CM, type=null), height=Dimension(value=30.0, unit=CM, type=null), scaleWeight=Weight(value=5.0, unit=kg, type=null)), SHIP_METHOD=AMZN_US_PRIME, ADDRESS_ID=LDI7ICATBZNOAQNW634MG057BMA07370713J4ZQ1VGOMB7KPXTQ2EIA2OX4CKT7L, CUSTOMER_ID=A07370713J4ZQ1VGOMB7K, REQUEST_STATE=UNKNOWN, RESPONSE=GetAccessPointsForHubDeliveryOutput(destinationLocation=null, fallBackLocation=null, capability=null), IS_COMMERCIAL_ATTRIBUTE_PRESENT=false}
and I wanted to extract the following data out of it:
CONTAINER_ID
TRACKING_ID
PHYSICAL_ATTRIBUTES
SHIP_METHOD
ADDRESS_ID
REQUEST_STATE
RESPONSE
But I'm not able to figure out appropriate filter for such large log event. I've tried using https://grokdebug.herokuapp.com/ and went through Logstash grok documentation as well, but still couldn't extract the required fields. I could only come up with this:
%{MONTHDAY:monthday} %{MONTH:month} %{YEAR:year} %{TIME:time} [%{LOGLEVEL:logLevel}] %{HOSTNAME}
Please suggest an approach on this and how to directly filter the following fields without creating extra fields like time and date.
I have tried the following grok pattern
{CONTAINER_ID=%{DATA:container_id}, TRACKING_ID=%{DATA:tracking_id}, PHYSICAL_ATTRIBUTES=PhysicalAttributes%{DATA:physical_attributes} SHIP_METHOD=%{DATA:ship_method}, ADDRESS_ID=%{DATA:address_id}, CUSTOMER_ID=%{DATA:customer_id}, REQUEST_STATE=%{DATA:request_state}, RESPONSE=%{GREEDYDATA:response}(?=,)
in grok debugger (https://grokdebug.herokuapp.com/)
Output:

Elasticsearch Logstash Kibana and Grok How do I break apart the message?

I created a filter to break apart our log files and am having the following issue. I'm not able to figure out how to save the parts of the "message" to their own field or tag or whatever you call it. I'm 3 days new to logstash and have had zero luck with finding someone here who knows it.
So for an example lets say this is your log line in a log file
2017-12-05 [user:edjm1971] msg:This is a message from the system.
And what you want to do is to get the value of the user and set that into some index mapping so you can search for all logs that were by that user. Also, you should see the information from the message in their own fields in Kibana.
My pipeline.conf file for logstash is like
grok {
match => {
"message" => "%{TIMESTAMP_ISO8601:timestamp} [sid:%{USERNAME:sid} msg:%{DATA:message}"
}
add_tag => [ "foo_tag", "some_user_value_from_sid_above" ]
}
Now when I run the logger to create logs data gets over to ES and I can see the data in KIBANA but I don't see foo_tag at all with the sid value.
How exactly do I use this to create the new tag that gets stored into ES so I can see the data I want from the message?
Note: using regex tools it all appears to parse the log formats fine and the log for logstash does not spit out errors when processing.
Also for the logstash mapping it is using some auto defined mapping as the path value is nil.
I'm not clear on how to create a mapping for this either.
Guidance is greatly appreciated.

Logstash grok filter : parsing custom application logs

I'm trying to parse my application logs using logstash filters. The log file contents are like below
17 May 2016 11:45:53,391 [tomcat-http--10] INFO com.visa.vrm.aop.aspects.LoggingAspect - RTaBzeTuarf |macBook|com.visa.vrm.admin.controller.OrgController|getOrgs|1006
I'm trying to create a dashboard (line chart) using logstash and want to show the activities on it. For e.g request comes in from some server with correlation id and have to see which class it calls with corresponding method and how long it took to execute.
The message is like:
correlation id | server-name | class name | method name | time taken
log file e.g
RTaBzeTuarf |macBook|com.visa.vrm.admin.controller.OrgController|getOrgs|1006
I'm unable to create grok patterns/filters for above message. Can someone advise me on this?
Try that:
(?<timestamp>%{MONTHDAY} %{MONTH} %{YEAR} %{HOUR}:%{MINUTE}:%{SECOND}) \[%{NOTSPACE:thread}\] %{LOGLEVEL:loglevel} (?<logger>[A-Za-z0-9$_.]+) - %{GREEDYDATA:correlationId}\|%{GREEDYDATA:servername}\|%{GREEDYDATA:className}\|%{GREEDYDATA:methodName}\|%{NUMBER:time}$

Resources