Send email to different user based on some pattern in Logstash - logstash

We can send email notification to a particular email address but I want to send email to different mailing address based on some pattern in logs.
For example say I have three users with email address
userOne#something.com receives mail id log contains [userOneModule]
userTwo#something.com receives mail id log contains [userTwoModule]
userThree#something.com receives mail id log contains [userThreeModule
Logstash version used is 1.3.3
Any help if this is possible in logstash or any workaround to achieve something like this.
This is my configuration, Although both 'Security' and 'Portal' matches but email is sent to only one.
When I keep only one kind of logs say Security Logs or Portal Logs it works but when I keep both the logs it only sends email to one of it.
output {
if [module] == "Security"{
email {
to => "userOne#somemail.com"
from => "dummy2161#somemail.com"
match =>["%{message}","severity,ERROR"]
subject => "Error Occured"
body => "%{message}"
via => "smtp"
options => {
starttls => "true"
smtpIporHost => "smtp.gmail.com"
port => "587"
userName => "dummy2161#somemail.com"
password => "*******"
authenticationType => "plain"
}
}
}
if [module] == "Portal"{
email {
to => "userTwo#somemail.com"
from => "dummy2161#gmail.com"
match =>["%{message}","severity,ERROR"]
subject => "Error Occured"
body => "%{message}"
via => "smtp"
options => {
starttls => "true"
smtpIporHost => "smtp.gmail.com"
port => "587"
userName => "dummy2161#somemail
password => "*****"
authenticationType => "plain"
}
}
}
}
Thanks

You can either store the recipient email address in a field (using conditionals or grok filters to assign the value) and refer to that field in the email output's to parameter, or you can wrap multiple email outputs in conditionals.
Using a field for storing the address:
filter {
# If the module name is the same as the recipient address's local part
mutate {
add_field => { "recipient" => "%{modulename}#example.com" }
}
# Otherwise you might have to use conditionals.
if [modulename] == "something" {
mutate {
add_field => { "recipient" => "someuser#example.com" }
}
} else {
mutate {
add_field => { "recipient" => "otheruser#example.com" }
}
}
}
output {
email {
to => "%{recipient}"
...
}
}
Wrapping outputs in conditionals:
output {
if [modulename] == "something" {
email {
to => "someuser#example.com"
...
}
} else {
email {
to => "otheruser#example.com"
...
}
}
}

Related

How to map array inside message in Logstash HTTP Output

I am using Logstash to update by query existing Elasticsearch documents with an additional field that contains aggregate values extracted from Potgresql table.
I use elastichsearch output to load one index using document_id and http output to update another index that have different document_id but receving errors:
[2023-02-08T17:58:12,086][ERROR][logstash.outputs.http ][main][b64f19821b11ee0df1bd165920785876cd6c5fab079e27d39bb7ee19a3d642a4] [HTTP Output Failure] Encountered non-2xx HTTP code 400 {:response_code=>400, :url=>"http://localhost:9200/medico/_update_by_query", :event=>#LogStash::Event:0x19a14c08}
This is my pipeline configuration:
input {
jdbc {
# Postgres jdbc connection string to our database, mydb
jdbc_connection_string => "jdbc:postgresql://handel:5432/mydb"
statement_filepath => "D:\ProgrammiUnsupported\logstash-7.15.2\config\nota_sede.sql"
}
}
filter {
aggregate {
task_id => "%{idCso}"
code => "
map['idCso'] = event.get('idCso')
map['noteSede'] ||= []
map['noteSede'] << {
'id' => event.get('idNota'),
'tipo' => event.get('tipoNota'),
'descrizione' => event.get('descrizione'),
'data' => event.get('data'),
'dataInizio' => event.get('dataInizio'),
'dataFine' => event.get('dataFine')
}
event.cancel()"
push_previous_map_as_event => true
timeout => 60
timeout_tags => ['_aggregatetimeout']
}
}
}
output {
stdout { codec => rubydebug { metadata => true } }
# this works
elasticsearch {
hosts => "https://localhost:9200"
document_id => "STRUTTURA_%{idCso}"
index => "struttura"
action => "update"
user => "user"
password => "password"
ssl => true
cacert => "/usr/share/logstash/config/ca.crt"
}
http {
url => "http://localhost:9200/medico/_update_by_query"
user => "elastic"
password => "changeme"
http_method => "post"
format => "message"
content_type => "application/json"
message => '{
"query":{
"term":{
"idCso":"%{idCso}"
}
},
"script":{
"source":"ctx._source.noteSede=params.noteSede",
"lang":"painless",
"params":{
"noteSede":"%{noteSede}"
}
}
}
}'
}
}
The stdout output show me the sended docs to output like this:
{
"query" => {
"term" => {
"idCso" => "859119"
}
},
"script" => {
"source" => "ctx._source.noteSede=params.noteSede",
"lang" => "painless",
"params" => {
"noteSede" => "{dataFine=null, dataInizio=2020-02-13, descrizione=?, tipo=DB, id=6390644, data=2020-02-13 12:26:58.409},{dataFine=null, dataInizio=2020-02-13, descrizione=?, tipo=DE, id=6390645, data=2020-02-13 12:26:58.41}"
}
}
}
}
How could I set noteSede array field into message to _update_by_query ?

LogStash - Parse http filter result

As always, official doc lacks of examples.
I have a filter that calls an API and has to adds fields parsing the API result:
http {
url => "rest/api/subnet/check_subnet_from_ip/"
query => { "ip" => "%{[source][ip]}" }
verb => GET
}
API response is something like this:
{ "name": "SUBNET1", "cidr": "192.168.0.0/24" }
I need to add new fields with these results.
I need to consider an empty result {}
I cant find any example about parsing the results.
Thanks.
Your response is a json document, you need to use the json filter to parse it, look at the documentation of the json filter for all the options available.
But basically you will need something like this:
http {
url => "rest/api/subnet/check_subnet_from_ip/"
query => { "ip" => "%{[source][ip]}" }
verb => GET
target_body => api_response
}
json {
source => "api_response"
}
To add new fields you need to use the mutate filter, look at the documentation of the mutate filter for all options available.
To add a new field you need something like this:
mutate {
add_field => { "newFieldName" => "newFieldValue" }
}
Or to add a new field with the value from an existing field:
mutate {
add_field => { "newFieldName" => "%{existingField}" }
}
Considering an answer in the format:
{ "name": "SUBNET1", "cidr": "192.168.0.0/24" }
And the fact that you need to check for empty responses, you will also need to add conditionals, so your pipeline should be something like this example:
http {
url => "rest/api/subnet/check_subnet_from_ip/"
query => { "ip" => "%{[source][ip]}" }
verb => GET
target_body => api_response
}
json {
source => "api_response"
}
if [api_response][name]
{
mutate
{
add_field => { "[source][subnet][name]" => "%{[api_response][name]}" }
}
}
if [api_response][cidr]
{
mutate
{
add_field => { "[source][subnet][cidr]" => "%{[api_response][cidr]}" }
}
}
This will check if the fields name and cidr exists, and if it exists it will add new fields.
You can also rename the fields if you want, just use this mutate configuration instead.
mutate {
rename => { "name" => "subnet_name" }
}

Logstash configuration for word extraction

I am new to Logstash manipulations and I have no idea how to do the below.
I have a sample data as below:
Column:Type
Incident Response P3
Incident Resolution L1.5 P2
...
I want to extract the word 'Response' and 'Resolution' into a new column 'SLA type'
Im looking for something very alike to the below SQL statement:
case when Type like '%Resolution%' then Resolution
when Type like '%Response%' then Response
end as SLA_Type
How do i manipulate this in Logstash?
Below is my conf. I'm using an API input.
input {
http_poller {
urls => {
snowinc => {
url => "https://service-now.com"
user => "your_user"
password => "yourpassword"
headers => {Accept => "application/json"}
}
}
request_timeout => 60
metadata_target => "http_poller_metadata"
schedule => { cron => "* * * * * UTC"}
codec => "json"
}
}
filter
{
json {source => "result" }
split{ field => ["result"] }
date {
match => ["[result][sys_created_on]","yyyy-MM-dd HH:mm:ss"]
target => "sys_created_on"
}
}
output {
elasticsearch {
hosts => ["yourelastuicIP"]
index => "incidentsnow"
action=>update
document_id => "%{[result][number]}"
doc_as_upsert =>true
}
stdout { codec => rubydebug }
}
The output for the API json url looks like the below:
{"result":[
{
"made_sla":"true",
"Type":"incident resolution p3",
"sys_updated_on":"2019-12-23 05:00:00",
"number":"INC0010275",
"category":"Network"} ,
{
"made_sla":"true",
"Type":"incident resolution l1.5 p4",
"sys_updated_on":"2019-12-24 07:00:00",
"number":"INC0010567",
"category":"DB"}]}
You can use the following filter block in your pipeline to add a new field if a word is present in another field.
if "response" in [Type] {
mutate {
add_field => { "SLA_Type" => "Response" }
}
}
if "resolution" in [Type] {
mutate {
add_field => { "SLA_Type" => "Resolution" }
}
}
If the word response is present in the field Type a new field named SLA_Type with the value Response will be added to your document, the same in will happen with resolution.

Issue in renaming Json parsed field in Logstash

I am parsing json log file in Logstash. There is a field named #person.name. I tried to rename this field name before sending it to elasticsearch. I also tried to remove the field but I couldn't remove or delete that field because of that my data not getting indexed in Elasticsearch.
Error recorded in elasticsearch
MapperParsingException[Field name [#person.name] cannot contain '.']
at org.elasticsearch.index.mapper.object.ObjectMapper$TypeParser.parseProperties(ObjectMapper.java:276)
at org.elasticsearch.index.mapper.object.ObjectMapper$TypeParser.parseObjectOrDocumentTypeProperties(ObjectMapper.java:221)
at org.elasticsearch.index.mapper.object.ObjectMapper$TypeParser.parse(ObjectMapper.java:196)
at org.elasticsearch.index.mapper.object.ObjectMapper$TypeParser.parseProperties(ObjectMapper.java:308)
at org.elasticsearch.index.mapper.object.ObjectMapper$TypeParser.parseObjectOrDocumentTypeProperties(ObjectMapper.java:221)
at org.elasticsearch.index.mapper.object.RootObjectMapper$TypeParser.parse(RootObjectMapper.java:138)
at org.elasticsearch.index.mapper.DocumentMapperParser.parse(DocumentMapperParser.java:119)
at org.elasticsearch.index.mapper.DocumentMapperParser.parse(DocumentMapperParser.java:100)
at org.elasticsearch.index.mapper.MapperService.parse(MapperService.java:435)
at org.elasticsearch.cluster.metadata.MetaDataMappingService$PutMappingExecutor.applyRequest(MetaDataMappingService.java:257)
at org.elasticsearch.cluster.metadata.MetaDataMappingService$PutMappingExecutor.execute(MetaDataMappingService.java:230) at org.elasticsearch.cluster.service.InternalClusterService.runTasksForExecutor(InternalClusterService.java:458)
at org.elasticsearch.cluster.service.InternalClusterService$UpdateTask.run(InternalClusterService.java:762)
My Logstash config
input {
beats {
port => 11153
}
}
filter
{
if [type] == "person_get" {
##Parsing JSON input to JSON Filter..
json {
source => "message"
}
mutate{
rename => { "#person.name" => "#person-name" }
remove_field => [ "#person.name"]
}
fingerprint {
source => ["ResponseTimestamp"]
target => "fingerprint"
key => "78787878"
method => "SHA1"
concatenate_sources => true
}
}
}
output{
if [type] == "person_get" {
elasticsearch {
index => "logstash-person_v1"
hosts => ["xxx.xxx.xx:9200"]
document_id => "%{fingerprint}" # !!! prevent duplication
}
stdout {
codec => rubydebug
}
} }

logstash mail to multiple recipients

at logstash conf filter , I assigned multiple email id to variable(variable name = targetmailid) and output field , I am trying to use that variable. However, it is assiging multiple email id's to CC field in output. Please suggest.
filter {
kv {
field_split => ","
value_split=>":"
}
if [ref_type] =~ /tag/ {
ruby {
code => "tag= event['ref']
targetmailid = 'testuser1#mail.com,testuser2#mail.com,testuser3#mail.com'
}
}
}
output {
if "tagcreate" in [tags] {
email {
body => "test messgage"
from => "admin#emil.com"
to => "admin2#email.com"
cc => "targetmailid"
subject => "test mail"
options => {
smtpIporHost => "smtp"
port => 25
}
}
}
You need to use the sprintf format %{...} like this:
email {
body => "test messgage"
from => "admin#emil.com"
to => "admin2#email.com"
cc => "%{targetmailid}" <--- modify this
subject => "test mail"
options => {
smtpIporHost => "smtp"
port => 25
}
}
UPDATE
Also make sure to modify the following part:
if [ref_type] =~ /tag/ {
ruby {
code => "event['targetmailid'] = 'testuser1#mail.com,testuser2#mail.com,testuser3#mail.com'"
}
} else {
mutate {
add_field => { "targetmailid" => ""}
}
}

Resources