Logstash windows logs - logstash

i am having trouble finding the logs for logstash, i had configured windows servers to forward logs via nxlog using rsyslog in my linux machine, now i don't know where the logs are stored. i have looked in /var/log/ directory but nothing is there.
From my windows hosts although i am receiving the logs to Kibana, can please anyone help me? also my hosts are showing as fqdn and netbios name, i can not attach the image as i do not have enough reputation posts, can someone please assist me?
Thanks

When you started logstash, what config file did you use (it is the file specified after the -f flag)?
In that .conf file, there is an input {} section that shows you the file path (path => file/path/for/logs) that logstash is using to look for logs.
Alternatively, you may be sending the data received over TCP directly to Elasticsearch. You can query this using curl (or a web browser). Should be something like:
curl -XGET http://localhost:9200/_search?pretty

Related

Owncloud: Log-Files for login-attempts

Does Owncloud log any files with successfull/failed login attempts? I would like to transfer these into Splunk to anaylze potential attacks against our system.
Thanks a lot in advance!
You should checkout the App and Add-On for Nextcloud on Splunkbase:
App for Nextcloud
Add-On for Nextcloud
From the Splunkbase page:
Successful and failed logins and ratio of the same
Installation docs: https://intranet.graabek.com/cloud/index.php/s/Lc9oXkaWNmQHBqG#pdfviewer
Splunk Pro Tip™: check Splunkbase to see if your data source of choice already has an app/add-on available :)
You can generate a log file in Owncloud by following the steps at https://doc.owncloud.com/server/admin_manual/troubleshooting/providing_logs_and_config_files.html . However, this is most likely going to be formatted for human use, not for processing by Splunk.
I suggest you look at the syslog logging option described at https://doc.owncloud.com/server/admin_manual/configuration/server/logging/logging_configuration.html and send that syslog data to splunk. You can configure Splunk for syslog listening at https://docs.splunk.com/Documentation/Splunk/latest/Data/Monitornetworkports

OSSEC server or wazuh server to Logstash to Qradar pipeline

In my present lab setup I have few windows machines and linux machines with ossec agent installed and sending logs to ossec server.
From OSSEC server I am forwarding the logs via syslog output to logstash.
In logstash I am not doing any modification, simply I am forwarding the plain log to qradar as received(I verified it). It have alert level, rule and event. But in qradar it's showing single log source that is the logstash server.
From logstash I send the logs as syslog to qradar.
Ideally, in qradar all machines which are sending logs to ossec should be listed in log sources, but it's not happening.
What's I am doing wrong here? Any help.. I followed this link https://www.ibm.com/support/knowledgecenter/en/SS42VS_DSM/t_DSM_guide_OSSEC_cfg.html instead directly sending logs to qradar I placed a logstash in between.
I do not see anything wrong, if you have a Logstash between your devices and QRadar then the only log source that QRadar knows about is your Logstash server, it is the only service sending data to it.
If you want to see your ossec devices listed as log sources in QRadar I think that you will need to ship the logs directly to QRadar.
edit: I do not know QRadar very well, but if it is possible to use tags or custom fields to identifier a log source, maybe you can add a custom field in your logstash pipeline and QRadar will use this field to know that the log source is not your logstash server, but other device.

Logs and ELK stack on separate servers, how to display the logs?

I have logs on one server and ELK stack -- ElasticSearch, Kibana, Logstash -- on another. I want to display the logs, which are on the 1st server, in Kibana, which is on the 2nd one.
How can I do that? Is logstash capable of retrieving logs over the internet from a remote server?
Your best bet is to use Filebeat on server one and send the logs to the logstash server.
https://www.elastic.co/products/beats/filebeat
Logstash can't access remote file, you'll have to use a shipper, like beaver, using udp or tcp transport:
Beaver documentation.

Collectd not pushing data to Logstash

my collectd setup is not pushing the logs to Logstash. Not sure what is the problem here.
I have run the tcpdump on my collectd server. Even its not sending any request. I think, the problem may be on collectd. Someone have any idea on what is wrong here.
Note : There is no block in server firewall .
I don't see anything immediately wrong with your collectd config, but try setting your Server tag to "localhost". That way you'll be sure to bind your server to your local interface.
Change the "collectd" to collectd in your sixth to last line.

Outputting UDP From Logstash

I have some logs that I want to use logstash to collate . It will be the logstash agent as a shipper on the source servers going to a central logging server . I want to use UDP from the shipper to the central log so I can be totally isolated should the logger fail ( I dont want the 70 odd production servers affected in any way ) . The only way I can see of using udp for the transport is to output using syslog format . Does anyone know of a way I can output UDP natively from logstash ? ( The docs hint at it but I must be missing something on the TCP side ?)
My understanding is that the typical approach would be to use something a bit more lightweight (rsyslog, or even Lumberjack) running on the remote machine, that is sending data to your central logstash server.
It sounds like you want logstash agents (forwarders) to be running on each server and sending the data to the broker. This blog post (also linked as an example below), gives a nice explanation of why they chose not to use logstash forwarders on their remote servers - they eat too much RAM
I use a setup where rsyslog sends UDP data to logstash directly. I've also used a setup where rsyslog was the receiving log server, and aggregated the logs into separate logfiles (based on server hostname) and then logstash read from those files.
For some example configs see the following:
Logstash with Lumberjack and ES
Logstash with rsyslog and ES
I suggest the rsyslog approach, as the chances that rsyslog is already running on your server is higher than Lumberjack. Also there is quite a bit more documentation on rsyslog than Lumberjack. Start off simple, and add more complexity later on, if you feel the need.
You can output UDP natively using the UDP output plugin
http://logstash.net/docs/1.2.1/outputs/udp
By setting the codec field, you can choose whatever output format you like (e.g. JSON)

Resources