Send log to a specific Graylog Index via Nxlog configuration - graylog2

I am currently using nxlog to send the server logs to a graylog2 server and all the messages are going to the default index in Graylog. I am trying to send the messages to a particular index which should be configurable from the nxlog conf file.

We cannot achieve this via Nxlog configuration. This problem can be solved by using Streams functionality provided by Graylog http://docs.graylog.org/en/2.4/pages/streams.html . We can create a stream with a particular rule to figure out the input source and then redirect the logs to a particular index which is configured when we are creating a stream.

Related

Sending Python App logs directly to SOLR

I am currently using python logging with file handler to write the logs. However, was wondering if there is a way to send logs straight to an external system such as SOLR? I see there is a way of sending it to logstash, however would prefer if I could send it straight to SOLR. Is there any way to do that?

Logstash not sending data to Elasticsearch

I have a logstash config which reads nginx logs and sends data to elasticsearch.
The log file contains two endpoints viz /index and /info. Logstash parses the /index endpoint and sends the data to ES but it doesn't do the same with /info endpoint.
The issue is that when I run logstash in foreground using /opt/logstash/bin/logstash to debug the problem, it is able to parse both the endpoints and send data to ES, whereas as a daemon it does so only with the /index endpoint.
I don't understand what I'm doing wrong here, can anybody please point me in the right direction?

How can I send log4j logs to an arbitrary program listening on a socket

I am using log4j 1.2
How can I send log4j logs to an arbitrary program listening on a socket. I tried following options
SocketAppender - it expects a SocketNode to listen on the port.
TelnetAppender but it sends logs to a read-only port.
What I am really looking for is to send log4j logs to Flume. I know that log4j2.X has a FlumeAppender but not sure if it works with log4j1.2
If Flume runs in the same machine where log4j logs are being stored, then there is no need to send the logs to Flume, but configure Flume to directly read those logs. Regarding that, please try to configure the Exec source with a tail command execution. tail will print the logs line by line (I guess Flume somehow redirects the stdout to a internal file descriptor or something like that) and Flume will get those lines as input data.
I found org.apache.flume.clients.log4jappender.Log4jAppender using Avro
to send logs to flume agent running locally on the machine

How do you change the logging level in Apache Tomcat and reduce the size of catalina.out

I am using pusher chat application.
In pusher i am using webhook client events for chatting using presence channel.
So the problem is whenever users are chatting my webhook api gets called and the logs are coming that frequently. I want to stop this logs. the problem is beacause of this logs my file catalina.out is increasing its size and proportionately my server size is getting increased.
To stope httpclieny i have used below line in log4j prop file
log4j.logger.httpclient=WARN
So same way i want to know for the solution for pusher.
Thanks.

Can logstash read directly from remote log?

I am new to logstash and I am reading about it from couple of days. Like most of the people, I am trying to have a centralized logging system and store data in elasticsearch and later use kibana to visualize the data. My application is deployed in many servers and hence I need to fetch logs from all those servers. Installing logstash forwarder in all those machines and configuring them seems to be a very tedious task (I will do it if this is the only way). Is there a way for logstash to access those logs by mentioning the URL to logs somewhere in conf file instead of logstash forwarders forwarding it to logstash? FYI, my application is deployed on tomcat and the logs are accessible via URL http://:8080/application_name/error.log.
Not directly but there are a few close workarounds - the idea is to create a program/script that will use curl (or it's equivalent) to effectively perform a "tail -f" of the remote log file, and the run that output into logstash.
Here's a bash script that does the trick:
url-tail.sh
This bash script monitors url for changes and print its tail into
standard output. It acts as "tail -f" linux command. It can be helpful
for tailing logs that are accessible by http.
https://github.com/maksim07/url-tail
Another similar one is here:
https://gist.github.com/habibutsu/5420781
There are others out there, written in php or java: Tail a text file on a web server via HTTP
Once you have that running the question is how to get it into logstash - you could:
Pipe it into stdin and use the stdin input
Have it append to a file and use the file input
Use the pipe input to run the command itself and read from the
stdout of the command
The devil is in the details thought, particularly with logstash, so you'll need to experiment but this approach should work for you.

Resources