Implementing server logs with splunk - log4j

folks !
i'm trying to log server logs over my splunk cloud, can you please explain how to implement this, i have setted up splunk with universal forwarder and my client side logs are working fine, but how to put server side logs, i have idea about log4j.properties file but what to write in it, and in other files to reflect server logs on splunk site as well.
If you could help in simple terms that would be helpful.
Thank You so Much. !!

I'm not sure I totally understand your question. Anyways, I think our Java Logging Libraries may be helpful for you - we support Log4j

You can log server logs similarly how you logging client logs. Update the server log file path in inputs.log.

Related

How to integrate logstash in nodejs application?

I'm working on node application and my main goal is to maintain the logs (error, info) of the backend part in logstash so that I could do some analysis of which API is breaking and why. I'm new to logstash and I read some basics of the logstash and elastic stacks. I want to achieve the following -
Integrate logstash to maintain the logs.
Read the logs to analysis the breaking changes.
I don't want to integrate the elastic search and kibana. I tried winston-logstash but it's not working and this library source code is not maintainable either. If anyone knows how to implement the above thing in nodejs application, Please let me know.
If your nodejs app runs as a docker container, you can use the gelf logging driver and then just log to console/stdout in nodejs and it will get routed to logstash.
Keep in mind Logstash is really just for transformation/enrichment/filtering/etc. you still probably want to output the log events (from Logstash) to an underlying logging solution e.g. elasticsearch.

How to read multiple servers log files in graylog?

In our application we want to read logs from 2 different servers i.e apache tomcat and Jboss and want to monitor the logs. I have tried to search online to how to configure it but not albe to understand clearly about how can i implement it in graylog. Please help. Thank you.
You can send logs from an arbitrary number of applications and systems to Graylog (even on the same input).
Simply configure your applications and systems to send logs to Graylog and create an appropriate input for them.
See http://docs.graylog.org/en/2.1/pages/sending_data.html for some hints.
Hope you were able to send your logs to graylog server. Centralized logging using graylog will help newbies to get started with graylog and the article explains use cases like integrating apache, nginx, mysql slow query logs to graylog. This covers various ways like sending logs via syslog, graylog apache module, filebeat etc. which most articles miss out explaining in detail.

logging to ELK stack from karaf

I've been working on getting an ELK stack setup to have our logs centralized and easier to check, but I'm running into a bit of a snag.
I've modified a few of our java programs to use the socket appender from log4j and it's worked great each time. Now I'm trying to add it to karaf to have all of our karaf logs recorded but it doesn't seem to be working.
I added:
log4j.rootLogger=INFO, logstash, osgi:*
# Logstash appender
log4j.appender.logstash=org.apache.log4j.net.SocketAppender
log4j.appender.logstash.Port=PORT
log4j.appender.logstash.RemoteHost=HOST
log4j.appender.logstash.ReconnectionDelay=10000
to the file in {karaf_home}/etc/org.ops4j.pax.logging.cfg (with the correct port/host obviously) and then restarted karaf just to make sure (something I read said it would pick up changes automatically but I didn't know if I trusted it so I restarted it anyway) but nothing seems to be making it from karaf to our ELK stack. When I do log:display on the karaf console I see plenty of messages being written to the log, but none in ELK.
Any clue as to why this may not be working for karaf, but is working for other projects using the same appender?
You should have a look at karaf decanter. It already contains connectors that can be used to send logs to an ELK stack, the decanter-collector-log is probably what you are looking for

Where are the application logs for node.js on AWS OpsWorks

This is probably something really easy, but I can't find the where node.js is logging on AWS OpsWorks. I ssh into an instance, I confirm node is running and listening on port 80, yet the usual /var/log/nodejs does not exist, and the log directory that seems to have been created in my application root is empty. Any help appreciated.
This is something I experienced recently. Monit configuration for NodeJs apps doesn't send logs to anywhere, so you can't access those logs.
A Similar thread explains the problem and how you can fix it:
Node.js OpsWorks Layer console logs

how to send linux log files to central server and look at them through web interface?

I have a couple of linux servers and logrotate and rsyslog are taking care of all log files. Now I was wondering whether the following is possible:
keep log files locally (present)
send log events to a centralized server (should be possible with logrotate, right?)
make log events on centralized server browse & searchable
So here are my questions:
How do I have setup logrotate and rsyslog (on 'client' and 'server) to accomplish this configuration?
does someone know of a good (opensource) web interface that would work with this setup?
EDIT:
seems like what I want to accomplish exists for sysnlog-ng. http://www.debianhelp.co.uk/syslog-ng.htm
there are tons of log analysers with web interface for example
http://www.xpolog.com/
http://www.splunk.com
etc.

Resources