I want to try ELK stack independently instead of using jhispter console.
I wanted to understand how spring boot microservice is integrated with ligstash.
As I see only jhipster.logstash enabled flag is set in application.yml file. I want to know what is the java file in jhipster framework which integrates application logs with logstash ?
It's LoggingConfiguration.java and it uses this logstash appender library.
Related
Is it possible to run a Spring Boot REST API service on top of Node.js instead of Tomcat,
or if not Node.js which are the other possible servers on which we can run our Spring Boot REST Application.
Please help me figure it out.
NodeJs is a server to run Javascript code. It can not run a Java web application, which needs a JVM (Java Virtual Machine) to be run into a JVM. Before you ask, no, NodeJS can not run a JVM. Is is just not made for that. To understand what I mean, it's like wanting a car to run with a outboard engine... Definitly not possible.
So NO, you can't run your Spring Boot REST Application on the top of NodeJs server.
If you don't want to use a Tomcat, then there is other options for you:
https://blog.idrsolutions.com/2015/04/top-10-open-source-java-and-javaee-application-servers/
I'm working on node application and my main goal is to maintain the logs (error, info) of the backend part in logstash so that I could do some analysis of which API is breaking and why. I'm new to logstash and I read some basics of the logstash and elastic stacks. I want to achieve the following -
Integrate logstash to maintain the logs.
Read the logs to analysis the breaking changes.
I don't want to integrate the elastic search and kibana. I tried winston-logstash but it's not working and this library source code is not maintainable either. If anyone knows how to implement the above thing in nodejs application, Please let me know.
If your nodejs app runs as a docker container, you can use the gelf logging driver and then just log to console/stdout in nodejs and it will get routed to logstash.
Keep in mind Logstash is really just for transformation/enrichment/filtering/etc. you still probably want to output the log events (from Logstash) to an underlying logging solution e.g. elasticsearch.
I've been asked to configure an ELK stack, in order to manage log from several applications of a certain client. I have the given stack hosted and working at a redhat 7 server (followed this cookbook), and a test instance in a virtual machine with Ubuntu 16.04 (followed this other cookbook), but I've hit a roadblock and cannot seem to get through it. Kibana is rather new for me and maybe I don't fully understand the way it works. In addition, the most important application of the client is a JHipster managed application, another tool I am not familiarized.
Up until now, all I've found about jhipster and logstash tells me to install the full ELK stack using Docker (which I haven't, and would rather avoid in orther to keep the configuration I've already made), so that Kibana deployed through that method already has configured a dashboard tunned for displaying the information that the application will send with the native configuration, activated in the application.yml logstash: enabled: true.
So... my questions would be... Can I get that preconfigured jhipster dashboard imported in my preexistent Kibana deploy. Where is the data, logged by the application, stored? can I expect a given humanly readable format? Is there any other way of testing the configuration is working, since I don't have any traffic going through the test instance into the VM?
Since that JHipster app is not the only one I care for, I want other dashboards and inputs to be displayed from other applications, most probably using file beat.
Any reference to useful information is appreciated.
Yes you can. Take a look at this repository: https://github.com/jhipster/jhipster-console/tree/master/jhipster-console
there are the exports (in JSON format) from kibana stored in the repository, along with the load.sh
The scripts adds the configuration by adding them via the API. As you can imply, any recent dashboard is not affected by this, so you can use your existing configuration.
Some years ago we deployed several OSGi-based Spring Integration (SI) applications in Virgo. However, apparently SI has moved away from OSGi. So, in absence of Virgo container, what is best way to run an SI app in production now? Say, a simple app that monitors a file system location & loads file data into Oracle? Is it just java -jar?
Spring Boot makes it easy to create stand-alone, production-grade Spring based Applications that you can "just run". You can run a small production applications on it, you can consider using Spring Cloud then.
If you are looking for a container then think about SpringSource Tc Server - based on Pivotal Tc Server (an enterprise version of Apache Tomcat) as a platform. This is the drop-in replacement for Apache Tomcat that's optimized for Spring.
I am going to write a J2EE application and application will be deployed in Tomcat.
The requirement is that the server and the application must send snmp trap to external NMS.
The details of my application is
J2EE application
Deployed in Tomcat v7
The Server is Redhat Linux 6.2
We need to send trap for all the above 3 (For the application, Tomcat and the linux server).
Can we write our own agent using snmp4j for the above requirement and how will snmp agent know when to send trap to NMS?
Thanks in advance for support.
Yes you can for that you need to extend the logger framework. For Instance you can use logback framework. where you can extend the logging with CustomAppender where you can write snmp-agent code and forward the log as an trap. Moreoever logback has nice and easy way to format, deny log if not necessary other other feature. And you can change the tomcat logging to logback is simple steps. However I'm not sure if you can really send a trap for any issues on linux server. I believe it would be a tedious task. You might look for some syslog server monitoring feature.