Integrating apache-kafka client with java application as log appender - log4j

Is someone aware of any opensource implementation of Log4J appender which can write log events to Kafka?

There is a KafkaLog4jAppender already included in the Kafka project.
You'll find the api docs to that here: http://people.apache.org/~joestein/kafka-0.7.1-incubating-docs/kafka/producer/KafkaLog4jAppender.html (for 0.7.1)
and the corresponding source code here: https://github.com/apache/kafka/blob/0.7.1/core/src/main/scala/kafka/producer/KafkaLog4jAppender.scala

Related

ActiveMQ 5.16.2 Log4J

I tried to search for security Log4J issues regarding ActiveMQ 5.16.2 but I could not find any information.
Does ActiveMQ Version 5.16.2 has the Log4J security problems?
If so what should be done to close this gap?
You might review the ActiveMQ users mailing list to stay up-to-date on the latest project activity. All official support and development activities are discussed and documented on the mailing lists. This is standard practice for all Apache projects.
There's also a "News" section of the ActiveMQ website that may have relevant information like this update regarding CVE-2021-44228.

Replacement of log4j with slf4j

I have implemented log4j in my project for logging but as we all know it is slower than slf4j. Thats why I want to upgrade to slf4j. How I can replace log4j with slf4j.
I have created my own framework where I implemented log4j in my whole project. But I want to replace that with slf4j but not able to find a proper method.
This answer is bit longer to post it as a comment, so posting it as an
answer.
Extending #sazzad answer, SLF4j is logging facade and it requires an underlying logging api such as log4j,log4j2, logback, commons-logging etc.
So which logging api you are planning to use?
If you are planning to use log4j itself as an underlying logging api, then you need to use slf4j-log4j12 jar in your application. (Make sure not to use both slf4j-log4j12 and log4j-over-slf4j at the same time as it causes an infinite loop) and that's it.
If you are planning to use other logging api such as logback, then you need to use log4j-over-slf4j jar and respective logging api bridge jar. See Slf4j Bridging legacy APIs

Logging in common codes for both flink(slf4j) and spark(log4j) platform

I am writing some codes which are supposed to run (as jar) on both flink and spark platforms. However, these two platforms use different log APIs. (flink uses log4j as logging framework, but slf4j as API) In this case, what is the best practice to log in the common codes ?
I tried with Log4j2 API in these common codes, but it cannot log anything in flink.
My idea now would be trying to get the logging context with log4j API from the slf4j context (which is already launched by flink), is that correct?
Thanks
Definitely a safe way to go would be to use SLF4J from a shared common library.
Since SLF4J is a logging facade, you don't have to force your users to use the same logging framework you're using in your library. See the user manual to this point:
Authors of widely-distributed components and libraries may code
against the SLF4J interface in order to avoid imposing an logging
framework on their end-user. Thus, the end-user may choose the desired
logging framework at deployment time by inserting the corresponding
slf4j binding on the classpath, which may be changed later by
replacing an existing binding with another on the class path and
restarting the application. This approach has proven to be simple and
very robust.

Annotation configuration for kafka High level consumer

I wonder if there's a way to configure the kafka-integration Inbound Channel Adapter as explained in https://github.com/spring-projects/spring-integration-kafka using Java annotations instead of xml. I don't quite get the clue on it. I was able to configure the Message Driven Channel Adapter but now I need one that doesn't re-reads consumed messages in case of re-starting the server
See the main Spring Integration Documentation for configuring with annotations; also see the Java DSL.
We are adding more and more java configuration examples; see here for example.
The kafka high-level message source would be configured as an #InboundChannelAdapter #Bean.

Couchdb and log4j

So I am thinking about making the change to couchdb. I googled it but couldn't find any documentation that it would support log4j. What I am trying to do is have a logger send logs to a database and have it store it. From there I want to be able to retrieve the logs and display it. Does couchdb support this?
First off you need to learn about couchapp and install CouchLog.
Second you will have to write your own log4j appender to send the log entries in the format that CouchLog expects.
I hacked together a log4j appender that did this a long time ago, the code is lost, don't ask for it. But it was only like 30 mins of coding to get it working.
There is a NoSQL appender that supports currently CouchDB and MongoDB:
NoSQLAppender
The NoSQLAppender writes log events to a NoSQL database using an internal lightweight provider interface. Provider implementations currently exist for MongoDB and Apache CouchDB, and writing a custom provider is quite simple.
(source: https://logging.apache.org/log4j/2.x/manual/appenders.html)

Resources