Camel Log Component - java

I'm looking for a logging solution that is based on SLF4J so I can bind to any underlying implementation I want at runtime (for right now I'm thinking log4j). Since I am planning to have my backend routed via Apache Camel, I figured Camel must have some solution for logging.
It does - here.
But from that page description I can't tell if camel-log is for pushing internal (Camel) messages (errors, exceptions, infos, etc.) to SLF4J, or for me to use as a SLF4J "wrapper", or both.
Hence my question: is camel-log for enabling Camel messaging (so I can see what Camel is doing under the hood) or is it a component that pushes my application's messages onto a route? Or both?!?
Thanks in advance!

Camel log component (http://camel.apache.org/log.html) is for logging exchanges. In latest versions of Camel it uses SFL4J so you could choose underlying logging implementation in usual SFL4J way.
You could enable 'trace' on Camel context to 'see what Camel is doing under the roof'.
For your own logging you could just use SLF4J inside your code as usual.

Related

Jetty 11 changed the logging to SLF4J - how to access it?

We understand that Jetty 11 has basically changed logging from version 10 (no internal Jetty classes, moreover Jetty 11 is commited to use SLF4 as a base logging).
The problem
We have a rudimentary knowledge of SLF4J (used it before and we've even read the Jetty 11 SLF4J sources ,too), but currently we don't see any way to "teach" Jetty 11 a new logging (aka there are no "setLogging()" methods in the Jetty 11 source code as there were before).
Global (Jetty) parameters, alas, can't be our solution just yet.
The state (aka our requirements)
We have already solved the "RequestLog" outputs of Jetty, no problems there, we need the "normal" Jetty-log outputs.
We need to control (many) modules/jars etc. via a unified logging.
Our logging is simple but requires that no output happens on the console (stdout / stderr etc). In best case the logging gets an instance of an Exception/Runtime, too.
Therefore, we need to route the Jetty output from the "Jetty server" through our internal logging. Using SLF4? If there is no other way (and we see no other way up to now), gladly.
Switching back to Jetty 10, sadly, is not an option.
Could this be solved in any way we are not aware of (yet)? Any idea would be very appreciated, thank you!
The switch from Jetty logging to Slf4j was actually done in Jetty 10.0.0.
slf4j was designed for unified logging, it can capture into a single logging location implementation all of the logging events generated from libraries that use ...
slf4j API
java.util.logging API
log4j1 API
log4j2 API
commons-logging API
logback API
org.apache.juli.logging API
and if you use slf4j version 2.x series, there's even rudimentary support for capturing java.lang.System.Logger API.
With slf4j, you have 2 categories of jar files to think about.
Bridge API JARs
These are slf4j based JARs that merely capture the above logging events and route them to slf4j. You can choose 0..n of these JARs to use.
There's dozens of options here.
Here's some common ones
jcl-overs-slf4j - captures Jakarta Commons Logging events and sends to slf4j
jul-to-slf4j - captures Java Util Logging events and sends them to slf4j
log4j-over-slf4j - captures Log4j 1.x events and sends them to slf4j
log4j2-overs-slf4j - captures Log4j 2.x events and sends them to slf4j
osgi-over-slf4j - captures osgi logging bundle events and sends them to slf4j
See http://www.slf4j.org/legacy.html
Implementation Binding JAR
These are the implementation of slf4j-api, and is the final binding of all logging events, it is the thing that decides what to do with the logging event (eg: write it to disk, ignore it, send it to a logging database, etc)
You have many choices here as well, here's some common jars to pick from (pick only 1!)
logback-classic - slf4j to Logback (Eclipse Jetty group's favorite logging implementation)
slf4j-jdk14 - slf4j to Java Util Logging
slf4j-log4j12 - slf4j to Log4j 1.2.x
log4j-slf4j-impl - slf4j to Log4j 2.x (see https://logging.apache.org/log4j/2.x/log4j-slf4j-impl/)
slfj-jcl - slf4j to Jakarta Commons Logging
jetty-slf4j-impl - Jetty 10+ implementation of the slf4j api
See: http://www.slf4j.org/manual.html#swapping
Since Jetty 10.0.x, the jetty-slf4j-impl exists, which provides an out of the box implementation that simply writes to System.err (aka STDERR) with some decent logging filtering by level in the usual jetty-logging.properties.
See https://search.maven.org/artifact/org.eclipse.jetty/jetty-slf4j-impl
Important advice
Don't use multiple binding implementations. Narrow it down to 1 binding implementation and purge all other logging implementation jars.
Don't accidentally create a loop with introducing a Bridge API Jar and a Binding Implementation JAR with the same logging technology. (eg: using binding log4j-over-slf4j and slf4j-log4j12 at the same time)
There is no "configuration" to wire up these binding or bridge jars, their mere existence in the classloader is enough to make them work. See the slf4j manual on how that works.
We have already solved the "RequestLog" outputs of Jetty, no problems there, we need the "normal" Jetty-log outputs.
Interesting, this is "solved" by actually using slf4j, as that's the only non-deprecated implementation of RequestLog.Writer in Jetty 10 and Jetty 11.
The way this works, is the Slf4jRequestLogWriter will emit events to a single named logger (the name of which you can configure in Slf4jRequestLogWriter.setLoggerName(String)) using the slf4j-api. Then it reaches the logging implementation and is routed wherever that logging configuration decides based on that logger name (file, with rolling, syslog, sent to a different system for aggregation, logstash, etc)
Did you really implement your own RequestLog.Writer instead of just using your preferred logging logging library? (libraries like logback, log4j2, log4j1, and even java.util.logging can easily create separate log files for RequestLog events).
⚠️ Note: do not use logback-access for RequestLog at this time (It does not fully support jakarta.servlets yet, and has many bugs that result in bad request log data. See open PR at https://github.com/qos-ch/logback/pull/532)

Using two logging framework in same Spring application

We have a common service module which uses legacy log4J for logging. We need to use this module as dependency in a new Spring Boot application. In new application we are trying to set up SLF4J-Logback as logging framework which is recommended as Log4J is old however we are observing that the log messages are going to different log files. I think this is happening because our common module uses log4j while we are using logback in new module. Now which approach should we use ? Having log messages in two different files will make it difficult to read and debug issues. Shall i configure log4J and logback to use same file ? Is that safe ? Or we use log4j in new application as well and drop logback ?
I would strongly recommend that you use a logging facade, what you already do with SLF4J.
That means that logback in combination with SLF4J is a perfect choice. Thereby SLF4J severs as a simple facade for various logging frameworks. It allows to redirect log messages from legacy logging frameworks to behave as if they were made to the SLF4J API instead.
Adding the appropriate briding module (log4j-over-slf4j) to your classpath, should be everything you have to do for "installation".

ApacheKafka Appender logback

I just started playing with it with ApacheKafka. I set the whole thing up and now I am trying to introduce Kafka sending logs via log appender in already existing Java application. This application is using Logback as a logger library. So, I guess, that makes me impossible to use kafka.producer.KafkaLog4jAppender? There are not compatible? I am getting IncompatibleClassException. Is there maybe other solution? Thanks a lot!!
You can use logback-kafka appender available.
Check this link
This is a good Logback Appender that sends log messages to Kafka. It has a powerful, light-weight approach that allows you to make full use of the latest Kafka Producer.

Logs pipeline from AEM (CQ5) to the Log4j server

Is it possible to transfer the AEM (v5.6) logs to the Log4j server? Or there is some best practice for centralized AEM logging?
I haven't dealt much with AEM. It seems the Apache Sling Logging services configure only the FileAppender.
For CQ5.5 there is no log4j.xml. All the CRX logger needs to
configured using sling config.
We can use Apache Sling Logging Logger
Configuration
(https://docs.adobe.com/docs/en/aem/6-0/deploy/configuring/monitoring-and-maintaining.html#Create%20a%20Custom%20Log%20File)
or LogBack
(http://sling.apache.org/documentation/development/logging.html#logback-integration)
AEM provides the loggers out of the box as you have seen. Typically if you want centrallized log handling I would suggest mounting a super-high-speed shared volume and having all instances log there, just for performance/speed reasons.
You might want to check out this:
http://adobe-consulting-services.github.io/acs-aem-commons/features/syslog-appender.html
It is AEM6 only, but you could look at the code and do it that way.
This uses logback, not log4j, but it should solve your problem.
The other option, really, is to write your own logging service and use that and configure the appender in the code. I do not see the normal XML files anywhere. I'm curious if you come across them.
Let me know if I can help further.

Intercept Log Messages slf4j

I have an application which uses slf4j as the logging facade. Now I would like to intercept all the error messages before the transfer is handed out to the underlying logging system. Is it possible to do that for slf4j? I looked through the documentation and see that we can change the Appenders in the implementation (like log4j) to achieve this but can we do this at the facade level itself? The intercept does basic stuff like incrementing a global counter for the number of error messages etc.
There is nothing for doing this in the SLF4J API.
But if your logger is Logback you can do it using a Filter like TurboFilter.
If you're not using Logback you'll probably need something like an AspectJ interceptor.

Categories