The message attribute in the JSON log has backslash - java

I am using spring boot in a project and currently exploring the logging behaviour. For that I'm using the zipkin service.
My logback.xml is as follows:
<appender name="STASH"
class="net.logstash.logback.appender.LogstashTcpSocketAppender">
<destination>192.168.0.83:5000</destination>
<encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder"
<providers>
<mdc /> <!-- MDC variables on the Thread will be written as JSON fields -->
<context /> <!--Outputs entries from logback's context -->
<version /> <!-- Logstash json format version, the #version field in the output -->
<logLevel />
<loggerName />
<pattern>
<pattern>
{
"serviceName": "zipkin-demo"
}
</pattern>
</pattern>
<threadName />
<message />
<logstashMarkers />
<arguments />
<stackTrace />
</providers>
</encoder>
</appender>
<root level="info">
<appender-ref ref="STASH" />
</root>
Now for what I have understood, to log in json, you need a custom logger implementation which I have done as follows:
public class CustomLogger {
public static void info(Logger logger, Message message, String msg) {
log(logger.info(), message, msg).log();
}
private static JsonLogger log(JsonLogger logger, Message message, String msg) {
try {
if (message == null) {
return logger.field("Message", "Empty message").field("LogTime", new Date() );
}
logger
.field("message", message.getMessage())
.field("id", message.getId());
StackTraceElement ste = (new Exception()).getStackTrace()[2];
logger.field(
"Caller",
ste.getClassName() + "." + ste.getMethodName() + ":" + ste.getLineNumber());
return logger;
} catch (Exception e) {
return logger.exception("Error while logging", e).stack();
}
}
My message class is:
public class Message {
private int id;
private String message;
..constructor & setters and getters
}
Now in my controller class I'm using my custom logger as:
static com.savoirtech.logging.slf4j.json.logger.Logger log = com.savoirtech.logging.slf4j.json.LoggerFactory.getLogger(Controller.class.getClass());
Message msg = new Message(1, "hello");
CustomLogger customLogger = new CustomLogger();
customLogger.info(log, msg, "message");
My logs end up in kibana as:
"message": "{\"message\":\"hello\",\"id\":1,\"Caller\":\"<class_name>:65\",\"level\":\"INFO\",\"thread_name\":\"http-nio-3333-exec-2\",\"class\":\"<custom_logger_class>\",\"logger_name\":\"java.lang.Class\",\"#timestamp\":\"2018-07-20 19:02:14.090+0530\",\"mdc\":{\"X-B3-TraceId\":\"554c43b0275c3430\",\"X-Span-Export\":\"true\",\"X-B3-SpanId\":\"368702fffa40d2cd\"}}",
Now instead of this I would want my message part of the logs as a json entry. Where am I going wrong?
I have tried searching a way extensively but to no avail. Is it even possible?

I finally got it working. I just changed the logstash config file and added:
input
{
tcp
{
port => 5000
type => "java"
codec => "json"
}
}
filter
{
if [type]== "java"
{
json
{
source => "message"
}
}
}
The filter part was missing earlier.

Related

logback create new instance of appender

I work with LogBack and I have several Loggers.
I have created a Custom Appender:
public class LogListenerAppender extends AppenderBase<ILoggingEvent> {
private List<LogListener> listeners;
public LogListenerAppender() {
listeners = new ArrayList<>();
}
public void addListener(LogListener listener){
listeners.add(listener);
System.out.println("Current listener: " + listeners.size());
}
/**
* Send the LogEvent to all Listeners
* #param eventObject the LogEventObject
*/
#Override
protected void append(ILoggingEvent eventObject) {
for(LogListener listener : listeners){
listener.receiveLogMessage(eventObject);
}
}
}
This appender is to add listeners to the Logger to catch Log messages.
Now in my logback.xml file, I have created the Appender as LISTENER
<!--Custom Listener Appender-->
<appender name="LISTENER" class="path.to.LogListenerAppender"/>
And some Logger:
<logger name="TestLogger">
<appender-ref ref="LISTENER" />
</logger>
<logger name="MainLogger">
<appender-ref ref="LISTENER" />
</logger>
In the code I add LogListener to the Logger:
public static void main(String[] args){
Logger testLogger = LoggerFactory.getLogger("TestLogger");
Logger mainLogger = LoggerFactory.getLogger("MainLogger");
addListenerToLogger(testLogger, new LogListener(Level.TRACE) {
#Override
public void log(String message, long timestamp) {
System.out.println("TEST LOG: " + message);
}
});
addListenerToLogger(mainLogger, new LogListener(Level.TRACE) {
#Override
public void log(String message, long timestamp) {
System.out.println("MAIN LOG: " + message);
}
});
testLogger.info("Hello");
}
private static void addListenerToLogger(Logger logger, LogListener loglistener){
ch.qos.logback.classic.Logger log = (ch.qos.logback.classic.Logger) logger;
LogListenerAppender appender = (LogListenerAppender)log.getAppender("LISTENER");
appender.addListener(loglistener);
}
The desired output is:
TEST LOG: Hello
But the output is:
TEST LOG: Hello
MAIN LOG: Hello
And the System.out.println("Current listener: " + listeners.size()); in the LogListenerAppender prints 2.
My problem now is that Logback uses the same instance of LogListenerAppender for all Logger who uses <appender-ref ref="LISTENER"/>.
But I need for every Logger a new LogListenerAppender.
How can I configure logBack that he creates every time a new Instance?
My idea is to create appender for every logger like:
<appender name="LISTENER1" class="path.to.LogListenerAppender"/>
<appender name="LISTENER2" class="path.to.LogListenerAppender"/>
//etc...
<logger name="TestLogger">
<appender-ref ref="LISTENER1" />
</logger>
<logger name="MainLogger">
<appender-ref ref="LISTENER2" />
</logger>
//etc...
But I hope it exists an easier way
You can see why this is occuring in your code here:
LogListenerAppender appender = (LogListenerAppender)log.getAppender("LISTENER");
Your code just gets whatever appender you've created in memory tagged with reference "LISTENER". Everything in that list of listeners will listen for any event from an appender tagged with listener.
Perhaps try adding a String to your add listener method like so:
private static void addListenerToLogger(Logger logger, LogListener loglistener, String appenderRef){
ch.qos.logback.classic.Logger log = (ch.qos.logback.classic.Logger) logger;
LogListenerAppender appender = (LogListenerAppender)log.getAppender(appenderRef);
appender.addListener(loglistener);
}
This way you can pass the appropriate appender ref (i.e. "LISTENER1") to the method to retrieve the appropriate appenders.
It would be worthwhile to chose appropriate refs as well i.e.;
<logger name="TestLogger">
<appender-ref ref="TEST" />
</logger>
<logger name="MainLogger">
<appender-ref ref="MAIN" />
</logger>
For the sake of readability and maintainability, but that's a style choice more than a technical decision

Using SLF4J in Akka

I followed Logging Doc for using SLF4J with logback in my Akka application. But i have problem in logging in non-actor classes: I can not log MDCs (e.g. sourceActorSystem) in my non-actor classes by a SLF4J logger. Here is my simple application:
1- pom.xml:
<dependency>
<groupId>com.typesafe.akka</groupId>
<artifactId>akka-slf4j_2.12</artifactId>
<version>2.5.18</version>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.2.3</version>
</dependency>
2- application.conf:
akka {
loggers = ["akka.event.slf4j.Slf4jLogger"]
loglevel = "DEBUG"
logging-filter = "akka.event.slf4j.Slf4jLoggingFilter"
}
3- logback.xml which uses sourceThread, akkaSource, and sourceActorSystem MDCs:
<configuration scan="true" scanPeriod="30 seconds" >
<contextName>akka-sample</contextName>
<jmxConfigurator />
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d{HH:mm:ss} %-5level [%contextName] [%X{sourceThread}] [%X{sourceActorSystem}] [%X{akkaSource}] %logger{40} -| %msg %rEx%n</pattern>
</encoder>
</appender>
<root level="info">
<appender-ref ref="STDOUT" />
</root>
</configuration>
4- UserActor: An actor class which logs by DiagnosticLoggingAdapter:
public class UserActor extends AbstractActor {
DiagnosticLoggingAdapter logger = Logging.getLogger(this);
Handler handler = new Handler(); // non-actor class
public static Props props() {
return Props.create(UserActor.class, UserActor::new);
}
#Override
public Receive createReceive() {
return receiveBuilder()
.match(UserMessages.Register.class, register -> {
logger.info("{} received", register);
// Delegate message-handling to a non-actor class
handler.onRegister(register, getSelf(), getSender());
})
.build();
}
}
5- Handler: A non-actor class which logs by SLF4J logger. Each UserActor instance has a Handler field for delegating message-handling to it:
public class Handler {
org.slf4j.Logger logger = org.slf4j.LoggerFactory.getLogger(getClass());
public void onRegister(UserMessages.Register register, ActorRef self, ActorRef sender) {
logger.info("Handling {}", register);
sender.tell(new UserMessages.Registered(), self);
}
}
6- Simple application:
public class Application {
public static void main(String [] args) throws InterruptedException {
ActorSystem system = null;
try {
system = ActorSystem.create("akka-system");
ActorRef user1 = system.actorOf(UserActor.props(), "user-1");
ActorRef user2 = system.actorOf(UserActor.props(), "user-2");
user2.tell(UserMessages.Register.builder().userId("1").build(), user1);
Thread.sleep(2000);
} finally {
system.terminate();
}
}
}
Log Output:
18:07:30 INFO [akka-sample] [] [] [] akka.event.slf4j.Slf4jLogger -| Slf4jLogger started
18:07:30 INFO [akka-sample] [akka-system-akka.actor.default-dispatcher-4] [akka-system] [akka://akka-system/user/user-2] x.y.akka.UserActor -| UserMessages.Register(userId=1) received
18:07:30 INFO [akka-sample] [] [] [] x.y.akka.Handler -| Handling UserMessages.Register(userId=1)
As you can see, only log in UserActor (actor class) has sourceThread, akkaSource, and sourceActorSystem MDCs; But in Handler (non-actor) they are empty. Also if i change akka.event.DiagnosticLoggingAdapter in UserActor to org.slf4j.Logger, these MDCs become empty.

How to create multiple log file programatically in log4j2?

I am developing a java application which communicates with lots of devices. For each device I need to create a different log file to log it's communication with device. This is the wrapper class I developed. It creates two log files but the data is written to only the first one. The second file is created but nothing is written to it. The output that should go to second file goes to console. If I uncomment createRootLogger() in constructor nothing is written to both the files, everything goes to console. I have gone through log4j2 documentation but it is poorly written with very few code samples. Here is my wrapper class, where is the error? I am using log4j-api-2.9.0.jar and log4j-core-2.9.0.jar.
package xyz;
import org.apache.logging.log4j.Level;
import org.apache.logging.log4j.core.Logger;
import org.apache.logging.log4j.core.LoggerContext;
import org.apache.logging.log4j.core.appender.ConsoleAppender;
import org.apache.logging.log4j.core.config.Configuration;
import org.apache.logging.log4j.core.config.Configurator;
import org.apache.logging.log4j.core.config.builder.api.*;
import java.util.Hashtable;
public class LogManager
{
static protected LogManager m_clsInstance = null;
protected Hashtable<String, Logger> m_clsLoggers = new Hashtable<String, Logger>();
private LogManager()
{
//createRootLogger();
}
/**
* getInstance is used to get reference to the singalton class obj ......
*/
static synchronized public LogManager getInstance()
{
try
{
if (m_clsInstance == null)
{
m_clsInstance = new LogManager();
//Configurator.setRootLevel(Level.TRACE);
}
}
catch (Exception xcpE)
{
System.err.println(xcpE);
}
return m_clsInstance;
}
static public Logger getLogger(String sLogger)
{
try
{
return getInstance().m_clsLoggers.get(sLogger);
}
catch (Exception xcpE)
{
System.err.println(xcpE);
}
return null;
}
public Logger createLogger(String strName, String sPath, int nBackupSize, long lngMaxSize, String strPattern, String strLevel)
{
try
{
ConfigurationBuilder builder = ConfigurationBuilderFactory.newConfigurationBuilder();
builder.setStatusLevel(Level.getLevel(strLevel));
builder.setConfigurationName("RollingBuilder"+strName);
// create a console appender
AppenderComponentBuilder appenderBuilder = builder.newAppender("Stdout", "CONSOLE").addAttribute("target",
ConsoleAppender.Target.SYSTEM_OUT);
appenderBuilder.add(builder.newLayout("PatternLayout")
.addAttribute("pattern", strPattern));
builder.add( appenderBuilder );
// create a rolling file appender
LayoutComponentBuilder layoutBuilder = builder.newLayout("PatternLayout")
.addAttribute("pattern", strPattern);
ComponentBuilder triggeringPolicy = builder.newComponent("Policies")
// .addComponent(builder.newComponent("CronTriggeringPolicy").addAttribute("schedule", "0 0 0 * * ?"))
.addComponent(builder.newComponent("SizeBasedTriggeringPolicy").addAttribute("size", lngMaxSize));
appenderBuilder = builder.newAppender("rolling"+strName, "RollingFile")
.addAttribute("fileName", sPath)
.addAttribute("filePattern", "d:\\trash\\archive\\rolling-%d{MM-dd-yy}.log.gz")
.add(layoutBuilder)
.addComponent(triggeringPolicy);
builder.add(appenderBuilder);
// create the new logger
builder.add( builder.newLogger( strName, Level.getLevel(strLevel) )
.add( builder.newAppenderRef( "rolling"+strName ) )
.addAttribute( "additivity", false ) );
Configuration clsCnfg = (Configuration) builder.build();
LoggerContext ctx = Configurator.initialize(clsCnfg);
Logger clsLogger = ctx.getLogger(strName);
m_clsLoggers.put(strName, clsLogger);
return clsLogger;
}
catch (Exception xcpE)
{
System.err.println(xcpE);
}
return null;
}
protected void createRootLogger()
{
try
{
ConfigurationBuilder builder = ConfigurationBuilderFactory.newConfigurationBuilder();
builder.setStatusLevel(Level.getLevel("TRACE"));
builder.setConfigurationName("rootConfig");
// create a console appender
AppenderComponentBuilder appenderBuilder = builder.newAppender("Stdout", "CONSOLE").addAttribute("target",
ConsoleAppender.Target.SYSTEM_OUT);
appenderBuilder.add(builder.newLayout("PatternLayout")
.addAttribute("pattern", "[%d{yyyy-MMM-dd HH:mm:ss:SSS}][%-5p %l][%t] %m%n"));
builder.add( appenderBuilder );
builder.add( builder.newRootLogger( Level.getLevel("TRACE"))
.add( builder.newAppenderRef( "Stdout") ) );
Configuration clsCnfg = (Configuration) builder.build();
LoggerContext ctx = Configurator.initialize(clsCnfg);
Logger clsLogger = ctx.getRootLogger();
m_clsLoggers.put("root", clsLogger);
}
catch (Exception xcpE)
{
System.err.println(xcpE);
}
}
static public void main(String args[])
{
//Logger clsLogger = setLogger();
Logger clsLogger = Emflex.LogManager.getInstance().createLogger(
"AnsiAmrController_" + 5555,
"d:\\trash\\LogManagerTest5555.log",
10,
100000000,
"[%d{yyyy-MMM-dd HH:mm:ss:SSS}][%-5p %l][%t] %m%n",
"TRACE"
);
Logger clsLogger2 = Emflex.LogManager.getInstance().createLogger(
"AnsiAmrController_" + 6666,
"d:\\trash\\LogManagerTest6666.log",
10,
100000000,
"[%d{yyyy-MMM-dd HH:mm:ss:SSS}][%-5p %l][%t] %m%n",
"TRACE"
);
for (int i=0;i<100;i++)
{
clsLogger.error("Testing - ["+i+"]");
clsLogger2.error("Testing - ["+(i*i)+"]");
}
}
}
You said your objective is:
For each device I need to create a different log file to log it's communication with device.
There are many different ways to accomplish this without programmatic configuration. Programmatic configuration is bad because it forces you to depend on the logging implementation rather than the public interface.
For example you could use a context map key in conjunction with a Routing Appender to separate your logs, similar to the example I gave in another answer. Note that in the other answer I used the variable as the folder where the log is stored but you can use it for the log name if you wish.
Another way to do what you want would be to use a MapMessage as shown in the log4j2 manual.
Yet another way would be to use markers in combination with a RoutingAppender. Here is some example code for this approach:
package example;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.apache.logging.log4j.Marker;
import org.apache.logging.log4j.MarkerManager;
public class LogLvlByMarkerMain {
private static final Logger log = LogManager.getLogger();
private static final Marker DEVICE1 = MarkerManager.getMarker("DEVICE1");
private static final Marker DEVICE2 = MarkerManager.getMarker("DEVICE2");
public static void main(String[] args) {
log.info(DEVICE1, "The first device got some input");
log.info(DEVICE2, "The second device now has input");
}
}
Configuration:
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="WARN">
<Appenders>
<Routing name="MyRoutingAppender">
<Routes pattern="$${marker:}">
<Route>
<File
fileName="logs/${marker:}.txt"
name="appender-${marker:}">
<PatternLayout>
<Pattern>[%date{ISO8601}][%-5level][%t] %m%n</Pattern>
</PatternLayout>
</File>
</Route>
</Routes>
</Routing>
<Console name="STDOUT" target="SYSTEM_OUT">
<PatternLayout pattern="[%date{ISO8601}][%-5level][%t] %m%n" />
</Console>
</Appenders>
<Loggers>
<Logger name="example" level="TRACE" additivity="false">
<AppenderRef ref="STDOUT" />
<AppenderRef ref="MyRoutingAppender" />
</Logger>
<Root level="WARN">
<AppenderRef ref="STDOUT" />
</Root>
</Loggers>
</Configuration>
Output:
This will generate 2 log files - DEVICE1.txt and DEVICE2.txt as shown in the image below.
The first log will contain only messages that were marked as DEVICE1 and the second will contain only DEVICE2 logs.
I.e. the first log contains:
[2017-09-21T09:52:04,171][INFO ][main] The first device got some input
and the second contains:
[2017-09-21T09:52:04,176][INFO ][main] The second device now has input
The approach log4j2 is initialize programmatically and later configuration is modified is different. And you you trying to add dynamic appender and logger using initialization approach.
So, first you should initialize your RootLogger using initialization approach that seems correct in your code.
After that, add dynamic appender and logger using approach mentioned here
adding on D.B answer:
I had trouble making this write to file. (and yes I tried using log4j2 version 2.8.1 but still didn't work)
To make it work I edited this part
<Root level="WARN">
<AppenderRef ref="STDOUT" />
</Root>
to this:
<Root level="WARN">
<AppenderRef ref="STDOUT" />
<AppenderRef ref="MyRoutingAppender" />
</Root>
And since the Debug level is set to WARN
<Configuration status="WARN">
and we trying to log info
log.info(DEVICE$, "The $ device now has input");
the info log wont be written (WARN will only print: warn, error, fatal check this link log4j logging level)
you can simply change
log.info() --> log.warn()
just as a proof of concept.

GKE & Stackdriver: Java logback logging format?

I have a project running Java in a docker image on Kubernetes. Logs are automatically ingested by the fluentd agent and end up in Stackdriver.
However, the format of the logs is wrong: Multiline logs get put into separate log lines in Stackdriver, and all logs have "INFO" log level, even though they are really warning, or error.
I have been searching for information on how to configure logback to output the correct format for this to work properly, but I can find no such guide in the google Stackdriver or GKE documentation.
My guess is that I should be outputting JSON of some form, but where do I find information on the format, or even a guide on how to properly set up this pipeline.
Thanks!
This answer contained most of the information I needed: https://stackoverflow.com/a/39779646
I have adapted the answer to fit my exact question, and to fix some weird imports and code that seems to have been deprecated.
logback.xml:
<configuration>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder class="ch.qos.logback.core.encoder.LayoutWrappingEncoder">
<layout class="my.package.logging.GCPCloudLoggingJSONLayout">
<pattern>%-4relative [%thread] %-5level %logger{35} - %msg</pattern>
</layout>
</encoder>
</appender>
<root level="INFO">
<appender-ref ref="STDOUT"/>
</root>
</configuration>
GCPCloudLoggingJSONLayout:
import ch.qos.logback.classic.Level;
import ch.qos.logback.classic.PatternLayout;
import ch.qos.logback.classic.spi.ILoggingEvent;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
import java.util.Map;
import static ch.qos.logback.classic.Level.DEBUG_INT;
import static ch.qos.logback.classic.Level.ERROR_INT;
import static ch.qos.logback.classic.Level.INFO_INT;
import static ch.qos.logback.classic.Level.TRACE_INT;
import static ch.qos.logback.classic.Level.WARN_INT;
/**
* GKE fluentd ingestion detective work:
* https://cloud.google.com/error-reporting/docs/formatting-error-messages#json_representation
* http://google-cloud-python.readthedocs.io/en/latest/logging-handlers-container-engine.html
* http://google-cloud-python.readthedocs.io/en/latest/_modules/google/cloud/logging/handlers/container_engine.html#ContainerEngineHandler.format
* https://github.com/GoogleCloudPlatform/google-cloud-python/blob/master/logging/google/cloud/logging/handlers/_helpers.py
* https://cloud.google.com/logging/docs/reference/v2/rest/v2/LogEntry
*/
public class GCPCloudLoggingJSONLayout extends PatternLayout {
private static final ObjectMapper objectMapper = new ObjectMapper();
#Override
public String doLayout(ILoggingEvent event) {
String formattedMessage = super.doLayout(event);
return doLayoutInternal(formattedMessage, event);
}
/**
* For testing without having to deal wth the complexity of super.doLayout()
* Uses formattedMessage instead of event.getMessage()
*/
private String doLayoutInternal(String formattedMessage, ILoggingEvent event) {
GCPCloudLoggingEvent gcpLogEvent =
new GCPCloudLoggingEvent(formattedMessage, convertTimestampToGCPLogTimestamp(event.getTimeStamp()),
mapLevelToGCPLevel(event.getLevel()), event.getThreadName());
try {
// Add a newline so that each JSON log entry is on its own line.
// Note that it is also important that the JSON log entry does not span multiple lines.
return objectMapper.writeValueAsString(gcpLogEvent) + "\n";
} catch (JsonProcessingException e) {
return "";
}
}
private static GCPCloudLoggingEvent.GCPCloudLoggingTimestamp convertTimestampToGCPLogTimestamp(
long millisSinceEpoch) {
int nanos =
((int) (millisSinceEpoch % 1000)) * 1_000_000; // strip out just the milliseconds and convert to nanoseconds
long seconds = millisSinceEpoch / 1000L; // remove the milliseconds
return new GCPCloudLoggingEvent.GCPCloudLoggingTimestamp(seconds, nanos);
}
private static String mapLevelToGCPLevel(Level level) {
switch (level.toInt()) {
case TRACE_INT:
return "TRACE";
case DEBUG_INT:
return "DEBUG";
case INFO_INT:
return "INFO";
case WARN_INT:
return "WARN";
case ERROR_INT:
return "ERROR";
default:
return null; /* This should map to no level in GCP Cloud Logging */
}
}
/* Must be public for Jackson JSON conversion */
public static class GCPCloudLoggingEvent {
private String message;
private GCPCloudLoggingTimestamp timestamp;
private String thread;
private String severity;
public GCPCloudLoggingEvent(String message, GCPCloudLoggingTimestamp timestamp, String severity,
String thread) {
super();
this.message = message;
this.timestamp = timestamp;
this.thread = thread;
this.severity = severity;
}
public String getMessage() {
return message;
}
public void setMessage(String message) {
this.message = message;
}
public GCPCloudLoggingTimestamp getTimestamp() {
return timestamp;
}
public void setTimestamp(GCPCloudLoggingTimestamp timestamp) {
this.timestamp = timestamp;
}
public String getThread() {
return thread;
}
public void setThread(String thread) {
this.thread = thread;
}
public String getSeverity() {
return severity;
}
public void setSeverity(String severity) {
this.severity = severity;
}
/* Must be public for JSON marshalling logic */
public static class GCPCloudLoggingTimestamp {
private long seconds;
private int nanos;
public GCPCloudLoggingTimestamp(long seconds, int nanos) {
super();
this.seconds = seconds;
this.nanos = nanos;
}
public long getSeconds() {
return seconds;
}
public void setSeconds(long seconds) {
this.seconds = seconds;
}
public int getNanos() {
return nanos;
}
public void setNanos(int nanos) {
this.nanos = nanos;
}
}
}
#Override
public Map<String, String> getDefaultConverterMap() {
return PatternLayout.defaultConverterMap;
}
}
As I said earlier, the code was originally from another answer, I have just cleaned up the code slightly to fit my use-case better.
Google has provided a logback appender for Stackdriver, I have enhanced it abit to include the thread name in the logging label, so that it can be searched more easily.
pom.xml
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-logging-logback</artifactId>
<version>0.116.0-alpha</version>
</dependency>
logback-spring.xml
<springProfile name="prod-gae">
<appender name="CLOUD"
class="com.google.cloud.logging.logback.LoggingAppender">
<log>clicktrade.log</log>
<loggingEventEnhancer>com.jimmy.clicktrade.arch.CommonEnhancer</loggingEventEnhancer>
<flushLevel>WARN</flushLevel>
</appender>
<root level="info">
<appender-ref ref="CLOUD" />
</root>
</springProfile>
CommonEnhancer.java
public class CommonEnhancer implements LoggingEventEnhancer {
#Override
public void enhanceLogEntry(Builder builder, ILoggingEvent e) {
builder.addLabel("thread", e.getThreadName());
}
}
Surprisingly, the logback appender in MVN repository doesn't align with the github repo source code. I need to dig into the JAR source code for that. The latest version is 0.116.0-alpha, it seems it has version 0.120 in Github
https://github.com/googleapis/google-cloud-java/tree/master/google-cloud-clients/google-cloud-contrib/google-cloud-logging-logback
You can use the glogging library.
Simply add it as a dependency and use provided layout in logback.xml:
<configuration>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder class="ch.qos.logback.core.encoder.LayoutWrappingEncoder">
<layout class="io.github.aaabramov.glogging.GoogleLayout">
<!-- You have a choice which JSON encoder to use. Or create your own via implementing JsonEncoder interface -->
<json>io.github.aaabramov.glogging.JacksonEncoder</json>
<!-- OR -->
<!-- <json>io.github.aaabramov.glogging.GsonEncoder</json> -->
<!-- Optionally append "${prefix}/loggerName" labels -->
<appendLoggerName>true</appendLoggerName>
<!-- Optionally configure prefix for labels -->
<prefix>com.yourcompany</prefix>
<!-- Provide message pattern you like. -->
<!-- Note: there is no need anymore to log timestamps & levels to the message. Google will pick them up from specific fields. -->
<pattern>%message %xException{10}</pattern>
</layout>
</encoder>
</appender>
<appender name="ASYNCSTDOUT" class="ch.qos.logback.classic.AsyncAppender">
<appender-ref ref="STDOUT"/>
</appender>
<!-- Configure logging levels -->
<logger name="com.github" level="DEBUG"/>
<root level="DEBUG">
<appender-ref ref="ASYNCSTDOUT"/>
</root>
</configuration>
It will produce messages in the format that GSL will gladly accept:
{"timestamp":{"seconds":1629642099,"nanos":659000000},"severity":"DEBUG","message":"debug","labels":{"io.github.aaabramov/name":"Andrii","io.github.aaabramov/loggerName":"io.github.aaabramov.glogging.App"}}
Disclaimer: I am the author of this library.
Credits: https://stackoverflow.com/a/44168383/5091346

How to add a PatternLayout to the root logger at runtime?

I am using logback as the backend for Slf4j. Currently, I configure the logger using a logback.xml file. My issue is that sensitive information is being logged (outside of my control) and I want to mask this sensitive information. To mask the information, I have wrote a custom PatternLayout class that essentially does:
#Override
public String doLayout(ILoggingEvent event) {
String message = super.doLayout(event);
Matcher matcher = sesnsitiveInfoPattern.matcher(message);
if (matcher.find()) {
message = matcher.replaceAll("XXX");
}
return message;
}
My issue is that I need to tell logback to use this custom pattern layout. I don't want to add this to the XML configuration however. My current configuration looks like this:
<configuration>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<layout class="com.my.MaskingPatternLayout"> <!-- here -->
<pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
</layout>
</encoder>
</appender>
<root level="info">
<appender-ref ref="STDOUT" />
</root>
</configuration>
In XML, my desired configuration would look like this (but I don't want to use XML):
Hello Max I hope you are using Log4j 2.x because this solution uses the plugins approache introduced in log4j 2.x . first you should create a package where you are going to put your plugins classes and you put there these two classes :
my.log4j.pluggins.CustomConfigurationFactory :
#Plugin(name = "CustomConfigurationFactory", category = ConfigurationFactory.CATEGORY)
#Order(value = 0)
public class CustomConfigurationFactory extends ConfigurationFactory {
private Configuration createConfiguration(final String name,
ConfigurationBuilder<BuiltConfiguration> builder) {
System.out.println("init logger");
builder.setConfigurationName(name);
builder.setStatusLevel(Level.INFO);
builder.setPackages("my.log4j.pluggins");
AppenderComponentBuilder appenderBuilder = builder.newAppender(
"Stdout", "CONSOLE").addAttribute("target",
ConsoleAppender.Target.SYSTEM_OUT);
appenderBuilder
.add(builder
.newLayout("PatternLayout")
.addAttribute("pattern", "%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %myMsg%n"));
builder.add(appenderBuilder);
builder.add(builder.newRootLogger(Level.TRACE).add(
builder.newAppenderRef("Stdout")));
return builder.build();
}
#Override
protected String[] getSupportedTypes() {
String[] supportedExt = { "*" };
return supportedExt;
}
#Override
public Configuration getConfiguration(ConfigurationSource source) {
ConfigurationBuilder<BuiltConfiguration> builder = newConfigurationBuilder();
return createConfiguration(source.toString(), builder);
}
#Override
public Configuration getConfiguration(String name, URI configLocation) {
ConfigurationBuilder<BuiltConfiguration> builder = newConfigurationBuilder();
return createConfiguration(name, builder);
}
}
my.log4j.pluggins.SampleLayout :
#Plugin(name = "CustomConverter", category = "Converter")
#ConverterKeys({"myMsg"})
public class SampleLayout extends LogEventPatternConverter {
protected SampleLayout(String name, String style) {
super(name, style);
}
public static SampleLayout newInstance(){
return new SampleLayout("custConv", "custConv");
}
#Override
public void format(LogEvent event, StringBuilder stringBuilder) {
//replace the %myMsg by XXXXX if sensitive
if (sensitive()){
stringBuilder.append("XXXX");}
else {
stringBuilder.append(event.getMessage().getFormattedMessage());}
}
}
the CustomConfiguration class is responsable for creating the configuration of log4j and the line 9 where 'builder.setPackages("my.log4j.pluggins")' is important in order to scan that package and pick up the converter pluggin wich is SampleLayout.
the second class will be responsible for formatting the new key '%myMsg' in the pattern that contains my sensitive message, this Converter class checks if that message is sensitive and actes accordingly.
Before you start logging you should configure your log4j like this
ConfigurationFactory.setConfigurationFactory(new CustomConfigurationFactory());

Categories