Log4j2 custom email subject from Map - java

I've some applications installed to my customers and I configured smtp appender to receive errors email.
Unfortunally I need a way to understand from which customer is arriving the email.
I'm trying to set a parameter in the map in order to show it as the subject of the email. I can set this parameter only after my app is started and the db is up:
String[] parametri = {username};
MapLookup.setMainArguments(parametri);
and my log4j2.xml is:
<SMTP name="Mailer" subject="${sys:logPath} - ${map:0}" to="${receipients}"
from="${from}" smtpHost="${smtpHost}" smtpPort="${smtpPort}"
smtpProtocol="${smtpProtocol}" smtpUsername="${smtpUser}"
smtpPassword="${smtpPassword}" smtpDebug="false" bufferSize="200"
ignoreExceptions="false">
</SMTP>
the subject is the relevant part. Unfortunally the subject is not replaced from log4j and remains as it is.
What I'm doing wrong?
Thanks

Currently, the SmtpAppender class (actually its helper SmtpManager) creates a MimeMessage object once and reuses it for all messages to be sent. The message subject is initialized only once. The lookup is done only once when your configuration is read.
I suggest you raise a feature request on the Log4j2 Jira issue tracker for your use case.

Note: log4j 2.6+ supports this natively; you need Java7+ for this.
I created a free useable solution for log4j2 and also Java6 with an ExtendedSmtpAppender supporting PatternLayout in subject.
If you still use log4j 1.x (original question), simply replace your log4j-1.x.jar with log4j-1.2-api-2.x.jar - and log4j-core-2.x.jar + log4j-api-2.x.jar of course.
You get it from Maven Central as de.it-tw:log4j2-extras (This requires Java 7+ and log4j 2.8+).
If you are restricted to Java 6 (and thus log4j 2.3) then use de.it-tw:log4j2-Java6-extras
See also the GitLab project: https://gitlab.com/thiesw/log4j2-extras (or https://gitlab.com/thiesw/log4j2-Java6-extras)
Additionally, it supports burst summarizing, so you will not get 1000 error emails within a few seconds or minutes. Use case: Send all ERROR-logs via Email to support/developer. On a broken network or database this can cause hundreds of the same error email.
This appender does the following:
the first occurrence is emailed immediately
all following similar ERROR logs are buffered for a certain time (similarity and time is configurable)
after the time passed, a summary email with summary info (number of events, time) and the first and last event is send
Example configuration (inside <Appenders>):
<SMTPx name="ErrorMail" smtpHost="mailer.xxxx.de" smtpPort="25"
from="your name <noReply#xxx.de>" to="${errorEmailAddresses}"
subject="[PROJECT-ID, ${hostName}, ${web:contextPath}] %p: %c{1} - %m%notEmpty{ =>%ex{short})}"
subjectWithLayout="true" bufferSize="5"
burstSummarizingSeconds="300" bsCountInSubject="S" bsMessageMaskDigits="true"
bsExceptionOrigin="true" >
<PatternLayout pattern="-- %d %p %c [%.20t,%x] %m%n" charset="UTF-8" /> <!-- SMTP uses fixed charset for message -->
</SMTPx>
<Async name="AsyncErrorMail" blocking="false" errorRef="Console">
<AppenderRef ref="ErrorMail"/>
</Async>
See also https://issues.apache.org/jira/browse/LOG4J2-1192.

Related

Spring Boot Log4j2 Custom Hybrid Log Layout - Mix of Pattern Layout and JSON Layout

I would like to know how I print my application logs to console in a specific format of my choice.
Our ELK stack's FileBeat daemon is configured to recognise only those Kubernetes pod logs that are in this pattern - appender.console.layout.pattern = %d{ISO8601} - %-5level: %msg%n
This is done so as to keep track of all incoming requests and some attributes of responses. Generally msg part in above pattern contains http requests and responses. Now, I built a new Microservice in Spring Boot that does not have any http interactions. It consumes messages from Kafka and processes them. So, the logs would be mostly application log statements and exceptions.
If I follow the above pattern, my exceptions will be logged as strings and I cannot index logs and filter based on any keys in Kibana. To solve this problem, I need to log msg as JSON just like in JSON layout of log4j2.
I tried putting the following in log4j2.properties file. I am getting a cool json for each log statement but filebeat won't pick this up since it is configured to pick only logs in previously specified format.
log4j2.appender.console.json.type = JsonTemplateLayout
log4j2.appender.console.json.eventTemplateUri = classpath:EcsLayout.json
Could anyone help me arrive at a solution where I can log in the acceptable format only which the msg part is a json that looks like following.
{
"#timestamp": "2017-05-25T19:56:23.370Z",
"ecs.version": "1.2.0",
"log.level": "ERROR",
"message": "Hello, error!",
"process.thread.name": "main",
"log.logger": "org.apache.logging.log4j.JsonTemplateLayoutDemo",
"error.type": "java.lang.RuntimeException",
"error.message": "test",
"error.stack_trace": "java.lang.RuntimeException: test\n\tat org.apache.logging.log4j.JsonTemplateLayoutDemo.main(JsonTemplateLayoutDemo.java:11)\n"
}
In essence, my log statement should be
2022-11-23T15:50:05,802 - ERROR : {"#timestamp":"2017-05-25T19:56:23.370Z","ecs.version":"1.2.0","log.level":"ERROR","message":"Hello, error!","process.thread.name":"main","log.logger":"org.apache.logging.log4j.JsonTemplateLayoutDemo","error.type":"java.lang.RuntimeException","error.message":"test","error.stack_trace":"java.lang.RuntimeException: test\n\tat org.apache.logging.log4j.JsonTemplateLayoutDemo.main(JsonTemplateLayoutDemo.java:11)\n"}
I tried using Pattern Layout and JSON Layout. But I am expecting a Custom Layout that mentioned above.
You can use logback-classic dependency for that.
First add this dependency.
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
</dependency>
Then add this logger to logback.xml
<appender name="json" class="ch.qos.logback.core.ConsoleAppender">
<layout class="ch.qos.logback.contrib.json.classic.JsonLayout">
<jsonFormatter
class="ch.qos.logback.contrib.jackson.JacksonJsonFormatter">
<prettyPrint>true</prettyPrint>
</jsonFormatter>
<timestampFormat>yyyy-MM-dd' 'HH:mm:ss.SSS</timestampFormat>
</layout>
</appender>
<logger name="jsonLogger" level="TRACE">
<appender-ref ref="json" />
</logger>
After that, you can call it like this:
Logger logger = LoggerFactory.getLogger("jsonLogger");
logger.info("test message");
I used Pattern Layout with the following pattern in log4j2.properties
appender.console.layout.pattern = %d{ISO8601} - %level: {"timestamp":"%d{ISO8601}","thread.id":"%T","thread.name":"%t","log.level":"%-5level","log.logger":"%fqcn","message":"%m","error.class_name":"%throwable{short.className}","error.method_name":"%throwable{short.methodName}","error.file_name":"%throwable{short.fileName}","error.line_number":"%throwable{short.lineNumber}","error.message":"%throwable{short.message}","error.localized_message":"%throwable{short.localizedMessage}"}%n
Reference: https://logging.apache.org/log4j/2.x/manual/layouts.html#PatternLayout

Limit number of emails per time from Log4j2 SMTPAppender

I use Apache Log4j2 and its SMTPAppender in an application. It's configured to send email notifications for events of level ERROR or above. Usually this works great.
But recently I had a batch processing situation in which thousands of ERROR lines were logged in a time interval of 5 minutes. My inbox was flooded with thousands of emails and our mail server blacklisted the affected application server...
To avoid such a mishap: Can we apply a maximum limit to the number of emails sent per time interval?
E.g. I'd like SMTPAppender to not send more than 20 emails per hour. If this limit is exceeded, further ERROR/FATAL lines should be aggregated into a single email which is sent as soon as one more email may be sent regarding the limit of 20/hour.
Is there a Log4j2-standard way to achieve that? How did you solve this task in your apps using Log4j2?
You can use BurstFilter. These are the parameters (from the documentation):
Parameter Name
Type
Description
level
String
Level of messages to be filtered. Anything at or below this level will be filtered out if maxBurst has been exceeded. The default is WARN meaning any messages that are higher than warn will be logged regardless of the size of a burst.
rate
float
The average number of events per second to allow.
maxBurst
integer
The maximum number of events that can occur before events are filtered for exceeding the average rate. The default is 10 times the rate.
onMatch
String
Action to take when the filter matches. May be ACCEPT, DENY or NEUTRAL. The default value is NEUTRAL.
onMismatch
String
Action to take when the filter does not match. May be ACCEPT, DENY or NEUTRAL. The default value is DENY.
<Appenders>
<SMTP> <!-- parameters omitted for brevity -->
<BurstFilter level="ERROR" rate="16" maxBurst="100"/>
</SMTP>
</Appenders>

Temporarily increase log4j2 logger level in multi-threaded service

Here is a long question for you Log4j2 gurus.
I have a service that:
has very strict performance requirements
is instrumented with a lot of logging calls using log4j2.
A typical call is gated, like:
if ( LOG.isInfoEnabled() ) {
LOG.info("everything's fine");
}
Because of the number of log messages and the performance needs, the service will generally run with logging set to WARN (i.e., not many messages).
However, I have been asked to build in a parameter to the service call that, if given, will cause it to:
Temporarily increase the logging level to whatever was requested in the parameter (e.g., INFO or TRACE)
Add a WriterAppender to capture the logging in a PrintWriter.
Append the PrintWriter log data to the request response.
It seems clear, due to the gating I put around each logging call, that I need to actually increase the logging level temporarily, like this:
LoggerContext ctx = (LoggerContext) LogManager.getContext(false);
Configuration cfg = ctx.getConfiguration();
LoggerConfig loggerCfg = cfg.getLoggerConfig("com.mycompany.scm");
loggerCfg.setLevel(logLevel);
.. other code to add `WriterAppender` ...
ctx.updateLoggers();
But I have an immediate problem with that, in that it causes the logging to ALSO go the log file of the service. That might not be the end of the world, but I'd like to avoid that, if possible.
I did that by having the default logging go through appenders that filter by level, so that even if logging is turned on, it won't write any messages more detailed than are wanted in the default log file. (Like this, from my .properties file):
appenders=scm_warn, scm_info
appender.scm_warn.type = Console
appender.scm_warn.name = SCM_WARN
appender.scm_warn.layout.type = PatternLayout
appender.scm_warn.layout.pattern = %d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n
appender.scm_warn.filter.threshold.type = ThresholdFilter
appender.scm_warn.filter.threshold.level = warn
appender.scm_info.type = Console
appender.scm_info.name = SCM_INFO
appender.scm_info.layout.type = PatternLayout
appender.scm_info.layout.pattern = %d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n
appender.scm_info.filter.threshold.type = ThresholdFilter
appender.scm_info.filter.threshold.level = info
loggers = coreConfigurator
logger.coreConfigurator.name = com.mycompany.scm
logger.coreConfigurator.level = warn
logger.coreConfigurator.additivity = false # do not let configurator log messages get processed by client application's parent or root logger.
logger.coreConfigurator.appenderRefs = core
logger.coreConfigurator.appenderRef.core.ref = SCM_WARN
... that way, even if the logging level gets increased, the extra messages will not go to the main log file (I only want them to go to my PrintWriter).
And now, the question!
How can I temporarily increase the log level (like I try to do in the code above) for the current thread only?
If there are three (3) simultaneous calls to the service, I...
... want each added Appender to only write log messages generated by the thread that created the Appender.
... want each added Appender to be removed after the request that added it finishes.
... want the logging level to get reset back to what it was, as long as there are no other requests with this logging parameter turned on still in process.
Ideally, I think it sounds like I want each thread to have a completely separate logging context. Is that possible? Any thoughts on how to do all this?
You could potentially use a custom Context Selector to have a different context per thread, but that's probably cause issues when multiple threads want to write to the same log file, so likely not a viable option.
The alternative is to write a custom Appender, that uses a ThreadLocal to store the StringWriter. If a StringWriter has not been established for the thread, the appended will skip logging. This custom Appender should be added in the Log4J config file, so it's always there and receiving log entries.
That way you enable logging for a particular thread by creating and assigning a StringWriter to the ThreadLocal, run the code, then clear the ThreadLocal and get the logged information from the StringWriter. Since there initially is no StringWriter for any thread, the appender will do nothing, so shouldn't affect performance in any noticeable way.
You'd still have to do the level-escalation you're already doing, with filters on the other appenders.
You could:
access to class logger and change the log level (as you suggested):
LogManager.getLogger(Class.forName("your.class.package")).setLevel(Level.FATAL);
use a different logger; just configure two different loggers.

Is it possible to buffer output to Log4j Mail Appender?

In many service applications where log4j is used in combination with a mail appender as such ..
log4j.rootLogger=ERROR, MAIL
log4j.appender.MAIL=org.apache.log4j.net.SMTPAppender
log4j.appender.MAIL.BufferSize=1
log4j.appender.MAIL.SMTPHost=smtp.example.com
log4j.appender.MAIL.From=noreply#example.com
log4j.appender.MAIL.To=developer#example.com
log4j.appender.MAIL.Subject=Exception
log4j.appender.MAIL.layout=org.apache.log4j.PatternLayout
log4j.appender.MAIL.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss,SSS} %5r %-5p [%t] %c{2} - %m%n
.. the probability for several exceptions of the same type to occur, and possibly continuing until an issue is resolved is very high.
Is there a way of buffering output so that mails are only sent within given time intervals?
A nice solution would be to group exceptions on type and class it occurred in, but simply appending to a local log and sending a rolled log over timed intervals would suffice. The idea is to not receive hundreds or thousands of error reports from a stressed system, but rather receive vital information in time, and then not be bothered about the subsequent actions.
Is there a solution without implementing a TriggeringEventEvaluator?
You could try this one: https://github.com/reaktor/log4j-email-throttle
Simple, but works fine for buffering.

Which Appenders should be used in distributed system? How to configure them?

I am trying to add logging component to distributed system. It is written in AspectJ to avoid chaining current source-code. I use socket appender to send logs, but I'd like to try something more effective.
I've heard I should use JMSAppender and AsyncAppender, but I failed to configure it. Should I create Receiver which gathers logs and pass them to database and to GUI (I use ChainSaw)?
I tried to follow turorial1 and tutorial2 , but they aren't clear enough.
Edit:
In a small demo I've prepared I sent 6 logs for a request (simulation of 3 components)
[2012-08-08 15:40:28,957] [request1344433228957] [Component_A] [start]
[2012-08-08 15:40:32,050] [request1344433228957] [Component_B] [start]
[2012-08-08 15:40:32,113] [request1344433228957] [Component_C] [start]
[2012-08-08 15:40:32,113] [request1344433228957] [Component_C] [end - throwing]
[2012-08-08 15:40:32,144] [request1344433228957] [Component_B] [end]
[2012-08-08 15:40:32,175] [request1344433228957] [Component_A] [end]
Using socket Appender. So my log4j.properties is:
log4j.rootLogger=DEBUG, server
log4j.appender.server=org.apache.log4j.net.SocketAppender
log4j.appender.server.Port=4712
log4j.appender.server.RemoteHost=localhost
log4j.appender.server.ReconnectionDelay=1000
so I run
>java -classpath log4j-1.2.17.jar org.apache.log4j.net.SimpleSocketServer 4712 log4j-server.properties
with configuration
log4j.rootLogger=DEBUG, CA, FA
#
log4j.appender.CA=org.apache.log4j.ConsoleAppender
log4j.appender.CA.layout=org.apache.log4j.PatternLayout
log4j.appender.CA.layout.ConversionPattern=[%d] [%t] [%c] [%m]%n
#
log4j.appender.FA=org.apache.log4j.FileAppender
log4j.appender.FA.File=report.log
log4j.appender.FA.layout=org.apache.log4j.PatternLayout
log4j.appender.FA.layout.ConversionPattern=[%d] [%t] [%c] [%m]%n
Then I send my logs from file to Chainsaw:
It is absolutely basic, but I want to learn how to do it better. First of all, I'd like to send logs asynchronously. Then create very simple Receiver, which e.g. can pass logs to a file.
I tried to follow tutorials I listed above, but I failed. So question is: could you provide some example configuration? Example of Receiver.java and log4.properties files?
I would use NFS or CDFS and mount a drive on all the machines. Have each application instance write to a different file. You will be able to find all the logs in one directory (or drive) no matter how many machines you use.
I wouldn't use NFS or CDFS over a global WAN with a high latency e.g. > 50 ms round trip. In this cause I have used JMS (but I didn't use log4j)
My two cents.. Whatever you do, make sure that you use asynchronous mechanism to deliver your logs to the receiver, otherwise it will eventually stall your apps. Another point, to deliver logs reliably you should consider a fail over mechanism built into the appender itself - receivers may go offline for short or long time, if you care for the logs, the fail over is definitely required. We have built similar system you describe (sorry for the add), but if you like you can use our appender (look in downloads), it's free and has the sources. There is also a video tutorial. It has fail over and flexible asynchronous mechanism plus a backup fall back.
How many appenders should you use? One appender per jvm will do all right. Config files should probably be per jvm, not sure how you intend to implement the receiver, in any case the appenders need to find your receiver which is usually host port pair at least. Regarding the database, my experience is very sour with RDBMS (we are moving to nosql) but if you don't go above couple of hundred million records, most commercial databases will do with some effort. Not a simple task I must say, took us couple of years to build commercial quality system you just drawn with few skinny rectangles :)
Finally I've found how to configure it. I put 2 files into src folder.
jndi.properties
topic.logTopic=logTopic
and log4j-jms.properties
log4j.rootLogger=INFO, stdout, jms
## Be sure that ActiveMQ messages are not logged to 'jms' appender
log4j.logger.org.apache.activemq=INFO, stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=
## Configure 'jms' appender. You'll also need jndi.properties file in order to make it work
log4j.appender.jms=org.apache.log4j.net.JMSAppender
log4j.appender.jms.InitialContextFactoryName=org.apache.activemq.jndi.ActiveMQInitialContextFactory
log4j.appender.jms.ProviderURL=tcp://localhost:61616
log4j.appender.jms.TopicBindingName=logTopic
log4j.appender.jms.TopicConnectionFactoryBindingName=ConnectionFactory
Then I run my program with VM argument
-Dlog4j.configuration=log4j-jms.properties
and receive logs in class Receiver.java
public class Receiver implements MessageListener {
PrintWriter pw = new PrintWriter("result.log");
Connection conn;
Session sess;
MessageConsumer consumer;
public Receiver() throws Exception {
ActiveMQConnectionFactory factory = new ActiveMQConnectionFactory("tcp://localhost:61616");
Connection conn = factory.createConnection();
Session sess = conn.createSession(false, Session.AUTO_ACKNOWLEDGE);
conn.start();
MessageConsumer consumer = sess.createConsumer(sess.createTopic("logTopic"));
consumer.setMessageListener(this);
}
public static void main(String[] args) throws Exception {
new Receiver();
}
public void onMessage(Message message) {
try {
LoggingEvent event = (LoggingEvent) ((ActiveMQObjectMessage) message).getObject();
DateFormat df = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss,SSS");
String nowAsString = df.format(new Date(event.getTimeStamp()));
pw.println("["+ nowAsString + "]" +
" [" + event.getThreadName()+"]" +
" ["+ event.getLoggerName() + "]" +
" ["+ event.getMessage()+"]");
pw.flush();
} catch (Exception e) {
e.printStackTrace();
}
}
}
I'd recommend syslog and the built in syslog appender. Use TCP for reliable logging (+Asyc appender maybe) or UDP for fire-and-forget logging.
I have a rsyslog config if you need.

Categories