I am using WebLogic and Log4j for my Struts application. Since Action class is not thread-safe, I assumed Action classes are cached by WebLogic and re-used for every HTTP request.
In this case, if there are multiple clients accessing the same Action class, I assume the events printed on by Log4j will be output by multiple request.
The log information would not be sequential and very difficult to interpret.
How do I resolve such issues?
First, I would like to fix some terms usage in your question. Struts is a MVC framework. Weblogic is a Java EE container. The Action functionality and life cycle does not depend on container. It is the Struts's functionality only.
You are right, since instance of Action is created per request your log will contain a mixture of log messages created by different actions.
The typically used solution is to print thread name into log (log4j supports this configuration), then use grep command on unix or find on windows to filter only relevant messages.
Here is an example of layout configuration that causes log4j to print thread name:
<layout class="org.apache.log4j.EnhancedPatternLayout">
<param name="ConversionPattern" value="%-5p %-23d{ISO8601}{GMT} [%t] %x: %c{1}(%C{1}.%M:%L) - %m%n"/>
</layout>
[%t] does the job.
Related
I have the following situation:
- two HA singleton apps on jboss eap
- each one with its own log4j.properties file:
App1:
log4j.rootLogger=DEBUG, App1
log4j.appender.App1=org.apache.log4j.RollingFileAppender
log4j.appender.App1.append=true
log4j.appender.App1.File=${jboss.server.log.dir}/App1.log
log4j.appender.App1.MaxFileSize=10MB
log4j.appender.App1.MaxBackupIndex=10
log4j.appender.App1.threshold=TRACE
log4j.appender.App1.layout=org.apache.log4j.PatternLayout
log4j.appender.App1.layout.ConversionPattern=[%-5p] [%t] %d{yyyy MMM dd HH:mm:ss,SSS} (%C:%F:%L) - %m%n
log4j.logger.org.hibernate=DEBUG, App1
log4j.logger.com.arjuna=DEBUG, App1
log4j.logger.com.sun=ERROR,App1
log4j.logger.com.sun.xml.ws.transport.http.client.HttpTransportPipe=DEBUG,App1
App2
log4j.rootLogger=DEBUG, App2
log4j.appender.App2=org.apache.log4j.RollingFileAppender
log4j.appender.App2.append=true
log4j.appender.App2.File=${jboss.server.log.dir}/App2.log
log4j.appender.App2.MaxFileSize=10MB
log4j.appender.App2.MaxBackupIndex=10
log4j.appender.App2.threshold=TRACE
log4j.appender.App2.layout=org.apache.log4j.PatternLayout
log4j.appender.App2.layout.ConversionPattern=[%-5p] [%t] %d{yyyy MMM dd HH:mm:ss,SSS} (%C:%F:%L) - %m%n
log4j.logger.org.hibernate=DEBUG, App2
log4j.logger.com.arjuna=DEBUG, App2
log4j.logger.com.sun=ERROR,App2
log4j.logger.com.sun.xml.ws.transport.http.client.HttpTransportPipe=DEBUG,App2
both installed and running quite hmmm... with this issue:
All hibernate logs form App2 are written in App1.log
Also all com.sun.xml... logs from App2 are written in App1.log.
And non (nor hibernate nor sun.xml) are written into App2.log.
Both org.hibernate and com.sun are log managed into applications, at server level they are on ERROR level, so there is no logging in server.log.
Also, if I’m disabling App1, those two categories from App2 will be logged into the App1's log file.
It clearly is something I miss and/or I really don't know.
Now, my problem is that I need log4j to log stuff only where I’m telling it to do it.
Can anyone advise me with anything? And, hmm, I really don’t like the idea of using jboss logging settings (custom appenders in console or standalone-ha.xml).
I just want to use log4j…
Thanks to all.
Hibernate is shipped with JBoss (under modules\system\layers\base\org\hibernate) and uses internally org.jboss.logging. And org.jboss.logging is also shipped with JBoss. Hence I guess that logging initialization is only done once for the complete JVM (as org.jboss.logging is loaded through the module classloader and not through your application class loaders and static fields are classloader specific).
I would recommend (although your requirements sounds strange to me), to ship Hibernate and log4j in your application (i.e. in the lib/ folder of your applications). Then Hibernate is loaded twice, one time by the classloader for the first application and a second time by the classloader for the second application. This way the static field can exist twice.
I know there has been a lot of question related to this, but i couldn't find one that matches on the scenario that i'm looking at, so here's the question.
Current logging setup: logger coded using Slf4j with Log4j 1.2 bindings. DailyRollingAppender used.
The program: A multi-threading backend Java program processing data from a table and call relevant web services.
A new request came in to have the log file name be based on a certain data, lets call it match_code. With this, whenever a thread is processing say MatchA, then the log file the thread use should be set as MatchA.log
I'd googled for a while and understand that i will need to programmatically configure the Log4j configuration whenever the processes starts, the question is how should i change the log file name settings while not affecting others setting such as the Patterns.
I'm open to switch to Log4j 2.x if it means that can solve my problem, so far have no luck in finding samples for this.
Any suggestion is appreciated. Thank you.
UPDATE on what's tried
Tried using the System.setProperty way to dynamically set the log file. Here's the properties setting:
log4j.appender.file.File=/log/${logfile.name}.log
In main class, added these two lines before anything else:
static{
System.setProperty("logfile.name","output");
}
private static Logger logger = LoggerFactory.getLogger(Engine.class);
Added this right after the process found data to be process:
System.setProperty("logfile.name",match_code+"_output");
where match_code is a value from database such as 'MatchA'
The result is, the main class DID have the log named as output.log. However if i put in data to test, the log will still goes to output.log and there's no new log file created based on the data.
Dunno if I understand your problem: you want your same log message goes to different log file, depending on the data you are processing?
if you use LogBack, you can do it by combination of MDC + SiftingAppender. For example, in your code, you can do:
(in code)
MDC.put("match_code", "MatchA");
logger.debug("whatever message"); // just log it normally
(in logback.xml)
<appender name="SIFT" class="ch.qos.logback.classic.sift.SiftingAppender">
<discriminator>
<key>match_code</key>
<defaultValue>unknown</defaultValue>
</discriminator>
<sift>
<appender name="FILE-${match_code}" class="ch.qos.logback.core.FileAppender">
<file>${match_code}.log</file>
<layout class="ch.qos.logback.classic.PatternLayout">
<pattern>%d [%thread] %level %mdc %logger{35} - %msg%n</pattern>
</layout>
</appender>
</sift>
</appender>
Please note, here I am using LogBack as logging backend. There is no official sifting appender for Log4j 1. I kind of remember something similar in Log4j 2. Wish this can serve as your starting point of searching if you really insist to use Log4j
The settings I have specified in log4j is as follows:
log4j.appender.F2=org.apache.log4j.RollingFileAppender
log4j.appender.F2.File=E\:/Documentum/logs/dflogger.log
log4j.appender.F2.MaxFileSize=10MB
log4j.appender.F2.layout=org.apache.log4j.PatternLayout
log4j.appender.F2.layout.ConversionPattern=%d{ABSOLUTE} %5p [%t] %c - %m%n
log4j.logger.com.myorg.mytbo.tbo=DEBUG,F2
I am trying to log information from inside TBO (defined in package com.myorg.mytbo.tbo) which is essentially a JAR deployed on jboss application server inside EMC Documentum Content Server. This specific information about Documentum server shouldn't be any concern since it still uses org.apache.log4j. When the TBO runs it creates dflogger.log but does not append any information inside it.
I suspect many reasons like:
File is getting created in readonly mode.
or may be there is some issue associated with logging for com.myorg.mytbo.tbo, for which this is somewhat similar thread, specifying PARENT_FIRST option in websphere. But I am using jboss, so if the issue is similar then can anyone tell if I need to modify jboss settings?
I'm looking for simplest way to trace execution time of SQL query generated by Hibernate.
Unfortunately it cannot be done in traditional way - just by setting show_sql or hibernate logger, because monitored application is multithread on production environment and sql times tracing should be done only for one service, which is mots problematic.
Service means some component running within Spring application. But in my case it is safe to tell, that it is thread - thread is not changed during invocation. Service implementation is a Java method and this method calls others methods, components, etc, everything i one thread. It is possible for me to change one method source and deploy it, but I cannot release application.
Unfortunately AspectJ cannot be used as is, because I cannot change whole application, recompile nor plug something into JVM.
Unfortunately (next) DB administrators cannot turn on sql queries tracing - they don't know how to do it.
Please help, how to tune Hibernate execution without digging the whole application? What is the simplest way?
Facts: Hibernate 3.x, Spring 2.x, java 1.5
Here is how I would do it, assuming you're using Logback as your logging framework.
Make sure you have scanning enabled on your logback configuration:
<?xml version="1.0" encoding="UTF-8"?>
<configuration debug="false" scan="true" scanPeriod="30 seconds" >
Make sure your file logger includes thread name in the output (%t):
<encoder class="ch.qos.logback.classic.encoder.PatternLayoutEncoder">
<pattern>%d{yyyy-MM-dd HH:mm:ss} %-5p %C{1} %t - %m%n</pattern>
</encoder>
Start with SQL logging turned off:
<logger name="org.hibernate.SQL" level="OFF">
<appender-ref ref="FILE_APPENDER"/>
</logger>
<logger name="sql-log" level="OFF">
<appender-ref ref="FILE_APPENDER"/>
</logger>
Once your application is up and running, and you're ready to execute your test, edit the logback.xml file of the deployed application, and change those levels to 'DEBUG'. Then execute your tests. When tests are done, set those levels back to 'OFF'.
Look through the log output, and identify the thread name of interest, then grep for that thread name:
grep "thread-1-pool-7" debug.log > sqldebug.log
A bit cumbersome, but it will work.
You are not very specific about filtering criteria: do you want to filter SQLs by thread/HTTP session or from a given service (sql times tracing should be done only for one service)?
Nevertheless everything can be done on logging framework level. First you need to enable logging of all queries and then filter out non-interesting ones. I am assuming you are using Logback (BTW Spring 2.x and Java 1.5.x are becoming obsolete...):
Per thread
Implement Logback filter and discard logs from not interesting thread. Alternatively use SiftingAppender with thread id as a key. All logs from a given thread will be dispatched to a separate file.
Per HTTP session
This is a bit tricky because you need to get access to HTTP session/session id from logging framework level. The easy way is to use MDC (see example: Logging user activity in web app). Having session id you can do filtering similar to Per thread section.
Per service
It's not obvious what do you mean. Do you only want to log SQL queries issued from a given class or from a class and all the methods and classes it calls? In both cases you need to examine a call stack while filtering, which isn't very effective. Also AspectJ has within directive - too heavyweight for you I guess.
If this is what you want to achive please clarify your question, I have some other ideas.
I create a default project using GoogleAppEngine for java, and when I deploy my application on google server I have the following warning message for the first request.
log4j:WARN No appenders could be found for logger (DataNucleus.Connection).
log4j:WARN Please initialize the log4j system properly.
The log is working fine, but some request are delayed by this problem.
How can I configure it correctly?
What makes you sure that this is delaying some of your requests?
GAE does not function like a standard servlet container. Behind the scenes it unloads any webapps that are idle, and then loads them in again only when it gets a new request for that webapp. This is basically equivalent to doing a complete redeploy of your application, and it doesn't even begin until after GAE has received the request. Thus any request that triggers a load operation will be noticeably delayed compared to subsequent requests.
But there are a whole lot of things going on that are contributing to the delay, and I think an uninitialized log4j setup is not making much of an actual difference.
This messages means that there is no log4j configuration is found.
You have to put an configuration for log4j, for example into file named as log4j.properties in app classpath.
Configuration, for example, would be:
log4j.rootLogger=WARN, stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target=System.out
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{ABSOLUTE} %5p %t %c{1}:%M:%L - %m%n
If your project is Maven based, then the best place to put log4j.properties will be src/main/resources