Understanding logging - java

I have a question about java.util.logging. It's mostly about understanding it because I already got it to work. But I'm still not quite sure why it works.
So i have a big and very old application with many threads without any synchronization (not invented by me, i'm just the poor maintainer). I ported the proprietary logging to java.util.logging. At startup i read a configuration file:
String logname = Ini.getProperty("BoxLog", "boxlog.properties");
logManager = LogManager.getLogManager();
try {
logManager.readConfiguration(new FileInputStream(logname));
LOGGER.info("Read log configuration file " + logname);
} catch (SecurityException e) { ... }
I stepped through it with a debugger, and everything is ok here. No exception, the correct file is read, everything seems to be ok. But the format for the log handler is not updated. So the log line here is output in the format given as default in the system-wide configuration.
Due to the many threads, there are a few log outputs in another object that use the default configuration, they are output before my configuration is read. After I removed these loggings everything worked ok.
The documentation says for readConfiguration(): Reinitialize the logging properties and reread the logging configuration. So I assumed that after these few lines of logging happened, and then after the correct configuration was read, every further logging would be in the given format. But it was not. Every further logging in any class was still in the default format. It seems to me that on reading the configuration any already instantiated loggers and their handlers are not reinitialized.
Now this is the question. How does this work really? Did I miss something here? Or did I wrongly understand something here? Or is there a bug in the documentation? Or what else?

The documentation says for readConfiguration(): Reinitialize the logging properties and reread the logging configuration. So I assumed that after these few lines of logging happened, and then after the correct configuration was read, every further logging would be in the given format.
You are running in to JDK-8033661 readConfiguration does not cleanly reinitialize the logging system and JDK-5035854 LogManager.readConfiguration does not properly modify existing loggers.
This is resolved in JDK9 by adding LogManager.updateConfiguration​(InputStream, Function<String,BiFunction<String,String,String>>)
This is covered in detail in Use of readConfiguration method in logging activities.
If you don't have access to JDK9 then you can use the java.util.logging.config.class option along with the java.util.logging.config.file option to create custom code that works around limitation of the bugs listed above.
Every further logging in any class was still in the default format
If you are using the SimpleFormatter then that is covered in SimpleFormatter ignoring the java.util.logging.SimpleFormatter.format property.

Related

WebSphere Liberty Profile Verbose Class Logging - where is it's output?

I'm getting an error in my program about a class not being found. I have double (and triple) checked and the class is definitely in my jar - it's finding other classes from the same jar just fine.
To help with debugging this, I want to turn on verbose class loading logging as described here:
http://java.dzone.com/articles/how-use-verbose-options-java
That doesn't say how exactly to turn on this option if you're using WebSphere Liberty Profile, though, so I looked around some more and found this:
http://www-01.ibm.com/support/knowledgecenter/SSD28V_8.5.5/com.ibm.websphere.wlp.doc/ae/twlp_admin_customvars.html
This says that I need to place the line in ${server.config.dir}/jvm.options.
So I wrote a simple file which consists only of:
-verbose:class
And I saved that to wlp/usr/servers/defaultServer/jvm.options, which means the new file is in the same directory as my apps folder, my logs folder, and my server.xml.
I stopped my server and started it back up and looked in the logs directory. It generated the same logs as always, nothing new: console.log, messages.log, status.log, and trace.log. I checked all of these log files and none of them have anything like the output from my first link.
So I don't think I'm doing this properly. Here are the three points where I think I may have gone wrong:
Was my file too simple? Is there more that I need to put in it than just -verbose:class? Does that perhaps need to be nested in something? Are there more parameters that I must have? Prior to this I didn't have any jvm.options file at all, so I assume that it'll use default values for anything I'm not explicit about.
Did I put the file in the proper place? As far as I can tell from the documentation, I think I put it in the proper spot, but the docs are a bit less explicit than I would like.
Am I looking in the right place for the logs? What will the name of the log file be? Where will it be placed? I assumed it would be in the logs directory just like all of the other logs generated by WebSphere Liberty Profile, but maybe I'm incorrect?
While writing the third bullet for my question, I realized that console.log was actually a new file that didn't previously exist, and I hadn't actually checked what was in it. I just opened it up and lo and behold, it's exactly the class loading logs that I was looking for.
So to recap, here are the answers to my bullets:
You can have a file with nothing but -verbose:class
You save it to wlp/usr/servers/<server name>/jvm.options
The output is in wlp/usr/servers/<server name>/logs/console.log

Why would logback ever READ the log file it is writing to?

In a java application that uses logback, and runs under Windows 7, the sysinternals process monitor (www.sysinternals.com) shows that the java process is READING the application's log file. Why would this be?
Our app is having issues and logback came under scrutiny when this was discovered. We have since found that this is irrelevant to our issues, but I would still like to understand it.
I had thought that an appender would only append to the end of the log file as its name implies and am surprised and embarrassed to see this, especially after I insisted it couldn't be possibly be true.
Can someone explain why logback would need to READ a logfile? I can categorically state that none of the application's code reads the file.

Best way to interact with application logs output from log4j (while also are being updated by application)

May be it is simpler than I think but I am confused on the following:
I want to be able to present to a user (in a graphical interface) the logs produced by Log4j.
I could just read the files as it is and present it, but I was wondering if there is a standard way to do it to so as to also get any updates that happen at the same time from the other parts of the application that log concurrently.
The log4j files could be multiple i.e. rolling appender
Also the presentation could be while there is no logging happening.
I.e. view of logs up to date
UPDATE:
I am constraint to Java 6
You can use Java 7's NIO2 libraries to get notified when one of multiple files get's modified in a directory, and reread & display it:
http://blogs.oracle.com/thejavatutorials/entry/watching_a_directory_for_changes
Have you tried the following tools :
Chainsaw
Xpolog
Perhaps add a database appender (JDBCAppender) and present the log entries from that?
Fro the official documentation of log4j:
Is there a way to get log4j to automatically reload a configuration file if it changes?
Yes. Both the DOMConfigurator and the PropertyConfigurator support automatic reloading
through the configureAndWatch method. See the API documentation for more details.
PropertyConfigurator#configureAndWatch
DOMConfigurator#configureAndWatch
For the on-demand reload of log4j config using GUI I would suggest expose it via a servlet in your J2EE application so that whole file can be edited in a web page (text area may be) and once saved you can overwrite your existing log4j file and reload the log4j config.
Maybe you could think about more "OS-level" solution.
I don't know if you are using win or linux, but on linux there is this realy nice command "tail".
So you could use ProcessBuilder to create OS process which goes something like "tail -f yourLogFile.txt".
And then read the OutputStream of the returned Process. Reading the stream will block waiting for new output from the process to be available, and will immediately unblock when such is available, giving you immediate feedback and possibility to read the latest changes of the log file.
However, you might have problems shutting this process down from Java.
You should be able to send SIGTERM signal to it if you know the process id. Or you could start a different process which could lookup the id of the "tail" process and kill it via "kill" command or something similar.
Also I am not sure if there is similar tool available on windows, if this is your platform.
If you write your own simple appender and have your application include that appender in your log4j configuration, your appender will be called whenever events are written to other appenders, and you can choose to display the event messages, timestamps, etc. in a UI.
Try XpoLog log4j/log4net connector. It parses the data automaticly and has predefined set of dashboards for it:
Follow the below steps
Download and install XpoLog from here
Add the log4j data using the log4j data connector from here and
deploy the log4j app here

Adding Log4J appenders programmatically

Ok, so I have this stupid library I'm using (documentum DFC), which does a check to see if Logger.getRootLogger().getAllAppenders().hasMoreElements() == false, if so, it resets my rootLogger level to WARN, which destroys my logging after that. So in an effort to stop this, I'm attempting to add an appender to the root logger just to see if I can get it to stop doing that code. However when
I call
Logger.getRootLogger().addAppender(new ConsoleAppender()); that function is still coming up false. Has anyone run into this?
I'm using whatever log4j version comes with jboss 6, it doesn't say in the jar file name.
I do have similar problems. I can add an appender, writing to a memory string but this never works.
For me it seems like JBoss does use/modify log4J in a way, that this code modification is no longer possible, see also here: https://issues.jboss.org/browse/JBAS-9318

java web app cannot create log files on glassfish v3(using log4j)

As far as I can tell, it can write to console on production environment, it also can create log files on my local glassfish. Any thoughts on this issue? Thank you!
First of all, check if log4j is configured correctly. You won't get any log files if you didn't ask for them.
Second, as ckuetbach suggested, check if your paths have adequate permission. If you have problems of this kind, you should see log4j's error trace in Glassfish's log files. They won't halt your application, since logging systems are supposed to be non intrusive, but a trace should exist. And from it, you will know what (if anything) went wrong.
I think it may be the filepermissions or absolute Paths
in the log4j config.
Give this a try
Put log4j.xml in domains//lib/classes.
create and permission log folder as expected by log4j config e.g. ${com.sun.aas.instanceRoot}/applogs/myapplication.log

Categories