I have a JAVA UDF installed in DB2 10.5 and everything works ok.
But i have some catch blocks in which i would like to log some info along with the stack trace. Question is how can i log them and in which db2 log file will these info be printed.
I tried using System.out.println and looking into db2diag log file, but nothing was printed there.
In a java routine for Db2-LUW, the output to System.out.println will never appear in the Db2 diagnostics file, and if you are wise you should not force that.
There's more than one way to handle it.
One way is for your routine to redirect stdout before calling System.out.println.
Example
System.setOut(new PrintStream(new FileOutputStream("java_routine_log.txt")));
In the above example the filename is unqualified so by default it will appear in the instance diagnostics directory (by default ~${DB2INSTANCE}/sqllib/db2dump ).
Another way is to use a configurable logging framework which lets you control logging locations and other details for tracing.
Other ways exist also.
Related
I currently have a program that - among other things - produces log outputs. The problem is that the mappers all seem to want to log to wherever they are running. I want all of this log output to end up in a single file. I am using log4j to log information.
I was thinking that it might be possibble to somehow stream that data as a string from the Mapper back to the main function somehow, and log it that way. Is something like this possibble? Is there a better way to consolidate logs?
Each map or reduce task's log is written to the local file system of the task tracker nodes on which they were executed. The log is written to a 'userlog' directory which is defined by HADOOP_LOG_DIR, with sub directory named by task attempt ID.
These logs files can be accessed through job tracker web GUI, in each task detail page, there is a link you can click to check the log content. But from the original Apache Hadoop version, to my best knowledge, there is not any tool or mechanism that you could consolidate logs from different nodes.
While some vendors might have their own utilities for this purpose, I found this one recently, from MapR which seems a good solution but I did not try that myself, http://doc.mapr.com/display/MapR/Centralized+Logging
I am not sure if the idea of picking up logs and feed into one single map task will work. As you need to know the task and attempt id inside the task Java class file to pick up its own log file after it's completed.
OK, I ended up solving this problem by using MultipleOutputs. I set up one stream for the actual output, and one for the log results my code produced. This allowed me to manually log things by sending output through the log output stream. I then had a script consolidate the log files into a single file. While the automatically generated logs stayed where they originally were this solution has allowed me to send the log messsages I put into my code into a single location.
example log statement using MultipleOutputs:
mout.write("logOutput", new Text("INFO: "), new Text("reducertest"),
"templogging/logfile");
I've made a small desktop application in java for OS X. I've packaged in into a .app using JarBundler. Everything runs fine on my computer.
When I send the .app to someone else (also running a mac), the app opens and closes immediately. Is there a log file of some kind I can get from their computer (which I have full access to). Is there a way to get System.out.println statements or similar to show up in that file?
execute the application from the console, from there any errors will be printed to the standard error stream.
Please avoid using System.out.println() statements on the application. The method is synchronized and results in poor performance. Not to mention you may not be able to retrieve the statements based on who captures the console.
Use a logging solution like sl4j and back it up with a logger like log4j with a file appender. The file appender writes to a file and you can get your debug statements / stack traces from there.
I wrote a simple Java app which I have placed in the start up folder of my programs which makes the program starts when the computer starts up. what is the easiest way to make it open a command line or something which I can see the System.out.println results instead of just running in the background?
You should familiarize yourself with logging frameworks such as logback and log4j. Instead of using System.out.println you use some special API and the logging library redirects all messages to preconfigured appenders like console or file.
In your case you can configure your application to log on console while developing and switch to file when configuring an application to run from startup.
This won't really open a new command line window on startup, but instead it will store all messages to some predefined file on disk - which is actually even better.
You can use Log4j API for logging the details with the predefined outputs. It is far better then using SOP. Because it is light waighted and also very simple to configure the logs in the other files with the output format whichever you want to make.
http://logging.apache.org/log4j/1.2/ Go to this url where you can find log4j api available.
Hope this work for you
Enjoy !!!
May be it is simpler than I think but I am confused on the following:
I want to be able to present to a user (in a graphical interface) the logs produced by Log4j.
I could just read the files as it is and present it, but I was wondering if there is a standard way to do it to so as to also get any updates that happen at the same time from the other parts of the application that log concurrently.
The log4j files could be multiple i.e. rolling appender
Also the presentation could be while there is no logging happening.
I.e. view of logs up to date
UPDATE:
I am constraint to Java 6
You can use Java 7's NIO2 libraries to get notified when one of multiple files get's modified in a directory, and reread & display it:
http://blogs.oracle.com/thejavatutorials/entry/watching_a_directory_for_changes
Have you tried the following tools :
Chainsaw
Xpolog
Perhaps add a database appender (JDBCAppender) and present the log entries from that?
Fro the official documentation of log4j:
Is there a way to get log4j to automatically reload a configuration file if it changes?
Yes. Both the DOMConfigurator and the PropertyConfigurator support automatic reloading
through the configureAndWatch method. See the API documentation for more details.
PropertyConfigurator#configureAndWatch
DOMConfigurator#configureAndWatch
For the on-demand reload of log4j config using GUI I would suggest expose it via a servlet in your J2EE application so that whole file can be edited in a web page (text area may be) and once saved you can overwrite your existing log4j file and reload the log4j config.
Maybe you could think about more "OS-level" solution.
I don't know if you are using win or linux, but on linux there is this realy nice command "tail".
So you could use ProcessBuilder to create OS process which goes something like "tail -f yourLogFile.txt".
And then read the OutputStream of the returned Process. Reading the stream will block waiting for new output from the process to be available, and will immediately unblock when such is available, giving you immediate feedback and possibility to read the latest changes of the log file.
However, you might have problems shutting this process down from Java.
You should be able to send SIGTERM signal to it if you know the process id. Or you could start a different process which could lookup the id of the "tail" process and kill it via "kill" command or something similar.
Also I am not sure if there is similar tool available on windows, if this is your platform.
If you write your own simple appender and have your application include that appender in your log4j configuration, your appender will be called whenever events are written to other appenders, and you can choose to display the event messages, timestamps, etc. in a UI.
Try XpoLog log4j/log4net connector. It parses the data automaticly and has predefined set of dashboards for it:
Follow the below steps
Download and install XpoLog from here
Add the log4j data using the log4j data connector from here and
deploy the log4j app here
I found a bug in an application that completely freezes the JVM. The produced stacktrace would provide valuable information for the developers and I would like to retrieve it from the Java console. When the JVM crashes, the console is frozen and I cannot copy the contained text anymore.
Is there way to pipe the Java console directly to a file or some other means of accessing the console output of a Java application?
Update: I forgot to mention, without changing the code. I am a manual tester.
Update 2: This is under Windows XP and it's actually a web start application. Piping the output of javaws jnlp-url does not work (empty file).
Actually one can activate tracing in the Java Control Panel. This will pipe anything that ends up in the Java console in a tracing file.
The log files will end up in:
<user.home>/.java/deployment/log on Unix/Linux
<User Application Data Folder>\Sun\Java\Deployment\log on Windows
/~/Library/Caches/Java/log on OS X
(If you can modify the code) you can set the System.out field to a different value:
System.setOut(new PrintStream(new FileOutputStream(fileName)));
If you are running a script (invoking the program via java) from Unix you could do:
/path/to/script.sh >& path/to/output.log
In Mac 10.8.2 logs could be found at /Users/<userName>/Library/Application Support/Oracle/Java/Deployment/log/.
Before you have to enable logging from Java Control Panel. Option "Enable logging" is at tab "Advanced". Java Control Panel could be started from "System preferences".
A frozen console probably means a deadlock (it could also mean repeated throwing of an exception). You can get a stack dump using jstack. jps may make finding the process easier.
try this guide it works for me. it also guides you that how you can set "System.setOut(fileStream);", "System.setErr(fileStream);"