I want to implement the logging to openlog.nsf functionality from the domino api in an application.
However in the current setup of the application logging (to Domino console) only occurs when this is enabled for the whole application via a configuration property e.g.
public static void writeToConsole(String msg){
if (getDeugMode() ==true) {
System.out.println(msg);
}
}
usage:
writeToConsole("hello world");
I am wondering how I could rewrite the writeToConsole method to utilize the XspOpenLogLogUtil class?
XspOpenLogUtil.logErrorEx(Throwable, String, Level, Document) will allow you to pass a custom message.
Also XspOpenLogUtil.getOpenLogItem() gets a handle on the OpenLogItem object. From there you can use any of the inner methods.
See this page https://wiki.openntf.org/pages/viewpage.action?pageId=6586418 (all the method names are the same, the class in ODA is just XspOpenLogUtil instead of OpenLogUtil.
In my apps I usually have a wrapper method handleException(Throwable t) which calls XspOpenLogUtil.logError()` anyway and that's what I would recommend. It gives greater flexibility for handling e.g. different logging levels like this or changing the logging framework, should you so wish in the future.
Related
I am reading a code of GWT
Basically in this project they are getting some constant value like button text from a properties file.
so they have an interface LocalizableResource and getting the instance like
public interface LocalizableResource extends Constants {
public static class Util {
public static LocalizableResource getInstance() {
return GWT.create(LocalizableResource.class);
}
}
String lblName_text_1();
}
and use this instance to get a button text
String buttonText = LocalizableResource.Util.getInstance().lblName_text_1();
Button b = new Button(buttonText);
in java we can not Instantiates an interface then,
How GWT doing this such like. I have not so much Idea about deferred binding and GWT.
That's the beauty of GWT and one of its way to manage multiple clients which is the core advantages of GWT framework.
http://www.gwtproject.org/doc/latest/DevGuideCodingBasicsDeferred.html
Deferred binding is a feature of the GWT compiler that works by generating many versions of code at compile time, only one of which needs to be loaded by a particular client during bootstrapping at runtime. Each version is generated on a per browser basis, along with any other axis that your application defines or uses. For example, if you were to internationalize your application using GWT’s Internationalization module, the GWT compiler would generate various versions of your application per browser environment, such as “Firefox in English”, “Firefox in French”, “Internet Explorer in English”, etc… As a result, the deployed JavaScript code is compact and quicker to download than hand coded JavaScript, containing only the code and resources it needs for a particular browser environment.
A tag interface that facilitates locale-sensitive, compile-time
binding of constant values supplied from properties files. Using
GWT.create(class) to "instantiate" an interface that extends Constants
returns an instance of an automatically generated subclass that is
implemented using values from a property file selected based on
locale. more info
I created a wrapper around Log.i.
public class MyLog {
public static int i(String tag, String message) {
//Do Stuff
return Log.i(tag, message);
}
}
Now I'd like all my consumers com.myapp.package1, com.myapp.package2, etc. to use MyLog.i instead of Log.i. Currently, we're just enforcing this by conventions and documentation. Are there more aggressive programatic ways to discourage or disable developers from calling Log.i from within certain packages?
Not within the language itself. THis is something better done by either your build system or your source control system. A lint rule would work, with your build system set to error out or your source control set to reject diffs that fail the rule.
I would like to add a unique identifier to log statements, so I am able to add documentation (externally, e.g. a wiki) to every log statement, so a user can quickly access the message related documentation using the id. The logging framework I would like to use is SLF4J/logback.
I was not able to find documentation about related approaches except for some bits regarding auditing frameworks.
There is the Marker concept which I thought could be usable for ID injection, or I could just add the ID to the message text itself.
How would I add IDs to the logging statements "the right way"? Are there possibilities I didn't think of?
EDIT
The term unique ID just states there should be an identifier per log statement. A developer e.g. adds such an ID to a table/enum/whatever manually, which could be done wrong.
Such ID has to be stable, so documentation can be based on it. So the ID itself is not what I am wondering about.
My question is: what would be the right way of pushing the ID to the logger together with the message text? Would Markers be suited for this kind of requirement, should I embed the ID into the message text or is there some other possibility?
So, basically, would I use
logger.info(IDMarkers.DB_CONNECTION_FAILED, "no connection to the database");
or instead just
logger.info("[{}] no connection to the database", LogIDs.DB_CONNECTION_FAILED);
First approach has the advantage that showing the IDs is up to the logging system/its configuration.
Slf4j has http://www.slf4j.org/apidocs/org/slf4j/Marker.html
Unfortunately Markers are advertised for a different purpose. Still you can use them to uniquely mark logging statements.
More cumbersome solution is MDC:
MDC.put("MsgId", "EV-1234");
log.info()
MDC.remove("MsgId");
or with structural logging (requires v2.0.0):
logger.atDebug()
.addKeyValue("MsgId", "EV-1234")
.log("Temperature changed.");
Unique is only unique within some scope. Eventually, even every int or long value will be used.
So think about what "uniqueness" means to you. Then use a wrapper that will ensure your logging is handled with that id inserted.
Note that with slf4j you are dealing with an interface which will make a number of logging APIs consistent. This means you probably won't have the option to sub-class or even inject your implementation of the interface to ensure your consistent logging. Therefore you will be constrained to techniques like wrapping your logging API (preferably through the "consistent" interface).
package mypackage.log;
public class LoggerWrapper implements org.log4j.Logger {
private org.log4j.Logger logger;
public LoggerWrapper(org.log4j.Logger logger) {
this.logger = logger;
}
public String getUniqueId() {
return ...;
}
public void info(String message, Object params...) {
logger.info(String.format("[%d] %s", getUniqueId(), message), params));
}
... implement all the other methods ...
}
And this means that you will have to make your own LoggerFactory interface too
public Logger getLogger(String name) {
return new LoggerWrapper(org.sql4j.LoggerFactory(name));
}
While the code above has a few warts (not actually testing it); hopefully, it will give you an idea.
So I'm writing a web service architecture which includes FunctionProvider classes which do the actual processing of requests, and a main Endpoint class which receives and delegates requests to the proper FunctionProvider.
I don't know exactly the FunctionProviders available at runtime, so I need to be able to 'register' (if that's the right word) them with my main Endpoint class, and query them to see if they match an incoming request.
public class MyFunc implements FunctionProvider{
static {
MyEndpoint.register(MyFunc);
}
public Boolean matchesRequest(Request req){...}
public void processRequest(Request req){...}
}
public class MyEndpoint{
private static ArrayList<FunctionProvider> functions = new ArrayList<FunctionProvider>();
public void register(Class clz){
functions.add(clz);
}
public void doPost(Request request){
//find the FunctionProvider in functions
//matching the request
}
}
I've really not done much reflective Java like this (and the above is likely wrong, but hopefully demonstrates my intentions).
What's the nicest way to implement this without getting hacky?
Do not let the FunctionProviders self register. Bootstrap the endpoint through some application init. call with a list of FunctionProviders. That way you can configure priority (what if two providers both claim they can process a request?). The way you set it up now you need to invoke the class somehow to trigger the static constructor, too indirect.
If detecting whether or not a FunctionProvider supports a given request is trivial consider making it part of configuration. If this is in the request map it to that FunctionProvider. This would seperate concerns a bit better. If the detection is complicated consider doing it in seperate classes from the FunctionProvider.
By configuring a delegate/function pointer you can possibly prevent from needing a FunctionProvider altogether (not sure if/how Java supports delegates).
I'm configuring the logging for a Java application. What I'm aiming for is two logs: one for all messages and one for just messages above a certain level.
The app uses the java.util.logging.* classes: I'm using it as is, so I'm limited to configuration through a logging.properties file.
I don't see a way to configure two FileHandlers differently: the docs and examples I've seen set properties like:
java.util.logging.FileHandler.level = INFO
While I want two different Handlers logging at different levels to different files.
Any suggestions?
http://java.sun.com/j2se/1.4.2/docs/guide/util/logging/overview.html is helpful. You can only set one Level for any individual logger (as you can tell from the setLevel() method on the logger). However, you can take the lowest of the two common levels, and then filter programmatically.
Unfortunately, you can't do this just with the configuration file. To switch with just the configuration file you would have to switch to something like log4j, which you've said isn't an option.
So I would suggest altering the logging in code, with Filters, with something like this:
class LevelFilter implements Filter {
private Level Level;
public LevelFilter(Level level) {
this.level = level;
}
public boolean isLoggable(LogRecord record) {
return level.intValue() < record.getLevel().intValue();
}
}
And then on the second handler, do setFilter(new LevelFilter(Level.INFO)) or whatever. If you want it file configurable you could use a logging properties setting you've made up yourself, and use the normal Properties methods.
I think the configuration code for setting up the two file handlers ad the programmatic code is fairly simple once you have the design, but if you want more detail add a comment and I'll edit.
I think you should be able to just subclass a handler and then override the methods to allow output to go to multiple files depending on the level of the message. This would be done by overriding the publish() method.
Alternatively, if you have to use the system-provided FileHandler, you could do a setFilter() on it to inject your own filter into the mix and, in that filter code, send ALL messages to your other file and return true if the LogRecord level if INFO or higher, causing the FileHandler.publish() to write it to the real file.
I'm not sure this is the way you should be using filters but I can't see why it won't work.