I want to customize my log messages in Tomcat6, and have created a class "MyFormatter" which looks like this:
public class LogFormatter extends Formatter {
#Override
public String format(LogRecord record) {
StringBuilder sb = new StringBuilder();
sb.append("LOLCAT--")
.append(new Date(record.getMillis()))
.append(" \t")
.append(record.getThreadID())
.append(" \t")
.append(record.getSourceMethodName())
.append(" \t")
.append(record.getSourceClassName())
.append(" \t")
.append(record.getLevel().getLocalizedName())
.append(": ")
.append(formatMessage(record))
.append(System.getProperty("line.separator"));
return sb.toString();
}
}
I've packed this into a .jar and placed in ${catalina.home}/lib.
In my logging.properties file I've added the following:
1catalina.org.apache.juli.FileHandler.level = FINE
1catalina.org.apache.juli.FileHandler.directory = ${catalina.base}/logs
1catalina.org.apache.juli.FileHandler.prefix = lolcat.
1catalina.org.apache.juli.FileHandler.formatter = my.package.LogFormatter
After sevral attempts trying different packaging, different configs, i decided to try the built in "org.apache.juli.OneLineFormatter" - and this works perfectly. So the config should be fine.
The question remains, why doesn't Tomcat6 load my class?
I found the solution.
After reading about Tomcat and Class Loading, i found that there is an order which Tomcat follows. It goes like this;
Bootstrap (/jre/lib/ext) -> System (/catalina-home/bin/) -> Common (/catalina-home/lib) -> Webapps.
tomcat-juli.jar which contains the logging stuff is loaded with the "System"-step, so when you place other logging stuff in common, it ignores it cause its already loaded.
The solution is then to place the .jar before tomcat-juli.jar is loaded aka in /jre/lib/ext.´
Edit:
It's not always a great idea to keep it in the jre folder, so I found that the best solution is to put it in an endorsed directory.
-Djava.endorsed.dirs=${catalina_home}/endorsed
This endorsed directory will run before System class loading.
Related
Has anyone tried the plugin to build an executable war/jar using Tomcat 9?
I attempted to do so however ran into:
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.catalina.startup.Catalina.setConfig(Ljava/lang/String;)V
at org.apache.tomcat.maven.runner.Tomcat7Runner.run(Tomcat7Runner.java:240)
at org.apache.tomcat.maven.runner.Tomcat7RunnerCli.main(Tomcat7RunnerCli.java:204)
I looked at the source and changed Catalina.setConfig() to Catalina.setConfigFile() based on docs here. After doing so the .extract dir is just empty:
use extractDirectory:.extract populateWebAppWarPerContext
warValue:ROOT.war|ROOT populateWebAppWarPerContext
contextValue/warFileName:ROOT/ROOT.war webappWarPerContext entry
key/value: ROOT/ROOT.war expand to file:.extract/webapps/ROOT.war
Exception in thread "main" java.lang.Exception: FATAL: impossible to
create directories:.extract/webapps at
org.apache.tomcat.maven.runner.Tomcat7Runner.extract(Tomcat7Runner.java:586)
at
org.apache.tomcat.maven.runner.Tomcat7Runner.run(Tomcat7Runner.java:204)
at
org.apache.tomcat.maven.runner.Tomcat7RunnerCli.main(Tomcat7RunnerCli.java:204)
.... although there is a ROOT.war, server.xml, web.xml in the *-exec-war.jar.
Is there a better way to be creating exec-jars with embedded tomcat 9?
For those looking for a solution it was fairly straight forward to checkout the code for the plugin and make a few changes to get this to work. Namely:
Update POM to change the depends to Tomcat 9
Fix compile errors which generally stem from deprecated methods. The lookup on these methods can be found here. For example:
- container.setConfig( serverXml.getAbsolutePath() );
+ container.setConfigFile( serverXml.getAbsolutePath() );
... and ...
- staticContext.addServletMapping( "/", "staticContent" );
+ staticContext.addServletMappingDecoded( "/", "staticContent" );
There are a few others but generally not difficult to resolve. After doing so I updated my app's pom to use the modified version and was able to generate a Tomcat 9 exec jar.
I would love to hear what others are doing here. I know some are programmatically initializing Tomcat via a new Tomcat() instance however curious what other solutions exist ready made. Thanks
For future searchs, one solution is to use the DirResourceSet or JarResourceSet.
String webAppMount = "/WEB-INF/classes";
WebResourceSet webResourceSet;
if (!isJar()) {
webResourceSet = new DirResourceSet(webResourceRoot, webAppMount, getResourceFromFs(), "/");
} else {
webResourceSet = new JarResourceSet(webResourceRoot, webAppMount, getResourceFromJarFile(), "/");
}
webResourceRoot.addJarResources(webResourceSet);
context.setResources(webResourceRoot);
public static boolean isJar() {
URL resource = Main.class.getResource("/");
return resource == null;
}
public static String getResourceFromJarFile() {
File jarFile = new File(System.getProperty("java.class.path"));
return jarFile.getAbsolutePath();
}
public static String getResourceFromFs() {
URL resource = Main.class.getResource("/");
return resource.getFile();
}
When add the webapp, use root path "/" for docBase:
tomcat.addWebapp("", "/")
Credits for:
https://nkonev.name/post/101
I have a spark scala program which loads a jar I wrote in java. From that jar a static function is called, which tried to read a serialized object from a file (Pattern.class), but throws a java.lang.ClassNotFoundException.
Running the spark program locally works, but on the cluster workers it doesn't. It's especially weird because before I try to read from the file, I instantiate a Pattern object and there are no problems.
I am sure that the Pattern objects I wrote in the file are the same as the Pattern objects I am trying to read.
I've checked the jar in the slave machine and the Pattern class is there.
Does anyone have any idea what the problem might be ? I can add more detail if it's needed.
This is the Pattern class
public class Pattern implements Serializable {
private static final long serialVersionUID = 588249593084959064L;
public static enum RelationPatternType {NONE, LEFT, RIGHT, BOTH};
RelationPatternType type;
String entity;
String pattern;
List<Token> tokens;
Relation relation = null;
public Pattern(RelationPatternType type, String entity, List<Token> tokens, Relation relation) {
this.type = type;
this.entity = entity;
this.tokens = tokens;
this.relation = relation;
if (this.tokens != null)
this.pattern = StringUtils.join(" ", this.tokens.toString());
}
}
I am reading the file from S3 the following way:
AmazonS3 s3Client = new AmazonS3Client(credentials);
S3Object confidentPatternsObject = s3Client.getObject(new GetObjectRequest("xxx","confidentPatterns"));
objectData = confidentPatternsObject.getObjectContent();
ois = new ObjectInputStream(objectData);
confidentPatterns = (Map<Pattern, Tuple2<Integer, Integer>>) ois.readObject();
LE: I checked the classpath at runtime and the path to the jar was not there. I added it for the executors but I still have the same problem. I don't think that was it, as I have the Pattern class inside the jar that is calling the readObject function.
Would suggest this adding this kind method to find out the classpath resources before call, to make sure that everything is fine from caller's point of view
public static void printClassPathResources() {
final ClassLoader cl = ClassLoader.getSystemClassLoader();
final URL[] urls = ((URLClassLoader) cl).getURLs();
LOG.info("Print All Class path resources under currently running class");
for (final URL url : urls) {
LOG.info(url.getFile());
}
}
This is sample configuration spark 1.5
--conf "spark.driver.extraLibrayPath=$HADOOP_HOME/*:$HBASE_HOME/*:$HADOOP_HOME/lib/*:$HBASE_HOME/lib/htrace-core-3.1.0-incubating.jar:$HDFS_PATH/*:$SOLR_HOME/*:$SOLR_HOME/lib/*" \
--conf "spark.executor.extraLibraryPath=$HADOOP_HOME/*" \
--conf "spark.executor.extraClassPath=$(echo /your directory of jars/*.jar | tr ' ' ',')
As described by this Trouble shooting guide :Class Not Found: Classpath Issues
Another common issue is seeing class not defined when compiling Spark programs this is a slightly confusing topic because spark is actually running several JVM’s when it executes your process and the path must be correct for each of them. Usually this comes down to correctly passing around dependencies to the executors. Make sure that when running you include a fat Jar containing all of your dependencies, (I recommend using sbt assembly) in the SparkConf object used to make your Spark Context. You should end up writing a line like this in your spark application:
val conf = new SparkConf().setAppName(appName).setJars(Seq(System.getProperty("user.dir") + "/target/scala-2.10/sparktest.jar"))
This should fix the vast majority of class not found problems. Another option is to place your dependencies on the default classpath on all of the worker nodes in the cluster. This way you won’t have to pass around a large jar.
The only other major issue with class not found issues stems from different versions of the libraries in use. For example if you don’t use identical versions of the common libraries in your application and in the spark server you will end up with classpath issues. This can occur when you compile against one version of a library (like Spark 1.1.0) and then attempt to run against a cluster with a different or out of date version (like Spark 0.9.2). Make sure that you are matching your library versions to whatever is being loaded onto executor classpaths. A common example of this would be compiling against an alpha build of the Spark Cassandra Connector then attempting to run using classpath references to an older version.
Let's say there is a jar main.jar which depends on two other jars - dep1.jar and dep2.jar. Both dependencies are in a classpath in MANIFEST.MF of main.jar. Each of dependency jars has a directory foo inside with a file bar.txt within:
dep1.jar
|
\--foo/
|
\--bar.txt
dep2.jar
|
\--foo/
|
\--bar.txt
Here is a main class of main.jar:
public class App
{
public static void main( String[] args ) {
ApplicationContext ctx = new StaticApplicationContext();
Resource barResource = ctx.getResource("classpath:foo/bar.txt");
}
}
Which of two bar.txt files will be loaded? Is there a way to specify in a resource URL a jar the file should be loaded from?
Which one you get is undefined. However, you can use
Resource[] barResource = ctx.getResources("classpath*:foo/bar.txt");
to get them both (all). The URL in the Resource will tell you which jar they are in (though I don't recommend you start programming based on that information).
Flip a quarter, that's the one you'll get. Most likely, it will be the one highest alphabetically, so in your case the one inside dep1.jar. The files both have identical classpaths (foo.Bar), and while this should look to throw a compile time exception, it will not because it will just package both jars up and not try to compile/look at the (this specific file) file as it is a .txt file.
You wouldn't expect a compile time exception as resource loading is a run time process.
You can't specify which jar the resource will come from in code, and this is a common issue, particularly when someone bundles something like log4j.properties into a jar file.
What you can do is specify the order of jars in your classpath, and it will pick up the resource from the first one in the list. This is tricky in itself as when you are using something like ivy or maven for classpath dependencies, you are not in control of the ordering in the classpath (in the eclipse plugins at any rate).
The only reliable solution is to call the resources something different, or put them in separate packages.
The specification says that the first class/resource on the class path is taken (AFAIK).
However I would try:
Dep1Class.class.getResource("/foo/bar.txt");
Dep2Class.class.getResource("/foo/bar.txt");
As Class.getResource works cannot take resources from another jar, as opposed to the system class loader.
With a bit of luck, you will not need to play with ClassLoaders and hava a different class loader load dep2.jar.
As #Sotirios said, you can get all resources with the same name using ctx.getResources(...), code such as :
ApplicationContext ctx = new StaticApplicationContext();
Resource[] resources = ctx.getResources("classpath*:/foo/bar.txt");
for (Resource resource : resources) {
System.out.println("resource file: " + resource.getURL());
InputStream is = new FileInputStream(resource.getFile());
if (is == null) {
System.out.println("resource is null");
System.exit(-1);
}
Scanner scanner = new Scanner(is);
while(scanner.hasNextLine()) {
System.out.println(scanner.nextLine());
}
}
Hello stackoverflow'ers!
I am trying to list all classes from an interface in a specific package.
I came across multiple solutions and tried the following:
Using Reflections:
AllCommands = new ArrayList<ICommand>();
Reflections reflections = new Reflections(this.getClass().getPackage().getName());
commandClasses = reflections.getSubTypesOf(ICommand.class);
for (Class<? extends ICommand> c :commandClasses){
AllCommands.add((ICommand)c.newInstance());
}
Using extcos:
AllCommands = new ArrayList<ICommand>();
ComponentScanner scanner = new ComponentScanner();
scanner.getClasses(new ComponentQuery() {
#Override
protected void query() {
select().
from(this.getClass().getPackage().getName()).
andStore(thoseImplementing(ICommand.class).into(commandClasses)).
returning(none());
}
});
for (Class<? extends ICommand> c :commandClasses){
AllCommands.add((ICommand)c.newInstance());
}
Both solutions are successfully listing all my classes when debugging in NetBeans (7.3), but when i compile the jar and execute it, the commandClasses collections seems to stay empty. I am not used to java but i guess it has to do something with the classpath, and the debugging-time taking the classes from the \target\classes folder and not from the jar.
Can someone help me out with a solution that would allow me to list classes in the .jar?
Is it possible that this.getClass().getPackage().getName() just has to change to something that points into the jar?
I figured it out myself.. Using this.getClass().getPackage().getName() as Packagename does only point at the files in the classes folder, to reference classes in the jar i needed to replace the '.' to '/'. Thank you anyway.
I've an issue with a project I try to deliver using the one-jar packager to simplify the deployment process.
Without the packaging, everything works fine and the logging configuration is perfectly loaded, but within the packaging, only part of the configuration is appied.
So, here is the logging.properties I use:
handlers= java.util.logging.ConsoleHandler, java.util.logging.FileHandler
.level= INFO
java.util.logging.FileHandler.pattern = C:\\MyPath\\logging.csv
java.util.logging.FileHandler.limit = 50000
java.util.logging.FileHandler.count = 1
java.util.logging.FileHandler.formatter = my.package.logging.Formatter
java.util.logging.ConsoleHandler.level = INFO
java.util.logging.ConsoleHandler.formatter = my.package.logging.Formatter
And In my main class, here is how I load it:
public class MainClass {
public static void main(final String[] args) {
try {
LogManager.getLogManager().readConfiguration(
new MainClass().getClass().getResourceAsStream("logging.properties"));
// main process goes here.
} catch(Exception e) {
// Exception handling
}
}
}
The log level as well as the FileHandler pattern are well understood because the logging ends up in the correct file, but as row XML output, which makes me think that the formatter is not loaded as it normally outputs a CSV format.
Could it be related to a classpath issue? Does anyone knows how to handle this?
It could be in your jars you have more than one logging.properties file, with similar but slightly different settings. When you combine them with one-jar the order changes and one of them gets hidden. Do a "jar -tf *.jar |grep logging.properties" and see what you see.
If that doesn't work can you try unjarring the onejar result to a directory structure, and then running with the directory on the classpath instead of the jar? That will let you see if it is something to do with the jar, and actually inspect the logging.properties you have in the onejar, and see if it matches what you expect.
Use LogManager.getLogManager().readConfiguration(LogManager.class.getResourceAsStream("/logging.debug.properties"));
(note the extra slash).