spring-boot: configuring different versions of a property-file with hardcoded filename? - java

Building a spring-boot app, we are depending on a 3rd party jar-file,
that expects to find a properties-file with hardcoded filename (say xyz.properties) on the classpath,
and will read its properties from that.
We need, though, to "switch in" different version of this properties-file, depending on in which environment we deploy the jar-file.
So would need, preferably, to add to the classpath a directory external to the jar-file, where we can put the properties-file.
Googling this, I find other people having similar issue,
but not a simple, clean solution for it.
It seems to me, the spring properties-model assumes you only care abt the property-names and their values
(picking them up from System.getProperties())
and really dont care abt from which property-file each value comes.
This may be fine when building your own code along that model,
but may fit not so well when depending on 3rd party solutions, like our use-case.
The simplest workaround I found is to "explode" the spring jar file,
then copy desired property-files into WEB-INF/classes,
then start with the JarLauncher.
Just wondering if there is a better way, without need to "explode" it?
Is my understanding above correct, or have I just overlooked some spring-feature that already supports this use-case?

Hopefully it still works - but with some trick you can set your own classpath:
java -cp "./conf/:yourBoot.jar" org.springframework.boot.loader.JarLauncher
and then you can place your config in external dir (as you already suggested).
See original question: Add jar file to spring-boot classpath at runtime

Related

Which jars exactly are needed to embed Neo4j?

EDIT This question is not about how to solve dependencies using Ant / Maven / Gradle or whatnots.
I'm trying to use Neo4j and I'm a bit confused by the docs as to what I need to embed a very simple "Hello, world!" Neo4j example in an app.
I've read in several places that Neo4j was lightweight and that only one (and now two) jars where needed.
For example here: http://highscalability.com/neo4j-graph-database-kicks-buttox
we can read: "Small footprint. Neo4j is a single <500k jar with one dependency (the Java Transaction API)."
This is precisely one of the reason I'm interested in Neo4j to embed it...
So I downloaded the community edition (GPL) of Neo4j and read the explanation here:
http://docs.neo4j.org/chunked/stable/tutorials-java-embedded-setup.html
which says: "Extract a Neo4j download zip/tarball, and use the jar files found in the lib/ directory."
Now that's more than concise and I've found old messages saying that the "wording was changed". At one point all that Neo4j needed was one jar apparently (which is one of the reason I was interested in embedding Neo4j btw). But now apparently it's two, because there's a dependency on some Java Transaction API (which one? a .jar shipped with neo4j?)
The problem is that if I look in that lib/ dir I've got quite some things:
1115454 lib/neo4j-kernel-1.6.1.jar
153707 lib/neo4j-graph-algo-1.6.1.jar
222791 lib/neo4j-shell-1.6.1.jar
8865464 lib/scala-library-2.9.0-1.jar
43530 lib/neo4j-jmx-1.6.1.jar
590503 lib/neo4j-kernel-1.6.1-tests.jar
23954 lib/neo4j-community-1.6.1.jar
28023 lib/neo4j-udc-1.6.1.jar
1517975 lib/neo4j-cypher-1.6.1.jar
51662 lib/neo4j-graph-matching-1.6.1.jar
16030 lib/geronimo-jta_1.1_spec-1.1.1.jar
143177 lib/neo4j-lucene-index-1.6.1.jar
1466301 lib/lucene-core-3.5.0.jar
118875 lib/server-api-1.6.1.jar
92850 lib/org.apache.servicemix.bundles.jline-0.9.94_1.jar
And in system/lib:
27461 system/lib/blueprints-neo4j-graph-1.1.jar
72650 system/lib/jettison-1.3.jar
628626 system/lib/rrd4j-2.0.7.jar
17985 system/lib/asm-analysis-3.2.jar
177174 system/lib/jetty-util-6.1.25.jar
109043 system/lib/commons-io-1.4.jar
755981 system/lib/neo4j-server-1.6.1.jar
35910 system/lib/gremlin-java-1.4.jar
46367 system/lib/jsr311-api-1.1.1.jar
36551 system/lib/asm-util-3.2.jar
206035 system/lib/commons-beanutils-core-1.8.0.jar
227122 system/lib/jackson-core-asl-1.8.3.jar
33094 system/lib/asm-commons-3.2.jar
17308 system/lib/jcl-over-slf4j-1.6.1.jar
21878 system/lib/asm-tree-3.2.jar
12359 system/lib/log4j-over-slf4j-1.6.1.jar
.
. (skipped a few jars from system/lib here)
.
If my Emacs-fu is strong enough the jars above weight at nearly 17 MB (not that "embeddable")... And I didn't even paste all the jars from system/lib/.
So what is the minimum number of .jar (and which are they) do I need so that I can embed Neo4j and run a simple "Hello, world!" example?
I'm confused by the official doc saying: "... use the jar files found in the lib/ directory".
Surely I don't need all of them right?
Basically, you need only neo4j-kernel-1.6.1.jar (and the mentioned transaction API geronimo-jta_1.1_spec).
However, this will give you only the basic functionality. If you want to use other parts, like indexing, querying, management tools, etc., you would need other jars.
The absolute minimum to run the kernel is
neo4j-kernel.jar
jta.jar
The rest is Cypher, Lucene indexing and other stuff.

How to track down JAR with corrupt JAR Index that cause InvalidJarIndexException in Tomcat in Eclipse

I am working with development of an application which, among other pieces of code, contains a number of servlets. The development environment I use is Eclipse (3.2.1, which is rather old) in which I run a Tomcat server (5.5.23, rather old as well) using the Eclipse Tomcat Wrapper plug-in for the task. All this runs on a RedHat 5.2 Linux system.
The Java runtime I use is JDK 1.6.0(21), which I upgraded to (from a previous JDK 1.5 version) quite recently and as far as I can recall, the software combination above (together with the application I'm working with) did actually work: I could start the Tomcat server, it got up without errors or complaints and the application's servlets were available on port 8080.
However, something has changed somewhere (could be in the application jarfiles themselves, I'm suspicious of essentially everything on the host to be the root cause of this). Now, when I try to start up the Tomcat server, I get the error sun.misc.InvalidJarIndexException in the console output. This happens for the following classes and methods:
org.apache.commons.modeler.Registry registerComponent (happens 3 times)
org.apache.catalina.core.StandardServer initialize (happens once)
org.apache.catalina.connector.Connector start (happens twice)
I did find this stack overflow question regarding how to find the JAR of a Java Class useful and I did run find /usr -name \*name-of-suspected-jar\*.jar a few times to track down a number of suggested offending JARS. I also tried to check the runtime configuration of the Tomcat server in Eclipse, but could really not match the JAR files on the system with the CLASSPATH of neither the Tomcat runtime setup (or with the CLASSPATH used in the environment when starting Eclipse). That effort probably requires some more rigor on my part but before doing that (and that is why I right now don't post all the gory details regarding CLASSPATHs here), I did a read up on exactly what InvalidJarIndexException really is about.
So, JAR files may contain an optional INDEX.LIST file which contains information about what classes (and methods?) to find in the JAR file. The idea is to short-circuit the search throughout all JARS in the CLASSPATH which is useful in a number of circumstances. Problem is when the INDEX.LIST file happens to be corrupt (or, is believed to be corrupt), that causes the loading of the class to be completely given up (the class loader does not fall back to searching all JARs in the CLASSPATH) and the error InvalidJarIndexException to be thrown. To make things more messy, the order in which JARs are searched might affect how the class loader treats the INDEX.LIST file: the INDEX.LIST file of one JAR might refer to other JARS and if those referred to JARS are not in sync with the first JAR's INDEX.LIST file, the class loader fails with this InvalidJarIndexException error.
So (according to this StackOverflow question), it seems like this error can be thrown not only because a JAR file has a corrupt INDEX.LIST, it seems it can even be thrown on a JAR even if the JAR has a valid INDEX.LIST or legitimately is lacking a INDEX.LIST simply because a previously searched JAR has confused the class loader. (To put in another way, as things are, this exception might be thrown even for "innocent" non-corrupted JAR files due to offenders elsewhere on the system).
So, after writing a mere novel, here comes my main set of questions:
What is the best way to track down the precise .jar file for which each InvalidJarIndexException is thrown?
What is the best way check if a randomly picked .jar file has an INDEX.LIST file and if so, if said file is valid (that is, non-corrupt)? What tools exist for this task?
Is there an efficient way to automatically deduce the search order of .jar files? I can try to follow the CLASSPATH manually but to be honest, that is error prone and tedious.
Is there an efficient way to figure out what .jar file there is in a search order which might confuse the class loader to accuse innocent, non-corrupt .jar files later in the search to have incorrect INDEX.LIST files?
Disclaimer: I know I run old versions software (even if I have the latest updates of my Redhat 5.2 installed though) and I know a knee-jerk reaction for many people is to suggest that I don't put any effort whatsoever in debugging this but instead upgrade to a more recent version of Tomcat, Eclipse and Linux (Java is recent though). The reason I would prefer not to is that after looking into things, I've found it rather messy to do an upgrade or to try to install a separate modern Tomcat or Eclipse next to the RHEL5.2 provided Tomcat/Eclipse I use today. Also, I consider this kind of troubleshooting an opportunity to learn some useful nitty gritty details about Java and it's associated tools and features. Figuring out how the class loading works and what causes it to throw this InvalidJarIndexException on my system would be very educating!
(But if this troubleshooting fails, I'll seriously consider to use a modern Linux, Eclipse and Tomcat... I promise)
Take the following steps to diagnose the problem:
Add an exception breakpoint in Eclipse (it's the J with an
exclamation mark icon), and set it to halt for caught and uncaught
exceptions, of type InvalidJarIndexException.
Start debugging your program.
Eclipse will halt at your exception breakpoint, when the InvalidJarIndexException is thrown. Even without the source for URLClassPath, you will still be able to inspect the variables on the stack leading to the exception, including the name of the class that URLClassPath is attempting to locate. Knowing the name of the class should significantly narrow the list of JAR's you need to examine.
Perhaps you've locally added a new class to a package and the contents of that package are described by the index file in a stale JAR on your classpath?
Try Tattletale which is a good reporting tool for jars. What I have done in this case was to eliminate INDEX.LIST from jars one by one until I did not get InvalidJarIndexException any more

Control the classpath ordering of jars in WEB-INF/lib on Tomcat 5?

I have a legacy web app running in Tomcat 5.0.
This web app has two jars in WEB-INF/lib, let's say Foo-2.0.jar and Bar-2.0.jar. Bar-2.0.jar actually includes a Foo-1.0.jar inside of it. Bar is also a dead project, meaning no upgrading, no source, but still important to the application.
The latest release of this application requires Foo-2.0.jar for some other stuff. Having both Foo-1.0.jar and Foo-2.0.jar in the classpath creates a conflict, specifically a ClassDefNotFound type of error, where a class that was later added in 2.0 cannot be found in 1.0, etc.
In Eclipse, the simple solution is to right click on your Project, click Properties > Java Built Path > Order and Export and to move Foo-2.0.jar above Bar-2.0.jar so it's resolved first.
How does one accomplish this type of classpath ordering for jars in WEB-INF/lib in Tomcat?
Tomcat 5's classloading precedence for webapps is roughly as follows: first the bootstrap/system (JRE/lib, then Tomcat's internal classes), then the webapp libraries (first WEB-INF/classes, then WEB-INF/lib), then the common libraries (first Tomcat/common, then Tomcat/lib) and finally the webapp-shared libraries (Tomcat/shared).
So to get Foo-2.0.jar loaded before Bar-2.0.jar, best what you can do is to move Bar-2.0.jar from WEB-INF/lib to Tomcat/common or Tomcat/shared.
The JAR's aren't loaded in alphabetic order of their name. At least, there's no spec which says that. Renaming them to change the alphabetical filename order makes no sense.
Strip Foo-1.0.jar out of Bar-2.0.jar. As it is, it's just trouble waiting to happen both for development(need to fudge the dev. environments) and for deployment.
Put Foo-1.0.jar to $CATALINE_HOME/common/endorsed (or any other place where it will be loaded after Foo-2.0.jar).
You don't, as this feature isn't available in Tomcat. If both Foo-1.0.jar and Foo-2.0.jar are needed in the classpath at the same time, you will need some major classpath reorganization.
If Bar-2.0 can work with Foo-2.0, then the best thing to do would be to rebuild Bar-2.0 yourself without a Foo-1.0.jar inside of it.
It is possible to set the Class-Path in the mainfest of the jar. http://java.sun.com/developer/Books/javaprogramming/JAR/basics/manifest.html
I can't really promise it will solve the problem, but can be worth a try.
To have Foo-2.0.jar before Bar-2.0.jar, update the Bar-2.0.jar with the content of Foo-2.0.jar (overwrite already contained .class)and delete Foo-2.0.jar from the war.
-cp A.jar:B.jar has the effect, that content of A.jar is like a layer over B.jar. So you get the same effect with overwriting B.jar's content with A.jar's.
This is a bit hacky but might work. Change the name of Foo-2.0.jar to be alphabetically ahead of Bar-2.0.jar, say AFoo-2.0.jar.

search paths where one native library depends on another

I'm using JNA and Java but I think this question affects any native-to-nonnative bridge.
I have a Java application which relies on lib1.dylib, and lib1.dylib relies on lib2.dylib.
I want to put everything inside of my .app file on Mac. I can easily put lib1.dylib inside and set java.classpath (or NativeLibrary.addSearchPath()) to tell the JVM where to find lib1.dylib. The trouble is, I don't know how to communicate that lib1.dylib's dependencies are also in the location I provided. The result is that lib1 is loaded fine, but then lib2 can't be found since it's not in the operating system's library path.
Anyone know how I can overcome this problem? I imagine it must come up plenty in big projects with large numbers of shared libraries.
I've come across this problem before, and have just run into it again today. You may be able to get around it by adding the VM argument "-Djava.library.path=/path/to/other/libs", but I seem to remember Java only uses that to search for the intial library and then uses the system PATH to look for any dependencies.
A few solutions I've tried before:
1) Use System.load(absolutePath) on the dependent library before loading your library. Doesn't make your program ultra-portable though, unless you always know where that library is going to be.
2) In a case where lib1 depends on lib2, I actually used SetCurrentDirectory (Windows, not sure of the Mac equivalent) in the native code before it linked to any of the dependent libs, and that seemed to work. Again, requires knowing where the other libs are.
3) On Windows, could dump the dependent libraries in c:\windows\system32, and it finds them.
A few helpful posts on a similar topic (Windows-specific, but I think the problem is the same):
http://www.realityinteractive.com/rgrzywinski/archives/000219.html
http://www.velocityreviews.com/forums/t387618-jni-library-path.html
I've found a solution for MacOSX based on the idea in (2) from Stew:
Using Mac's JarBundler (or the Ant task of the same name) set the workingdirectory variable to $JAVAROOT and make sure your dylibs are in the Contents/Resources/Java part of the .app. If you do this the dynamic linker will find all the dependency dylibs because it will be the present directory. Java will also find the original dylib (the one that has all the dependencies) for the same reason.
Ant code:
<target name="package_mac_app" depends="package_jar, compile_native" description="bundle the runnable jar into a Mac Application -- requires JarBundler ANT Task">
<taskdef name="jarbundler" classname="net.sourceforge.jarbundler.JarBundler"/>
<echo message="CREATING MAC .app EXECUTABLE"/>
<jarbundler dir="${dist}"
name="${appname}"
mainclass="myPackage.myMainClass"
icon="${icon_location}"
jvmversion="1.5+"
infostring="${appname}"
shortname="${appshortname}"
bundleid="${com.mycompany.mydepartment.myprogram}"
jar="${run_jar_location}"
workingdirectory="$JAVAROOT">
<javafilelist dir="${dylib_location}" files="my-lib.dylib"/>
<javafilelist dir="${dylib_location}" files="dependent-lib.dylib"/>
</jarbundler>
</target>

GWT 1.6 project war layout- mixing source code & compiler-generated artifacts?

Having just wrapped up a GWT-1.5 based project, I'm taking a look at what we'll have to do to migrate to 1.6. I'm very surprised to see that GWT seems to want to write its compiled output to the war directory, where you would normally have items under source control.
What's the reason behind this? Did Google really think this was a good idea? Is there a workaround to keep source code separate from compiler-generated artifacts? Is there some other reason for this that I'm missing?
EDIT:
It's been suggested that I use the -war option to specify an output directory. I wrote some ANT scripts, and have this mostly working. I've had to copy over my static resources such as HTML, JSPs, etc into this directory (I'm using target/war, maven-style). Is that what most people are doing? Or are you just letting GWT write its output into your source-code-controlled war dir, and telling your VCS to ignore the non-version-controlled files? It occurred to me that there might be some benefit to letting GWT write to this dir directly, since then Jetty could automatically notice changes to JSPs, HTML etc, and avoid having to do a copy to make these changes visible.
Use the "-war" option to control where the output goes.
FYI: The Wiki has the design doc which will, hopefully, give you a bit of insight as to what they were thinking.
See also the Release Notes which discuss the new project layout, as well as some things to watch out for with this change.
Salvador Diaz has provided an excellent solution to this.
Yep, look at the -war option which may help.
What I'm doing (which may not be as clean as maven, and I dont use the -war) is I'm putting my entire project dir on SVN, and then ignoring the subdir that holds the js and other compiled bs along with the classes dir. That way I have everything else on source control, including the libs which I wanted. So another team member can just check out the whole project from SVN, compile, and ready to go.

Categories