I have java server application wich uses many libs (netty, guava, etc). I always export this application as one single .jar. When I run application in Eclipse, I didn't have any problems. But if I start app in console (Windows, or Ubuntu, doesn't matter), I have strange problem: ALL connection processes via sockets last toooo long. For example, simple http connection via HttpAsync or others (rabbitmq connection, etc.) lasts 1-2 min. But after connection completed, data sends/receives fast. I can't figure what the problem. As mentioned before, I use Eclipse for development.
As you know, you can export project 3 dif ways (in Eclipse):
Extract required libraries into JAR.
Package required libraries into JAR.
Copy required libraries into sub folder next to JAR.
So, when I used 2 option, I had problem. When I switched to 3d option (all .jars in folder near main .jar), problem was solved.
Generally there are no big difference between 2 and 3 option (in 2 all .jars just inside one jar). I thought that it was cause of extra time needed to load new classes in execution time from the jars. But problem occurs not only at start, but for all new connections.
Can someone explain this behavior?
UPD: Eclipse Luna. Doesn't matter what OS I'm using (Windows, or Ubuntu), even doesn't matter what jvm (tried with different Oracle jdk, even tried open jdk).
This all talks about difference in performance when packaging into JAR v/s extracting into JAR & difference in performance when running from Eclipse v/s running from console.
Difference in performance when packaging into JAR v/s extracting into JAR:
Extract required libraries into JAR:
What it does:
In this option Eclipse will extract all the classes from the referenced JARs and package into the generated JAR.
If you open the JAR then you will find that there are NO referenced JARs packaged but all the classes of referenced JARs are arranged as per the package structure and then packaged inside the JAR at root level. This brings the key difference in performance as compared to the "Packaging required libraries into a jar file" where there is additionally cost of runtime parsing and loading of JAR in memory etc..
When exporting as JAR through Eclipse then it is best option if performance is concern. Also this is scalable option because you can ship this JAR
MANIFEST.MF Main thing to note in this file is you main class. When you run the JAR you are directly running the class you need.
Main-Class: com.my.jar.TestSSL
Package required libraries into JAR:
What it does:
In this option Eclipse will:
package all the referenced JARs into the generated JAR.
employ Eclipse's JAR loading mechanism through org.eclipse.jdt.internal.jarinjarloader.JarRsrcLoader and you can also see org.eclipse.jdt.internal.jarinjarloader package into your generated JAR and this package is just under the root directory of the generated JAR.
Now of course this is the additional cost which comes when you choose this option because when you run the JAR then it is not you main class getting executed but JarRsrcLoader will be executed which will load your main class and other libraries, and all the referenced libraries are packaged. See MANIFEST.MF section below
MANIFEST.MF Main thing to note in this file is you main class. When you run the JAR, JarRsrcLoader will run and will do further job.
Rsrc-Main-Class: com.cgi.tmi.TestSSL
Main-Class: org.eclipse.jdt.internal.jarinjarloader.JarRsrcLoader
Now for last Eclipse export option - "Copy required libraries into sub folder next to JAR", I don't think it is a very scalable solution to consider because this imposes your file system dependency, so I would say don't do it.
Difference in performance when running from Eclipse v/s running from console:
When you run application from Eclipse then it is quiet similar to 1st export option where Eclipse doesn't need to parse and load JARs at runtime and all.
This is however a very trivial point, key is the consideration of Eclipse JAR export option 1 v/s option 2.
Final words:
Use "Extract required libraries into JAR" for exporting JAR and you will see substantial performance gain.
It is highly improbable that your socket connections are lasting long when you run from console because JVM runs code then it would have same or very comparable performance when running from Eclipse and console (considering same Java versions in both case). You could be feeling because of packaged JAR performance. Try extracted JAR and you should be fine.
Also, consider the amount of logging you are doing. When running through, depending upon configuration Eclipse may mask a lot of logging and hence saving you i/o time.
Do understand how classes are accessed from JAR class path, which is like additional computational cost when you are referencing classes from JAR.
As we don't know the exact structure of your JAR here is a more general explanation (assumed you run your application with java -jar your_app.jar).
case Copy required libraries into sub folder next to JAR.
if a class needs to be loaded the class loader (after the runtime JAR) first checks your_app.jar to find a required class
if the class is not found it traversed over all JAR files in the subfolder
all JAR files could be kept in the filesystem cache for further reading
case Package required libraries into JAR
if a class needs to be loaded the Eclipse class loader JarRsrcLoader (after the runtime JAR) first checks your_app.jar to find a required class
if the class is not found it traversed over all embedded JAR files, which means as first they need to be decompressed from your_app.jar before the content can be read
the extracted embedded JAR files are not kept in the filesystem cache for further reading (as they are not files in the filesystem)
If you have a bigger number of hugh embedded library JARs this might lead in a slow down of class loading (but only for the first time a class is loaded by a class loader).
You can see the difference in the class loading if you compare the outpout of
java -verbose:class -jar your_app_external_library_jars.jar
with
java -verbose:class -jar your_app_embedded_library_jars.jar
The performance might be improved by generating an INDEX.LIST file for each JAR file (e.g. your_app.jar and the embedded library JARs).
It happens because when you go with "uber jar" approach, some metadata might be lost.
It's just an example, but if you download this and this, take a look inside the jar. There are a few files with the same name in the same META-INF folder.
Those files might be important, and when eclipse repackages things for you, he might not be the doing a decent job on merging such files.
That is what might be happening to you.
In the 2nd approach, You have all dependency jars in the main.jar.
So it won't load any of the dependency jars unless required.
Whereas, in case of 3rd option, your main.jar and other dependency jars are independent (unlike 2nd way), and hence gets loaded for connections and is available.
try adding a log statement or syso by manipulating a dependency jar to see this working.
Related
I'm re-using a standalone Swing-based Java class which backs up and restores mysql databases.
I've tested running it from a Windows batch file (.bat) on my dev system, and it works there.
But, if I run the batch file on a different Windows , I get a "main class not found" exception.
However, when I run the command directly on the command line, it works.
The command in the batch file to run it is:
java -cp lda-services.jar;bip-services-1.6.0.0-SNAPSHOT.jar;decryptor-1.6.0.0-SNAPSHOT.jar;slf4j-api-1.7.31.jar;commons-io-2.6.jar com.ilcore.util.SosaMaintenanceJFrame
The SosaMaintenanceJFrame class is contained in the lda-services jar.
Here's the error message:
Error: Could not find or load main class com.ilcore.util.SosaMaintenanceJFrame
Caused by: java.lang.ClassNotFoundException: com.ilcore.util.SosaMaintenanceJFrame
The class is definitely in the jar file, as I've extracted it the file and seen it.
Any thoughts on why this would be happening? I need to run inside a batch file so the user can just click on it to run it.
Most likely explanation
Your paths are relative, which means that the batch file isn't going to work unless you run it from the right place. In general, having a batch file that has an invisible rider stapled to it with: "I break in mysterious ways if not run from the appropriate dir" is a crappy batch file - make it better.
Better solution
Or, even better, get rid of it. You don't need batch files to distribute java programs.
Proper ways to distribute java programs:
The modern take is very very different from what you have here: JREs are dead, you must ship an installer that does the whole thing, notably including a java runtime (no longer called a JRE, and one you ship and keep up to date if relevant). That's perhaps a bridge too far for what you're doing here. Relevant tools include jlink.
A slightly less modern take involves jars with manifests:
Your jar file should contain a manifest. This manifest must contain 2 relevant entries:
Class-Path: lda-services.jar bip-services-1.6.0.0-SNAPSHOT.jar decryptor-1.6.0.0-SNAPSHOT.jar slf4j-api-1.7.31.jar commons-io-2.6.jar
and
Main-Class: com.ilcore.util.SosaMaintenanceJFrame
You can use jar's -m switch, or just include the manifest (it's just a file in the jar): it's at META_INF/MANIFEST.MF and it's a text file, each line is an entry, and an entry consists of a key: value pair.
When a jar contains this, just double clicking the jar and running java -jar thejar.jar will then take care of it all: Java will load the stated jars as part of the classpath, and these, crucially, are resolved as paths relative to the directory the jar is in, so it DOES work when you try to launch them from elsewhere, i.e. if you do:
C:
CD \
java -jar "c:\Program Files\MyApp\myapp.jar"
it works fine, whereas that batch script would fail due to being in the wrong place.
Build systems let you define the manifest too, check your build systems docs for how to do this, it'll be easy, and there are tons of tutorials if you search the web for e.g. 'manifest executable jar maven' or whatnot.
You can consider making a shaded jar. But I wouldn't.
A shaded jar takes all your dependencies and packs them into your main jar, so that there is only one jar. There is now no need for a Class-Path entry (the jar you run is obviously already on the classpath and there's nothing else to include) and your app is shipped as 'just' a single jar file.
But this is mostly a red herring: There are no consumer JREs anymore so you've made the user experience from a D- to a D. If you actually care about giving your users a nice experience, there's no getting around an installation process of some sort and once you have that, having the separate jars is no longer a problem. Separate jars are less hairy when signed jars are involved, are much easier to keep up to date, and have a significantly faster turnaround (when you build your stuff and want to ship what you built, shading takes ages, so it's nice to cut that step out). The faster your CI system tells you about failing tests, the better.
Meet in the middle
You don't have to upgrade to modules and the like. What you can do instead is use something like launch4j. The aim is to end up with a zip file along with the installation instructions: Make a dir somewhere. unzip this zip in it. Doubleclick 'myapp.exe'. Done.
The zip would contain an entire JRE, all your jar file deps, and your main app, and an exe file which launch4j made for you, that launches your app using the JRE packed into the jar. This means you know exactly which JRE is being used, and it'll work even on systems that didn't have one installed yet (which, these days, should be all of them - the notion of 'end user downloads a JRE from oracle and the user + oracle work together to keep that thing up to date and security-issue-free', is dead).
The fact that it's an EXE is nice: Now if the user e.g. alt+tabs through their apps, they get your app, with your name, and your icon, instead of 'javaw.exe' with an ugly coffee mug logo.
But when I try running it from the jar file generated by Maven, however, I get a "class not found" exception.
Even if you didn't get that error, you'd get another one unless you'd used Maven Shade, as that's the only way you're going to run that with a single jar. My guess as to why that particular error occurs is that the app class you're attempting to run is in fact in one of the *SNAPSHOT* jars
Consider a Java program, launched from a main method, that needs something from tools.jar. In this case, some utility code for connecting to JMX services. Do we have any choice but to wrap it in a shell script that uses -cp to manage the class path? We'd much rather use a MANIFEST.MF classpath.
from http://java.sun.com/developer/Books/javaprogramming/JAR/basics/manifest.html
the URLs in the Class-Path header are given relative to the URL of the JAR file of the applet or application.
I do not believe you have a choice about using a shell wrapper to get the tools.jar on your classpath. unless you write some custom classloader internally to allow you to find external jars.
If incorporating classes from the dependency jar is an option, I'd go with creation of a "Runnable JAR file". Basically you extract the classes from it and put them with your own classes in the JAR. That eliminates the need for a wrapping script.
To do that in Eclipse, select your project, File -> Export -> Java -> Runnable JAR file; that option will require that you have executed the main class at least once to know what profile to run when you actually run produced JAR.
I would like to ship my application as a self-contained jar file. The jar file should contain all the class files, as well as two shared libraries. One of these shared libraries is written for the JNI and is essentially an indirection to the other one (which is 100% C).
I have first tried running my jar file without the libraries, but having them accessible through the LD_LIBRARY_PATH environment variable. That worked fine.
I then put the JNI library into the jar file. I have read about loading libraries from jar files by copying them first to some temporary directory, and that worked well for me (note that the 100% C library was, I suppose, loaded as before).
Now I want to put both libraries into the jar, but I don't understand how I can make sure that they will both be loaded. Sure I can copy them both to a temporary directory, but when I load the "indirection" one, it always gives me:
java.lang.UnsatisfiedLinkError: /tmp/.../libindirect.so: /libpure.so: cannot open shared object file: No such file or directory
I've tried to force the JVM to load the "100% C" library first by explicitely calling System.load(...) on its temporary file, but that didn't work better. I suspect the system is looking for it when resolving the links in libindirect.so but doesn't care about what the JVM loaded.
Can anyone help me on that one?
Thanks
One way would be to spawn another Java process from the first, generating the appropriate invocation script.
The jar is invoked by the user
The libraries are extracted to a temp directory
A (bash) script is written to the temp directory
this sets/exports the necessary environment variables
this launches the second JRE instance
The code makes the script executable
The code invokes the script
I know, spawning two JRE instances to launch one app would not be my first choice either.
If you are using Eclipse IDE, then this answer might help you.
I had same problem in eclipse windows that I couldn't added dependant .class files from the JNI.
After searching for a while I came to know that "Its a known bug inside Eclipse", In order resolve the same, I ported all the code to NetBeans IDE.
Can not add all the classes files from the JNI folder in Eclipse (JAVA, Windows 7)
Most of the time , the developers will be having hard time to debug the issues related to class loading issues for the reasons like
1 . the class path might have two different jars with the same class having different version.
2. class loading issues.
Although we could use jar utility to delve into each and every jar , it is going to be extremely tedious and error prone.
Is there a tool or some mechanism to resolve this kind of issues .
Though the class loading is not simple in realistic , say how the weblogic will do class loading of a particular ear file.
give a try to tattletale, it works both in ant and maven:
The tool will provide you with reports that can help you
Identify dependencies between JAR files
Find missing classes from the classpath
Spot if a class/package is located in multiple JAR files
Spot if the same JAR file is located in multiple locations
With a list of what each JAR file requires and provides
Verify the SerialVersionUID of a class
Find similar JAR files that have different version numbers
Find JAR files without a version number
Find unused JAR archives
Identify sealed / signed JAR archives
Locate a class in a JAR file
Get the OSGi status of your project
Remove black listed API usage
I find running Java in verbose mode quite handy for resolving class path errors.
It will show you what classes and jars are being loaded by the program.
It can be a quick first step to try fix the problem without using a debugging program.
i have a few batch java command-line applications which are planned to be deployed as:
batch_apps
app_1
batch1.jar
run_batch1.sh
app_2
batch2.jar
run_batch3.sh
{...etc...}
what would be the best practice on organizing a shared library pool - for example log4j:
batch_apps
app_1
batch1.jar
run_batch1.sh
app_2
batch2.jar
run_batch3.sh
libs
log4j.jar
ojdbc.jar
?
and include individual log4j.xml's in each app's own jar file?
i understand i would need to add 'libs' to the classpath either in manifests or in run_batchX.sh
(which way is preferable?)
I am mostly wondering what would be the most efficient setup performance-wise.
thanks
Having a shared libs directory at the root of your install dir is definitely the way to go. Since libs will be loaded in memory once, when the JVM launches, there is no impact on performance whatever solution you choose.
I would not put the classpath in the jar files, as this would force you to change your jars if you need to relocate your lib dir. Editing a script is much simpler.
I would not include the log4j conf file in your jar files either, for the same reason.
It appears your applications don't share a single JVM instance. (i.e. They are individually started via 'java -jar batch1.jar' or some such.) Therefore, sharing library .jar files only saves you DISK space not RAM.
If the apps are not sharing a single JVM then ease-of-deployment should take precedence over disk space. (Your time is worth more than a few "wasted" MB.)
In this instance I would recommend making each application self contained, in either a single .jar file including all its libraries, or a single folder of .jar files. (i.e. Put the libs for each app in a folder for that app.)
If the apps were sharing a single JVM then I would recommend the shared library folder option.
You can use the java extension mechanism. Place them in JAVA_HOME/lib/ext and they will be accessible by all apps. Of course, this may not be the best for all deployments, but its certainly easier.
This doesn't directly answer your question, but I have already tried the approach that you propose but would now create a single jar per application (see how to do it with Ant). That way, no need to include anything in the classpath:
java -jar myApp.jar
is all you need. I find it cleaner but everything is debatable.
It doesn't make any difference from a performance point-of-view since each application is run inside its own JVM.
The only downside is that some libraries will be present in each jar file. It only costs more to store on the HD, but these days, MB are pretty cheap :-) I trade simplicity (no external lib folder) and no jar hell (not placing your jars inside the Java ext folder) over storage price any time. If your application doesn't include terrabyte of libraries, I think it's fine.
For the Log4j configuration file, I would place one default file inside the jar but provide a sample config file (log4j-custom.xml.sample) that someone can modify and specify in the command line:
java -Dlog4j.configuration=log4j-custom.xml -jar myApp.jar