To speed up the startup time of the JVM, the Sun developers decided it is a good idea to precompile the standard runtime classes for a platform during installation of the JVM. These precompiled classes can be found e.g. at:
$JAVA_HOME\jre\bin\client\classes.jsa
My company currently develops a Java standalone application which brings its own JRE, so it would be a fantastic option to speed up our application start time by adding our own application classes to this jsa file, too.
I don't believe the JSA file was created by magic, so: How is it created? And how can I trick the JVM into incorporating my own classes?
EDIT: I already found out the following:
The classes.jsa is created by the command
java -Xshare:dump
The list of classes to incorporate in the dump can be found in $JAVA_HOME/jre/lib/classlist.
I even managed to add my own classes here (and to add them into the rt.jar for java to find them), and to generate my own checksum below the classlist file.
The final problem is: Only classes in the packages java, com.sun, and org.w3c seem to be recognized, if I leave the same classes in their original packages, they won't be loaded. I searched the whole OpenJDK source for pointer about this, but it seems to have something to do with protection domains. If someone is interested enough in this topic and knowledgeable enough, please add some pointers for me to investigaete further.
As of Java 8u40 (and Embedded Java 8u51), Java now supports Application Class Data Sharing (AppCDS) (ie your own classes in the shared archive). On our embedded java, we've found a startup improvement of >40%! Pretty awesome for almost no work on our part...
https://blogs.oracle.com/thejavatutorials/entry/jdk_8u40_released
You were almost there, you only need a couple steps to make it work. To add your own classes to the clients.js you need the following steps:
The qualified name your classes (you have it)
The classpath of these classes (you have it)
Know how to recalculate the checksum (you have it)
Dump the new file, providing the classpath of the classes you are now precompiling with the Java classes.
Run the program, providing the same classpath that you used to dump the new classes.jsa
To provide the classpath where are the classes you are adding to the classlist, use the -Xbootclasspath/a command. It will append the directories/JARs when JVM is searching the places where the boot classes are. The default space for the classes.jsa is quite small, if you need to improve it you can use the -XX:SharedReadWriteSize and -XX:SharedReadOnlySize commands. Your dump command you look similar to this:
java -Xshare:dump -Xbootclasspath/a:C:/myfiles/directoryA/;C:/myfiles/directoryB/;C:/myJars/myJar.jar;
The last step is just run the java application normally, rememebering of turn on the share mode. You also need to add the Xbootclasspath excatly as you added on the dump. It will look similar to this:
java myapp.java -Xshare:on -Xbootclasspath/a:C:/myfiles/directoryA/;C:/myfiles/directoryB/;C:/myJars/myJar.jar;
Now every class that you put on the classlist is being shared with other instances running in the same JVM.
Interesting idea. As I read it though, it's used for sharing data across VMs and for speeding up classloading, not compiling. I'm not sure how much of a boost you would get, but it might be worth a try if you have a big lag at startup already (though the VM already tries to mitigate that).
As for trying it yourself, it appears this file is normally created when the Sun VM is installed, but you can also control it. Some details are in this older Sun Java 5 Class Data Sharing document (which you may have already seen?). Some Sun Java 6 docs also mention it a few times, but don't add much to the documentation. It seems it was originally an IBM VM feature. And, to continue the link dump, it's explained a bit in this article.
I don't personally know much about it, so I don't know how you might control it. You can regenerate it, but I don't think it's intended for you to put custom stuff into. Also, even if you can "trick" it, that would probably violate a Sun/Oracle license of some sort (you can't mess with rt.jar and redistribute, for instance). And, all that said, I doubt you would see a serious improvement in startup time unless you have thousands or tens of thousands of classes in your app?
(And this isn't really an answer, I know, but it was too big to fit in a comment, and I found the question interesting so I investigated a bit and put links here in case anyone finds the same info useful.)
It took a little figuring out but I have 4 Java8 VMs (version 1.8.0_162) running using shared classes. The following script was used to set up and test sharing and with a little modification could be used elsewhere:
#!/bin/bash
# Libraries to load
LIBS1="./lib/protobuf-java-2.6.1.jar:\
./lib/jetty-server-9.2.18.v20160721.jar:./lib/jetty-util-9.2.18.v20160721.jar:./lib/servlet-api-3.1.jar:./lib/jetty-http-9.2.18.v20160721.jar:./lib/jetty-io-9.2.18.v20160721.jar:\
./lib/derby.jar:\
./lib/json-simple-1.1.1.jar:"
LIBS2=":./lib/GTFS.jar"
# Uncomment these lines for the first phase where you are determining the classes to archive. During this phase aim to get as many classes loaded as possible
# which means loading a schedule and retrieving the stop list and next vehicle information
#
#APPCDS="-Xshare:off -XX:+UnlockCommercialFeatures -XX:+UseAppCDS -XX:DumpLoadedClassList=../GtfsAppCds.lst"
#java -Xmx512m $APPCDS -Dderby.system.home=database -classpath $LIBS1$LIBS2 com.transitrtd.GtfsOperatorManager
# Uncomment these lines when the class list is created and run to create the shared archive. Classes marked as unverifiable will need to be removed from the
# archived class list in GtfsAppCds.lst and the lines below run again. LIBS2 above contains jars which are left out of the archive. These are jars which change
# frequently and would therefore cause the archive to be frequently rebuilt.
#
#APPCDS="-Xshare:dump -XX:+UnlockCommercialFeatures -XX:+UseAppCDS -XX:SharedClassListFile=../GtfsAppCds.lst -XX:SharedArchiveFile=../GtfsAppCds.jsa"
#java -Xmx512m $APPCDS -classpath $LIBS1
# Uncomment these lines when wishing to verify the application is using the shared archive.
#
#APPCDS="-Xshare:on -XX:+UnlockCommercialFeatures -XX:+UseAppCDS -XX:SharedArchiveFile=../GtfsAppCds.jsa -verbose:class"
#java -Xmx512m $APPCDS -Dderby.system.home=database -classpath $LIBS1$LIBS2 com.transitrtd.GtfsOperatorManager
Note that the shared archive file (i.e.the jsa file) is architecture dependent and will need to be built on each target platform type.
Also if a jar uses sealed packages a security exception is thrown, see
https://docs.oracle.com/javase/tutorial/deployment/jar/sealman.html
for information on sealed packages. This was the case above with derby.jar but the problem could be solved by unpacking the jar file, replacing Sealed:true with Sealed:false in the manifest and repacking it.
jars built with older versions of java cannot be used in a shared archive, in the case above the derby version needed to be upgraded from 10.10 to 10.14 to benefit.
Related
I am currently working on a small project. The idea is to use jQAssistant to fill the neo4j database so that the data can be used by an rest api. The plan is to upload a jar, war or ear to a java backend so that it can be scanned (scan -f) and then start the neo4j server on port 7474.
What I already have tried:
1. Trying to execute "scan" and "server" with Java ProcessBuilder and Runtime.
2. Importing JQAssistant Commandline Neo4jv3 - 1.6.0 with gradle and trying to use the run-Method in Main.class with the commandline arguments (scan -f foldername).
Server-start works without any problems in both cases, but scanning is a huge problem. It does not seem to scan the specified folder correctly. The jqassistant-folder which has been created does not have any scanned data.
I assume that the root of the problem is the plugins folder and the variables JQASSISTANT_HOME and JQASSISTANT_OPTS appearing in the jqassistant.cmd and .sh files.
Is it actually possible to execute "server" and especially "scan" within java code?
It is possible to use jQAssistant from Java code but I'd not recommend it as the underlying APIs are subject to change. What remains downwards compatible over releases are the command line arguments, so going for the Main class as described in your question should be safe for a while. This approach is also used by the Gradle integration provided by Kontext E (http://techblog.kontext-e.de/jqassistant-with-gradle/).
Assuming that you're encountering the same problem with missing data when using the provided shell scripts for Windows/Linux. A common issue is that for scanning folders containing Java classes you need to specify a scope:
scan -f java:classpath::build/classes/main
The java:classpath prefix provides a hint that the folder shall be treated as a classpath element, see http://buschmais.github.io/jqassistant/doc/1.6.0/#_scanner and http://buschmais.github.io/jqassistant/doc/1.6.0/#cli:scan.
I've been working on trying to get COBOL and Java to interact with each other on the mainframe, and have run into trouble with specifically the cob2 compiler, which is the Unix on the mainframe equivalent.
I haven't seen many user experiences with this compiler online, so I was wondering if I asked a more direct question, people would reveal their insight.
IBM has several examples of Java calling COBOL DLL's either directly or indirectly, but they ultimately boil down to compile the COBOL as a dll, use System.load, compile Java and run. These examples haven't worked for me for the following reasons.
When using cob2 with the -c option, it is purported to generate a .o object file. This has not happened for me, although it did generate an empty .lst file. I was able to get around this by simply skipping the -c step and compiling and linking using this series of commands:
` sh ${COB2HOME}/bin/cob2 -o ${DIR}/c2jcr.o
-qdll,thread,case=mixed ${DIR}/c2jcr.cbl;
${COB2HOME}/bin/cob2
-o ${DIR}/libc2jcr.so
-bdll,case=mixed ${DIR}/c2jcr.o
${JAVAHOME}/bin/j9vm/libjvm.x
${COB2HOME}/lib/igzcjava.x `
This appears to provide the .so library that is required for link with the Java program, but upon investigation of the load, and during run, the system declares that the LE CSECT CEESTART is not there.
Am I missing something in my cob2 library that has these LE modules, or somewhere in my scripting? I tried pulling in loads from the mainframe compiled with the LE modules intact and ENTRY CEESTART explicitly stated in the link step, but could not get any further than "UnsatisfiedLinkError" with "Internal Error".
Any wisdom is greatly appreciated, especially if you've gone down a completely different route to call COBOL from Java. Thank you very much.
After conferring with IBM, it turns out I had a couple things missing.
You must have a STEPLIB environment field set to the location of your COBOL compiler on the mainframe, so it can find your IGYCRCTL module.
Second, like other COBOL 5+ compiling, you must allocate a gargantuan amount of space in order to compile. 2 GB is not enough. Since I don't have permission to reallocate this in Unix, I ran a BPXBATCH job with REGION=0M.
After those two changes, -c compiles came out as normally. The "workaround" I provided in the question is completely incorrect. You must use:
sh ${COB2HOME}/bin/cob2 -c -qdll,thread,case=mixed ${DIR}/${COBPROG}.cbl
as your compile step, and the rest is just linkage.
MATLAB is configured to search its static java class path before searching the user-modifiable dynamic path. Unfortunately, the static path contains quite a number of very old public libraries, so if you are trying to use a new version you may end up loading the wrong implementation and get errors.
For instance, the static path contains an old copy of the google-collections.jar, which has long been supplanted by Google's guava library and which has some of the same class names (e.g. com.google.common.base.Objects). As a result, if you invoke a Guava method that uses a newer method of one of such a class, you will end up getting surprising NoSuchMethodErrors because the google-collections jar is found first.
As of R2012b, MATLAB lets you specify additional jars to add to the static path by putting a javaclasspath.txt file in your preferences folder, but that adds jars to the end of the path, and doesn't let you override jars that are built into MATLAB.
So what is the best way around this?
I got an official response from Mathworks:
As of MATLAB R2013a (also in R2012b), classes can be added to the front of the static Java class path by including the following line in javaclasspath.txt:
<before>
Any directory that is after this line in javaclasspath.txt will be added to the front of the static Java class path. This is an undocumented use of javaclasspath.txt as of R2013a.
But overall in MATLAB, the ability to add classes to the front of the static Java classpath is not available through javaclasspath.txt in MATLAB 8.0 (R2012b).
MATLAB searches for classpath.txt in the following order:
In the startup directory. As of MATLAB 8.0 (R2012b) a warning will be shown if the file is found there and it will be ignored.
In the first directory on the MATLABPATH environment variable. (This environment variable is used in the bin/matlab shell script on Linux and in general is not used by the end-user).
In the toolbox/local directory.
Although the MATLABPATH environment variable of point 2 is normally not used by end-users we can use it in a workaround to allow reading a custom classpath.txt outside of the toolbox/local directory.
On Windows:
You will need to create the MATLABPATH environment variable. The first directory on it should be your directory with the custom classpath.txt AND you will also need to add the toolbox\local directory as second option. So from a cmd prompt you could do:
set MATLABPATH=c:\Users\user\Documents\myMATLABClasspath;c:\Program Files\MATLAB\R2012b
\toolbox\local
matlab.exe
One hack that appears to work is to add the jar to the top of the classpath.txt file that can be found in your MATLAB installations toolbox/local folder. Unfortunately, this is automatically generated and may get rewritten at some unspecified time, such as when you install new toolboxes, so this approach would require you to have some way to notice when this happens and reapply the hack.
If you're distributing a jar that's intended to be used with matlab, it may be better to use proguard as described at http://code.google.com/p/guava-libraries/wiki/UsingProGuardWithGuava.
If you specify that all of your classes and their (public) fields and methods are to be preserved and include guava as a program jar (not a library), then it will rename all of guava's methods and update your compiled bytecode to reference the new names.
It seems a bit hackish, but depending on the audience, it may be significantly easier than teaching your users about static vs. dynamic classpath, and it won't break any matlab code that depends on the old behavior.
Instead of obfuscating the package as suggested by #user2443532, I have found it easier to "shade" the conflicting package instead of obfuscating it - unless you actually need obfuscation. One easy way to do this is to build your package using Maven and use the maven-shade-plugin. Internal calls are modified automatically, so you don't need to modify any of the Java code.
Direct calls from Matlab will need to be modified - for example, calls to com.opensource.Class become shaded.com.opensource.Class.
For more info on shading, see What is the maven-shade-plugin used for, and why would you want to relocate Java packages?
We are developing a fairly large project and have many dependencies. Recently, we ran into an issue with a conflict between two of them, agileAPI.jar and axis.jar. Both are 3rd party libraries.
The code in question depends directly on agileAPI.jar. If I build it with just that in the build path, everything that depends on it works correctly.
As soon as I add axis.jar to the build path (just adding it, not writing code that depends on it), everything goes wrong. Some of the code that depended on the first library is now throwing exceptions from the 2nd library. It is as if the first library is picking and choosing methods to call from the 2nd library, instead of whereever it was calling them from prior.
I have code in the project that needs axis.jar directly, so I can't just remove it from the build path. I need to find a way to have these two exist in the same build path, but ignore each other.
It should be noted that both libraries coexisted prior to a recent upgrade with agile. I have been working with Oracle's support team to try and resolve this. After two weeks, though, I am looking for other sources of help.
Our environment is Windows and Eclipse, although in testing this, it also occurs when running java from a command line. Our JDK is 1.5.0_22.
Any help would be appreciated.
Thank you,
David
EDIT:
As requested, here are the stack traces that we see. The first stack trace is printed in the code beyond my control:
java.lang.NoSuchMethodError: org.apache.axis.description.OperationDesc.setStyle(Lorg/apache/axis/constants/Style;)V
at com.agile.webfs.components.fileserver.client.FileServerSoapBindingStub._initOperationDesc1(FileServerSoapBindingStub.java:37)
at com.agile.webfs.components.fileserver.client.FileServerSoapBindingStub.<clinit>(FileServerSoapBindingStub.java:20)
at com.agile.webfs.components.fileserver.client.FileServerWSServiceLocator.getFileServer(FileServerWSServiceLocator.java:43)
at com.agile.webfs.client.IFSLocator.getRemoteFileServer(IFSLocator.java:128)
at com.agile.webfs.client.IFSLocator.getConnection(IFSLocator.java:101)
at com.agile.api.pc.EJBLookup.createFileSession(EJBLookup.java:444)
at com.agile.api.pc.EJBLookup.getFileSession(EJBLookup.java:432)
at com.agile.api.pc.attachment.IFSOutputStream.getFileSession(IFSOutputStream.java:133)
at com.agile.api.pc.attachment.IFSOutputStream.copyFrom(IFSOutputStream.java:87)
at com.agile.api.pc.attachment.IFSOutputStream.copyFrom(IFSOutputStream.java:115)
at com.agile.api.pc.TableAttachment.uploadFile(TableAttachment.java:886)
at com.agile.api.pc.TableAttachment$AddFiles2Action.doSdkAction(TableAttachment.java:724)
at com.agile.api.common.SDKAction.run(SDKAction.java:23)
at com.agile.api.common.OracleAuthenticator.doAs(OracleAuthenticator.java:131)
at com.agile.api.common.Security.doAs(Security.java:54)
at com.agile.api.common.Security.doAs(Security.java:109)
at com.agile.api.pc.TableAttachment.addFiles2(TableAttachment.java:483)
at com.agile.api.pc.TableAttachment.createNewBlob2(TableAttachment.java:459)
at com.agile.api.pc.TableAttachment.doCreateServerRowWithParam(TableAttachment.java:363)
at com.agile.api.pc.Table.createTableRow(Table.java:238)
at com.agile.api.pc.TableAttachment.createTableRow(TableAttachment.java:169)
at com.agile.api.pc.Table.createRow(Table.java:202)
at com.[snip].updateAttachments(VaultImportService.java:3068)
at com.[snip].processIncorporatedFile(VaultImportService.java:926)
at com.[snip].processPdxFile(VaultImportService.java:532)
at com.[snip].processPdxRequest(VaultImportService.java:388)
at com.[snip].<init>(VaultImportService.java:299)
at com.[snip].main(VaultImportService.java:3660)
After the exception bubbles up and we catch it, the stacktrace that we print looks like:
at com.agile.api.pc.Session.createError(Session.java:1772)
at com.agile.api.pc.EJBLookup.createFileSession(EJBLookup.java:454)
at com.agile.api.pc.EJBLookup.getFileSession(EJBLookup.java:432)
at com.agile.api.pc.attachment.IFSOutputStream.getFileSession(IFSOutputStream.java:133)
at com.agile.api.pc.attachment.IFSOutputStream.copyFrom(IFSOutputStream.java:87)
at com.agile.api.pc.attachment.IFSOutputStream.copyFrom(IFSOutputStream.java:115)
at com.agile.api.pc.TableAttachment.uploadFile(TableAttachment.java:886)
at com.agile.api.pc.TableAttachment$AddFiles2Action.doSdkAction(TableAttachment.java:724)
at com.agile.api.common.SDKAction.run(SDKAction.java:23)
at com.agile.api.common.OracleAuthenticator.doAs(OracleAuthenticator.java:131)
at com.agile.api.common.Security.doAs(Security.java:54)
at com.agile.api.common.Security.doAs(Security.java:109)
at com.agile.api.pc.TableAttachment.addFiles2(TableAttachment.java:483)
at com.agile.api.pc.TableAttachment.createNewBlob2(TableAttachment.java:459)
at com.agile.api.pc.TableAttachment.doCreateServerRowWithParam(TableAttachment.java:363)
at com.agile.api.pc.Table.createTableRow(Table.java:238)
at com.agile.api.pc.TableAttachment.createTableRow(TableAttachment.java:169)
at com.agile.api.pc.Table.createRow(Table.java:202)
at com.[snip].updateAttachments(VaultImportService.java:3068)
at com.[snip].processIncorporatedFile(VaultImportService.java:926)
at com.[snip].processPdxFile(VaultImportService.java:532)
at com.[snip].processPdxRequest(VaultImportService.java:388)
at com.[snip].<init>(VaultImportService.java:299)
at com.[snip].main(VaultImportService.java:3660)
In both cases, the line "at com.agile.api.pc.Table.createRow(Table.java:202)" is the agileAPI call that I am making. I have removed our package structure, as it identifies the company that I work for. They value privacy and security.
I'd advise you to check these two things first:
Open the axis.jar file with some zip utility, like 7-Zip or WinRar. See if there's a folder called "services" in the META-INF folder in the jar. If there is, it's possible that the axis.jar file specifies implementations for specific interfaces that somehow don't interoperate with agileAPI. Also do the same for agileAPI.jar, since it might itself declare an interface implementation that axis doesn't like.
Open both agileAPI.jar and axis.jar with a zip utility, then check if there's packages with the same name. If there's none, it won't be a naming conflict. If there's one or more, open the corresponding folders and do the same check recursively. If you end up with at least one class with the same name in the same package across the two jars, it's probably a naming conflict.
That should catch the most obvious issues. If none of this is the case, we'll need to look deeper.
A way to solve such classpath issues is to use a module system such as OSGi or the NetBeans Platform module system where each module has its own classloader.
an application I have written uses several third party jars. Sometimes only a small portion of the entire 50kB to 1.7mB jar is used - one or two function calls or classes.
What is the best way to reduce the jar sizes. Should I download the sources and build a jar with just the classes I need? What existing tools can help automate this (ex I briefly looked at http://code.google.com/p/jarjar/)?
Thank you
Edit 1:
I would like to lower the size of my third party 'official' jars like swingx-1.6.jar (1.4 MB), set-3.6 (1.7 MB) glazedlists-1.8.jar (820kB) , etc. so that they only contain the bare minimum classes I need
Edit 2:
Minimizing a jar by hand or by using a program like proguard is further complicated if the library uses reflection.
Injection with google guice does not work anymore after obfuscation with proguard
The answer by cletus on another post is very good How to determine which classes are used by a Java program?
Proguard would be an option. It can eliminate unused classes and methods. You can also use it to obfuscate, which can further reduce the size of your final jar. Be aware that class loading by name is liable to break unless care is taken to keep the affected classes unobfuscated.
I've found Proguard quite effective - can be a bit cryptic to understand at the outset. But I don't have any experience with similar to offer a comparison.
First of all, if you use only one class from JAR file this does not mean that this class does not use other classed from that JAR.
The option for you, if you use open source JARs, is to get sources of that JAR, attach them to your project, remove unnecessary stuff and build the changes by yourself.
You could add GenJar as an Ant task and use it to build the JAR. As it says on the library's home page,
GenJar is a specialized Ant task that
builds jar files based on class
dependencies rather than simply the
contents of a directory.
You can find it on SourceForge.