an application I have written uses several third party jars. Sometimes only a small portion of the entire 50kB to 1.7mB jar is used - one or two function calls or classes.
What is the best way to reduce the jar sizes. Should I download the sources and build a jar with just the classes I need? What existing tools can help automate this (ex I briefly looked at http://code.google.com/p/jarjar/)?
Thank you
Edit 1:
I would like to lower the size of my third party 'official' jars like swingx-1.6.jar (1.4 MB), set-3.6 (1.7 MB) glazedlists-1.8.jar (820kB) , etc. so that they only contain the bare minimum classes I need
Edit 2:
Minimizing a jar by hand or by using a program like proguard is further complicated if the library uses reflection.
Injection with google guice does not work anymore after obfuscation with proguard
The answer by cletus on another post is very good How to determine which classes are used by a Java program?
Proguard would be an option. It can eliminate unused classes and methods. You can also use it to obfuscate, which can further reduce the size of your final jar. Be aware that class loading by name is liable to break unless care is taken to keep the affected classes unobfuscated.
I've found Proguard quite effective - can be a bit cryptic to understand at the outset. But I don't have any experience with similar to offer a comparison.
First of all, if you use only one class from JAR file this does not mean that this class does not use other classed from that JAR.
The option for you, if you use open source JARs, is to get sources of that JAR, attach them to your project, remove unnecessary stuff and build the changes by yourself.
You could add GenJar as an Ant task and use it to build the JAR. As it says on the library's home page,
GenJar is a specialized Ant task that
builds jar files based on class
dependencies rather than simply the
contents of a directory.
You can find it on SourceForge.
Related
I have a Scala Akka Application which connects to HBase (currently CDP earlier HDP) deployed on rancher; Never faced any trouble when connecting to HDP hbase; Since recent HDP to CDP change, with the same image we are getting no method found on one of the dependency's class in one of the container, where as another container of same image connects to hbase properly.; even though the jar exists in the same image and classpath also.
one of the noticeable difference is change in the order of classpath.
Does change in the classpath order will effect the jars availability.
Does java libraries/classes would load in different order when they would hit a faster CPU cycle at startup.
What could be the reason for such "no class method found".
It certainly can, if the same class file is present in different classpath entries. For example, if your classpath is: java -cp a.jar:b.jar com.foo.App, and:
a.jar:
pkg/SomeClass.class
b.jar:
pkg/SomeClass.class
Then this can happen - usually because one of the jars on your classpath is an older version than the other, or the same but more complicated: one of the jars of your classpath contains a whole heap of different libraries all squished together and one of those components is an older version.
There are some basic hygiene rules to observe:
Don't squish jars together. If you have 500 deps, put 500 entries on your classpath. We have tools to manage this stuff, use them. Don't make striped jars, uber jars, etc.
Use dependency trackers to check if there are version difference in your dependency chain. If your app depends on, say, 'hibernate' and 'jersey', and they both depend on google's guava libraries, but hibernate imports v26 and jersey imports v29, that's problematic. Be aware of it and ensure that you explicitly decide which version ends up making it. Presumably, you'd want to explicitly pick v29 and perhaps check that hibernate also runs on v29*. If it doesn't, you have bigger problems. They are fixable (with modular classloaders), but not easily.
*) Neither hibernate nor jersey actually depend on guava, I'm just using them as hypothetical examples.
For example, if you use maven, check out the enforcer plugin. (groupId: org.apache.maven.plugins, artifactId: maven-enforcer-plugin).
My bet is that there is another version of the jar somewhere in CDP, and occasionally it is loaded before the version that you ship with your project, causing the error.
So, when your container starts, try logging from which location the conflicting class is loaded. This question might help you: Determine which JAR file a class is from
Our project has started newly and wanted to know some of the best practices followed in industry. We have generated lot of DTOs(getters and setters) code for webservices using JaxB. we keep all the DTO classes along with regular pojos(logic written), its looks like large project due to this auto-generated code, also for code coverage it considers these classes also.
I am keen to know that these classes should be as a jar file in classpath or it should be as classes in project.
Thanks in Advance,
Madhavi
If your project uses Maven (or something similar) I would advise having the code generation and the generated code in a separate module of a multi module project.
This way the generated stuff is out of the way of the hand crafted code. You can also set up your Maven build process to then build this module first and the rest of the code can rely on the resulting artefact, be it a jar or something else.
You could also regenerate the generated code on each new build this way. Although this can be a lengthy process, depending on the service.
Generated files should not be mixed with your written files.
A common approach is to generate them to the target folder, e.g. target/generated-sources or something similiar. Of course, if they are rarely changed, you could also put them in a jar file that you import into your project.
I think its better to keep them in jar. As its auto generated code and no one is supposed to change. Whenever regenerated include new jar.
We are developing a fairly large project and have many dependencies. Recently, we ran into an issue with a conflict between two of them, agileAPI.jar and axis.jar. Both are 3rd party libraries.
The code in question depends directly on agileAPI.jar. If I build it with just that in the build path, everything that depends on it works correctly.
As soon as I add axis.jar to the build path (just adding it, not writing code that depends on it), everything goes wrong. Some of the code that depended on the first library is now throwing exceptions from the 2nd library. It is as if the first library is picking and choosing methods to call from the 2nd library, instead of whereever it was calling them from prior.
I have code in the project that needs axis.jar directly, so I can't just remove it from the build path. I need to find a way to have these two exist in the same build path, but ignore each other.
It should be noted that both libraries coexisted prior to a recent upgrade with agile. I have been working with Oracle's support team to try and resolve this. After two weeks, though, I am looking for other sources of help.
Our environment is Windows and Eclipse, although in testing this, it also occurs when running java from a command line. Our JDK is 1.5.0_22.
Any help would be appreciated.
Thank you,
David
EDIT:
As requested, here are the stack traces that we see. The first stack trace is printed in the code beyond my control:
java.lang.NoSuchMethodError: org.apache.axis.description.OperationDesc.setStyle(Lorg/apache/axis/constants/Style;)V
at com.agile.webfs.components.fileserver.client.FileServerSoapBindingStub._initOperationDesc1(FileServerSoapBindingStub.java:37)
at com.agile.webfs.components.fileserver.client.FileServerSoapBindingStub.<clinit>(FileServerSoapBindingStub.java:20)
at com.agile.webfs.components.fileserver.client.FileServerWSServiceLocator.getFileServer(FileServerWSServiceLocator.java:43)
at com.agile.webfs.client.IFSLocator.getRemoteFileServer(IFSLocator.java:128)
at com.agile.webfs.client.IFSLocator.getConnection(IFSLocator.java:101)
at com.agile.api.pc.EJBLookup.createFileSession(EJBLookup.java:444)
at com.agile.api.pc.EJBLookup.getFileSession(EJBLookup.java:432)
at com.agile.api.pc.attachment.IFSOutputStream.getFileSession(IFSOutputStream.java:133)
at com.agile.api.pc.attachment.IFSOutputStream.copyFrom(IFSOutputStream.java:87)
at com.agile.api.pc.attachment.IFSOutputStream.copyFrom(IFSOutputStream.java:115)
at com.agile.api.pc.TableAttachment.uploadFile(TableAttachment.java:886)
at com.agile.api.pc.TableAttachment$AddFiles2Action.doSdkAction(TableAttachment.java:724)
at com.agile.api.common.SDKAction.run(SDKAction.java:23)
at com.agile.api.common.OracleAuthenticator.doAs(OracleAuthenticator.java:131)
at com.agile.api.common.Security.doAs(Security.java:54)
at com.agile.api.common.Security.doAs(Security.java:109)
at com.agile.api.pc.TableAttachment.addFiles2(TableAttachment.java:483)
at com.agile.api.pc.TableAttachment.createNewBlob2(TableAttachment.java:459)
at com.agile.api.pc.TableAttachment.doCreateServerRowWithParam(TableAttachment.java:363)
at com.agile.api.pc.Table.createTableRow(Table.java:238)
at com.agile.api.pc.TableAttachment.createTableRow(TableAttachment.java:169)
at com.agile.api.pc.Table.createRow(Table.java:202)
at com.[snip].updateAttachments(VaultImportService.java:3068)
at com.[snip].processIncorporatedFile(VaultImportService.java:926)
at com.[snip].processPdxFile(VaultImportService.java:532)
at com.[snip].processPdxRequest(VaultImportService.java:388)
at com.[snip].<init>(VaultImportService.java:299)
at com.[snip].main(VaultImportService.java:3660)
After the exception bubbles up and we catch it, the stacktrace that we print looks like:
at com.agile.api.pc.Session.createError(Session.java:1772)
at com.agile.api.pc.EJBLookup.createFileSession(EJBLookup.java:454)
at com.agile.api.pc.EJBLookup.getFileSession(EJBLookup.java:432)
at com.agile.api.pc.attachment.IFSOutputStream.getFileSession(IFSOutputStream.java:133)
at com.agile.api.pc.attachment.IFSOutputStream.copyFrom(IFSOutputStream.java:87)
at com.agile.api.pc.attachment.IFSOutputStream.copyFrom(IFSOutputStream.java:115)
at com.agile.api.pc.TableAttachment.uploadFile(TableAttachment.java:886)
at com.agile.api.pc.TableAttachment$AddFiles2Action.doSdkAction(TableAttachment.java:724)
at com.agile.api.common.SDKAction.run(SDKAction.java:23)
at com.agile.api.common.OracleAuthenticator.doAs(OracleAuthenticator.java:131)
at com.agile.api.common.Security.doAs(Security.java:54)
at com.agile.api.common.Security.doAs(Security.java:109)
at com.agile.api.pc.TableAttachment.addFiles2(TableAttachment.java:483)
at com.agile.api.pc.TableAttachment.createNewBlob2(TableAttachment.java:459)
at com.agile.api.pc.TableAttachment.doCreateServerRowWithParam(TableAttachment.java:363)
at com.agile.api.pc.Table.createTableRow(Table.java:238)
at com.agile.api.pc.TableAttachment.createTableRow(TableAttachment.java:169)
at com.agile.api.pc.Table.createRow(Table.java:202)
at com.[snip].updateAttachments(VaultImportService.java:3068)
at com.[snip].processIncorporatedFile(VaultImportService.java:926)
at com.[snip].processPdxFile(VaultImportService.java:532)
at com.[snip].processPdxRequest(VaultImportService.java:388)
at com.[snip].<init>(VaultImportService.java:299)
at com.[snip].main(VaultImportService.java:3660)
In both cases, the line "at com.agile.api.pc.Table.createRow(Table.java:202)" is the agileAPI call that I am making. I have removed our package structure, as it identifies the company that I work for. They value privacy and security.
I'd advise you to check these two things first:
Open the axis.jar file with some zip utility, like 7-Zip or WinRar. See if there's a folder called "services" in the META-INF folder in the jar. If there is, it's possible that the axis.jar file specifies implementations for specific interfaces that somehow don't interoperate with agileAPI. Also do the same for agileAPI.jar, since it might itself declare an interface implementation that axis doesn't like.
Open both agileAPI.jar and axis.jar with a zip utility, then check if there's packages with the same name. If there's none, it won't be a naming conflict. If there's one or more, open the corresponding folders and do the same check recursively. If you end up with at least one class with the same name in the same package across the two jars, it's probably a naming conflict.
That should catch the most obvious issues. If none of this is the case, we'll need to look deeper.
A way to solve such classpath issues is to use a module system such as OSGi or the NetBeans Platform module system where each module has its own classloader.
I have a project which I want to add plugins. I have all the interfaces/factories/etc. setup (my gateway interface is called ApplicationMonitorFactory), I just need to make a way to locate/activate the plugin. My configuration file is a java properties file.
I think what I need to do is:
find a good way to specify a set of one or more plugins
for each plugin, run it
1. find a good way to specify a set of one or more plugins
something like:
application.plugins=foo-monitor.jar,bar-monitor.jar
I think maybe it's just best to specify a list of jar files; for each jar file specified, the implication is that it contains one or more classes which implement ApplicationMonitorFactory, and these are the ones that will be instantiated. (I might also add an annotation #ApplicationMonitorPlugin so that a .jar file can have a test ApplicationMonitorFactory that does not get instantiated)
Does this sound reasonable?
2. for each plugin, run it
I did this once a while back, and if I remember right I think I need to use a custom classloader to add the appropriate .jar file to the classpath dynamically. Or is there an easier way?
Could I suggest using OSGI instead? If it's a serverside project, something like Apache Karaf gives you quite a lot out of the box in terms of plugin deployment and specification.
To answer the questions based on what you have at the moment:
1. find a good way to specify a set of one or more plugins
The properties file approach is fine. You may want to just be able to drop plugins into a folder that you monitor if you want hot deploy. Just having 1 jar file for a plugin does limit plugin developers to packaging all of their dependencies into a single jar file (maven shade plugin is useful for this). The annotation approach should work (the approach that Servlet 3.0 uses). Using OSGI, you'd have a manifest file with a Bundle-Activator property that would reference the plugin class that should be instantiated.
2. for each plugin, run it
Yes, you would need to fire up a class loader for the Jar files. This is where things get a bit hairier. It's easy enough to do but Class loading has all sorts of gotcha's. This is where OSGI would really help, even though it is a bit of an upfront cost.
I am looking for a replacement for javadeps, which I used to use to generate sections of a Makefile to specify which classes depended on which source files.
Unfortunately javadeps itself has not been updated in a while, and cannot parse generic types or static imports.
The closest thing I've found so far is Dependency Finder. It almost does what I need but does not match non-public classes to their source files (as the source filename does not match the class name.) My current project has an interface whose only client is an inner class of a package-private class, so this is a significant problem.
Alternatively if you are not aware of a tool that does this, how do you do incremental compilation in large Java projects using command-line tools? Do you compile a whole package at a time instead?
Notes:
javadeps is not to be confused with jdepend, which is for a very different purpose.
This question is a rewrite of "Tool to infer dependencies for a java project" which seemed to be misunderstood by 2 out of 3 responders.
I use the <depend> task in ant, which is ok, but not 100% trustworthy. Supposedly JavaMake can do this dependency analysis, but it seems to be rarely updated and the download page is only sometimes available.