For reasons I don't even want to begin to get into.. I have a maven hierarchy that looks like the one below. In a nutshell, everything requires commonslang3, except one ancient artifact that requires commonslang2.
We have no issues with compile or runtime, the dependencies work as expected. The challenge we are having is at development time.
We'd like to ensure everyone on the team uses the commonslang3 APIs, but occasionally (because of the ancient artifact and Eclipse auto suggest), someone accidentally uses the commonslang2 APIs.
Normally, we would just force the desired version in our POM, but commonslang is a special snowflake. The package signature changed between comonslang2 and commonslang3, which means we would have compile failures if we excluded the older library. E.g.,
org.apache.commons.lang3.StringUtils
org.apache.commons.lang.StringUtils
My question is this, how can I configure maven/Eclipse, to use commonlang2 as needed during compile... but not populate it in the Eclipse class autosuggest list? My desired end state is that someone types 'stringuti' + ctrl + space, and the only option they see is commonslang3. I am aware that each developer can remove individual classes via (Window->Preferences->Java->Appearance->Type Filters) but that is not a viable solution for two reasons: 1) It's a large team with frequently changing resources... 2) I need an entire artifact removed, as in hundreds of classes.
Example Tree:
MyWar
-- MyModuleJar1
-- ...
-- MyModuleJar2
-- LibA
-- commonslang
-- ...
-- LibB
-- commonslang3
-- ...
-- LibC
-- commonslang3
-- ...
-- ...
In Eclipse:
Window->Preferences->Java->Appearance->Type Filters
Add org.apache.commons.lang.*
Because you want to affect auto-complete which is a function of the IDE, you are forced to change the setting in the IDE. You can export the preferences and share them of the rest of the team.
There is not much you can do about it in Eclipse other than type filters #JustinKSU mentioned.
But with Maven you can use Takari to access rules to prevent accidental inclusion of transitive dependencies. Of course this comes with a plethora of caveats with one ironically being that the Eclipse JDT compiler has to be used instead of plain javac.
Related
I have a Scala Akka Application which connects to HBase (currently CDP earlier HDP) deployed on rancher; Never faced any trouble when connecting to HDP hbase; Since recent HDP to CDP change, with the same image we are getting no method found on one of the dependency's class in one of the container, where as another container of same image connects to hbase properly.; even though the jar exists in the same image and classpath also.
one of the noticeable difference is change in the order of classpath.
Does change in the classpath order will effect the jars availability.
Does java libraries/classes would load in different order when they would hit a faster CPU cycle at startup.
What could be the reason for such "no class method found".
It certainly can, if the same class file is present in different classpath entries. For example, if your classpath is: java -cp a.jar:b.jar com.foo.App, and:
a.jar:
pkg/SomeClass.class
b.jar:
pkg/SomeClass.class
Then this can happen - usually because one of the jars on your classpath is an older version than the other, or the same but more complicated: one of the jars of your classpath contains a whole heap of different libraries all squished together and one of those components is an older version.
There are some basic hygiene rules to observe:
Don't squish jars together. If you have 500 deps, put 500 entries on your classpath. We have tools to manage this stuff, use them. Don't make striped jars, uber jars, etc.
Use dependency trackers to check if there are version difference in your dependency chain. If your app depends on, say, 'hibernate' and 'jersey', and they both depend on google's guava libraries, but hibernate imports v26 and jersey imports v29, that's problematic. Be aware of it and ensure that you explicitly decide which version ends up making it. Presumably, you'd want to explicitly pick v29 and perhaps check that hibernate also runs on v29*. If it doesn't, you have bigger problems. They are fixable (with modular classloaders), but not easily.
*) Neither hibernate nor jersey actually depend on guava, I'm just using them as hypothetical examples.
For example, if you use maven, check out the enforcer plugin. (groupId: org.apache.maven.plugins, artifactId: maven-enforcer-plugin).
My bet is that there is another version of the jar somewhere in CDP, and occasionally it is loaded before the version that you ship with your project, causing the error.
So, when your container starts, try logging from which location the conflicting class is loaded. This question might help you: Determine which JAR file a class is from
I am currently writing a JavaFX application contained within 1 module and I want to use Javadoc to document all of my code. However, I am noticing that I can't seem to generate Javadocs for packages that have not been exported out of the module in module-info.java. On one hand, that makes sense. Non-exported packages aren't part of the public API. On the other hand, I feel like surely there should be options to enable documentation of internal APIs hidden in non-exported packages, but I've had no success in enabling them.
As this is a Maven project, I've tried the following options with the maven-javadoc-plugin:
<show>private</show>
<additionalOptions>-private</additionalOptions>
<additionalOptions>--show-module-contents all --show-packages all --show-types private</additionalOptions>
None of these work (and I am pretty sure 1 and 2 are the exact same thing). They only show a bit more info on one package that i've exported to another specific module. If I don't have these options, the Modules section of the Javadoc is completely blank with the exception of the module name.
I've done lots of Googling and no one on the Internet seems to bring this issue up. Maybe my Google-Fu is just off? I feel like there's just some silly undocumented flag that I haven't found yet because it can't be the case that you have to export the packages to get Javadocs for them, right?
My project consists of only one module containing 8 packages. None of them need to be fully exported out yet. Only one package containing my JavaFX files needs to be exported to javafx.graphics and that's the only one that gets picked up by Javadoc when I enable <show>private</show>.
Here is a gist of my module and Maven config, if anybody needs it:
https://gist.github.com/urbenlegend/753de7bec598fd07d6b5c0b0ef02d1d0
I am invoking Javadoc generation via mvn compile javadoc:javadoc
Anyone here have any tips? Thanks in advance!
How can you strip java or scala annotations programmatically?
One of the deliverables for my current project is my code, plus whatever dependencies my code relies on, as an uber-jar. We're building an SDK therefore any of the included dependencies need to be renamed as to not interfere with the SDK client's dependencies (meaning if we're using apache commons version X, and they're using version Y, there aren't any conflicts). We used sbt-assembly to rename the packages our code relies on. For example org.apache.* -> com.mycompany.org.apache.*
Our project is largely written in scala and we're using sbt to build. Recently I determined that shading causes problems with the ScalaSignature annotation. This in turn causes build problems when the uber-jar is used in the client's project.
I have determined that removing ScalaSignature resolves the problem. I used ProGuard to do this, however that tool is overkill for what we are trying to achieve. I'm just curious if anyone out there has found a tool that can strip specified annotations. Or even better, a tool that will rename the packages while preserving ScalaSignature in a way that scalac is happy with.
You might want to check JavaAssist library, which clearly allows to get/add annotations dynamically.
Following that, it should be theoretically possible to use AnnotationsAttribute in the following way (haven't tried myself but let us know if the below has worked):
MethodInfo minfo = m.getMethodInfo();
AnnotationsAttribute attr = (AnnotationsAttribute)
minfo.getAttribute(AnnotationsAttribute.invisibleTag);
Annotation[] annotations = attr.getAnnotations();
// ...remove the unnecessary annotation from array...
attr.setAnnotations(annotations);
I'm looking for a possibility to parameterise a multi-module build in a way that I can replace/specify some files (e.g. UML files) that are used during the build in order to produce different output.
The procedure of the build stays the same but I want to be able to produce different output depending on the input UML model.
I have a multi-module project that builds several jars based upon an UML model. The pom structure looks as follows:
+ generation
- mod1
- mod2
- mod3
The root pom (generation) generates java sourcecode (.java) based upon an UML model stored in the directory /uml. Afterwards the modules (mod1...3) compile distinct subsets of this sourcecode and package the output as jar.
I want to reuse this build procedure and apply it to different UML models.
How can I reuse the entire generation, compilation and packaging procedure defined in the multimodule project in other maven projects?
# Generate jars based upon the foo UML model
+ generation-foo
/uml/foo.uml
# Generate jars based upon the bar UML model
+ generation-bar
/uml/bar.uml
Update
I could use profiles in the generation project in order to define the different input uml models and then just activate the relevant one. But I would lose traceability that way.
Perhaps a completely new approach would be a better idea ... any suggestions?
Conceptually, I would say, Maven is designed around the POM file which is a model of the project that is being built. It is not so much a process description that applies a function to an input and results in an output on basis of that.
That being said, there is a lot possible with properties in the POM, which then can be passed along on the command-line: -Dproperty=value. It looks as if you would be able to pass the property to whatever process is generating the source code.
I may express some caution, though. I'm seeing some possible red flags in the overall design that you describe. If modules (regardless of their inheritance relationship) pass along files/folders, that should preferably go through installation.
So, if you were to do that, you end up with a version of the parent project in your local repository of which you don't really know what it is. Which parameters were used? And how will a user of that artifact then deal with that?
I'm not saying this won't work, but it may get hairy and not play entirely well within more traditional Maven implementations.
I'm not entirely sure I understand your use cases but you might want to look at:
POM Inheritance : Defining as much as you can in the parent module (different groups of modules can have the same parent)
Maven profiles : you can activate based on all sort of potential conventions like even the project name.
Maven Archetypes : And finally I think based on what your saying this maybe the only solution of a reusable project template
I am looking for a replacement for javadeps, which I used to use to generate sections of a Makefile to specify which classes depended on which source files.
Unfortunately javadeps itself has not been updated in a while, and cannot parse generic types or static imports.
The closest thing I've found so far is Dependency Finder. It almost does what I need but does not match non-public classes to their source files (as the source filename does not match the class name.) My current project has an interface whose only client is an inner class of a package-private class, so this is a significant problem.
Alternatively if you are not aware of a tool that does this, how do you do incremental compilation in large Java projects using command-line tools? Do you compile a whole package at a time instead?
Notes:
javadeps is not to be confused with jdepend, which is for a very different purpose.
This question is a rewrite of "Tool to infer dependencies for a java project" which seemed to be misunderstood by 2 out of 3 responders.
I use the <depend> task in ant, which is ok, but not 100% trustworthy. Supposedly JavaMake can do this dependency analysis, but it seems to be rarely updated and the download page is only sometimes available.