Is there a code analysis tool for Java for asserting package dependencies/non-dependencies?
I have a project where dependencies have crept in between packages that should not know about each other - in particular back-dependencies, which should have been one way.
I'd like to specify which pairs of packages (in which direction) are allowed to be dependent. Alternatively, specifying packages that are NOT allowed to depend on each other - or not both ways - would help. Ideally this would be something that could run as part of an Ant build and/or JUnit run and fail the build if new dependencies, in violation of the rules, are introduced. We'd then add this to our CI process.
It would also be useful to be able to specify dependencies at the level of groups of packages (for example, all packages and sub-packages in 'web' CAN depend on any package in 'api' or its sub-packages).
Some specifics in case relevant, for my particular project:
The Java version is 1.7
The build process is based around Ant (can upgrade to latest version if needed)
Testing is with JUnit (can upgrade to latest version if needed)
You need a solution like Jdepend. There is an example on how to use with junit
Related
My library depends on another library; let's call it "lib". I want to test my library with multiple versions of lib, in an automated manner.
Test if my library compiles for each version of lib.
Run JUnit 5 tests for each version of lib.
Are there any existing solutions for this?
I could write a script that changes the version number of lib in my pom.xml and executes mvn compile and mvn surefire:test. I could also use profiles and automate this with a script. I was hoping there is a better way, through something like a Maven plugin.
Maven focuses on reproducible builds which means that if you repeat the build at a later date you should get the same results which in turn requires that the dependency versions are fixed.
This fundamental mindset is what you want to challenge. Maven won't like it even if it is for a good reason, and you will most likely need to have a separate full run for each version instead of looping inside Maven.
I was thinking that the way I would approach this, is to have a bill-of-materials POM that has a dependencyManagement section that lists the exact version you want to have, which is generated in the local filesystem before each run, and then orchestrate a run for each version you want to test.
You can also leverage your build system and have a repository which orchestrates this. Github Actions can do array builds which might be what you need.
I'm trying to convert existing Java projects with Maven and Eclipse into Java 9+ modules. The projects have unit tests and the unit tests have test dependencies. I need the test dependencies to be available in the test code, but I don't want them exposed to the rest of the world in the published modules.
I think Testing in the Modular World describes the Maven solutions well. In summary one solution is to create one module-info.java in the main source folder and another in the testing folder. The file in the main folder has the real dependencies. The file in the test folder adds the test dependencies.
The solution works well in Maven and I can build and run tests from the command line. However, when I import the project into Eclipse as a Maven project it balks. Eclipse complains that "build path contains duplicate entry module-info" and refuses to build the project at all.
Using the other suggested solution in the article with a module-info.test containing --add-reads has no effect and the build fails in both Maven and Eclipse as the tests can't find their dependencies.
To make matters more complex I need to import the test dependencies from Maven, but I also need to import standard Java modules that are not used by the main code. For example one unit test relies on the built-in web server provided by java.httpserver and as it is part of the JDK any magic done on the test dependencies will miss it.
Is there a solution for this that works in Maven and Eclipse (latest versions)? It sounds like a very common problem and the module system has been around for a while by now.
Note that I really don't want to change the project settings in Eclipse. I can fiddle with plugins in the pom files, but adding a manual routine where all developers need to edit the generated/imported project settings manually is not an option.
EDIT:
There is an open Eclipse bug report for this, see Eclipse bug 536847. It seems it is not supported yet, but perhaps someone can suggest a workaround?
The Eclipse emulation of the multiple-classpaths-per-project feature in Maven has been broken for very long. The symptom is that you can have non-test classes using test dependencies just fine.
Essentially Eclipse just considers each project to have a single classpath instead of two parallel ones which causes things like this to ... not do the right thing.
I would suggest splitting each of the problematic projects into two. One with the actual sources and one with the test sources (depending on the actual source). This will avoid the Eclipse bug and also allow you to use the newest version of Java for your tests while having your application built for an older version of Java.
I've recently returned to working on a Scala project, after having spent some time working with the nodejs ecosystem. After getting used to package managers like npm and yarn, I feel like I need to rethink the tools/processes I've been using for Java/Scala. With that in mind, there are several problems which appear to exist in the JVM world for which I'd like to know if there's some automated solution:
Given some list of dependencies without versions (but with group, module), is there some automated way to detect what the valid combinations (if any exist) of versions for each dependency are? Specifically, ensuring that there are no conflicting transitive dependencies?
I believe Java Modules should reduce/eliminate this issue, but I'm currently limited to Java 8 so can't use them.
Aside from manually changing version numbers in my build.gradle, is there any automated way to update a dependency from cli?
For example, I can do yarn install <package>#<version> to record the new version of a nodejs library I depend on and install it in one step - does anything similar exist for JVM projects?
Are there any tools similar to updtr for Java/Scala projects? Basically; a tool that will automatically try to update my dependencies and run tests with the new versions, and rollback if anything fails.
In case it matters, I'm currently using gradle as my build tool in a Scala 2.11 project; but I'm curious to know about any answers that would apply to any mixed language project using any build tool. Ultimately I just want to avoid manually checking every one of my dependencies against every other dependency manually - anything else is an extra nicety.
I can answer only point 3 of your question, and even this - only partially.
You might be interested in Gradle Versions Plugin by Ben Manes.
This plugin does not update your dependencies (so all the more it does not have the test-running + rollback functionality).
However, it will list all the dependencies that can be upgraded, like that (it's only one of the possible formats):
The following dependencies are using the latest integration
version:
- backport-util-concurrent:backport-util-concurrent:3.1
- backport-util-concurrent:backport-util-concurrent-java12:3.1
The following dependencies exceed the version found at the integration
revision level:
- com.google.guava:guava [99.0-SNAPSHOT 3.0]
http://code.google.com/p/google-guice/
- com.google.inject.extensions:guice-multibindings [2.0 -> 3.0]
http://code.google.com/p/google-guice/
Gradle updates:
- Gradle: [4.6 -> 4.7 -> 4.8-rc-2]
Source: Report format
Moreover, the plugin can be configured as to what is not considered to be an upgradeable version:
boolean rejected = ['alpha', 'beta', 'rc', 'cr', 'm'].any { qualifier ->
selection.candidate.version ==~ /(?i).*[.-]${qualifier}[.\d-]*/
}
Source: Revisions
Eclipse oxygen; windows 7; JDK 9 final from 9, 21; JUnit 4.12 and an existing application. As starting point the application can be compiled, executed and all JUnit Tests shows green. Now we use eclipse to generate the file module-info.java. The outcome looks like:
module ch.commcity.topsort {
exports ch.commcity.topsort;
requires junit;
}
But with the error: junit cannot be resolved to module.
The question is: How to tell the file that junit does not define any module and it should be run in compatibility mode?
How to tell the file that junit does not define any module and it should be run in compatibility mode?
Your question seems to be based on several misconceptions:
You can not tell module-info.java whether JUnit defines a module or not. If a module says it requires another module then the compiler expects that module to be present - no way around that.
Whether JUnit 4 comes as a module or not is not overly important - as long as you place it on the module path, it will end up being treated as a module (possibly as an automatic one).
There is no "compatibility mode". You can continue to write code as you did before the module system (almost), but once you're creating module declarations you need to play by its rules.
I recommend to give the outstanding State of the Module System a thorough read and then ask yourself what exactly you are trying to accomplish. Are you really trying to create a module that depends on JUnit? Or was that just accidental because you use its API for testing. If the latter, you should not depend on it - instead your IDE / build tool needs to figure out how to compile and run your tests.
Expansion on "a module that depends on JUnit"
The module system does not classify dependencies as "compile" or "test" - if a module requires another module, it has to be present, always. That means a module that requires junit would force the presence of JUnit. Unless the module provides testing-related features, that is most certainly wrong.
In other words, requires junit is akin to adding JUnit to a project's POM, using the compile scope.
First, please update your Java 9 support for Eclipse Oxygen or use the latest available release candidate build for Eclipse 4.7.1a (to be released on October 11, 2017).
To add a library or a container to the modulepath of a Java 9 project in Eclipse, open the project's Java Build Path dialog. On the Libraries tab, select the node Modulepath and add your library to it. Make sure to first remove that library from the Classpath node if it is already present there.
As mentioned by others, in your case, the JUnit libraries will be considered as automatic modules.
How to tell the file that junit does not define any module and it should be run in compatibility mode?
Since the module junit as generated in the module-info would be an automatic module converted from its artifact. You need to make sure that the jar for Junit junit:junit:4.12 is available on the modulepath of your project and the module would be resolved itself.
In order to make sure of the above, you can check dependencies of your project/module as configured in the IDE to include junit:4.12:jar.
I recently discovered that BlackBerry treats all classes with the same fully-qualified name as identical--regardless of whether they are in entirely different apps or not--causing apps that use different versions of our shared libraries to break when they are installed on the same phone.
To solve this problem, we are planning on changing the package names to include a version number, then building. Can someone explain how, using Bamboo, I can insert a step in our build process that:
changes certain packages names
replaces all code references to the old package name with references to the new package name?
A great tool that is made especially for the task of changing the fully qualified names of Java classes in jar files is jarjar. It can be used easily from within Ant, or alternatively from a shell script.
I have never used Bamboo - I assume, it should work there, too. Of course, there may be some special restrictions in that environment (concerning bytecode manipulation), I don't know about (?)
I'm not familiar with Bamboo and you did not include much information about your build system. If you are using maven, you could use the shade plugin:
This plugin provides the capability to package the artifact in an uber-jar, including its dependencies and to shade - i.e. rename - the packages of some of the dependencies.
The second example here shows how to configure package renaming. The resulting jar file would then have to be processed by rapc as in Chris Lerchers comment to his answer. It should be possible to also integrate this in a maven build using the exec plugin.