Eclipse + m2e + junit5 - already possible? - java

Tried to get Eclipse 2018-09 + Patch with Java 11 support, m2e, and junit5 working together.
As recommended in junit5-modular-world example I introduced a second module-info.java under test/java.
The reaction of Eclipse was astonishing to me:
I could not save that file after changing it.
It was saved only by closing Eclipse at all.
However, re-opening, bewildered Eclipse. It cannot show any details of the project hosting multiple module-info.java, just the project name.
Probably Eclipse identifies one project with one Java module, while mvn test compiles and executes obviously a different module than the one created by mvn install.
Experienced a lot of options I can think of. Currently I had to give up and fall back to junit 4.12.
Do you know of a better solution?

A secondary module-info.java in the test source folder is not supported by Eclipse at this time (but its behaviour if you try to do that should probably be improved).
For now, you probably won't need it at all:
Maven puts dependencies that are mentioned in the module-info.java on the module path, all others (e.g. test-only dependencies like junit) on the class path, so they become part of the unnamed module. When tests are compiled, command line options are added, so the test code that is treated as part of the module in the main source folder still can read the unnamed module (by adding --add-reads modulename=ALL-UNNAMED), so junit is visible to the test code.
Eclipse Photon and later also supports this behaviour.
Some background regarding the secondary test module-info.java: maven-compiler-plugin supports this since version 3.8 (see https://www.mail-archive.com/announce#maven.apache.org/msg00866.html, implemented in issue https://issues.apache.org/jira/browse/MCOMPILER-341), but I'm not aware that a matching maven-surefire-plugin has been released, so I think you currently wouldn't be able to run these kinds of tests with maven.
Implementing support for a secondary test module-info.java in Eclipse may be possible, as long as it is a strict superset of of the primary module-info.java in the main source folder, or maybe as long as they specify the same module and their contents would get merged as in the "pro" build tool https://github.com/forax/pro. But nobody has worked on that yet.
What will probably be never supported in Eclipse, is to have a secondary test module-info.java that specifies a different module as Eclipse has the assumption that one java project belongs to only one module. But that shouldn't matter, as these tests can only use public and exported code of the main sources, so they can simply be put into their own maven module.

Related

Java Gradle Missing libraries Modue that exists or have been imported - > Task :compileJava

So I have been facing these issues in so many JavaFX Gradle based projects in Intelli J Idea IDE. This has pushed me to the point where I had to manually download library files and make them part of my projects as a workaround.
The gradle projects I have they keeping failing when ever i run the > Task :compileJava in the IDE, for example in this particular that made me create the issue is that i have successfully imported the socket io lib from maven implementation 'io.socket:socket.io-client:2.0.1' , i have managed to import it and write a bit of sample code for it and i have added
requires engine.io.client;
requires socket.io.client;
in the module info file . So when its time to run this fails stating that
error: module not found: socket.io.client
requires socket.io.client;
error: module not found: engine.io.client
requires engine.io.client;
I have tried on JDK 13,16,17 to see if I am missing something but keeps on failing to run , so I have noticed now as a trend in my previous JavaFX project in which i managed to get away with.
So if there is anyone who understands what's wrong with Gradle set up please help.
This answer outlines an approach rather than a concrete solution.
socket.io.client and engine.io.client are not module names.
The socket.io-client library is not Java platform modularized (as far as I can tell), so it will be an automatic module.
The name of the module will be derived from the jar name. I don't know the exact translation as the jar name has . and - characters which may be remapped (or not) to make the module name valid. Try first the exact jar file name. There can be only one module per jar.
Additionally to requiring the right name, the jar needs to be on the module path. Maven will do this automatically for automatic modules, Gradle will not. I am not a Gradle expert, so will not provide advice on how to do that for Gradle.
If you use the right name in module-info and ensure the jar is on the module path, then it may work, or it may be incompatible with the Java module system in ways that are not easily fixable by you (i.e. the broken module must be fixed by the module maintainers).
You can raise an issue for the library maintainer for them to create module-info.java files for the modules and update their documentation on how to use their libraries in a Java module environment.
If the library you are trying to use is incompatible with the Java module system when used as a module, then you could try making your project non-modular by deleting module-info.java from your project and adding appropriate command-line switches. To understand how to do this, refer to documentation on non-modular projects at openjfx.io.

Maven Eclipse plugin bug? failing to auto-add in-workspace dependency JAR to Run Configuration module-path

So here is a quick rundown of my situation:
I have two Java projects: one in Java 8 (so not modular) and one in Java 11 that is modular.
The modular/not-modular issue may not be relevant but for the sake of clarity, I've stated it.
For reference, the Java 8 is a game library I made, and the Java 11 is the game implementation I'm making.
I need to reference the Java 8 library from my Java 11 game project.
Both projects are Maven projects, and I have my dependency defined in my game's POM file.
I'm using latest version of Eclipse (2020-03 4.15.0) and Maven 3.6.3 with Java version 11.0.7 OracleJDK.
My Problem:
My understanding is that my Java 8 library project becomes an automatic module. Adding it into my Java 11 game project module-info file works (with a warning about the name being unstable, but no issue) and I can compile my game project code with no issues in Eclipse.
When I attempt to run the game, I get Module <my-library> not found, required by <my-game>. Now, since Maven is managing the dependencies, it should just work.
How can I get my game to run?
I Can Fix It Three Ways...
First I can simply manually add the library project's JAR file (in it's target folder) to the Run Configuration module-path of my game project.
Second, I can delete the library project from my workspace. This means Maven then goes and gets the JAR from the local m2 repo (it's been installed with mvn install). In this situation Maven DOES automatically add the JAR to the Run Configuration module-path correctly.
Third, I can change the version of the library project in it's POM file and like option two, this means it no longer satisfies the dependency and Maven then looks for the JAR in the local m2 repo.
But...
All three of these options seem to me like they should be unnecessary. This feels like a bug with Maven failing to add the in-workspace project dependency to the module path in the Run Configuration in Eclipse.
To be fair, it is a Maven Eclipse plugin feature that automatically detects when one of the in-workspace projects is a dependency and uses that "live" version instead of the m2 repo version. This is very handy for these situations where development on a library is happening in parallel.
But until this bug is fixed (or unless it's not a bug and I'm missing something), this caused me a ton of frustration. I've posted this in hopes of helping anyone else who may be facing the same issue.

Test dependencies for white-box unit testing Java modules with Maven and Eclipse

I'm trying to convert existing Java projects with Maven and Eclipse into Java 9+ modules. The projects have unit tests and the unit tests have test dependencies. I need the test dependencies to be available in the test code, but I don't want them exposed to the rest of the world in the published modules.
I think Testing in the Modular World describes the Maven solutions well. In summary one solution is to create one module-info.java in the main source folder and another in the testing folder. The file in the main folder has the real dependencies. The file in the test folder adds the test dependencies.
The solution works well in Maven and I can build and run tests from the command line. However, when I import the project into Eclipse as a Maven project it balks. Eclipse complains that "build path contains duplicate entry module-info" and refuses to build the project at all.
Using the other suggested solution in the article with a module-info.test containing --add-reads has no effect and the build fails in both Maven and Eclipse as the tests can't find their dependencies.
To make matters more complex I need to import the test dependencies from Maven, but I also need to import standard Java modules that are not used by the main code. For example one unit test relies on the built-in web server provided by java.httpserver and as it is part of the JDK any magic done on the test dependencies will miss it.
Is there a solution for this that works in Maven and Eclipse (latest versions)? It sounds like a very common problem and the module system has been around for a while by now.
Note that I really don't want to change the project settings in Eclipse. I can fiddle with plugins in the pom files, but adding a manual routine where all developers need to edit the generated/imported project settings manually is not an option.
EDIT:
There is an open Eclipse bug report for this, see Eclipse bug 536847. It seems it is not supported yet, but perhaps someone can suggest a workaround?
The Eclipse emulation of the multiple-classpaths-per-project feature in Maven has been broken for very long. The symptom is that you can have non-test classes using test dependencies just fine.
Essentially Eclipse just considers each project to have a single classpath instead of two parallel ones which causes things like this to ... not do the right thing.
I would suggest splitting each of the problematic projects into two. One with the actual sources and one with the test sources (depending on the actual source). This will avoid the Eclipse bug and also allow you to use the newest version of Java for your tests while having your application built for an older version of Java.

How to test compiled JAR file using Gradle

I had a code that was working correctly when was executed during standard unit testing, but didn't work when it was compiled into the jar and was added as dependency for some other project.
It wasn't an issue to find the root cause and fix it, but I started to think how can I test freshly made jar artifact before deploying it anywhere, to make sure that it will work for end users and other projects. I have googled this topic for several hours, but didn't even find something close to it.
Maybe I'm totally wrong and trying to achieve something weird, but I cannot figure out another way how to verify compiled packages and be confident that it will work for others.
Some details about the project - simple Java library with few classes, using Gradle 5.5 as a build system and travis-ci as CI/CD tool, for testing I'm using TestNG, but I can easily switch to JUnit if it will be required.
If you curious about the code, which was not working when was compiled into the package, here is simplified version:
public String readResourceByURI() throws IOException, URISyntaxException
{
new String(Files.readAllBytes(Paths.get(ClassLoader.getSystemClassLoader().getResource("resource.txt").toURI())));
}
This function will throw java.nio.file.FileSystemNotFoundException if packaged into the jar file. But as I said the problem is not with the code...
Ideally, I want to create a build pipeline, that will produce jar artifacts, which then will be tested and if tests are successful those jars will be automatically deployed to repository (maven and/or bintray).
At the moment all tests are executed before jar creation and as a result there is chance, that compiled code inside jar package will not work due to packaging.
So, to simplify my question I'm looking for a Gradle configuration that can execute unit tests on a freshly made jar file.
That's what I came up with:
test {
// Add dependency on jar task, since it will be main target for testing
dependsOn jar
// Rearrange test classpath, add compiled JAR instead of main classes
classpath = project.sourceSets.test.output + configurations.testRuntimeClasspath + files(jar.archiveFile)
useTestNG()
}
Here I'm changing default classpath for test task by combining folder with test classes, runtime dependencies and compiled JAR file. Not sure if it's correct way to do it...
I don't think there is a good way to detect this kind of problem in a unit test. It is the kind of problem that is normally found in an integration test.
If your artifact / deliverable is a library, integration tests don't normally make a lot of sense. However, you could spend some time to create a sample or test application that uses your library, which you can then write an integration test for.
You would need to ask yourself whether there are enough potential errors of this kind to warrant doing that:
I don't image that you will make this particular mistake again soon.
Other problems of this nature might include assumptions in your library about the OS platform or (occasionally) Java versions ... which can only really be tested by running an application on the different platforms.
Maybe the pragmatic answer is to recognize that you cannot (or cannot afford to) test everything.
Having said that, one possible approach might be to chose a free-standing test runner (packaged as a non-GUI Java application). Then get Gradle to run the test runner as a scripted task with JARs for your library and the unit tests on the classpath.
In Gradle you can try to execute some scripting task that run code from your jar. Or complex but on simple POC it works.
In main gradle project create subproject 'child'.
Add inforation about it in settings.gradle:
include 'child'
In build.gradle add this:
task externalTest {
copy {
from 'src/test'
into './child/src/test'
}
}
externalTest.dependsOn(':child:build')
jar.doLast {
externalTest
}
And in child/settings.gradle in dependency add parent jar:
compile files('../build/libs/parent.jar')
Now in main project on build, the child project will be build after jar creation.

Java 9 Modules and JUnit 4

Eclipse oxygen; windows 7; JDK 9 final from 9, 21; JUnit 4.12 and an existing application. As starting point the application can be compiled, executed and all JUnit Tests shows green. Now we use eclipse to generate the file module-info.java. The outcome looks like:
module ch.commcity.topsort {
exports ch.commcity.topsort;
requires junit;
}
But with the error: junit cannot be resolved to module.
The question is: How to tell the file that junit does not define any module and it should be run in compatibility mode?
How to tell the file that junit does not define any module and it should be run in compatibility mode?
Your question seems to be based on several misconceptions:
You can not tell module-info.java whether JUnit defines a module or not. If a module says it requires another module then the compiler expects that module to be present - no way around that.
Whether JUnit 4 comes as a module or not is not overly important - as long as you place it on the module path, it will end up being treated as a module (possibly as an automatic one).
There is no "compatibility mode". You can continue to write code as you did before the module system (almost), but once you're creating module declarations you need to play by its rules.
I recommend to give the outstanding State of the Module System a thorough read and then ask yourself what exactly you are trying to accomplish. Are you really trying to create a module that depends on JUnit? Or was that just accidental because you use its API for testing. If the latter, you should not depend on it - instead your IDE / build tool needs to figure out how to compile and run your tests.
Expansion on "a module that depends on JUnit"
The module system does not classify dependencies as "compile" or "test" - if a module requires another module, it has to be present, always. That means a module that requires junit would force the presence of JUnit. Unless the module provides testing-related features, that is most certainly wrong.
In other words, requires junit is akin to adding JUnit to a project's POM, using the compile scope.
First, please update your Java 9 support for Eclipse Oxygen or use the latest available release candidate build for Eclipse 4.7.1a (to be released on October 11, 2017).
To add a library or a container to the modulepath of a Java 9 project in Eclipse, open the project's Java Build Path dialog. On the Libraries tab, select the node Modulepath and add your library to it. Make sure to first remove that library from the Classpath node if it is already present there.
As mentioned by others, in your case, the JUnit libraries will be considered as automatic modules.
How to tell the file that junit does not define any module and it should be run in compatibility mode?
Since the module junit as generated in the module-info would be an automatic module converted from its artifact. You need to make sure that the jar for Junit junit:junit:4.12 is available on the modulepath of your project and the module would be resolved itself.
In order to make sure of the above, you can check dependencies of your project/module as configured in the IDE to include junit:4.12:jar.

Categories