I want to build a "toJavaCode()" on my model that would generated the required Java source code to generate that model (never mind the reasons or if it should or shouldn't be done, nor the compatibility issues that may occur).
I'm at a loss at how to test this. I'm using maven, but generate-sources won't really work for me since my server needs to be up for proper, bulk testing. I do get the server up during the "test" goal, but generate-sources is just too early.
On the other hand, while I can use the built in compiler (from tools.jar in the JDK) to do this, I don't know how I can pack it into the jar for testing (or load that jar).
Any ideas?
You can use the JavaCompiler API to compile your source files and setup a classloader to load the compiled classes in your test (sample code). tools.jar has to be on the classpath. This is the case if the JDK is used.
If your generated code is stable for any given class you could use annotation processor to generate the source code and compile it in the same javac run as the annotated class.
You can add ant tasks to your maven. That's a way to something 'out-of-classical-order' during a maven build. Like adding a javac ant task to mavens test goal or so.
(sorry, I'm as confused as your commentor matt b - but the embedded ant tasks are your swiss army knife here.)
Related
I had a code that was working correctly when was executed during standard unit testing, but didn't work when it was compiled into the jar and was added as dependency for some other project.
It wasn't an issue to find the root cause and fix it, but I started to think how can I test freshly made jar artifact before deploying it anywhere, to make sure that it will work for end users and other projects. I have googled this topic for several hours, but didn't even find something close to it.
Maybe I'm totally wrong and trying to achieve something weird, but I cannot figure out another way how to verify compiled packages and be confident that it will work for others.
Some details about the project - simple Java library with few classes, using Gradle 5.5 as a build system and travis-ci as CI/CD tool, for testing I'm using TestNG, but I can easily switch to JUnit if it will be required.
If you curious about the code, which was not working when was compiled into the package, here is simplified version:
public String readResourceByURI() throws IOException, URISyntaxException
{
new String(Files.readAllBytes(Paths.get(ClassLoader.getSystemClassLoader().getResource("resource.txt").toURI())));
}
This function will throw java.nio.file.FileSystemNotFoundException if packaged into the jar file. But as I said the problem is not with the code...
Ideally, I want to create a build pipeline, that will produce jar artifacts, which then will be tested and if tests are successful those jars will be automatically deployed to repository (maven and/or bintray).
At the moment all tests are executed before jar creation and as a result there is chance, that compiled code inside jar package will not work due to packaging.
So, to simplify my question I'm looking for a Gradle configuration that can execute unit tests on a freshly made jar file.
That's what I came up with:
test {
// Add dependency on jar task, since it will be main target for testing
dependsOn jar
// Rearrange test classpath, add compiled JAR instead of main classes
classpath = project.sourceSets.test.output + configurations.testRuntimeClasspath + files(jar.archiveFile)
useTestNG()
}
Here I'm changing default classpath for test task by combining folder with test classes, runtime dependencies and compiled JAR file. Not sure if it's correct way to do it...
I don't think there is a good way to detect this kind of problem in a unit test. It is the kind of problem that is normally found in an integration test.
If your artifact / deliverable is a library, integration tests don't normally make a lot of sense. However, you could spend some time to create a sample or test application that uses your library, which you can then write an integration test for.
You would need to ask yourself whether there are enough potential errors of this kind to warrant doing that:
I don't image that you will make this particular mistake again soon.
Other problems of this nature might include assumptions in your library about the OS platform or (occasionally) Java versions ... which can only really be tested by running an application on the different platforms.
Maybe the pragmatic answer is to recognize that you cannot (or cannot afford to) test everything.
Having said that, one possible approach might be to chose a free-standing test runner (packaged as a non-GUI Java application). Then get Gradle to run the test runner as a scripted task with JARs for your library and the unit tests on the classpath.
In Gradle you can try to execute some scripting task that run code from your jar. Or complex but on simple POC it works.
In main gradle project create subproject 'child'.
Add inforation about it in settings.gradle:
include 'child'
In build.gradle add this:
task externalTest {
copy {
from 'src/test'
into './child/src/test'
}
}
externalTest.dependsOn(':child:build')
jar.doLast {
externalTest
}
And in child/settings.gradle in dependency add parent jar:
compile files('../build/libs/parent.jar')
Now in main project on build, the child project will be build after jar creation.
Since GWT works strictly on Java source code, and Annotation Processors / JSR 269 generate also Java source code, would there be a way, in Maven, to have javac process the files, using the Annotation Processors, and save the generated Java source code somewhere, such that GWT can then use it itself, saving the work of reproducing the Annotation Processors implementation in a GWT generator?
According to this question, assuming the answer is still relevant, it would be best to use the maven-processor-plugin to process annotations. The documentations says that you can specify an "outputDirectory". And this question says you should use the copy-resources goal of the maven-resources-plugin to make the source available to GWT.
Assuming all of this is right, my question is: how do you tell Maven, that it should compile the code with javac, and run the other (maven-processor-plugin / maven-resources-plugin) plugins before running the "GWT Maven Plugin"? (Or would that always, for some reason, happen in that order anyway?)
There are many ways to configure your Maven build. Here are a few of them:
let the maven-compiler-plugin handle Java compilation and annotation processing, and configure it to output the generated sources in addition to compiling them. Then use the build-helper-maven-plugin to add the generated sources directory to the project sources (or resources) for later consumption by the gwt-maven-plugin. That means the build-helper-maven-plugin has to run between the compile and prepare-package phases.
use the maven-processor-plugin to run the annotation processors and output the generated sources, and make sure they're added to the project sources. Then disable annotation processing for the maven-compiler-plugin using <proc>none</proc>.
use the maven-compiler-plugin twice: once to run the annotation processors (with <proc>only</proc>), and once to compile the files (with <proc>none</proc>). Basically, the first execution is equivalent to using the maven-processor-plugin.
You shouldn't need to use resources:copy-resources.
In my IDEA project a Scala module depends on a Java module. When I try to compile the Scala module, only scalac is triggered. It compiles both Java and Scala sources.
I'd like scalac to compile only the Scala module, because javac is much faster for Java sources (and my Java project is a big one).
How to make IDEA use different compiler for different modules?
My workaround is to (for each dependency to Java module):
Delete module dependency in project configuration
Add dependency to appropriate compile output directory "MyJavaModule/target/classes"
Obviously I'm not happy with that, because every time I reimport Maven project I need to repeat all of this to have fast compilation. I hope somebody knows a better way.
Clarification: I'd like to stress, that tools like SBT or Maven don't solve my problem. It is not about compilation alone. It's about compilation in IDEA, required for things like Scala Worksheet or running unit tests from IDEA. My goal is to have full range of IDEA niceties (syntax highlighting, intelligent auto-completion, auto-imports, etc) with compilation speed of SBT. Now I have to either tolerate long compilation times (due to dependencies to my Java module) or to use bare-bones REPL and testing in SBT.
Randall Schulz has asked the right question in the comment: "Why does it matter which tool does the compilation?"
Up until now I believed that IDEA needs to compile all classes itself if you want to use its nice features (like IDEA's Scala Console or running tests from within it). I was wrong.
In fact, IDEA will pick up classes compiled by any other tool (like the great SBT for instance). You just need to assure that all classes are up-to-date before using any of IDEA's helpful features. The best way to do it is:
launch continuous incremental compilation in the background (for
example by issuing "~ compile" in SBT)
remove "make" step in IDEA's
run configurations
That's all! You can then use all cool features of IDEA (not only syntax highlighting and code completion, but all auto-imports in Scala Console, quickly running selected unit tests) without switching between different windows.
That's the workflow I missed until now! Thanks to everybody for all the comments about the issue.
You should look at using a dependency management suite like Apache Ivy or Apache Maven. Then put your Java source in a separate artifact, and have your Scala project be dependent on the Java project artifact.
If you go the Maven route, there is a Scala plugin.
Probably the simplest way to get compiled Scala and Java files is SBT - Simple Build Tool. Just create a project (+ add dependencies and so on) and compile it. Scala + Java compilation works out of the box. I've switched to SBT from Maven.
If you have a complex POM or if you have another reason not to migrate to SBT, you can try to configure the POM. Just adding (and possibly configuring) the Scala plugin should be enough. I hope it will not break the Java support.
A project was using various libraries. E.g. a.jar, b.jar,c.jar,d.jar etc
Some of the jars have been refactored and now is ab.jar and cd.jar etc.
What I need is an automatic way to find which jars in my installation are now obsolete and I can delete them.
Is this possible?
So with LooseJar you can detect unused Jar files by adding:
-javaagent:loosejar.jar
to your java command when you invoke form the command line (or as a VM option in Eclipse). I guess this isn't technically automatic because lines of code that dynamically load classes at runtime will need to be invoked in order for LooseJar to know that the class and therefor the jar is needed. A good method might be to invoke your unit tests with this java agent (assuming your unit tests have good code coverage)
The best way is to use maven. If dependencies are defined in maven you can just run mvn dependency:tree to retrieve needed information. Please refer to this article for details.
If you do not use maven you probably have to use tools like jdepend. But be careful: such tools cannot really retrieve all dependencies. It is impossible to retrieve dependency on dynamically loaded class or API being called by reflection using static analysis only. Full solution may be achieved only if you are running your application, test it with all possible scenarios and check what classes are loaded by class loader. If you have 100% test coverage you can run your application using option -verbose:class and then run all unit tests against your application. You will get a list of all loaded classes. Now put this list to file and write shell script that analyses the classes list and transforms it to jars list.
Thinking that the answer to this is pretty obvious but here it goes:
When I am working on a small project for school (in java) I compile it.
On my coop we are using ant to build our project.
I think that compiling is a subset of building. Is this correct? What is the difference between building and compiling?
Related:
What is the difference between compiling and building?
The "Build" is a process that covers all the steps required to create a "deliverable" of your software. In the Java world, this typically includes:
Generating sources (sometimes).
Compiling sources.
Compiling test sources.
Executing tests (unit tests, integration tests, etc).
Packaging (into jar, war, ejb-jar, ear).
Running health checks (static analyzers like Checkstyle, Findbugs, PMD, test coverage, etc).
Generating reports.
So as you can see, compiling is only a (small) part of the build (and the best practice is to fully automate all the steps with tools like Maven or Ant and to run the build continuously which is known as Continuous Integration).
Some of the answers I see here are out-of-context and make more sense if this were a C/C++ question.
Short version:
"Compiling" is turning .java files into .class files
'Building" is a generic term that includes compiling and other tasks.
"Building" is a generic term describes the overall process which includes compiling. For example, the build process might include tools which generate Java code or documentation files.
Often there will be additional phases, like "package" which takes all your .class files and puts them into a .jar, or "clean" which cleans out .class files and temporary directories.
Compiling is the act of turning source code into object code.
Linking is the act of combining object code with libraries into a raw executable.
Building is the sequence composed of compiling and linking, with possibly other tasks such as installer creation.
Many compilers handle the linking step automatically after compiling source code.
What is the difference between compile code and executable code?
In simple words
Compilation translates java code (human
readable) into bytecode, so the
Virtual machine understands it.
Building puts all the compiled parts
together and creates (builds) an
executable.
Build is a compiled version of a program.
Compile means, convert (a program) into a machine-code or lower-level form in which the program can be executed.
In Java: Build is a Life cycle contains sequence of named phases.
for example: maven it has three build life cycles, the following one is default build life cycle.
◾validate - validate the project is correct and all necessary information is available
◾compile - compile the source code of the project
◾test - test the compiled source code using a suitable unit testing framework. These tests should not require the code be packaged or deployed
◾package - take the compiled code and package it in its distributable format, such as a JAR.
◾integration-test - process and deploy the package if necessary into an environment where integration tests can be run
◾verify - run any checks to verify the package is valid and meets quality criteria
◾install - install the package into the local repository, for use as a dependency in other projects locally
◾deploy - done in an integration or release environment, copies the final package to the remote repository for sharing with other developers and projects.
Actually you are doing the same thing. Ant is build system based on XML configuration files that can do a wide range of tasks related to compiling software. Compiling your java code is just one of those tasks. There are many others such as copying files around, configuring servers, assembling zips and jars, and compiling other languages such as C.
You don't need Ant to compile your software. You can do it manually as you are doing at school. Another alternative to Ant is a product called Maven. Both Ant and Maven do the same thing , but in quite different ways.
Lookup Ant and Maven for more details.
In Eclipse and IntelliJ, the build process consist of the following steps:
cleaning the previous packages,
validate,
compile,
test,
package,
integration,
verify,
install,
deploy.
Compiling is just converting the source code to binary, building is compiling and linking any other files needed into the build directory