Always Using Maven `clean` Goal? - java

Should the Maven clean goal be applied to every build as a best practice to avoid stale CLASS files? Or perhaps always using clean is unnecessary since Maven is smart enough to know which source code needs to be re-compiled based on changes?
compile
install
etc?
Gareth Davis pointed out the potential danger of forgetting to running a clean after re-naming a CLASS file - https://stackoverflow.com/a/4662536/409976.
Example:
Compile module (not whole project) - Foo.java -> target/Foo.class
Re-name to Bar.java & re-compile module -> target/{Foo.class, Bar.class}
re-compile main
BOOM – other module code should’ve failed to compile since it relied on making a Foo class (but it still longer exists since we didn’t clean).
I'm looking into whether the absence of clean will improve build performance on a guest VM within a shared directory. However, I'm not aware of all of the consequences of not always calling the clean step first.

Running clean before any build is always a good practice since the stale class scenario that you have mentioned is completely possible and if encountered, it can create problems in your final artifact.
I'm looking into whether the absence of clean will improve build performance on a guest VM.
By performance, my best guess is that you mean time efficiency. By avoiding the clean goal, you CAN save time since maven will only build the classes where any change was done and it will leave out the rest of the untouched class. So, technically speaking, yes you will save some time in the build process.
However, I'm not aware of all of the consequences of not always calling the clean step first.
Class staleness is by far the most common issue that you may and will encounter.
By experience I can share a couple of things. If you are using an IDE and your project structure is a bit large complex ( A couple of dependant modules etc etc ) then running mvn clean does sometimes confuse the IDE in managing the build path of the project and you may run into ClassNotFoundExceptions. Although not a big issue but you have to do a project --> maven --> Update project to set the build path right.
(Edit However, if you are running it via command lien, then this issue will never arise)
On your dev machines, its all good with mvn install, but when you are deploying a build on production or test server, its best to go with mvn clean before doing a mvn install to make sure no issues arise and all the classes are built once before wrapping them in a jar/war or anything

Related

How do I bypass compiling in Maven?

My project has a complicated maven structure, with lots of sub projects, and 66 pom.xml files. But for my current task, I'm only modifying one resource file, a .xsl file that I need to test lots of small changes. The build takes 9 minutes, and it spends most of its time recompiling my java source files that haven't changed. Is there any way to use maven to package and build my .war file without recompiling my java source code?
This is a tricky one. Maven is not good at building incrementally.
My first shot would be to try #gaborsch suggestion. But be careful. The results can be inconsistent, depending on your changes. You need to make some experiments to figure out if this works for you.
The second shot would be to speed up the build. You can try to build in parallel, or you can only partly build your multi-module project if only parts are affected (and you are building Snapshot versions).
Gradle is much better at building incrementally (I am a Maven guy, but I have to admit it). But switching to Gradle is probably not the right way to go.
I will suggest what I answer to a similar question.
mvn package -pl my-child-module
This command will build the project my-child-module only.
It won't skip the compilation but at least will skip unwanted modules to be built.
-pl, --projects Build specified reactor projects instead of all projects
-am, --also-make If project list is specified, also build projects required by the list
-amd, --also-make-dependents If project list is specified, also build projects that depend on projects on the list

Bind javadoc and sources goals to deploy phase before deploy goal is run

I would like my project to generate javadoc and sources artifacts only when I activate the deploy phase, not when I ask for mvn install. That is because I only need those artifacts when deploying, and I want to save time when not deploying.
Therefore, I thought about binding the maven-source-plugin goal to the deploy phase.
But, I need those artifacts to exist at the time the deploy goal runs. Thus, the source and javadoc generation goals must run before the deploy goal. Unfortunately, the goal from the packaging is executed first (as documented).
I am aware the usual advice is to define a “release” profile and define the javadoc plugin (and related ones) only there. But this seems needlessly complicated for my simple use case. Now I need to think about activating the release profile exactly when I ask for deploying, I would prefer the right plugins to be activated automatically depending on the phase I ask for.
I am surprised this does not seem to be considered possible or even desirable by Maven (as it does not seem to allow for a goal to run in the deploy phase but before the deploy goal). Did I miss something, is this possible? Or, is there any reason not to do it in the way I consider? (Otherwise, I am thinking about introducing a feature request.)

How to use Jenkins job(s) with dependent git repositories and multiple branches

We have 2 git repositories, Platform and US (we have other geo-specific ones as well which is why they are split, but they are not necessarily relevant here). US depends on Platform.
We are using git-flow (meaning new features are in their own branches like feature/some-product, develop branch is somewhat more stable and represents QA-ready builds, master branch is stable and for releases). (If a feature has both Platform and US parts, there will be a branch in each with the same name.) We decided that the Jenkins jobs for the features should not run mvn deploy because we don't want to publish them to the snapshot repository and probably shouldn't run mvn install because we don't want a different feature branch to grab it from Jenkins's local repo (this we are less sure about though). We believe they should only make sure everything compiles and that the unit tests pass (mvn verify).
This is where the problem comes in, because these are separate git repositories and we are not doing anything with the compiled jar (install or deploy),
how can we safely expose the compiled jars from the Platform job to the US without exposing them to other developers or jobs (or is this even a concern is only doing mvn install) or
how can one Jenkins job build Platform and US for a specific branch together?
If we only have a single actively developed branch (or we were using subversion) this would not be an issue.
Some ideas we have (and concerns with each)
For feature branches use a different version (e.g., 8.1.0-SNAPSHOT-some-product).
This seems like a lot of work for every feature branch.
It seems like it'd clog up the local repo with "stale" jars, and we would need to worry about purging them.
Somehow use git submodule to checkout Platform's and US's feature/some-product and either use mvn verify --reactor or a simple pom file with the top level projects as modules.
How to make Jenkins add the submodules?
If the submodules were already there, there would need to be a whole git repo for this, which seems redundant.
--reactor doesn't work always.
How to supply the pom file?
Just do mvn install.
feature/other-thing may only be on US, so after Platform feature/some-product publishes to Jenkins local repository (which may be very different from Platform develop, which US feature/other-thing would be built against normally), it would (We think) cause US feature/other-thing to fail (or pass!) in a false sense (supposing that if it were compiled against Platform develop it could possibly get a different result).
I have not had to address this issue personally.... here is my thoughts on how I would look at the issue:
If you MUST only have one job for both branches (a bad idea), you can use parameterized build plugin to pass in the text string "US" or "Platform" and have logic in a shell script that will check out the relevant repo's branch.
HOWEVER, this eliminates the ability to have repo polling kickoff the build. You would have to set up a build schedule on a cron and you would get a new build no matter what, even if the repo hasn't changed (unless your batch / shell script is smart enough to check for changes).
I don't see any reason NOT to have two separate Jenkins Jobs, one for each branch.
If one job needs access to the .jars (aka the build artifacts) then you can always reference the artifacts of any other jar from the job's "LATEST" URL on the jenkins server. Be sure the jobs specify what artifacts need to get archived.
The way I ended up solving this is using the maven versions plugin. I had to make sure all the modules were managed dependencies in the top-level project, but that may have been a different issue. Also, I am sure of this, the US project will need to explicitly declare its version even if it is the same as the parent.
They both poll git but the Platform job also triggers US if it built successfully.
The way the versions plugin works will require you to do this in two steps. Add 2 "Invoke top-level Maven targets" in the job, the second is the clean deploy. The first is a little different for the Platform and US.
Platform: mvn versions:set -DnewVersion=yourBranchName-${project.version}.
US: mvn versions:update-parent -DparentVersion=yourBranchName-${project.version} versions:set -DnewVersion=yourBranchName-${project.version}
If the branch only exists on the US repository, then obviously don't make the Platform one, and the US one is the same command as what the Platform one's would have been.
One final issue I ran into was originally I had the new version as ${project.version}-yourBranchName but the issue here was that the repository the job was deploying to only accepted snapshots and because the version didn't end in -SNAPSHOT it gave error code 400.

What is the point of a Maven goal?

I come from a Ruby on Rails background and I am learning Java Spring MVC right now. When I try to compile my code using Maven in STS it wants me to include a goal. All the guides I read on this seem vague and unclear. What is the point of a goal? Why can't I just compile my source code and run it?
You have to understand that Maven is complex and modular.
It does not have any concept of "compiling" your code or "running" your code. What it does is trigger plugins in the order of the build lifecycle.
Running a Maven goal triggers a particular Maven plugin. For example, running mvn compile triggers the Maven compiler plugin.
This all seems to be overkill for someone just starting in Java, and there are many "why can't Maven just do what its told" questions on SO.
Most of these stem from a fundamental misunderstanding of what Maven is. It is not (strictly speaking) designed to "compile and run" your code. It is designed to carry out a number of preconfigured steps in a particular order.
When it comes to "running your code", this gets even trickier:
to run your code with the maven exec plugin, call mvn exec:java. You obviously need to configure the plugin
to run your code as a webapp with an embedded Tomcat server, call mvn tomcat7:run-war.
to run your code as GWT application in devMode on an embedded webserver call mvn gwt:run
What all these goals have in common is that they trigger a specific, pre-configured, plugin that carries out the task you have asked for.
It is further worth pointing out that Maven is designed to compile, test and install your code, not really to run it. So whilst may plugins for running code do exist, the real strength of Maven comes from being able to automate the compiling, testing and deployment of code to a Maven repository.
As a final note, the massive red warning at the top of the page you linked says:
Apache Maven 1.x has reached its end of life, and is no longer
supported.
Take heed. We're now on Maven 3.2.X. Don't read documentation for ancient and obsolete versions. This will serve no purpose but to confuse you.

Is it safe to run continuous builds with "mvn verify" instead of "mvn clean verify"

We are running our continuous builds on Hudson currently with "mvn clean verify". That is what we always did and so we never questioned it.
The question is: Is it safe to run a continuous build with "mvn verify" only?
So that would mean the maven-compiler-plugin would only compile classes that got changed since the last build and save precious time.
Will the feedback be the same quality as with "clean" or are there any drawbacks to be expected?
The product being tested is a typical Java web application with lots of generated code (JSPs, reports). There is also code around using Dependency Injection.
No, it's not safe! The Maven compiler plugin is not smart enough to figure out that the API of a class A has changed and that it should check all other classes which use this API, too. It will only compile A and create a jar with a lot of broken classes.
Note: It's usually better to run mvn clean in advance and then run the build/verify/compile/install. That allows you to run the second command several times without cleaning all the time.

Categories