We have 2 Maven applications, a war application which contains our jsps and presentation layer code, and a shared library which contains our business layer code.
Before migrating to maven we had the shared library as a project reference of the WAR application. Whenever we built or debugged the WAR application in Netbeans, the shared library would get automatically compiled and built and any new changes were picked up automatically.
With Maven, it looks like any time we make a change to the shared library we now need to build the shared library project BEFORE debugging. Is there any way to retain the efficiency of the old method?
When we debug the WAR application is there any way to have Maven build the shared library dependency (local jar project) automatically whenever we debug?
In eclipse its just a matter of the auto build/hot deploy set up correctly. Just make sure to have maven plugin installed and use it as the the builder for the project.
I don't imagine netbeans is much different.
Ok, I finally accomplished this using the maven invoker plugin:
<plugin>
<artifactId>maven-invoker-plugin</artifactId>
<version>1.6</version>
<configuration>
<projectsDirectory>../</projectsDirectory>
<pomIncludes>
<pomInclude>project1/pom.xml</pomInclude>
<pomInclude>project2/pom.xml</pomInclude>
</pomIncludes>
<goals>
<goal>install</goal>
</goals>
</configuration>
<executions>
<execution>
<id>build-deps</id>
<goals>
<goal>install</goal>
</goals>
</execution>
</executions>
I highly doubt this is best practice, but it got the job done. You could install this into it's own profile to make sure it doesn't interfere with automated building.
The way that I've worked around this is to have a parent POM that encapsulates all of the sub-components:
myProject
+- myProject-web
+- myProject-bl
+- myProject-da
\- myProject-domain
Each is a separate project that builds into it's own maven dependency. Then, the parent POM simply includes the other projects:
<modules>
<module>myProject-web</module>
<module>myProject-bl</module>
<module>myProject-da</module>
<module>myProject-domain</module>
</modules>
Now, whenever you build myProject, maven will rebuild the sub-components if they are out of date. Since maven builds each sub-component individually, you will have each part in your Maven repo, accessible by any other project that needs to use one or more parts of your project.
Related
I am developing a project which needs a client personal jar, and it needs to be deployed on a pipeline of tools which are out of our control (sadly). One of the tools in this pipeline is sonarqube.
To build and deploy we have to use maven.
I put the jar into a folder of the project, and tried various way to actually make it work.
The first (working) way was to have it as a system with a systemPath to the folder of the project. It compiled, worked and everything, but sonarqube apparently hates systemPath and made us take it away.
After a tiny bit of searching, we added to our pom a maven-install-plugin, bounded an install-file to validate phase and configured to generate the dependency. This seems to work on local if I first run mvn validate and THEN mvn clean package. Otherwise, it tries to look for the jar on the main repository and fails. If I comment the tag and leave only the plugin active, I noticed it executes the plugin, installs the jar to local repository, but build fails due to not resolving packages and classes inside the jar. If I now put the tag in, everything works, because it now find the jar in the repository.
While this solution works, it doesnt suite me because the repository will be emptied every once in a while, and to restart everything I would need to commits, one knownligly failing, just to install the jar.
I tried addind a tag instead, pointing to a project dir where I would store the necessary jar, and that works just fine on my PC, but utterly fails on the pipeline, looking at main repository only (I guess it is some configuration on the pipeline, but cant really tell, being outside my control)
Was actually able to do it with maven install plugin,
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-install-plugin</artifactId>
<configuration>
<groupId>com.exmaple.stuff</groupId>
<artifactId>ClientJar</artifactId>
<version>1.0</version>
<packaging>jar</packaging>
<file>${project.basedir}/src/main/resources/ClientJar-1.0.jar</file>
<generatePom>true</generatePom>
</configuration>
<executions>
<execution>
<id>install-client-jar</id>
<phase>validate</phase>
<goals>
<goal>install-file</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
And having dependency
<!-- Client jar -->
<dependency>
<groupId>example</groupId>
<artifactId>ClientJar</artifactId>
<version>1.0</version>
</dependency>
If you notice, the tricky part here was actually that groupId is different between the dependency and the plugin declaration. I do not know if this difference is due to the configuration of their artifactory server, but it seems to work locally too.
Also it is needed to explicitly run mvn validate in the pipeline
I am building an Eclipse project that consists of a number of plug-ins that are packed together. I have create POM files for each component and a main POM for the project. Something like this:
projectDir\releng\pom.xml <-- Parent project
projectDir\proj1\pom.xml <-- Child project 1
projectDir\proj2\pom.xml <-- Child project 2
My build currently works by calling the parent POM which builds everything. Until now I have been building using 0.0.1-SNAPSHOT as the version of the parent POM, and in each Eclipse plug-in I have 0.0.1.qualifier as the version in the MANIFEST.MF file.
I now want to promote my latest version to 0.1.0. From my understanding, this means that I have to go over ALL of my POM files AND MANIFEST.MF files and upgrade the version in both of them (since while the version is defined in the parent POM, it is referenced in all child POM:s).
Is this the correct way to do this or is there a way to automate the whole process and not make mistakes?
P.S. There is the Maven Release plugin but this won't work with Eclipse.
For the version update step of a relase process, there is the tycho-versions-plugin which knows how to consistently update the POMs and manifests.
Just go to the root of your parent/aggregator module and call
mvn org.eclipse.tycho:tycho-versions-plugin:set-version -DnewVersion="0.1.0"
This will update the version of the parent project and of all child projects with the same/equivalent version as the parent project. In your case, these are all projects because the Eclipse versions 0.0.1.qualifier is considerered equivalent to 0.0.1-SNAPSHOT in Tycho.
For the remaining steps of the release process (tagging, building, pushing tags, etc.) just call the appropriate SCM or Maven commands, e.g. from a script. I haven't tried to use the maven-release-plugin for this (and apparently no-one else has).
Please have a look here: Unleash Maven Plugin - Tycho Releases
The Unleash Maven Plugin is implemented as an alternative to the Maven Release Plugin and has a Tycho feature which should do exactly what you need. Furhtermore it is much more flexible, failure tolerant and has an integrated rollback feature.
I will publish some blog posts soon to promote and explain this plugin.
just some hints on how we implemented it.
It can be done with an extra plugin that does transformation of versions in MANIFEST.MF and *.product files. This plugin needs to be a lifecycle participant. #Component(role = AbstractMavenLifecycleParticipant.class) the reason for this is that is must transform and commit before the release plugin starts to look for modifications. Then it must also to transformation back after the release.
The mojo executor plugin saves a good deal of work since it can call the replacer, buildhelper and scm plugin from inside your plugin.
Another important gotcha is that you need to disable to hard coded clean invocation that tycho does by confguring the release plugin to configure the clean plugin to skip execution.
Hope this helps.
There is a new feature in tycho-1.1.0 (unreleased at the time of this post) that should support what you're trying to do.
If you've configured your pom correctly for standard maven-release + added the dep to tycho 1.1.0, you can customize your build as follows [1]:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-release-plugin</artifactId>
<version>2.5.3</version>
<configuration>
<preparationGoals>org.eclipse.tycho:tycho-versions-plugin:${tycho-version}:update-eclipse-metadata org.apache.maven.plugins:maven-scm-plugin:1.9.5:add org.apache.maven.plugins:maven-scm-plugin:1.9.5:checkin</preparationGoals>
<completionGoals>org.eclipse.tycho:tycho-versions-plugin:${tycho-version}:update-eclipse-metadata org.apache.maven.plugins:maven-scm-plugin:1.9.5:add org.apache.maven.plugins:maven-scm-plugin:1.9.5:checkin</completionGoals>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-scm-plugin</artifactId>
<executions>
<execution>
<id>default-cli</id>
<goals>
<goal>add</goal>
<goal>checkin</goal>
</goals>
<configuration>
<includes>**/META-INF/MANIFEST.MF, **/feature.xml, **/*.product</includes>
<excludes>**/target/**</excludes>
<message>Changing the Eclipse files versions</message>
</configuration>
</execution>
</executions>
</plugin>
[1] This is taken directly from a tutorial that describes this new feature:
https://wiki.eclipse.org/Tycho/Release_Workflow
I've got what seems like a corner case for Eclipse/Maven and "Resolve dependencies from workspace projects". My project has a mix of written code and generated code, with the generated code coming from a dependency which uses JAXWS.
The problem is that if I check "Resolve dependencies", Eclipse/Maven ignores any JAR dependencies and tries to resolve everything by only looking at the workspace, which results in Eclipse showing errors like "Package/Class not found" (related to the generated code) even though the project will build fine with Maven from the command line.
On the other hand, if I uncheck it, it resolves everything by only looking at the JARs in the Maven repository. The second option generally works, but when I do something like Ctrl-click on a class or variable, I get the Class File Editor and "Source not found", which isn't terribly useful. Also, it can get out of sync if I edit code in the IDE but don't run "maven install" after that.
I suppose this is mainly an inconvenience with Eclipse but it's annoying. I am considering resolving this by modifying the Maven dependencies to build with source (or debug) but I can't necessarily do this with everything. Is the "Resolve dependencies" option intended to work exclusively one way or the other as I've described?
You might want to have a look at the build helper maven plugin.
You can configure it like this :
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>1.5</version>
<executions>
<execution>
<id>add-source</id>
<phase>generate-sources</phase>
<goals>
<goal>add-source</goal>
</goals>
<configuration>
<sources>
<source>target/generated-sources</source>
<source>target/jaxws/wsimport/java</source>
</sources>
</configuration>
</execution>
</executions>
</plugin>
This will tell your eclipse maven plugin to have a look at the generated sources and include it in your project classpath.
You can also add the generated sources manually to your classpath in eclipse. (right-click on the generated folder -> add to build path)
I think that since you want to reference files that only exist after a build that you somehow force the build to happen before you need the references resolved. You could cheat by just doing a build from within Eclipse. That would leave the generated source files in place ready to be referenced. I think, however, that the maven philosophy would have you move the generated code to another maven artifact entirely. That would let you separate the lifecycle of the two groups of code so that when you're ready to use Eclipse to edit the hand-coded code, references to generated classes are resolved because you've already generated that code in the build of an separate, independent module.
I know this is an old issue. But I encountered the same thing in Juno with an updated "m2e-wtp" plugin. So I'm answering solely for other readers' benefit.
This was only happening in war projects. The only thing resolved it eventually was removing the ".settings" folder under the war project's folder and restarting eclipse.
I have a multimodule maven setup for my project, made of 5 modules, which includes a GWT webapp.
It is also an eclipse multiproject workspace, so I created an additional project, only containing a pom, which lists the other projects (sibling on the file system) as children modules.
I'm also a new maven user, so I might be doing something wrong. =)
The gwt module uses the following plugin
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>gwt-maven-plugin</artifactId>
<version>2.4.0</version>
<executions>
<execution>
<goals>
<goal>generateAsync</goal>
<goal>compile</goal>
</goals>
</execution>
</executions>
<configuration>
<hostedWebapp>war</hostedWebapp>
<runTarget>GWT.html</runTarget>
</configuration>
</plugin>
When I run mvn package on the pom project I get the expected behaviour: projects are build in the correct order, and the war is fine.
When I run mvn gwt:run, though, maven tries to find a gwt app on each module, failing on the first one (the parent) which doesn't even declare nor manage the gwt plugin.
If I run mvn -fn gwt:run, the build fails on each other project, finally finding a gwt app on the gwt module, and displaying it.
How do I correctly run the app on hosted mode? Is this the correct behavior?
I do not want the GWT module to be the parent module (if it's possible), because the project has multiple target platforms, producing the gwt web frontend, a Java executable jar backend and in the future also an Android app, and shares most parts of the code (not only the model). Is a single pom structure recommended for such a setup, or am I failing at maven?
Are profiles what I need? If I do, should I declare the same profile id on each module? How would I prevent the trigger of gwt:run command on them anyway?
What should the setup of such a project be? Is this the correct setup?
Additional information
Modules are
pom: declares modules model, logic, analyze, gwt, tests
model: no dependencies
logic: no dependencies
analyze: depends on model, logic
gwt: depends on model, logic
tests: depends on model, logic, analyze, gwt (contains global tests,
not unit tests)
If I run gwt:run on the gwt module i get the error
Could not resolve dependencies for project
djjeck.gwt:djjeck.gwt:war:0.0.1-SNAPSHOT:
Could not find artifact djjeck.model:djjeck.model:jar:0.0.1-SNAPSHOT
This is from djjeck.gwt/pom.xml
<dependency>
<groupId>djjeck.model</groupId>
<artifactId>djjeck.model</artifactId>
<version>0.0.1-SNAPSHOT</version>
<scope>compile</scope>
</dependency>
A com.model-0.0.1-SNAPSHOT.jar is inside the war lib folder, both packed and unpacked, and also inside djjeck.model/target.
Go to the webapp module and then run mvn gwt:run.
You may use profiles to speed up compilation time: one profile could only gwt compile for gecko and english +draftCompile for example.
Have a look at maven GWT plugin multi-module setup if you're still having problems.
As I was also struggling with GWT dev mode and a Maven project with multiple sub-modules/projects, I created an example and uploaded it to GitHub. You can find it at:
https://github.com/steinsag/gwt-maven-example
The readme on aboves page shows how to run it via Maven. Features of this example are:
multiple modules
not using GWT's embedded Jetty, but an own Tomcat7 server
startup of Tomcat7 and GWT hosted mode possible via documented Maven commands
I hope this helps a bit to have at least a working example to start from.
Up until now we used Ant in my company. Whenever we wanted to send the application to the client we run a special Ant script that packaged all our source code with all jar libraries and Ant itself along with a simple batch file.
Then the client could put the files on a computer with no network access at all (and not even Ant) and run the batch file. As long as the computer had a valid JDK the batch script would compile all the code using the jars and create a WAR/EAR that would finally be deployed by the client on the application server.
Lately we migrated to Maven 2. But I haven't found a way to do the same thing. I have seen the Maven assembly plugin but this just creates source distributions or binary ones. Our scenario is actually a mix since it contains our source code but binary jars of the libraries we use (e.g. Spring, Hibernate)
So is it possible to create with Maven a self-contained assembly/release/package that one can run in a computer with no network access at all??? That means that all libraries should be contained inside.
Extra bonus if Maven itself is contained inside as well, but this is not a strict requirement. The final package should be easily compiled by just one command (easy for a system administrator to perform).
I was thinking of writing my own Maven plugin for this but I suspect that somebody has already encountered this.
From your dev environment, if you include the following under build plugins
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</plugin>
and invoke mvn assembly:assembly, you would get yourApp-version-with-dependencies.jar in the target folder. This is a self-sufficient jar, and with a Main-class MANIFEST.MF entry, anybody can double click and run the application.
You might try this approach:
Use mvn ant:ant to create ant build
scripts from a maven project
Make sure ant is a project dependency
Use the assembly to build an ant
system
or plan b:
Use mvn ant:ant to create ant build
scripts from a maven project
Make sure ant is a project dependency
Write a "bootstrap class" to call Ant and run the build
Use appassembler to build a
scripted build and install environment
In plan b, you'd write scripts to set up a source tree someplace from the packaged source jars, and then use the appassembler build bat or sh scripts to call the bootstrap and build via ant. Your bootstrap can do anything you need to do before or after the build.
Hope this helps.
Perhaps an answer that I submitted for a similar question could be of some assistance. See Can maven collect all the dependant jars for a project to help with application deployment? The one piece missing is how to include the source code in the assembly. I have to imagine that there is some way to manage that with the assembly plugin. This also doesn't address the inclusion of Maven in the distribution.
What was the reason for moving from Ant to Maven? It sounds like you had everything worked out well with the Ant solution, so what is Maven buying you here?
If it is just dependency management, there are techniques for leveraging Maven from Ant that give you the best of both worlds.
the source plugin will give you a jar containing the source of a probject "source:jar". you could then use the assembly plugin to combine the source jars from your internal projects (using the sources to reference these source jars) and the binary jars from the external projects into one distribution.
however, as for turning this into a compilable unit, i have no suggestions. you could certainly bundle maven, but you'd need to create a bundle containing all the plugins you need to build your project! i don't know of any existing tool to do that.
This is how I do it... on the build part of the pom add in this:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-source-plugin</artifactId>
<executions>
<execution>
<id>attach-sources</id>
<phase>verify</phase>
<goals>
<goal>jar</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
And then on the profiles section add this bit in:
<profiles>
<profile>
<id>release</id>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-source-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>jar</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>
And when I do a maven install it builds the jar and also checks in a jar of the source.