Currently my deploy workflow involves manually (i.e. in a script) cd-ing into each maven project directory and running mvn install. The problem is for local resources, i.e. other in-house code that I've written and am actively developing/maintaining, I don't know how to tell maven to build those resources itself when they are missing. Ideally each time I need to re-package the top level application it will rebuild any libraries it depends on that have at least one file modified.
If your (multi-module) project uses other in-house resources, what you actually need might not be to rebuild all those resources all the time, but to use a local maven repository. It can be a simple repository where resources are deployed using ssh or an HTTP transport (see the deploy plugin), or a real artifact manager such as Archiva, Artifactory or Nexus.
A repository manager does more than just hold your deployed artifacts, it can also clean the obsolete snapshots once the corresponding release has been made, and serve as a local cache for other repositories, including central.
Have a parent POM which contains all your modules. When you build the parent, all the modules that are part of parent POM file will be build as well.
You can inherit many things from the parent as long as you have the parent in your child.
Consider setting up Jenkins to automatically build your code. Jenkins has a useful feature that will rebuild projects that depend on newly built artifacts. Builds can be automatically triggered by simply committing your code.
If you're using several development machines (or working in a team) combine Jenkins with Nexus (Other options:Artifactory, Archiva) to provide a common store for shared artifacts. These tools are were designed to support Maven builds.
Related
Simple console maven artifacts with shared dependencies (some also provide public API's in addition to their own class) living on same production server. How to best organise/install on production server?
My instinct is for a single folder holding all (version numbered) jars (ie. a 'flattened'/dependency populated 'repository') however:
(a) Can't see how such a folder would increase, on a 'dependency' basis, it's population from maven deployment repository
(b) How a jar's manifest's classpath would change from the default 'lib/...,lib/...' (ie. relative to 'main' jar, sensible for dev/test using Eclipse) to just '...,...'
What is recommended best practice as regards organisation on production server?
Google'ing 'maven production classpath' (amongst others) resulted in http://blog.armstrongconsulting.com/?p=232 which seems related but light on detail.
Any pointers?
How experienced are you with Maven? If you are the process described in the blog you mention is pretty straightforward also without going into details.
Re (a): Dependencies are downloaded from a remote Maven repository into a local Maven repository on demand. Default in ${user.home}/.m2/repository or according to <localRepository> at the beginning of your settings.xml. See Introduction to Repositories. So, there's no need for a single 'flattened'/dependency populated 'repository' folder.
A local repository can also be populated with the install:install-file goal manually. But this can be a cumbersome process if there are many artifacts to install.
See Maven, Available Plugins for what the mentioned plugin:goals do.
My company runs an internal Ivy repository over a NAS. There we put all the dependencies for our projects. A few days ago, the repo has been rebuilt completely from scratch, using a combination of Java code and calls to Ant task ivy-install.
I personally designed and performed the rebuilding, but faced an important issue. While I used mvrepository.com as reference for importing common open source projects (e.g. Spring, Hibernate), often packages were available in not-widely-known Maven repositories I had to Google for.
Every time (e.g. with hibernate-spatial) I found a package belonging to an "unknown" Maven source I had to some manual work by registering the remote source into ivy-settings.xml and manually run again ivy-install.
This made me think: does it even exist a tool that graphically (e.g. web interface) helps Ivy administrator search and download to local repository artifacts from multiple remote sources (registered by user)? Like if I type the <dependency> tag, or the org, name and rev attributes and it lists available sources. When I need to download a package to local repo, simply click and it gets published.
I have a Maven 3 multi-module project (~50 modules) which is stored in Git. Multiple developers are working on this code and building it, and we also have automated build machines that run cold builds on every push.
Most individual changelogs alter code in a fairly small number of modules, so it's a waste of time to rebuild the entire source tree with every change. However, I still want the final result of running the parent project build to be the same as if it had built the entire codebase. And I don't want to start manually versioning modules, as this would become a nightmare of criss-crossing version updates.
What I would like to do is add a plugin which intercepts some step in build or install, and takes a hash of the module contents (ideally pulled from Git), then looks in a shared binary repository for an artifact stored under that hash. If one is found, it uses that artifact and doesn't even execute the full build. If it finds nothing in the cache it performs the build as normal, then stores its artifact in the cache. It would also be good to rebuild any modules which have dependencies (direct or transient) which themselves had a cache miss.
Is there anything out there which does anything like this already? If not, what would be the cleanest way to go about adding it to Maven? It seems like plugins might be able to accomplish it, but for a couple pieces I'm having trouble finding the right way to attach to Maven. Specifically:
How can you intercept the "install" goal to check the cache, and only invoke the module's 'native' install goal on a cache miss?
How should a plugin pass state from one module to another regarding which cache misses have occurred in order to force rebuilds of dependencies with changes?
I'm also open to completely different ways to achieve the same end result (fewer redundant builds) although the more drastic the solution the less value it has for me in the near term.
I have previously implemented a more complicated solution with artifact version manipulation and deployment to private Maven repository. However, I think this will fit your needs better and is somewhat more simple:
Split your build into multiple builds (e.g., with a single build per module using maven -pl argument).
Setup parent-child relationships between these builds. (Bamboo even has additional support for figuring out Maven dependencies, but I'm not sure how it works.)
Configure Maven settings.xml to use a different local repository location - specify a new directory inside your build working directory. See docs: https://maven.apache.org/guides/mini/guide-configuring-maven.html
Use mvn install goal to ensure newly built artifacts are added to local repository
Use Bamboo artifact sharing to expose built artifacts from local repository - you should probably filter this to include only the package(s) you're interested in
Set dependent builds to download all artifacts from parent builds and put them into proper subdirectory of local repository (which is customized to be in working directory)
This should even work for feature branch builds thanks to the way Bamboo handles parent-child relations for branch builds.
Note that this implies that Maven will redownload all other dependencies, so you should use a proxy private Maven repository on local network, such as Artifactory or Nexus.
If you want, I can also describe the more complicated scenario I've already implemented that involves modifying artifact versions and deploying to private Maven repository.
The Jenkins plugin allows you to manage/minimize dependent builds
whenever a SNAPSHOT dependency is built (determined by Maven)
after other projects are built (manually via Jenkins jobs)
And if you do a 'mvn deploy' to save the build into your corporate Maven repo then you don't have to worry about dependencies when builds run on slave Jenkins machines. The result is that no module is ever built unless it or one of its dependencies has changed.
Hopefully you can apply these principles to a solution with Bamboo.
The problem is, in our company we have a project with multiple sub-modules, however one of the sub-modules is just a collection of API declarations and is meant for other (3rd praty) projects to use. I want to keep it as a sub-module because is easier to maintain and build (dependency and property inheritance). Other sub-modules in this project are also dependant on it.
The question I have is, if there exist a good practice or a nice way to execute a deploy phase that will upload just this sub-module to a different repository (can be duplicated too) without it having a dependency to parent pom.
What I have already tried:
I have already checked the deploy:deploy-file, but the problem is when it comes to SNAPSHOT builds. We wish to be able to publish SNAPSHOTS and release builds, and snapshots have different repository than release ones, but deploy-file goal can only have one url parameter. I do not wish to use different profile for snapshot deploy. Than I tried to use maven build-helper and its regex-property to be able to change the repository url if the version is a SNAPSHOT, but was unable to do so because of the plugin and regex limitations.
The last option is I can make a plugin for this, but I wish to know if there is a more elegant way to solve this the "maven way".
You can deploy this module separately but only for SNAPSHOT's for a release it does not make sense. The deployment of a module can be done via:
mvn -pl TheModuleYouWouldLikeToDeploy deploy
may be you need to add the option -am (also make dependencies) like:
mvn -am -pl TheModuleYouWouldLikeToDeploy deploy
Apart from that your approach sounds wrong cause if you are using a multi-module build why not deploying the whole build via mvn deploy ? May be it would be better to let do the job via a CI tool like Jenkins.
One of the most time consuming task Jenkins makes during every build is to download the artifacts into his local repository which it deletes.
While deleting my artifacts is fine. I don't understand the necessity in deleting 3rd party artifacts which were previously downloaded into it's local maven repository(.m2).
Is there any way to prevent Jenkins from deleting the local repository before build.
Thanks
You should install a Maven repository manager (MRM) like Sonatype Nexus, JFrog Artifactory or Apache Archiva and the downloads will be local to your network and very fast. Using a MRM is pretty much considered a necessity for any useful usage of Maven or any build tool with declarative dependency management since it allows you to cache artifacts as well as upload your own libraries and share them acros you developers as well as you CI builds.
If that is still not enough you can disable the private repository deletion per build or even use one shared repository per build, but that reduces the stability of the build since you are now mixing between builds and therefore introducing interdependencies.
While I agree with Manfred's recommendation to use a Maven repository manage I'd also recommend looking at how you manage the Maven local repository:
Prevent Jenkins from Installing Artifact to Local Maven Repository
When is it safe to delete the local Maven repository?
Ivy, Ant, Jenkins - Is it good idea to to a <ivy:cleancache> on Jenkins builds?
Maven does not normally purge the local repository, I'm guessing you have a periodic task that does this.