My company runs an internal Ivy repository over a NAS. There we put all the dependencies for our projects. A few days ago, the repo has been rebuilt completely from scratch, using a combination of Java code and calls to Ant task ivy-install.
I personally designed and performed the rebuilding, but faced an important issue. While I used mvrepository.com as reference for importing common open source projects (e.g. Spring, Hibernate), often packages were available in not-widely-known Maven repositories I had to Google for.
Every time (e.g. with hibernate-spatial) I found a package belonging to an "unknown" Maven source I had to some manual work by registering the remote source into ivy-settings.xml and manually run again ivy-install.
This made me think: does it even exist a tool that graphically (e.g. web interface) helps Ivy administrator search and download to local repository artifacts from multiple remote sources (registered by user)? Like if I type the <dependency> tag, or the org, name and rev attributes and it lists available sources. When I need to download a package to local repo, simply click and it gets published.
Related
why use Maven when you have such quantity of local jars?
So we have a client that have a lot of private jars and custom jars.
For example commons-langMyCompanyCustom.jar which is commons-lang.jar with 10 more classes in it.
So on their environment we use 100% Maven without local dependencies.
But on our site we have the jars for development in Eclipse and have Maven build with the public ones, but we do not have permission to add their jars in our organizational repository.
So we want to use the Maven good things like: compile,test, build uber-jar, add static code analysis, generate java-docs, sources-jars etc. not to do this thinks one by one with the help of Eclipse.
So we have 70 jar some of them are public if I get the effective pom on their environment I found 50 of them in Maven Central, but the other 20 are as I called "custom" jars. I searched for decision of course but found this:
<dependency>
<groupId>sample</groupId>
<artifactId>com.sample</artifactId>
<version>1.0</version>
<scope>system</scope>
<systemPath>${project.basedir}/src/main/resources/yourJar.jar</systemPath>
</dependency>
So for all 20 of them I have to add this in the development maven profile??
Is there a easy way like in Gradle where you can add all folder with its dependencies to the existing ones?
Also installing one by one in every developer's repo is not acceptable.
Please forget the system scope as mentioned before! Too problematic...
Ideally:
Ideally, all your developers have access to Repository Manager in your or their organization (if possible).
A central environment for your System Integration Testing, maybe?
Alternatively, you may have a central environment for testing where all the dependencies are provided. This approach can be used to simulate how a compilation would work as if it's in your client's environment. Plus you only setup jars once.
So on their environment we use 100% Maven without local dependencies.
But on our site we have the jars for development in Eclipse and have
Maven build with the public ones, but we do not have permission to add
their jars in our organizational repository.
According to what you're saying in the above-quoted excerpt I believe you want to have set in your build's pom.xml assuming that in the client setup the dependencies will be present.
Especially, as you indicate that the organization doesn't give you permission to add their jars in your repository, I would use the provided scope.
As stated in the Maven docs, the definition of a provided dependency is as followed:
This is much like compile, but indicates you expect the JDK or a container to provide the dependency at runtime. For example, when building a web application for the Java Enterprise Edition, you would set the dependency on the Servlet API and related Java EE APIs to scope provided because the web container provides those classes. This scope is only available on the compilation and test classpath, and is not transitive.
So basically you assume that these dependencies will be present at your client's setup. However, this has some limitations. Meaning you can build solutions independently but cannot test it locally because you won't have the dependencies on your workstation.
If you won't even have access to the jars to configure your central environment ask if your client can provide a DEV/SIT environment.
None of the above? Inherit a parent pom.
To avoid the whole constant copy-paste process for every single (related) project, maven has the tools to centralize dependency and plugin configurations, one of such is by inheriting the configuration of a parent pom. As is explaining in the following documentation it is quite simple:
First you create a project with just a pom.xml where you define everything you wish to centralize (watch out, certain items have slight differences in their constructs);
Use as property of packaging tag the option pom: <packaging>pom</packaging>;
In the pom's that have to inherit these configurations set the parent configuration tags in <parent> ... </parent> (documentation is very clear with this);
Now everytime you update any "global" pom configuration only the parent version has to be updated on every project. As a result of this, you only need to configure everything once.
You can also apply this together with the abovementioned solutions, this way combining to find a solution that fits best to your needs.
But there is a big Maven world out there, so I advise a good read in its doc's to further acknowledge your possibilities. I remembered these situations because I've been in a similar situation you seem to be now.
Good luck!
Another alternative is the project RepoTree.
This one creates a Maven repository directory (not a server) from another directory which contains just the .jars. In other words, it creates the necessary .pom files and directory structure. It takes into account only the precise information from metadata contained in the archives (MANIFEST.MF, pom.xml).
Utility to recursively install artifacts from a directory into a local
Maven repository Based on Aether 1.7
This is 5 years old, but still should work fine.
TL;DR: MavenHoe creates a Maven repository server (not a directory) which serves the artefacts from a directory, guessing what you ask for if needed. The purpose is to avoid complicated version synchronizing - it simply takes whatever is closest to the requested G:A:V.
I have moved the MavenHoe project, which almost got lost with the decline of Google Code, to Github. Therefore I put it here for availability in the form of a full answer:
One of the options you have when dealing with conditions like that is to take whatever comes in form of a directory with .jar's and treat it as a repository.
Some time ago I have written a tool for that purpose. My situation was that we were building JBoss EAP and recompiled every single dependency.
That resulted in thousands of .jars which were most often the same as their Central counterpart (plus security and bug fixes).
I needed the tests to run against these artifacts rather than the Central ones. However, the Maven coordinates were the same.
Therefore, I wrote this "Maven repository/proxy" which provided the artifact if it found something that could be it, and if not, it proxied the request to Central.
It can derive the G:A:V from three sources:
MANIFEST.MF
META-INF/.../pom.xml
Location of the file in the directory, in combination with a configuration file like this:
jboss-managed.jar org/jboss/man/ jboss-managed 2.1.0.SP1 jboss-managed-2.1.0.SP1.jar
getopt.jar gnu-getopt/ getopt 1.0.12-brew getopt-1.0.12-brew.jar
jboss-kernel.jar org/jboss/microcontainer/ jboss-kernel 2.0.6.GA jboss-kernel-2.0.6.GA.jar
jboss-logging-spi.jar org/jboss/logging/ jboss-logging-spi 2.1.0.GA jboss-logging-spi-2.1.0.GA.jar
...
The first column is the filename in the .zip; Then groupId (with either slashes or dots), artifactId, version, artifact file name, respectively.
Your 70 files would be listed in this file.
See more information at this page:
https://rawgit.com/OndraZizka/MavenHoe/master/docs/README.html
The project is available here.
Feel free to fork and push further, if you don't find anything better.
Simple console maven artifacts with shared dependencies (some also provide public API's in addition to their own class) living on same production server. How to best organise/install on production server?
My instinct is for a single folder holding all (version numbered) jars (ie. a 'flattened'/dependency populated 'repository') however:
(a) Can't see how such a folder would increase, on a 'dependency' basis, it's population from maven deployment repository
(b) How a jar's manifest's classpath would change from the default 'lib/...,lib/...' (ie. relative to 'main' jar, sensible for dev/test using Eclipse) to just '...,...'
What is recommended best practice as regards organisation on production server?
Google'ing 'maven production classpath' (amongst others) resulted in http://blog.armstrongconsulting.com/?p=232 which seems related but light on detail.
Any pointers?
How experienced are you with Maven? If you are the process described in the blog you mention is pretty straightforward also without going into details.
Re (a): Dependencies are downloaded from a remote Maven repository into a local Maven repository on demand. Default in ${user.home}/.m2/repository or according to <localRepository> at the beginning of your settings.xml. See Introduction to Repositories. So, there's no need for a single 'flattened'/dependency populated 'repository' folder.
A local repository can also be populated with the install:install-file goal manually. But this can be a cumbersome process if there are many artifacts to install.
See Maven, Available Plugins for what the mentioned plugin:goals do.
I have a Maven 3 multi-module project (~50 modules) which is stored in Git. Multiple developers are working on this code and building it, and we also have automated build machines that run cold builds on every push.
Most individual changelogs alter code in a fairly small number of modules, so it's a waste of time to rebuild the entire source tree with every change. However, I still want the final result of running the parent project build to be the same as if it had built the entire codebase. And I don't want to start manually versioning modules, as this would become a nightmare of criss-crossing version updates.
What I would like to do is add a plugin which intercepts some step in build or install, and takes a hash of the module contents (ideally pulled from Git), then looks in a shared binary repository for an artifact stored under that hash. If one is found, it uses that artifact and doesn't even execute the full build. If it finds nothing in the cache it performs the build as normal, then stores its artifact in the cache. It would also be good to rebuild any modules which have dependencies (direct or transient) which themselves had a cache miss.
Is there anything out there which does anything like this already? If not, what would be the cleanest way to go about adding it to Maven? It seems like plugins might be able to accomplish it, but for a couple pieces I'm having trouble finding the right way to attach to Maven. Specifically:
How can you intercept the "install" goal to check the cache, and only invoke the module's 'native' install goal on a cache miss?
How should a plugin pass state from one module to another regarding which cache misses have occurred in order to force rebuilds of dependencies with changes?
I'm also open to completely different ways to achieve the same end result (fewer redundant builds) although the more drastic the solution the less value it has for me in the near term.
I have previously implemented a more complicated solution with artifact version manipulation and deployment to private Maven repository. However, I think this will fit your needs better and is somewhat more simple:
Split your build into multiple builds (e.g., with a single build per module using maven -pl argument).
Setup parent-child relationships between these builds. (Bamboo even has additional support for figuring out Maven dependencies, but I'm not sure how it works.)
Configure Maven settings.xml to use a different local repository location - specify a new directory inside your build working directory. See docs: https://maven.apache.org/guides/mini/guide-configuring-maven.html
Use mvn install goal to ensure newly built artifacts are added to local repository
Use Bamboo artifact sharing to expose built artifacts from local repository - you should probably filter this to include only the package(s) you're interested in
Set dependent builds to download all artifacts from parent builds and put them into proper subdirectory of local repository (which is customized to be in working directory)
This should even work for feature branch builds thanks to the way Bamboo handles parent-child relations for branch builds.
Note that this implies that Maven will redownload all other dependencies, so you should use a proxy private Maven repository on local network, such as Artifactory or Nexus.
If you want, I can also describe the more complicated scenario I've already implemented that involves modifying artifact versions and deploying to private Maven repository.
The Jenkins plugin allows you to manage/minimize dependent builds
whenever a SNAPSHOT dependency is built (determined by Maven)
after other projects are built (manually via Jenkins jobs)
And if you do a 'mvn deploy' to save the build into your corporate Maven repo then you don't have to worry about dependencies when builds run on slave Jenkins machines. The result is that no module is ever built unless it or one of its dependencies has changed.
Hopefully you can apply these principles to a solution with Bamboo.
Currently my deploy workflow involves manually (i.e. in a script) cd-ing into each maven project directory and running mvn install. The problem is for local resources, i.e. other in-house code that I've written and am actively developing/maintaining, I don't know how to tell maven to build those resources itself when they are missing. Ideally each time I need to re-package the top level application it will rebuild any libraries it depends on that have at least one file modified.
If your (multi-module) project uses other in-house resources, what you actually need might not be to rebuild all those resources all the time, but to use a local maven repository. It can be a simple repository where resources are deployed using ssh or an HTTP transport (see the deploy plugin), or a real artifact manager such as Archiva, Artifactory or Nexus.
A repository manager does more than just hold your deployed artifacts, it can also clean the obsolete snapshots once the corresponding release has been made, and serve as a local cache for other repositories, including central.
Have a parent POM which contains all your modules. When you build the parent, all the modules that are part of parent POM file will be build as well.
You can inherit many things from the parent as long as you have the parent in your child.
Consider setting up Jenkins to automatically build your code. Jenkins has a useful feature that will rebuild projects that depend on newly built artifacts. Builds can be automatically triggered by simply committing your code.
If you're using several development machines (or working in a team) combine Jenkins with Nexus (Other options:Artifactory, Archiva) to provide a common store for shared artifacts. These tools are were designed to support Maven builds.
It seems as though ivy , maven, grapes , and other dependency managers link to the same integrated repositories.
1) What do these different dependency managers have in common in terms of the way that
resources are checked and downloaded ?
2) When i have a package name in an ivy or maven file -- how can I find the curators of that package ? Where are these remote java resources unified and managed ?
I'm not asking for the "development lifecycle" scope of information here, but rather, I want to know specifically how grapes/maven/ivy are capable of playing nicely together, i.e. , is what is the standard for resolving remote java repositories ?
It's the Maven Central repository that ties together these tools. Tools like ivy can be configured to use their own repository format, but default to using the public Maven repository for downloads.
Maven Central is maintained by Sonatype, the creators of both Maven and the Nexus repository manager. It is estimated that it will soon host 90% of Java's open source libraries.
Maven central can be searched using the following URL:
http://search.maven.org
And the following guide gives information on how one can upload artifacts:
http://maven.apache.org/guides/mini/guide-central-repository-upload.html
Information on the ownership of modules is normally available from the module's POM file. Artifacts are also signed, using PGP, to prove ownership.
In conclusion, while Maven may one of several dependency management clients, it has certainly established itself as the de-facto standard for server-side repository management.