Maven: where is the proper place to set internal repo site - java

it seems to me there are 2 places where I might want to set the internal maven repo:
In maven's settings.xml inside the mirror tag;
In the project's pom file, inside repository and pluginRepository tags.
The question is, which one is right? or shall I put the internal repo in both places?
Thanks,
John

I don't recommend putting repository definitions in your pom. I have been bitten many times by a pom in my dependency list including repository definitions to repositories that I can't reach.
By putting these in settings.xml, you allow each developer the freedom to control which repositories are used when running a build. Since developers sometimes work disconnected or across a VPN, it can be desirable for this list of repositories to be different from machine to machine.
Remember also that the POM becomes immutable once a release is performed, effectively making the repository URL you defined permanent. Placing it in settings.xml allows your future team members the freedom to move the repository (or remove it).

In both :-(
You add the repo definition (server ID, URL and login credentials) to Maven's settings.xml and add a reference to that repo (by ID) inside your project's pom.xml. It's cumbersome, but it lets your credentials stay away from shared files.
Maven docs state:
The repositories for download and deployment are defined by the
repositories and distributionManagement elements of the POM. However,
certain settings such as username and password should not be
distributed along with the pom.xml. This type of information should
exist on the build server in the settings.xml.

Related

How to add 70 local jars on maven project?

why use Maven when you have such quantity of local jars?
So we have a client that have a lot of private jars and custom jars.
For example commons-langMyCompanyCustom.jar which is commons-lang.jar with 10 more classes in it.
So on their environment we use 100% Maven without local dependencies.
But on our site we have the jars for development in Eclipse and have Maven build with the public ones, but we do not have permission to add their jars in our organizational repository.
So we want to use the Maven good things like: compile,test, build uber-jar, add static code analysis, generate java-docs, sources-jars etc. not to do this thinks one by one with the help of Eclipse.
So we have 70 jar some of them are public if I get the effective pom on their environment I found 50 of them in Maven Central, but the other 20 are as I called "custom" jars. I searched for decision of course but found this:
<dependency>
<groupId>sample</groupId>
<artifactId>com.sample</artifactId>
<version>1.0</version>
<scope>system</scope>
<systemPath>${project.basedir}/src/main/resources/yourJar.jar</systemPath>
</dependency>
So for all 20 of them I have to add this in the development maven profile??
Is there a easy way like in Gradle where you can add all folder with its dependencies to the existing ones?
Also installing one by one in every developer's repo is not acceptable.
Please forget the system scope as mentioned before! Too problematic...
Ideally:
Ideally, all your developers have access to Repository Manager in your or their organization (if possible).
A central environment for your System Integration Testing, maybe?
Alternatively, you may have a central environment for testing where all the dependencies are provided. This approach can be used to simulate how a compilation would work as if it's in your client's environment. Plus you only setup jars once.
So on their environment we use 100% Maven without local dependencies.
But on our site we have the jars for development in Eclipse and have
Maven build with the public ones, but we do not have permission to add
their jars in our organizational repository.
According to what you're saying in the above-quoted excerpt I believe you want to have set in your build's pom.xml assuming that in the client setup the dependencies will be present.
Especially, as you indicate that the organization doesn't give you permission to add their jars in your repository, I would use the provided scope.
As stated in the Maven docs, the definition of a provided dependency is as followed:
This is much like compile, but indicates you expect the JDK or a container to provide the dependency at runtime. For example, when building a web application for the Java Enterprise Edition, you would set the dependency on the Servlet API and related Java EE APIs to scope provided because the web container provides those classes. This scope is only available on the compilation and test classpath, and is not transitive.
So basically you assume that these dependencies will be present at your client's setup. However, this has some limitations. Meaning you can build solutions independently but cannot test it locally because you won't have the dependencies on your workstation.
If you won't even have access to the jars to configure your central environment ask if your client can provide a DEV/SIT environment.
None of the above? Inherit a parent pom.
To avoid the whole constant copy-paste process for every single (related) project, maven has the tools to centralize dependency and plugin configurations, one of such is by inheriting the configuration of a parent pom. As is explaining in the following documentation it is quite simple:
First you create a project with just a pom.xml where you define everything you wish to centralize (watch out, certain items have slight differences in their constructs);
Use as property of packaging tag the option pom: <packaging>pom</packaging>;
In the pom's that have to inherit these configurations set the parent configuration tags in <parent> ... </parent> (documentation is very clear with this);
Now everytime you update any "global" pom configuration only the parent version has to be updated on every project. As a result of this, you only need to configure everything once.
You can also apply this together with the abovementioned solutions, this way combining to find a solution that fits best to your needs.
But there is a big Maven world out there, so I advise a good read in its doc's to further acknowledge your possibilities. I remembered these situations because I've been in a similar situation you seem to be now.
Good luck!
Another alternative is the project RepoTree.
This one creates a Maven repository directory (not a server) from another directory which contains just the .jars. In other words, it creates the necessary .pom files and directory structure. It takes into account only the precise information from metadata contained in the archives (MANIFEST.MF, pom.xml).
Utility to recursively install artifacts from a directory into a local
Maven repository Based on Aether 1.7
This is 5 years old, but still should work fine.
TL;DR: MavenHoe creates a Maven repository server (not a directory) which serves the artefacts from a directory, guessing what you ask for if needed. The purpose is to avoid complicated version synchronizing - it simply takes whatever is closest to the requested G:A:V.
I have moved the MavenHoe project, which almost got lost with the decline of Google Code, to Github. Therefore I put it here for availability in the form of a full answer:
One of the options you have when dealing with conditions like that is to take whatever comes in form of a directory with .jar's and treat it as a repository.
Some time ago I have written a tool for that purpose. My situation was that we were building JBoss EAP and recompiled every single dependency.
That resulted in thousands of .jars which were most often the same as their Central counterpart (plus security and bug fixes).
I needed the tests to run against these artifacts rather than the Central ones. However, the Maven coordinates were the same.
Therefore, I wrote this "Maven repository/proxy" which provided the artifact if it found something that could be it, and if not, it proxied the request to Central.
It can derive the G:A:V from three sources:
MANIFEST.MF
META-INF/.../pom.xml
Location of the file in the directory, in combination with a configuration file like this:
jboss-managed.jar org/jboss/man/ jboss-managed 2.1.0.SP1 jboss-managed-2.1.0.SP1.jar
getopt.jar gnu-getopt/ getopt 1.0.12-brew getopt-1.0.12-brew.jar
jboss-kernel.jar org/jboss/microcontainer/ jboss-kernel 2.0.6.GA jboss-kernel-2.0.6.GA.jar
jboss-logging-spi.jar org/jboss/logging/ jboss-logging-spi 2.1.0.GA jboss-logging-spi-2.1.0.GA.jar
...
The first column is the filename in the .zip; Then groupId (with either slashes or dots), artifactId, version, artifact file name, respectively.
Your 70 files would be listed in this file.
See more information at this page:
https://rawgit.com/OndraZizka/MavenHoe/master/docs/README.html
The project is available here.
Feel free to fork and push further, if you don't find anything better.

Managing and importing Ivy artifacts from global Maven repositories

My company runs an internal Ivy repository over a NAS. There we put all the dependencies for our projects. A few days ago, the repo has been rebuilt completely from scratch, using a combination of Java code and calls to Ant task ivy-install.
I personally designed and performed the rebuilding, but faced an important issue. While I used mvrepository.com as reference for importing common open source projects (e.g. Spring, Hibernate), often packages were available in not-widely-known Maven repositories I had to Google for.
Every time (e.g. with hibernate-spatial) I found a package belonging to an "unknown" Maven source I had to some manual work by registering the remote source into ivy-settings.xml and manually run again ivy-install.
This made me think: does it even exist a tool that graphically (e.g. web interface) helps Ivy administrator search and download to local repository artifacts from multiple remote sources (registered by user)? Like if I type the <dependency> tag, or the org, name and rev attributes and it lists available sources. When I need to download a package to local repo, simply click and it gets published.

Lock dependencies by secure checksum in Maven

When I distribute a source project with a POM I can define dependencies with version strings. The build will download those dependencies if they are not in the local repository, it will even verify the checksums of those downloaded files with the metadata from the same repository (-C).
However the build will download those dependencies from a number of public repositories (and proxies) and my users are at the mercy of those public services if they will return unmodified files.
I would like to have a way to record the checksums of all my build dependencies and ship them together with the POM (so I am sure the files are unaltered but I dont need to ship a copy of my local repository to builders).
Is there a Maven way to do this? Similiar to that, is there a easy way to archive my dependencies (copy of the local repository with all used artifacts and metadata files) so I can repeat my build even when the central repositories fail or ship them to offline customers?
(Both without the need for an repository proxy if possible. I know I can build something to do that, I just wonder if there is a infrastructure in maven for that already. Maybe shipping a local repository which does only contain the metadata files or similiar?)
NB: I am not looking for createChecksum on the generated artifacts, but on locking the checksums of the used dependencies. I found a maven-create-checksums plugin, but no corresponding verifyer.
There is BitcoinjEnforcerRules, which appears to be an plugin for Maven's enforcer plugin (that's right it's a plugin for a plugin). It works similar to gradle-witness, which means it compares the checksum of (hopefully) all used artifact dependencies to a list you have previously created.
While it's better then nothing, an ideal solution would be if one could simply set the checksum (or the openPG key used to sign the POM in case it's an snapshot artifact) for the POM of direct dependencies, and the POM of those dependencies would contain the checksum of it's artifacts, and the checksum (/openPG key, if snapshot) of the POM of its direct dependencies, and so on.
Note that if checksums are used, then they should be tied to a particular maven repository, since the same (non-changing, i.e. non-snapshot) artifact may have different checksums in different repositories.

Maven not looking for plugin in repository

I have a small problem in which I'm looking to use both the default Maven repository and another repository for my organization. When I go to compile it throws a whole list of warnings that packages aren't available. Then at the very end of the error it list places that it looked. It checks my local repository (.m2/) and my organization repository but it won't check the original default repository. Has anyone run into this issue before?
Have you checked the repositories that are configured in your MAVEN_HOME/conf/settings.xml file. All the repos you are using should be listed in there.
You will need this config file to include your organizations repo, but you will have to add the apache one as well when you override the default.

How do I prevent Maven 2 from searching remote repositories for specific local depedencies?

How do I prevent Maven 2 from searching remote repositories for specific dependencies that are in the local repository only?
How do I prevent Maven 2 from searching remote repositories for specific depedencies that are in the local repository only
Well, actually, Maven won't unless:
they are SNAPSHOT dependencies in which case this is the expected behavior.
they are missing a .pom file in which case you can provide it or generate it (see questions below).
Related questions
How do I stop Maven 2.x from trying to retrieve non-existent pom.xml files for dependencies every build?
Maven install-file won’t generate pom.xml
set up nexus as a repository manager.
add addtional remote proxied repositories if necessary
add your local hosted repository (hosted on the nexus server)
define a group of repositories in the correct search sequence with your local repo's first.
change your builds to point at the nexus group url (use mirrorOf=* in your settings.xml)
run your build and let nexus manage the local vs remote dependency resolution
Use fixed version numbers in your POM for your remote dependencies or the local versions you want to fetch from the local repository.
Maven tries to be friendly and fetch the latest and greatest of whatever which has no version number specified.
For a quick fix to not be waiting for the internet to be downloaded each time you build you can use mvn -o to force an offline build, and then it will not lose time trying to fetch new versions.
The answer of #crowne is also very good advice, especially setting up your own nexus and making sure all remote repos are configured there so you will never have unpleasant surprises when a repo dissappears some day.
To prevent Maven from checking remote repositories at all, you can use the -o flag. Otherwise, Maven will check that any snapshot dependencies are up-to-date. You can use a dependency manager such as Nexus to get fine-grained control over dependency resolution. The repository section in your pom.xml or settings.xml file also has an updatePolicy element that allows you to configure how often Maven will check for updated dependencies.

Categories