Maven: repositories - java

I have a small Java project in a version control system (git), shared by 4 developers. I'm thinking about using Maven in this project as a build tool.
Dependency management is a wanted feature, but I don't want:
- automatic updates of dependencies (as this could break my software).
- to rely on an Internet connection to download dependencies from a remote repository and be able to compile my code.
Therefore, the questions:
1) May I configure Maven to use local dependencies (eg jars shared in a VCS)? I don't have several dependencies shared among several projects and my dependencies rarely will be updated, so using Maven repositories is not worth it to me imho.
2) If I choose to use a Maven repository, may I configure one in my local network? I don't want a remote repository mirror or a portal to the remote repository. I want a standalone repository with my dependencies, located at a server in my local network.
3) If I use the default Maven approach with the remote repository, could I turn off dependency updates after all dependencies are downloaded the first time?
Thanks in advance for any help.

Answer to 1:
Yes you can, google for System-Scope dependencies, BUT: It is not a good idea to do this, because you will remove one of the key-features.
<dependency>
<groupId>com.mycompany</groupId>
<artifactId>app-lib</artifactId>
<version>3.1.0</version>
<scope>system</scope>
<systemPath>${project.basedir}/libs/app-lib-3.1.0.jar</systemPath>
</dependency>
Answer to 2:
Yes you can:
- Artifactory
- Nexus
Answer to 3:
Yes you can. For that case you can use the ---offline flag OR better approach: release all dependencies.
Some thoughts:
You want to use a dependecy-management system, without using dependency-management, sounds strange to mee.
If you fear, that changes within your libs may break your code, just don't use SNAPSHOTs.
Try a kind of version scheme. We use
x.y.z
if z changes in a release, the jar should be compatible.
if y changes, you'll have to change youz code
if x changes... well everthing needs to be renewed.

Your concern about being dependent on Internet connectivity is a valid one, but I don't think it's as bad as you think.
After a dependency is downloaded from the Central Repository, it is saved to a cache on your hard drive (located at "~/.m2/repository"). From then on, the copy in the cache is used and Internet connectivity is no longer required to compile your application.
When you compile your first project in Maven, it will have to download a crap-load of stuff. But after that, all subsequent compilations will go much faster and they won't need to download anything.
Also, Maven's versioning scheme makes it so that all "release" versions of a dependency cannot change once they are deployed to the Central repository. For example, if I'm using version "2.2" of "commons-io", I know that this version of the library will always stay the same. A change cannot be made without releasing a new version.
"Snapshot" versions, however, can change. If a library's version is a snapshot, then it will end in "-SNAPSHOT" (for example, "1.2-SNAPSHOT" means the library will eventually be released as "1.2"). I don't think the Central repository allows snapshot builds though. You shouldn't use them in production code anyway.

I thought that Internet connectivity was only needed in the 1st compile, but I get several download msgs whenever I change code. Msgs like these:
Downloading: http://repo.maven.apache.org/maven2/org/eclipse/core/resources/maven-metadata.xml
Downloading: http://repository.springsource.com/maven/bundles/external/org/eclipse/core/resources/maven-metadata.xml
Downloading: http://repository.springsource.com/maven/bundles/release/org/eclipse/core/resources/maven-metadata.xml
Downloading: https://oss.sonatype.org/content/repositories/snapshots/org/eclipse/core/resources/maven-metadata.xml
Why is that? Is Maven looking for updates in these repositories or is there another reason to download these metadata xmls often?

Related

Maven proxy configuration with Nexus

I am using Nexus and have configured maven-proxy and maven-hosted repositories, added them to a group repo and using that repo through settings.xml. Now when a new dependency is added in pom, maven-proxy goes to maven central and downloads it. However, I do not want this.
My goal is to stop replying on Maven central completely, but I know it won't work until my hosted repository contains everything that maven needs.
Issue is that Maven plugins like compiler, clean, jar etc. downloads tons of dependencies on its own. If I remove connection to maven proxy, how do I get all that list and then how do I make sure that I put whatever is needed in my hosted repository ?
Should I even try to put such artifacts in my hosted repo ? Is there any other better approach ?
You cannot really do that manually, the number of artifacts is way too large.
You can let Maven download all needed artifacts, then copy that from a remote to a hosted repository and work with that (until you need something new).
But it is still painful. I would not do that.
If your concern is security, I would use an open source security scanner instead of blocking internet access altogether.

Maven versioning of internal dependencies

Let's assume that we have a project which uses Maven and has some dependencies which are developed in the same company/team or even by the same person. It is clear that when some developer wants to compile the project, the specified dependencies will be fetched from the repo and downloaded locally if they are not there yet.
Now let's assume the following scenario:
The developer does not care about the dependencies and
the version of the dependency is x.x.x.SNAPSHOT => maven will be fetching the latest version from the repo every 24 hours (by default). Problem: if this version is not compatible with your project, basically, you don't even know what happened because you did not change anything in your project. The only possible solution here is to compile and manage the dependency locally.
the version of the dependency is "x.x.x.y" => maven will fetch exactly this version and nothing else. So, to update this dependency I need to change the version. Problem: it seems that it means that every time when this dependency gets some changes and code is pushed to the server the version must be changed. But this sounds just ridiculous.
Possible solution:
It seems that the only possible solution in this case is to handle the internal dependencies manually (get the source from repo and compile locally). However, these two issues are bothering me:
This solution breaks the whole idea of maven which should fetch all the dependencies.
This solution will bring difficulties for the developers who just want to start development on the project, but do not care about those dependencies (because those dependencies are not used in the project part that they are working on).
Is there better solution?
You can also take care about latest stable dependencies by making your maven to work with your own administrated web repo. So you will be sure that all crew has a plugins,frameworks etc. of version needed. Our team uses TeamCity server for building workbanches, .m2/settings.xml is configured to work with our TC server repo as well. And team-lead is controling all versions.

How to add 70 local jars on maven project?

why use Maven when you have such quantity of local jars?
So we have a client that have a lot of private jars and custom jars.
For example commons-langMyCompanyCustom.jar which is commons-lang.jar with 10 more classes in it.
So on their environment we use 100% Maven without local dependencies.
But on our site we have the jars for development in Eclipse and have Maven build with the public ones, but we do not have permission to add their jars in our organizational repository.
So we want to use the Maven good things like: compile,test, build uber-jar, add static code analysis, generate java-docs, sources-jars etc. not to do this thinks one by one with the help of Eclipse.
So we have 70 jar some of them are public if I get the effective pom on their environment I found 50 of them in Maven Central, but the other 20 are as I called "custom" jars. I searched for decision of course but found this:
<dependency>
<groupId>sample</groupId>
<artifactId>com.sample</artifactId>
<version>1.0</version>
<scope>system</scope>
<systemPath>${project.basedir}/src/main/resources/yourJar.jar</systemPath>
</dependency>
So for all 20 of them I have to add this in the development maven profile??
Is there a easy way like in Gradle where you can add all folder with its dependencies to the existing ones?
Also installing one by one in every developer's repo is not acceptable.
Please forget the system scope as mentioned before! Too problematic...
Ideally:
Ideally, all your developers have access to Repository Manager in your or their organization (if possible).
A central environment for your System Integration Testing, maybe?
Alternatively, you may have a central environment for testing where all the dependencies are provided. This approach can be used to simulate how a compilation would work as if it's in your client's environment. Plus you only setup jars once.
So on their environment we use 100% Maven without local dependencies.
But on our site we have the jars for development in Eclipse and have
Maven build with the public ones, but we do not have permission to add
their jars in our organizational repository.
According to what you're saying in the above-quoted excerpt I believe you want to have set in your build's pom.xml assuming that in the client setup the dependencies will be present.
Especially, as you indicate that the organization doesn't give you permission to add their jars in your repository, I would use the provided scope.
As stated in the Maven docs, the definition of a provided dependency is as followed:
This is much like compile, but indicates you expect the JDK or a container to provide the dependency at runtime. For example, when building a web application for the Java Enterprise Edition, you would set the dependency on the Servlet API and related Java EE APIs to scope provided because the web container provides those classes. This scope is only available on the compilation and test classpath, and is not transitive.
So basically you assume that these dependencies will be present at your client's setup. However, this has some limitations. Meaning you can build solutions independently but cannot test it locally because you won't have the dependencies on your workstation.
If you won't even have access to the jars to configure your central environment ask if your client can provide a DEV/SIT environment.
None of the above? Inherit a parent pom.
To avoid the whole constant copy-paste process for every single (related) project, maven has the tools to centralize dependency and plugin configurations, one of such is by inheriting the configuration of a parent pom. As is explaining in the following documentation it is quite simple:
First you create a project with just a pom.xml where you define everything you wish to centralize (watch out, certain items have slight differences in their constructs);
Use as property of packaging tag the option pom: <packaging>pom</packaging>;
In the pom's that have to inherit these configurations set the parent configuration tags in <parent> ... </parent> (documentation is very clear with this);
Now everytime you update any "global" pom configuration only the parent version has to be updated on every project. As a result of this, you only need to configure everything once.
You can also apply this together with the abovementioned solutions, this way combining to find a solution that fits best to your needs.
But there is a big Maven world out there, so I advise a good read in its doc's to further acknowledge your possibilities. I remembered these situations because I've been in a similar situation you seem to be now.
Good luck!
Another alternative is the project RepoTree.
This one creates a Maven repository directory (not a server) from another directory which contains just the .jars. In other words, it creates the necessary .pom files and directory structure. It takes into account only the precise information from metadata contained in the archives (MANIFEST.MF, pom.xml).
Utility to recursively install artifacts from a directory into a local
Maven repository Based on Aether 1.7
This is 5 years old, but still should work fine.
TL;DR: MavenHoe creates a Maven repository server (not a directory) which serves the artefacts from a directory, guessing what you ask for if needed. The purpose is to avoid complicated version synchronizing - it simply takes whatever is closest to the requested G:A:V.
I have moved the MavenHoe project, which almost got lost with the decline of Google Code, to Github. Therefore I put it here for availability in the form of a full answer:
One of the options you have when dealing with conditions like that is to take whatever comes in form of a directory with .jar's and treat it as a repository.
Some time ago I have written a tool for that purpose. My situation was that we were building JBoss EAP and recompiled every single dependency.
That resulted in thousands of .jars which were most often the same as their Central counterpart (plus security and bug fixes).
I needed the tests to run against these artifacts rather than the Central ones. However, the Maven coordinates were the same.
Therefore, I wrote this "Maven repository/proxy" which provided the artifact if it found something that could be it, and if not, it proxied the request to Central.
It can derive the G:A:V from three sources:
MANIFEST.MF
META-INF/.../pom.xml
Location of the file in the directory, in combination with a configuration file like this:
jboss-managed.jar org/jboss/man/ jboss-managed 2.1.0.SP1 jboss-managed-2.1.0.SP1.jar
getopt.jar gnu-getopt/ getopt 1.0.12-brew getopt-1.0.12-brew.jar
jboss-kernel.jar org/jboss/microcontainer/ jboss-kernel 2.0.6.GA jboss-kernel-2.0.6.GA.jar
jboss-logging-spi.jar org/jboss/logging/ jboss-logging-spi 2.1.0.GA jboss-logging-spi-2.1.0.GA.jar
...
The first column is the filename in the .zip; Then groupId (with either slashes or dots), artifactId, version, artifact file name, respectively.
Your 70 files would be listed in this file.
See more information at this page:
https://rawgit.com/OndraZizka/MavenHoe/master/docs/README.html
The project is available here.
Feel free to fork and push further, if you don't find anything better.

Apache Maven: where do dependencies and libraries installed locally up?

I'm building a Java project that has a dependency on a library. mvn.bat clean install produced the target subdirectories as expected, and the project built fine with mvn.bat clean install as well.
What's not expected is that when I deleted the entire directory of the library, the outer project still built fine, although the library it depends on was gone.
How does this work?
UPDATE: Turns out Maven makes some sort of cache in %USERPROFILE\.m2.
You are most likely thinking of your local repository where everything you install locally (and maven downloads for you from the central repository) is put for later usage.
The behavior you describe is intentional, and allows for building A once and then let B reference it whenever needed, without having to recompile A every time. This is usually very desirable, especially in teams or with large code bases.
Note, that for changing code you should be using -SNAPSHOT artifacts. They are treated slightly differently.
Your dependencies are always downloaded into .m2/repository.
If you want to have some predictability on downloaded libraries in your team, you can put in place a repository manager like Nexus : https://repository.apache.org/index.html#welcome
Instead of downloading dependencies from Maven central, your developers will download their dependencies from this repository manager.

How do I prevent Maven 2 from searching remote repositories for specific local depedencies?

How do I prevent Maven 2 from searching remote repositories for specific dependencies that are in the local repository only?
How do I prevent Maven 2 from searching remote repositories for specific depedencies that are in the local repository only
Well, actually, Maven won't unless:
they are SNAPSHOT dependencies in which case this is the expected behavior.
they are missing a .pom file in which case you can provide it or generate it (see questions below).
Related questions
How do I stop Maven 2.x from trying to retrieve non-existent pom.xml files for dependencies every build?
Maven install-file won’t generate pom.xml
set up nexus as a repository manager.
add addtional remote proxied repositories if necessary
add your local hosted repository (hosted on the nexus server)
define a group of repositories in the correct search sequence with your local repo's first.
change your builds to point at the nexus group url (use mirrorOf=* in your settings.xml)
run your build and let nexus manage the local vs remote dependency resolution
Use fixed version numbers in your POM for your remote dependencies or the local versions you want to fetch from the local repository.
Maven tries to be friendly and fetch the latest and greatest of whatever which has no version number specified.
For a quick fix to not be waiting for the internet to be downloaded each time you build you can use mvn -o to force an offline build, and then it will not lose time trying to fetch new versions.
The answer of #crowne is also very good advice, especially setting up your own nexus and making sure all remote repos are configured there so you will never have unpleasant surprises when a repo dissappears some day.
To prevent Maven from checking remote repositories at all, you can use the -o flag. Otherwise, Maven will check that any snapshot dependencies are up-to-date. You can use a dependency manager such as Nexus to get fine-grained control over dependency resolution. The repository section in your pom.xml or settings.xml file also has an updatePolicy element that allows you to configure how often Maven will check for updated dependencies.

Categories