I'm building a Java project that has a dependency on a library. mvn.bat clean install produced the target subdirectories as expected, and the project built fine with mvn.bat clean install as well.
What's not expected is that when I deleted the entire directory of the library, the outer project still built fine, although the library it depends on was gone.
How does this work?
UPDATE: Turns out Maven makes some sort of cache in %USERPROFILE\.m2.
You are most likely thinking of your local repository where everything you install locally (and maven downloads for you from the central repository) is put for later usage.
The behavior you describe is intentional, and allows for building A once and then let B reference it whenever needed, without having to recompile A every time. This is usually very desirable, especially in teams or with large code bases.
Note, that for changing code you should be using -SNAPSHOT artifacts. They are treated slightly differently.
Your dependencies are always downloaded into .m2/repository.
If you want to have some predictability on downloaded libraries in your team, you can put in place a repository manager like Nexus : https://repository.apache.org/index.html#welcome
Instead of downloading dependencies from Maven central, your developers will download their dependencies from this repository manager.
Related
I use dbus-java library in my own library. It depends on unix-java and some more. Those jars are not present in any maven repo.
How would I explicitly depend on all of these?
I see several options:
send jars to maven repo by myself (though it's not clear for me - how to preserve their groupId?)
package all the jar's into mine (which is obviously bad)
write in README: "apt-get install dbus-java-bin" and what to include in classpath... but it makes me really sad :(
Note: I came from Ruby land, so I'm relative new to all these weird Maven repos and confused by missing jars everywhere. In Ruby I was always sure that I will be able to retrieve all the gems either from rubygems or from a specified git repo (usually on github).
Could you explain how is better to distribute such libraries?
What I would do is to download the jars from the net and install them in my local-global repository.
(By this I mean the repository that is not local on my machine, but local to the company, often this is managed by Nexus).
You just need to set a pom with
<groupId>, <artifactId> and <version>.
Then, in your pom, you point to them in your dependencies list.
mvn deploy
By the way, if you wander what the groupId should be, you have two options:
com.yourcompany.trirdparty
or
com.whatever.the.original.groupid.is.groupId
I am new to using github and have been trying to figure out this question by looking at other people's repositories, but I cannot figure it out. When people fork/clone repositories in github to their local computers to develop on the project, is it expected that the cloned project is complete (ie. it has all of the files that it needs to run properly). For example, if I were to use a third-party library in the form of a .jar file, should I include that .jar file in the repository so that my code is ready to run when someone clones it, or is it better to just make a note that you are using such-and-such third-party libraries and the user will need to download those libraries elsewhere before they begin work. I am just trying to figure at the best practices for my code commits.
Thanks!
Basically it is as Chris said.
You should use a build system that has a package manager. This way you specify which dependencies you need and it downloads them automatically. Personally I have worked with maven and ant. So, here is my experience:
Apache Maven:
First word about maven, it is not a package manager. It is a build system. It just includes a package manager, because for java folks downloading the dependencies is part of the build process.
Maven comes with a nice set of defaults. This means you just use the archtype plugin to create a project ("mvn archetype:create" on the cli). Think of an archetype as a template for your project. You can choose what ever archetype suits your needs best. In case you use some framework, there is probably an archetype for it. Otherwise the simple-project archetype will be your choice. Afterwards your code goes to src/main/java, your test cases go to src/test/java and "mvn install" will build everything. Dependencies can be added to the pom in maven's dependency format. http://search.maven.org/ is the place to look for dependencies. If you find it there, you can simply copy the xml snippet to your pom.xml (which has been created by maven's archetype system for you).
In my experience, maven is the fastest way to get a project with dependencies and test execution set up. Also I never experienced that a maven build which worked on my machine failed somewhere else (except for computers which had year-old java versions). The charm is that maven's default lifecycle (or build cycle) covers all your needs. Also there are a lot of plugins for almost everything. However, you have a big problem if you want to do something that is not covered by maven's lifecycle. However, I only ever encountered that in mixed-language projects. As soon as you need anything but java, you're screwed.
Apache Ivy:
I've only ever used it together with Apache Ant. However, Ivy is a package manager, ant provides a build system. Ivy is integrated into ant as a plugin. While maven usually works out of the box, Ant requires you to write your build file manually. This allows for greater flexibility than maven, but comes with the prize of yet another file to write and maintain. Basically Ant files are as complicated as any source code, which means you should comment and document them. Otherwise you will not be able to maintain your build process later on.
Ivy itself is as easy as maven's dependency system. You have an xml file which defines your dependencies. As for maven, you can find the appropriate xml snippets on maven central http://search.maven.org/.
As a summary, I recommend Maven in case you have a simple Java Project. Ant is for cases where you need to do something special in your build.
I have a Maven 3 multi-module project (~50 modules) which is stored in Git. Multiple developers are working on this code and building it, and we also have automated build machines that run cold builds on every push.
Most individual changelogs alter code in a fairly small number of modules, so it's a waste of time to rebuild the entire source tree with every change. However, I still want the final result of running the parent project build to be the same as if it had built the entire codebase. And I don't want to start manually versioning modules, as this would become a nightmare of criss-crossing version updates.
What I would like to do is add a plugin which intercepts some step in build or install, and takes a hash of the module contents (ideally pulled from Git), then looks in a shared binary repository for an artifact stored under that hash. If one is found, it uses that artifact and doesn't even execute the full build. If it finds nothing in the cache it performs the build as normal, then stores its artifact in the cache. It would also be good to rebuild any modules which have dependencies (direct or transient) which themselves had a cache miss.
Is there anything out there which does anything like this already? If not, what would be the cleanest way to go about adding it to Maven? It seems like plugins might be able to accomplish it, but for a couple pieces I'm having trouble finding the right way to attach to Maven. Specifically:
How can you intercept the "install" goal to check the cache, and only invoke the module's 'native' install goal on a cache miss?
How should a plugin pass state from one module to another regarding which cache misses have occurred in order to force rebuilds of dependencies with changes?
I'm also open to completely different ways to achieve the same end result (fewer redundant builds) although the more drastic the solution the less value it has for me in the near term.
I have previously implemented a more complicated solution with artifact version manipulation and deployment to private Maven repository. However, I think this will fit your needs better and is somewhat more simple:
Split your build into multiple builds (e.g., with a single build per module using maven -pl argument).
Setup parent-child relationships between these builds. (Bamboo even has additional support for figuring out Maven dependencies, but I'm not sure how it works.)
Configure Maven settings.xml to use a different local repository location - specify a new directory inside your build working directory. See docs: https://maven.apache.org/guides/mini/guide-configuring-maven.html
Use mvn install goal to ensure newly built artifacts are added to local repository
Use Bamboo artifact sharing to expose built artifacts from local repository - you should probably filter this to include only the package(s) you're interested in
Set dependent builds to download all artifacts from parent builds and put them into proper subdirectory of local repository (which is customized to be in working directory)
This should even work for feature branch builds thanks to the way Bamboo handles parent-child relations for branch builds.
Note that this implies that Maven will redownload all other dependencies, so you should use a proxy private Maven repository on local network, such as Artifactory or Nexus.
If you want, I can also describe the more complicated scenario I've already implemented that involves modifying artifact versions and deploying to private Maven repository.
The Jenkins plugin allows you to manage/minimize dependent builds
whenever a SNAPSHOT dependency is built (determined by Maven)
after other projects are built (manually via Jenkins jobs)
And if you do a 'mvn deploy' to save the build into your corporate Maven repo then you don't have to worry about dependencies when builds run on slave Jenkins machines. The result is that no module is ever built unless it or one of its dependencies has changed.
Hopefully you can apply these principles to a solution with Bamboo.
I have a small Java project in a version control system (git), shared by 4 developers. I'm thinking about using Maven in this project as a build tool.
Dependency management is a wanted feature, but I don't want:
- automatic updates of dependencies (as this could break my software).
- to rely on an Internet connection to download dependencies from a remote repository and be able to compile my code.
Therefore, the questions:
1) May I configure Maven to use local dependencies (eg jars shared in a VCS)? I don't have several dependencies shared among several projects and my dependencies rarely will be updated, so using Maven repositories is not worth it to me imho.
2) If I choose to use a Maven repository, may I configure one in my local network? I don't want a remote repository mirror or a portal to the remote repository. I want a standalone repository with my dependencies, located at a server in my local network.
3) If I use the default Maven approach with the remote repository, could I turn off dependency updates after all dependencies are downloaded the first time?
Thanks in advance for any help.
Answer to 1:
Yes you can, google for System-Scope dependencies, BUT: It is not a good idea to do this, because you will remove one of the key-features.
<dependency>
<groupId>com.mycompany</groupId>
<artifactId>app-lib</artifactId>
<version>3.1.0</version>
<scope>system</scope>
<systemPath>${project.basedir}/libs/app-lib-3.1.0.jar</systemPath>
</dependency>
Answer to 2:
Yes you can:
- Artifactory
- Nexus
Answer to 3:
Yes you can. For that case you can use the ---offline flag OR better approach: release all dependencies.
Some thoughts:
You want to use a dependecy-management system, without using dependency-management, sounds strange to mee.
If you fear, that changes within your libs may break your code, just don't use SNAPSHOTs.
Try a kind of version scheme. We use
x.y.z
if z changes in a release, the jar should be compatible.
if y changes, you'll have to change youz code
if x changes... well everthing needs to be renewed.
Your concern about being dependent on Internet connectivity is a valid one, but I don't think it's as bad as you think.
After a dependency is downloaded from the Central Repository, it is saved to a cache on your hard drive (located at "~/.m2/repository"). From then on, the copy in the cache is used and Internet connectivity is no longer required to compile your application.
When you compile your first project in Maven, it will have to download a crap-load of stuff. But after that, all subsequent compilations will go much faster and they won't need to download anything.
Also, Maven's versioning scheme makes it so that all "release" versions of a dependency cannot change once they are deployed to the Central repository. For example, if I'm using version "2.2" of "commons-io", I know that this version of the library will always stay the same. A change cannot be made without releasing a new version.
"Snapshot" versions, however, can change. If a library's version is a snapshot, then it will end in "-SNAPSHOT" (for example, "1.2-SNAPSHOT" means the library will eventually be released as "1.2"). I don't think the Central repository allows snapshot builds though. You shouldn't use them in production code anyway.
I thought that Internet connectivity was only needed in the 1st compile, but I get several download msgs whenever I change code. Msgs like these:
Downloading: http://repo.maven.apache.org/maven2/org/eclipse/core/resources/maven-metadata.xml
Downloading: http://repository.springsource.com/maven/bundles/external/org/eclipse/core/resources/maven-metadata.xml
Downloading: http://repository.springsource.com/maven/bundles/release/org/eclipse/core/resources/maven-metadata.xml
Downloading: https://oss.sonatype.org/content/repositories/snapshots/org/eclipse/core/resources/maven-metadata.xml
Why is that? Is Maven looking for updates in these repositories or is there another reason to download these metadata xmls often?
I'm new to Maven, using the m2e plugin for Eclipse. I'm still wrapping my head around Maven, but it seems like whenever I need to import a new library, like java.util.List, now I have to manually go through the hassle of finding the right repository for the jar and adding it to the dependencies in the POM. This seems like a major hassle, especially since some jars can't be found in public repositories, so they have to be uploaded into the local repository.
Am I missing something about Maven in Eclipse? Is there a way to automatically update the POM when Eclipse automatically imports a new library?
I'm trying to understand how using Maven saves time/effort...
You picked a bad example. Portions of the actual Java Library that come with the Java Standard Runtime are there regardless of Maven configuration.
With that in mind, if you wanted to add something external, say Log4j, then you would need to add a project dependency on Log4j. Maven would then take the dependency information and create a "signature" to search for, first in the local cache, and then in the external repositories.
Such a signature might look like
groupId:artifactId:version
or perhaps
groupId:artifactId:version:classifier
This identifies a maven "module" which will then be downloaded and configured into your system. Once in place it adds all of the classes within the module to your configured project.
Maven principally saves time in downloading and organizing JAR files in your build. By defining a "standard" project layout and a "standard" build order, Maven eliminates a lot of the guesswork in the "why isn't my project building" sweepstakes. Also, you can use neat commands like "mvn dependency:tree" to print out a list of all the JARs your project depends on, recursively.
Warning note: If you are using the M2E plugin and Eclipse, you may also run into problems with the plugin itself. The 1.0 version (hosted at eclipse.org) was much less friendly than the previous 0.12 version (hosted at Sonatype). You can get around this to some extent by downloading and installing the "standalone" version of Maven from apache (maven.apache.org) and running Maven from the command line. This is actually much more stable than trying to run Maven inside Eclipse (in my personal experience) and may save you some pain as you try to learn about Maven.