I have a small problem in which I'm looking to use both the default Maven repository and another repository for my organization. When I go to compile it throws a whole list of warnings that packages aren't available. Then at the very end of the error it list places that it looked. It checks my local repository (.m2/) and my organization repository but it won't check the original default repository. Has anyone run into this issue before?
Have you checked the repositories that are configured in your MAVEN_HOME/conf/settings.xml file. All the repos you are using should be listed in there.
You will need this config file to include your organizations repo, but you will have to add the apache one as well when you override the default.
Related
why use Maven when you have such quantity of local jars?
So we have a client that have a lot of private jars and custom jars.
For example commons-langMyCompanyCustom.jar which is commons-lang.jar with 10 more classes in it.
So on their environment we use 100% Maven without local dependencies.
But on our site we have the jars for development in Eclipse and have Maven build with the public ones, but we do not have permission to add their jars in our organizational repository.
So we want to use the Maven good things like: compile,test, build uber-jar, add static code analysis, generate java-docs, sources-jars etc. not to do this thinks one by one with the help of Eclipse.
So we have 70 jar some of them are public if I get the effective pom on their environment I found 50 of them in Maven Central, but the other 20 are as I called "custom" jars. I searched for decision of course but found this:
<dependency>
<groupId>sample</groupId>
<artifactId>com.sample</artifactId>
<version>1.0</version>
<scope>system</scope>
<systemPath>${project.basedir}/src/main/resources/yourJar.jar</systemPath>
</dependency>
So for all 20 of them I have to add this in the development maven profile??
Is there a easy way like in Gradle where you can add all folder with its dependencies to the existing ones?
Also installing one by one in every developer's repo is not acceptable.
Please forget the system scope as mentioned before! Too problematic...
Ideally:
Ideally, all your developers have access to Repository Manager in your or their organization (if possible).
A central environment for your System Integration Testing, maybe?
Alternatively, you may have a central environment for testing where all the dependencies are provided. This approach can be used to simulate how a compilation would work as if it's in your client's environment. Plus you only setup jars once.
So on their environment we use 100% Maven without local dependencies.
But on our site we have the jars for development in Eclipse and have
Maven build with the public ones, but we do not have permission to add
their jars in our organizational repository.
According to what you're saying in the above-quoted excerpt I believe you want to have set in your build's pom.xml assuming that in the client setup the dependencies will be present.
Especially, as you indicate that the organization doesn't give you permission to add their jars in your repository, I would use the provided scope.
As stated in the Maven docs, the definition of a provided dependency is as followed:
This is much like compile, but indicates you expect the JDK or a container to provide the dependency at runtime. For example, when building a web application for the Java Enterprise Edition, you would set the dependency on the Servlet API and related Java EE APIs to scope provided because the web container provides those classes. This scope is only available on the compilation and test classpath, and is not transitive.
So basically you assume that these dependencies will be present at your client's setup. However, this has some limitations. Meaning you can build solutions independently but cannot test it locally because you won't have the dependencies on your workstation.
If you won't even have access to the jars to configure your central environment ask if your client can provide a DEV/SIT environment.
None of the above? Inherit a parent pom.
To avoid the whole constant copy-paste process for every single (related) project, maven has the tools to centralize dependency and plugin configurations, one of such is by inheriting the configuration of a parent pom. As is explaining in the following documentation it is quite simple:
First you create a project with just a pom.xml where you define everything you wish to centralize (watch out, certain items have slight differences in their constructs);
Use as property of packaging tag the option pom: <packaging>pom</packaging>;
In the pom's that have to inherit these configurations set the parent configuration tags in <parent> ... </parent> (documentation is very clear with this);
Now everytime you update any "global" pom configuration only the parent version has to be updated on every project. As a result of this, you only need to configure everything once.
You can also apply this together with the abovementioned solutions, this way combining to find a solution that fits best to your needs.
But there is a big Maven world out there, so I advise a good read in its doc's to further acknowledge your possibilities. I remembered these situations because I've been in a similar situation you seem to be now.
Good luck!
Another alternative is the project RepoTree.
This one creates a Maven repository directory (not a server) from another directory which contains just the .jars. In other words, it creates the necessary .pom files and directory structure. It takes into account only the precise information from metadata contained in the archives (MANIFEST.MF, pom.xml).
Utility to recursively install artifacts from a directory into a local
Maven repository Based on Aether 1.7
This is 5 years old, but still should work fine.
TL;DR: MavenHoe creates a Maven repository server (not a directory) which serves the artefacts from a directory, guessing what you ask for if needed. The purpose is to avoid complicated version synchronizing - it simply takes whatever is closest to the requested G:A:V.
I have moved the MavenHoe project, which almost got lost with the decline of Google Code, to Github. Therefore I put it here for availability in the form of a full answer:
One of the options you have when dealing with conditions like that is to take whatever comes in form of a directory with .jar's and treat it as a repository.
Some time ago I have written a tool for that purpose. My situation was that we were building JBoss EAP and recompiled every single dependency.
That resulted in thousands of .jars which were most often the same as their Central counterpart (plus security and bug fixes).
I needed the tests to run against these artifacts rather than the Central ones. However, the Maven coordinates were the same.
Therefore, I wrote this "Maven repository/proxy" which provided the artifact if it found something that could be it, and if not, it proxied the request to Central.
It can derive the G:A:V from three sources:
MANIFEST.MF
META-INF/.../pom.xml
Location of the file in the directory, in combination with a configuration file like this:
jboss-managed.jar org/jboss/man/ jboss-managed 2.1.0.SP1 jboss-managed-2.1.0.SP1.jar
getopt.jar gnu-getopt/ getopt 1.0.12-brew getopt-1.0.12-brew.jar
jboss-kernel.jar org/jboss/microcontainer/ jboss-kernel 2.0.6.GA jboss-kernel-2.0.6.GA.jar
jboss-logging-spi.jar org/jboss/logging/ jboss-logging-spi 2.1.0.GA jboss-logging-spi-2.1.0.GA.jar
...
The first column is the filename in the .zip; Then groupId (with either slashes or dots), artifactId, version, artifact file name, respectively.
Your 70 files would be listed in this file.
See more information at this page:
https://rawgit.com/OndraZizka/MavenHoe/master/docs/README.html
The project is available here.
Feel free to fork and push further, if you don't find anything better.
it seems to me there are 2 places where I might want to set the internal maven repo:
In maven's settings.xml inside the mirror tag;
In the project's pom file, inside repository and pluginRepository tags.
The question is, which one is right? or shall I put the internal repo in both places?
Thanks,
John
I don't recommend putting repository definitions in your pom. I have been bitten many times by a pom in my dependency list including repository definitions to repositories that I can't reach.
By putting these in settings.xml, you allow each developer the freedom to control which repositories are used when running a build. Since developers sometimes work disconnected or across a VPN, it can be desirable for this list of repositories to be different from machine to machine.
Remember also that the POM becomes immutable once a release is performed, effectively making the repository URL you defined permanent. Placing it in settings.xml allows your future team members the freedom to move the repository (or remove it).
In both :-(
You add the repo definition (server ID, URL and login credentials) to Maven's settings.xml and add a reference to that repo (by ID) inside your project's pom.xml. It's cumbersome, but it lets your credentials stay away from shared files.
Maven docs state:
The repositories for download and deployment are defined by the
repositories and distributionManagement elements of the POM. However,
certain settings such as username and password should not be
distributed along with the pom.xml. This type of information should
exist on the build server in the settings.xml.
I tried all the suggested solutions which found on Stackoverflow but didn't solve the issue with Maven repositories in Intellij IDEA. The problem is that I can't find needed jars in local repository, even if I update it. Central repository is impossible to be updated. Just for example: I use in web-app servlet api (jar is found in local repo but the version is 2.5), jstl and jdbc. If I don't create Maven project I just add all the external libraries to the project manually. But in the case of Maven-project I do not add nothing but try to create dependency through Alt + Ins when writing the class. Result - there are not needed jars in local repository.
What I tried:
1.Installed/deleted a couple of versions of Maven (The current is 3.2.2)
2.Defined local repository in settings.xml (the tag missed by default)
3.Updated local repository
4.Added dependency manually in pom.xml but IDEA didn't define it
Moreover, when I created a first Maven-project in IDEA according to web-app archetype it didn't have needed folders structure but started donloading the number of jars. Current version of IDEA is 13.0. If somebody faced such problem please help me to eliminate it.
But in the case of Maven-project I do not add nothing but try to
create dependency through Alt + Ins when writing the class. Result -
there are not needed jars in local repository
You actually have to perform an maven install, this downloads the jars from whereever to your local repository. Just writing the depenecy in pom, doesn't actually download them.
This is how maven works, nothing to do with intellij.
How do I prevent Maven 2 from searching remote repositories for specific dependencies that are in the local repository only?
How do I prevent Maven 2 from searching remote repositories for specific depedencies that are in the local repository only
Well, actually, Maven won't unless:
they are SNAPSHOT dependencies in which case this is the expected behavior.
they are missing a .pom file in which case you can provide it or generate it (see questions below).
Related questions
How do I stop Maven 2.x from trying to retrieve non-existent pom.xml files for dependencies every build?
Maven install-file won’t generate pom.xml
set up nexus as a repository manager.
add addtional remote proxied repositories if necessary
add your local hosted repository (hosted on the nexus server)
define a group of repositories in the correct search sequence with your local repo's first.
change your builds to point at the nexus group url (use mirrorOf=* in your settings.xml)
run your build and let nexus manage the local vs remote dependency resolution
Use fixed version numbers in your POM for your remote dependencies or the local versions you want to fetch from the local repository.
Maven tries to be friendly and fetch the latest and greatest of whatever which has no version number specified.
For a quick fix to not be waiting for the internet to be downloaded each time you build you can use mvn -o to force an offline build, and then it will not lose time trying to fetch new versions.
The answer of #crowne is also very good advice, especially setting up your own nexus and making sure all remote repos are configured there so you will never have unpleasant surprises when a repo dissappears some day.
To prevent Maven from checking remote repositories at all, you can use the -o flag. Otherwise, Maven will check that any snapshot dependencies are up-to-date. You can use a dependency manager such as Nexus to get fine-grained control over dependency resolution. The repository section in your pom.xml or settings.xml file also has an updatePolicy element that allows you to configure how often Maven will check for updated dependencies.
I am using M2Eclipse (0.10.0, Maven 3)in projects. I can add Maven dependency using m2eclipse. But dependency jars couldn't be downloaded. Instead, it created a file in each local repo folder named [JAR_Name].jar.lastupdate. The content of this file is some thing like :
http://[REPO_URL]/central/=1276221188566
Even using Maven 3 command line. Jars couldn't be downloaded. Any idea about how could this happen?
First off, the presence of "lastupdated" file is irrelevant. We need to know the debug output (mvn -X dependency:tree). Then you mentioned you were using repository manager and mirroring every request to it - so setup settings.xml according to this guide . If you just specified the mirror element with repository manager location and what repositories (URLs) you want to proxy ,(without that profile enabled which is practically changing policy for getting snapshots), you would have something like "central repository disabled" messages in your debug log. After you fix it, it should work.
EDIT: You can always use the URL of group repository from maven settings.xml http://hostname/nexus/content/groups/public + path to the artifact like "org/apache/maven/someartifact/maven-metadata.xml" and see if nexus can proxy the request and serve what you want. If this works, then the reason must be either in maven settings or pom definition.
I have the same problem. Don't know a 'real' solution, but whenever something isn't working I do a scan for .lastUpdated files in my local repo and delete them. Then things usually work again. (I think that might be due to a badly configured nexus, but unfortunately I don't have access to the nexus config)