I am currently working on several projects each one of them has different artifactory settings. So, when switching between them I have to go to maven folder and change settings file to correspond to correct corporate repository. I'm pretty sure that it is ineffective and most likely someone already have a simple solution for this problem. May be there is some IDEA settings that I need to switch, or maven itself has some way to store them?
So, my question is - does anyone know the simple way to use several settings.xml file in the same maven home folder?
Quick google and stackoverflow search doesn't provide any results so far.
I expect that when I open another project in Idea it will work with its own artifactory instead of using the one from previous project (since these are from different companies, obviously it is not correct to use the same one)
The answer above doesn't quite work for me, since I'm running project via IDEA and it's getting artifacts downloaded using single settings.xml
The usual way you work is to either:
put all repositories you need for all of your projects into your settings.xml so that it works for all projects.
use a Nexus/Artifactory that proxies all the repositories you need and put that as mirror into your settings.xml.
If a repository should really be only accessible for one project, you can put into the pom.xml of that project.
I don't understand how other developers should object to putting the repositories into the pom.xml because they need the very same repositories to run the project (if not, please comment and explain).
Related
I am using Nexus and have configured maven-proxy and maven-hosted repositories, added them to a group repo and using that repo through settings.xml. Now when a new dependency is added in pom, maven-proxy goes to maven central and downloads it. However, I do not want this.
My goal is to stop replying on Maven central completely, but I know it won't work until my hosted repository contains everything that maven needs.
Issue is that Maven plugins like compiler, clean, jar etc. downloads tons of dependencies on its own. If I remove connection to maven proxy, how do I get all that list and then how do I make sure that I put whatever is needed in my hosted repository ?
Should I even try to put such artifacts in my hosted repo ? Is there any other better approach ?
You cannot really do that manually, the number of artifacts is way too large.
You can let Maven download all needed artifacts, then copy that from a remote to a hosted repository and work with that (until you need something new).
But it is still painful. I would not do that.
If your concern is security, I would use an open source security scanner instead of blocking internet access altogether.
why use Maven when you have such quantity of local jars?
So we have a client that have a lot of private jars and custom jars.
For example commons-langMyCompanyCustom.jar which is commons-lang.jar with 10 more classes in it.
So on their environment we use 100% Maven without local dependencies.
But on our site we have the jars for development in Eclipse and have Maven build with the public ones, but we do not have permission to add their jars in our organizational repository.
So we want to use the Maven good things like: compile,test, build uber-jar, add static code analysis, generate java-docs, sources-jars etc. not to do this thinks one by one with the help of Eclipse.
So we have 70 jar some of them are public if I get the effective pom on their environment I found 50 of them in Maven Central, but the other 20 are as I called "custom" jars. I searched for decision of course but found this:
<dependency>
<groupId>sample</groupId>
<artifactId>com.sample</artifactId>
<version>1.0</version>
<scope>system</scope>
<systemPath>${project.basedir}/src/main/resources/yourJar.jar</systemPath>
</dependency>
So for all 20 of them I have to add this in the development maven profile??
Is there a easy way like in Gradle where you can add all folder with its dependencies to the existing ones?
Also installing one by one in every developer's repo is not acceptable.
Please forget the system scope as mentioned before! Too problematic...
Ideally:
Ideally, all your developers have access to Repository Manager in your or their organization (if possible).
A central environment for your System Integration Testing, maybe?
Alternatively, you may have a central environment for testing where all the dependencies are provided. This approach can be used to simulate how a compilation would work as if it's in your client's environment. Plus you only setup jars once.
So on their environment we use 100% Maven without local dependencies.
But on our site we have the jars for development in Eclipse and have
Maven build with the public ones, but we do not have permission to add
their jars in our organizational repository.
According to what you're saying in the above-quoted excerpt I believe you want to have set in your build's pom.xml assuming that in the client setup the dependencies will be present.
Especially, as you indicate that the organization doesn't give you permission to add their jars in your repository, I would use the provided scope.
As stated in the Maven docs, the definition of a provided dependency is as followed:
This is much like compile, but indicates you expect the JDK or a container to provide the dependency at runtime. For example, when building a web application for the Java Enterprise Edition, you would set the dependency on the Servlet API and related Java EE APIs to scope provided because the web container provides those classes. This scope is only available on the compilation and test classpath, and is not transitive.
So basically you assume that these dependencies will be present at your client's setup. However, this has some limitations. Meaning you can build solutions independently but cannot test it locally because you won't have the dependencies on your workstation.
If you won't even have access to the jars to configure your central environment ask if your client can provide a DEV/SIT environment.
None of the above? Inherit a parent pom.
To avoid the whole constant copy-paste process for every single (related) project, maven has the tools to centralize dependency and plugin configurations, one of such is by inheriting the configuration of a parent pom. As is explaining in the following documentation it is quite simple:
First you create a project with just a pom.xml where you define everything you wish to centralize (watch out, certain items have slight differences in their constructs);
Use as property of packaging tag the option pom: <packaging>pom</packaging>;
In the pom's that have to inherit these configurations set the parent configuration tags in <parent> ... </parent> (documentation is very clear with this);
Now everytime you update any "global" pom configuration only the parent version has to be updated on every project. As a result of this, you only need to configure everything once.
You can also apply this together with the abovementioned solutions, this way combining to find a solution that fits best to your needs.
But there is a big Maven world out there, so I advise a good read in its doc's to further acknowledge your possibilities. I remembered these situations because I've been in a similar situation you seem to be now.
Good luck!
Another alternative is the project RepoTree.
This one creates a Maven repository directory (not a server) from another directory which contains just the .jars. In other words, it creates the necessary .pom files and directory structure. It takes into account only the precise information from metadata contained in the archives (MANIFEST.MF, pom.xml).
Utility to recursively install artifacts from a directory into a local
Maven repository Based on Aether 1.7
This is 5 years old, but still should work fine.
TL;DR: MavenHoe creates a Maven repository server (not a directory) which serves the artefacts from a directory, guessing what you ask for if needed. The purpose is to avoid complicated version synchronizing - it simply takes whatever is closest to the requested G:A:V.
I have moved the MavenHoe project, which almost got lost with the decline of Google Code, to Github. Therefore I put it here for availability in the form of a full answer:
One of the options you have when dealing with conditions like that is to take whatever comes in form of a directory with .jar's and treat it as a repository.
Some time ago I have written a tool for that purpose. My situation was that we were building JBoss EAP and recompiled every single dependency.
That resulted in thousands of .jars which were most often the same as their Central counterpart (plus security and bug fixes).
I needed the tests to run against these artifacts rather than the Central ones. However, the Maven coordinates were the same.
Therefore, I wrote this "Maven repository/proxy" which provided the artifact if it found something that could be it, and if not, it proxied the request to Central.
It can derive the G:A:V from three sources:
MANIFEST.MF
META-INF/.../pom.xml
Location of the file in the directory, in combination with a configuration file like this:
jboss-managed.jar org/jboss/man/ jboss-managed 2.1.0.SP1 jboss-managed-2.1.0.SP1.jar
getopt.jar gnu-getopt/ getopt 1.0.12-brew getopt-1.0.12-brew.jar
jboss-kernel.jar org/jboss/microcontainer/ jboss-kernel 2.0.6.GA jboss-kernel-2.0.6.GA.jar
jboss-logging-spi.jar org/jboss/logging/ jboss-logging-spi 2.1.0.GA jboss-logging-spi-2.1.0.GA.jar
...
The first column is the filename in the .zip; Then groupId (with either slashes or dots), artifactId, version, artifact file name, respectively.
Your 70 files would be listed in this file.
See more information at this page:
https://rawgit.com/OndraZizka/MavenHoe/master/docs/README.html
The project is available here.
Feel free to fork and push further, if you don't find anything better.
I use dbus-java library in my own library. It depends on unix-java and some more. Those jars are not present in any maven repo.
How would I explicitly depend on all of these?
I see several options:
send jars to maven repo by myself (though it's not clear for me - how to preserve their groupId?)
package all the jar's into mine (which is obviously bad)
write in README: "apt-get install dbus-java-bin" and what to include in classpath... but it makes me really sad :(
Note: I came from Ruby land, so I'm relative new to all these weird Maven repos and confused by missing jars everywhere. In Ruby I was always sure that I will be able to retrieve all the gems either from rubygems or from a specified git repo (usually on github).
Could you explain how is better to distribute such libraries?
What I would do is to download the jars from the net and install them in my local-global repository.
(By this I mean the repository that is not local on my machine, but local to the company, often this is managed by Nexus).
You just need to set a pom with
<groupId>, <artifactId> and <version>.
Then, in your pom, you point to them in your dependencies list.
mvn deploy
By the way, if you wander what the groupId should be, you have two options:
com.yourcompany.trirdparty
or
com.whatever.the.original.groupid.is.groupId
I am very new to the industry, so apologies in advance for the very likely stupidity of the question.
In the team, we work with Intellij IDEA 13 as IDE, and use Maven 3 for our projects. We provide a few online services and portals, and I'm just starting to work on one: the project has several dependencies that are shared by other older projects, some are JAR archives, some are WARs...
To my questions re: how to edit those dependencies locally (e.g. editing a resources.properties was the case I had in mind) my tutor suggested turning the dependencies into snapshots and work with those.
What I managed to do was create a copy of the appropriate folders in my local repository and change wherever the version of the dependency was in the name or in the files, then modify my pom.xml files.
Now, this works perfectly if I open the JAR/WAR and edit some file, but I'd like to be able to do it from my IDE, also cause not being able to suggests I'm probably doing this in a wrong way. Do I need to somehow unpack the dependency to be able to do so? Is my entire approach wrong?
P.S.: I would ask someone in my office, but oddly enough none of those who could help are at work today anymore!
If none of your colleagues was able to help you, I am afraid there might be something else hidden.
However, let's try it!
I am guessing, here, that your resources.properties is a part of his own project. Project handled by Maven and expressed as a dependency in one of your main project.
I am also guessing that your main projects are WARs (Webapps mostly, services, portals) and the JARs are libraries, configurations, etc...)
Therefore, I am guessing that your webapps are referencing some libraries as Maven dependencies, to a specific version.
That said, IntelliJ (and other IDE) can easily handle modification of either JARs and WARs related to each other via Maven as long as the visioning is meaningful.
Note: Having -SNAPSHOT at the end of the version number tell to Maven NOT to cache the package. On the opposite, a definitive version number is considered as released and is only fetched from the cache. This is important because with a SNAPSHOT, you can publish an illimited number of time and it is guaranteed to have the latest version.
Note: Doing mvn clean install publish a package into your local Maven repository (generally located in ~/.m2) and is only available to you.
The general good practice is to have, in all the development branches of your DVCS, all your owned, often modified projects (Don't be too greedy, it depend on the situation) as SNAPSHOT. And during a release (Maven has a specific plugin for that) change all the versions to a final one, attributed in this precise moment (You never know if you will need a minimal version or a major).
Your code, then, has always the SNAPSHOT number of your expected next release.
Finally, I think that in your case, if you choose to change the pom.xml of one of your library for a SNAPSHOT, you should change the pom.xml of the root project to correspond.
If this dependency version is the same, then, you can add your library as a module within IntelliJ and the IDE will do the math to figure that the Maven dep and the Java module are the same entity.
I don't even know if that's help you (I'm not even sure if it's clear), but I hope it will make you ask more questions about what you need. Your co-workers will probably be able to help you more.
I have a Maven2 project, with a pom.xml and a profiles.xml files at the same level.
The project configuration is provided by Maven profile properties:
dbhost=${dbhost}
dbport=${dbport}
// etc.
Locally, each developper customize his build in the "profiles.xml". It works well.
For continuous integration, a ci "profiles.xml" has been put on our SCM server (at the same level as the pom.xml).
The problem is that Hudson simply ignores this file during the Maven build, whereas the "-P hudsonprofile" is correctly set.
If the same profile is moved directly in the "pom.xml", or in the global "settings.xml" the build works. So we already have a solution.
I also know that the "profiles.xml" file is deprecated, but I would like to understand why the comportement is different between Hudson build and my local build...
Note: Hudson and my local build use the same version of Maven (2.2.1).
Sounds like a classpath problem to me. Why would Hudson not notice the profiles.xml? The only reason I can think of is that Hudson uses a different classpath than you would expect.
A best practice (atleast in my experience), is to try to build the project from the command line on your CI server (where Hudson runs). If that works, then Hudson should work too. Unless you have configured Maven in Hudson weirdly.
Also, adjusting the settings.xml of Maven is not that bad. At least, if you dont expect it to change too much. Even so, it is fixed quickly.
I think the best solution is defining the profiles directly in your pom.xml file for both CI specifically and then generically for local builds. Then the devs can override any profile settings in their own personal settings.xml file for local builds. This has the added benefit of not having to check-in a profiles.xml file which will not work for devs, forcing them to modify this versioned file and remember to not check in their changes. This also has the added benefit on making your build not dependent on a deprecated feature of Maven. After all, I would not count on the behavior of a deprecated feature in the first place. Hopefully this is an elegant solution which uses ideas from what you know already works.