How can you get all the dependencies of a MavenProject (including transitive ones) using Aether?
I have seen numerous examples where you specify the gav and it resolves the artifact and all it's dependencies. This is all fine. However, if your plugin is supposed to be invoked from the same project whose dependencies you're trying to resolve, this does not seem to work (or perhaps I am doing it wrong). Could somebody please give me a working example of how to do it?
I have tried the example with jcabi-aether shown in this SO post.
Try to use an utility class Classpath from jcabi-aether:
Collection<File> jars = new Classpath(
this.getProject(),
new File(this.session.getLocalRepository().getBasedir()),
"test" // the scope you're interested in
);
You will get a list of JARs and directories which are in "test" scope in the current Maven project your plugin is in.
If you're interested to get a list of Artifacts instead of Files, use Aether class directly:
Aether aether = new Aether(this.getProject(), repo);
Set<Artifact> artifacts = new HashSet<Artifact>();
for (Artifact dep : this.getProject().getDependencyArtifacts()) {
artifacts.addAll(aether.resolve(dep, JavaScopes.COMPILE));
}
Related
in my maven project, I've got a xml file in resources. Depending on some input parameter I want the file to be adapted before packaged into a jar or war. Of course, the original file shall not be touched.
It is not an option to create multiple xml-files and select a suitable one, for example, with spring profiles as there can be numerous combinations of contents in the xml file.
So, I thought of creating a maven plugin, that manipulates the file before packaging. Probably, I need to manipulate the file, when maven has copied the file to the target folder but before maven packages the file into the jar/war.
#Mojo(name = "manipulate-xml", defaultPhase = LifecyclePhase.PREPARE_PACKAGE)
public class MyMojo extends AbstractMojo {
#Parameter(defaultValue = "${project}", required = true, readonly = true)
MavenProject project;
#Parameter(property = "option")
String option;
public void execute() throws MojoExecutionException {
if (option.equals("optionA")) {
// get file from target and manipulate
} else if (option.equals("optionB")) {
// get file from target and manipulate
}
}
}
Then, I could embedded the maven plugin into my project and build the project with
mvn clean package -Doption=optionA
However, now I am stuck. I do not know, how to get the file from target and even if this is the right approach.
Besides, is it possible during the packaging to prevent some dependencies from being packaged into the jar/war?
I appreciate any help.
Depending on what manipulating means, you can use the possibilities of the maven resources plugin (https://maven.apache.org/plugins/maven-resources-plugin/index.html).
If you need to modify some simple values inside the xml, use properties in the xml and let the resources plugin replace them during build. The values for the build can be either in the pom.xml or given to maven via -Dproperty=value.
If you want to select a different files, define multiple maven profiles, in each you can configure the resources plugin to copy only the wanted files and then select the correct profile in the build.
If the built-in possibilities are not enough, you might even program your own filter for the resources plugin, that might be easier than writing a custom full fledged maven plugin.
Let's say that there are 2 maven artifacts (local) with the same
groupId but with a different artifactId.
The different artifactId should make each maven artifact unique.
However, if both of the unique artifacts each have a class with that share the same name. that class will not be unique because when it is imported to java it will use the groupId.className format. and the neither groupId nor the className are unique (in the discussed case).
This will result in an issue of ambiguity as to determining which class to use.
Upon testing it seems that the dependency declared first in the pom.xml file will be used.
The Question Are
What is the best practice solve/avoid this issue?
Why does maven's artifactId coordinate contribute to the uniqueness of a maven artifact within the repository but not inside the java code?
Example Code:
Maven - Same Class Name Same GroupId Different ArtifactId
Project1 is the first artifact.
Project2 is the second artifact.
"Projects User" is the artifact/project that will depend on both Project1 & Project2.
Project1 & Project2 both have a class named Utilities.
The class Utilities have a static method public static String getDescription() that returns a string containing the current project's artifact coordinates as well as the project name.
Utilities.getDescription() resulting String is called to see if an error will occur somewhere, and to see how it will be resolved.
The output depends on which dependency was declared first in the pom.xml file of the "Projects User" artifact.
Edited : Follow up Question
Is there an archetype that will create the java package using both the artifactId and groupId instead of having to do it manually every
time?
What is the best practice solve/avoid this issue?
We include the groupId and artifactId as the base package in the module. This way it is not possible to have the same class in two modules as the packages would be different.
e.g.
<groupId>net.openhft</groupId>
<artifactId>chronicle-bytes</artifactId>
has everything under the package
package net.openhft.chronicle.bytes;
Also if you know the package of a class you know which JAR it must be in.
if you have a class two JARs need, I suggest creating a common module, they both depend on.
Note: it is general practice to use your company domain name (and notional division as well) as the base of your package. Maven recommend using your domain name as you groupId and if you release to Maven Central this is now a requirement. The above strategy supports both recommendations.
Why does maven's artifactId coordinate contribute to the uniqueness of a maven artifact within the repository but not inside the java code?
Maven doesn't take any notice of the contents of the JAR.
#Peter following your lead on suggesting best practices to avoid this issue.
Group Id : It is required to uniquely identify your project. Revese of your domain name ex :
com.github.dibyaranjan
artifactId is the name of the jar without version.
To distinguish two classes from different JARs, Create package as groupId.artifactId.
For Example, I would create a project TestDummy, I want the name of the JAR to be TestDummy-1.1, then my package would look like.
com.github.dibyaranjan.testdummy
The class would look like - com.github.dibyaranjan.testdummy.MyClass
For reference visit : https://maven.apache.org/guides/mini/guide-naming-conventions.html
I'm currently writing a custom maven plugin for generating a XML file in a multi-module maven project.
My maven structure is pretty standard: one parent project and a module by project components in the parent project folder:
-- Parent
-- module A
-- module B
-- module C
I need to list, by module, a set of classes flagged by a custom annotation.
I already wrote a set of custom annotations and an annocation processor to create a XML file at compile time in the corresponding module output directory (${project.build.outputDirectory}) .
Now i need to merge each module XML into one file, but i don't know how to access each modules from within my maven plugin except having each path set as parameters (i don't like this method).
Any idea on how to do this ?
Does maven plugins can traverse project modules ?
Thank you in advance.
To get the list list of all projects you can use:
List<MavenProject> projectList = MavenSession.getProjectDependencyGraph().getSortedProjects()
If one of your goals is correctly executed you will get everything you need. Every MavenProject contains a getBaseDir() etc.
After some researches, it seems that MavenProject.getCollectedProjects() will return the list of projects beeing manipulated by a goal execution in a multi-module project.
Is it possible to load remote artifacts via Maven during runtime, e.g. using a specific (Maven) ClassLoader?
For my use case, a legacy software is using an URLClassLoader to pull a JAR containing some resources files during start-up of a test framework.
Problem is that we currently just use a fixed URL pointing to the repository and not actually using Maven artifact resolution at all.
Adding this to the projects dependency is no option because we want to refer to a specific version from an external configuration file (to run the test framework with different versions of our packaged use cases without changing code).
I hope you get what I want to achieve - it doesn't have to be the prettiest solution because we currently rely on a fixed URL pattern, I'd like to be dependent from the local maven setup instead.
You may use Eclipse Aether (http://www.eclipse.org/aether) to resolve and download the JAR artifacts from maven repositories using GAV coordinates.
Then use a regular URLClassLoader with the JAR you've downloaded.
You can find some examples there: https://github.com/eclipse/aether-demo/blob/master/aether-demo-snippets/
But basically, what you should do is the following:
DefaultServiceLocator locator = MavenRepositorySystemUtils.newServiceLocator();
locator.addService(RepositoryConnectorFactory.class, BasicRepositoryConnectorFactory.class);
locator.addService(TransporterFactory.class, FileTransporterFactory.class);
locator.addService(TransporterFactory.class, HttpTransporterFactory.class);
RepositorySystem system = locator.getService(RepositorySystem.class);
DefaultRepositorySystemSession session = MavenRepositorySystemUtils.newSession();
LocalRepository localRepo = new LocalRepository("/path/to/your/local/repo");
session.setLocalRepositoryManager(system.newLocalRepositoryManager(session, localRepo));
// Set the coordinates of the artifact to download
Artifact artifact = new DefaultArtifact("<groupId>", "<artifactId>", "jar", "<version>");
ArtifactRequest artifactRequest = new ArtifactRequest();
artifactRequest.setArtifact(artifact);
// Search in central repo
artifactRequest.addRepository(new RemoteRepository.Builder("central", "default", "http://repo1.maven.org/maven2/").build());
// Also search in your custom repo
artifactRequest.addRepository(new RemoteRepository.Builder("your-repository", "default", "http://your.repository.url/").build());
// Actually resolve (and download if necessary) the artifact
ArtifactResult artifactResult = system.resolveArtifact(session, artifactRequest);
artifact = artifactResult.getArtifact();
// Create a classloader with the downloaded artifact.
ClassLoader classLoader = new URLClassLoader(new URL[] { artifact.getFile().toURI().toURL() });
If I have a Maven Artifact information (GroupId, ArtifactId, Version) how can I programmatically (using Java) retrieve that Artifact from my local repository?
Specifically, I need to be able to connect to the Maven Repository and create/retrieve a org.apache.maven.artifact.Artifact so I can retrieve the file associated with the Artifact.
I have looked into m2e source code, but the MavenImpl.java (which provides Artifact resolution) is way more complex than what I need and it is difficult to understand how the connection to the repository works.
You'll probably want to look at Aether. See the Wiki for examples.
You can construct a URL from the given information and download the file (note, replace the '.' in the <groupId> with '/'):
<repositoryUrl>/<groupId>/<artifactId>/<version>/<artifactId>-<version>.<type>
This is how we do it in jcabi-aether:
final File repo = this.session.getLocalRepository().getBasedir();
final Collection<Artifact> deps = new Aether(this.getProject(), repo).resolve(
new DefaultArtifact("junit", "junit-dep", "", "jar", "4.10"),
JavaScopes.RUNTIME
);
Give it a list of remote repositories, a location of a local repo, and Maven coordinates of the artifact. As the name shows, the library uses Apache Aether from Sonatype.