We use an internal ivy repository and are in the process from moving away from ant / ivy and tasking everything in gradle. I have my ivy repository set up in gradle as so:
repositories {
ivy {
url "${ivy_repository_url}"
layout "pattern", {
ivy "repository/[organisation]/[module]/[revision]/[artifact].[ext]"
artifact "repository/[organisation]/[module]/[revision]/[artifact].[ext]"
m2compatible = true
}
credentials {
username "${ivy_repository_username}"
password "${ivy_repository_password}"
}
}
}
and upon executing a task it resolves as expected, however, in some instances it pulls down everything in the repository if the naming convention doesn't match so I end up with a lot of extra stuff, javadocs.zip, sources.zip and everything else.
To get around this I want to download the specific jar files to a temp folder first and then compile them from local but I have no clue how to tell gradle to download a file that is named differently from the module name.
Example:
ivyFiles 'net.sourceforge.jtidy:jtidy:r938#jar'
downloads just jtidy-r938.jar from the net.sourceforge.jtidy repository but something like
ivyFiles 'com.gargoylesoftware:htmlunit:2.7#jar'
would pull htmlunit-2.7.jar but not the file htmlunit-core-js-2.7.jar.
If I omit the #jar it reverts to calling the ivy.xml file and I am left with all the junk + dependencies which I am trying to avoid. I have tried the following with no success
ivyFiles ('com.gargoylesoftware:htmlunit:2.7'){
artifact{
name = 'htmlunit-core-js'
type = 'jar'
}
}
There must be a way to do this.
Thank you
I figured it out. This is what I ended up doing
ivyFiles ('com.gargoylesoftware:htmlunit:2.7'){
transitive = false
artifact {
name = 'htmlunit-core-js'
extension = 'jar'
type = 'jar'
}
}
Now only the htmlunit-core-js-2.7.jar file was downloaded
Related
in my maven project, I've got a xml file in resources. Depending on some input parameter I want the file to be adapted before packaged into a jar or war. Of course, the original file shall not be touched.
It is not an option to create multiple xml-files and select a suitable one, for example, with spring profiles as there can be numerous combinations of contents in the xml file.
So, I thought of creating a maven plugin, that manipulates the file before packaging. Probably, I need to manipulate the file, when maven has copied the file to the target folder but before maven packages the file into the jar/war.
#Mojo(name = "manipulate-xml", defaultPhase = LifecyclePhase.PREPARE_PACKAGE)
public class MyMojo extends AbstractMojo {
#Parameter(defaultValue = "${project}", required = true, readonly = true)
MavenProject project;
#Parameter(property = "option")
String option;
public void execute() throws MojoExecutionException {
if (option.equals("optionA")) {
// get file from target and manipulate
} else if (option.equals("optionB")) {
// get file from target and manipulate
}
}
}
Then, I could embedded the maven plugin into my project and build the project with
mvn clean package -Doption=optionA
However, now I am stuck. I do not know, how to get the file from target and even if this is the right approach.
Besides, is it possible during the packaging to prevent some dependencies from being packaged into the jar/war?
I appreciate any help.
Depending on what manipulating means, you can use the possibilities of the maven resources plugin (https://maven.apache.org/plugins/maven-resources-plugin/index.html).
If you need to modify some simple values inside the xml, use properties in the xml and let the resources plugin replace them during build. The values for the build can be either in the pom.xml or given to maven via -Dproperty=value.
If you want to select a different files, define multiple maven profiles, in each you can configure the resources plugin to copy only the wanted files and then select the correct profile in the build.
If the built-in possibilities are not enough, you might even program your own filter for the resources plugin, that might be easier than writing a custom full fledged maven plugin.
I am trying to build my project on an offline machine (this is a requirement). I have created local maven repository (it's just a folder with appropriate structure) and successfully build all other things.
I do the following way:
1) Run gradle installl (this maven plugin's goal)
then checking errors by hand. If I see some library not found, then
2) I take it's maven coordinates and copy in to this machine by hand from my machine.
It works except of Antlr. I getting the following message:
>gradle install
:generateGrammarSource
FAILURE: Build failed with an exception.
* What went wrong:
Could not resolve all dependencies for configuration ':antlr'.
> Could not download antlr4.jar (org.antlr:antlr4:4.5)
> Could not get resource 'http://repo.maven.apache.org/maven2/org/antlr/antlr4/4.5/antlr4-4.5.jar'
> Could not HEAD 'http://repo.maven.apache.org/maven2/org/antlr/antlr4/4.5/antlr4-4.5.jar'.
> Connection to http://repo.maven.apache.org refused
error message is the same as always, but this time putting jar into local maven repository does not help.
How to overcome? How to configure Antlr to eat from local maven repo?
UPDATE
File is present in
MYHOME\.gradle\caches\modules-2\files-2.1\org.antlr\antlr4\4.5\af4a530e3cd7fa03636645d8077145eefac12907\antlr4-4.5.jar
and in
MYHOME\.m2\repository\org\antlr\antlr4\4.5\antlr4-4.5.jar
In maven case accompaning files are also present.
UPDATE 2
Note that it says
Could not resolve all dependencies for configuration ':antlr'.
and Antlr dependency is added by
antlr "org.antlr:antlr4:4.5" // use ANTLR version 4
i.e. not compile and not testCompile. May be this is the clue? May be it is a way to configure repositores specifically to antlr configuration?
UPDATE 3
I those cases I was resolving successfully it was writing:
Could not resolve all dependencies for configuration ':compile'.
> Could not resolve net.coobird:thumbnailator:0.4.8.
Required by:
com.cireca.overlaywidget:OverlayWidget:1.0-SNAPSHOT
> Could not resolve net.coobird:thumbnailator:0.4.8.
> Could not get resource 'https://repo1.maven.org/maven2/net/coobird/thumbnailator/0.4.8/thumbnailator-0.4.8.pom'.
> Could not GET 'https://repo1.maven.org/maven2/net/coobird/thumbnailator/0.4.8/thumbnailator-0.4.8.pom'.
> Connection to https://repo1.maven.org refused
UPDATE 4
Strange thing. I noticed that my config looks redundant:
repositories {
maven { url "http://repo.maven.apache.org/maven2" }
maven { url "https://oss.sonatype.org/content/repositories/snapshots/" }
maven { url "http://maven-eclipse.github.io/maven" }
mavenLocal()
flatDir {
dirs 'lib'
}
maven { url "http://repo.maven.apache.org/maven2" }
maven { url "https://oss.sonatype.org/content/repositories/snapshots/" }
maven { url "http://maven-eclipse.github.io/maven" }
}
I changed this to
repositories {
mavenLocal()
flatDir {
dirs 'lib'
}
maven { url "http://repo.maven.apache.org/maven2" }
maven { url "https://oss.sonatype.org/content/repositories/snapshots/" }
maven { url "http://maven-eclipse.github.io/maven" }
}
And after that it started to claim different liraries. First it claimed
org.antlr:antlr4-runtime:4.5
and I fed it successfully, but then it claimed
org.antlr:antlr-runtime:3.5.2
and I can't feed it (same situation).
Maven contacts the repositories as specified in your settings.xml. If you want to avoid external repositories, you need to mirror everything.
Actually, the best approach to work in environment not connected to the internet is to set up your own Nexus/Artifactory. And the easiest way to fill up your repository for offline use to connect it to the internet once, then build everything and disconnect. Then you have a local copy of everything that is relevant for you.
I have removed remote repositories references completely from that machine (before this the remote repositories just had lower priority).
After that the Gradle started to report what it miss in the form of explicit local paths and I found, that it had lack of some parent (?) artifacts.
For example, for Guava, it was wishing com.google.guava:guava-parent and for antlr it was org.antlr:antlr-master. These names were not reported when remote repositories were present in the config.
If anybody could explain what happened in more detailed way, I will accept his/her answer.
In my Gradle project, I need to define two types of repositories: a flat directory and a Maven repository (a local Nexus server).
I can get both working separately, but can't get them to play nicely together. Ideally, I would have Gradle look at the local directory first, then at the Maven repository.
Current Setup
I have defined the repositories in my build.gradle like this:
repositories {
flatDir dirs: "${rootProject.projectDir}/libs"
flatDir dirs: "${rootProject.projectDir}/test.libs"
maven {
credentials {
username nexus_username
password nexus_password
}
url "http://nexus-server/nexus/content/groups/public"
}
}
In the libs (and test.libs) directory, the jar file names may or may not have versioning (but when using a flatDir repository, I believe that is irrelevant):
libs\activation.jar
libs\imap.jar
....
libs\<closed_source_framework>.jar
....
libs\gson-2.2.4.jar
libs\stax-utils.jar
.....
The reason I can't use our local Nexus server for everything is because of <closed_source_framework>.jar; most of the dependancies in the libs folder come packaged with that distribution, and I can't reliably get the version information to pull them from Nexus.
Now, one of the other teams is publishing their jars to the Nexus server and I'd like to be able to pull their jars from Nexus, so I have (re)defined my dependencies in my build.gradle:
dependencies {
// Grab other-team jar files from Nexus server
compile "other-team-group:other-team-jar-1:version"
compile "other-team-group:other-team-jar-2:version"
compile "other-team-group:other-team-jar-3:version"
// Grab everything else from 'flatDir'
compile name: 'activation'
compile name: 'imap'
...
compile name: 'gson-2.2.4'
compile name: 'stax-utils'
.....
}
The Problem
So now comes my problem. I had expected Gradle would search the repositories in the order specified in my build.gradle; meaning that it would look to the local libs folder first and then go to Nexus if it can't find it locally. What I'm seeing instead is that Gradle is looking at the Nexus server for the jar files already in the local libs folder. Obviously, this is slowing down my build (I have ~30 dependencies defined).
Some Info
Output from gradle properties command, to show repository information:
.....
repositories: [org.gradle.api.internal.artifacts.repositories.DefaultFlatDirArtifactRepository_Decorated#18814b1b, org.gradle.api.internal.artifacts.repositories.DefaultMavenArtifactRepository_Decorated#6dff028]
.....
Output from gradle --info compileJava, to show that Gradle is doing a lookup to Nexus:
.....
// Successfully find the other team jar files in Nexus, this is okay
Download http://nexus-server/nexus/content/groups/public/other-team-group/other-team-jar-1/version/other-team-jar-1-version.pom
Download http://nexus-server/nexus/content/groups/public/other-team-group/other-team-jar-2/version/other-team-jar-2-version.pom
Download http://nexus-server/nexus/content/groups/public/other-team-group/other-team-jar-3/version/other-team-jar-3-version.pom
.....
// Continues looking in Nexus for jar files that should be found in local libs folder
Resource missing. [HTTP GET: http://nexus-server/nexus/content/groups/public//activation//activation-.pom]
Resource missing. [HTTP HEAD: http://nexus-server/nexus/content/groups/public//activation//activation-.jar]
Resource missing. [HTTP GET: http://nexus-server/nexus/content/groups/public///imap//imap-.pom]
Resource missing. [HTTP HEAD: http://nexus-server/nexus/content/groups/public//imap//imap-.jar]
.....
Bottom Line
How can I get Gradle to stop looking at the Maven repository for jar files that I know it will only find locally?
I also posted this question over on the Gradle forums. I have copy/pasted the solution below.
Gradle will prefer an artifact with an pom/ivy descriptor over an
artifact without. I think this is why gradle continues searching after
it finds a match in the flatDir repository. This may or may not solve
your problem, but you could use a FileCollectionDependency instead of
a ModuleDependency.
Eg:
ext {
libs = "${rootProject.projectDir}/libs"
testLibs = "${rootProject.projectDir}/test.libs"
}
dependencies {
compile files("${libs}/activation.jar", "${libs}/imap.jar")
compile files("${testLibs}/gson-2.2.4.jar")
...
}
How can you get all the dependencies of a MavenProject (including transitive ones) using Aether?
I have seen numerous examples where you specify the gav and it resolves the artifact and all it's dependencies. This is all fine. However, if your plugin is supposed to be invoked from the same project whose dependencies you're trying to resolve, this does not seem to work (or perhaps I am doing it wrong). Could somebody please give me a working example of how to do it?
I have tried the example with jcabi-aether shown in this SO post.
Try to use an utility class Classpath from jcabi-aether:
Collection<File> jars = new Classpath(
this.getProject(),
new File(this.session.getLocalRepository().getBasedir()),
"test" // the scope you're interested in
);
You will get a list of JARs and directories which are in "test" scope in the current Maven project your plugin is in.
If you're interested to get a list of Artifacts instead of Files, use Aether class directly:
Aether aether = new Aether(this.getProject(), repo);
Set<Artifact> artifacts = new HashSet<Artifact>();
for (Artifact dep : this.getProject().getDependencyArtifacts()) {
artifacts.addAll(aether.resolve(dep, JavaScopes.COMPILE));
}
I would like to use the gradle tomcat plugin in order to do integration tests with gradle. The current project relies on some .properties files underneath the running tomcat's catalina.base directory (cannot be changed because another dependent project relies on them as well).
Does anybody know how to deploy those files to the embedded tomcat instance?
I figured out it's just a simple copy task issue. Here's my solution:
task copyDMConfigFiles << {
def srcDir = new File('src/test/resources/conf')
if(!srcDir.isDirectory())
println "Outlet configuration files missing!!!"
def buildDir = new File('build/tmp/tomcatRunWar/conf')
if(!buildDir.isDirectory()) {
println "Outlet target directory missing. Creating one"
buildDir.mkdirs()
}
copy {
from(srcDir)
into(buildDir)
include '**/*.properties'
include '**/*.xml'
}
copy {
from('src/main/webapp/WEB-INF')
into('build/tmp/tomcatRunWar/work/Tomcat/localhost/digitalmedia/WEB-INF')
include 'web.xml'
include 'dispatcherservlet.xml'
}
}