So, I have a fairly basic Gradle project. Everything works fine through Gradle using either "test" or "build". However, when I go to actually import this project into IntelliJ, everything seems to be added as expected except none of the dependencies make it into the build process. The Gradle tool window clearly lists the dependencies used and such, but actually trying to use these ends up with none of the packages being found. I've tried resyncing it, cleaning and rebuilding, etc. But it always ends up with none of the dependencies actually being there in IntelliJ's build process. Building from Gradle is fine.
How can I resolve this!? I tried using the "idea" plugin with the "ideaModule" task. This puts all the dependencies into there as I expect, but also overwrites all of the changes that I made (such as excluded/included source files)... And I'd need to make sure to manually run it any time I change dependencies
How can I get IntelliJ to actually recognize and build with the dependencies in the Gradle build file?
Related
I'm trying to run some unit tests with Apache Maven. I hoped this would be as simple as running the test "goal". But when I did that, maven complained that it could not download some dependencies and thus can't run my tests. This sounds fine, except that I have no idea why it decided I need those dependencies; they are not in my pom.xml, and I doubt they're in my transitive dependencies either. (I'm not sure about that last part; they very well might be in my transitive dependencies.)
Luckily, maven has the perfect tool for this: dependency:tree will tell us exactly which dependency is getting pulled in by what. Except for the small problem that maven thinks to itself "in order to build the tree, I have to resolve the dependencies first" so it tries (and fails) to download those very same dependencies so that it can build the part of the tree that's under them.
So now I don't have a tree, and I have no idea how to proceed from here.
How exactly would you think that maven could resolve transitive dependencies (= dependencies of dependencies) without resolving the dependencies first? Escpecially for the goal "test" also the dependency scope "test" has to be used, which is more then the default scope "compile".
You can use the goal dependency:go-offline to prepare for the offline mode. Maven downloads then all required dependencies. Find the detailed docs for that on https://maven.apache.org/plugins/maven-dependency-plugin/go-offline-mojo.html
You could also have a look at this answer to get another opinion on going online.
The main problem is maven downloads dependencies by demand, you may just check that by triggering different lifecycle phases like mvn initialize, mvn validate, mvn compile, mvn package and checking what maven is trying to download. Sometimes it is possible to figure out project dependencies via analysing project object model (pom), sometimes it is not, especially when plugins define their own dependencies either implicitly or explicitly, some examples below:
we may ask maven-dependency-plugin to download something via dependency:copy-dependencies
exec-maven-plugin has similar functionality: Running Java programs with the exec goal
maven-invoker-plugin may run poms which are part of project but not a part of reactor.
In short: neither maven plugin will able to download all required dependencies. The only "reliable" way to go offline is to run target goal and only then go offline, unfortunately even in this cases some weird things may happen, especially when you or dependency authors are using snapshot versions, version ranges, third-party repositories, etc (my own preference is to run maven with -llr flag to make it more reliable).
For several months we've been using Buildship 1.X plus some manual .launch/tasks to build our Eclipse/WTP config files per development environment. I am currently attempting to migrate to using Buildship 2 (which I'm hoping will rid us of the need for the manual bits.)
However, when I import the projects (which have 0 eclipse config files at this point) via the buildship/gradle import, the subprojects are included via 'Libraries' rather than as 'Projects' (see image below.) In contrast, if I use gradle's eclipse task to generate the eclipse config files (i.e. .classpath) then the configuration ends up as I would expect it to be. Is this a current limitation of Buildship, or do I need to do something differently in my gradle files to coerce Buildship to bring them in as Projects?
Ultimately I don't know that I should care about this difference, but I do know that I'm getting compiler errors saying classes from the subprojects are missing from the classpath. As long as I can fix that issue, I'm perfectly happy.
Potentially helpful info
settings.gradle:
rootProject.name = 'projectroot'
include 'Project2.0'
project(':Project2.0').name = 'projectx'
include 'the-platform'
include 'the-platform:central-repo:central-repo-common'
include 'the-platform:central-repo:central-repo-model'
include 'the-platform:central-repo:central-repo-persist'
include 'the-platform:central-repo:central-repo-service'
Project2.0/build.gradle (snippet):
dependencies {
...
compile project(':the-platform:central-repo:central-repo-common')
compile project(':the-platform:central-repo:central-repo-model')
compile project(':the-platform:central-repo:central-repo-persist')
compile project(':the-platform:central-repo:central-repo-service')
...
}
Hmmm, nevermind. My intuition about the difference between the behavior of buildship vs the eclipse plugin to gradle being responsible for my classpath issues was incorrect. Something else (as yet unexplained) must've been the issue as it is working correctly now.
I'm having some trouble importing my project from build.gradle file.
It's not able to find classes from hamcrest-core-1.3.jar.
gradle clean build
runs successfully on terminal.
My environment is
Intellij 2016.3.1
Gradle 2.14.1
It was able to resolve the reference one time but it went away on restart. When it was able to resolve hamcrest, it later failed to resolve pigunit jar. I got NoSuchMethodError (class was loaded from wrong jar) also once.
Invalidate Cache also didn't work.
I tried importing the project from scratch multiple times.
Please let me know if any other information is needed.
For some reason org.hamcrest.CoreMatchers loads CoreMatchers from
mockito-all-1.10.19.jar instead of hamcrest-core-1.3.jar.I played around with the order of these two jars in the modules libraries.
Please refer to the screenshot below.
I have several gradle projects in my eclipse workspace. For the sake of simplicity I'm only really interested in 2 of them, let's just use A and B for this.
So the problem I'm having is that Project A has an included dependency on JBoss, which pulls in javax validation-api 1.0.0.GA, and Project B has a dependency on javax validation-api 1.1.0.Final. Since Gradle itself resolves the conflict by using the newer library first, B is happy when built by gradle. But Eclipse itself includes errors which are very distracting while editing.
The correct version of the validation-api jar ends up in B's class path but the problem is that the Gradle IDE plugin changes the project(':A') dependency to a project reference, and Eclipse seems to give the project reference precedence over the external jar. So the old jar is preferred by extension.
I tried adding { exclude module: 'validation-api' } in B's build.gradle for the dependency on A which works according to the output of 'gradle dependencies', however since Eclipse just gets as far as making it a project reference, it won't exclude the jar and the problem remains.
Also per this question I tried adding { transitive = false } and the same thing happens. I don't think even the hack posed there would work for me since the .classpath contains a single reference to the Gradle container so there's nothing to remove.
I've managed to get around this by explicitly including a reference to the correct version of the jar from my gradle cache and then moving it above the Gradle Classpath Container so that eclipse sees that version first.
My question is: Is there a better/more generic way to do this? Preferably one that I can commit to source control without breaking other people's builds or requiring them to manually modify paths or properties somewhere? There is another project with what appears to be a similar issue so something I can fix in the build.gradle file would be awesome.
Worst case scenario, I could probably switch to IntelliJ if that behaves itself better than the Eclipse-Gradle integration?
These kind of transitive dependency issues are long-standing problem with Gradle Eclipse integration (both in STS tooling and also commandline generated .classpath metadata from Gradle's Eclipse plugin. The problem is the way that Eclipse computes transitive classpaths.
Only recently we found a reasonable solution to this problem. Actually there are now two solutions, one better than the other but depending on your situation you might want to use either of them.
The first solution is a bug fix that changes the classpath order of project dependencies so that they are no longer 'preferred' over jar dependencies PR-74. To get this fix you may need to install gradle tooling from a snapshot update site because the fix went in after 3.6.3.
This solution doesn't fix the real problem (you still have the 'wrong' stuff on the classpath) but just makes it less likely to cause real problem in your projects.
The second solution is to enable use of the 'Custom Tooling API model' PR-55 introduced in STS 3.6.3. This is a bit experimental and only works for recent version of Gradle, at least 1.12 but probably better to use 2.x. It also only works for projects that have 'Dependency management' enabled (if not enabled you are using the .classpath generated by Gradle's eclipse plugin which has the same 'broken' classpath issues as the STS tooling).
The 'custom tooling model' is really the better solution in principle as it fixes the way gradle classpath get mapped to eclipse projects so that project dependencies are no longer exported and each project gets its own classpath considering dependencies conflict resolution.
To enable this go to "Window >> Preferences >> Gradle" and enable checkbox "Use Custom Tooling Model".
Is there a way to configure Gradle to shorten the folder names of its cached dependencies?
From the Gradle user guide it does not appear its possible, but figured to check with others.
My use case is because using the 'idea' Gradle plugin it helps with settings up Module dependencies. A problem arises when the Module classpath becomes 'too long' for cmd.exe (I'm not trying to discuss these limitations). Idea loads the project just fine, but its unable to run my program since it states the classpath is too long.
Since this is not an Idea problem, I figured it would be lovely if there was a way Gradle could cache deps using shorter folder names.
Example
from: C:\.gradle\caches\modules-2\files-2\com.google.application\application\2.0\SVABNSAVSASAMNVSMAVSASN\application.jar
Options 1 To: C:\.gradle\caches\modules-2\files-2\c.g.a\a\2.0\SVABNSAVSASAMNVSMAVSASN\application.jar
Options 2 To: C:\.gradle\caches\modules-2\files-2\co.go.ap\ap\2.0\[tinyurl-equivalent]\application.jar
Options 3 To: C:\.g\c\m-2\f-2\c.g.a\a\2.0\[tinyurl-equivalent]\application.jar
Options 4 To: C:\.g\[tinyurl-equivalent]\application.jar
I do know that Idea does recognize the long classpath and prompts to enabled Dynamic Classpaths, but this has been known to cause other problems (some invoked apps cannot see the full classpath) and therefore I'd like to avoid this Idea option.
As of Gradle 2.1, shortening dependency cache paths isn't supported. There are ideas around symlinking or copying dependencies into the project, but nothing concrete has materialized.