I am am migrating project to dagger 1.2.2. I'd like to override some dependencies for functional tests. For that I included the dagger-compiler as a dependency of the androidTest-build(?) as well:
apt "com.squareup.dagger:dagger-compiler:$daggerVersion"
compile "com.squareup.dagger:dagger:$daggerVersion"
androidTestApt "com.squareup.dagger:dagger-compiler:$daggerVersion
Now the compiler complains that he cannot find a class (I guess because both builds now contain the transitive dependencies of dagger-compiler):
Error:Execution failed for task ':app:compileDebugAndroidTestJava'.
> java.lang.NoClassDefFoundError: javax/inject/Scope
Looking around github it seems the approach should work without manually excluding stuff.
Nevermind. Actually reading the whole buildfile helps alot.
Because of previous dependency-foo I had a directive that excluded the missing dependency explicitly:
configurations {
androidTestCompile.exclude(group:'javax.inject')
}
Removing that fixed it.
Related
My gradle project contains 3 sub-projects with one source file each:
root-project\
sub-project-abstract\
...AbstractFoo.java
sub-project-commons\
...ConcreteFoo.java (extends AbstractFoo)
sub-project-main\
...Main.java (instantiates ConcreteFoo)
build.gradle of sub-project-commons:
dependencies {
implementation(project(:sub-project-abstract))
}
build.gradle of sub-project-main:
dependencies {
implementation(project(:sub-project-commons))
}
The Main-class in sub-project-main is aware of ConcreteFoo, however, compilation fails with cannot access AbstractFoo.
For some reason, I expected sub-project-commons to "export" ConcreteFoo and AbstractFoo, since it's a implementation-dependency. Or in other words, form the perspective of sub-project-main, AbstractFoo is a transitive dependency.
However, this doesn't seem to be the case.
I know that I could probably make it work by explicitly adding sub-project-abstract as a direct dependency to sub-project-main. However, that's something I want to avoid due to the nature of the commons project (my actual project contains up to 10 subprojects, and it should be possible to reuse the commons-project without declaring a dependency to sub-project-abstract every single time the commons-project is referenced.
Is there a way to make the Main-class aware of AbstractFoo without directly declaring sub-project-abstract as a dependency (but indirectly via sub-project-commons)?
This is expected behavior for the implementation configuration. You should apply the Java Library Plugin and use the api configuration.
The key difference between the standard Java plugin and the Java Library plugin is that the latter introduces the concept of an API exposed to consumers. A library is a Java component meant to be consumed by other components. It’s a very common use case in multi-project builds [emphasis added], but also as soon as you have external dependencies.
The plugin exposes two configurations that can be used to declare dependencies: api and implementation. The api configuration should be used to declare dependencies which are exported by the library API, whereas the implementation configuration should be used to declare dependencies which are internal to the component.
[...]
Dependencies appearing in the api configurations will be transitively exposed to consumers of the library, and as such will appear on the compile classpath of consumers. Dependencies found in the implementation configuration will, on the other hand, not be exposed to consumers, and therefore not leak into the consumers' compile classpath. [...]
In sub-project-commons (Kotlin DSL):
plugins {
...
`java-library`
}
...
dependencies {
api(project(":sub-project-abstract"))
}
...
Micronaut documentation says:
For test resources which make use of Testcontainers, you may extend
the base AbstractTestContainersProvider class.
My question is: how to add this class properly to the classpath of the test resources sourceset (I am using Gradle)
You will need to add the following dependencies to your build.gradle file:
dependencies {
testResourcesImplementation platform("io.micronaut:micronaut-bom:3.6.1")
testResourcesImplementation "io.micronaut.testresources:micronaut-test-resources-testcontainers"
}
(note that I'm importing the Micronaut BOM so that you don't have to specify the test resources version, but you could use it directly)
I just updated my Hibernate dependencies in my gradle build file from:
implementation 'org.hibernate:hibernate-core:5.4.12.Final'
implementation 'org.hibernate.validator:hibernate-validator:6.0.18.Final'
implementation 'org.hibernate:hibernate-c3p0:5.4.21.Final'
to:
implementation 'org.hibernate:hibernate-core:5.5.7.Final'
implementation 'org.hibernate.validator:hibernate-validator:7.0.1.Final'
implementation 'org.hibernate:hibernate-c3p0:5.5.7.Final'
I already saw, that the validation API has been moved from javax.* to jakarta.* and I guess it has something to do with that. However, I was not able to find out which dependencies are in conflict in this case and what I would have to change to make it compatible.
Can someone help me there?
I solved it by adding 'javax.validation:validation-api:2.0.1.Final' to my dependencies. Can anyone explain to me why this is explicitly required? Are parts of the validation API still in the javax package?
I faced with same exception. It throws from TypeSafeActivator class
So according to https://hibernate.org/orm/releases/5.5/
Hibernate ORM 5.5 adds new artifacts with the artifact id suffix
"-jakarta" like hibernate-core-jakarta.
How can a task be associated to a specific dependency configuration?
If I look the 23.5. Dependency management (gradle java plugin official doc) section part, it states that, for example, compileTestJava task use testCompile configuration.
I just wanted to know how I could achieve that.
gradle is creating these configurations automatically;
if you define a sourceSet, a bunch of things gets created (by convention):
sourceSets {
thing
}
will define configurations: thingCompile, thingRuntime
tasks: compileThingJava, processThingResources, thingClasses
you might want to look at: gradle tasks --all and gradle dependencies
if you want to add dependencies to these configurations
the most preferable to use the generated ones
you may of course create your own configuration and extend from that: configurations { thingCompile.extendsFrom(myConfig) }
My project depends on the Pax Exam framework. I declare, among others, these dependencies on Pax (PAX_EXAM_VERSION = 3.4.0):
compile group: 'org.ops4j.pax.exam', name: 'pax-exam-junit4', version: PAX_EXAM_VERSION
compile group: 'org.ops4j.pax.exam', name: 'pax-exam-container-native', version: PAX_EXAM_VERSION
Both of these depend on org.ops4j.pax.exam:pax-exam-spi, which is the module causing my issue.
So, when I try to build my project, the error reported is this one:
Could not resolve org.ops4j.pax.exam:pax-exam-spi:3.4.0
...
Could not parse POM http://repo.maven.apache.org/maven2/org/ops4j/pax/exam/pax-exam-spi/3.4.0/pax-exam-spi-3.4.0.pom
Unable to resolve version for dependency 'com.google.guava:guava:jar'
I have tried:
excluding pax-exam-spi from the transitive dependencies of the modules I depend on (but notice I still need the classes in it to be able to compile), adding Guava to my first-level dependencies, then trying to make pax-exam-spi a first-level dependency with transitive = false (won't work, same problem as above).
same as above, but instead of doing transitive = false, trying to use artifact-only notation, like this:
compile "org.ops4j.pax.exam:pax-exam-spi:${PAX_EXAM_VERSION}#jar"
I know the root of the problem is that the guava version is not declared in the pax-exam-spi pom, but in its parent exam, which only declares the Guava version(s) to use in 2 different profiles' dependencyManagement sessions (this works in Maven because one of the profiles is activated if the property glassfish.release is NOT set, and the other if that property IS set). However, knowing this has not been useful so far :(
Please let me know if there's any not-so-hacky way to make sure Gradle does include the pax-exam-spi's jar in my classpath, but does not even try to parse its POM (in particular, referring to a hard-coded path to the jar is out of question!).