How to merge source sets while sharing dependencies to each other - java

I'd like to publish a library with two different API versions where both use the same core code underneath. I tried shading/shadowing but have struggles getting the visibility right (I'd like to hide the core code from the API user). So I wanted to achieve my goals by having different source sets and configurations:
sourceSets {
// the `main` source set acts as the common code base for `api` and `api2`
api {
java {
srcDir 'src/api/java'
// Includes classes from `main`:
compileClasspath += sourceSets.main.output
runtimeClasspath += sourceSets.main.output
}
}
api2 {
java {
srcDir 'src/api2/java'
// Includes classes from `main`:
compileClasspath += sourceSets.main.output
runtimeClasspath += sourceSets.main.output
}
}
}
configurations {
common {
canBeResolved = true
canBeConsumed = false
}
// These art the configurations used both for being consumed with `project(...)` or published:
exposedApi {
canBeResolved = true
canBeConsumed = true
extendsFrom common
}
exposedApi2 {
canBeResolved = true
canBeConsumed = true
extendsFrom common
}
}
task apiJar(type: Jar) {
group = 'build'
from configurations.exposedApi
baseName = 'api'
}
task api2Jar(type: Jar) {
group = 'build'
from configurations.exposedApi2
baseName = 'api2'
}
publishing {
publications {
api(MavenPublication) {
artifact apiJar
artifactId 'mylib-api'
}
api2(MavenPublication) {
artifact api2Jar
artifactId 'mylib-api2'
}
}
}
dependencies {
common sourceSets.main.output
exposedApi sourceSets.api.output
exposedApi2 sourceSets.api2.output
}
If I want to use one of these APIs I can easily use project(path: ':mylib', configuration: 'exposedApi2') or use one of the published Maven artifacts and it works nicely.
But as soon as I change classes in the main source set to internal in order to achieve proper encapsulation of the main code, the API code won't compile anymore:
Cannot access 'SomeClassInMain': it is internal in '' (<-- yes, it really shows nothing in the '')
I also tried to merge the source set into one, so there is technically not really a main source set anymore:
sourceSets {
api {
java {
srcDirs('src/api/java', 'src/main/java')
}
}
api2 {
java {
srcDirs('src/api2/java', 'src/main/java')
}
}
}
That now works all as intended, no compilation errors, calls from the API to main work as expected and the classes in main even have internal visibility. But unfortunately IntelliJ seems to not pick up the fact that classes in main are really part of the same source set. I get an error (Unresolved reference: SomeClassInMain) in the IDE every time I mention a class from the main sources and of course no auto-completion would work, too, making the solution somehow not really practical in the end.
So just to sum up the goal:
it's important that the main sources are accessible to the API
but not to the user using the API (or the Maven publication) – the only thing the user should be facing is the API
If possible, I'd like to not put the API and main code in separate modules and publish them separately for encapsulation reasons
I tried a shading/shadowing (fat/uber JAR) approach but I haven't managed to reduce the visibility to internal in the main sources
I'm new to the topic of these complicated kinds of build configurations so maybe I simply have chosen the wrong approach. Maybe there's a better one which I haven't yet managed to find?
Many, many thanks in advance!

Related

Package dependencies into jar

I'm trying to develop a game plugin (oldschool runescape). I'm trying to add in org.json so that it's easier to read/write game states and stuff, but can't seem to figure out how to get it to package org.json with my plugin. It compiles fine, but doesn't run with that package. Any help?
This is what my plugin.gradle.kts looks like
version = "4.0.0"
project.extra["PluginName"] = "Plugin Name"
project.extra["PluginDescription"] = "Misc QOL fixes I wanted"
repositories{
mavenCentral()
}
dependencies{
// https://mavenlibs.com/maven/dependency/org.json/json
compileOnly(group = "org.json", name = "json", version = "20220320")
}
tasks {
jar {
manifest {
attributes(
mapOf(
"Plugin-Version" to project.version,
"Plugin-Id" to nameToId(project.extra["PluginName"] as String),
"Plugin-Provider" to project.extra["PluginProvider"],
"Plugin-Description" to project.extra["PluginDescription"],
"Plugin-License" to project.extra["PluginLicense"]
)
)
}
}
}
edit: I tried compileOnly, implementation, testImplementation, all with the same error "ClassnotFoundException: org.json.JSONObject"
You are using the wrong configuration, you should use "implementation" instead of "compileOnly" as per this documentation.
The gist of it is that "compileOnly" means these libraries are only needed at compiletime, and not at runtime, so they are not included in the jar, as the jar is used at runtime. The "implementation" configuration means these libraries are needed both at compile time and at runtime. Alternatively, you could also use "runtimeOnly" to indicate the package is only needed at runtime, but I don't know if that would work with your project.

Problem when auto generating gRPC stub files using Gradle

I've been asked to implement some gRPC classes for a college course, and have run into some problems when generating the java classes from one source proto file.
Some background first: it's a fairly basic service, with a simple method that receives an id and returns a phone and an email. This is the proto file (BuscarData means FetchData, sorry for the leftover non translation!):
syntax = 'proto3';
option java_multiple_files=true;
option java_generic_services= true;
package uy.edu.um.sd20;
message DataRequest {
int32 id = 1;
}
message DataResponse {
string phone = 1;
string email = 2;
}
service DataRepo {
rpc BuscarData (DataRequest) returns (DataResponse);
}
The idea I had was to generate the classes with gradle plugins. My build.gradle:
plugins {
id 'java'
id "com.google.protobuf" version '0.8.8'
}
apply plugin: 'java'
group 'org.example'
version '1.0-SNAPSHOT'
repositories {
mavenCentral()
}
dependencies {
testCompile group: 'junit', name: 'junit', version: '4.12'
compile group: 'com.google.protobuf', name: 'protobuf-java', version: '3.11.4'
implementation 'io.grpc:grpc-netty-shaded:1.29.0'
implementation 'io.grpc:grpc-protobuf:1.29.0'
implementation 'io.grpc:grpc-stub:1.29.0'
}
sourceSets {
main {
proto {
srcDir 'src/main/proto'
}
java {
srcDirs 'src/main/java', 'generated-sources/main/java'
}
}
}
protobuf {
protoc {
artifact = 'com.google.protobuf:protoc:3.11.0'
}
plugins {
grpc {
artifact = 'io.grpc:protoc-gen-grpc-java:1.29.0'
}
}
generateProtoTasks.generatedFilesBaseDir = 'generated-sources'
generateProtoTasks {
all().each { task ->
// Here you can configure the task
}
ofSourceSet('main')
}
}
From what I understood, everything's there: the grpc and protoc dependencies, and the plugin which enables protoc to compile grpc (protoc-gen-grpc), and where to deposit the generated files.
However, there are two problems:
the generated-sources are not marked as source or anything like that, meaning they cannot be imported from other classes
if I'm not mistaken, the generated-sources should generate a skeleton of DataRepoImpl so that I can add the code needed for BuscarData. However, it didn't. Or maybe I should create it, extending from DataRepo.java, but I couldn't test it, due to problem n°1.
I've added a screenshot of the project file structure:
img
As you can see, quite a lot (if not all) of the gradle settings are copy-pasted and scavenged from many different web-sites. I hope I was careful enough not to repeat any imports. There are similar questions, and I tried the solutions there, but no luck there. One example, with which I knew I had to include the gen grpc plugin: another SO question
Any tip regarding anything else is welcome! I'm new to stackoverflow question-asking, so I may have made mistakes regarding the question specificity or aim of the question.
Thanks!
Franri.
For 1), the plugin should put the generated files are the input for java compile tasks even if you do not explicitly add 'generated-sources/main/java' in the sourceSets configuration. Version 0.8.8 has been a while, you can try with newer versions. There might have been some minor fixes for things you may hit.
For 2), you did not add grpc plugin to each generateProto task. It should be
generateProtoTasks {
all().each { task ->
task.plugins { grpc{} }
}
ofSourceSet('main')
}

How to make Gradle fail the build if a file dependency is not found?

I have a Gradle build that has some dependencies of the form
compile files('path/to/local/lib.jar')
(the build is being migrated - eventually these will be replaced)
The build failed because one of these paths was incorrectly specified. But it failed due to a compile error - it looked like Gradle silently ignored the missing dependency.
Is there a simple option or switch that will force Gradle to fail the build if any dependency (particularly local file dependencies) cannot be resolved (eg., file missing)?
Edit: to clarify further:
If a dependency cannot be found in the configured repositories, Gradle will fail the build when attempting to resolve them, as expected.
BUT - if a dependency is defined as "compile files ....", and the file specified does not exist at build time, Gradle will IGNORE that error, and attempt compilation anyway. That seems spectacularly wrong-headed and inconsistent default behaviour.
My question is - is there a Gradle option or switch or environment variable or system property that I can set to force Gradle to verify that file dependencies exist? (E.g,, behave in a sane and rational way?)
This is a bit of an old thread, but given that none of the currently proposed solutions actually works, and the solution appears to be trivial (collating two of them), I am leaving it here for future reference.
The point here is that we simply want to ensure that the files do exist, so we can just use the exists() method of the File class:
task ensureDepsExist() {
doLast {
configurations.implementation.canBeResolved(true)
Set<File> impFiles = configurations.implementation.resolve()
impFiles.forEach { f ->
if (!f.exists()) {
ant.fail "${f} could not be found"
}
}
}
}
compileJava.dependsOn ensureDepsExist
The canBeResolved() call is required, or Gradle will complain that configurations dependencies cannot be resolved.
Here's how you can check transitive dependencies using Gradle 7.3 (example: Fail if the project depends on log4j directly or transitively).
Kotlin DSL
configurations {
all {
relsolutionStrategy {
eachDependency {
if (requested.name == "log4j") {
throw RuntimeException("Project depends on log4j")
}
}
}
}
}
Groovy DSL
configurations.all {
resolutionStrategy.eachDependency { DependencyResolveDetails details ->
if (details.requested.name == 'log4j') {
throw new RuntimeException("Project depends on log4j")
}
}
}
You could do something as shown below. It is not a built-in Gradle function but does not require code to check each dependency specifically (it checks all in the compile configuration):
apply plugin: 'java'
dependencies {
compile files('lib/abc.jar')
compile files('lib/def.jar')
}
task checkDependencies() {
doLast {
configurations.compile.each { file ->
assert file.exists()
}
}
}
compileJava.dependsOn checkDependencies
To fail the build you can:
ant.fail('message why it failed')
Then you can craft a condition then fail the build with nice message ;)
I would suggest to create a task that will bring the file to the project first with a condition to check if the file is available etc if not then throw a Gradle exception and fail the build with a message, and execute the task first in the execution phase.
I have no chance to test it now but it could be something like this, correct me if any syntax is wrong - but you should get the idea.
def yourDep = $/\path\to\your\depdendency/$
task bringDeps << {
if (yourDep.exists()){
copy {
from yourDep
into $projectDir/depsOrSmthg
}
} else{
ant.fail('message why it failed')
}
}
task ensureDependenciesExist() {
doLast {
configurations.implementation.canBeResolved(true)
DependencySet deps = configurations.implementation.getDependencies()
Set<File> impFiles = configurations.implementation.resolve()
deps.each { d ->
boolean depWasResolved = impFiles.any { impFile -> impFile.name.find(".*${d.name}.*${d.version}") }
if (!depWasResolved) {
println "${d} was not resolved"
assert depWasResolved
}
}
}
}
compileJava.dependsOn ensureDependenciesExist

Gradle generates Querydsl metadata twice via different annotation processors

I have a gradle build script. I want said script to generate QueryDSL-Metadata. Those metadata should be generated under the build/generated-sources/metamodel Folder.
The problem I am facing at the moment is that the metamodel is not only being generated once, but twice. Along with the desired target it is also being generated in the "default" buld/classes/... resulting in a "duplicate class"-error.
sourceSets {
generated.java.srcDirs=['build/generated-sources/metamodel']
main {
java { srcDir 'src/main/java' }
}
test {
java { srcDir 'src/main/test' }
}
}
configurations { querydslapt }
dependencies {
compile 'org.hibernate:hibernate-entitymanager:5.2.3.Final',
'org.hibernate.javax.persistence:hibernate-jpa-2.1-api:1.0.0.Final-redhat-1',
'com.querydsl:querydsl-jpa:4.1.3',
// ... others, non-hibernate/querydsl ...
querydslapt 'com.querydsl:querydsl-apt:4.1.3'
}
task generateSources(type: JavaCompile, group: 'build', description:'Generates the QueryDSL query types') {
source = sourceSets.main.java
classpath = configurations.compile + configurations.querydslapt
options.compilerArgs = ['-proc:only',
'-processor', 'com.querydsl.apt.hibernate.HibernateAnnotationProcessor']
destinationDir = sourceSets.generated.java.srcDirs.iterator().next()
}
compileJava {
dependsOn generateSources
source generateSources.destinationDir
}
According to the gradle trace, the Problem appears to be that there are two AnnotatioProcessors in the mix. First, the HibernateAnnotationProcessor. Second, a JPAAnnotationProcessor, eventually generating the duplicate class. And I can't figure out why, the build script looks ok-ish. I know, it might be guesswork, but I am grateful for any suggestions. I even cleaned my gradle-cache, just in case. It might not even be a pure build-script related issue, but the behavior persists even if I run the script via console.
Gist, basically exactly what I "should" need
(older) Post regarding this issue
This thread's solution works for me, the idea is to hook the Annotation Processor into the javac, the HibernateAnnotationProcessor can be declared via compilerArgs, roughly like:
dependencies {
compile 'org.hibernate:hibernate-entitymanager:5.2.3.Final',
'org.hibernate.javax.persistence:hibernate-jpa-2.1-api:1.0.0.Final-redhat-1',
'com.querydsl:querydsl-jpa:4.1.4',
'com.querydsl:querydsl-apt:4.1.4',
// other
}
ext {
generatedSourcesDir = file("build/generated-sources/metamodel")
}
sourceSets {
main {
java {
srcDir 'src/main/java'
srcDir generatedSourcesDir
}
}
test {
java { srcDir 'src/main/test' }
}
}
compileJava {
doFirst {
generatedSourcesDir.mkdirs()
}
options.compilerArgs += ['-s', generatedSourcesDir,
'-processor', 'com.querydsl.apt.hibernate.HibernateAnnotationProcessor']
}
But I still wonder why the first approach does not work (runs two annotation processors), so any idea is still highly appreciated.

How to add custom Antlr output path to main sourceset in Gradle?

So, I'm new to Gradle and Java in general and having quite a few problems. Because of some other weird difficulties with IntelliJ, I want to change path that Antlr outputs the generated code to. This was easy to change:
generateGrammarSource {
outputDirectory = file("src/temp/generated-code")
}
However, now I'm having great difficulty actually getting it to compile into my "main" and "test" source sets. I basically just want to extend the main and test source sets to include these files. I tried doing that with the something like:
sourceSets {
generated{
java {
srcDir 'src/temp/generated-code'
}
}
main {
compileClasspath += generated.output
runtimeClasspath += generated.output
}
test {
compileClasspath += generated.output
runtimeClasspath += generated.output
}
}
However, doing this doesn't allow the generated code compilation to have access to the dependencies. So, compilation fails because it can't use all of the stuff in the antlr packages.
Is there any easy way to add these dependencies, OR, just force the main and test source sets to somehow include the generated code?
I ended up figuring this out in a deceptively easy way:
sourceSets {
main {
java {
srcDirs = ["src/main/java", "src/temp/generated-code"]
}
}
}
Though I did have to add this for proper clean up:
clean.doFirst {
delete "src/temp"
}
I feel like there is probably a better way to do it than passing these path names around everywhere, but it seems to work fine

Categories