How to add custom Antlr output path to main sourceset in Gradle? - java

So, I'm new to Gradle and Java in general and having quite a few problems. Because of some other weird difficulties with IntelliJ, I want to change path that Antlr outputs the generated code to. This was easy to change:
generateGrammarSource {
outputDirectory = file("src/temp/generated-code")
}
However, now I'm having great difficulty actually getting it to compile into my "main" and "test" source sets. I basically just want to extend the main and test source sets to include these files. I tried doing that with the something like:
sourceSets {
generated{
java {
srcDir 'src/temp/generated-code'
}
}
main {
compileClasspath += generated.output
runtimeClasspath += generated.output
}
test {
compileClasspath += generated.output
runtimeClasspath += generated.output
}
}
However, doing this doesn't allow the generated code compilation to have access to the dependencies. So, compilation fails because it can't use all of the stuff in the antlr packages.
Is there any easy way to add these dependencies, OR, just force the main and test source sets to somehow include the generated code?

I ended up figuring this out in a deceptively easy way:
sourceSets {
main {
java {
srcDirs = ["src/main/java", "src/temp/generated-code"]
}
}
}
Though I did have to add this for proper clean up:
clean.doFirst {
delete "src/temp"
}
I feel like there is probably a better way to do it than passing these path names around everywhere, but it seems to work fine

Related

How to merge source sets while sharing dependencies to each other

I'd like to publish a library with two different API versions where both use the same core code underneath. I tried shading/shadowing but have struggles getting the visibility right (I'd like to hide the core code from the API user). So I wanted to achieve my goals by having different source sets and configurations:
sourceSets {
// the `main` source set acts as the common code base for `api` and `api2`
api {
java {
srcDir 'src/api/java'
// Includes classes from `main`:
compileClasspath += sourceSets.main.output
runtimeClasspath += sourceSets.main.output
}
}
api2 {
java {
srcDir 'src/api2/java'
// Includes classes from `main`:
compileClasspath += sourceSets.main.output
runtimeClasspath += sourceSets.main.output
}
}
}
configurations {
common {
canBeResolved = true
canBeConsumed = false
}
// These art the configurations used both for being consumed with `project(...)` or published:
exposedApi {
canBeResolved = true
canBeConsumed = true
extendsFrom common
}
exposedApi2 {
canBeResolved = true
canBeConsumed = true
extendsFrom common
}
}
task apiJar(type: Jar) {
group = 'build'
from configurations.exposedApi
baseName = 'api'
}
task api2Jar(type: Jar) {
group = 'build'
from configurations.exposedApi2
baseName = 'api2'
}
publishing {
publications {
api(MavenPublication) {
artifact apiJar
artifactId 'mylib-api'
}
api2(MavenPublication) {
artifact api2Jar
artifactId 'mylib-api2'
}
}
}
dependencies {
common sourceSets.main.output
exposedApi sourceSets.api.output
exposedApi2 sourceSets.api2.output
}
If I want to use one of these APIs I can easily use project(path: ':mylib', configuration: 'exposedApi2') or use one of the published Maven artifacts and it works nicely.
But as soon as I change classes in the main source set to internal in order to achieve proper encapsulation of the main code, the API code won't compile anymore:
Cannot access 'SomeClassInMain': it is internal in '' (<-- yes, it really shows nothing in the '')
I also tried to merge the source set into one, so there is technically not really a main source set anymore:
sourceSets {
api {
java {
srcDirs('src/api/java', 'src/main/java')
}
}
api2 {
java {
srcDirs('src/api2/java', 'src/main/java')
}
}
}
That now works all as intended, no compilation errors, calls from the API to main work as expected and the classes in main even have internal visibility. But unfortunately IntelliJ seems to not pick up the fact that classes in main are really part of the same source set. I get an error (Unresolved reference: SomeClassInMain) in the IDE every time I mention a class from the main sources and of course no auto-completion would work, too, making the solution somehow not really practical in the end.
So just to sum up the goal:
it's important that the main sources are accessible to the API
but not to the user using the API (or the Maven publication) – the only thing the user should be facing is the API
If possible, I'd like to not put the API and main code in separate modules and publish them separately for encapsulation reasons
I tried a shading/shadowing (fat/uber JAR) approach but I haven't managed to reduce the visibility to internal in the main sources
I'm new to the topic of these complicated kinds of build configurations so maybe I simply have chosen the wrong approach. Maybe there's a better one which I haven't yet managed to find?
Many, many thanks in advance!

gradle javaexec error "'apiElements' directly is not allowed"- Gradle 5.4.1

I am new to Gradle and trying to migrate an existing system build from ant to Gradle.
As part of this I need to run a java program on every file in a directory. Directory contains xml files and the java code will parse and convert .xml to .java files (and these Java files would be build to generate class and package in final jar) after performing some business specific transformation.
below is a function I wrote in Gradle
private runJavaFile(String dirPath) {
FileTree tree = fileTree(dir: dirPath, include: '**/*.xml')
tree.each {
def xmlfile = it.path
def javaFile = it.path.replaceFirst(".xml", ".java")
javaexec { //// getting error on this line
classpath configurations.all
main = 'XmlToJavaParser'
args = ["$xmlfile", "$javaFile", 'Java']
}
}
}
I am calling this function from a Gradle task by passing the dir path which contains the xml files to be parsed.
While running the task, I am getting below error:
> Resolving configuration 'apiElements' directly is not allowed
Any help would be appreciated.
Let me know if any more information is needed.
In Gradle, a configuration represents a group of artifacts and their dependencies. You typically have several configurations depending on what you want to do. For instance, you could have one where you declare which dependencies are needed for compilation, which are only needed at runtime, or which are needed for running a particular Java application.
In your case, you are saying that the classpath to the XmlToJavaParser class is "all configurations combined" and that doesn't really make sense. You are also not allowed to do that as some configurations from the Java plugin are not resolvable like this, which is why you get an error.
So to fix it, you should declare your own configuration for XmlToJavaParser. You can then declare dependencies for it like you normally do. Example (using the Groovy DSL):
configurations {
xmlJavaParser {
canBeResolved = true
canBeConsumed = false
}
}
dependencies {
xmlJavaParser "org.example:xml-java-parser:1.0" // or whatever you need
}
private runJavaFile(String dirPath) {
// ...
javaexec {
classpath = configurations.xmlJavaParser // The configuration is referenced here
main = 'XmlToJavaParser'
args = ["$xmlfile", "$javaFile", 'Java']
}
}
There are also other ways to go about it. But the main point is to not use configurations.all as a classpath.

How to make Gradle fail the build if a file dependency is not found?

I have a Gradle build that has some dependencies of the form
compile files('path/to/local/lib.jar')
(the build is being migrated - eventually these will be replaced)
The build failed because one of these paths was incorrectly specified. But it failed due to a compile error - it looked like Gradle silently ignored the missing dependency.
Is there a simple option or switch that will force Gradle to fail the build if any dependency (particularly local file dependencies) cannot be resolved (eg., file missing)?
Edit: to clarify further:
If a dependency cannot be found in the configured repositories, Gradle will fail the build when attempting to resolve them, as expected.
BUT - if a dependency is defined as "compile files ....", and the file specified does not exist at build time, Gradle will IGNORE that error, and attempt compilation anyway. That seems spectacularly wrong-headed and inconsistent default behaviour.
My question is - is there a Gradle option or switch or environment variable or system property that I can set to force Gradle to verify that file dependencies exist? (E.g,, behave in a sane and rational way?)
This is a bit of an old thread, but given that none of the currently proposed solutions actually works, and the solution appears to be trivial (collating two of them), I am leaving it here for future reference.
The point here is that we simply want to ensure that the files do exist, so we can just use the exists() method of the File class:
task ensureDepsExist() {
doLast {
configurations.implementation.canBeResolved(true)
Set<File> impFiles = configurations.implementation.resolve()
impFiles.forEach { f ->
if (!f.exists()) {
ant.fail "${f} could not be found"
}
}
}
}
compileJava.dependsOn ensureDepsExist
The canBeResolved() call is required, or Gradle will complain that configurations dependencies cannot be resolved.
Here's how you can check transitive dependencies using Gradle 7.3 (example: Fail if the project depends on log4j directly or transitively).
Kotlin DSL
configurations {
all {
relsolutionStrategy {
eachDependency {
if (requested.name == "log4j") {
throw RuntimeException("Project depends on log4j")
}
}
}
}
}
Groovy DSL
configurations.all {
resolutionStrategy.eachDependency { DependencyResolveDetails details ->
if (details.requested.name == 'log4j') {
throw new RuntimeException("Project depends on log4j")
}
}
}
You could do something as shown below. It is not a built-in Gradle function but does not require code to check each dependency specifically (it checks all in the compile configuration):
apply plugin: 'java'
dependencies {
compile files('lib/abc.jar')
compile files('lib/def.jar')
}
task checkDependencies() {
doLast {
configurations.compile.each { file ->
assert file.exists()
}
}
}
compileJava.dependsOn checkDependencies
To fail the build you can:
ant.fail('message why it failed')
Then you can craft a condition then fail the build with nice message ;)
I would suggest to create a task that will bring the file to the project first with a condition to check if the file is available etc if not then throw a Gradle exception and fail the build with a message, and execute the task first in the execution phase.
I have no chance to test it now but it could be something like this, correct me if any syntax is wrong - but you should get the idea.
def yourDep = $/\path\to\your\depdendency/$
task bringDeps << {
if (yourDep.exists()){
copy {
from yourDep
into $projectDir/depsOrSmthg
}
} else{
ant.fail('message why it failed')
}
}
task ensureDependenciesExist() {
doLast {
configurations.implementation.canBeResolved(true)
DependencySet deps = configurations.implementation.getDependencies()
Set<File> impFiles = configurations.implementation.resolve()
deps.each { d ->
boolean depWasResolved = impFiles.any { impFile -> impFile.name.find(".*${d.name}.*${d.version}") }
if (!depWasResolved) {
println "${d} was not resolved"
assert depWasResolved
}
}
}
}
compileJava.dependsOn ensureDependenciesExist

Gradle generates Querydsl metadata twice via different annotation processors

I have a gradle build script. I want said script to generate QueryDSL-Metadata. Those metadata should be generated under the build/generated-sources/metamodel Folder.
The problem I am facing at the moment is that the metamodel is not only being generated once, but twice. Along with the desired target it is also being generated in the "default" buld/classes/... resulting in a "duplicate class"-error.
sourceSets {
generated.java.srcDirs=['build/generated-sources/metamodel']
main {
java { srcDir 'src/main/java' }
}
test {
java { srcDir 'src/main/test' }
}
}
configurations { querydslapt }
dependencies {
compile 'org.hibernate:hibernate-entitymanager:5.2.3.Final',
'org.hibernate.javax.persistence:hibernate-jpa-2.1-api:1.0.0.Final-redhat-1',
'com.querydsl:querydsl-jpa:4.1.3',
// ... others, non-hibernate/querydsl ...
querydslapt 'com.querydsl:querydsl-apt:4.1.3'
}
task generateSources(type: JavaCompile, group: 'build', description:'Generates the QueryDSL query types') {
source = sourceSets.main.java
classpath = configurations.compile + configurations.querydslapt
options.compilerArgs = ['-proc:only',
'-processor', 'com.querydsl.apt.hibernate.HibernateAnnotationProcessor']
destinationDir = sourceSets.generated.java.srcDirs.iterator().next()
}
compileJava {
dependsOn generateSources
source generateSources.destinationDir
}
According to the gradle trace, the Problem appears to be that there are two AnnotatioProcessors in the mix. First, the HibernateAnnotationProcessor. Second, a JPAAnnotationProcessor, eventually generating the duplicate class. And I can't figure out why, the build script looks ok-ish. I know, it might be guesswork, but I am grateful for any suggestions. I even cleaned my gradle-cache, just in case. It might not even be a pure build-script related issue, but the behavior persists even if I run the script via console.
Gist, basically exactly what I "should" need
(older) Post regarding this issue
This thread's solution works for me, the idea is to hook the Annotation Processor into the javac, the HibernateAnnotationProcessor can be declared via compilerArgs, roughly like:
dependencies {
compile 'org.hibernate:hibernate-entitymanager:5.2.3.Final',
'org.hibernate.javax.persistence:hibernate-jpa-2.1-api:1.0.0.Final-redhat-1',
'com.querydsl:querydsl-jpa:4.1.4',
'com.querydsl:querydsl-apt:4.1.4',
// other
}
ext {
generatedSourcesDir = file("build/generated-sources/metamodel")
}
sourceSets {
main {
java {
srcDir 'src/main/java'
srcDir generatedSourcesDir
}
}
test {
java { srcDir 'src/main/test' }
}
}
compileJava {
doFirst {
generatedSourcesDir.mkdirs()
}
options.compilerArgs += ['-s', generatedSourcesDir,
'-processor', 'com.querydsl.apt.hibernate.HibernateAnnotationProcessor']
}
But I still wonder why the first approach does not work (runs two annotation processors), so any idea is still highly appreciated.

Gradle: custom source set as dependency for the main and test ones

I've created custom source set in Gradle project to keep all generated code:
sourceSets {
generated {
java {
srcDir 'src/generated/java'
}
resources {
srcDir 'src/generated/resources'
}
}
}
I want to make the result of this source set's code compilation available at compile and run time for main and test source sets.
What's the right semantic way to do it in Gradle?
UPDATE:
As suggested here: How do I add a new sourceset to Gradle? doesn't work for me, I still get java.lang.ClassNotFoundException when I launch my app (though compilation and unit tests run fine). Here is what I tried:
sourceSets {
main {
compileClasspath += sourceSets.generated.output
runtimeClasspath += sourceSets.generated.output
}
test {
compileClasspath += sourceSets.generated.output
runtimeClasspath += sourceSets.generated.output
}
}
sourceSets {
main {
compileClasspath += generated.output
runtimeClasspath += generated.output
}
}
Same for the test source set.

Categories