Recreation of jpa metamodels - java

I want to use JPA meta-models in my project. I added required dependency to my project, added generation to JavaCompile task and meta-models are successfully generated. If I want to run the code again, It doesn't compile. It fails with:
Error:java: Problem with Filer: Attempt to recreate a file for type project.models.AdministrationUser_
for every single meta-model. I am running it in Idea as spring boot run. If I use gradle task boot run than it will run just fine, no problem, but I need Idea run, because I need to set active profiles. It also shows, that problem is probably not in code but somewhere in run task configuration but I have no idea what to change and I tried to change several things but I'm just firing blanks.
I'm using Gradle 5.4.1., Idea 2019.2 and Java 11.
Here are important parts of my build.gradle file:
dependencies {
annotationProcessor("javax.xml.bind:jaxb-api")
annotationProcessor("org.hibernate:hibernate-jpamodelgen")
}
tasks.withType(JavaCompile) {
options.annotationProcessorGeneratedSourcesDirectory =
file("src/generated/java")
}
sourceSets {
generated {
java {
srcDirs = ['src/generated/java']
}
}
}
Something similar was already asked here but one answer suggest to delete hibernate-jpamodelgen what (if I understand it correctly) seems like absurd solution - because it wont work if you delete it. Other answer suggest using some maven plugin, so not an option for me either.
I'm stuck on this problem for a longer time, have no one to talk to about it and I'm completely out of ideas so I'm pretty desperate and any help will be much appreciated.

First, make modifications to your build.gradle.
You point to the wrong directory in the sourceSets section. Fix it:
sourceSets {
generated {
java {
srcDirs = ['src/generated']
}
}
}
Then make further improvements:
With Gradle 5.2 and Intellij 2019.1 annotationProcessors is the only piece of configuration you need. See https://github.com/tbroyer/gradle-apt-plugin/blob/v0.21/README.md
Get rid of
tasks.withType(JavaCompile) {
options.annotationProcessorGeneratedSourcesDirectory =
file("src/generated/java")
}
sourceSets {
generated {
java {
srcDirs = ['src/generated/java']
}
}
}
Lastly, consider delegating your build and test tasks to Gradle (this is now the default). Go to File -> Settings -> Build, Execution, Deployment -> Build Tools -> Gradle -> Build and run using Gradle

Related

JHipster - Where place additional modules/plugins to be included in build.gradle

I've created simple app based on jhipster and I had to add some libraries to build.gradle to develop what I wanted. That worked correctly. Then I had to regenerate my app since I wanted to add/remove features of jhipster itself. Because my build.gradle was modified and jhipster wanted to regenerate it from it's source I was wondering if there is a place where I can put my stuff in so it would be there whenever I would have to regenerate it again?
In generated files for jhipster one can spot lines like:
//jhipster-needle-gradle-plugins - JHipster will add additional gradle plugins here
So assuming if I understand it correctly that jhipster have additional "registry" of modification which should be applied to original template of the file. But where it is or how to use this so jhipster would know that something although is modified from source is ok and should stay (I mean in the diff view)?
Let's say I need additional plugin for javacc which would be:
id "ca.coglinc2.javacc" version "3.0.0"
and additional section of build.gradle like this:
compileJavacc {
arguments = [
grammar_encoding: 'UTF-8',
static: "false",
debug_parser: "false",
debug_lookahead: "false"
]
inputDirectory = file('src/main/javacc')
outputDirectory = file('src/main/generated/javacc')
}
compileJava.dependsOn processResources,compileJavacc
bootWar.dependsOn war
processResources.dependsOn cleanResources,bootBuildInfo
bootBuildInfo.mustRunAfter cleanResources
sourceSets {
main {
java {
srcDirs 'src/main/java'
srcDirs 'src/main/generated/javacc'
}
}
test {
java {
srcDirs 'src/test/java'
}
}
}
Where to put it that whenever I would have to regenerate the code it would be there and not removed by jhipster?
There is basically no place where you can put your code. What works quite well in most cases is, that you create an own gradle script file (like custom.gradle) and just apply that file in the main build.gradle file (there already some apply from ... lines.
With that you can quite easily manage your custom settings even in case jhipster wants to overwrite build.gradle.

Mapstruct AnnotationProcessor with IntelliJ and Gradle

I am trying to get the Mapstruct annotation processor to work in IntelliJ in a Gradle project.
Ideally, I would expect for all configuration to be in the gradle-file and that anyone could just import the project into IntelliJ and get a complete setup without having to set any preferences manually.
But I am okay with compromises on that.
I am using IntelliJ 2018.3 and Gradle 5.0 with Java 11 (i.e. the latest and greatest). The Mapstruct version is 1.2.0.FINAL.
What I have done:
Configured the Mapstruct annotation processor in my build.gradle:
compile "org.mapstruct:mapstruct-jdk8:${mapstruct_version}"
annotationProcessor "org.mapstruct:mapstruct-processor:${mapstruct_version}"
Selected "Delegate IDE build/run actions to Gradle" in the Preferences under "Build, Execution, Deployment -> Build Tools -> Gradle -> Runner"
In the directory build/classes/java/main/com/myapp/mypackage/mapper/ I see a MyMapperImpl.class and a MyMapperImpl.java, so code generation seems to work.
Now I would expect that when I select my annotated abstract MyMapper class and press ctrlH, that the generated MyMapperImpl appears in the hierarchy view.
If I manually mark build/classes/java/main/ as a "generated sources" directory (which I really don't want to have to do, see above), the class still does not appear in the hierarchy. But the source code is marked with a lot of errors, as no classes from my project are found, apparently.
Needless to say: I can flawlessly run tests that use the mapper, both from IntelliJ and the command line.
Use this, my team is also using mapstruct and we use it in our
build.gradle, you will need to bring the idea plugin for gradle as well
def generatedSources = "$buildDir/generated"
def generatedOutputDir = file("$generatedSources")
/*
create generated .java files in different folder than classes
In IntelliJ 2016.3.x: Enable Annotation Processing, then set generated sources,
relative to module output dir, at path '../../generated'
*/
compileJava {
doFirst {
generatedOutputDir.exists() || generatedOutputDir.mkdirs()
options.compilerArgs = [
'-s', "${generatedSources}"
]
}
}
idea {
module {
downloadSources = true
// tell intellij where to find generated sources
sourceDirs += generatedOutputDir
}
}
You will able to run your code even without Gradle runner with this workaround

Gradle: Add dependency from Java to Native compilation

I try to setup gradle for a proper JNI compilation, so I need to build first a shared library (with the c plugin), and then compile and test the java code (which consumes the library).
Here a sample of the build.gradle, related to the native compilation:
model {
components {
yli(NativeLibrarySpec) {
sources {
c {
source {
srcDir 'src/main/c'
include "Yli.c"
commonFolders.each {
include "$it/**/*.c"
}
}
}
}
buildTypes {
release
}
}
}
}
What is the best way to tell gradle that the compileJava should wait for the build of the NativeLibrarySpec?
Edit: When I try to add
compileJava.dependsOn(yliSharedLibrary)
I have the following error during gradle build:
* What went wrong:
A problem occurred evaluating root project 'yli'.
> Could not get unknown property 'sharedLibrary' for root project 'yli' of type org.gradle.api.Project.
Note: I used the command 'gradle tasks' in order to found the name of the task: 'yliSharedLibrary'.
I played around with this and discovered that you can access the tasks created by the software model within closures. For example, if you want to depend on one of the native tasks, you can do so with:
compileJava.dependsOn { yliNativeCompileTask }
Of course, if you want the Java task to come after the native one, but not force an actual dependency between them, you can use mustRunAfter():
compileJava.mustRunAfter { yliNativeCompileTask }
This syntax also works for declared inputs and outputs:
compileJava.inputs.files { yliNativeCompileTask }
Note that if you tie the inputs of a task to the outputs of another task, you don't have to explicitly declare a dependsOn. Gradle infers the task dependency.
Disclaimer I don't know if this is the correct way to do this, or how far you can take this approach.
One final thing: the old native software model is being replaced by a new set of native plugins based on Gradle's original model. It should be much easier to integrate Java projects with these new plugins, but you may want to wait until the plugins have been fully fleshed out before attempting a migration.

Gradle main source code substitution

I'm trying to generated a pre-processed source code in Android, I'm applying some regex-es to my code through a Gradle task and copying that modified code to a new folder within the build folder, this is working properly, but the missing part is that after preprocessing the code, and setting the source for the Android task, gradle complains about duplicated classes. I want to replace the main srcDir for some specific cases (when the build is a Release build specifically) but I can't override the path for the classes to avoid the code duplication. How can I achieve this?
My gradle task as follow :
task filterComments(type: Copy) {
from "$projectDir/src/main/java"
into "$projectDir/build/generated-src"
filter { line -> line.replaceAll('LoremIpsumDolor', 'LOREMIPSUMDOLOR') }
}
tasks.withType(JavaCompile) {
task -> if (task.name.contains("compileRelease")) {
task.dependsOn filterComments
task.source "$projectDir/build/generated-src"
}
}

Annotation processor in Gradle outputs source files to build/classes making javadoc fail. How to fix it?

I have an annotation processor that is automatically picked up by the Java compiler at build time (using SPI). During a gradle build, the generated java sources of this annotation processor are put in build/classes as Gradle tells the annotation processor that this is the place to output generated source files.
When the standard javadoc task is run, it tries to create javadoc for all files in build/classes, including *.java. This failes because javadoc only expects *.class files, making the whole build fail.
So my question is:
Is this a Gradle bug/feature?
How do I fix it/make it work?
It seems the problem is that the generated source files are not picked up, making the javadoc fail because it had nothing to process.
I'm posting the solution here in case somebody is experiencing the same problem:
The problem with compile time source generation in gradle is that the outputted sources are not automatically picked up by the javadoc. This is a problem if all your sources are auto generated. The build will fail with an error saying that no sources could be processed. In the other case your build will succeed but you will have no javadoc of your generated java sources.
The root problem here is gradle's poor integration with generating sources that are both generated and compiled during the same compile step. To remedy this I changed my build files to this.
project layout:
rootproject
rootproject/annotationProcessor
rootproect/userOfAnnotationProcessor
build file of userOfAnnotationProcessor
def generatedSources = "$buildDir/generated-src"
def generatedOutputDir = file("$generatedSources")
compileJava {
doFirst {
generatedOutputDir.exists() || generatedOutputDir.mkdirs()
options.compilerArgs = [
'-s', "${generatedSources}"
]
}
}
sourceSets {
main {
java {
srcDirs += generatedOutputDir
}
}
}
javadoc {
source = sourceSets.main.resources
}
compileJava.dependsOn clean
The trick here is to not add your generated sources to a custom sources set, else we'll run into troubles when trying to build aggregated javadoc in our root project. However this solution has the nasty side effect that our generated sources or added twice for some reason when trying to build after a first clean+build. The solution here is to always do a clean+build.
Now when doing an aggregate javadoc build, we'd like our generated source javadoc to be part of it as well.
This is how our rootproject build file looks like:
def exportedProjects = [
":annotationProcessor",
":userOfAnnotationProcessor",
]
task alljavadoc(type: Javadoc) {
source exportedProjects.collect { project(it).sourceSets.main.allJava }
classpath = files(exportedProjects.collect { project(it).sourceSets.main.compileClasspath })
destinationDir = file("${buildDir}/docs/javadoc")
}
alljavadoc.dependsOn(":userOfAnnotationProcessor:compileJava")
If we had used a custom source set previously, gradle would now start complaining about source set properties not being found. Why? I don't know... A last important thing to notice is that our alljavadoc depeonds on the compilation step of userOfAnnotationProcessor, this is needed to make sure our generated source files are there when the aggregated javadoc is build.
I hope I've helped sombody with this explanation!
I am not quite sure weather it is a bug or not. But as a workaround just filter the sources of javadoc.
Depending on how your build script looks like, it should look something like thistask
myJavadocs(type: Javadoc) {
classpath = sourceSets.main.output.filter { it -> !it.name.endsWith('.java') }
}

Categories