JHipster - Where place additional modules/plugins to be included in build.gradle - java

I've created simple app based on jhipster and I had to add some libraries to build.gradle to develop what I wanted. That worked correctly. Then I had to regenerate my app since I wanted to add/remove features of jhipster itself. Because my build.gradle was modified and jhipster wanted to regenerate it from it's source I was wondering if there is a place where I can put my stuff in so it would be there whenever I would have to regenerate it again?
In generated files for jhipster one can spot lines like:
//jhipster-needle-gradle-plugins - JHipster will add additional gradle plugins here
So assuming if I understand it correctly that jhipster have additional "registry" of modification which should be applied to original template of the file. But where it is or how to use this so jhipster would know that something although is modified from source is ok and should stay (I mean in the diff view)?
Let's say I need additional plugin for javacc which would be:
id "ca.coglinc2.javacc" version "3.0.0"
and additional section of build.gradle like this:
compileJavacc {
arguments = [
grammar_encoding: 'UTF-8',
static: "false",
debug_parser: "false",
debug_lookahead: "false"
]
inputDirectory = file('src/main/javacc')
outputDirectory = file('src/main/generated/javacc')
}
compileJava.dependsOn processResources,compileJavacc
bootWar.dependsOn war
processResources.dependsOn cleanResources,bootBuildInfo
bootBuildInfo.mustRunAfter cleanResources
sourceSets {
main {
java {
srcDirs 'src/main/java'
srcDirs 'src/main/generated/javacc'
}
}
test {
java {
srcDirs 'src/test/java'
}
}
}
Where to put it that whenever I would have to regenerate the code it would be there and not removed by jhipster?

There is basically no place where you can put your code. What works quite well in most cases is, that you create an own gradle script file (like custom.gradle) and just apply that file in the main build.gradle file (there already some apply from ... lines.
With that you can quite easily manage your custom settings even in case jhipster wants to overwrite build.gradle.

Related

Recreation of jpa metamodels

I want to use JPA meta-models in my project. I added required dependency to my project, added generation to JavaCompile task and meta-models are successfully generated. If I want to run the code again, It doesn't compile. It fails with:
Error:java: Problem with Filer: Attempt to recreate a file for type project.models.AdministrationUser_
for every single meta-model. I am running it in Idea as spring boot run. If I use gradle task boot run than it will run just fine, no problem, but I need Idea run, because I need to set active profiles. It also shows, that problem is probably not in code but somewhere in run task configuration but I have no idea what to change and I tried to change several things but I'm just firing blanks.
I'm using Gradle 5.4.1., Idea 2019.2 and Java 11.
Here are important parts of my build.gradle file:
dependencies {
annotationProcessor("javax.xml.bind:jaxb-api")
annotationProcessor("org.hibernate:hibernate-jpamodelgen")
}
tasks.withType(JavaCompile) {
options.annotationProcessorGeneratedSourcesDirectory =
file("src/generated/java")
}
sourceSets {
generated {
java {
srcDirs = ['src/generated/java']
}
}
}
Something similar was already asked here but one answer suggest to delete hibernate-jpamodelgen what (if I understand it correctly) seems like absurd solution - because it wont work if you delete it. Other answer suggest using some maven plugin, so not an option for me either.
I'm stuck on this problem for a longer time, have no one to talk to about it and I'm completely out of ideas so I'm pretty desperate and any help will be much appreciated.
First, make modifications to your build.gradle.
You point to the wrong directory in the sourceSets section. Fix it:
sourceSets {
generated {
java {
srcDirs = ['src/generated']
}
}
}
Then make further improvements:
With Gradle 5.2 and Intellij 2019.1 annotationProcessors is the only piece of configuration you need. See https://github.com/tbroyer/gradle-apt-plugin/blob/v0.21/README.md
Get rid of
tasks.withType(JavaCompile) {
options.annotationProcessorGeneratedSourcesDirectory =
file("src/generated/java")
}
sourceSets {
generated {
java {
srcDirs = ['src/generated/java']
}
}
}
Lastly, consider delegating your build and test tasks to Gradle (this is now the default). Go to File -> Settings -> Build, Execution, Deployment -> Build Tools -> Gradle -> Build and run using Gradle

Mapstruct AnnotationProcessor with IntelliJ and Gradle

I am trying to get the Mapstruct annotation processor to work in IntelliJ in a Gradle project.
Ideally, I would expect for all configuration to be in the gradle-file and that anyone could just import the project into IntelliJ and get a complete setup without having to set any preferences manually.
But I am okay with compromises on that.
I am using IntelliJ 2018.3 and Gradle 5.0 with Java 11 (i.e. the latest and greatest). The Mapstruct version is 1.2.0.FINAL.
What I have done:
Configured the Mapstruct annotation processor in my build.gradle:
compile "org.mapstruct:mapstruct-jdk8:${mapstruct_version}"
annotationProcessor "org.mapstruct:mapstruct-processor:${mapstruct_version}"
Selected "Delegate IDE build/run actions to Gradle" in the Preferences under "Build, Execution, Deployment -> Build Tools -> Gradle -> Runner"
In the directory build/classes/java/main/com/myapp/mypackage/mapper/ I see a MyMapperImpl.class and a MyMapperImpl.java, so code generation seems to work.
Now I would expect that when I select my annotated abstract MyMapper class and press ctrlH, that the generated MyMapperImpl appears in the hierarchy view.
If I manually mark build/classes/java/main/ as a "generated sources" directory (which I really don't want to have to do, see above), the class still does not appear in the hierarchy. But the source code is marked with a lot of errors, as no classes from my project are found, apparently.
Needless to say: I can flawlessly run tests that use the mapper, both from IntelliJ and the command line.
Use this, my team is also using mapstruct and we use it in our
build.gradle, you will need to bring the idea plugin for gradle as well
def generatedSources = "$buildDir/generated"
def generatedOutputDir = file("$generatedSources")
/*
create generated .java files in different folder than classes
In IntelliJ 2016.3.x: Enable Annotation Processing, then set generated sources,
relative to module output dir, at path '../../generated'
*/
compileJava {
doFirst {
generatedOutputDir.exists() || generatedOutputDir.mkdirs()
options.compilerArgs = [
'-s', "${generatedSources}"
]
}
}
idea {
module {
downloadSources = true
// tell intellij where to find generated sources
sourceDirs += generatedOutputDir
}
}
You will able to run your code even without Gradle runner with this workaround

Outsource java packages for reuse in other projects with gradle

Ive got an Project an within it,I developed a bunch of classes wich is kept very abstract so I could use it in other projects. How should I outsource the package in a way so I can include it via gradle or by the IDE in the end?
Also the reusable packag-content is still in development so I want do work on it in paralelle.
Can anybody tell me how to solve this?
In your build.gradle use a custom build to collect only the package that you want
task customBuild(type: Jar) {
from ("src/main/java/abstract/"){
into "abstract"
}
version = ""
baseName = "myClasses"
}
it will build you a jar file inside build/libs/YouJarName
Now you can copy the jar to anywhere you want, and include it in another project in that way :
dependencies {
compile fileTree (dir : "your/jar/location", includes: ["myJar.jar"])
}

Access file in JUnit test in Gradle environment

Right now I have got a Java library which has a test class. In that class I want to access some files located on my hard disk.
The build.gradle looks like this:
apply plugin: 'java'
dependencies {
testCompile 'junit:junit:4.11'
}
My file is under java_lib/src/test/assets/file.xml and the Java class is under java_lib/src/test/java/<package_name>.java
Therefore I execute
final InputStream resourceAsStream = this.getClass().getResourceAsStream("assets/file.xml");
Unfortunately I get null back. What am I doing wrong?
To get thing rolling you need to add the following to the gradle file:
task copyTestResources(type: Copy) {
from "${projectDir}/src/test/resources"
into "${buildDir}/classes/test"
}
processTestResources.dependsOn copyTestResources
What it basically does is copying all the files in the src/test/resource directory to build/classes/test, since this.getClass().getClassLoader().getResourceAsStream(".") points to build/classes/test.
The issue is already known to Google and they want to fix it in Android Studio 1.2 (since they need IntelliJ14 for that and it seems like it will be included in Android Studio 1.2)
Try placing file.xml under src/test/resources and use this.getClass().getResourceAsStream("file.xml") (without the folder prefix)
The problem appears to be that the assets folder is not part of the test runtime classpath, hence this.getClass().getResourceAsStream("assets/file.xml") wouldn't be able to resolve the path as you expected.
By default, the test resources folder in a Gradle java project is src/test/resources (same as a Maven java project). You can override it to assets folder if you wish by adding this in the project's build.gradle file:
sourceSets.test {
resources.srcDirs = ["src/test/assets"]
}
In build.gradle, add this :
sourceSets.test {
resources.srcDirs = ["src/test"]
}
In your code, access your resource like this :
getClass().getClassLoader().getResourceAsStream("assets/file.xml"));
Works for me.
Thanks for pointing out the Google issue I've been looking all day for this...
In "Android Studio 1.1 RC 1" (gradle build tool 1.1.0-rc1) there is no need to add the work around to the gradle file, but your you have to execute the test from the gradle task menu (or command prompt)!
This worked for me (3 years later, gradle 4.10)
subprojects {
junitPlatformTest.dependsOn processTestResources
}

How can I import one Gradle script into another?

I have a complex Gradle script that wraps up a load of functionality around building and deploying a number of NetBeans projects to a number of environments.
The script works very well, but in essence it is all configured through half a dozen maps holding project and environment information.
I want to abstract the tasks away into another file, so that I can simply define my maps in a simple build file, and import the tasks from the other file. In this way, I can use the same core tasks for a number of projects and configure those projects with a simple set of maps.
Can anyone tell me how I can import one Gradle file into another, in a similar manner to Ant's task? I've trawled Gradle's docs to no avail so far.
Additional Info
After Tom's response below, I thought I'd try and clarify exactly what I mean.
Basically I have a Gradle script which runs a number of subprojects. However, the subprojects are all NetBeans projects, and come with their own ant build scripts, so I have tasks in Gradle to call each of these.
My problem is that I have some configuration at the top of the file, such as:
projects = [
[name:"MySubproject1", shortname: "sub1", env:"mainEnv", cvs_module="mod1"],
[name:"MySubproject2", shortname: "sub2", env:"altEnv", cvs_module="mod2"]
]
I then generate tasks such as:
projects.each({
task "checkout_$it.shortname" << {
// Code to for example check module out from cvs using config from 'it'.
}
})
I have many of these sort of task generation snippets, and all of them are generic - they entirely depend on the config in the projects list.
So what I want is a way to put this in a separate script and import it in the following sort of way:
projects = [
[name:"MySubproject1", shortname: "sub1", env:"mainEnv", cvs_module="mod1"],
[name:"MySubproject2", shortname: "sub2", env:"altEnv", cvs_module="mod2"]
]
import("tasks.gradle") // This will import and run the script so that all tasks are generated for the projects given above.
So, in this example, tasks.gradle will have all the generic task generation code in, and will get run for the projects defined in the main build.gradle file. In this way, tasks.gradle is a file that can be used by all large projects that consist of a number of sub-projects with NetBeans ant build files.
There is a new feature in 0.9. You can use apply from: 'other.gradle' command.
Read my question about same thing at: Is there a way to split/factor out common parts of Gradle build
The answer to the question turned out to be in the Plugins system, where you can add the desired functionality in a set of plugins which can be groovy files located in the directory buildSrc/src/main/groovy. Plugins can also be bundled as a Jar though I haven't tried this.
Details here: Custom Plugins
Well, it is hard to tell what serves you best without actually seeing your build file.
I could assume that stetting up your environment as multi-project build should provide you the abstraction you are looking for.
In your project root build.gradle you define all your domain specific stuff as well as the things that apply to all your subprojects:
repositories {
add(new org.apache.ivy.plugins.resolver.FileSystemResolver()) {
name = 'destRepo'
addIvyPattern( file( project.properties['repo.dest.dir']).absolutePath + '/[organisation]/[module]/ivys/ivy(-[revision]).xml')
addArtifactPattern( file( project.properties['repo.dest.dir']).absolutePath + '/[organisation]/[module]/[type]s/[artifact](-[revision]).[ext]')
descriptor = 'optional'
checkmodified = true
}
...
}
...
subprojects {
sourceCompatibility = 1.5
targetCompatibility = 1.5
group = 'my.group'
version = '1.0'
uploadArchives {
uploadDescriptor = true
repositories {
add rootProject.repositories.destRepo
}
}
apply{ type my.group.gradle.api.plugins.MyPlugin }
...
}
dependsOnChildren()
The project root directory might also contain a gradle.properties file where you define properties used by your projects:
buildDirName=staging
repo.dest.dir=/var/repo
...
Then in an additional file from your project root named settings.gradle you actually point to your subprojects:
include 'my-first-component',
'my-second-component'
...
project(':my-first-component').projectDir = new File(rootDir, 'path/to/first/component')
project(':my-second-component').projectDir = new File(rootDir, 'path/to/second/component')
...
Each sub-project directory contains a build.gradle file containing the sub-project specific stuff only.
No matter if you invoke gradle from your project root or sub-project directory, gradle will automatically consider all your definitions done in the various files.
Also note that no compile task will be executed for your project root as long as you don't load any plugin beyond the default plugin at the root level.
This is an example for Kotlin DSL (build.gradle.kts).
apply(from = "scripts/my-script.gradle.kts")
scripts/my-script.gradle.kts:
println(
"""
I am defined at the top level of the script and
executed at the configuration phase of build process
"""
)
tasks.create("MyTask") {
println(
"""
I am defined in a task and
run at the configration phase of build process"""
)
doLast {
// ...
}
}
See this answer and this answer for how to import a function from another script in Kotlin DSL.
Based off this similar question/answer, the easiest solution I've found after searching for days is using buildscript.sourceFile. It correctly gives the file being run rather than the pwd/cwd/parent-file of said process. I feel like this would solve your issue.

Categories