Review of gitlab CI using yml - java

Hi I am using the following gitlab yml file for setting up my pipeline. The project is a maven Java project. But I am not able to run all the steps successfully. Here is the gitlab yml:
image: maven:3.5-jdk-8
variables:
MAVEN_CLI_OPTS: "-s .m2/settings.xml --batch-mode"
MAVEN_OPTS: "-Dmaven.repo.local=.m2/repository"
include:
- template: Security/SAST.gitlab-ci.yml
cache:
paths:
- .m2/settings.xml
# Define stages
# Stages group various steps into one block,
# if any step fails, the entire stage fails
stages:
- validate
- compile
- SonarQube
- test
validate:
stage: validate
script:
- mvn validate
compile:
stage: compile
script:
- mvn $MAVEN_CLI_OPTS compile
sonarqube-check:
image: maven:3.6.3-jdk-11
stage: SonarQube
variables:
SONAR_USER_HOME: "${CI_PROJECT_DIR}/.sonar"
script:
- mvn sonar:sonar -Dsonar.projectKey=key -Dsonar.host.url=url -Dsonar.login=id
allow_failure: true
spotbugs-sast:
variables:
COMPILE: "false"
SECURE_LOG_LEVEL: "debug"
artifacts:
reports:
sast: gl-sast-report.json
#spotbugs-sast:
# variables:
# SECURE_LOG_LEVEL: "debug"
#FAIL_NEVER: 1
test:
image: maven:3.5-jdk-8
stage: test
script:
- mkdir -p /opt/path/conf/project/
- echo ${CI_PROJECT_DIR}
- cp "${CI_PROJECT_DIR}/project.properties" "/opt/path/conf/project/"
- mvn $MAVEN_CLI_OPTS test -B
But I am getting errors in stages: sonarqube, spotbug-sast and test.
In sonarqube, it isshowing error as: Failed to resolve the project dependency with a list of jar files as:
The following artifacts could not be resolved: webpay:webpay-client:jar:4.0.4, mpienhanced:mpienhanced:jar:1.0.0, webpay:webpay-mpi:jar:4.3.9, webpay:matrix-mpi:jar:1.27.4, webpay:vbv-matrix:jar:1.12.1, webpay:xercesImpl:jar:2.12.0, webpay:xss4j:jar:0.0.1, webpay:xmlParserAPIs:jar:2.11.0, webpay:webpay-mpi-util:jar:4.2.2
In spotbugs-sast I am getting the error as:
[INFO] [Find Security Bugs] [2022-01-13T10:41:39Z] ▶ Found 1 analyzable projects.
[FATA] [Find Security Bugs] [2022-01-13T10:41:39Z] ▶ lstat /root/.m2/repository: no such file or directory
In test stage it is not able to get the properties file from the path that is mentioned in the config file. I have tried to place the properties file at all the places and specify the path but to no luck.
Can someone please help resolve my issues. Thanks in advance.
Let me know if any additional info is required.

You could try going back to a documentation example and tweak it to incrementally make it like the one in your question.
But be warned: spotbugs-sast won't analyze Java much longer.
See GitLab 14.10 (April 2022)
Faster, easier Java scanning in SAST
GitLab Static Application Security Testing (SAST) now uses Semgrep to scan Java code, building on previous support for Go (introduced in GitLab 14.4) and for JavaScript, TypeScript, and Python (introduced in GitLab 13.12).
The Semgrep-based analyzer runs significantly faster—up to 7 times faster in our testing than the existing analyzer that’s based on SpotBugs.
It also doesn’t need to compile your code before scanning, so it’s much simpler to use than SpotBugs.
The Static Analysis and Vulnerability Research teams worked together to translate rules to the Semgrep format, preserving most existing rules.
We also updated, refined, and tested the rules as we converted them.
If you use the GitLab-managed SAST template (SAST.gitlab-ci.yml), both Semgrep and SpotBugs now run whenever Java code is found.
In GitLab Ultimate, the Security Dashboard combines findings from the two analyzers, so you won’t see duplicate vulnerability reports.
In GitLab 15.0, as we announced, we’ll change the GitLab-managed SAST template (SAST.gitlab-ci.yml) to only run the Semgrep-based analyzer for Java code.
The SpotBugs-based analyzer will still scan other JVM languages like Groovy, Kotlin, and Scala.
If you have any questions, feedback, or issues with the new Semgrep-based Java scanning, please file an issue, we’ll be glad to help.
See Documentation and Issue.
And GitLab 15.1 (June 2022) adds improvements which could help:
Static Analysis analyzer updates
GitLab Static Analysis includes many security analyzers that the GitLab Static Analysis team actively manages, maintains, and updates. The following analyzer updates were published during the 15.1 release milestone. These updates bring additional coverage, bug fixes, and improvements.
Secret Detection analyzer updated for better offline support and easier debugging. See CHANGELOG for details.
Improve logging
Use checked-out copy of the repository if git fetch fails
Fall back to scanning the latest commit if automatic diff detection fails
SpotBugs analyzer updated to SpotBugs version 4.7.0 and find-sec-bugs version 1.12.0. See CHANGELOG for details.
Update gradle and grails to support Java 17
Set Java 17 as the system-wide default version
Use ‘assemble’ task for Gradle projects, instead of ‘build’, to support custom GRADLE_CLI_OPTS (see issue #299872)
Add additional detection rules
If you include the GitLab-managed SAST template (SAST.gitlab-ci.yml), you don’t need to do anything to receive these updates. However, if you override or customize your own CI/CD template, you need to update your CI/CD configurations.
To remain on a specific version of any analyzer, you can pin to a minor version of an analyzer. Pinning to a previous version prevents you from receiving automatic analyzer updates and requires you to manually bump your analyzer version in your CI/CD template.
For previous changes, see last month’s updates.
See Documentation and Issue.

Related

spring boot 3 native fail with javaAgent configured

I must say that I am pretty new with graalvm (like most people nowadays).
I have been following this guide created by #joshLong to create a spring boot native application.
In the guide it explains that we can use the java agent while executing tests in order to create the reflect-config.json and other files that are "given as an input to the graalVm compiler". This is also explained in the spring boot documentation here and also here
I created a repo (Which is a clone of the one from the guide), and just made one commit in order to "activate the java agent while executing the test" (in the history of the branch just look at the last commit).
please get it like this:
git clone https://github.com/clembo590/issues.git --branch issue_with_agent
when executing: mvn clean -P native native:compile I get this error
Caused by: com.oracle.svm.core.util.VMError$HostedError: New Method or Constructor found as reachable after static analysis: public java.lang.String com.example.aot.properties.DemoProperties.name()
I looked on the web if I could find something about this issue:
I found one issue on github about that , but I do not know how to fix it in the case of a spring boot app.
Thank you for your help.

How can a consistent Java code format be enforced?

I'm looking for a way to force developers to use the same Java code formatting rules. My requirements are:
Gradle integration
Task that checks if code is correctly formatted. This will be used on CI to cause a build failure if incorrectly formatted code is submitted
Task that fixes incorrectly formatted code (nice-to-have)
IntelliJ integration
Incorrectly formatted code can be fixed within the IDE via the "Reformat Code" action
Code that is generated by the IDE (e.g. getter/setter generation) conforms to the rules
Supports the OpenJDK/Oracle Java formatting rules
Currently I'm using Spotless with the following configuration
spotless {
java {
toggleOffOn()
eclipse().configFile("${project.rootDir}/tools/eclipse-java-formatter.xml")
indentWithSpaces()
removeUnusedImports()
}
}
For IntelliJ integration, I've installed the Eclipse Code Formatter plugin and configured it to use the same rules as Spotless.
This approach meets all of the requirements above except for 2.2 i.e. any code generated by IntelliJ must be reformatted before it conforms to the formatting rules. A further problem is that the imports seem to be arbitrarily reordered when code is reformatted. This generates a lot of spurious changes which makes pull requests more difficult to review.
Is there another approach (e.g. CheckStyle) that does not suffer from these shortcomings?
You could use the Google Java Format, which has plugins for the aforementioned IDEs (IntelliJ IDEA, Eclipse), it provides integrations with tools such as Maven, Gradle, or SBT, and provides means to run the formatter as pre-commit hook or when pushing the code to Github with Github actions.
In their README they also mention the imports issue and how to fix it for IntelliJ IDEA, and more insights are provided e.g.: on how to handle it on Spotless Gradle plugin, when using the Maven Spotless plugin, or for Github actions.
A drawback for your specific case may be that the tool enforces the Google Java style guide, which was praised and recommended by the Oracle Java team as described in the Oracle Java magazine. It also provides the option to use the AOSP code style.
Below a snippet for spotless Gradle configuration, considering imports ordering:
spotless {
java {
importOrder() // standard import order
importOrder('java', 'javax', 'com.acme', '') // or importOrderFile
// You probably want an empty string at the end - all of the
// imports you didn't specify explicitly will go there.
removeUnusedImports()
googleJavaFormat() // has its own section below
eclipse() // has its own section below
prettier() // has its own section below
clangFormat() // has its own section below
licenseHeader '/* (C) $YEAR */' // or licenseHeaderFile
}
}
Checkstyle supports most of your requirements.
Gradle Integration:
plugins {
id 'checkstyle'
}
checkstyle {
toolVersion = checkstyleVersion
config = rootProject.resources.text.fromFile(checkstyleRulesRootPath) # Loads configuration from a file
ignoreFailures = false # Causes build to fail
maxWarnings = 0 # Fails even for warnings
}
It do not fixes code automatically (AFAIK).
IntelliJ integration:
There's Checkstyle plugin which you can configure to display warnings as you're coding.
You can also configure IntelliJ autoformatting to use these rules.
Formatting rules
Here is the configuration for Oracle/Sun specifications in checkstyle.
I think you can use it p3c plugin
I use formatter-maven-plugin and import-maven-plugin
Those plugins have validate/check goals that I use in our CI tool to validate incoming PRs.
They also have gradle variants. Checkout here
I can help you here. Mainly, you have asked two main problems here:
Incorrectly formatted code can be fixed within the IDE via the "Reformat Code" action
For this, you need to write a code template. We use a specific code template in our organisation. Now, you need to import this XML code template under Settings > Code Style.
Now the code will by default be formatted the way the template has been written.
Or use Ctrl +Shift+ F as the eclipse shortcut(enabled in intelliJ).
Support for the OpenJDK/Oracle Java formatting rules can be taken care of within the same template. Please refer their code template as default one provided in Eclipse.
Code that is generated by the IDE (e.g. getter/setter generation) conforms to the rules
This link will help. Please explore more on how to write the custom code templates.
To restrict all the developers to not to push the wrong format of the code, you need to write a GIT hook. It will not allow the code to be pushed unless the code complies with basic rules provided in the custom script of the hook. Nobody needs to do anything on local intelliJ, the git hook will act from the remote GIT repo. Here is one supporting link.
I have provided crisp information here because it is more the matter of customized rules that will be there in your code templates.
Other questions:
Task that checks if code is correctly formatted. This will be used on CI to cause a build failure if incorrectly formatted code is submitted.
When you will restrict the committed code using git hooks, there shall never be any code unformatted on the repo. So, you don't need it as part of CI build.
It may be done by providing a pipeline script that will trigger a method having the git location of your code. It looks a tedious thing to me.
Hope, all your questions are answered.

Is it possible to use jQAssistant as a tool inside a java application?

I am currently working on a small project. The idea is to use jQAssistant to fill the neo4j database so that the data can be used by an rest api. The plan is to upload a jar, war or ear to a java backend so that it can be scanned (scan -f) and then start the neo4j server on port 7474.
What I already have tried:
1. Trying to execute "scan" and "server" with Java ProcessBuilder and Runtime.
2. Importing JQAssistant Commandline Neo4jv3 - 1.6.0 with gradle and trying to use the run-Method in Main.class with the commandline arguments (scan -f foldername).
Server-start works without any problems in both cases, but scanning is a huge problem. It does not seem to scan the specified folder correctly. The jqassistant-folder which has been created does not have any scanned data.
I assume that the root of the problem is the plugins folder and the variables JQASSISTANT_HOME and JQASSISTANT_OPTS appearing in the jqassistant.cmd and .sh files.
Is it actually possible to execute "server" and especially "scan" within java code?
It is possible to use jQAssistant from Java code but I'd not recommend it as the underlying APIs are subject to change. What remains downwards compatible over releases are the command line arguments, so going for the Main class as described in your question should be safe for a while. This approach is also used by the Gradle integration provided by Kontext E (http://techblog.kontext-e.de/jqassistant-with-gradle/).
Assuming that you're encountering the same problem with missing data when using the provided shell scripts for Windows/Linux. A common issue is that for scanning folders containing Java classes you need to specify a scope:
scan -f java:classpath::build/classes/main
The java:classpath prefix provides a hint that the folder shall be treated as a classpath element, see http://buschmais.github.io/jqassistant/doc/1.6.0/#_scanner and http://buschmais.github.io/jqassistant/doc/1.6.0/#cli:scan.

Maven build optimization - prevent building *-fat.jar locally

Right now I see this in my project:
I have a pretty optimized maven build using:
mvn -offline -T 9 package exec:java -DskipTests
the offline flag prevents it from looking for updates, uses 9 threads, and skips tests, but I wonder if there is a flag I can use to prevent it from creating the *-fat.jar?
I figure the fat.jar is a big file and if I avoid creating it until I need to, might save some time.
Maven is not creating something like "-fat.jar" by default. It mast be specific definition in the pom.xml: maven-assembly-plugin or maven-shade-plugin which do it.
So, you need to change your pom.xml: define special profiles: one(defualt) which will create "-fat.jar" and one which will not.
And then you will able to run something like "mav package -Pmy-no-fat-profile" to avoid "-fat.jar" creation.

Where should I configure code generation in NPM packages?

Disclaimer: I am the author of Jsonix and Jsonix Schema Compiler and I'm trying to figure the canonical way the Jsonix Schema Compiler should be integrated in NPM package.json.
The jsonix-schema-compiler NPM package provides a Java-based tool for code generation. If the jsonix-schema-compiler is installed as dependency then it can be used to generate XML<->JS mappings. The invocation is as something like:
java -jar node_modules/jsonix-schema-compiler/lib/jsonix-schema-compiler-full.jar
schema.xsd
This generates a JavaScript file like Mappings.js which is basically a part of module's code.
Ideally, jsonix-schema-compiler invocation above (java -jar ... and so on) should be executed during the module build. But it must be executed after modules dependencies are installed (otherwise node_modules/jsonix-schema-compiler will be missing).
My question is - where should I canonically configure code generation in NPM packages?
Right now I'm doing it in the postinstall scripts, something like:
{
...
"dependencies": {
"jsonix": "x.x.x",
"jsonix-schema-compiler": "x.x.x"
},
"devDependencies" : {
"nodeunit" : "~0.8.6"
},
"scripts": {
"postinstall" : "java -jar node_modules/jsonix-schema-compiler/lib/jsonix-schema-compiler-full.jar schema.xsd",
"test": "nodeunit src/test/javascript/tests.js"
}
}
However having read this:
tl;dr Don't use install. Use a .gyp file for compilation, and
prepublish for anything else.
You should almost never have to explicitly set a preinstall or install
script. If you are doing this, please consider if there is another
option.
I am now confused if postinstall is also OK.
All I want to do is to be able to execute a certain command-line command after dependencies are installed but before other things (like tests or publish). How should I canonically do it?
Typically people are running things like coffeescript-to-javascript compilers, Ecmascript 6->5 transpilers, and minifiers as a build step, which is what it sounds like you're doing.
The difference between doing it pre-publish and post-install is that a prepublish script is probably going to be run in your checked-out directory, so it's reasonable to assume that java is available and various dev-dependencies are available; while the post-install script would be run after every install, and will fail if java (etc.) is not available, as on a minimalist docker image. So you should put your build step in a prepublish or similar script.
Personally what I like to do is define a script 'mypublish' in package.json that ensures all tests pass, runs the build, ensures build artefacts exist, and then runs npm publish. I find this more intuitive than prepublish which is meant to be used as an "I'm about to publish" hook, not a "do the build before publishing".
Here is a package.json that uses this setup: https://github.com/reid/node-jslint/blob/master/package.json and here's the Makefile with the prepublish target: https://github.com/reid/node-jslint/blob/master/Makefile
Let me know if you have more questions; I'm kind of rambling because there are many legitimate ways to get it done-- as long as you avoid postinstall scripts. ;-)

Categories