Execute Maven Plugin in every Build - java

i've a question I can't answer by myself.
Introduction:
We've developed a corporate pom which enforces some rules on projects, provided some company wide profiles, distribution management, etc.
Most of our developer use the corporate pom, but not everyone does, and furthermore not all migrate old projects within the next development cycle.
So I decided to write a maven plugin which simply checks whether someone builds a project on developement stage (not qs/prod) and whether the user uses our corporate pom.
Problem:
I want to force the plugin to execute on every "mvn clean install" or something similar without configure the plugin within the project pom.
My first guess was the settings.xml, but unfortually you can't execute plugins via settings.xml.
Does someone have a solution?
Question in short: Force a plugin execution on every build without providing the plugin in the project pom and or command line!

In general you don't control the system of the developer, so trying to solve this with the settings.xml of with the Maven distribution is not the way to go. You must look for the placing which is used by every project. One system could be the SCM, but adding hooks there is probably hard.
Assuming you have a artifact repository manager, that is the place to do these kind of checks. I know both Nexus and Artifactory can validate the uploaded files, and are especially strong in analyzing the pom.xml
So your focus should not be on trying to solve this with a maven plugin, but on a common place in the infra.

As Robert mentioned it may be better to validate this kind on the infrastructre instead on the developer machine.
A pre-receive hook in git does the trick.
#!/bin/bash
#
# Hook simply validates whether the corporate pom is used
# and rejects the commit when the corporate pom is absent
echo "### Validate commit against Company rules... ####"
corppom_artefactId='corporate-pom'
oldrev=$1
newrev=$2
refname=$3
while read oldrev newrev refname; do
pom=`git ls-tree --full-name -r ${newrev} | grep pom.xml | awk '{ print $3 }'`
# Project seems not to be a maven project. So it's okay that the corporate pom is missing
if [ -z ${pom} ];
then
continue;
else
git cat-file blob ${pom} | grep $corppom_artefactId >> /dev/null
if [[ $? -ne 0 ]];
then
echo "### NO CORPORATE POM... Bye Bye ###"
# Rejecting commit
exit 1;
else
echo "### CORPORATE POM IS USED. GREAT! ###"
fi
fi
done
Be aware that this script is just an example! It will not work on multi module projects and furthermore is not coded very well. But as solution approach it is sufficient.

Related

Gitlab CI + maven: Use another repo as local dependency [duplicate]

Question
What is the best way to carry artifacts (jar, class, war) among projects when using docker containers in CI phase.
Let me explain my issue in details, please don't stop the reading... =)
Gitlabs project1
unit tests
etc...
package
Gitlabs project2
unit test
etc...
build (failing)
here I need one artifact (jar) generated in project1
Current scenario / comments
I'm using dockers so in each .gitlab-ci.yml I'll have independent containers
All is working fine in project1
If I use "shell" instead of dockers in my .gitlab-ci.yml I can keep the jar file from the project1 in the disk and use when project2 kicks the build
Today my trigger on call project2 when project1 finish is working nicely
My artifact is not an RPM so I'll not add into my repo
Possible solutions
I can commit the artifact of project1 and checkout when need to build project2
I need to study if cache feature from gitlabs is designed for this purpose (gitlab 8.2.1, How to use cache in .gitlab-ci.yml)
In GitLab silver and premium, there is the
$CI_JOB_TOKEN available, which allows the following .gitlab-ci.yaml snippet:
build_submodule:
image: debian
stage: test
script:
- apt update && apt install -y unzip
- curl --location --output artifacts.zip "https://gitlab.example.com/api/v4/projects/1/jobs/artifacts/master/download?job=test&job_token=$CI_JOB_TOKEN"
- unzip artifacts.zip
only:
- tags
However, if you do not have silver or higher gitlab subscriptions, but rely on free tiers, it is also possible to use the API and pipeline triggers.
Let's assume we have project A building app.jar which is needed by project B.
First, you will need an API Token.
Go to Profile settings/Access Tokens page to create one, then store it as a variable in project B. In my example it's GITLAB_API_TOKEN.
In the CI / CD settings of project B add a new trigger, for example "Project A built". This will give you a token which you can copy.
Open project A's .gitlab-ci.yaml and copy the trigger_build: section from project B's CI / CD settings trigger section.
Project A:
trigger_build:
stage: deploy
script:
- "curl -X POST -F token=TOKEN -F ref=REF_NAME https://gitlab.example.com/api/v4/projects/${PROJECT_B_ID}/trigger/pipeline"
Replace TOKEN with that token (better, store it as a variable in project A -- then you will need to make it token=${TRIGGER_TOKEN_PROJECT_B} or something), and replace REF_NAME with your branch (e.g. master).
Then, in project B, we can write a section which only builds on triggers and retrieves the artifacts.
Project B:
download:
stage: deploy
only:
- triggers
script:
- "curl -O --header 'PRIVATE-TOKEN: ${GITLAB_API_TOKEN}' https://gitlab.example.com/api/v4/projects/${PROJECT_A_ID}/jobs/${REMOTE_JOB_ID}/artifacts/${REMOTE_FILENAME}"
If you know the artifact path, then you can replace ${REMOTE_FILENAME} with it, for example build/app.jar. The project ID can be found in the CI / CD settings.
I extended the script in project A to pass the remaining information as documented in the trigger settings section:
Add variables[VARIABLE]=VALUE to an API request. Variable values can be used to distinguish between triggered pipelines and normal pipelines.
So the trigger passes the REMOTE_JOB_ID and the REMOTE_FILENAME, but of course you can modify this as you need it:
curl -X POST \
-F token=TOKEN \
-F ref=REF_NAME \
-F "variables[REMOTE_FILENAME]=build/app.jar" \
-F "variables[REMOTE_JOB_ID]=${CI_JOB_ID}" \
https://gitlab.example.com/api/v4/projects/${PROJECT_B_ID}/trigger/pipeline
Hello you must take a look at a script named get-last-successful-build-artifact.sh and developed by morph027.
https://gitlab.com/morph027/gitlab-ci-helpers
This script allow to download an artifact and unzip it in the project root. It use Gitlab API to retrieve latest successful build and download corresponding artifact. You can combine multiple artifacts and unzip wherever you want just by updating the script a little.
I'm also currently starting a PHP library to handle build artifacts but it's in a very early stage and tied with laravel for the moment.
For the moment there is no easy way to handle artifact usage between projects, you must build your own using that tools.
I think using shell executor is not the right solution, it's very dangerous because you can't verify the file on the server used during the build !
Hope this help :)
arry artifacts (jar, class, war) among projects
That should be what the package Registry is for.
With GitLab 13.3 (August 2020), it is now available for free!
Package Registry now available in Core
A year and a half ago, we expanded our support for Java projects and developers by building Maven support directly into GitLab. Our goal was to provide a standardized way to share packages and have version control across projects.
Since then, we’ve invested to build out the Package team further while working with our customers and community to better understand your use cases. We also added support for Node, C#/.NET, C/C++, Python, PHP, and Go developers.
Your increased adoption, usage, and contributions to these features have allowed us to expand our vision to a more comprehensive solution, integrated into our single application, which supports package management for all commonly-used languages and binary formats.
This goal could not have been achieved without the explicit support of the GitLab community.
As part of GitLab’s stewardship promises, we are excited to announce that the basic functionality for each package manager format is now available in the GitLab Core Edition.
This means that if you use npm, Maven, NuGet, Conan, PyPI, Composer or Go modules, you will be able to:
Use GitLab as a private (or public) package registry
Authenticate using your GitLab credentials, personal access, or job token
Publish packages to GitLab
Install packages from GitLab
Search for packages hosted on GitLab
Access an easy-to-use UI that displays package details and metadata and allows you to download any relevant files
Ensure that your contributions are available for ALL GitLab users
We look forward to hearing your feedback and continuing to improve these features with all of our users.
See Documentation and Issue.
See this video.
Cool, found my snippet being referenced here ;)
is it possible to use get-last-successful-build-artifact.sh without private-token (in a world-readable repository)? e.g. to share an artifact download command w/o exposing your token
Yes, just add it as a secret variable in project settings -> pipelines -> secret variables.
As of this writing artifacts cannot be shared across project only within the pipeline. See https://docs.gitlab.com/ee/ci/yaml/README.html#artifacts
However there is an open feature to enable this facility which is not yet implemented.
https://gitlab.com/gitlab-org/gitlab-ce/issues/14728

Jenkins Deploy Artifacts to Nexus using Deploy to Maven Repository

We have a scenario to deploy the artifact generated from maven build to Nexus. The Jenkins job would run goals clean package. The artifact should go to SNAPSHOT repo if the pom.xml has a SNAPSHOT version. If the pom.xml has a release version, the artifact should go to release repo. Any idea how we can achieve this using the Deploy to Maven Repository plugin. As of now I am using the below script in Execute Shell.
#!/bin/bash
var1=$1
var2="SNAPSHOT"
if [[ $(echo "$var1"|grep -i "$var2" | wc -l | tr -d ' ') -gt 0 ]]; then
exit 1
else
exit 0
fi
In Flexible Publish post build action, I am using Execute Shell conditional action. Based on the result of the script, I would execute the Deploy to Maven repository post build action. This can only help to deploy to release repo. Any better way of doing it.
I assume that if you cannot update pom files in repositories, you have two options:
There is a Maven Project plugin, which allows you to add a new post-build action Deploy artifacts to Maven repository. It allows you to set repository URL and name, along with few other options. Setting repository with snapshot policy as a target one will result in successful upload of snapshot artifacts. Note that
the step is available only if build type is Maven build (2/3)
upload will fail with Bad request error if you try to upload release artifact
In case adding a plugin is not an option, you can use dirty hack and alter pom file on-the-fly as the first build step via something like sed. That's risky and should not be used if not absolutely inevitable.
To update all builds at once I'd recommend either use some plugin (Configuration Slicing plugin as an option) or alter config.xml files directly via script from CLI and then use "Reload configuration" in Jenkins.
I believe that this functionality is built into Maven itself; you can specify a different <repository> and <snapshotRepository> in your <distrobutionManagement> block. (See docs)

How do I know which project is requesting a specific jar from Maven

I'm using Eclipse and recently upgraded all my projects to use the latest version of a library.
However in the Maven repository I can still see the old version of the library.
I've deleted manually the old library from the Maven repository, but it keeps coming back.
I am sure all the projects in Eclipse point to the new version: I've checked all my pom.xml, I've used the "Dependency Hierarchy" tool, etc.
Is there a way to know which project is telling Maven to download the old version of the library?
Many thanks!
You can use the Maven dependency plugin's tree goal:
mvn dependency:tree
and filter using the includes option which uses the pattern [groupId]:[artifactId]:[type]:[version].
Re: "and I have many". Perform the following in the topmost directory:
find . -name "pom.xml" -type f -exec mvn dependency:tree -f {} ';' | grep '^\[.*\] [-+\\\|].*'
Syntax details may vary from Bash to Bash.
Hint: Try it in a bottommost project directory first to ensure that it runs properly as intended. Since you have many projects it may take a while to finish and to recognize possible errors only then.
You can use below command to get a tree of all dependencies and then find out where the specific artifact is coming from.
You can pipe with grep to show only the related ones if you you are on linux/unix based os.
mvn dependency:tree
Thanks guys, appreciated, but it certainly is not an easy way. It looks like you have to do project by project (and I have many). Plus most of my pom reference poms in other folders and it's not able to process that either.

Maven building only changed files

Lets say i have module structure like below
Modules
->utils
->domain
->client
->services
->deploy (this is at the module level)
Now to lauch the client i need to make a build of all the modules, i.e utils, domain, client, services, because i am loading the jars of all the above modules to fianlly lanch the client
And all the jars gets assembled in the module deploy.
My question is if i change anything in services for example, then is there a way when running a build from deploy maven could recognise it has to build only services and hence build it and deploy it in deploy folder?
If you only call "mvn install" without "clean", the compiler plugin will compile only modified classes.
For GIT
mvn install -amd -pl $(git status | grep -E "modified:|deleted:|added:" | awk '{print $2}' | cut -f1 -d"/")
OR
In your .bashrc file (.bashrc can be found in home directory ~/.bashrc , or create it if doesn't exists) add the following function.
mvn_changed_modules(){
[ -z "$1" ] && echo "Expected command : mvn_changed_modules (install/build/clean or any maven command)" && exit 0
modules=$(git status | grep -E "modified:|deleted:|added:" | awk '{print $2}' | cut -f1 -d"/")
if [ -z "$modules" ];
then
echo "No changes (modified / deleted / added) found"
else
echo -e "Changed modules are : `echo $modules`\n\n"
mvn $1 -amd -pl $modules
fi
}
**Then after re-starting your bash** (command prompt), you **can just use the following command** from the ROOT directory itself.
smilyface#machine>ProjectRootDir]$ mvn_changed_module install
How it works
As per the question mvn install -amd -pl services is the command when "some changes done in services module". So, first get module name from the changed file(s) and put it as input for mvn-install command
Say for example, below is a list of modified files (output of git status) -
services/pom.xml
services/ReadMe.txt
web/src/java/com/some/Name.java
Then services and web are the modules name which need to be build / compile / install
Within a multi-module build you can use:
mvn -pl ChangedModule compile
from the root module will compile only the given ChangedModule. The compiler plugin will only compile the files which have been changed. But it can happen that the module you have changed would cause a recompile of other module which are depending on the ChangedModule. This can be achieved by using the following:
mvn -amd -pl ChangedModule compile
where the -amd means also make dependents. This will work without installing the whole modules into the local repository by a mvn install.
After trying and using aforementioned advises, I've met following problems:
Maven install (without clean) still takes a lot of time, which for several projects can be 10-20s extra time.
Sebasjm's solution is fast and useful (I was using it for a couple of months), but if you have several changed projects, rebuilding them all the time (if you even hadn't change anything) is a huge waste of time
What really worked for me is comparing source modification dates against .jar modification in local repository. And if you check only for VCS changed files (see sebasjm's answer), then date comparison won't take noticeable time (for me it was less than 1s for 100 changed files).
Main benefit of such approach is very accurate rebuild of only really changed projects.
Main problem is doing modification date comparison is a bit more than one-liner script.
For those, who want to try it, but too lazy to write such script themself sharing my version of it: https://github.com/bugy/rebuilder (linux/windows).
It can do some additional useful things, but the main idea and central algorithm is as explained above.
If you are using SVN and *nix, from the root module
mvn install -amd -pl $(svn st | colrm 1 8 | sed 's /.* ' | xargs echo | sed 's- -,:-g' | sed 's ^ : ')
I had the same frustration and I also wrote a project at the time - alas it is not available but I found people who implemented something similar:
for example - https://github.com/erickzanardo/maven-watcher
It uses nodejs and assumes an maven project but should work on windows and unix alike.
The idea of my implementation is to watch for changes and then compile what changed. - kind of like nodemon.
So for example
When a java file changes - I compile the module
When a class file or jar changes - I do something else (for example copy the jar under tomcat and restart tomcat)
And the two are unrelated.. so if the java compilation failed, there should be no reason for the jar file to update.. and it's quite stable.
I have used it on a project with 23K .java files and it worked smoothly.
It took the watch process a couple of seconds to start - but then it would only run if change was detected so the overall experience was nice.
The next step I intended to add is similar to your SVN support - list the modified files and use them as initialization.
Important to note - if compilation fails, it will retry on the next modification. so if you are modifying multiple jars, and the compilation fails as long as you are writing code, it will retry to compile everything on each code change until it compiled successfully.
If you'd like I can try find my old project, fix it up a bit and publish it..
mvn clean install to run full build
mvn install to compile only changed and prepare war/jars other binaries
mvn compile to compile only changed files...
So mvn compile is the fastest. but if run/debug your project with war/jars it might not show those changes.
The question and the answers posted so far do not take the dependency tree into account. What if the utils module is changed? We need to rebuild (retest at least) it and all the modules depending on it.
Ways to do so:
https://github.com/avodonosov/hashver-maven-plugin/
https://github.com/vackosar/gitflow-incremental-builder/
Gradle Enterprise is a commercial service which provides build cache, in
particular for maven
Migrate to newer build tools like Gradle or Bazel which support build caches out of box.

could the first ever maven build be made offline?

The problem: you have a zipped java project distribution, which depends on several libraries like spring-core, spring-context, jacskon, testng and slf4j. The task is to make the thing buildable offline. It's okay to create project-scope local repo with all required library jars.
I've tried to do that. Looks like even as the project contains the jars it requires for javac and runtime, the build would still require internet access. Maven would still lurk into network to fetch most of its own plugins it requires for the build. I assume that maven is run with empty .m2 directory (as this may be the first launch of the build, which may be an offline build). No, I am not okay with distributing full maven repo snapshot along the project itself, as this looks like an utter mess for me.
A bit of backround: the broader task is to create windows portable-style JDK/IntelliJ Idea distribution which goes along the project and allows for some minimal java coding/running inside IDE with minimal configuration and minimal internet access. The project is targeted towards students in a computer class, with little or no control over system configuration. It is desirable to keep console build system intact for the offline mode, but I guess that maven is overly dependent on the network, so I have to ditch it in favor of good old ant.
So, what's your opinion, could we move first maven build in offline mode completely? My gut feeling is that initial maven distribution just contains the bare minimum required to pull essential plugins off the main repo and is not fully functional without seeing the main repo at least once.
Maven has a '-o' switch which allows you to build offline:
-o,--offline Work offline
Of course, you will need to have your dependencies already cached into your $HOME/.m2/repository for this to build without errors. You can load the dependencies with:
mvn dependency:go-offline
I tried this process and it doesn't seem to fully work. I did a:
rm -rf $HOME/.m2/repository
mvn dependency:go-offline # lot of stuff downloaded
# unplugged my network
# develop stuff
mvn install # errors from missing plugins
What did work however is:
rm -rf $HOME/.m2/repository
mvn install # while still online
# unplugged my network
# develop stuff
mvn install
You could run maven dependency:go-offline on a brand new .m2 repo for the concerned project. This should download everything that maven needs to be able to run offline. If these are then put into a project-scope local repo, you should be able to achieve what you want. I haven't tried this though
Specify a local repository location, either within settings.xml file with <localRepository>...</localRepository> or by running mvn with -Dmaven.repo.local=... parameter.
After initial project build, all necessary artifacts should be cached locally, and you can reference this repository location the same ways, while running other Maven builds in offline mode (mvn -o ...).

Categories