Installing local jar to automated pipeline and sonarqube: how to do it with no command and no systempath? - java

I am developing a project which needs a client personal jar, and it needs to be deployed on a pipeline of tools which are out of our control (sadly). One of the tools in this pipeline is sonarqube.
To build and deploy we have to use maven.
I put the jar into a folder of the project, and tried various way to actually make it work.
The first (working) way was to have it as a system with a systemPath to the folder of the project. It compiled, worked and everything, but sonarqube apparently hates systemPath and made us take it away.
After a tiny bit of searching, we added to our pom a maven-install-plugin, bounded an install-file to validate phase and configured to generate the dependency. This seems to work on local if I first run mvn validate and THEN mvn clean package. Otherwise, it tries to look for the jar on the main repository and fails. If I comment the tag and leave only the plugin active, I noticed it executes the plugin, installs the jar to local repository, but build fails due to not resolving packages and classes inside the jar. If I now put the tag in, everything works, because it now find the jar in the repository.
While this solution works, it doesnt suite me because the repository will be emptied every once in a while, and to restart everything I would need to commits, one knownligly failing, just to install the jar.
I tried addind a tag instead, pointing to a project dir where I would store the necessary jar, and that works just fine on my PC, but utterly fails on the pipeline, looking at main repository only (I guess it is some configuration on the pipeline, but cant really tell, being outside my control)

Was actually able to do it with maven install plugin,
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-install-plugin</artifactId>
<configuration>
<groupId>com.exmaple.stuff</groupId>
<artifactId>ClientJar</artifactId>
<version>1.0</version>
<packaging>jar</packaging>
<file>${project.basedir}/src/main/resources/ClientJar-1.0.jar</file>
<generatePom>true</generatePom>
</configuration>
<executions>
<execution>
<id>install-client-jar</id>
<phase>validate</phase>
<goals>
<goal>install-file</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
And having dependency
<!-- Client jar -->
<dependency>
<groupId>example</groupId>
<artifactId>ClientJar</artifactId>
<version>1.0</version>
</dependency>
If you notice, the tricky part here was actually that groupId is different between the dependency and the plugin declaration. I do not know if this difference is due to the configuration of their artifactory server, but it seems to work locally too.
Also it is needed to explicitly run mvn validate in the pipeline

Related

spring-boot-devtools reload of multi-module maven project changes

Reload of multi-module maven project changes
Setting
Imagine a multi-module maven-project. The project structure is:
pom.xml //parentpom
|
pom.xml //submodule_1
|
pom.xml //submodule_2
.
.
.
pom.xml //submodule_7
For example submodule_5 has submodule_6 and submodule_7 as dependencies. The submodule_5 can be build to construct a War-file which can be deployed. Spring-Boot-Devtools provide the feature of automatic-restart whenever there is a change to submodule_5 it's classpath.
Whenever the application is run using:
mvn spring-boot:run
And changes are made to submodule_5 (depending on which IDE you use the classpath get changed. (for Eclipse automaticaly / for InteliJ when pressing Ctrl+F9)) spring-boot automaticaly restarts the application and changes are added. Changes which happen to submodule_6 or submodule_7 don't trigger the automatic restart.
Questions
Is there a way to make it so that whenever you make changes in submodule_6 or submodule_7 to have them force a restart and there-for apply the changes?
Spring-boot-devtools uses two classloaders: "The Base Classloader" & "The Restart Classloader". Is it so that on initial start of the application submodule_6 and submodule_7 get added to "The Base Classloader" whilst submodle_5 is kept in the "The Restart Classloader"? Making it so that whenever submodule_5 forces a restart it uses the versions of submodule_6 and submodule_7 out of "The Base Classloader"?
You may specify additional folders to be watched by spring-boot-devtools, in application.properties:
spring.devtools.restart.additional-paths=../submodule_6,../submodule_7
See Spring's documentation on using-boot-devtools-restart-additional-paths.
To fix this problem I started running the application from within InteliJ. without having to add.
spring.devtools.restart.additional-paths=../submodule_6,../submodule_7
IntelliJ and spring-boot seem to work together very wel. The reason it was not working for me in the first place was because I was working from the commandline at first.
Difference between commandline and IDE
So spring-boot-devtools uses two classloaders to load an application. Jars will be loaded ones in the "Base classloader", your application will be loaded in the "restart classloader". This last classloader will restart everytime there is a change on the classpath.
Whenever running submodule_5 from the commandline, it will build the submodule_6 and submodule_7 and add the jars to the build of submodule_5. Whenever changes are made in submodule_6 and submodule_7 spring-boot won't even notice since it's only watching submodule_5 and has the jars it needs. Even if you would specifically tell it to also watch those submodules, it still won't rebuild those, it'll just keep using the jars it already has loaded in the "base classloader" (This is my assumption, I'm not 100% certain of the way it works).
Whenever running submodule_5 from the IDE, it won't create the jar of the submodule_6 and submodule_7. It will just use their classpath. This makes it so that changes in your intire project's classpath (all submodules) will trigger the automatic restart and the changes will be applied.
EXTRA
Whenever running from the IDE changes to resources like html-files, css-files, xml-files . . . won't trigger a restart since this is not a change in the classpath. But the changes will still be visible.
I tried with spring.devtools.restart.additional-paths and in any case it is useless : source change restart the application but helpless because the application doesn't have the target/classes of modules during its execution.
With spring-boot:run executed on quite recent IntelliJ versions : it works out of the box.
With spring-boot:run executed on command line : there are at least two cases.
Case 1) we want to execute spring-boot:run from the module that has the spring boot main class (submodule_5 in the op question).
We need to add in the plugin configuration of its pom.xml the additional classpaths of compiled classes that we want that spring-boot plugin be aware it :
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<folders>
<folder>
../submodule_6/target/classes
</folder>
<folder>
../submodule_7/target/classes
</folder>
</folders>
</configuration>
</plugin>
</plugins>
</build>
Case 2) we want to execute spring-boot:run from the parent-module.
It works only with pom multi-modules that are also parent of modules.
We need to do two changes :
First, add the spring boot plugin declaration with flag skip in the parent pom :
`<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<version>2.4.1</version>
<configuration>
<skip>true</skip>
</configuration>
</plugin>
</plugins>
</build>`
Then add in the pom.xml of the module that has the spring boot main class (submodule_5 in the op question) :
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<skip>false</skip>
</configuration>
</plugin>
</plugins>
</build>
We can now start the application from the parent pom with :
mvn -pl submodule_5 -am spring-boot:run
FYI these maven flags specify to apply the goal on submodule_5 after applied that on its dependencies (whereas the skip flag in the multi/parent pom.xml).

Maven Shade - add local JAR file [duplicate]

I already found an answer here on Stack Overflow how to include a 3rd party JAR in a project without installing it to a "local repository":
Can I add jars to maven 2 build classpath without installing them?
But, when I use the Maven Shade Plugin to create a JAR that includes all the dependencies of the project as well, the 3rd party JAR is not included automatically.
How can I make the Maven Shade Plugin add such a 3rd party JAR in to the shaded JAR?
As per the answer gotten, I made it work. What I did was, added this snippet to the beginning of my pom.xml:
<repositories>
<repository>
<id>repo</id>
<url>file://${basedir}/repo</url>
</repository>
</repositories>
Then added a dependency for my project, also to pom.xml:
<dependencies>
<dependency>
<groupId>dummy</groupId>
<artifactId>dummy</artifactId>
<version>0.0.0</version>
<scope>compile</scope>
</dependency>
</dependencies>
And then ran a command line to add a package to 'repo':
mvn org.apache.maven.plugins:maven-install-plugin:2.3.1:install-file
-Dfile=<my-jar>.jar -DgroupId=dummy -DartifactId=dummy
-Dversion=0.0.0 -Dpackaging=jar -DlocalRepositoryPath=`pwd`/repo/
(Not sure if the repo path needs to be a full path, but didn't want to take chances.)
The contents of the repo subdirectory is now:
repo/dummy/dummy/0.0.0/dummy-0.0.0.jar
repo/dummy/dummy/0.0.0/dummy-0.0.0.pom
repo/dummy/dummy/maven-metadata-local.xml
Now I can check this in to version control, and have no local or remote dependencies.
But, when I use the Maven Shade Plugin to create a JAR that includes all the dependencies of the project as well, the 3rd party JAR is not included automatically.
Yes, because the system scoped dependencies are assumed to be always present (this is exactly what the system scope is about) so they won't be included. People actually don't understand what system scope dependencies are, they just keep abusing them (yes, this is abuse), and then get side effects and wonder why (as Brian pointed out in his answer).
I already wrote many, many, really many times about this here on SO and in 99% of the cases, system scoped dependencies should be avoided. And I'll repeat what the Dependency Scopes mini guide says one more time:
system: This dependency is required in some phase of your project's lifecycle, but is system-specific. Use of this scope is discouraged: This is considered an "advanced" kind of feature and should only be used when you truly understand all the ramifications of its use, which can be extremely hard if not actually impossible to quantify. This scope by definition renders your build non-portable. It may be necessary in certain edge cases. The system scope includes the <systemPath> element which points to the physical location of this dependency on the local machine. It is thus used to refer to some artifact expected to be present on the given local machine an not in a repository; and whose path may vary machine-to-machine. The systemPath element can refer to environment variables in its path: ${JAVA_HOME} for instance.
So, instead of using the system scope, either:
Add your libraries to your local repository via install:install-file. This is a quick and dirty way to get things working, it might be an option if you're alone but it makes your build non portable.
Install and run an "enterprise repository" like Nexus, Archiva, or Artifactory and add your libraries via deploy:deploy-file. This is the ideal scenario.
Setup a file based repository as described in this previous answer and put your libraries in there. This is the best compromise if you don't have a corporate repository but need to work as a team and don't want to sacrifice portability.
Please, stop using the system scope.
The Maven addjars plugin solves this problem - see
http://code.google.com/p/addjars-maven-plugin/wiki/UsagePage
Used <resources> to include my lib with all jars. i.e:
<build>
<resources>
<resource>
<directory>${project.basedir}</directory>
<includes>
<include>lib/*.jar</include>
</includes>
</resource>
</resources>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<configuration>
<createDependencyReducedPom>false</createDependencyReducedPom>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
If you only need a quick and dirty solution, you can add the content of the extracted jar file to your src/main/resource directory.

Automatically install a project in the local repository?

I'm trying to work around a maven bug MDEP-187 ( https://issues.apache.org/jira/browse/MDEP-187 ) by not using workspace resolution.
This forces me to do a mvn install for all my dependencies, I'm doing this by creating a launch configuration in eclipse with goal install.
The problem is that i have to create a launch config for every project in my multiproject workspace, in addition to install i have to manually call every launch config and run it. Which just doesn't work.
Is it possible to automatically install a project in the local repository? (whenever i update my code)
If you don't need to run dependency:copy in Eclipse, you can use following work-around:
Add a profile to your pom.xml, something like this:
<profiles>
<profile>
<id>copy</id>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.10</version>
<executions>
[...]
</executions>
</plugin>
</plugins>
<build>
<profile>
</profiles>
Enable workspace resolution in Eclipse.
Then Eclipse will not use dependency:copy, but you can use dependency:copy with command line: mvn install -P copy.
I did go with #khmarbaise solution:
But than you need to can handle the whole thing via
maven-assembly-plugin which can create archives / folders with all the
dependencies. Apart from that a swing ui must be started somehow which
will need some kind of shell script / batch file which you can create
by using appassembler-maven-plugin...And it sounds like you need to go
for a multi module project in maven..cause you might have parts like
core, ui, etc. which are needed to be combined in the end.
#khmarbaise i was in the understanding that the assembly-plugin didn't
support putting dependencies in a lib/ folder (just putting everything
in 1 big jar), but after a little bit of trying i just go myself a zip
with a runnable jar and my dependencies in a lib/ folder. Tomorrow i'm
going to read a bit more about the assembly-plugin. I'm happy ;-

Eclipse/Maven and "Resolve dependencies from workspace projects" can't mix jars and source?

I've got what seems like a corner case for Eclipse/Maven and "Resolve dependencies from workspace projects". My project has a mix of written code and generated code, with the generated code coming from a dependency which uses JAXWS.
The problem is that if I check "Resolve dependencies", Eclipse/Maven ignores any JAR dependencies and tries to resolve everything by only looking at the workspace, which results in Eclipse showing errors like "Package/Class not found" (related to the generated code) even though the project will build fine with Maven from the command line.
On the other hand, if I uncheck it, it resolves everything by only looking at the JARs in the Maven repository. The second option generally works, but when I do something like Ctrl-click on a class or variable, I get the Class File Editor and "Source not found", which isn't terribly useful. Also, it can get out of sync if I edit code in the IDE but don't run "maven install" after that.
I suppose this is mainly an inconvenience with Eclipse but it's annoying. I am considering resolving this by modifying the Maven dependencies to build with source (or debug) but I can't necessarily do this with everything. Is the "Resolve dependencies" option intended to work exclusively one way or the other as I've described?
You might want to have a look at the build helper maven plugin.
You can configure it like this :
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>1.5</version>
<executions>
<execution>
<id>add-source</id>
<phase>generate-sources</phase>
<goals>
<goal>add-source</goal>
</goals>
<configuration>
<sources>
<source>target/generated-sources</source>
<source>target/jaxws/wsimport/java</source>
</sources>
</configuration>
</execution>
</executions>
</plugin>
This will tell your eclipse maven plugin to have a look at the generated sources and include it in your project classpath.
You can also add the generated sources manually to your classpath in eclipse. (right-click on the generated folder -> add to build path)
I think that since you want to reference files that only exist after a build that you somehow force the build to happen before you need the references resolved. You could cheat by just doing a build from within Eclipse. That would leave the generated source files in place ready to be referenced. I think, however, that the maven philosophy would have you move the generated code to another maven artifact entirely. That would let you separate the lifecycle of the two groups of code so that when you're ready to use Eclipse to edit the hand-coded code, references to generated classes are resolved because you've already generated that code in the build of an separate, independent module.
I know this is an old issue. But I encountered the same thing in Juno with an updated "m2e-wtp" plugin. So I'm answering solely for other readers' benefit.
This was only happening in war projects. The only thing resolved it eventually was removing the ".settings" folder under the war project's folder and restarting eclipse.

Creating a self-contained source release with Maven

Up until now we used Ant in my company. Whenever we wanted to send the application to the client we run a special Ant script that packaged all our source code with all jar libraries and Ant itself along with a simple batch file.
Then the client could put the files on a computer with no network access at all (and not even Ant) and run the batch file. As long as the computer had a valid JDK the batch script would compile all the code using the jars and create a WAR/EAR that would finally be deployed by the client on the application server.
Lately we migrated to Maven 2. But I haven't found a way to do the same thing. I have seen the Maven assembly plugin but this just creates source distributions or binary ones. Our scenario is actually a mix since it contains our source code but binary jars of the libraries we use (e.g. Spring, Hibernate)
So is it possible to create with Maven a self-contained assembly/release/package that one can run in a computer with no network access at all??? That means that all libraries should be contained inside.
Extra bonus if Maven itself is contained inside as well, but this is not a strict requirement. The final package should be easily compiled by just one command (easy for a system administrator to perform).
I was thinking of writing my own Maven plugin for this but I suspect that somebody has already encountered this.
From your dev environment, if you include the following under build plugins
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</plugin>
and invoke mvn assembly:assembly, you would get yourApp-version-with-dependencies.jar in the target folder. This is a self-sufficient jar, and with a Main-class MANIFEST.MF entry, anybody can double click and run the application.
You might try this approach:
Use mvn ant:ant to create ant build
scripts from a maven project
Make sure ant is a project dependency
Use the assembly to build an ant
system
or plan b:
Use mvn ant:ant to create ant build
scripts from a maven project
Make sure ant is a project dependency
Write a "bootstrap class" to call Ant and run the build
Use appassembler to build a
scripted build and install environment
In plan b, you'd write scripts to set up a source tree someplace from the packaged source jars, and then use the appassembler build bat or sh scripts to call the bootstrap and build via ant. Your bootstrap can do anything you need to do before or after the build.
Hope this helps.
Perhaps an answer that I submitted for a similar question could be of some assistance. See Can maven collect all the dependant jars for a project to help with application deployment? The one piece missing is how to include the source code in the assembly. I have to imagine that there is some way to manage that with the assembly plugin. This also doesn't address the inclusion of Maven in the distribution.
What was the reason for moving from Ant to Maven? It sounds like you had everything worked out well with the Ant solution, so what is Maven buying you here?
If it is just dependency management, there are techniques for leveraging Maven from Ant that give you the best of both worlds.
the source plugin will give you a jar containing the source of a probject "source:jar". you could then use the assembly plugin to combine the source jars from your internal projects (using the sources to reference these source jars) and the binary jars from the external projects into one distribution.
however, as for turning this into a compilable unit, i have no suggestions. you could certainly bundle maven, but you'd need to create a bundle containing all the plugins you need to build your project! i don't know of any existing tool to do that.
This is how I do it... on the build part of the pom add in this:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-source-plugin</artifactId>
<executions>
<execution>
<id>attach-sources</id>
<phase>verify</phase>
<goals>
<goal>jar</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
And then on the profiles section add this bit in:
<profiles>
<profile>
<id>release</id>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-source-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>jar</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>
And when I do a maven install it builds the jar and also checks in a jar of the source.

Categories