How to set VM options for JLink launcher executable - java

When using jlink, a bin/java file is generated. This executable will accept VM options by specifying options on the command line in the usual way (such as -Dsystem.property=value or -Xmx1G).
jlink also provides a --launcher option to create an executable that can be run directly, instead of having to invoke the bin/java executable with a module name.
How do I make the launcher executable pre-configured to use my choice of JVM options?

You can use the add-options jlink plugin.
For example, if you want to set Xmx:
jlink --add-options="-Xmx100m" ...
To see a list of jlink plugins, run jlink --list-plugins.
The add-options plugin is currently documented (JDK14) as follows:
Plugin Name: add-options
Option: --add-options=<options>
Description: Prepend the specified <options> string, which may include
whitespace, before any other options when invoking the virtual machine
in the resulting image.
Beware that some of the plugins are, apparently, unstable (including add-options): https://docs.oracle.com/en/java/javase/12/tools/jlink.html

There are one or two ways of going about this,but mostly I'm going to concentrate on the default java ways.
Actual answer - Use JPackage. JLink is just the image of a runtime. JPackage is your distributable
Support for native packaging formats to give the end user a more natural installation experience. Specifically, the tool will support the following formats:
Windows: msi, exe
macOS: pkg, app in a dmg (drag the app into the Applications directory)
Linux: deb, rpm
The application will be installed in the typical default directory for each platform unless the end-user specifies an alternate directory during the installation process (for example, on Linux the default directory will be /opt).
The ability to specify JDK and application arguments at packaging time that will be used when launching the application
The ability to package applications in ways that integrate into the native platform, for example:
Setting file associations to allow launching an application when a file with an associated suffix is opened
Launching from a platform-specific menu group, such as Start menu items on Windows
Option to specify update rules for installable packages (such as in rpm/deb)
1) - Specify an #Args file
You can make an #args file that you can deploy (bundled) with your jlink app, and reference it when starting the application
java #args -m module/main
2) Use the new environment variable
JDK_JAVA_OPTIONS=--add-opens java.base/java.lang=...... -Xmx1G -Djdk.logging.provider=
https://docs.oracle.com/javase/9/tools/java.htm#JSWOR624
3) Use the JLink/JMod to specify main class in module
https://maven.apache.org/plugins/maven-jmod-plugin/plugin-info.html
<plugin>
<artifactId>maven-jmod-plugin</artifactId>
<version>3.0.0-alpha-1</version>
<extensions>true</extensions>
<configuration>
<module>
<mainClass>mainClass</mainClass>
</configuration>
</plugin>
4) Use JLink to create a custom launcher/Edit JDK_VM_OPTIONS
<plugin>
<artifactId>maven-jlink-plugin</artifactId>
<version>3.0.0-alpha-2-SNAPSHOT</version>
<extensions>true</extensions>
<configuration>
<noHeaderFiles>true</noHeaderFiles>
<noManPages>true</noManPages>
<stripDebug>true</stripDebug>
<verbose>true</verbose>
<compress>2</compress>
<launcher>customjrelauncher=module/mainClass</launcher>
</configuration>
<dependencies>
<dependency>
<groupId>org.ow2.asm</groupId>
<artifactId>asm</artifactId>
<version>${maven.asm.version}</version>
</dependency>
</dependencies>
</plugin>
In the generated .sh/.bat there is a variable to specify any custom addon's
You can also specify the main class descriptor in your module-info using moditect if you are using it:
<plugin>
<groupId>org.moditect</groupId>
<artifactId>moditect-maven-plugin</artifactId>
<executions>
<execution>
<id>add-module-infos</id>
<phase>package</phase>
<goals>
<goal>add-module-info</goal>
</goals>
<configuration>
<overwriteExistingFiles>true</overwriteExistingFiles>
<module>
<mainClass>mainClassLocation</mainClass>
</module>
</configuration>
</execution>
</executions>
</plugin>

Related

Passing an options/arguments file to Maven compiler plugin

The javac command can be configured with a file by specifying that file on the command line with #:
javac #compileargs
I want to use that syntax in Maven so I can collect parts of the command line arguments in such a file instead of Maven's pom.xml.
The Maven compiler plugin does not seem to have a specific tag for that, so I tried compilerArgs:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.6.1</version>
<configuration>
<compilerArgs>
<arg>#compile-args</arg>
</compilerArgs>
<fork>true</fork>
</configuration>
</plugin>
But then javac complains:
javac: invalid flag: #compile-args
Usage: javac <options> <source files>
use --help for a list of possible options
If I get the actual command Maven is executing (with -X) and call that myself it works, though.
I recently had a similar problem with spaces in compiler options so I assume a similar process is screwing with me here.
Background info: The maven-compiler depends on the plexus compiler.
If the build process gets forked it will take all specified arguments and create a temporary config file on its own (see the code). The argument file will also include the user defined argument file, but the documentation points out that:
Use of the at sign (#) to recursively interpret files is not supported.
This means referencing an options file from Maven is not possible.

Spring boot additional Crash Command

According to the Spring boot documentation, it's possible to define additional command when using a remote shell based on Crash.
Default locations for these commands are classpath*:/commands/,classpath*:/crash/commands/
A property can be used to override the default locations but in the provided example, the custom command is located in resources.
In my opinion, custom commands (at least java commands) shouldn't be located in resources but in src/main/java.
It works fine when defining a custom path in resources but how can I define a custom path in src/main/java? Didn't find a way to do it for now!
If they're under src/main/java, they'll be compiled automatically which is not what you need. My solution was to simulate that directory as a resources folder, which in short translates to:
configure the compiler plugin to ignore that particular folder
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.3</version>
<configuration>
<source>${java.version}</source>
<target>${java.version}</target>
<excludes>
<exclude>crash/commands/*</exclude>
</excludes>
</configuration>
</plugin>
copy the files just like any regular resources in the target directory
<resource>
<directory>src/main/java/crash/commands</directory>
<targetPath>crash/commands</targetPath>
<filtering>false</filtering>
</resource>
Minor update & disclaimer:
As you may already know, there are a couple of closures which are executed on login/logout. At least with v1.3.1, which is what I'm blindly inheriting from spring-boot, it will pick the first login.groovy it finds in the classpath. My project's artifact is packaged in an RPM along with all the other dependencies. Since its name begins with r, it comes after crash.shell-1.3.1.jar which is where the defaults reside, so I had to do the following small hack to make it pick up my own scripts instead of the default ones:
<!-- hack to make CRaSH pick up login.groovy from our jar instead of the default one -->
<finalName>0_${project.artifactId}-${project.version}</finalName>
You can try to put your command at src/main/resources/commands/

Maven AppAssembler not finding class

Attempting to modify an existing Java/Tomcat app for deployment on Heroku following their tutorial and running into some issues with AppAssembler not finding the entry class. Running target/bin/webapp (or deploying to Heroku) results in Error: Could not find or load main class org.stopbadware.dsp.Main
Executing java -cp target/classes:target/dependency/* org.stopbadware.dsp.Main runs properly however. Here's the relevant portion of pom.xml:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>appassembler-maven-plugin</artifactId>
<version>1.1.1</version>
<configuration>
<assembleDirectory>target</assembleDirectory>
<programs>
<program>
<mainClass>org.stopbadware.dsp.Main</mainClass>
<name>webapp</name>
</program>
</programs>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>assemble</goal>
</goals>
</execution>
</executions>
</plugin>
My guess is mvn package is causing AppAssembler to not use the correct classpath, any suggestions?
Your artifact's packaging must be set to jar, otherwise the main class is not found.
<pom>
...
<packaging>jar</packaging>
...
</pom>
The artifact itself is added at the end of the classpath, so nothing other than a JAR file will have any effect.
Try:
mvn clean package jar:jar appassembler:assemble
Was able to solve this by adding "$BASEDIR"/classes to the CLASSPATH line in the generated script. Since the script gets rewritten on each call of mvn package I wrote a short script that calls mvn package and then adds the needed classpath entry.
Obviously a bit of a hack but after a 8+ hours of attempting a more "proper" solution this will have to do for now. Will certainly entertain any more elegant ways of correcting the classpath suggested here.
I was going through that tutorial some time ago and had very similar issue. I came with a bit different approach which works for me very nicely.
First of all, as it was mentioned before, you need to keep your POM's type as jar (<packaging>jar</packaging>) - thanks to that, appassembler plugin will generate a JAR file from your classes and add it to the classpath. So thanks to that your error will go away.
Please note that this tutorial Tomcat is instantiated from application source directory. In many cases that is enough, but please note that using that approach, you will not be able to utilize Servlet #WebServlet annotations as /WEB-INF/classes in sources is empty and Tomcat will not be able to scan your servlet classes. So HelloServlet servlet from that tutorial will not work, unless you add some additional Tomcat initialization (resource configuration) as described here (BTW, you will find more SO questions talking about that resource configuration).
I did a bit different approach:
I run a org.apache.maven.plugins:maven-war-plugin plugin (exploded goal) during package and use that generated directory as my source directory of application. With that approach my web application directory will have /WEB-INF/classes "populated" with classes. That in turn will allow Tomcat to perform scanning job correctly (i.e. Servlet #WebServlet annotations will work).
I also had to change a source of my application in the launcher class:
public static void main(String[] args) throws Exception {
// Web application is generated in directory name as specified in build/finalName
// in maven pom.xml
String webappDirLocation = "target/embeddedTomcatSample/";
Tomcat tomcat = new Tomcat();
// ... remaining code does not change
Changes to POM which I added - included maven-war-plugin just before appassembler plugin:
...
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.5</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>exploded</goal>
</goals>
</execution>
</executions>
</plugin>
...
Please note that exploded goal is called.
I hope that small change will help you.
One more comment on that tutorial and maven build: note that the tutorial was written to show how simple is to build an application and run it in Heroku. However, that is not the best approach to maven build.
Maven recommendation is that you should adhere to producing one artifact per POM. In your case there are should two artifacts:
Tomcat launcher
Tomcat web application
Both should be build as separate POMs and referenced as modules from your parent POM. If you look at the complexity of that tutorial, it does not make much sense to split that into two modules. But if your applications gets more and more complex (and the launcher gets some additional configurations etc.) it will makes a lot of sense to make that "split". As a matter of fact, there are some "Tomcat launcher" libraries already created so alternatively you could use of one them.
You can set the CLASSPATH_PREFIX environment variable:
export CLASSPATH_PREFIX=target/classes
which will get prepended to the classpath of the generated script.
The first thing is that you are using an old version of appassembler-maven-plugin the current version is 1.3.
What i don't understand why are you defining the
<assembleDirectory>target</assembleDirectory>
folder. There exists a good default value for that. So usually you don't need it. Apart from that you don't need to define an explicit execution which bounds to the package phase, cause the appassembler-maven-plugin is by default bound to the package phase.
Furthermore you can use the useWildcardClassPath configuration option to make your classpath shorter.
<configuration>
<useWildcardClassPath>true</useWildcardClassPath>
<repositoryLayout>flat</repositoryLayout>
...
</configruation>
And that the calling of the generated script shows the error is depending on the thing that the location of the repository where all the dependencies are located in the folder is different than in the generated script defined.

Managing multiple Java modules with external resource dependencies

Suppose I have a set of n Java libraries each with a conf and a resources folder and then I have a Java project X that depends on some of these n Java libraries, how do I make it so that when X is built, all the dependent conf and resources folders are copied and merged in the dist folder. No - I don't want them to be embedded in the jars.
Obviously, there will be issues with duplicate filenames, but let's assume all files have distinct names.
Edit: An additional and related question: How do it so that project X can detect the conf and resources during development phase of all the dependent projects without needing to copy them over to project X's folder. For example, I'd like Netbeans to be able to find these resources that the referenced libraries use when I click "Run" on X's main method.
Edit2: Here's a hypothetical example of a project setup:
**Library 1:** Image Processing
conf: Processing configurations, log4j
resources: Training sets, etc.
**Library 2:** Machine Learning
conf: Training parameters, log4j
resources: Dependent C++ batch files (i.e. system calls)
**Library 3:** Reporting Tool
resources: Reporting templates
**Library 4:** Text Mining Toolkit
conf: Encoding, character sets, heuristics
resources: Helper PHP scripts
**Executable Project 1: **
Uses Library 1 to process images
Uses Library 2 to do machine learning on processed images
Uses Library 3 to make reports
**Executable Project 2: **
Uses Library 4 to do text mining
Uses Library 2 to do machine learning on collected textual information
Uses Library 3 to make reports
We can assume Executable Projects 1 and 2 can use different parameters for their constituent libraries once deployed.
Take a look at the maven-dependency-plugin which can copy the deps and copy them to particular location.
<project>
[...]
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.5.1</version>
<executions>
<execution>
<id>copy</id>
<phase>package</phase>
<goals>
<goal>copy</goal>
</goals>
<configuration>
<artifactItems>
<artifactItem>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<type>jar</type>
<overWrite>false</overWrite>
<outputDirectory>${project.build.directory}/alternateLocation</outputDirectory>
<destFileName>optional-new-name.jar</destFileName>
</artifactItem>
</artifactItems>
<outputDirectory>${project.build.directory}/wars</outputDirectory>
<overWriteReleases>false</overWriteReleases>
<overWriteSnapshots>true</overWriteSnapshots>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
[...]
</project>
I see the following in your example. Let me use Library 1 as an example.
Library 1: Image Processing
conf: Processing configurations, log4j
resources: Training sets, etc.
You have library 1 which contains processing configuration which sounds to me like a runtime configuration. This mean it should be part of the created jar (src/main/resources the location for such things). The same is for log4j configuration. Just put it into the jar (src/main/resources of the project.
Now comming to resources: Training set. If you make a separate maven project which contains a training set so this will produce a single artifact and can later be used to integrate that into the Example 1. If you have several training sets you can create different artifacts and use them as usual dependency or use the maven-dependency-plugin (or may be the maven-remote-resources-plugin) to use them in your projects.
With this setup you can deploy Library 1 into your local repository and of course into a repository manager and use it as a dependency.
You can use the same approach to handle Library 2, 3 etc.
May be you can take a look at the maven-remote-resource-plugin (I'm not sure if this helps).

Environment Variable with Maven

I've ported a project from Eclipse to Maven and I need to set an environment variable to make my project work.
In Eclipse, I go to "Run -> Run configurations" and, under the tab "environment", I set "WSNSHELL_HOME" to the value "conf".
How can I do this with Maven?
You can just pass it on the command line, as
mvn -DmyVariable=someValue install
[Update] Note that the order of parameters is significant - you need to specify any options before the command(s).[/Update]
Within the POM file, you may refer to system variables (specified on the command line, or in the pom) as ${myVariable}, and environment variables as ${env.myVariable}. (Thanks to commenters for the correction.)
Update2
OK, so you want to pass your system variable to your tests. If - as I assume - you use the Surefire plugin for testing, the best is to specify the needed system variable(s) within the pom, in your plugins section, e.g.
<build>
<plugins>
...
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
...
<configuration>
...
<systemPropertyVariables>
<WSNSHELL_HOME>conf</WSNSHELL_HOME>
</systemPropertyVariables>
</configuration>
</plugin>
...
</plugins>
</build>
The -D properties will not be reliable propagated from the surefire-pluging to your test (I do not know why it works with eclipse). When using maven on the command line use the argLine property to wrap your property. This will pass them to your test
mvn -DargLine="-DWSNSHELL_HOME=conf" test
Use System.getProperty to read the value in your code. Have a look to this post about the difference of System.getenv and Sytem.getProperty.
You could wrap your maven command in a bash script:
#!/bin/bash
export YOUR_VAR=thevalue
mvn test
unset YOUR_VAR
For environment variable in Maven, you can set below.
http://maven.apache.org/surefire/maven-surefire-plugin/test-mojo.html#environmentVariables
http://maven.apache.org/surefire/maven-failsafe-plugin/integration-test-mojo.html#environmentVariables
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
...
<configuration>
<includes>
...
</includes>
<environmentVariables>
<WSNSHELL_HOME>conf</WSNSHELL_HOME>
</environmentVariables>
</configuration>
</plugin>
Following documentation from #Kevin's answer the below one worked for me for setting environment variable with maven sure-fire plugin
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<environmentVariables>
<WSNSHELL_HOME>conf</WSNSHELL_HOME>
</environmentVariables>
</configuration>
</plugin>
Another solution would be to set MAVEN_OPTS (or other environment variables) in ${user.home}/.mavenrc (or %HOME%\mavenrc_pre.bat on windows).
Since Maven 3.3.1 there are new possibilities to set mvn command line parameters, if this is what you actually want:
${maven.projectBasedir}/.mvn/maven.config
${maven.projectBasedir}/.mvn/jvm.config
There is a maven plugin called properties-maven-plugin this one provides a goal set-system-properties to set system variables. This is especially useful if you have a file containing all these properties. So you're able to read a property file and set them as system variable.
in your code add:
System.getProperty("WSNSHELL_HOME")
Modify or add value property from maven command:
mvn clean test -DargLine=-DWSNSHELL_HOME=yourvalue
If you want to run it in Eclipse, add VM arguments in your Debug/Run configurations
Go to Run -> Run configurations
Select Tab Arguments
Add in section VM Arguments
-DWSNSHELL_HOME=yourvalue
you don't need to modify the POM
You can pass some of the arguments through the _JAVA_OPTIONS variable.
For example, define a variable for maven proxy flags like this:
_JAVA_OPTIONS="-Dhttp.proxyHost=$http_proxy_host -Dhttp.proxyPort=$http_proxy_port -Dhttps.proxyHost=$https_proxy_host -Dhttps.proxyPort=$http_proxy_port"
And then use mvn clean install (it will automatically pick up _JAVA_OPTIONS).
I suggest using the amazing tool direnv. With it you can inject environment variables once you cd into the project. These steps worked for me:
.envrc file
source_up
dotenv
.env file
_JAVA_OPTIONS="-DYourEnvHere=123"
As someone might end up here changing his global Java options, I want to say defining _JAVA_OPTIONS is a bad idea. Instead define MAVEN_OPTS environment variable which will still be picked up automatically by Maven but it won't override everything like _JAVA_OPTS will do (e.g. IDE vm options).
MAVEN_OPTS="-DmyVariable=someValue"

Categories