How disable sql-maven-plugin file copy to temp folder? - java

Here is a piece of plugin configuration:
<configuration>
<srcFiles>
<srcFile>src${file.separator}integration-test${file.separator}resources${file.separator}sql${file.separator}schema.sql</srcFile>
</srcFiles>
</configuration>
Everything works, but I see in log:
[INFO] --- sql-maven-plugin:1.5:execute (create-tables) # smsfinance-server ---
[INFO] Executing file: C:\Users\User\AppData\Local\Temp\schema.915861870sql
Is there a way to disable copying?

No, this is not possible at the moment.
Reading from the source code of version 1.5, the SQL Maven Plugin is copying the sources files to a temporary directory to handle filtering, even when filtering is disabled. Filtering is enabled with the attribute enableFiltering.
You could create an issue at their GitHub page to disable copying of the file when filtering is disabled (which is the default).

Workaround
For cases when sql files is placed on ssd disk, you can specify tmp dir on ssd - just add -Djava.io.tmpdir=/ssd-drive/tmp to maven command line. It does not disable copying, but spead up sql execution process.

Related

Maven run Jetty Plugin by command line specifying contextPath

I'm on IntelliJ IDEA CE and I'm running a war application by means of the Maven Jetty Plugin.
I don't have the plugin in my pom.xml (and I don't want to), so I'm running directly the web server with this command:
mvn org.eclipse.jetty:jetty-maven-plugin:9.4.26.v20200117:run-exploded
It works fine but it doesn't apply the contextPath specified in the xml file src/main/webapp/META-INF/context.xml
I would like to specify the right contextPath from the terminal command.
The documentation doesn't say anything specific about this.
The tests I've made (without any successful result) are the following:
mvn org.eclipse.jetty:jetty-maven-plugin:9.4.26.v20200117:run-exploded -Dproject.artifactId='/project'
mvn org.eclipse.jetty:jetty-maven-plugin:9.4.26.v20200117:run-exploded -DcontextPath='/project'
mvn org.eclipse.jetty:jetty-maven-plugin:9.4.26.v20200117:run-exploded -Dconfiguration.webApp.contextPath="/project"
mvn org.eclipse.jetty:jetty-maven-plugin:9.4.26.v20200117:run-exploded -Djetty.configuration.webApp.contextPath="/project"
What am I missing?
This is ultimately a generic maven tip, not Jetty specific.
In other words, how to figure out what you can do with a maven plugin.
$ mvn org.eclipse.jetty:jetty-maven-plugin:9.4.26.v20200117:help
...(snip)...
jetty:help
Display help information on jetty-maven-plugin.
Call mvn jetty:help -Ddetail=true -Dgoal=<goal-name> to display parameter
details.
So lets see what the details are on goal :run-exploded ...
$ mvn org.eclipse.jetty:jetty-maven-plugin:9.4.26.v20200117:help -Ddetail=true -Dgoal=run-exploded
[INFO] Scanning for projects...
[INFO]
[INFO] ------------------< org.apache.maven:standalone-pom >-------------------
[INFO] Building Maven Stub Project (No POM) 1
[INFO] --------------------------------[ pom ]---------------------------------
[INFO]
[INFO] --- jetty-maven-plugin:9.4.26.v20200117:help (default-cli) # standalone-pom ---
[INFO] Jetty :: Jetty Maven Plugin 9.4.26.v20200117
Jetty maven plugins
jetty:run-exploded
This goal is used to assemble your webapp into an exploded war and
automatically deploy it to Jetty.
Once invoked, the plugin runs continuously, and can be configured to scan for
changes in the pom.xml and to WEB-INF/web.xml, WEB-INF/classes or WEB-INF/lib
and hot redeploy when a change is detected.
You may also specify the location of a jetty.xml file whose contents will be
applied before any plugin configuration. This can be used, for example, to
deploy a static webapp that is not part of your maven build.
Available parameters:
contextHandlers
List of other contexts to set up. Consider using instead the <jettyXml>
element to specify external jetty xml config file. Optional.
contextXml
Location of a context xml configuration file whose contents will be
applied to the webapp AFTER anything in <webApp>.Optional.
dumpOnStart (Default: false)
Use the dump() facility of jetty to print out the server configuration to
logging
User property: dumponStart
excludedGoals
List of goals that are NOT to be used
httpConnector
A ServerConnector to use.
jettyXml
Comma separated list of a jetty xml configuration files whose contents
will be applied before any plugin configuration. Optional.
loginServices
List of security realms to set up. Consider using instead the <jettyXml>
element to specify external jetty xml config file. Optional.
nonBlocking (Default: false)
Determines whether or not the server blocks when started. The default
behavior (false) will cause the server to pause other processes while it
continues to handle web requests. This is useful when starting the server
with the intent to work with it interactively. This is the behaviour of
the jetty:run, jetty:run-war, jetty:run-war-exploded goals.
If true, the server will not block the execution of subsequent code. This
is the behaviour of the jetty:start and default behaviour of the
jetty:deploy goals.
reload (Default: automatic)
reload can be set to either 'automatic' or 'manual' if 'manual' then the
context can be reloaded by a linefeed in the console if 'automatic' then
traditional reloading on changed files is enabled.
User property: jetty.reload
requestLog
A RequestLog implementation to use for the webapp at runtime. Consider
using instead the <jettyXml> element to specify external jetty xml config
file. Optional.
scanIntervalSeconds (Default: 0)
The interval in seconds to scan the webapp for changes and restart the
context if necessary. Ignored if reload is enabled. Disabled by default.
Required: Yes
User property: jetty.scanIntervalSeconds
server
A wrapper for the Server object
skip (Default: false)
Skip this mojo execution.
User property: jetty.skip
stopKey
Key to provide when stopping jetty on executing java -DSTOP.KEY=<stopKey>
-DSTOP.PORT=<stopPort> -jar start.jar --stop
stopPort
Port to listen to stop jetty on executing -DSTOP.PORT=<stopPort>
-DSTOP.KEY=<stopKey> -jar start.jar --stop
supportedPackagings
Per default this goal support only war packaging. If your project use an
other type please configure it here.
systemProperties
System properties to set before execution. Note that these properties will
NOT override System properties that have been set on the command line or
by the JVM. They WILL override System properties that have been set via
systemPropertiesFile. Optional.
systemPropertiesFile
File containing system properties to be set before execution Note that
these properties will NOT override System properties that have been set on
the command line, by the JVM, or directly in the POM via systemProperties.
Optional.
User property: jetty.systemPropertiesFile
useProvidedScope (Default: false)
Whether or not to include dependencies on the plugin's classpath with
<scope>provided</scope> Use WITH CAUTION as you may wind up with duplicate
jars/classes.
war (Default: ${project.build.directory}/${project.build.finalName})
The location of the war file.
Required: Yes
webApp
An instance of org.eclipse.jetty.webapp.WebAppContext that represents the
webapp. Use any of its setters to configure the webapp. This is the
preferred and most flexible method of configuration, rather than using the
(deprecated) individual parameters like 'tmpDirectory', 'contextPath' etc.
This tells you that the configuration for the webApp is where you set the contextPath
Unfortunately, that's a complex object and you cannot specify that on the command line.
So edit your pom.xml to include it.
<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-maven-plugin</artifactId>
<configuration>
<webApp>
<contextPath>/foo</contextPath>
</webApp>
</configuration>
</plugin>
...
See also How to define complex Maven properties in the comand line
Here is the link
https://www.eclipse.org/jetty/documentation/jetty-9/index.html#jetty-maven-plugin
Here is the commandline:
mvn jetty:run -Dcontext=/abc
This command line is for the following pom:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.rahul.soAnswer</groupId>
<artifactId>jetty-run</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>jetty-answer</name>
<packaging>war</packaging>
<properties>
<maven.compiler.source>11</maven.compiler.source>
<maven.compiler.target>11</maven.compiler.target>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-maven-plugin</artifactId>
<version>9.4.44.v20210927</version>
<configuration>
<webApp>
<contextPath>${context}</contextPath>
</webApp>
</configuration>
</plugin>
</plugins>
</build>
</project>
You can choose to add more detail to the configuration as per your application needs

Ivy Install task fails with JSCH SFTP error 4 first time, but is successful on subsequent attempts

I am trying to use the ANT Ivy install task to copy a library from one repository to the other.
Some example code within my ANT target:
<ivy:install organisation="testOrg" module="testModuleName" revision="1.2.3" from="fromRepo" to="toRepo"/>
The fromRepo and toRepo are defined in a local ivysettings.xml file.
The resolve (from fromRepo) of the library is successful but the install to toRepo fails, with an SFTP Code 4 error.
impossible to install testOrg#testModuleName;1.2.3: java.io.IOException: Failure
at org.apache.ivy.plugins.repository.sftp.SFTPRepository.put(SFTPRepository.java:164)
at org.apache.ivy.plugins.repository.AbstractRepository.put(AbstractRepository.java:130)
at org.apache.ivy.plugins.resolver.RepositoryResolver.put(RepositoryResolver.java:234)
at org.apache.ivy.plugins.resolver.RepositoryResolver.publish(RepositoryResolver.java:215)
at org.apache.ivy.core.install.InstallEngine.install(InstallEngine.java:150)
at org.apache.ivy.Ivy.install(Ivy.java:537)
at org.apache.ivy.ant.IvyInstall.doExecute(IvyInstall.java:102)
at org.apache.ivy.ant.IvyTask.execute(IvyTask.java:271)
...
Caused by: 4: Failure
at com.jcraft.jsch.ChannelSftp.throwStatusError(ChannelSftp.java:2833)
at com.jcraft.jsch.ChannelSftp.mkdir(ChannelSftp.java:2142)
at org.apache.ivy.plugins.repository.sftp.SFTPRepository.mkdirs(SFTPRepository.java:186)
at org.apache.ivy.plugins.repository.sftp.SFTPRepository.mkdirs(SFTPRepository.java:184)
at org.apache.ivy.plugins.repository.sftp.SFTPRepository.put(SFTPRepository.java:160)
... 37 more
However if I simply run the same target again, the install completes successfully!
It seems to be some issue with creating a directory, from com.jcraft.jsch.ChannelSftp.mkdir(ChannelSftp.java:2142) in the stacktrace.
After running the 1st time, the testOrg/testModuleName directory exists (only testOrg having previously existed).
The 2nd time running the testOrg/testModuleName/1.2.3 directory is created (along with the library artifacts).
If after running the 1st time I delete the testOrg/testModuleName directory it created, it will continue to return the code 4 error.
My ANT library directory contains: jsch-0.1.50.jar which I assume it is using to upload to the destination Ivy Server.
In addition I am using:
Ant 1.8.4
Ivy 2.4.0
Java 1.7.0_80
By debugging the Ivy SFTP source code that creates the new directories on the destination toRepo repository, I was able to see why this was happening.
The code is in the method: SFTPRepository.mkdirs() this recursively calls itself to make each directory in the path if they do not exist.
For my example the directory being uploaded was:
/toRepo/testOrg/testModuleName//1.2.3/
You can see the double slash: // in the middle of the path.
The meant that the mkdirs() method tried to create the testModuleName directory twice. The 2nd time failed which caused the code 4 error.
The reason there is a double slash in the path is because there is no branch for this artifact.
Within my ivy settings file the sftp resolver (for my toRepo repository) artifact patterns were configured to:
<ivy pattern="/toRepo/[organisation]/[module]/[branch]/[revision]/ivy-[revision].xml"/>
<artifact pattern="/toRepo/[organisation]/[module]/[branch]/[revision]/[artifact]-[revision].[ext]"/>
The /[branch]/ part of the pattern is what was generating the // in the path.
There are 2 configurations, one for the ivy.xml file itself and the other for all other artifacts.
Ivy patterns allow the use of parenthesis for optional parts of the pattern.
So changing my configuration to:
<ivy pattern="/toRepo/[organisation]/[module](/[branch])/[revision]/ivy-[revision].xml"/>
<artifact pattern="/toRepo/[organisation]/[module](/[branch])/[revision]/[artifact]-[revision].[ext]"/>
Fixed the issue and the ivy install functioned as expected.
This means that for antifacts where there is no branch defined (like 3rd party artifacts), then the branch directory will not be included in the path.

Configure multiple Jacoco reports for Sonarqube with Maven

I am building a SpringBoot/Maven project. Jacoco is configured to check coverage and creates 3 reports:
from unit tests (surefire) target/jacoco.exec (xml: target/site/jacoco)
from integration tests (failsafe) target/jacoco-it.exec (xml: target/site/jacoco-it)
aggregate (jacoco-aggregate) target/jacoco-aggregate.exec (xml: target/site/jacoco-aggregate)
Without any additional configuration Sonarqube seems to create it's own aggregate:
[INFO] 13:14:27.153 Analysing C:\Users\code\dras-mt\target\jacoco-it.exec
[INFO] 13:14:27.248 Analysing C:\Users\code\dras-mt\target\jacoco.exec
[INFO] 13:14:27.414 Analysing C:\Users\code\dras-mt\target\sonar\jacoco-merged.exec
And it warns about deprecated use of properties:
[DEBUG] 13:14:27.000 Property 'sonar.junit.reportsPath' is deprecated and will be ignored, as property 'sonar.junit.reportPaths' is also set.
(they are not set via my pom.xml)
The problem with the generated report is, that Sonarqube seems to show the aggregated coverage but only counts the unit tests.
My question is how to get a consistent report (coverage matches count tests) and in best case how to export all 3 reports to Sonarqube (Sonarqube's own aggregate could be the 4th).
I tried to use property:
<properties>
....
<sonar.coverage.jacoco.xmlReportPaths>
${project.build.directory}/site/jacoco-aggregate/,${project.build.directory}/site/jacoco/,${project.build.directory}/site/jacoco-it/
</sonar.coverage.jacoco.xmlReportPaths>
</properties>
But this property seems to be completely ignored (i tried also absolute and reletive paths and set via -D and also single path) - so log output (also using -X) does not change.
(first i tried this: https://community.sonarsource.com/t/coverage-test-data-importing-jacoco-coverage-report-in-xml-format/12151)
So how to correctly confige the sonar-maven-plugin (3.7.0.1746) to show the 3 reports?

Jenkins ERROR: Failed to create /usr/share/tomcat7/.m2 on Maven project

I am running Jenkins ver. 2.60.2 and it doesn't seem possible, within a Maven Job, to define a local repository not in /usr/share/tomcat7/.m2.
Here are my attempts:
I created a Global Maven settings.xml and a Settings file with the Config File Management Plugin, that contains:
<settings>
<localRepository>/srv/maven/.m2/repository</localRepository>
...
</settings>
I Created a new Maven Project. Tried to make the Job see that file by attempting all of the following:
a) Defining either Settings file or Global settings file (I created two identical files) within the build step:
b) Adding a Pre-step Provide Configuration files, and then using the variable MY_SETTINGS either in the Goals and options or MAVEN_OPTS:
c) Use the Provide Configuration files within the build environment (and using the MY_SETINGS in the same way as in the previous step.:
However, none of these seems to work. The job always fails, trying to use the default maven repository location (/usr/share(tomcat7/.m2) - which I have no idea how to re-define:
provisioning config files...
copy managed file [MYFILE settings] to file:/srv/webapps/jenkins/jobs/testJob/workspace#tmp/config3408982272576109420tmp
provisioning config files...
copy managed file [MYFILE settings] to file:/srv/webapps/jenkins/jobs/testJob/workspace#tmp/config2203063037747373567tmp
Parsing POMs
using global settings config with name MYFILE settings
Replacing all maven server entries not found in credentials list is true
Deleting 1 temporary files
ERROR: Failed to create /usr/share/tomcat7/.m2
Finished: FAILURE
Do you know how to make this work within a Maven Job type in Jenkins?

Maven plugin conditional execution based on a previous plugin's execution (maven-compiler-plugin)

Is it possible to conditionally execute a goal in compile phase, based on whether the maven-compiler-plugin actually detected source changes and therefore compiled and produced new class files?
My use case would be to do things like run findbugs or jacoco plugins only when there's new byte code in the project.
Currently, I unconditionally run findbugs by hooking it into the compile phase:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>findbugs-maven-plugin</artifactId>
<executions>
<execution>
<id>findbugs-check-compile</id>
<phase>compile</phase>
<goals>
<goal>check</goal>
</goals>
</execution>
</executions>
</plugin>
However, if I repeatedly execute "mvn package" I get:
[INFO] --- maven-compiler-plugin:3.3:compile (default-compile) # my-prj ---
[INFO] Nothing to compile - all classes are up to date
[INFO] >>> findbugs-maven-plugin:3.0.1:check (findbugs-check-compile) > :findbugs # my-prj >>>
[INFO] --- findbugs-maven-plugin:3.0.1:findbugs (findbugs) # my-prj ---
[INFO] Fork Value is true
[INFO] Done FindBugs Analysis....
Notice how maven-compiler-plugin detects "Nothing to compile - all classes are up to date". I'd like to only execute findbugs:check afterwards if this is NOT the case (or equivalently, I'd like to SKIP the "findbugs:check" goal execution if this indeed IS the case and nothing has changed).
NOTE 1: I know about profiles and conditional activation based on things like OS / architecture / system properties / etc, but my understanding is that these are evaluated early when maven starts, and cannot change later during the build.
NOTE 2: I've also seen maven-ant-plugin mentioned, but I'd like to just skip the extra plugin's execution altogether. I don't want to add an antrun execution just to be able to skip findbugs.
NOTE 3: I need to be able to do this for multiple plugins, not just findbugs
to be honest I dont think there is a way to do this. Only the sort of workarounds but not exactly your exact need (based on profiles, property setting from profile1 to be picked up by another profile etc).
Something close to that is usually achieved with tools like jenkins, where you set a basic job e.g do a package (compile) if it succeeds or completes or you see something being generated, then you activate a post-build job to execute find bugs.

Categories