CodeCov cannot find reports - java

I am trying to get codecov to run and process the reports generated by Jacoco for my multibuild Java Gradle project. However, when I run the codecov script (bash <(curl -s https://codecov.io/bash)), I get the following output:
x> No CI provider detected.
Testing inside Docker? http://docs.codecov.io/docs/testing-with-docker
Testing with Tox? https://docs.codecov.io/docs/python#section-testing-with-tox
project root: .
Yaml found at: .codecov.yml
==> Running gcov in . (disable via -X gcov)
==> Python coveragepy not found
==> Searching for coverage reports in:
+ .
--> No coverage report found.
Please visit http://docs.codecov.io/docs/supported-languages
I have verified that the reports are created by jacoco in build/reports/jacoco/codeCoverageReport, and that the xml report in fact exists.
I setup the jacoco reporting following the guide here (Github). The main difference between my gradle code and the code on that github is I have xml.destination "${buildDir}/reports/jacoco/report.xml" excluded, because Gradle will fail to process with it included.
.codecov.yml
codecov:
require_ci_to_pass: true
coverage:
precision: 3
round: up
range: "70...100"
status:
project: true
patch: yes
changes: no
parsers:
gcov:
branch_detection:
conditional: yes
loop: yes
method: yes
macro: no
comment:
layout: "reach,diff,flags,tree"
behavior: default
require_changes: false

I figured it out. Running bash <(curl -s https://codecov.io/bash) -h listed the options available to me, where I found out that there is a -f <file> option to specify the exact file to use.
From here, I simply use that in my travis file to get it to upload correctly:
bash <(curl -s https://codecov.io/bash) -f build/reports/jacoco/codeCoverageReport/codeCoverageReport.xml

I am using maven with java15
add into pom.xml (under build section):
<plugin>
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<version>0.8.6</version>
<executions>
<execution>
<goals>
<goal>prepare-agent</goal>
</goals>
</execution>
<execution>
<id>report</id>
<phase>prepare-package</phase>
<goals>
<goal>report</goal>
</goals>
</execution>
</executions>
</plugin>
add into .travis.yml:
script:
- mvn clean package
after_success:
- bash <(curl -s https://codecov.io/bash)
Worked well for me.

Related

Maven signed jar gives warning "unsigned application"

I'm using Maven for a project that creates a JAR that's embedded in my web application to sign PDF documents using a smartcard.
In my pom.xml I use the maven-jarsigner-plugin as follows:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jarsigner-plugin</artifactId>
<version>1.4</version>
<executions>
<execution>
<id>sign</id>
<goals>
<goal>sign</goal>
</goals>
</execution>
<execution>
<id>verify</id>
<goals>
<goal>verify</goal>
</goals>
</execution>
</executions>
<configuration>
<keystore>/path/to/my/keystore.jks</keystore>
<alias>my-key-alias</alias>
<storepass>********</storepass>
<keypass>********</keypass>
<verbose>true</verbose>
<certs>true</certs>
<arguments>
<argument>-tsa</argument>
<argument>https://timestamp.geotrust.com/tsa</argument>
</arguments>
</configuration>
</plugin>
The project builds fine, without any errors. For 99% they are just [INFO] messages, except some [WARNING] messages from the Maven Shade plugin:
[WARNING] maven-shade-plugin has detected that some .class files
[WARNING] are present in two or more JARs. When this happens, only
[WARNING] one single version of the class is copied in the uberjar.
[WARNING] Usually this is not harmful and you can skeep these
[WARNING] warnings, otherwise try to manually exclude artifacts
[WARNING] based on mvn dependency:tree -Ddetail=true and the above
[WARNING] output
When I manually check the resulting jar using the CLI jarsigner it is fine:
Niels-MBP:target niels$ jarsigner -verify my-applet.jar
jar verified.
The jar also verifies without problems on other computers. However, when I include the jar in my web application, users get the message: "security warning: Do you want to run this application? Un unsigned application from the location above is requesting permission to run."
UPDATE: When I run the jarsigner with the -verbose option, all .class files are marked as sm (signature verified, entry is listed in manifest) and are missing the k option (at least one certificate was found in keystore). This may be the cause of the error. END UPDATE
The page is served over HTTPS. The jar is on the same domain (even the same folder) as the HTML page and included like this:
<script src="https://www.java.com/js/deployJava.js"></script>
<script>
var attributes = {
id: 'myApplet',
code: 'nl.company.project.applet.MyAppletApplet',
archive: 'my-applet.jar',
width: 200,
height: 200
};
deployJava.runApplet(attributes, '1.7');
<script>
Any help with this would be appreciated!
Niels
The company where I purchased the code signing certificate - Xolphin - tracked down the problem for me. It had something to do with an incorrect added certificate/alias in the keystore. I recreated the keystore and the problem is gone.
For others facing the same warning: make sure that you uncheck 'Keep temporary files on my computer' in your Java settings (System Preferences -> Java -> Temporary Internet Files -> Settings). It caused me to search further after the problem was fixed, even though I used different filenames for different versions of my JAR file.

Cannot load UIMA PEAR package through the package installer GUI

I have a problem related to installing a UIMA PEAR package containing an Annotator component. I am using PearPackageMavenPlugin for the job with the following setup:
<plugin>
<groupId>org.apache.uima</groupId>
<artifactId>PearPackagingMavenPlugin</artifactId>
<version>2.6.0</version>
<extensions>true</extensions>
<executions>
<execution>
<phase>package</phase>
<configuration>
<!-- PEAR file component classpath settings -->
<classpath>$main_root/bin</classpath>
<!-- PEAR file main component descriptor -->
<mainComponentDesc>desc/S4DocumentUimaAnnotator.xml</mainComponentDesc>
<!-- PEAR file component ID -->
<componentId>S4DocumentAnnotator</componentId>
<!-- PEAR file UIMA datapath settings -->
<datapath>$main_root/resources</datapath>
</configuration>
<goals>
<goal>package</goal>
</goals>
</execution>
</executions>
</plugin>
`
I have constructed a special maven profile building the project in a bin directory instead of target so all my compiled classes are there that is why I have pointed the classpath setting of the plugin at $main_root/bin.
Finally when I load the built pear package I get the following error:
Verification of S4DocumentAnnotator failed =>
org.apache.uima.resource.ResourceInitializationException: The class com.ontotext.s4.api.components.uima.S4DocumentUimaAnnotator is not a valid Analysis Component. You must specify an Annotator, CAS Consumer, Collection Reader, or CAS Multiplier. If you are calling ResourceManager.setExtensionClassPath, this error can also be caused if you have put UIMA framework jar files on the extension classpath, which is not allowed. (Descriptor: file:/home/ceco/s4_stuff/my_pear/S4DocumentAnnotator/desc/S4DocumentUimaAnnotator.xml)
at org.apache.uima.analysis_engine.impl.PrimitiveAnalysisEngine_impl.initializeAnalysisComponent(PrimitiveAnalysisEngine_impl.java:228)
at org.apache.uima.analysis_engine.impl.PrimitiveAnalysisEngine_impl.initialize(PrimitiveAnalysisEngine_impl.java:170)
at org.apache.uima.impl.AnalysisEngineFactory_impl.produceResource(AnalysisEngineFactory_impl.java:94)
at org.apache.uima.impl.CompositeResourceFactory_impl.produceResource(CompositeResourceFactory_impl.java:62)
at org.apache.uima.UIMAFramework.produceResource(UIMAFramework.java:279)
at org.apache.uima.UIMAFramework.produceResource(UIMAFramework.java:331)
at org.apache.uima.UIMAFramework.produceAnalysisEngine(UIMAFramework.java:448)
at org.apache.uima.pear.tools.InstallationTester.testAnalysisEngine(InstallationTester.java:218)
at org.apache.uima.pear.tools.InstallationTester.doTest(InstallationTester.java:113)
at org.apache.uima.pear.tools.InstallationController.verifyComponentInstallation(InstallationController.java:1110)
at org.apache.uima.pear.tools.InstallationController.verifyComponent(InstallationController.java:1993)
at org.apache.uima.tools.pear.install.InstallPear.installPear(InstallPear.java:389)
at org.apache.uima.tools.pear.install.InstallPear.access$000(InstallPear.java:80)
at org.apache.uima.tools.pear.install.InstallPear$RunInstallation.run(InstallPear.java:109)
at java.lang.Thread.run(Thread.java:744)
I do not understand why the UIMA jars are not supposed to be packaged when the idea of the PEAR package is to be self-contained and not depend on the system it is ran on?
This is what I would try:
The class com.ontotext.s4.api.components.uima.S4DocumentUimaAnnotator is not a valid Analysis Component
Check that S4DocumentUimaAnnotator is valid. Unzip the PEAR and check the xml.
If you are calling ResourceManager.setExtensionClassPath, this error can also be caused if you have put UIMA framework jar files on the extension classpath, which is not allowed.
Did you try to print the extension classpath?
Else you could try to use a plain java version of the PEAR, meaning: manually unzip it and create a normal java project with it.
One common issue I ran into several times and which creates exactly this error message is that my PEAR contains uimaj-common.jar in the lib folder. Have you checked this?
I know this one is old but what I think you should do is set the scope of the uima dependencies to provided. The PEAR needs only it's dependencies to run, the enviroment it'll be used should have the uima dependencies set to use all the uima features and pears
Provided dependencies won't be copied to the lib folder of the PEAR
There are a couple of things that could be going on, but going by your error message, the first thing I would determine is whether or not the uimaj-core jar file is being excluded from the build of the PEAR file. It should be. (Take a look at the error message above.) I just ran into this problem myself, and I got around it by adding this to my POM:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<!-- Copy the dependencies to the lib folder for the PEAR to copy. -->
<execution>
<id>copy-dependencies</id>
<phase>package</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<stripVersion>true</stripVersion>
<outputDirectory>${basedir}/lib</outputDirectory>
<overWriteReleases>false</overWriteReleases>
<overWriteSnapshots>true</overWriteSnapshots>
<includeScope>runtime</includeScope>
<!-- An exception happens when using a PEAR if the archive includes this jar. -->
<excludeArtifactIds>uimaj-core</excludeArtifactIds>
</configuration>
</execution>
</executions>
</plugin>
I recently posted a parent POM file and a project-specific POM file to my Gist. You're welcome to take a look at what I'm doing. (One of the things I'm doing is copying over my desc directory, which contains my AE's descriptor XML file, to the root of my project directory, so that the maven-pear plugin can copy it into the PEAR archive properly.)
Parent POM:
https://gist.github.com/software-mariodiana/d46e10fca53dc6e6c0f16e20563476b8
Project-specific POM:
https://gist.github.com/software-mariodiana/e9a0f0f03a49d33dcc32655170fd4841
Good luck!

Build executable JAR for Gatling load test

I am new to Gatling (2.1.2) and want to do a small prototype project to show to my colleagues.
According to the quick start page, there are several ways I can run a simulation with Gatling:
decompress the Gatling bundle into a folder and drop my simulation files into user-files/simulations folder. bin/gatling.sh will compile and run the simulation files.
use the gatling-maven-plugin maven plugin to execute the simulation.
create a project with gatling-highcharts-maven-archetype, and run the Engine class.
and I found those problems
For 1, it is hard to add dependencies for simulation classes. I have to figure out what the jars are needed and drop them to the lib folder.
For 2, it requires maven to be installed.
For 3, it only runs from an IDE
I just want a simple executable JAR file with all the dependencies bundled together (my simulation, Gatling and third party), and run it from any machine (like EC2 instances).
Is there a way to achieve this?
Update 1:
I tried method 3, but moving all the project files from test folder to main, and used maven-assembly-plugin to build a jar with dependencies. When I tried to run the file, I got the following error:
Exception in thread "main" java.lang.ExceptionInInitializerError
at Engine$.delayedEndpoint$Engine$1(Engine.scala:7)
at Engine$delayedInit$body.apply(Engine.scala:4)
at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at Engine$.main(Engine.scala:4)
at Engine.main(Engine.scala)
Caused by: java.nio.file.FileSystemNotFoundException
at com.sun.nio.zipfs.ZipFileSystemProvider.getFileSystem(ZipFileSystemProvider.java:171)
at com.sun.nio.zipfs.ZipFileSystemProvider.getPath(ZipFileSystemProvider.java:157)
at java.nio.file.Paths.get(Paths.java:143)
at io.gatling.core.util.PathHelper$.uri2path(PathHelper.scala:32)
at IDEPathHelper$.<init>(IDEPathHelper.scala:7)
at IDEPathHelper$.<clinit>(IDEPathHelper.scala)
... 11 more
I guess this is something to do with Gatling configuration, but don't know what has gone wrong.
I tried to do something similar. I could not use Maven as well. I will try to remember how I did this.
1) I have configured maven-assembly-plugin to generate single JAR with dependencies like this:
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</plugin>
You need to ensure all required libraries (gatling, scala runtime, zinc compiler) are present on your resulting classpath.
2) Check the scope of your dependencies as Maven packs only classes defined with scope=compile by default. The most simple way is probably to use no test dependencies.
3) Create a launch script, e.g. launch.sh. It should contain something like this:
#!/bin/sh
USER_ARGS="-Dsomething=$1"
COMPILATION_CLASSPATH=`find -L ./target -maxdepth 1 -name "*.jar" -type f -exec printf :{} ';'`
JAVA_OPTS="-server -XX:+UseThreadPriorities -XX:ThreadPriorityPolicy=42 -Xms512M -Xmx2048M -XX:+HeapDumpOnOutOfMemoryError -XX:+AggressiveOpts -XX:+OptimizeStringConcat -XX:+UseFastAccessorMethods -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:+CMSParallelRemarkEnabled -Djava.net.preferIPv4Stack=true -Djava.net.preferIPv6Addresses=false ${JAVA_OPTS}"
java $JAVA_OPTS $USER_ARGS -cp $COMPILATION_CLASSPATH io.gatling.app.Gatling -s your.simulation.FullClassName
To explain, I took gatling`s own launch script for inspiration. Note mainly the presence of target directory in classpath parameter definition.
4) Compile your compiled target directory and launch.sh to a single directory and distribute this (e.g. as archive). Then you can the scenarios by executing ./launch.sh.
I know this is not a standard solution, but it worked for me. Hopefully it will help you too. If you have any problems or tips to improve, please share with us.
I think is a bit late for that but I face kinda the same problem related here, but instead to use maven I used gradle. Guess that the approach it's the same, a bit mix of the first solution and something or my own.
First, define a gradle build file with gatling dependencies and a task to build a fatjar
apply plugin: 'scala'
version 0.1
dependencies {
compile group: 'io.gatling', name: 'gatling-test-framework', version: '2.1.7'
compile group: 'com.typesafe.akka', name: 'akka-actor_2.11', version: '2.4.7'
compile group: 'org.scala-lang', name: 'scala-library', version: '2.11.7'
}
repositories{
mavenCentral()
mavenLocal()
}
task fatJar(type: Jar) {
manifest {
attributes 'Implementation-Title': 'Preparing test',
'Implementation-Version': version,
'Main-Class': 'io.gatling.app.Gatling'
}
baseName = project.name + '-all'
from { configurations.compile.collect { it.isDirectory() ? it : zipTree(it) } } {
exclude 'META-INF/MANIFEST.MF'
exclude 'META-INF/*.SF'
exclude 'META-INF/*.DSA'
exclude 'META-INF/*.RSA'
}
with jar
}
That task executed as
gradle clean build fatJar
will generate a self contained jar which will run the Gatling main class as default. So tell it witch test you want to run is made with the standard '-s' parameter.
So last step is create, if you want, a script to run it. I will "steal" the script for the first comment and change a bit
#!/bin/sh
if [ -z "$1" ];
then
echo "Test config tool"
echo
echo "Running Parameters : "
echo
echo " <Config file> : Test definition file. Required"
echo
exit 0;
fi
USER_ARGS="-DCONFIG_FILE=$1"
JAVA_OPTS="-server -XX:+UseThreadPriorities -XX:ThreadPriorityPolicy=42 -Xms512M -Xmx2048M -XX:+HeapDumpOnOutOfMemoryError -XX:+AggressiveOpts -XX:+OptimizeStringConcat -XX:+UseFastAccessorMethods -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:+CMSParallelRemarkEnabled -Djava.net.preferIPv4Stack=true -Djava.net.preferIPv6Addresses=false ${JAVA_OPTS}"
java $JAVA_OPTS $USER_ARGS -jar test-project-all-0.1.jar -s FunctionalTestSimulation -nr
In my case I will run the same test with different, easy to configure, parameters so my Simulation is always the same.
All my scala files are compiled by gradle and package in the jar that's mean they are in the classpath, changing the "FunctionalTestSimulation" name for a Script variable make easy adapt this script for something more generic.
Guess that make a Maven version will be easy.
Hope that help somebody.
Update with folder structure
After a request will add an small draft of the folder structure for the project:
test-project
|_ build.gradle
|_ src
|_ main
|_ scala
|_ resources
|_ runSimulation.sh
|_ configFile.conf
When have time will provide a link to my github with a working one.
Cheers
You can always create a simple Java class that starts Gatling with the Gatling.fromArgs. With this setup you can have all in just one happy executable jar. Let this class be the jar mainClass instead of "io.gatling.app.Gatling". This example is for a scala simulation class "my.package.MySimulation".
import scala.Option;
import io.gatling.app.Gatling;
import io.gatling.core.scenario.Simulation;
public class StartSimulation {
public static void main(String[] args) {
Gatling.fromArgs(new String[]{}, new Option<Class<Simulation>>() {
private static final long serialVersionUID = 1L;
#Override
public int productArity() {
return 0;
}
#Override
public Object productElement(int arg0) {
return null;
}
#SuppressWarnings("unchecked")
#Override
public Class<Simulation> get() {
try {
return (Class<Simulation>) Class.forName("my.package.MySimulation");
} catch (ClassNotFoundException e) {
e.printStackTrace();
}
return null;
}
#Override
public boolean isEmpty() {
return false;
}
#Override
public boolean canEqual(Object o) {
return false;
}
});
}
}
I had a similar issue, I fixed it as following:
Inside Gatling package there is bin/ and take a look at gatling.sh. You see that it simply adds certain configurations into classpath and then runs io.gatling.app.Gatling class in gatling-compiler-<version_number>.jar. So, all you need to do is to make a jar that includes compiler, add configurations and tests to classpath and run io.gatling.app.Gatling.
steps:
add compiler dependency:
<dependency>
<groupId>io.gatling</groupId>
<artifactId>gatling-compiler</artifactId>
<version>${gatling.version}</version>
</dependency
create jar with dependencies:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.4.1</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
<finalName>${project.build.finalName}</finalName>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
create test jar (this includes your gatling tests)
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.4</version>
<executions>
<execution>
<goals>
<goal>test-jar</goal>
</goals>
<configuration>
<excludes>
<exclude>src/test/resources/*</exclude>
</excludes>
<finalName>${project.build.finalName}</finalName>
</configuration>
</execution>
</executions>
</plugin>
create a package out of your configuration. You can use maven assembly for that. What I usually do, is to create a separate module that handles creating the package for different environments. This package contains your gatling.conf, logback.xmland all the other resources you application wants including test data.
Now you basically have three packages: application.jar, application-tests.jar and application-conf.zip.
Unzip application-conf.zip, copy application.jarand application-tests.jarin the same folder.
In this folder, You need to create target/test-classes/ folder, just
leave it empty. In my case, it was required. I think you can some how
change that in gatling.conf. But I am not sure how.
Run
java -cp ".:application-test.jar:application.jar" io.gatling.app.Gatling
I use IntelliJ Idea and I got this fixed by right clicking on the scala folder > Mark Directory as > Test Sources Root . Now Execute "Engine" and you will be all good !
I've recently blogged about this Creating a versionable, self-contained (fat-/uber-) JAR for Gatling tests, the source of which can be found in jamietanna/fat-gatling-jar.
For a Maven project, the steps would be as follows.
The main things you need are to add the dependency on gatling-charts-highcharts:
<project>
<!-- ... -->
<dependencies>
<dependency>
<groupId>io.gatling.highcharts</groupId>
<artifactId>gatling-charts-highcharts</artifactId>
<version>${gatling.version}</version>
</dependency>
</dependencies>
</project>
Next, you need to make sure your Gatling scenarios/simulations are in src/main instead of src/test.
Finally, you can use the maven-shade-plugin to build an executable JAR which uses Gatling's CLI runner as the mainClass:
<project>
<!-- ... -->
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.1.1</version>
<configuration>
<filters>
<!-- https://stackoverflow.com/a/6743609 -->
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>io.gatling.app.Gatling</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>

exec-maven-plugin says cannot run specified program, even though it is on the PATH

Edit 20140716:
Solution found
tl;dr = exec-maven-plugin does not recognise .cmd files, but only .bat files, as executable scripts. Rename grunt.cmd --> grunt.bat, bower.cmd --> bower.bat, etc. as a workaround.
Having done npm install -g grunt-cli on my system, grunt is most certainly on the PATH
When I run maven install however, this doesn't seem to register.
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.2.1:exec
(build-spa-bower) on project foobar: Command execution failed.
Cannot run program "grunt" (in directory "C:\workspace\foobar\src\main\spa"):
CreateProcess error=2, The system cannot find the file specified -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException:
Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.2.1:exec
(build-spa-bower) on project foobar: Command execution failed.
Just to be sure, in the same terminal, I have executed this
cd C:\workspace\foobar\src\main\spa
grunt build
... in the same terminal as I issued the maven command above, and grunt executes just fine.
Does exec-maven-plugin use the PATH environment variable, or does it need to be told that this executable exisst in some other way?
EDIT:
This documentation suggests that executables on PATH should be found, so it stumps me further.
I dug into the source code of exec-maven-plugin and found this snippet.
From the source of ExecMojo#getExecutablePath:
CommandLine toRet;
if ( OS.isFamilyWindows() && exec.toLowerCase( Locale.getDefault() ).endsWith( ".bat" ) )
{
toRet = new CommandLine( "cmd" );
toRet.addArgument( "/c" );
toRet.addArgument( exec );
}
else
{
toRet = new CommandLine( exec );
}
I compared this to another plugin that ran grunt tasks from maven, and found this
if (isWindows()) {
command = "cmd /c " + command;
}
... and that worked for me. So essentially the latter worked because all commands in WIndows were prepended with cmd /c,
whereas the exec-maven-plugin did not, because it only did so for file ending in .bat.
Looking in C:\Users\USER\AppData\Roaming\npm, I see:
node_modules (folder)
grunt (unix script file)
grunt.cmd (windows script file)
When I rename grunt.cmd --> grunt.bat, this solves the problem, and exec-maven-plugin is able to run this command.
(this also applies to other executables created using npm install -g, such as bower and yo)
In addition to bguiz' answer, which would be the best solution, I've created a workaround using Maven profiles, bypassing the problem.
This is a temporary solution, until the maven-exec-plugin's bug gets fixed.
Please upvote the bug report here: http://jira.codehaus.org/browse/MEXEC-118
Edit: The bug is resolved, you can point to 1.4-SNAPSHOT to fix it.
<project>
(...)
<profiles>
<profile>
<id>grunt-exec-windows</id>
<activation>
<os>
<family>Windows</family>
</os>
</activation>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>${exec-maven-plugin.version}</version>
<executions>
<execution>
<id>grunt-default</id>
<phase>generate-resources</phase>
<configuration>
<executable>cmd</executable>
<arguments>
<argument>/C</argument>
<argument>grunt</argument>
</arguments>
</configuration>
<goals>
<goal>exec</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>
</project>
I had the same issue with 1.5.0 of the plugin.
The cause in my case was spaces in my user name resulting in a grunt path:
C:\Users\My name with spaces\AppData\Roaming\npm.
When I moved the contents of the npm directory to a path without spaces, it worked.

How to deploy a node.js app with maven?

Most of our team consists of java developers and therefore the whole build / deployment / dependency management system is built on top of maven. We use CI so every build process runs unit test (w. karma and phantomJS for the frontend, and jasmine-node for the backend). I've managed to configure a karma maven plugin for this purpose.
This does not solve the issue of downloading node.js dependencies from package.json on build. I need to deploy my node.js / express app in existing environment, so the perfect scenario would be:
pull from the repo (done automatically with maven build)
npm install (that is - downloading dependencies from node package registry)
running tests
I was trying to find a nodejs package for maven, but to be honest - as a node.js developer I do not feel very confident when it comes to choosing the right tools, since I'm not able to distinguish a bad maven plugin from a decent one.
Maybe using a shell plugin and invoking npm install from the terminal is a better choice?
What's your opinion?
You've got two choices:
https://github.com/eirslett/frontend-maven-plugin to let maven download your npm modules from your package.json and let it automagically install node and npm all along
https://github.com/mulesoft/npm-maven-plugin to let maven download your npm packages that you have specified in the pom.xml (link dead as of April 2020, seems to be discontinued)
As a hacky solution, though still feasible you could as you've mentioned yourself, use something like maven-antrun-plugin to actually execute npm with maven.
All approaches have their pros and cons, but frontend-maven-plugin seems to be the most often used approach - but it assumes that your ci server can download from the internet arbitrary packages, whereas the "hacky" solution should also work, when your ci server has no connection to the internet at all (besides proxying the central maven repo)
I think you can find the answer in Grunt and the many available plugins.
I'm actually working on a web project where the client-side is made with AngularJS. Nevertheless, I think the deployement process may partially answer to your question :
In your pom.xml, you can do something like that:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.5</version>
<executions>
<execution>
<id>exec-gen-sources</id>
<phase>generate-sources</phase>
<configuration>
<target name="Build Web">
<exec executable="cmd" dir="${project.basedir}"
failonerror="true" osfamily="windows">
<arg line="/c npm install" />
</exec>
<exec executable="cmd" dir="${project.basedir}"
failonerror="true" osfamily="windows">
<arg line="/c bower install --no-color" />
</exec>
<exec executable="cmd" dir="${project.basedir}"
failonerror="true" osfamily="windows">
<arg line="/c grunt release --no-color --force" />
</exec>
</target>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
First part is the npm install task: downloading of dependencies from node package.
Second part is the bower install task: downoading of other dependencies with bower (in my case, AngularJS, but you might not need this part)
Third part is the Grunt Release part: launching a Grunt task that includes Karma unit testing.
You can find documentation about Grunt here. There are many available plugins like Karma unit testing.
I hope this helped you.
I made npm process work for my AngularJS 2 + Spring Boot application by exec-maven-plugin. I don't use bower and grunt, but think you can make it work by exec-maven-plugin too, after look at the antrun example above from Pear.
Below is my pom.xml example for exec-maven-plugin. My app has package.json and all the AngularJS .ts files are under src/main/resources, so run npm from the path. I run npm install for dependencies and npm run tsc for .ts conversion to .js
pom.xml
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<executions>
<execution>
<id>exec-npm-install</id>
<phase>generate-sources</phase>
<configuration>
<workingDirectory>${project.basedir}/src/main/resources</workingDirectory>
<executable>npm</executable>
<arguments>
<argument>install</argument>
</arguments>
</configuration>
<goals>
<goal>exec</goal>
</goals>
</execution>
<execution>
<id>exec-npm-run-tsc</id>
<phase>generate-sources</phase>
<configuration>
<workingDirectory>${project.basedir}/src/main/resources</workingDirectory>
<executable>npm</executable>
<arguments>
<argument>run</argument>
<argument>tsc</argument>
</arguments>
</configuration>
<goals>
<goal>exec</goal>
</goals>
</execution>
</executions>
</plugin>
One little hack on this is running maven build on eclipse with Windows or Mac. It perfectly fine on eclipse with linux or even also fine on Windows command window though. When run build on eclipse with Windows, it fail to understand npm and complain about not find the file. Weird thing is npm is working fine on Windows command window. So solving the hack I create npm.bat file under system path. In my case nodejs and npm are installed under C:\Program File\nodejs. After putting this batch file. everything works fine.
npm.bat
#echo off
set arg1=%1
set arg2=%2
C:\Progra~1\nodejs\npm.cmd %arg1% %arg2%
For Mac, I got same issue on eclipse. The thing is nodejs and npm are installed under /usr/local/bin. So to solve the issue, I make symbolic link /usr/local/bin/node and /usr/local/bin/npm to under /user/bin. However /usr/bin is protected in security policy, I done that after booting from recovery disk
Since 2015, there is an alternative to the frontend-maven-plugin mentioned in
Christian Ulbrich's excellent answer:
https://github.com/aseovic/npm-maven-plugin
Usage
Basically, all you have to do to use it is to put it into your POM as usual (and use "extensions:true"):
<build>
<plugins>
<plugin>
<groupId>com.seovic.maven.plugins</groupId>
<artifactId>npm-maven-plugin</artifactId>
<version>1.0.4</version>
<extensions>true</extensions>
</plugin>
[...]
</plugins>
</build>
The plugin will then automatically bind to the Maven lifecycle. Then, you can put a script into your package.json, such as:
"scripts":
{
"package": "npm pack",
[...]
}
and the npm script "package" will run automatically as part of the Maven build lifecycle phase "package".
Compared to frontend-maven-plugin
Just like frontend-maven-plugin, it will run npm scripts inside a maven project. There are two important differences:
frontend-maven-plugin will (and must) download and install npm itself. npm-maven-plugin uses (and requires) an installed version of npm.
frontend-maven-plugin requires you to describe every npm invocation in the POM (as an "execution" section). In contrast, npm-maven-plugin simply extends the Maven build lifecycle to automatically execute an npm script with the same name for each lifecycle phase (clean, install etc.). That means there is no npm-specific configuration in the POM - it's all taken from package.json.
Personally, I prefer the npm-maven-plugin's approach because it requires less configuration in the POM - POMs have a tendency to get bloated, and everything to counter that helps. Also, putting the npm invocations into package.json feels more natural and allows reusing them when invoking npm directly.
Admittedly, even with the frontend-maven-plugin you can [and probably should] define all npm invocations as scripts in package.json, and invoke these scripts from the POM, but there is still the temptation to put them directly into the POM.

Categories