Maven surefire plugin doesn't find unit tests - java

Yes, the Java classes which are containing the tests are named correctly. (they are ending with Tests)
Tried to add the following configuration in pom.xml:
<configuration>
<includes>
<include>**/*Test.java</include>
</includes>
</configuration>
Tests are located under the following structure: /src/test/packagename/JavaClassTest.java where packagename is the same package what is under the unit test written under src/main/java path.
I'm using jupiter Junit 5 and maven-surefire-plugin with 2.22.2
And I still get the following error on mvn test:
--- maven-surefire-plugin:2.22.2:test (default-test) # <project-name> ---
[INFO] No tests to run.
What do I do wrong?

Is there any reason why you configure just a certain tests? By default the surefire plugin should access all classes in testRoot and sub directories.
You could also just link a specific file like
src/test/ArchTest.java
to see if it is the "include" in your configuration or something else. I am not sure that the wildecards are working as you expect them to work. See Maven <include> wildcard match on partial folder name .
Based on this you might try out
<configuration>
<includes>
<include>/**/*Test.java</include>
</includes>
</configuration>

Related

How to use custom delimiter when creating filtered fileSets in a maven artefact?

I'm trying to create a custom maven artefact that creates a basic Java Handler for AWS Lambda. One of the files in my archetype-resources is a serverless.yml file as we are looking to deploy this handler using the ServerLess Framework. I want this file to be part of a filtered=true fileSet as I want to pre-populate certain fields based on the project groupId, projectId etc. Here's a sample:
service: cmpy-prefix-${groupId}-${artifactId}-service
# exclude the code coverage files and circle ci files
package:
exclude:
- coverage/**
- .circleci/**
...
profider:
...
environment:
S3_BUCKET_NAME: ${self:provider.stage}-cmpy-bkt
And I add this file to src/main/resources/META-INF/maven/archetype-metadata.xml as follows:
<fileSet encoding="UTF-8" filtered="true" packaged="false">
<directory></directory>
<includes>
<include>serverless.yml</include>
</includes>
</fileSet>
Now my problem is that serverless.yml file contains ${self:provider.stage} which interfere's when I run maven:generate for this archetype and the error I get is:
org.apache.velocity.exception.ParseErrorException: Encountered ":provider.stage}-cmpy-bkt\...
I tried to set the <delimiter> for the maven-resource-plugin in the pom.xml for my main archetype to no avail. Essentially, I added the following to the pom of the archetype project:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-resources-plugin</artifactId>
<version>${org.apache.maven.plugins.maven-resources-plugin.version}</version>
<configuration>
<addDefaultExcludes>false</addDefaultExcludes>
<delimiters>$[*]</delimiters>
</configuration>
</plugin>
</plugins>
</build>
But I still face the same problem when I try to generate a new project using this archetype. The maven archetype plugin seems to be ignoring the delimiter.
Any advice/help on how I can fix this will be immensely appreciated.
Found the solution. I had not realised I could add Velocity directives in my archetype files.
See this other Stackoverflow post for hints Maven archetype strips comments

Maven Filtering parameters in file

I have been looking over the maven war plugin and how to configure it. Here is my situation. I have a web application that is distributed to several production facilities. There are two files, in this web app, that are customized for each facility. These are /js/config.js and /META-INF/context.xml.
I have my project in a typical maven structure:
/src
|--/main
|--webapp
|--/js
|--config.js
|--properties
|--plant.properties
|--/META-INF
|--context.xml
I've left out non-essential directories for brevity.
The config.js has been altered to contain "parameter" I want substituted:
var Config {
...
system_title: '${plant_name} - Audit System',
...
}
The relevant portion of my pom is:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>3.2.0</version>
<configuration>
<filters>
src/main/webapp/js/properties/fayetteville.properties
</filters>
<failOnMissingWebXml>false</failOnMissingWebXml>
<webResources>
<resource>
<directory>src/main/webapp/js</directory>
<filtering>true</filtering>
<exclude>**/properties</exclude>
</resource>
</webResources>
</configuration>
</plugin>
When I run "mvn clean package", I would expect to see ${plant_name} replaced with what is in my properties file. In this case, my properties file contains a single key-value pair:
plant_name=Austin
But I am not seeing the substitution. The resulting config.js in the target folder still contains ${plant_name} as does the config.js in the resulting war file.
I really don't want to use profiles if possible. Eventually, I want the build process to use a list of properties files to do this for all plants.
From my research, including a number of SO questions and answers, I feel I have things configured correctly.
What might I be doing wrong?

How to port an Eclipse Java project to another PC and compile it from Shell?

I have created a Java project in Eclipse and successfully executed it directly from Eclipse on my Windows PC. Now I have to run the same java program on Linux server.
I have tried to copy the .class files from my PC to server and run it but it didn't work. After that I copied the whole project and run javac MyProject.java from shell and it returned the following errors:
RecordImportBatch.java:2: error: package org.apache.commons.io does not exist
import org.apache.commons.io.FileUtils;
...
RecordImportBatch.java:3: error: package org.neo4j.graphdb does not exist
import org.neo4j.graphdb.RelationshipType;
which I guess are caused because I didn't include jar files in compile command.
There are many jar files included in this project and as a Java newbie so far I haven't found the way to compile the project which works in Eclipse from Shell.
Does anyone know if there is a way to get the appropriate compile command directly from Eclipse and just paste it to Shell or do I have to include all jars 'manually'? If this is the case, does anyone know how to include all jars, placed in lib directory which is located in the same folder as MyProject.java?
Thank you!
If you are just learning about java, this suggestion may be some challenge, but it would be good for you to use maven to build your project, which requires reorganizing your source files and directories. And then use the assembly plugin to create a zip that includes all dependencies. Then to run your program, you just do something like:
unzip myapp.zip
cd myapp
java -cp "lib/*" com.blah.MyApp
(you might need to adjust the syntax of the /* part, using single quotes, or removing quotes depending on your shell)
Here is a snippet for the assembly plugin (general purpose... nothing hardcoded other than version, and the path which follows conventions). This goes in pom.xml:
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.2</version>
<configuration>
<descriptors>
<descriptor>src/main/assembly/distribution.xml</descriptor>
</descriptors>
<appendAssemblyId>false</appendAssemblyId>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<!-- this is used for inheritance merges -->
<phase>package</phase>
<!-- append to the packaging phase. -->
<goals>
<goal>single</goal>
<!-- goals == mojos -->
</goals>
</execution>
</executions>
</plugin>
And here is an example assembly file (this goes in src/main/assembly/distribution.xml relative to pom.xml):
<assembly xmlns="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.0 http://maven.apache.org/xsd/assembly-1.1.0.xsd"
>
<id>${artifact.version}</id>
<formats>
<format>zip</format>
</formats>
<files>
<file>
<!-- an example script instead of using "java -cp ..." each time -->
<source>${project.basedir}/src/main/bin/run.sh</source>
<outputDirectory>.</outputDirectory>
<destName>run.sh</destName>
<fileMode>0754</fileMode>
</file>
</files>
<fileSets>
<fileSet>
<directory>${project.basedir}/src/main/resources/</directory>
<outputDirectory>/res/</outputDirectory>
<includes>
<!-- just examples... -->
<include>*.sql</include>
<include>*.properties</include>
</includes>
</fileSet>
<fileSet>
<directory>config/</directory>
<outputDirectory>/config/</outputDirectory>
</fileSet>
</fileSets>
<dependencySets>
<dependencySet>
<outputDirectory>/lib</outputDirectory>
<excludes>
<!-- add redundant/useless files here -->
</excludes>
</dependencySet>
</dependencySets>
</assembly>
Also, eclipse has a "jar packager" utility in the gui, but I found it to be not very good when I used it a few years ago. And I don't think it handles dependencies, so you would need to take my "-cp" argument above, and add all the jars, or put them in your lib directory yourself.
Also there is this http://fjep.sourceforge.net/ but I have never used it.... I just found it now while quickly looking up the eclipse jar packager. In his tutorial, his last line (showing running it) looks like:
> java -jar demorun_fat.jar
Hello
If what you need to do, is to compile and run your program in Eclipse on your pc and transfer the compiled result to the Linux machine, then use the File -> Export -> Java -> Runnable Jar file and choose the packaging most suitable for you.
The technologically most simple is to use "Copy required libraries into a sub-folder next to the jar" but then you need to distribute by zipping the files together, and unzip them on the Linux box.
I would strongly recommend using any kind of build tools, the de facto standards are Ant or Maven, but you can find several alternatives. Both of them are quite trivial to set up for a smaller project, and using them is also a piece of cake (note that Eclipse can also generate you a basic Ant build.xml file).
For instance, it could be one command to run your whole project:
> ant run
Buildfile: build.xml
clean:
compile:
[mkdir] Created dir: ...
[javac] Compiling N source file to ...
run:
[java] Running application...
main:
BUILD SUCCESSFUL

Maven Eclipse plugin doesn't regard Maven failsave plugin?

I'm just using Maven to build my project and also my eclipse project settings. The eclipse:eclipse target generates the .classpath file for eclipse regarding the dependencies and other project settings like source directory, test source directory and so on. Now I added the Maven failsafe plugin and defined a <testSourceDirectory>/test/integration</testSourceDirectory> beside my normal (junit) test directory.
test/unit -> contains my junit test cases which are executes in maven "test" phase
test/integration -> contains my integration (maybe also junit) test cases, executed in maven phase "integration-test".
Works fine BUT eclipse plugin won't consider my <testSourceDirectory> and won't add it as entry into my .claspath file :-( Is there a way to manipulate the eclispe plugin to add the classpath entry from the failsafe plugin? I already the following:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-eclipse-plugin</artifactId>
<configuration>
<additionalConfig>
<file>
<name>.classpath</name>
<content>
<![CDATA[<classpathentry kind="src" path="test/integration" output="build/compile/test-classes"/>]]>
</content>
</file>
</additionalConfig>
</configuration>
</plugin>
But this results in overidden .classpath file whith the above entry as single line.. :-(
Has someone a good idea to slve it?
cheers, Yellomen
Have you tried specifying your integration dir in sourceIncludes as described here?

Building the WAR with m2eclipse in combination with WTP (handling webResources)

I have a situation where I have a web application that is built using maven (i.e., maven-war-plugin). For each code modification, we have had to manually launch maven and restart the application server. Now, to reduce build cycle overhead, I want to use WTP to publish the webapp.
Now, we have resource processing with Maven, and there are some additional Maven tasks defined in our POM when building the webapp. Therefore m2eclipse seems like a natural solution.
I have gotten far enough that the Maven builder is running these tasks and filtering resources correctly. However, when I choose "Run on Server", the WAR file does not look like it would if I built it in Maven.
I am guessing that this is because WTP actually builds the WAR, and not the m2eclipse builder. So even though we have configured the maven-war-plugin in our POM, those settings are not used.
Below is a snippet with our maven-war-plugin configuration. What is configured under "webResources" is not picked up, it appears:
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.1-alpha-2</version>
<configuration>
<outputDirectory>${project.build.directory}</outputDirectory>
<workDirectory>${project.build.directory}/work</workDirectory>
<webappDirectory>${project.build.webappDirectory}</webappDirectory>
<cacheFile>${project.build.webappDirectory}/webapp-cache.xml</cacheFile>
<filteringDeploymentDescriptors>true</filteringDeploymentDescriptors>
<nonFilteredFileExtensions>
<nonFilteredFileExtension>pdf</nonFilteredFileExtension>
<nonFilteredFileExtension>png</nonFilteredFileExtension>
<nonFilteredFileExtension>gif</nonFilteredFileExtension>
<nonFilteredFileExtension>jsp</nonFilteredFileExtension>
</nonFilteredFileExtensions>
<webResources>
<!-- Add generated WSDL:s and XSD:s for the web service api. -->
<resource>
<directory>${project.build.directory}/jaxws/wsgen/wsdl</directory>
<targetPath>WEB-INF/wsdl</targetPath>
<filtering>false</filtering>
<includes>
<include>**/*</include>
</includes>
</resource>
</webResources>
</configuration>
Do I need to reconfigure these resources to be handled elsewhere, or is there a better solution?
To fill in an answer to my own question if someone else comes across the same problem, I ended up adding the following to my webapp project:
<resource>
<directory>${project.build.directory}/jaxws/wsgen/wsdl</directory>
<filtering>true</filtering>
<targetPath>${project.basedir}/src/main/webapp/WEB-INF/wsdl</targetPath>
<includes>
<include>**/*</include>
</includes>
</resource>
(inside the resources element under build).
It works fine since my WSDL files are generated in the generate-resources phase and places them in target/jaxws/wsgen/wsdl. Then those are moved into src/main/webapp/WEB-INF/wsdl, where the WTP builder picks them up when building the WAR file.
Note: I should mention that I get some problems with the eclipse plugin for Maven now (i.e., mvn eclipse:eclipse), because apparently you are not allowed to have absolute paths in targetPath. Not found a satisfactory workaround yet...
I'm not sure (filtered) web resources are supported yet, see MNGECLIPSE-1149. The issue has a patch (and a workaround) that could work for you. Also have a look at the hack from this thread.
WebResources are supported in m2e-wtp 0.12 and later versions (compatible with Eclipse Helios and Indigo).
For more details, see http://community.jboss.org/en/tools/blog/2011/05/03/m2eclipse-wtp-0120-new-noteworthy

Categories