Given the following Maven project:
root-project
database
server
I am able to configure maven-exec-plugin inside the server sub-module to run the application found therein. However, if someone updates a peer sub-module (e.g. database) then I get runtime errors. What I would like to do instead is have some mechanism that would:
Build root-project and its sub-modules when I build the "current project". I don't mind which project this is, it could be root-project or server.
Runs server when I run the "current project"
This way, I can initiate all project-wide operations from a single point instead of having to context-switch between the two projects.
I tried configuring maven-exec-plugin at root-project to do this, but <classpath/> resolves to root-project's classpath instead of the desired server classpath.
I'm wondering if approaching this from the opposite end is possible (configuring maven-compiler-plugin in server to build root-project and its dependencies) but I'm not sure how to do so. I am also worried that this might set off an endless loop as root-project tries building server and server tries building root-project.
I hope I understood your question correct, it looks like any other project to me:
I've created a project available from https://github.com/johanwitters/stackoverflow-mavenExec
Parent: stackoverflow-mavenExec
Child module 1: database
Child module 2: server
Module "database" has 1 class com.johanw.stackoverflow.database.Database defined as:
package com.johanw.stackoverflow.database;
public class Database {
public static String DATBASE_NAME = "The name";
}
Module "server" depends on module "database".
<dependency>
<groupId>com.johanw.stackoverflow.mavenExec</groupId>
<artifactId>database</artifactId>
<version>0.1-SNAPSHOT</version>
</dependency>
Module "server" has a exec-maven-plugin plugin defined as:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.6.0</version>
<executions>
<execution>
<id>install</id>
<phase>install</phase>
<goals>
<goal>java</goal>
</goals>
</execution>
</executions>
<configuration>
<mainClass>com.johanw.stackoverflow.server.Test</mainClass>
</configuration>
</plugin>
This runs the class com.johanw.stackoverflow.server.Test available from "server", which is defined as:
package com.johanw.stackoverflow.server;
import com.johanw.stackoverflow.database.Database;
public class Test {
public static void main(String[]args) {
System.out.println("Hello " + Database.DATBASE_NAME);
}
}
You mentioned "I don't mind which project this is, it could be root-project or server". So, for the above to work, you'll need to build the root project. When you do, root (parent) project stackoverflow-mavenExec using ...
mvn clean install
... it will eventually run the com.johanw.stackoverflow.server.Test and output
Hello The name
As illustrated below:
If you want to only run the server main class, from within the root directory, run the follwing command.
mvn exec:java -pl server -Dexec.mainClass=com.johanw.stackoverflow.server.Test
If it's your goal to switch between clean install and run server, then you can remove the definition of exec-maven-plugin in the server pom, and run respectively:
mvn clean install
and
mvn exec:java -pl server -Dexec.mainClass=com.johanw.stackoverflow.server.Test
I hope this helps.
Related
I have a strange problem, after having setup my build pipeline in eclipse via maven. I will use my sass compiler for example, but this extends to my whole pipeline (js merging, font copying, etc.). Some snippets to boot:
Pom.xml
<plugins>
<plugin>
<groupId>com.github.eirslett</groupId>
<artifactId>frontend-maven-plugin</artifactId>
<version>1.0</version>
<executions>
[...]
<execution>
<id>gulp build css</id>
<goals>
<goal>gulp</goal>
</goals>
<configuration>
<arguments>css</arguments>
<srcdir>${project.basedir}/src/main/resources/sass</srcdir>
<outputdir>${project.build.directory}/generated-resources/static/css</outputdir>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
gulpfile.js
[...]
gulp.task('css', function() {
var scssSource = 'src/main/resources/sass/main.scss';
var cssTarget = 'target/generated-resources/static/css/';
// compile sass files
var sassStream = gulp.src(scssSource).pipe(
sass().on('error', sass.logError));
// autoprefix and minify
sassStream.pipe(autoprefixer({
browsers: ['last 2 versions', 'ie >= 10']
})).pipe(concat('main.css')).pipe(gulp.dest(cssTarget)).pipe(
rename(function(path) {
path.basename += ".min";
})).pipe(cleanCSS()).pipe(gulp.dest(cssTarget));
});
I confirmed the build pipeline to be working e.g. it produces the correct output. When looking at the maven console in eclipse I can see the build running successfully. I can even open the generated css files through the IDE in "target/generated-resources/static/css", and confirm changes made.
However - they are not populated (hot deployed) to an running application. Restarting the application, the changes are visible. I tried refreshing the folder manually in eclipse, but that yielded no success.
If I trigger the build manually, instead of relying on the watch mechanism of the maven plugin, the same happens. However, a refresh of the output folder (generated-resources here) populates the changes to the running application.
This is extremely annoying, since it delays front-end development a lot (running the build manually, refreshing the folder, etc.). I can also confirm that hot-deployment itself work. Static js/css files (which are not part of the fulp build) can be changed, with result immediately visible in the running application.
You must refresh Eclipse workspace in order to do hot deployment.
Alternatively you can configure Eclipse for check changes automatically going to:
Window->Preferences->General->Workspace and check "Refresh automatically"
Or (depends of Eclipse's version)
Window->Preferences->General->Workspace and check "Refresh using native hooks or polling"
I have multiple projects using similar step definition across the different projects. Hence using all step definition in single project and added as dependency jar in maven.
When I run using maven command it says :
You can implement missing steps with the snippets below:
#When("^Import a canvas from \"(.*?)\" to project \"(.*?)\"$")
public void import_a_canvas_from_to_project(String arg1, String arg2) throws Throwable {
// Write code here that turns the phrase above into concrete actions
throw new PendingException();
}
but when I add package in same project it works fine. (Even in eclipse from different projects works). Is there any way to run such scenarios from maven and jenkins?
I am using eclipse IDE. maven command I used is :
mvn -DprofileTest=cucumberID clean -P cucumberID test
cucumberID is my profile name.
Following profile I added in pom.xml
<profile>
<id>cucumberID</id>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<version>2.11</version>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<testFailureIgnore>true</testFailureIgnore>
<includes>
<include>step_definitions/LoginTest.java</include>
</includes>
<parallel>classes</parallel>
<threadCount>3</threadCount>
<useFile>true</useFile>
</configuration>
</plugin>
</plugins>
</build>
</profile>
You haven't specified how are you running your test suite but assuming that you have #CucumberOptions somewhere, you can just point to other projects packages like this:
#CucumberOptions(. . . glue = {
"com.company.test.package1", "com.company2.test.package2", . . .})
use classpath: prefix for the package name to solve this.
for example:
#CucumberOptions(glue = { "classpath:com.example.test.steps" })
If you will use Maven's preferred way of creating packages, then "mvn package" your code, and "mvn install" this package, then you'll be able to run test from external library without changes in class annotated with #CucumberOptions.
Attempting to modify an existing Java/Tomcat app for deployment on Heroku following their tutorial and running into some issues with AppAssembler not finding the entry class. Running target/bin/webapp (or deploying to Heroku) results in Error: Could not find or load main class org.stopbadware.dsp.Main
Executing java -cp target/classes:target/dependency/* org.stopbadware.dsp.Main runs properly however. Here's the relevant portion of pom.xml:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>appassembler-maven-plugin</artifactId>
<version>1.1.1</version>
<configuration>
<assembleDirectory>target</assembleDirectory>
<programs>
<program>
<mainClass>org.stopbadware.dsp.Main</mainClass>
<name>webapp</name>
</program>
</programs>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>assemble</goal>
</goals>
</execution>
</executions>
</plugin>
My guess is mvn package is causing AppAssembler to not use the correct classpath, any suggestions?
Your artifact's packaging must be set to jar, otherwise the main class is not found.
<pom>
...
<packaging>jar</packaging>
...
</pom>
The artifact itself is added at the end of the classpath, so nothing other than a JAR file will have any effect.
Try:
mvn clean package jar:jar appassembler:assemble
Was able to solve this by adding "$BASEDIR"/classes to the CLASSPATH line in the generated script. Since the script gets rewritten on each call of mvn package I wrote a short script that calls mvn package and then adds the needed classpath entry.
Obviously a bit of a hack but after a 8+ hours of attempting a more "proper" solution this will have to do for now. Will certainly entertain any more elegant ways of correcting the classpath suggested here.
I was going through that tutorial some time ago and had very similar issue. I came with a bit different approach which works for me very nicely.
First of all, as it was mentioned before, you need to keep your POM's type as jar (<packaging>jar</packaging>) - thanks to that, appassembler plugin will generate a JAR file from your classes and add it to the classpath. So thanks to that your error will go away.
Please note that this tutorial Tomcat is instantiated from application source directory. In many cases that is enough, but please note that using that approach, you will not be able to utilize Servlet #WebServlet annotations as /WEB-INF/classes in sources is empty and Tomcat will not be able to scan your servlet classes. So HelloServlet servlet from that tutorial will not work, unless you add some additional Tomcat initialization (resource configuration) as described here (BTW, you will find more SO questions talking about that resource configuration).
I did a bit different approach:
I run a org.apache.maven.plugins:maven-war-plugin plugin (exploded goal) during package and use that generated directory as my source directory of application. With that approach my web application directory will have /WEB-INF/classes "populated" with classes. That in turn will allow Tomcat to perform scanning job correctly (i.e. Servlet #WebServlet annotations will work).
I also had to change a source of my application in the launcher class:
public static void main(String[] args) throws Exception {
// Web application is generated in directory name as specified in build/finalName
// in maven pom.xml
String webappDirLocation = "target/embeddedTomcatSample/";
Tomcat tomcat = new Tomcat();
// ... remaining code does not change
Changes to POM which I added - included maven-war-plugin just before appassembler plugin:
...
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.5</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>exploded</goal>
</goals>
</execution>
</executions>
</plugin>
...
Please note that exploded goal is called.
I hope that small change will help you.
One more comment on that tutorial and maven build: note that the tutorial was written to show how simple is to build an application and run it in Heroku. However, that is not the best approach to maven build.
Maven recommendation is that you should adhere to producing one artifact per POM. In your case there are should two artifacts:
Tomcat launcher
Tomcat web application
Both should be build as separate POMs and referenced as modules from your parent POM. If you look at the complexity of that tutorial, it does not make much sense to split that into two modules. But if your applications gets more and more complex (and the launcher gets some additional configurations etc.) it will makes a lot of sense to make that "split". As a matter of fact, there are some "Tomcat launcher" libraries already created so alternatively you could use of one them.
You can set the CLASSPATH_PREFIX environment variable:
export CLASSPATH_PREFIX=target/classes
which will get prepended to the classpath of the generated script.
The first thing is that you are using an old version of appassembler-maven-plugin the current version is 1.3.
What i don't understand why are you defining the
<assembleDirectory>target</assembleDirectory>
folder. There exists a good default value for that. So usually you don't need it. Apart from that you don't need to define an explicit execution which bounds to the package phase, cause the appassembler-maven-plugin is by default bound to the package phase.
Furthermore you can use the useWildcardClassPath configuration option to make your classpath shorter.
<configuration>
<useWildcardClassPath>true</useWildcardClassPath>
<repositoryLayout>flat</repositoryLayout>
...
</configruation>
And that the calling of the generated script shows the error is depending on the thing that the location of the repository where all the dependencies are located in the folder is different than in the generated script defined.
I have a problem with a service I am trying to write. I am trying to create a service that runs in the background on a windows system but uses java. I have seen several ways of doing this, but decided on one method that seemed to meet my requirements. The service will check a database for items it needs to work on. When it finds an item in the DB that it needs to do it will run some system commands to take care of them.
I found a way to use the tomcat7.exe file to run a jar as a service and that worked pretty well for basic stuff. Anything I write and compile into my jar file "myService.jar" we'll can call it goes well enough. The problem is that we already have several classes written for accessing the DB and running commands that are precompiled in a library of classes called BGLib-1.0.jar.
I have used this library in writing several jenkins plugins and had no problems calling functions from it. They all work fine when I create an hpi file and deploy it in Jenkins. There the compiler (Eclipse using Maven) packages the BGLib jar in with the plugin jar and Jenkins figures out how to get them to see one another.
When I build my service jar, however, it doesn't work when I deploy it.
I run a command like this to install the Tomcat exe renames to myservice.exe:
d:\myService\bin>myService.exe //IS//myService --Install=D:\myService\bin\myService.exe --Description="run some commands
Java Service" --Jvm=auto --Classpath=D:\myService\jar\myService.jar;D:\myService\jar\BGLib-1.0.jar --StartMode=jvm --
StartClass=com.myCompany.myService.myService --StartMethod=windowsService --StartParams=start --StopMode=jvm --StopClass
=com.myCompany.myService.myService --StopMethod=windowsService --StopParams=stop --LogPath=D:\myService\logs --StdOutpu
t=auto --StdError=auto
When I deploy this with code solely within the myService.jar the service behaves as expected, but when I try to call functions within the BGLib-1.0.jar I get nothing. The jvm appears to crash or become unresponsive. Debugging is a little tricky but it looks like I am getting class not found errors.
I tried adding the entry below in the POM file to see if changing the classpath entry in the manifest would help, but it didn't change the manifest. I am still kind of clueless ass to how the manifest file works. Any documentation on that would be cool. I have been to Maven's site and it doesn't seem to have comprehensive documentation on the tags available. Is there something I need to change in the manifest to get my jar to see external classes? Or is there something I can add that will get Maven to compile the classes from that jar in with my jar?
thanks in advance.
<configuration>
<archive>
<manifest>
<addClasspath>true</addClasspath>
<mainClass>com.myCompany.myService.myService</mainClass>
<customClasspathLayout>BGLib-1.0.jar</customClasspathLayout>
</manifest>
</archive>
</configuration>
To answer mainly the question of the title, you can the shade plugin to include dependencies into your final jar. You can even even relocate the class files (e.g. change package name) within the final jar so that the included classes don't conflict with different versions of the shaded dependency on the classpath. Not sure if this is the best solution for your particular problem though.
You can use the maven-dependency-plugin unpack-dependencies goal to include the contents of a dependency in the resulting artifact.
An example of how to do this would be:
<plugin>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<id>${project.artifactId}-fetch-deps</id>
<phase>generate-sources</phase>
<goals>
<goal>unpack-dependencies</goal>
</goals>
<configuration>
<outputDirectory>${project.build.outputDirectory}</outputDirectory>
<stripVersion>true</stripVersion>
<excludeTransitive>true</excludeTransitive>
<includeArtifactIds>protobuf-java</includeArtifactIds>
</configuration>
</execution>
</executions>
</plugin>
This will expand the protobuf-java dependency (flatten it) and include the contents in the resulting artifact generated by your build.
Looks to me you actually want to use the appassembler-maven-plugin, otherwise I'd go for the maven-shade-plugin.
I'm translating an Ant script to Maven 2 and I have this problem: the Ant script use a pretty simple java class to encrypt files this way:
<target name="encrypt">
<java classname="DESEncrypter">
<classpath>
<pathelement path="...classpath for this thing..." />
</classpath>
<arg line="fileToEncrypt.properties fileEncrypted.properties" />
</java>
</target>
This DESEncrypter is a compiled class which source doesn't belong to the project I am converting but is used similarly in other projects. Probably I have to create a maven plugin for this to reuse, but I don't want to do it now. My question is: in which directory do i put the DESEncrypter class and how do i invoke it? Using the exec:java plugin, may be? I don't think the encrypter belong to src, test or resources directories.
Obviously, I don't want to include the encrypter class in the final product, just the encrypted files.
My question is: in which directory do i put the DESEncrypter class and how do i invoke it? Using the exec:java plugin, may be? I don't think the encrypter belong to src, test or resources directories.
A very straightforward solution would be to use the Maven AntRun Plugin. Regarding the location of your encrypter, you could either:
put it in a separate module that you could declare as dependency OF the plugin (see this example)
keep it in the current module, in the source tree, and configure the Maven JAR Plugin to exclude it using excludes.
The third obvious answer (apart from exec:java and antrun) is GMaven, which lets you execute Groovy code either from an external class or inline from your pom. So if you only need a one-liner, embedding it in your pom is a quick and easy way to implement things (otherwise you should use an external script). BTW: if you don't know groovy: it's basically java with some additional syntax sugaring.
Here's a sample configuration (of course you have to replace the artifact and class you use):
<plugin>
<groupId>org.codehaus.groovy.maven</groupId>
<artifactId>gmaven-plugin</artifactId>
<dependencies>
<dependency>
<groupId>your.library.com</groupId>
<artifactId>your-library</artifactId>
<version>1.2.3</version>
</dependency>
</dependencies>
<executions>
<execution>
<phase>process-classes</phase>
<!-- Or any other phase -->
<goals>
<goal>execute</goal>
</goals>
<configuration>
<source><![CDATA[
import com.encryption.*;
new Encrypter().encrypt(
new File(project.build.outputDirectory,
'fileToEncrypt.properties'),
new File(project.build.outputDirectory,
'encryptedFile.properties')
)
]]></source>
</configuration>
</execution>
</executions>
</plugin>
(By making the encryption artifact a plugin dependency, you keep it out of your deployed dependencies, but this holds true for antrun and exec:java also)
You might want to just use the AntRun plugin, it should let you accomplish anything from Ant with a minimum amount of fuss.
You would need a dependency on the class/jar you were using, but by giving it a scope of test, or provided it won't package it in your final product.