I want to build an OSGi compliant multi-moduled application where I have all the required bundles in 3 folders after compilation. I am using maven-bundle-plugin and maven-scr-plugin to create the bundles.
What I want is to run this application in an osgi container (Equinox) with a single command, using a script hopefully. For this I believe I have to create a config.ini file listing all the bundles in the application.
Is there a way to generate this at Maven compilation time itself? Or is there a better way to get all the bundles in some folder structure so that the app can be run straight away?
You can use maven-pax-plugin with PaxRunner in your OSGi Maven project. Check this tutorial for the details.
<plugin>
<!-- Pax Runner Maven plugin -->
<groupId>org.ops4j</groupId>
<artifactId>maven-pax-plugin</artifactId>
<version>1.4</version>
<configuration>
<!-- Pax Runner version -->
<runner>1.4.0</runner>
<!-- OSGi framework type (equinox, felix, knopflerfish) -->
<framework>equinox</framework>
<provision>
<param>--log=debug</param>
<param>--workingDirectory=target/runner</param>
<!-- bundles that should be installed -->
<param>mvn:org.osgi/org.osgi.compendium/4.1.0#2</param>
<param>mvn:org.apache.felix/org.apache.felix.eventadmin/1.2.2#3</param>
<param>mvn:org.apache.felix/org.apache.felix.log/1.0.0#3</param>
</provision>
</configuration>
</plugin>
Just have a look at Tycho and its different packaging types (e.g. eclipse-application).
http://www.eclipse.org/tycho/
http://wiki.eclipse.org/Tycho/Packaging_Types
It is used for many commercial and open source applications.
I wrote a maven plugin that by default creates a dist folder under target that contains a ready-to-use equinox with all maven dependencies. Equinox is wrapped with YAJSW so you can use the generated equinox package as a test server. Please see the plugin usage page: http://www.everit.org/eosgi-maven-plugin/
The documentation is a bit poor yet so in case you have any question please do not hesitate to ask.
A short step-by-step guide:
Check out https://github.com/everit-org/osgi-samples-simple (user:guest, pass: guest)
Run "mvn install". This will generate a testing equinox environment at target/eosgi-itests-dist/equinox in the module tests/core.
In case you want to have a simple equinox server without the testing modules you can run the command "mvn eosgi:dist" on the tests/core module.
Edit:
A new cookbook will be available soon that contains a much more detailed step-by-step guide. The url is http://cookbook.everit.org
Related
I have a Java Project with Spring Boot and JavaFX added through maven. The code compiles and even i can execute the fat jar without the JavaFX SDK in the computer. But when I try to execute it in IntelliJ it results in
Error: JavaFX runtime components are missing, and are required to run this application
I have seen this output in many questions and in most of those cases the jar wasn't built at all or code compilation failed.
But in this scenario the mvn package works with no errors and I can execute the JAR with java -jar <jar_name> to cross out the fact that I might have the javafx sdk installed somewhere I tried it in a VM with only the JRE installed.
pom.xml
<dependency>
<groupId>org.openjfx</groupId>
<artifactId>javafx-controls</artifactId>
<version>11.0.2</version>
</dependency>
<dependency>
<groupId>org.openjfx</groupId>
<artifactId>javafx-fxml</artifactId>
<version>11.0.2</version>
</dependency>
As for plugins spring-boot-maven-plugin and maven-compiler-plugin.
Attempted Solutions
--1--
I tried the solution which said to add the
<plugin>
<groupId>org.openjfx</groupId>
<artifactId>javafx-maven-plugin</artifactId>
<version>0.0.6</version>
<executions>
<execution>
<id>default-cli</id>
<configuration>
<mainClass>com.example.demofx.Starter</mainClass>
</configuration>
</execution>
</executions>
</plugin>
But what it does is add the ability to run with: mvn clean javafx:run
The need to execute with IntelliJ is to debug the code because Debugging with souts isn't efficient.
--2--
Trying to module build with a module-info.jar with following
module com.example.demofx {
requires javafx.controls;
requires javafx.fxml;
// all other required modules including spring
opens com.example.demofx to javafx.fxml;
exports com.example.demofx;
}
This might have worked but due to some of old dependencies not working properly with modularized build this results in lots of breaking changes to the codebase.
Edit:
Missed to mention the environment
JDK - 11.0.8
IntelliJ IDEA - 2021.2.2
Added second solution tried.
This is more a troubleshooting and research guide than an actual fix. Fixes for environmental issues are difficult to provide in a StackOverflow context. However, if you study the information here, it might help you fix your project.
Recommended troubleshooting approach
Use the Intellij new JavaFX project wizard. Ensure that it works in your environment, then gradually add components from your current project into the working project, checking that everything still works after each small addition.
Debugging when executing via the JavaFX maven plugin
I think the above recommendation is the preferred approach, however, you can alternately get the following to work:
run with: mvn clean javafx:run
The need to execute with IntelliJ is to debug the code"
See:
intellij idea : how to debug a java:fx maven project?
I also think you can just right-click on the maven target for javafx:run and select Debug. I am not sure, I don't make use of the JavaFX maven plugin.
Creating fat jars for JavaFX applications
the fat jar
This is not a recommended configuration, but if you really must do it, you can review:
Maven Shade JavaFX runtime components are missing
That answer doesn't discuss getting such a configuration to work in conjunction with an Idea run/debug configuration, so it may not assist you.
If you do continue with a fat jar, I would not advise using a module-info, as you will be running code off the classpath anyway.
Modular versus non-modular JavaFX applications
If you don't use a fat jar, getting all the module dependencies correct for Spring is tricky anyway because Spring is not currently architected to directly support modules well. Spring 6 will be designed to work well with modules, though I think you should be able to get Spring 5 to work if you try hard enough (I have got it to work in the past for some applications).
Alternately you can just have the JavaFX components as modules and run the rest off the classpath. For example, the "Non-modular with Maven" approach at openjfx.io. Note that in that approach, the JDK and JavaFX modules are still loaded as modules off of the module path, it is only Spring your application that is not providing a module-info.java file and running off the classpath.
Creating runtime images for JavaFX applications
I also advise studying:
these resources for the creation of an appropriate runtime image.
Since I learn from some maven tutorials we can import Jetty as a maven plugin like this.
<plugin>
<groupId>org.mortbay.jetty</groupId>
<artifactId>maven-jetty-plugin</artifactId>
<version>6.1.10</version>
<configuration>
<scanIntervalSeconds>5</scanIntervalSeconds>
</configuration>
</plugin>
And can run the plugin like this.
$ mvn jetty:run
Also we can change the port and context path and lots of stuff in this plugin.
As I understood that we can use Jetty as a server like tomcat, and we can deploy an application through it.
But the thing I don't understand is what is the actual enterprise use of Jetty in maven..
From the official documentation:
The Jetty Maven plugin is useful for rapid development and testing. You can add it to any webapp project that is structured according to the Maven defaults. The plugin can then periodically scan your project for changes and automatically redeploy the webapp if any are found. This makes the development cycle more productive by eliminating the build and deploy steps: you use your IDE to make changes to the project, and the running web container automatically picks them up, allowing you to test them straight away.
However (and maybe this addresses what you call "enterprise use"):
While the Jetty Maven Plugin can be very useful for development we do not recommend its use in a production capacity. In order for the plugin to work it needs to leverage many internal Maven apis and Maven itself it not a production deployment tool. We recommend either the traditional distribution deployment approach or using embedded Jetty.
The main usage is for testing, Jetty can also be started programmatically (see this example Java code) which means you can start server directly from your code and interact with your REST API for instance.
You can also use it for easier deployment of small applications, just package everything into the JAR which runs server from main method when executed via java -jar your-app.jar. You don't need any dependencies installed except Java then.
As a side note, I currently work in Clojure (JVM language based on Lisp) and many people deploys their application as JAR which internally runs embedded Jetty because this way it's also starts REPL which you can connect to remotely and debug your application when it's running.
I don't know exactly what you mean by enterprise use, but let's say you're developing a web application and it's a Maven project.
Each time you want to test whether the web application works correctly, you need to deploy the web archive (WAR) on a web server, e.g. Jetty or Tomcat. Usually this involves a couple of manual steps like:
Start the web server
Deploy the WAR on it
Where the Maven plugin comes in handy is that it allows you to just execute
mvn jetty:run-war
and it does all these steps automatically for you in a single command, saving you lots of time. The plugin is even able to redeploy the application once it notices changes have been made.
I have a multimodule maven setup for my project, made of 5 modules, which includes a GWT webapp.
It is also an eclipse multiproject workspace, so I created an additional project, only containing a pom, which lists the other projects (sibling on the file system) as children modules.
I'm also a new maven user, so I might be doing something wrong. =)
The gwt module uses the following plugin
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>gwt-maven-plugin</artifactId>
<version>2.4.0</version>
<executions>
<execution>
<goals>
<goal>generateAsync</goal>
<goal>compile</goal>
</goals>
</execution>
</executions>
<configuration>
<hostedWebapp>war</hostedWebapp>
<runTarget>GWT.html</runTarget>
</configuration>
</plugin>
When I run mvn package on the pom project I get the expected behaviour: projects are build in the correct order, and the war is fine.
When I run mvn gwt:run, though, maven tries to find a gwt app on each module, failing on the first one (the parent) which doesn't even declare nor manage the gwt plugin.
If I run mvn -fn gwt:run, the build fails on each other project, finally finding a gwt app on the gwt module, and displaying it.
How do I correctly run the app on hosted mode? Is this the correct behavior?
I do not want the GWT module to be the parent module (if it's possible), because the project has multiple target platforms, producing the gwt web frontend, a Java executable jar backend and in the future also an Android app, and shares most parts of the code (not only the model). Is a single pom structure recommended for such a setup, or am I failing at maven?
Are profiles what I need? If I do, should I declare the same profile id on each module? How would I prevent the trigger of gwt:run command on them anyway?
What should the setup of such a project be? Is this the correct setup?
Additional information
Modules are
pom: declares modules model, logic, analyze, gwt, tests
model: no dependencies
logic: no dependencies
analyze: depends on model, logic
gwt: depends on model, logic
tests: depends on model, logic, analyze, gwt (contains global tests,
not unit tests)
If I run gwt:run on the gwt module i get the error
Could not resolve dependencies for project
djjeck.gwt:djjeck.gwt:war:0.0.1-SNAPSHOT:
Could not find artifact djjeck.model:djjeck.model:jar:0.0.1-SNAPSHOT
This is from djjeck.gwt/pom.xml
<dependency>
<groupId>djjeck.model</groupId>
<artifactId>djjeck.model</artifactId>
<version>0.0.1-SNAPSHOT</version>
<scope>compile</scope>
</dependency>
A com.model-0.0.1-SNAPSHOT.jar is inside the war lib folder, both packed and unpacked, and also inside djjeck.model/target.
Go to the webapp module and then run mvn gwt:run.
You may use profiles to speed up compilation time: one profile could only gwt compile for gecko and english +draftCompile for example.
Have a look at maven GWT plugin multi-module setup if you're still having problems.
As I was also struggling with GWT dev mode and a Maven project with multiple sub-modules/projects, I created an example and uploaded it to GitHub. You can find it at:
https://github.com/steinsag/gwt-maven-example
The readme on aboves page shows how to run it via Maven. Features of this example are:
multiple modules
not using GWT's embedded Jetty, but an own Tomcat7 server
startup of Tomcat7 and GWT hosted mode possible via documented Maven commands
I hope this helps a bit to have at least a working example to start from.
I need to build java project. The project should include two modules: domain and web.
The domain module contains all the entities, the business logic and hibernate integration.
The web module should be depend on the domain module and contains web application using apache wicket.
I wonder about the maven usage.
Should I create a project and modules using maven? If so, how?
What kind of archetype are relevant for my project and modules?
What is better experience - create the project myself or use maven?
I am using intellij.
I'm assuming you don't need a server for others to access your code,
but rather, you want to use maven/ant for internal project
organization/dependency resolution/source organization.
Should I create a project and modules using maven?
Yes, either maven or ant will be useful for any non-trivial Java or Java EE project with external dependencies, and build/testing requirements.
If so, how?
Either ant/maven will allow you to easily setup a platform independent "build" file, so that you can easily resolve dependencies, build your jar executables, and run unit tests in order by issuing a single command, rather than multiple clicks to different plugins in whatver the ide-of-the-month is. You can do this in eclipse using the maven plugin to create a new maven project, or , as you suggest, by creating an artifact by running the regular mvn install.
What kind of archetype are relevant for my project and modules?
To learn use maven-archetype-quickstart.
For a regular (simple) j2ee app, try maven-archetype-webapp.
There is also a j2ee archetype as well.
What is better experience - create the project myself or use maven?
A simple, 3-step, robust method for setting up a maven project :
1) Use maven archetypes to create and setup your "hello world" project.
2) Import the maven project into your ide as a java project.
3) Edit/refine/fix code in your IDE, but use maven to build and test the whole application.
Update: external web frameworks
Creating a wicket (or gwt or any other framework) oriented web app will
Be best done following specific tutorials related to the framework itself. In order to add theframework libs, just paste the maven info in your pom.xml like thus, and run a "mvn install" command :
<dependency>
<groupId>org.apache.wicket</groupId>
<artifactId>wicket-core</artifactId>
<version>1.5.3</version>
</dependency>
I'd recommend you to use maven. The reasons why I use maven:
IDE agnostic. You can use idea, eclipse or some other ID.
Dependencies management
Powerful plugin system
You can manually create 3 maven module
app.parent with pom packaging and no parent.
app.domain with jar packaging and app.parent parent
app.web with war packaging and app.parent parent
and import app.parent to idea.
Also checkout Wicket quick-start Maven archetype creation page http://wicket.apache.org/start/quickstart.html
I'm using OSGi for my latest project at work, and it's pretty beautiful as far as modularity and functionality.
But I'm not happy with the development workflow. Eventually, I plan to have 30-50 separate bundles, arranged in a dependency graph - supposedly, this is what OSGi is designed for. But I can't figure out a clean way to manage dependencies at compile time.
Example: You have bundles A and B. B depends on packages defined in A. Each bundle is developed as a separate Java project.
In order to compile B, A has to be on the javac classpath.
Do you:
Reference the file system location of project A in B's build script?
Build A and throw the jar into B's lib directory?
Rely on Eclipse's "referenced projects" feature and always use Eclipse's classpath to build (ugh)
Use a common "lib" directory for all projects and dump the bundle jars there after compilation?
Set up a bundle repository, parse the manifest from the build script and pull down the required bundles from the repository?
No. 5 sounds the cleanest, but also like a lot of overhead.
My company has 100+ bundle projects and we use Eclipse to manage the dependencies. However, I don't recommend the "Required Plugins" approach to managing the dependencies. Your best bet is to create Plugin Projects. Export just the packages from each project that you want to be visible. Then on the import side do the following:
Open the Manifest editor
Goto the dependencies tab In the bottom left is a section called "Automated Management of Dependencies"
Add any plugins that the current plugin depends on there
Once you have code written, you can click the "add dependencies" link on that tab to auto-compute the imported packages.
If you run from Eclipse, this gets done automatically for you when you execute.
The benefits of this approach is that your built bundles are only using OSGi defined package import/export mechanism, as opposed to something from Eclipse.
If you want to learn more, I'd recommend going to this site and ordering the book. It's excellent.
http://equinoxosgi.org/
Well, do what you should have a long time before, separate implementation and API ... ok, this is not always that easy on existing systems but that model has a huge bang for your buck. Once your API is in a separate (much more stable) bundle/jar you can compile the clients and implementations against that bundle/jar.
One of the key qualities of a successful bundle is that it makes as little assumptions about the outside world as possible. This implies you do not have to compile against the bundles you run against in runtime, I have a preference to try hard to not do that. You should only compile against the bundles minimum set of dependencies. If assumptions are made they are explicit as imported packages and the use of services. Well designed OSGi systems attempt to use services for all inter-bundle communications. Not only does this model get rid of class loading issues it also makes your build setup more decoupled.
Unfortunately most code is written as libraries that have a rather wide interface because they hand code lots of the functionality that services provide out of the box like Factories and Listeners. This code has a tight relationship between implementation and API so you have to have the same on the class path during compile and in OSGi. One solution to this problem is to include this kind of code inside the bundle using it (but make sure no objects of this library leak to other bundles). A bit of extra memory consumption but it saves you from some headaches.
So with OSGi, try to create systems relying on services and compile against their service API, not an implementation bundle.
Basically, you can use:
source dependency (with Eclipse's "referenced projects")
binary dependency (using the jar of bundle A)
But since binary dependency is much cleaner, it is also the kind of dependency best managed by a release management framework like maven.
And you can integrate maven in your Eclipse project through m2eclipse.
The Maven plugin to use would then be: maven-bundle-plugin, that you can see in action in:
Using maven to create an osgi bundle (osgi felix sample)
Bundle Plugin for Maven
Getting the benefits of maven-bundle-plugin in other project types
How to build OSGi bundles using Maven Bundle Plugin
Consider this more real-world example using Felix' Log Service implementation.
The Log Service project is comprised of a single package: org.apache.felix.log.impl.
It has a dependency on the core OSGi interfaces as well as a dependency on the compendium OSGi interfaces for the specific log service interfaces. The following is its POM file:
<project>
<modelVersion>4.0.0</modelVersion>
<groupId>org.apache.felix</groupId>
<artifactId>org.apache.felix.log</artifactId>
<packaging>bundle</packaging>
<name>Apache Felix Log Service</name>
<version>0.8.0-SNAPSHOT</version>
<description>
This bundle provides an implementation of the OSGi R4 Log service.
</description>
<dependencies>
<dependency>
<groupId>${pom.groupId}</groupId>
<artifactId>org.osgi.core</artifactId>
<version>0.8.0-incubator</version>
</dependency>
<dependency>
<groupId>${pom.groupId}</groupId>
<artifactId>org.osgi.compendium</artifactId>
<version>0.9.0-incubator-SNAPSHOT</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<extensions>true</extensions>
<configuration>
<instructions>
<Export-Package>org.osgi.service.log</Export-Package>
<Private-Package>org.apache.felix.log.impl</Private-Package>
<Bundle-SymbolicName>${pom.artifactId}</Bundle-SymbolicName>
<Bundle-Activator>${pom.artifactId}.impl.Activator</Bundle-Activator>
<Export-Service>org.osgi.service.log.LogService,org.osgi.service.log.LogReaderService</Export-Service>
</instructions>
</configuration>
</plugin>
</plugins>
</build>
</project>
There is a 6th option, which I've used for several projects, which is to use a single Eclipse project (not a plugin project, but just an ordinary Java project) and put all source code in there. A build file associated with the project will simply compile all code in a single pass and subsequently create bundles out of the compiled classes (using Bnd from Ant or from the soon to be released BndTools).
This has the downside that it does not honor visibility at development and compile time, but the upside that it's a really simple development model that gives you very fast build and deploy times.