I want to accomplish what I think these directions describe.
I want to use an external, maven-based project from a rcp platform application.
I've used the new Project wizard to build a new maven-based application that include a maven-based module. I've added my external dependencies to the maven-based module.
I've also added the publicPackages section to my module's pom.
When I right-click on the module and go to ProjectProperties->PublicPackages I can see the correct packages listed with check marks.
My maven module builds just fine.
However, when I try to add the maven-module as a dependency of another module the packages listed in PublicPackages are not found.
If I peek inside the nbm I can see the jars I wanted exposed are under netbeans\modules\ext
Is there some way to build a maven-module that wraps another maven project?
The nbm-maven-plugin docs include an example which sounds alot like what I want to do:
Public packages declaration
By default all your module's packages (and classes) and private to the given module. If you want to expose any API to other modules, you will need to declare those public packages in your pom.xml. This includes not only your own classes but also any other 3rd party library classes that are packaged with your module and are to be exposed for reuse by other modules.
For example:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>nbm-maven-plugin</artifactId>
<version>3.8.1</version>
<extensions>true</extensions>
<configuration>
<publicPackages>
<publicPackage>org.foo.api</publicPackage>
<publicPackage>org.apache.commons.*</publicPackage>
</publicPackages>
</configuration>
</plugin>
there is a package org.foo.api made public (but not org.foo.api.impl package) and any package starting with org.apache.commons, so both org.apache.commons.io and org.apache.commons.exec packages are exposed to the outside
I am clearly not interpreting those doc correctly because this behavior is not what I'm seeing.
Related
I tried to compile and obfuscate 2 projects where the one depends on the other one and both are built with the Spring boot maven plugin.
Let's call them for the sake of simplicity main and util projects.
The build has two stages. In the first stage the util project is built. In the second stage the main project which depends on the util project.
My problem is that Spring boot maven plugin creates nested jars. (https://docs.spring.io/spring-boot/docs/current/reference/html/executable-jar.html)
So if I try first repackage the projects with Spring boot maven plugin and after that obfuscate the repackaged jar which contains both the util and the main projects, then first proguard extracts the repackaged jar's content where the extracted content will contain the util jar. Then proguard won't obfuscate the content of this util jar because it is a jar and not a set of class files.
If I try first obfuscate the util project with proguard and after that repackaging with Spring boot maven plugin then the obfuscation will be done but when I try to compile the main project then it won't find the necessary symbols in the jar produced from the util project.
So how to obfuscate projects repackaged with Spring boot maven project?
In your main module declare proguard-maven-plugin and spring-boot-maven-plugin (proguard package phase must be executed before spring boot plugin repackage goal). Declare it only in module which is entry point - no need to declare it in submodules or parent pom.
Then, in your proguard plugin config add modules which you would like to obfuscate besides main jar:
<configuration>
<assembly>
<inclusions>
<inclusion>
<groupId>com.yourgroupid</groupId>
<artifactId>submodule</artifactId>
</inclusion>
</inclusions>
</assembly>
...
And in spring-boot-maven-plugin add exclusion, so spring not overwrites already obfuscated jar:
<configuration>
<mainClass>
com.yourapp.App
</mainClass>
<excludeGroupIds>com.yourgroupid</excludeGroupIds>
</configuration>
Hope that it will work for you!
I have a maven project called A, and it depends on another jar file called B.jar, both A and B.jar has same class but with different version. During maven build that classes in B.jar always overlap in A. What's the way to let maven only takes the classes in A not B?
I think that you have a real problem of conception about your Maven modules.
A JAR is not designed to exclude some classes when it is used by another JAR.
Why in B JAR, don't you provide a way to choose at runtime the implementation class to use ?
You can allow it by multiple ways : a property, an interface to implement ,etc....
In this way, you could specify the class to use in the client application.
You should think in terms of API to implement by client classes, not in terms of overwriting classes.
It doesn't mean that you cannot do it with Maven but it seems intricate, not natural, error prone and not good designed.
Here's some ideas to solve it with Maven.
You could configure the maven-jar-plugin to specify the class to exclude in the packaged jar :
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>3.0.2</version>
<configuration>
<excludes>
<exclude>a.b.c.MyclassToExclude.java</exclude>
</excludes>
</configuration>
</plugin>
Here is the reference documentation.
But by doing it, the class will not be available in the JAR in any case.
It makes the B JAR not working alone if the class to replace is required in B.
You could package the JAR with a specific classifier to avoid this problem.
You would have so the classic jar that contains everything and the jar-for-a that contains everything but this famous duplicate class.
But really, I think that you should really think about your design.
I have troubles with importing classes from an existing Spring-Boot application into my new application after they changed the structure of the build jar file.
They changed the jar-file so that the applications own classes now are located in BOOT-INF/classes and not on the root of the jar-file.
But when I have a normal maven dependency to this Spring-boot application I can not import the existing classes from this application and into my new classes in my new application.
It worked just fine before they changed the structure...
The solution here is to refactor your code, so that the classes you're depending on in both your applications are available in a separate project.
Now you can use these classes by importing the dependency in both your projects:
<dependency>
<groupId>org.example</groupId>
<artifactId>example-shared</artifactId>
</dependency>
Make sure that you're not using the Spring boot maven plugin in this newly made shared project and you should probably not use any Spring boot starters either, since they load a lot of dependencies you may not need.
I found out that it is actually possible to use a Spring Boot application as a dependency. Even though it most likely is not recommended. But in some cases it just makes it easier.
This solution means that you can not use the executable archive.
"The executable archive cannot be used as a dependency as the executable jar format packages application classes in BOOT-INF/classes. This means that they cannot be found when the executable jar is used as a dependency."
The solution to my question is to include a configuration classifier to the spring-boot-maven-plugin. Like this for Maven:
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<classifier>exec</classifier>
</configuration>
</plugin>
or like this for Gradle:
bootRepackage {
classifier = 'exec'
}
I have a multimodule maven setup for my project, made of 5 modules, which includes a GWT webapp.
It is also an eclipse multiproject workspace, so I created an additional project, only containing a pom, which lists the other projects (sibling on the file system) as children modules.
I'm also a new maven user, so I might be doing something wrong. =)
The gwt module uses the following plugin
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>gwt-maven-plugin</artifactId>
<version>2.4.0</version>
<executions>
<execution>
<goals>
<goal>generateAsync</goal>
<goal>compile</goal>
</goals>
</execution>
</executions>
<configuration>
<hostedWebapp>war</hostedWebapp>
<runTarget>GWT.html</runTarget>
</configuration>
</plugin>
When I run mvn package on the pom project I get the expected behaviour: projects are build in the correct order, and the war is fine.
When I run mvn gwt:run, though, maven tries to find a gwt app on each module, failing on the first one (the parent) which doesn't even declare nor manage the gwt plugin.
If I run mvn -fn gwt:run, the build fails on each other project, finally finding a gwt app on the gwt module, and displaying it.
How do I correctly run the app on hosted mode? Is this the correct behavior?
I do not want the GWT module to be the parent module (if it's possible), because the project has multiple target platforms, producing the gwt web frontend, a Java executable jar backend and in the future also an Android app, and shares most parts of the code (not only the model). Is a single pom structure recommended for such a setup, or am I failing at maven?
Are profiles what I need? If I do, should I declare the same profile id on each module? How would I prevent the trigger of gwt:run command on them anyway?
What should the setup of such a project be? Is this the correct setup?
Additional information
Modules are
pom: declares modules model, logic, analyze, gwt, tests
model: no dependencies
logic: no dependencies
analyze: depends on model, logic
gwt: depends on model, logic
tests: depends on model, logic, analyze, gwt (contains global tests,
not unit tests)
If I run gwt:run on the gwt module i get the error
Could not resolve dependencies for project
djjeck.gwt:djjeck.gwt:war:0.0.1-SNAPSHOT:
Could not find artifact djjeck.model:djjeck.model:jar:0.0.1-SNAPSHOT
This is from djjeck.gwt/pom.xml
<dependency>
<groupId>djjeck.model</groupId>
<artifactId>djjeck.model</artifactId>
<version>0.0.1-SNAPSHOT</version>
<scope>compile</scope>
</dependency>
A com.model-0.0.1-SNAPSHOT.jar is inside the war lib folder, both packed and unpacked, and also inside djjeck.model/target.
Go to the webapp module and then run mvn gwt:run.
You may use profiles to speed up compilation time: one profile could only gwt compile for gecko and english +draftCompile for example.
Have a look at maven GWT plugin multi-module setup if you're still having problems.
As I was also struggling with GWT dev mode and a Maven project with multiple sub-modules/projects, I created an example and uploaded it to GitHub. You can find it at:
https://github.com/steinsag/gwt-maven-example
The readme on aboves page shows how to run it via Maven. Features of this example are:
multiple modules
not using GWT's embedded Jetty, but an own Tomcat7 server
startup of Tomcat7 and GWT hosted mode possible via documented Maven commands
I hope this helps a bit to have at least a working example to start from.
I'm using OSGi for my latest project at work, and it's pretty beautiful as far as modularity and functionality.
But I'm not happy with the development workflow. Eventually, I plan to have 30-50 separate bundles, arranged in a dependency graph - supposedly, this is what OSGi is designed for. But I can't figure out a clean way to manage dependencies at compile time.
Example: You have bundles A and B. B depends on packages defined in A. Each bundle is developed as a separate Java project.
In order to compile B, A has to be on the javac classpath.
Do you:
Reference the file system location of project A in B's build script?
Build A and throw the jar into B's lib directory?
Rely on Eclipse's "referenced projects" feature and always use Eclipse's classpath to build (ugh)
Use a common "lib" directory for all projects and dump the bundle jars there after compilation?
Set up a bundle repository, parse the manifest from the build script and pull down the required bundles from the repository?
No. 5 sounds the cleanest, but also like a lot of overhead.
My company has 100+ bundle projects and we use Eclipse to manage the dependencies. However, I don't recommend the "Required Plugins" approach to managing the dependencies. Your best bet is to create Plugin Projects. Export just the packages from each project that you want to be visible. Then on the import side do the following:
Open the Manifest editor
Goto the dependencies tab In the bottom left is a section called "Automated Management of Dependencies"
Add any plugins that the current plugin depends on there
Once you have code written, you can click the "add dependencies" link on that tab to auto-compute the imported packages.
If you run from Eclipse, this gets done automatically for you when you execute.
The benefits of this approach is that your built bundles are only using OSGi defined package import/export mechanism, as opposed to something from Eclipse.
If you want to learn more, I'd recommend going to this site and ordering the book. It's excellent.
http://equinoxosgi.org/
Well, do what you should have a long time before, separate implementation and API ... ok, this is not always that easy on existing systems but that model has a huge bang for your buck. Once your API is in a separate (much more stable) bundle/jar you can compile the clients and implementations against that bundle/jar.
One of the key qualities of a successful bundle is that it makes as little assumptions about the outside world as possible. This implies you do not have to compile against the bundles you run against in runtime, I have a preference to try hard to not do that. You should only compile against the bundles minimum set of dependencies. If assumptions are made they are explicit as imported packages and the use of services. Well designed OSGi systems attempt to use services for all inter-bundle communications. Not only does this model get rid of class loading issues it also makes your build setup more decoupled.
Unfortunately most code is written as libraries that have a rather wide interface because they hand code lots of the functionality that services provide out of the box like Factories and Listeners. This code has a tight relationship between implementation and API so you have to have the same on the class path during compile and in OSGi. One solution to this problem is to include this kind of code inside the bundle using it (but make sure no objects of this library leak to other bundles). A bit of extra memory consumption but it saves you from some headaches.
So with OSGi, try to create systems relying on services and compile against their service API, not an implementation bundle.
Basically, you can use:
source dependency (with Eclipse's "referenced projects")
binary dependency (using the jar of bundle A)
But since binary dependency is much cleaner, it is also the kind of dependency best managed by a release management framework like maven.
And you can integrate maven in your Eclipse project through m2eclipse.
The Maven plugin to use would then be: maven-bundle-plugin, that you can see in action in:
Using maven to create an osgi bundle (osgi felix sample)
Bundle Plugin for Maven
Getting the benefits of maven-bundle-plugin in other project types
How to build OSGi bundles using Maven Bundle Plugin
Consider this more real-world example using Felix' Log Service implementation.
The Log Service project is comprised of a single package: org.apache.felix.log.impl.
It has a dependency on the core OSGi interfaces as well as a dependency on the compendium OSGi interfaces for the specific log service interfaces. The following is its POM file:
<project>
<modelVersion>4.0.0</modelVersion>
<groupId>org.apache.felix</groupId>
<artifactId>org.apache.felix.log</artifactId>
<packaging>bundle</packaging>
<name>Apache Felix Log Service</name>
<version>0.8.0-SNAPSHOT</version>
<description>
This bundle provides an implementation of the OSGi R4 Log service.
</description>
<dependencies>
<dependency>
<groupId>${pom.groupId}</groupId>
<artifactId>org.osgi.core</artifactId>
<version>0.8.0-incubator</version>
</dependency>
<dependency>
<groupId>${pom.groupId}</groupId>
<artifactId>org.osgi.compendium</artifactId>
<version>0.9.0-incubator-SNAPSHOT</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<extensions>true</extensions>
<configuration>
<instructions>
<Export-Package>org.osgi.service.log</Export-Package>
<Private-Package>org.apache.felix.log.impl</Private-Package>
<Bundle-SymbolicName>${pom.artifactId}</Bundle-SymbolicName>
<Bundle-Activator>${pom.artifactId}.impl.Activator</Bundle-Activator>
<Export-Service>org.osgi.service.log.LogService,org.osgi.service.log.LogReaderService</Export-Service>
</instructions>
</configuration>
</plugin>
</plugins>
</build>
</project>
There is a 6th option, which I've used for several projects, which is to use a single Eclipse project (not a plugin project, but just an ordinary Java project) and put all source code in there. A build file associated with the project will simply compile all code in a single pass and subsequently create bundles out of the compiled classes (using Bnd from Ant or from the soon to be released BndTools).
This has the downside that it does not honor visibility at development and compile time, but the upside that it's a really simple development model that gives you very fast build and deploy times.