What I'm trying to accomplish is the following:
I have a server with the following structure.
bin
apis
services
etc...
I want to define an API that contains an aspect to be used by services. Say:
#Aspect
public class AuthorizationAspect {
#Pointcut("call(* *()) && #annotation(Authorization)")
public void cutAuthorize() { }
#Before("cutAuthorize()")
public void callFromAuthorizeBefore() {
System.out.println("Test");
}
}
Then I define the service and annotate the methods I want with #Authorization and it gets pointcut by that aspect.
Things you should know:
Services only use the API to compile the code, therefore the scope is "provided", since the API will be already in the server.
Services JARs are loaded dynamically, so they will reside in another classloader.
My question is, how can I do this? How do I define my maven artifacts to accomplish that?
I noticed that the aspectj plugin has a weaveDependencies section, but this will also include in the service JAR all classes in that API (something that I want to avoid). Is this the right move?
Thanks in advance,
Rui
Take a look at how it's done in jcabi-aspects. You declare/compile your aspects in the API and then use this JAR as we com.jcabi:jcabi-aspects is being used in the example:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<configuration>
<aspectLibraries>
<aspectLibrary>
<groupId>com.jcabi</groupId>
<artifactId>jcabi-aspects</artifactId>
</aspectLibrary>
</aspectLibraries>
</configuration>
</plugin>
It's OK to have your aspects JAR in provided (or runtime) scope.
Related
We want to make a plugin-type main program based on spring. The main program can load other Spring jars and non-spring jars as a plugin. Each plugin is based on IPlugin, And the plugin's 'IPlugin' class same as the main program's 'IPlugin' class.
We make the non-spring plugin work by 'URLClassLoader', But the way not for the spring plugin.
In the 'TestPlugin' project, the implementation named of 'PluginTest' and execute 'SpringApplication.run(PluginTest.class, args);' in function 'init(String[])'.
'ClassNotFoundException' occurred for load class 'PluginTest'(cause of spring jars structure).
String pluginClassName = "com.example.demo.PluginTest";
c = newClassLoader.loadClass(pluginClassName);
So, We replace the pom 'plugins' section like following:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>3.1.0</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
<finalName>${project.artifactId}-${project.version}</finalName>
<appendAssemblyId>false</appendAssemblyId>
<attach>false</attach>
<archive>
<manifest>
<addDefaultImplementationEntries>true</addDefaultImplementationEntries>
</manifest>
</archive>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
But, We got another error 'Caused by: java.lang.NoClassDefFoundError: com/zaxxer/hikari/HikariConfig' (JDBC used in test project).
I don't know how to properly load other spring jars at run time.
Whole codes of plugin loader:
#Component
public class Initializer implements ApplicationRunner
{
#Override
public void run(ApplicationArguments args)
{
String jarPath = "e:/tmp/demo-0.0.1-SNAPSHOT.jar";
File file = new File(jarPath);
IPlugin p;
try
{
ClassLoader oldClassLoader = Thread.currentThread().getContextClassLoader();
URLClassLoader newClassLoader = new URLClassLoader(new URL[]{file.toURI().toURL()}, oldClassLoader);
Thread.currentThread().setContextClassLoader(newClassLoader);
String pluginClassName = "com.example.demo.PluginTest";
Class<?> c = newClassLoader.loadClass(pluginClassName);
Object pluginTest = c.newInstance();
p = (IPlugin)pluginTest;
p.init(new String[]{
"--spring.config.location=e:/tmp/application.properties"
});
}
catch (Exception e)
{
System.out.println(e);
e.printStackTrace();
}
}
}
Thanks!
In short - if your requirement is to load external jars at runtime and extend your spring application context with that - then this is not possible for a couple of reasons:
The Spring Application Context is built at application startup by scanning the Classpath, instantiating all beans and wiring them to each other (plus - doing a lot of fancy stuff in addition). Once setup, that application context is mostly static and will not be reevaluated.
It's possible to change a whole lot in Spring via configuration as code (e.g. how database transactions work or enabling/disabling certain spring features). Therefore, modifying that configuration at runtime, would be extremely hard as all possible combinations of changes would need to be considered - and back-integrated into the existing context. In addition, there's also potential error cases that can't be resolved while the application is running (e.g. consider you're introducing a circular dependency. that would normally prevent the application from even starting up - but now the application is already started - what should happen?)
There are massive security and stability issues with dynamically loading an executing code from external jars at runtime.
Even if Spring would be able to take care of all these things, there'd still be the problem that the application would need to be implemented dynamically. (e.g. do not cache references to other beans or information locally. also consider that beans may just not be there yet)
In short, Spring is not designed for that kind of dynamicity. If that is what you really need, consider application platforms that are more suitable for that sort of requirement. OSGI (Spring DM was built on OSGI) might be a solution, but be warned that there is a gigantic complexity and overhead involved in building OSGI applications that this platform requires from an application developer in order to solve the challenges mentioned above.
I would instead really recommend, to evaluate if you can work with a model, that can live without dynamic code loading as this makes things a lot easier. For instance, Spring has a very usable auto configuration system built in, that requires absolutely minimal overhead. What's necessary though is that your libraries are present in the classpath at runtime. For more details, you can read the documentation here.
I want to write a piece of Java code which can be executed with 2 different kinds of dependencies (or version of a dependency). Namely speaking about org.apache.poi. The code must run on a system with version=2 as well as version=3 or org.apache.poi.
Unfortunately between the versions 2 & 3 some interfaces have changed, code must be build slightly different and there is no way to upgrade both system to the same org.apache.poi version.
So my questions are:
Is there a way to compile the code with both versions to not run into compiler errors?
Is there a way to execute the right code based on the available org.apache.poi version?
What would be an appropriate approach to solve this issue?
As an amendment:
I'm building a code which shall work for two applications which provides an interface in different versions (maven scope of the dependency is provided).
If I have both dependencies in maven, it takes any of the dependencies and IF clauses will fail to compile as Cell.CELL_TYPE_STRING or CellType.STRING is not available in the chosen dependency.
And I would like to have the code working regardless of which dependency is plugged in the application.
// working with old poi interface
if (cell != null && cell.getCellType() == Cell.CELL_TYPE_STRING
&& cell.getRichStringCellValue().getString().trim().equals(cellContent)) {
return row;
}
// working with new poi interface
if (cell != null && cell.getCellType() == CellType.STRING
&& cell.getRichStringCellValue().getString().trim().equals(cellContent)) {
return row;
}
This i probably opinion based, but it seams legit.
First, you will have to create common interface that you will use to do your job.
Second, you will have to create adapter classes that implements that interface and will do required job using particular version of POI library
Third, you will write adapter factory that will return proper instance of adapter.
Adapter itself should provide "isSupported" method that will detect if given adapter can be used based on what kind of POI classes are currently loaded (detect by reflection - there must be some version specific classes or other markers)
Then you will put each adapter into separate maven module, so each module can be compiled independently (thus you will have no class conflicts). Each module will have POI dependency in "provided" scope in version that this adapter is going to support
Either module registers itself with the factory in your main module, or factory itself detects all adapters that are available (like #ComponentScan in Spring).
Then you will pack everything into single app bundle. Main module will use only common interface. All in all it will be kind of extensible plugin system
I do not think there is a single "best way".
Nonetheless, we faced a similar issue in a few of our apps that share a common library. I ended up with a variant of #Antoniossss's variant, except that the library itself does not use dependency injection (the parent app may or may not, but the library is free of it).
To be more specific, and due to transitive dependencies, some of our apps need a certain version of Apache Lucene (e.g. 7.x.y, or more) and other are stuck on older versions (5.5.x).
So we needed a way to build one of our lib against those versions, using maven in our case.
What we ended uses the following principles :
We share some code, which is common between all versions of Lucene
We have specific code, for each target version of Lucene that has an incompatible API (e.g. package change, non existing methods, ...)
We build as many jars as there are supported versions of lucene, with a naming scheme such as groupId:artifact-luceneVersion:version
Where the lib is used, direct access to the Lucene API is replaced by access to our specific classes
For exemple, un Lucene v5 there is a org.apache.lucene.analysis.synonym.SynonymFilterFactory facility. In v7 the same functionnality is implemented using org.apache.lucene.analysis.synonym.SynonymGraphFilterFactory e.g. same package, but different class.
What we end up with is providing a com.mycompany.SynonymFilterFactoryAdapter. In the v5 JAR, this class extends the Lucene v5 class, and respectively with v7 or any other version.
In the final app, we always instantiate the com.mycompany object, that behaves just the same as the native org.apache class.
Project structure
The build system being maven, we build it as follow
project root
|- pom.xml
|-- shared
|---|- src/main/java
|---|- src/test/java
|-- v5
|---|- pom.xml
|-- v7
|---|- pom.xml
Root pom
The root pom is a classic multimodule pom, but it does not declare the shared folder (notice that the shared folder has no pom).
<modules>
<module>v5</module>
<module>v7</module>
</modules>
The shared folder
The shared folder stores all non-version specific code and the tests. On top of that, when a version specific class is needed, it does not code against the API of this class (e.g. it does not import org.apache.VersionSpecificStuff), it does against com.mycompany.VersionSpecificStuffAdapter).
The implementation of this Adapter being left to the version specific folders.
Version specific folders
The v5 folder declares in its artifact id the Lucene version it compiles to, and of course declares it as a dependency
....
<artifactId>myartifact-lucene-5.5.0</artifactId>
....
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-analyzers-common</artifactId>
<version>5.5.0</version>
</dependency>
But the real "trick" is that it declares an external source folder for classes and tests using the build-helper-maven-plugin : see below how the source code from the shared folder is imported "as if" it was from this project itself.
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>3.0.0</version>
<executions>
<execution>
<id>add-5.5.0-src</id>
<phase>generate-sources</phase>
<goals>
<goal>add-source</goal>
</goals>
<configuration>
<sources>
<source>../shared/src/main/java</source>
</sources>
</configuration>
</execution>
<execution>
<id>add-5.5.0-test</id>
<phase>generate-test-sources</phase>
<goals>
<goal>add-test-source</goal>
</goals>
<configuration>
<sources>
<source>../shared/src/test/java</source>
</sources>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
For the whole implementation to work, it provides the Adapter implementations in its own source folder src/main/java, e.g.
package com.mycompany
public class VersionSpecificStuffAdapter extends org.apache.VersionSpecificStuff {
}
If both the v5 and the v7 package do it the same way, then client code using the com.mycompany.xxxAdapter will always compile, and under the hood, get the corresponding implementation of the library.
This is one way to do it. You can also, as already suggested, define your whole new interfaces and have clients of your lib code against your own interface. This is kind of cleaner, but depending on the case, may imply more work.
In your edit, you mention refering to constants that are not defined the same way, e.g. CellType.TYPE_XX.
In the version specific code, you could either produce another constant MyCellType.TYPE_XX that would duplicate the actual constant, under a stable name.
In case of an enum, you could create a CellTypeChecker util with a method isCellTypeXX(cell), that would be implemented in a version specific way.
v7 folder
It's pretty much the same structure, you just swap what changed between v5 and v7.
Caveats
This may not always scale.
If you have hundreds of types you need to adapt, this is cumbersome to say the least.
If you have 2 or more libs you need to cross-compile against (e.g. mylib-poi-1.0-lucene-5.5-guava-19-....) it's a mess.
If you have final classes to adapt, it gets harder.
You have to test to make sure every JAR has all the adapters. I do that by testing each Adapted class in the shared test folder.
I've tried to add more information on my Swagger documentation, but I'm having some issues with the #ApiPropertyModel annotation in specific.
It doesn't matter what I try to do, it just doesn't work. The plugin is generating the Swagger.json correctly, all the #ApiOperation annotations are working for the REST resources, but for the model part, it only introspects the model classes' properties and doesn't look at the annotations above them.
Here is how the plugin is configured:
<plugin>
<groupId>com.github.kongchen</groupId>
<artifactId>swagger-maven-plugin</artifactId>
<version>3.1.5</version>
<configuration>
<apiSources>
<apiSource>
<locations>
<location>com.example.rest.resources</location>
<location>com.example.rest.model</location>
</locations>
<swaggerDirectory>${project.build.directory}/generated-sources</swaggerDirectory>
<basePath>/path/to/the/api</basePath>
<info>
<title>My RESTful API Documentation</title>
<version>${project.version}</version>
</info>
</apiSource>
</apiSources>
</configuration>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>generate</goal>
</goals>
</execution>
</executions>
</plugin>
If I have for example:
#ApiModelProperty(example = "test example")
public String test;
It will generate the test property but it won't create any example or any other property that I set up in that annotation. The same is happening when using it in a getter, so I think that's not the problem.
Am I doing anything wrong? Also, I looked at Kongchen's example project and I couldn't see anything special to make it work.
I was trying to mess with the code again, and I've found that the problem is on the structure of the project. It has different modules, and it has a profile for the general development and a profile just for the RESTful API documentation.
I was distracted for a while and started to build the projects using mvn clean package, and as it had a version of the project installed, it was using it to create the documentation, and that's why it was never changing, after I used mvn clean install in the main source code I could see the annotation make any effect.
I'm sorry guys, it was beyond any information I could give about the documentation project, since it was something about the whole structure I'm using. But at least I'll keep this answer so the next person may be aware about this.
Thank you for your attention!
Maybe you forgot the #ApiModel annotation on your Model classes?
Like:
#ApiModel
public class PostRequest {
#ApiModelProperty(example = "test example")
public String test;
}
or your model package does not match what's given in the pom.xml.
I'm trying to upgrade our Spring version and use Spring IO Platform BOM to do so, but a few of our classes have gone missing (moved into other artifacts) or are no longer dependencies of some thing I was pulling in. I'm trying to find out which package they were originally part of (one example is CSVStrategy ). Some of these dependencies such as WhitespaceTokenizer have over a dozen artifact names that could be supplying it, and in order to find the correct upgrade path I need to figure out where it's currently coming from.
One possible way could be to get the resource (class) location. If the class comes from a jar file you would at least get the jar name. From that you should be able to identify the maven artifact.
someClass.getProtectionDomain().getCodeSource().getLocation().toURI();
Or with a ResourceLoader and a logger you could print a list of all classes on the classpath / servlet-path.
#Autowired
ResourceLoader resourceLoader;
public void printResourceLocations() {
PathMatchingResourcePatternResolver resolver = new PathMatchingResourcePatternResolver(resourceLoader);
Resource[] resources = resolver.getResources("classpath*:com/**/*.class"));
for (Resource resource : resources) {
log.info(resource.getURI());
// Not sure if that works, probably getFile() is ok?
}
}
I have used JBoss Tattletale for this type of task in the past. I don't think it's being actively maintained any longer, however it still works for me. Here's the config I use. Note, I had to add this to my POM's build section, even though the goal 'report' seems to imply it is a report plugin.
<plugin>
<groupId>org.jboss.tattletale</groupId>
<artifactId>tattletale-maven</artifactId>
<!-- The version of the plugin you want to use -->
<version>1.2.0.Beta2</version>
<executions>
<execution>
<goals>
<goal>report</goal>
</goals>
</execution>
</executions>
<configuration>
<!-- This is the location which will be scanned for generating tattletale reports -->
<source>${project.build.directory}/${project.artifactId}/WEB-INF/lib</source>
<!-- This is where the reports will be generated -->
<destination>${project.build.directory}/site/tattletale</destination>
</configuration>
</plugin>
You could also try jHades. I haven't had a chance to use it yet, it is on my list of things to investigate.
I'm building (multiple) complex webservice with base XSD types from all kinds of standards (GML, SWE, XLINK, etc). Now, I would like to break up the compilation into more steps, preferrably one for each of the standards I'm using.
Advantages:
1) I can add create tooling libraries that I can re-use in all of my webservices on each of the standards.
2) I can make use of the power of JAXB2 basics plugin, which seems to work very nicely with the maven-jaxb2-plugin (org.jvnet.jaxb2.maven2) and create for instance interface bindings. This in contrast with the jaxws-maven-plugin plugin.
The final step would be using the org.jvnet.jax-ws-commons:maven-jaxb2-plugin to create the actual web service that I can implement in an EJB (or call as a client).
Now, the org.jvnet.jaxb2.maven2:maven-jaxb2-plugin plugin allows me to refer to episodes by means of their maven coordinate, as part of its like this:
<episodes>
<episode>
<groupId>org.example</groupId>
<artifactId>jaxb2-basics-test-episodes-a</artifactId>
</episode>
</episodes>
How can I do this by means of the org.jvnet.jax-ws-commons:maven-jaxb2-plugin? I've searched a lot, and experimented like this:
<plugin>
<groupId>org.jvnet.jax-ws-commons</groupId>
<artifactId>>maven-jaxb2-plugin</artifactId>
<version>2.1</version>
<executions>
<execution>
<goals>
<goal>wsimport</goal>
</goals>
</execution>
</executions>
<configuration>
<wsdlDirectory>src/main/resources/</wsdlDirectory>
<wsdlFiles>
<wsdlFile>example.wsdl</wsdlFile>
</wsdlFiles>
<xjcArgs>
<xjcArg>-b</xjcArg>
<xjcArg>../cpt-xsd/target/generated-sources/xjc/META-INF/sun-jaxb.episode</xjcArg>
</xjcArgs>
<verbose>true</verbose>
</configuration>
</plugin>
Which takes the episode file from the target dir of the (compiled) JAXB dependend project. This sometimes even fails in the maven build (why I did not figure out yet).
I've tried to use catalog files to make a mapping but (I think I saw somewhere a catalog mapping that took maven coordinates as destination), but haven't succeeded yet.
Are you aware of the OGC Schemas and Tools Project? (Disclaimer: I'm the author.)
Now, to your question. My guess would be that org.jvnet.jax-ws-commons:maven-jaxb2-plugin does not support the "Maven coordinates" as you call them. This was a feature I've specifically implemented for my org.jvnet.jaxb2.maven2:maven-jaxb2-plugin (disclaimer: I'm the author).
From the other hand, episode file is nothing but a JAXB binding file. So you can simply extract this file from the JAR artifact (for instance using the maven-dependency-plugin) and then include it more or less like you do it already. Just don't point to directories in other modules, this is not reliable.