Exchange vars between API and software's core - java

I am developping a screenshot software which can load plugins from JAR. Thoses are developped using the API package, which is made of interfaces to implement, so the person who wants to make a plugin does not have to use the full source code.
This works well for adding like action (Upload to X or X host for example), but what if I want to send variable the other way around, like from a plugin TO the core ? How am I supposed to do this?
The only solution I can think of would be to use callbacks, but I don't find this so clean...
By the way, is my solution to use interface that devs implements, which I then instanciate is correct ? Or there is a better way?

Your solution is the most common way to implement such a scenario. You give plugins an instance of a class (instantiated by core) and they can store it for future use (e.g. to pass data to the core or trigger another action). Normally name of such classes ends with Context (e.g. BundleContext, PluginContext, etc.).
Another pattern is to use a sort of Mediator class. A class with some static methods that plugins can use to send some data to core or trigger some actions. I don't like it and it's not a very clean solution, but it makes it much easier for plugin developers to access the API as they don't need to store the context instance and respect its life cycle. This pattern is used widely in IntelliJ IDEA architecture.
As you're developing a plugin based system, I highly recommend you to take a look at OSGi architecture and APIs. It can be helpful in this regard.

Related

Targeting identical classes in different packages

I have created a library which supports an application, however in the newest version of the application the developer has changed the structure without changing the class names.
So version 1 of the application has classX in package A but version 2 has classX in package B. How can I develop my library in a way which allows supporting both of these in the same build?
Edit: My library is dependent on the application, not the other way around.
That is a bad decision, if you still want to make it work you need to provide skeleton classes with old structure and delegate calls to new version of class but it would get very dirty
better to not provide backward compatibility if you are firm with the renaming decision
Short answer: You can't.
Real answer: Your library should be able to exist independently of any application that uses it. The purpose of a library is to provide a set of reusable, modular code that you can use in any application. If your library is directly dependent on application classes, then it seems like a redesign should be seriously considered, as your dependencies are backwards. For example, have A.classX and B.classX both implement some interface (or extend some class) that your library provides, then have the application pass instances of those objects, or Class's for those objects, to the library.
If your "library" can't be designed this way then consider integrating it into application code, making it a direct part of the application, and come up with a better team workflow for you, the other developer, and others to work on the same project together.
Quick fix answer: Do not provide backward compatibility, as Jigar Joshi states in his answer.
Bad answer: You could hack a fragile solution together with reflection if you really had to. But please note that the "real answer" is going to last in the long run. You are already seeing the issues with the design you have currently chosen (hence your question), and a reflection based solution isn't going to prevent that from happening again (or even be reliable).

How should I design a plugin system in a layered Java EE Application?

I have a Java EE based REST api application. It has a layered architecture like the following:
Resources (Jax-rs resources)
Object Validation
Object Mapper
Service Layer
Repository Layer
JPA Entities
Everything is wired using Spring dependency injection.
I need to design this core application in such a way that it allows other external developers to write extensions/plugins and override or extends any minor or major functionality in the core. Think of it like Wordpress CMS in Java EE if that helps. How would you design a plugin system around the current architecture?
One obvious way that I can think of is override or add new functionality to the proper resource (with validation, objectmapper), service, repository and entity and create a jar + xml out of it. But I want to make sure that the plugin developer has to write the absolutely minimum amount of code to get the new functionality working, while reusing mush of the core code.
Assume, you want to create a wordpress blog post extension that lets you create blog posts with few extra fields that don't exist in core yet. What would be the simplest and cleanest way to go about designing the current Java EE app, so its easy for the plugin/extension developers? Any patterns that could be useful like strategy or template method pattern?
Are there any open source Java CMS that follow the model using Spring/JPA and standard technologies?
I think you mean to extend the functionality, rather than override the core. Typical architecture examples define concerns which can be overridden (separate from core) and make provisions. Eclipse framework achieves this using a combination of plugin-extensions & extension-points mechanism. This is taken further using OSGI bundling.
Another alternative is to breakdown the application into smaller independent modules/services. All you need to do is host these modules over an ESB/Application Integrator (like Mule/Spring Integration) and allow users to configure their version of routing/transformation. Extension would mean creation of new transformers which get added to the message flow.

How to use different versions of a class in the same application?

I'm currently working on a Java application which should have the capability to use different versions of a class at the same time (because of multi tenancy support). I was wondering, is there any good approach to manage this? My basic approach is to have an interface, lets say Car, and implement the different versions as CarV1, CarV2, and so on. Every version gets its own class.
My approach is kind of wiered, I think. But I didn't found any literature regarding to this topic, but I actually don't know what I should search for.
The interface idea is prudent. Combine it with a factory that can produce the required implementation instance depending on some external input, e. g. the tenant-id. If you don't need to support multiple tenants in the same running instance of the application, you could also use something like the ServiceLocator from the JDK which allows to use a file-based configuration approach.
If you are running in an application server, consider just firing up multiple instances, each configured for a different client. The server will then take care of the separation of instances, just fine.
Otherwise, if you really think you need multiple implementations at the same time (at runtime) in a non-Java EE application, this is a tricky problem. Maybe you want to consider a look at OSGi containers, which provide features for having multiple versions of a class. However, an approach like this add significant complexity, if you are not already familiar with it.
In theory you can handle this using multiple class loaders like JBoss for example does.
BUT: I would strongly advise against implementing this yourself. This is a rather complicated matter and easily gotten wrong. If you are talking about a web application, you can instead create one web app instance per tenant. If you are working on a stand-alone app, you should check, if running one instance per tenant might be feasible.

Java - keeping multi-version application from splitting codebase

I am writing an application that will ship in several different versions (initially around 10 variations of the code base will exist, and will need to be maintained). Of course, 98% or so of the code will be the same amongst the different systems, and it makes sense to keep the code base intact.
My question is - what would be the preferred way to do this? If I for instance have a class (MyClass) that is different in some versions (MyClassDifferent), and that class is referenced at a couple of places. I would like for that reference to change depending on what version of the application I am compiling, rather than having to split all the classes referring to MyClassDifferent too. Preprocessor macros would be nice, but they bloat the code and afaik there are only proof of concept implementations available?
I am considering something like a factory-pattern, coupled with a configuration file for each application. Does anyone have any tips or pointers?
You are on the right track: Factory patterns, configuration etc.
You could also put the system specific features in separate jar files and then you would only need to include the appropriate jar alongside your core jar file.
I'd second your factory approach and you should have a closer look at maven or ant (depending on what you are using).
You can deploy the different configuration files that determine which classes are used based on parameters/profiles.
Preprocessor makros like C/C++ have are not available directly for java. Although maybe it's possible to emulate this via build scripts. But I'd not go down that road. My suggestion is stick with the factory approach.
fortunately you have several options
1) ServiceLoader (builtin in java6) put your API class like MyClass in a jar, the compile your application against this API. Then put a separate implementation of MyClass in a separate jar with /META-INF/services/com.foo.MyClass. . Then you can maintain several version of your application simply keeping a "distribution" of jars. Your "main" class is just a bunch of ServiceLoader calls
2) same architecture of 1) but replacing META-INF services with Spring or Guice config
3) OSGI
4) your solution
Look up the AbstractFactory design pattern, "Dependency Injection", and "Inversion of Control". Martin Fowler writes about these here.
Briefly, you ship JAR files with all the needed components. For each service point that can be customized, you define an Interface for the service. Then you write one or more implementations of that Interface. To create a service object, you ask an AbstractFactory for it, eg:
AbstractFactory factory = new AbstractFactory();
...
ServiceXYZ s = factory.newServiceXYZ();
s.doThis();
s.doThat();
Inside your AbstractFactory you construct the appropriate ServiceXYZ object using the Java reflection method Class.classForName(), and SomeClassObject.newInstance(). (Doing it this way means you don't have to have the ServiceXYZ class in the jar files unless it makes sense. You can also build the objects normally.)
The actual class names are read in from a properties file unique to each site.
You can roll your own solution easily enough, or use a framework like Spring, Guice, or Pico.

Sandboxing Java / Groovy / Freemarker Code - Preventing execution of specific methods

I'm developing a system that allows developers to upload custom groovy scripts and freemarker templates.
I can provide a certain level of security at a very high level with the default Java security infrastructure - i.e. prevent code from accessing the filesystem or network, however I have a need to restrict access to specific methods.
My plan was to modify the Groovy and Freemarker runtimes to read Annotations that would either whitelist or blacklist certain methods, however this would force me to maintain a forked version of their code, which is not desirable.
All I essentially need to be able to do is prevent the execution of specific methods when called from Groovy or Freemarker. I've considered a hack that would look at the call stack, but this would be a massive speed hit (and it quite messy).
Does anyone have any other ideas for implementing this?
You can do it by subclassing the GroovyClassLoader and enforcing your constraints within an AST Visitor. THis post explains how to do it: http://hamletdarcy.blogspot.com/2009/01/groovy-compile-time-meta-magic.html
Also, the code referenced there is in the samples folder of Groovy 1.6 installer.
You should have a look at the project groovy-sandbox from kohsuke. Have also a look to his blog post here on this topic and what is solution is addressing: sandboxing, but performance drawback.
OSGi is great for this. You can partition your code into bundles and set exactly what each bundle exposes, and to what other bundles. Would that work for you?
You might also consider the java-sandbox (http://blog.datenwerke.net/p/the-java-sandbox.html) a recently developed library that allows to securely execute untrusted code from within java.
Also see: http://blog.datenwerke.net/2013/06/sandboxing-groovy-with-java-sandbox.html

Categories