Template pattern extension for subclasses with additional methods - java

Best explained by an example:
interface Plane {
public void flapsExtended();
public void engineFullThrottle();
public void takeOff();
public void landed();
}
class Spitfire implements Plane {
}
class P51Mustang implements Plane {
}
So far my code was doing a good job. But as WW2 ended we had never jets with retractable landing gear.
So I added a new class for F22 which would need to add retractLandingGear() and extendLandingGear between the takeOff and land phase.
example:
class F22 {
public void flapsExtended();
public void engineFullThrottle();
public void takeOff();
public void retractLandingGear();
public void extendLandingGear();
public void landed();
}
Now how can I plugging F22's with those legacy code ( and legacy planes :) ) ?

If you fully control all the code and are able to modify it, just modify the old classes so the new methods perform no-ops.
However, if you can't modify the old classes, you will have to make a new sub-interface with the extra methods and do instanceof checks to see which version of the interface the object you are working on uses and then cast accordingly.
This problem will be resolved in Java 8 when you can add default implementations to interfaces.

Related

What pattern should be used, strategy?

I do have a service which needs to handle two types of meal.
#Service
class MealService {
private final List<MealStrategy> strategies;
MealService(…) {
this.strategies = strategies;
}
void handle() {
var foo = …;
var bar = …;
strategies.forEach(s -> s.remove(foo, bar));
}
}
There are two strategies, ‘BurgerStrategy’ and ‘PastaStrategy’. Both implements Strategy interface with one method called remove which takes two parameters.
BurgerStrategy class retrieves meals of enum type burger from the database and iterate over them and perform some operations. Similar stuff does the PastaStrategy.
The question is, does it make sense to call it Strategy and implement it this way or not?
Also, how to handle duplications of the code in those two services, let’s say both share the same private methods. Does it make sense to create a Helper class or something?
does it make sense to call it Strategy and implement it this way or not
I think these classes ‘BurgerStrategy’ and ‘PastaStrategy’ have common behaviour. Strategy pattern is used when you want to inject one strategy and use it. However, you are iterating through all behaviors. You did not set behaviour by getting one strategy and stick with it. So, in my honour opinion, I think it is better to avoid Strategy word here.
So strategy pattern would look like this. I am sorry, I am not Java guy. Let me show via C#. But I've provided comments of how code could look in Java.
This is our abstraction of strategy:
public interface ISoundBehaviour
{
void Make();
}
and its concrete implementation:
public class DogSound : ISoundBehaviour // implements in Java
{
public void Make()
{
Console.WriteLine("Woof");
}
}
public class CatSound : ISoundBehaviour
{
public void Make()
{
Console.WriteLine("Meow");
}
}
And then we stick with one behaviour that can also be replaced:
public class Dog
{
ISoundBehaviour _soundBehaviour;
public Dog(ISoundBehaviour soundBehaviour)
{
_soundBehaviour = soundBehaviour;
}
public void Bark()
{
_soundBehaviour.Make();
}
public void SetAnotherSound(ISoundBehaviour anotherSoundBehaviour)
{
_soundBehaviour = anotherSoundBehaviour;
}
}
how to handle duplications of the code in those two services, let’s say both share the same private methods.
You can create one base, abstract class. So basic idea is to put common logic into some base common class. Then we should create abstract method in abstract class. Why? By doing this, subclasses will have particular logic for concrete case. Let me show an example.
An abstract class which has common behaviour:
public abstract class BaseMeal
{
// I am not Java guy, but if I am not mistaken, in Java,
// if you do not want method to be overriden, you shoud use `final` keyword
public void CommonBehaviourHere()
{
// put here code that can be shared among subclasses to avoid code duplication
}
public abstract void UnCommonBehaviourShouldBeImplementedBySubclass();
}
And its concrete implementations:
public class BurgerSubclass : BaseMeal // extends in Java
{
public override void UnCommonBehaviourShouldBeImplementedBySubclass()
{
throw new NotImplementedException();
}
}
public class PastaSubclass : BaseMeal // extends in Java
{
public override void UnCommonBehaviourShouldBeImplementedBySubclass()
{
throw new NotImplementedException();
}
}

Java: Is child overriding parent discouraged?

I was wondering if it's frowned upon that when designing an framework to be used by others, a class has some function as default behavior and expects its customers to override it if necessary. An example would be something like the following:
public class RecordProcessor<T extends Record> {
// ...
public void process() {
// process record logic
}
}
Consumers of this library creates their concrete classes to process their own records of type T.
Now I want to add a function called preProcess() to offer the ability for the consumers to preprocess their records. It would then look something like this:
public class RecordProcessor<T extends Record> {
// ...
public void process() {
preprocess();
// process record logic
}
public void preProcess() {
// By default no preprocessing
}
}
I know I can make preProcess an abstract function, but I dont want to due to a couple reasons:
Not all customers need to preprocess their records
We have a pipeline structure that autodeploys pushed code, so making RecordProcessor an abstract class would immediately break our customers' applications.
Is making preProcess do nothing in the parent class and let child classes override it considered bad practice? If not, what should the best way be to let customers know that they now have the power to preprocess the records? Through java docs?
One approach is to mark the public method as final (but this might also break existing apps) and allow protected hook methods to be overridden. For example:
public class RecordProcessor<T extends Record> {
// ...
public final void process() {
doPreProcess();
doProcess();
doPostProcess();
}
protected void doPreProcess() {
// By default no preprocessing
return;
}
protected void doProcess() {
// some default implementation
}
protected void doPostProcess() {
// By default no postprocessing
return;
}
}
Having some documentation should make it natural for other developers to recognize the optional extension methods.
I don't see anything wrong with having a hook method which does nothing. However, it should contain a return statement so static analysis tools won't complain.
UPDATE: in order to avoid breaking existing apps, if possible mark the existing method as deprecated and introduce a new method. For example:
public class RecordProcessor<T extends Record> {
// ...
public final void execute() {
doPreProcess();
doProcess();
doPostProcess();
}
#Deprecated - use execute() method instead.
public void process() {
doProcess();
}
protected void doPreProcess() {
// By default no preprocessing
return;
}
protected void doProcess() {
// some default implementation
}
protected void doPostProcess() {
// By default no postprocessing
return;
}
}
Prefer composition over inheritance. If you want your clients to add custom pre processing then do it by delegating to a separate objects.
public interface RecordPreProcessor<T extends Record>{
public void process(T record);
}
public class RecordProcessor<T extends Record> {
private RecordPreProcessor<T> recordPreProcessor = null;
public void setRecordPreProcessor(RecordPreProcessor<T> recordPreProcessor) {
this.recordPreProcessor = recordPreProcessor;
}
public void process() {
if (recordPreProcessor != null) recordPreProcessor.process(record);
// process record logic
}
}
No, overriding is not discouraged in Java.
The language allows overriding.
The language makes all methods overridable by default.
The Java class library includes examples of the same pattern.
Your approach is one reasonable way to allow subclasses to extend the behavior of their parent class. There are alternatives, such as passing a behavior as an object. However, there is no one true way.
One way you could improve your code is to mark preProcess() as protected. It's an implementation detail of the class. You don't want just anyone holding a RecordProcessor to decide they can call preProcess() by itself, right?
public class RecordProcessor<T extends Record> {
...
protected void preProcess() {
^^^^^^^^^
// By default no preprocessing
}
}
Another way to improve this is to consider whether you intend anyone to create an instance of the superclass RecordProcessor. If you don't, make the class abstract, to prevent that. The class name can express that, if you like, or your coding guidelines call for it.
public abstract class AbstractRecordProcessor<T extends Record> {
^^^^^^^^ ^^^^^^^^
...
protected void preProcess() {
// By default no preprocessing
}
}
One common way to document such methods is with the phrase "The default implementation does nothing. Subclasses may override this method ...". For example, below is the documentation for java.util.concurrent.FutureTask.done(). You can find more examples by searching for the first sentence of that phrase online.
public class FutureTask<V> implements RunnableFuture<V> {
...
/**
* Protected method invoked when this task transitions to state
* {#code isDone} (whether normally or via cancellation). The
* default implementation does nothing. Subclasses may override
* this method to invoke completion callbacks or perform
* bookkeeping. Note that you can query status inside the
* implementation of this method to determine whether this task
* has been cancelled.
*/
protected void done() { }
}
What I ended up doing- which I also thought was pretty good, inspired by #tsolakp, was simply creating a child class to RecordProcessor, called something like PreprocessRecordProcessor. This has no way of interfering existing code because nothing existing was touched. The class would something like this:
public class PreprocessRecordProcessor<T extends Record> extends RecordProcessor<T> {
// ...
public void process() {
preProcess();
super.process();
}
protected abstract void preProcess();
}
And if customers of this library would like to add their own logic they can simply extend this class and they'd be forced to provide pre-processing logic (as supposed to having the option to provide, which may result in unexpected results if they forgot to.)

How to use double interfaces in multiple implementing classes in Java?

So lets assume I am having the following stuff defined:
public interface IExportTool {
void export(IReport iReport);
}
And then attempting to use it:
public class KibanaExporter implements IExportTool{
public void export(IReport kibana) {
kibana = (Kibana) kibana;
((Kibana) kibana).toJSON();
}
}
But there are also other classes which would again be doing something like that too:
public class MetricExporter implements IExportTool{
public void export(IReport metric) {
metric = (Metric) metric;
((Metric) metric).toJSON(); // might be something else here like toXML etc
}
}
Please note that both Kibana and Metric are implementing IReport<KibanaRow> and IReport<MetricRow> respectively, while the IReport interface looks like:
public interface IReport<T> {
void addRow(T row);
}
I don't like all this casting, this doesn't feel right nor gives me autocomplete, so any suggestion how to do it properly?
From what you've posted, it's clear that both Kibana and Metric are subtypes of IReport.
In that case, you can make the interface generic:
interface IExportTool<R extends IReport> {
void export(R iReport);
}
And then change the implementations in this fashion:
public class KibanaExporter implements IExportTool<Kibana>{
public void export(Kibana kibana) {
kibana.toJSON();
}
}
And:
public class MetricExporter implements IExportTool<Metric> {
public void export(Metric metric) {
metric.toJSON();
}
}
This version allows the compiler to understand and validate that only instances of subtypes of IReport will ever be passed to export(). Code using this will be validated by the compiler, such that MetricExporter().export() can only be called with an object of type Metric and KibanaExporter().export() with an object of type Kibana.
And with that, type casts are no longer needed.

Cannot override method and cannot access field while using idiom "Providing a default interface implementation"

Here is code:
IDefaultInterface.aj:
public interface IDefaultInterface {
public void m1();
static aspect Impl{
public int f1;
public void IDefaultInterface.m1(){
}
}
}
DefaulstInterfaceClass.java:
public class DefaultInterfaceClass implements IDefaultInterface {
#Override
public void m1() {
}
void mm() {
f1 = 9;
}
}
In the second piece of code I'm trying to override m1() method and access f1 field. The compiler allows neither one.
How to overcome these limitations?
Additional thoughts. I would not wonder so much if in "AspectJ in action" 2 edition wasn't said about using this idiom that effect should be the same "as extending the default implementation for both (if multiple inheritance was allowed in Java)." I believe that multiple inheritance associated with C++ for majority. So, why not provide the semantics to which people used to?
I'm not fluent in AspectJ, but I see a couple of questionable things: your aspect is trying to define a non-abstract method in an interface, and your class is trying to access field f1 as if it owns the field, when you've declared f1 on the aspect. I'm not quite sure what you're trying to do here, but I don't think you're going about it in the right way.
First of all I misspelled f1 declaration. It should be
public int IDefaultInterface.f1;
It solves access field problem.
The second problem is solved by using following code:
public interface IDefaultInterface {
public void m1();
public static interface Impl extends IDefaultInterface{
static aspect Implementation{
public int IDefaultInterface.Impl.f1;
public void IDefaultInterface.Impl.m1(){
}
}
}
}
And then:
public class DefaultInterfaceClass implements IDefaultInterface.Impl ....

Java Polymorphism - Selecting correct method based on subtype

Given the following Class and Service layer signatures:
public class PersonActionRequest {
PersonVO person
// ... other fields
}
public class MyServiceLayerClass {
public void requestAction(PersonActionRequest request)
{
PersonVO abstractPerson = request.getPerson();
// call appropriate executeAction method based on subclass of PersonVO
}
private void executeAction(PersonVO person) {}
private void executeAction(EmployeeVO employee) {}
private void executeAction(ManagerVO manager) {}
private void executeAction(UnicornWranglerVO unicornWrangler) {}
}
As discussed here, java will select the best method based on type info at compile time. (Ie., it will always select executeAction(PersonVO person) ).
What's the most appropriate way to select the correct method?
The internet tells me that using instanceof gets me slapped. However, I don't see the appropraite way to select the method without explictly casting abstractPerson to one of the other concrete types.
EDIT: To Clarify - The VO passed in is a simple ValueObject exposed for web clients to instantiate and pass in. By convention it doesn't have methods on it, it's simply a data structure with fields.
For this reason, calling personVO.executeAction() is not an option.
Thanks
Marty
If executeAction was a method in a base class or interface that was common to PersonVO, EmployeeVO, ManagerVO and UnicornWranglerVO, you could just call abstractPerson.executeAction() instead of having multiple overridden methods.
Your principle obstacle to polymorphism here seems to be a 'dumb-struct' data object + 'manager class' service non-pattern. The "more polymorphic' approach would be for execute() to be a method that the various person implementations override.
Assuming that can't change, the way you do multiple dispatch in Java is with visitor-looking callbacks.
public interface PersonVisitor {
void executeAction(EmployeeVO employee);
void executeAction(ManagerVO manager);
void executeAction(UnicornWranglerVO unicornWrangler);
}
public abstract class PersonVO {
public abstract void accept(PersonVisitor visitor);
}
public class EmployeeVO extends PersonVO {
#Override
public void accept(PersonVisitor visitor) {
visitor.executeAction(this);
}
}
public class MyServiceLayerClass implements PersonVisitor {
public void requestAction(PersonActionRequest request)
{
PersonVO abstractPerson = request.getPerson();
abstractPerson.accept(this);
}
public void executeAction(EmployeeVO employee) {}
public void executeAction(ManagerVO manager) {}
public void executeAction(UnicornWranglerVO unicornWrangler) {}
}
You could change the way you are approaching the design and use a Visitor, passing the executor into the Person and have the person type determine which to call.
The Visitor pattern is often used to overcome Java lacking double-dispatch.
I would explicitly cast the abstractPerson. Not only does it ensure the JVM gets the right method, it makes it a hell of a lot easier to read and ensure you know what's going on.

Categories