I do have a service which needs to handle two types of meal.
#Service
class MealService {
private final List<MealStrategy> strategies;
MealService(…) {
this.strategies = strategies;
}
void handle() {
var foo = …;
var bar = …;
strategies.forEach(s -> s.remove(foo, bar));
}
}
There are two strategies, ‘BurgerStrategy’ and ‘PastaStrategy’. Both implements Strategy interface with one method called remove which takes two parameters.
BurgerStrategy class retrieves meals of enum type burger from the database and iterate over them and perform some operations. Similar stuff does the PastaStrategy.
The question is, does it make sense to call it Strategy and implement it this way or not?
Also, how to handle duplications of the code in those two services, let’s say both share the same private methods. Does it make sense to create a Helper class or something?
does it make sense to call it Strategy and implement it this way or not
I think these classes ‘BurgerStrategy’ and ‘PastaStrategy’ have common behaviour. Strategy pattern is used when you want to inject one strategy and use it. However, you are iterating through all behaviors. You did not set behaviour by getting one strategy and stick with it. So, in my honour opinion, I think it is better to avoid Strategy word here.
So strategy pattern would look like this. I am sorry, I am not Java guy. Let me show via C#. But I've provided comments of how code could look in Java.
This is our abstraction of strategy:
public interface ISoundBehaviour
{
void Make();
}
and its concrete implementation:
public class DogSound : ISoundBehaviour // implements in Java
{
public void Make()
{
Console.WriteLine("Woof");
}
}
public class CatSound : ISoundBehaviour
{
public void Make()
{
Console.WriteLine("Meow");
}
}
And then we stick with one behaviour that can also be replaced:
public class Dog
{
ISoundBehaviour _soundBehaviour;
public Dog(ISoundBehaviour soundBehaviour)
{
_soundBehaviour = soundBehaviour;
}
public void Bark()
{
_soundBehaviour.Make();
}
public void SetAnotherSound(ISoundBehaviour anotherSoundBehaviour)
{
_soundBehaviour = anotherSoundBehaviour;
}
}
how to handle duplications of the code in those two services, let’s say both share the same private methods.
You can create one base, abstract class. So basic idea is to put common logic into some base common class. Then we should create abstract method in abstract class. Why? By doing this, subclasses will have particular logic for concrete case. Let me show an example.
An abstract class which has common behaviour:
public abstract class BaseMeal
{
// I am not Java guy, but if I am not mistaken, in Java,
// if you do not want method to be overriden, you shoud use `final` keyword
public void CommonBehaviourHere()
{
// put here code that can be shared among subclasses to avoid code duplication
}
public abstract void UnCommonBehaviourShouldBeImplementedBySubclass();
}
And its concrete implementations:
public class BurgerSubclass : BaseMeal // extends in Java
{
public override void UnCommonBehaviourShouldBeImplementedBySubclass()
{
throw new NotImplementedException();
}
}
public class PastaSubclass : BaseMeal // extends in Java
{
public override void UnCommonBehaviourShouldBeImplementedBySubclass()
{
throw new NotImplementedException();
}
}
Related
I am trying to implement Template method pattern, but I need a slight variation that I don't think is best practice.
I have the following structure of classes
abstract class AbsClass {
public void algorithm(){
step1();
step2();
}
private void step1() {
//implementation
}
protected abstract void step2();
}
class A extends AbsClass {
protected void step2() {
// With implementation
}
}
class B extends AbsClass {
protected void step2() {
// No implementation needed
}
}
In the real case I have like 4 classes, and one of them doesn't need to have implementation for the second step. I don't think to leave the method empty would be good practice. I was thinking to put a comment(saying there is no need for implementation) in it, but I don't this would be the right solution.
Is there another approach I am not seeing?
We should not Force a design pattern. Here if we prefer Composition over inheritance then its better.
The code present in the question we have a method defined in a class but actually method has no behavior. Forcing a method in a class where it should not belomg to is not a good idea.
Below is one such possible implementation where you would not force a method to a class if it does not belong to it. Below is based on Strategy patter, but still I would say follow design principles and let the pattern itself suit your problem and do not force pattern to fit your solution.
public class AlgorithmClass {
private Strategy strategy;
public void setStrategy(Strategy strategy){
this.strategy = strategy;
}
public void algorithm(){
step1();
step2();
}
private void step1() {
//implementation
}
private void step2(){
if(this.strategy != null){
this.strategy.execute();
}
}
}
public interface Strategy{
public void execute();
}
public class Strategy1 implements Strategy{
public void execute(){
//implement your one of the stategies for step 2
}
}
public class Strategy2 implements Strategy{
public void execute(){
//implement your another stategy for step 2
}
}
I agree with #Vinay Avasthi's answer but I want to reinforce it.
Hook Method
Hook methods are defined in the base class and are a default implementation. And these can be overridden - they don't have to.
From Template-Method-Pattern Wikipedia page:
Template method's abstract class may also define hook methods that may be overridden by subclasses. These have a no-op implementation in the abstract class, but provide a "hook" on which to "hang" implementations.
Improvement
What you should do is leave a comment in the method body like // empty method body so that someone reading your code (and maybe your self) knows that this method has not been forgotten
Java's Default Methods
There is a second way to implement the Template-Method-Pattern in Java. Since Java 8 it is possible to have default method implementations in an interface.
If your methods do not depend on state it could look like:
interface AbsClass {
default void algorithm(){
step1();
step2();
}
default void step1() {
// implementation or empty
}
default void step2() {
// empty body in default case
}
}
class B implements AbsClass { }
I think it is absolutely fine. If the default behavior of step2 is to do nothing, then you can have an empty method in base class and override in child classes.
I was wondering if it's frowned upon that when designing an framework to be used by others, a class has some function as default behavior and expects its customers to override it if necessary. An example would be something like the following:
public class RecordProcessor<T extends Record> {
// ...
public void process() {
// process record logic
}
}
Consumers of this library creates their concrete classes to process their own records of type T.
Now I want to add a function called preProcess() to offer the ability for the consumers to preprocess their records. It would then look something like this:
public class RecordProcessor<T extends Record> {
// ...
public void process() {
preprocess();
// process record logic
}
public void preProcess() {
// By default no preprocessing
}
}
I know I can make preProcess an abstract function, but I dont want to due to a couple reasons:
Not all customers need to preprocess their records
We have a pipeline structure that autodeploys pushed code, so making RecordProcessor an abstract class would immediately break our customers' applications.
Is making preProcess do nothing in the parent class and let child classes override it considered bad practice? If not, what should the best way be to let customers know that they now have the power to preprocess the records? Through java docs?
One approach is to mark the public method as final (but this might also break existing apps) and allow protected hook methods to be overridden. For example:
public class RecordProcessor<T extends Record> {
// ...
public final void process() {
doPreProcess();
doProcess();
doPostProcess();
}
protected void doPreProcess() {
// By default no preprocessing
return;
}
protected void doProcess() {
// some default implementation
}
protected void doPostProcess() {
// By default no postprocessing
return;
}
}
Having some documentation should make it natural for other developers to recognize the optional extension methods.
I don't see anything wrong with having a hook method which does nothing. However, it should contain a return statement so static analysis tools won't complain.
UPDATE: in order to avoid breaking existing apps, if possible mark the existing method as deprecated and introduce a new method. For example:
public class RecordProcessor<T extends Record> {
// ...
public final void execute() {
doPreProcess();
doProcess();
doPostProcess();
}
#Deprecated - use execute() method instead.
public void process() {
doProcess();
}
protected void doPreProcess() {
// By default no preprocessing
return;
}
protected void doProcess() {
// some default implementation
}
protected void doPostProcess() {
// By default no postprocessing
return;
}
}
Prefer composition over inheritance. If you want your clients to add custom pre processing then do it by delegating to a separate objects.
public interface RecordPreProcessor<T extends Record>{
public void process(T record);
}
public class RecordProcessor<T extends Record> {
private RecordPreProcessor<T> recordPreProcessor = null;
public void setRecordPreProcessor(RecordPreProcessor<T> recordPreProcessor) {
this.recordPreProcessor = recordPreProcessor;
}
public void process() {
if (recordPreProcessor != null) recordPreProcessor.process(record);
// process record logic
}
}
No, overriding is not discouraged in Java.
The language allows overriding.
The language makes all methods overridable by default.
The Java class library includes examples of the same pattern.
Your approach is one reasonable way to allow subclasses to extend the behavior of their parent class. There are alternatives, such as passing a behavior as an object. However, there is no one true way.
One way you could improve your code is to mark preProcess() as protected. It's an implementation detail of the class. You don't want just anyone holding a RecordProcessor to decide they can call preProcess() by itself, right?
public class RecordProcessor<T extends Record> {
...
protected void preProcess() {
^^^^^^^^^
// By default no preprocessing
}
}
Another way to improve this is to consider whether you intend anyone to create an instance of the superclass RecordProcessor. If you don't, make the class abstract, to prevent that. The class name can express that, if you like, or your coding guidelines call for it.
public abstract class AbstractRecordProcessor<T extends Record> {
^^^^^^^^ ^^^^^^^^
...
protected void preProcess() {
// By default no preprocessing
}
}
One common way to document such methods is with the phrase "The default implementation does nothing. Subclasses may override this method ...". For example, below is the documentation for java.util.concurrent.FutureTask.done(). You can find more examples by searching for the first sentence of that phrase online.
public class FutureTask<V> implements RunnableFuture<V> {
...
/**
* Protected method invoked when this task transitions to state
* {#code isDone} (whether normally or via cancellation). The
* default implementation does nothing. Subclasses may override
* this method to invoke completion callbacks or perform
* bookkeeping. Note that you can query status inside the
* implementation of this method to determine whether this task
* has been cancelled.
*/
protected void done() { }
}
What I ended up doing- which I also thought was pretty good, inspired by #tsolakp, was simply creating a child class to RecordProcessor, called something like PreprocessRecordProcessor. This has no way of interfering existing code because nothing existing was touched. The class would something like this:
public class PreprocessRecordProcessor<T extends Record> extends RecordProcessor<T> {
// ...
public void process() {
preProcess();
super.process();
}
protected abstract void preProcess();
}
And if customers of this library would like to add their own logic they can simply extend this class and they'd be forced to provide pre-processing logic (as supposed to having the option to provide, which may result in unexpected results if they forgot to.)
source: https://en.wikipedia.org/wiki/Factory_method_pattern
This diagram really alludes to Factory Method Pattern?
Why do we need Creator? Look at code example:
interface Product{
public String getName();
}
class ConcreteProduct1 implements Product {
#Override
public String getName() {
return "I'm product 1";
}
}
class ConcreteProduct2 implements Product {
#Override
public String getName() {
return "Im product 2!";
}
}
// CREATOR HERE
interface Creator{
public Product createProuct(String productClass);
}
class ConcreteCreator implements Creator{
#Override
public Product createProuct(String productClass) {
if(productClass.equals("1"))
return new ConcreteProduct1();
else if(productClass.equals("2"))
return new ConcreteProduct2();
else
return null; //
}
}
public class Test {
public static void main(String[] args) {
Creator c = new ConcreteCreator();
Product product = c.createProuct("1");
System.out.print(product.getName());
}
}
Code without Creator interface:
class ConcreteCreator{
public Product createProuct(String productClass) {
if(productClass.equals("1"))
return new ConcreteProduct1();
else if(productClass.equals("2"))
return new ConcreteProduct2();
else
return null; //
}
}
public class Test{
public static void main(String[] args) {
ConcreteCreator c = new ConcreteCreator();
Product product = c.createProuct("1");
System.out.print(product.getName());
}
}
So why do we need Creator interface? Is it in case i would add another factory method in future? If yes, is it still Factory Method Pattern or Abstract Factory Pattern? Could you give me some code examples with extensions to my Creator interface and implementation of ConcreteCreator which uses two methods?
Also how about generic Creator? It looks much simpler than many type specified Creators...:
interface Product{
public String getName();
}
class ConcreteProduct implements Product{
#Override
public String getName() {
return "I'm product 1";
}
}
interface Moveable{
public String move();
}
class Car implements Moveable{
#Override
public String move() {
return "moving...";
}
}
interface Creator<T>{
public T create();
}
class ConcreteCreatorProducts implements Creator<Product>{
#Override
public Product create() {
return new ConcreteProduct();
}
}
class ConcreteCreatorCar implements Creator<Car>{
#Override
public Car create() {
return new Car();
}
}
public class Test{
public static void main(String[] args) {
Creator<Product> productCreator = new ConcreteCreatorProducts();
Product product = productCreator.create();
Creator<Car> carCreator = new ConcreteCreatorCar();
Car car = carCreator.create();
}
}
In your example, you don't need a Creator interface, unless you want to have multiple implementations and swap between them. But the diagram is actually describing a slightly different pattern than you've implemented.
The way the factory method pattern is described there is based on the original design patterns book. It's a bit odd today, as it uses subclassing to configure a class, when we would encourage the use of composition instead. So, the diagram does show the factory method pattern, but different from the way it's described in many other places.
The factory method pattern is:
Define an interface for creating an object, but let subclasses decide
which class to instantiate. The Factory method lets a class defer
instantiation it uses to subclasses.
In the original pattern, Creator isn't an interface. By 'interface', they mean the factory method that Creator defines, not interfaces like Java has.
The factory method doesn't need a parameter. Instead of different types being returned based on the parameter, there are different types returned based on the subclass created.
Also, you wouldn't call createProduct from main, but from methods within Creator. Creator is the user of the factory method, so it defines a factory method, that may be abstract, and some other methods that use that method.
See the Java examples on the wikipedia page. The MazeGame class is the Creator. The constructor is used as the anOperation method, and there are multiple subclasses for creating different kinds of rooms.
Code is written so that human readers understand it.
This means that you as a programmer sometimes use the means of the language not because it is absolutely mandatory, but because it is the best way to communicate your intention.
As soon as you declare that something is an interface you make it clear that there is no "base class" - only an interface, and that any specific implementation is subtle detail not really important to people dealing with the corresponding objects.
In other words: yes, it is perfectly possible to implement a factory pattern where the part responsible for creating the actual objects is not an interface, but a fixed class. Especially when thinking about "internal" factories (that are not exposed to a public API and wide range of "different" end users) that case is probably even the more common approach. ( the code I write contains many factories, few of them would follow the above approach of "interfacing" almost everything )
Beyond that - keep in mind that programming is also often about balancing between different requirements. Example: you might (again for communicating intent) decide to declare a class that provides a certain functionality as final. So that nobody gets idea of extending that specific class. But doing so means that users of that API are all of a sudden affected in their choice of mocking frameworks. As mocking final classes is not something that you can do easily. When you are then consuming this API, and you want to write unit tests - then you are very happy about the fact that the public API is relying on interfaces, not classes. Because you can always mock interfaces - but as said, final classes can cause headache.
I design my game application and face some troubles in OOP design.
I want to know some patterns which can help me, because java have not any multiple extends option. I will describe my problem below, and also explain why multiple interface doesn't help me at all. Lets go.
What we want is "class is set of features". By feature I mean construction like:
field a;
field b;
field c;
method m1(){
// use, and change fields a,b,c;
}
method m2(){
// use, and change fields a,b,c;
}
//etc
So, basically the feature is a set of methods and corresponding fields. So, it's very close to the java interface.
When I talk that class implemets "feature1" I mean that this class contains ALL "feature needed" fields, and have realisation of all feature related methods.
When class implements two features the tricky part begins. There is a change, that two different features contains similar fields (names of this fields are equal). Let the case of different types for such fields will be out of scope. What I want - is "feature naming tolerance" - so that if methodA() from feature A change the field "common_field", the methodB from feature B, that also use "common_field" as field will see this changes.
So, I want to create a set of features (basically interfaces) and their implementations. After this I want to create classes which will extends multiple features, without any copy-paste and other crap.
But I can't write this code in Java:
public static interface Feature1 {
public void method1();
}
public static interface Feature2 {
public void method2();
}
public static class Feature1Impl implements Feature1 {
int feature1Field;
int commonField;
#Override
public void method1() {
feature1Field += commonField;
commonField++;
}
}
public static class Feature2Impl implements Feature2 {
int feature2Field;
int commonField;
#Override
public void method2() {
commonField++;
}
}
public static class MyFeaturedClass extends Feature1Impl, Feature2Impl implements Feature1, Features2 {
}
So, as you can see the problem are really complex.
Below I'll describe why some standart approaches doesn't work here.
1) Use something like this:
public static class MyFeaturesClass implements Feature1,Feature2{
Feature1 feature1;
Feature2 feature2;
#Override
public void method2() {
feature2.method2();
}
#Override
public void method1() {
feature1.method1();
}
}
Ok, this is really nice approach - but it does not provide "feature field name tolerance" - so the call of method2 will not change the field "commonField" in object corresponding the feature1.
2) Use another design. For what sake you need such approach?
Ok. In my game there is a "unit" concept. A unit is MOVABLE and ALIVE object.
Movable objects has position, and move() method. Alive objects has hp and takeDamage() and die() methods.
There is only MOVABLE objects in my game, but this objects isn't alive.
Also, there is ALIVE objects in my game, but this objects isn't movable (buildings for example).
And when I realize the movable and alive as classes, that implements interfaces, I really don't know from what I should extends my Unit class. In both cases I will use copy-paste for this.
The example above is really simple, actually I need a lot of different features for different game mechanics. And I will have a lot of different objects with different properties.
What I actually tried is:
Map<Field,Object> fields;
So any object in my game has such Map, and to any object can be applied any method. The realization of method is just take needed fields from this map, do its job and change some of them. The problem of this approach is performance. First of all - I don't want to use Double and Interger classes for double and int fields, and second - I want to have a direct accsess to the fields of my objects (not through the map object).
Any suggestions?
PS. What I want as a result:
class A implements Feature1, Feature2, Feature3, Feature4, Feature5 {
// all features has corresponding FeatureNImpl implementations;
// features 1-2-3 has "shared" fields, feature 3-4 has, features 5-1 has.
// really fast implementation with "shared field tolerance" needed.
}
One possibility is to add another layer of interfaces. XXXProviderInterface could be defined for all possible common fields, that define a getter and setter for them.
A feature implementation class would require the needed providers in the constructor. All access to common fields are done through these references.
A concrete game object class implementation would implement the needed provider interfaces and feature interfaces. Through aggregation, it would add the feature implementations (with passing this as provider), and delegate the feature calls to them.
E.g.
public interface Feature1 {
void methodF1();
}
public interface Feature2 {
void methodF2();
}
public interface FieldAProvider {
int getA();
void setA(int a);
}
public class Feature1Impl implements Feature1 {
private FieldAProvider _a;
Feature1Impl(FieldAProvider a) {
_a = a;
}
void methodF1() {
_a.setA(_a.getA() * 2);
}
}
// Similar for Feature2Impl
public class GameObject implements Feature1, Feature2, FieldAProvider
{
int _fieldA;
Feature1 _f1;
Feature2 _f2;
GameObject() {
_f1 = new Feature1Impl(this);
_f2 = new Feature2Impl(this);
}
int getA() {
return _fieldA;
}
void setA(int a) {
_fieldA = a;
}
void methodF1() {
_f1.methodF1();
}
void methodF2() {
_f2.methodF2();
}
}
However, I don't think this is an optimal solution
I know that an interface must be public. However, I don't want that.
I want my implemented methods to only be accessible from their own package, so I want my implemented methods to be protected.
The problem is I can't make the interface or the implemented methods protected.
What is a work around? Is there a design pattern that pertains to this problem?
From the Java guide, an abstract class wouldn't do the job either.
read this.
"The public access specifier indicates that the interface can be used by any class in any package. If you do not specify that the interface is public, your interface will be accessible only to classes defined in the same package as the interface."
Is that what you want?
You class can use package protection and still implement an interface:
class Foo implements Runnable
{
public void run()
{
}
}
If you want some methods to be protected / package and others not, it sounds like your classes have more than one responsibility, and should be split into multiple.
Edit after reading comments to this and other responses:
If your are somehow thinking that the visibility of a method affects the ability to invoke that method, think again. Without going to extremes, you cannot prevent someone from using reflection to identify your class' methods and invoke them. However, this is a non-issue: unless someone is trying to crack your code, they're not going to invoke random methods.
Instead, think of private / protected methods as defining a contract for subclasses, and use interfaces to define the contract with the outside world.
Oh, and to the person who decided my example should use K&R bracing: if it's specified in the Terms of Service, sure. Otherwise, can't you find anything better to do with your time?
When I have butted up against this I use a package accessible inner or nested class to implement the interface, pushing the implemented method out of the public class.
Usually it's because I have a class with a specific public API which must implement something else to get it's job done (quite often because the something else was a callback disguised as an interface <grin>) - this happens a lot with things like Comparable. I don't want the public API polluted with the (forced public) interface implementation.
Hope this helps.
Also, if you truly want the methods accessed only by the package, you don't want the protected scope specifier, you want the default (omitted) scope specifier. Using protected will, of course, allow subclasses to see the methods.
BTW, I think that the reason interface methods are inferred to be public is because it is very much the exception to have an interface which is only implemented by classes in the same package; they are very much most often invoked by something in another package, which means they need to be public.
This question is based on a wrong statement:
I know that an interface must be public
Not really, you can have interfaces with default access modifier.
The problem is I can't make the interface or the implemented methods protected
Here it is:
C:\oreyes\cosas\java\interfaces>type a\*.java
a\Inter.java
package a;
interface Inter {
public void face();
}
a\Face.java
package a;
class Face implements Inter {
public void face() {
System.out.println( "face" );
}
}
C:\oreyes\cosas\java\interfaces>type b\*.java
b\Test.java
package b;
import a.Inter;
import a.Face;
public class Test {
public static void main( String [] args ) {
Inter inter = new Face();
inter.face();
}
}
C:\oreyes\cosas\java\interfaces>javac -d . a\*.java b\Test.java
b\Test.java:2: a.Inter is not public in a; cannot be accessed from outside package
import a.Inter;
^
b\Test.java:3: a.Face is not public in a; cannot be accessed from outside package
import a.Face;
^
b\Test.java:7: cannot find symbol
symbol : class Inter
location: class b.Test
Inter inter = new Face();
^
b\Test.java:7: cannot find symbol
symbol : class Face
location: class b.Test
Inter inter = new Face();
^
4 errors
C:\oreyes\cosas\java\interfaces>
Hence, achieving what you wanted, prevent interface and class usage outside of the package.
Here's how it could be done using abstract classes.
The only inconvenient is that it makes you "subclass".
As per the java guide, you should follow that advice "most" of the times, but I think in this situation it will be ok.
public abstract class Ab {
protected abstract void method();
abstract void otherMethod();
public static void main( String [] args ) {
Ab a = new AbImpl();
a.method();
a.otherMethod();
}
}
class AbImpl extends Ab {
protected void method(){
System.out.println( "method invoked from: " + this.getClass().getName() );
}
void otherMethod(){
System.out.println("This time \"default\" access from: " + this.getClass().getName() );
}
}
Here's another solution, inspired by the C++ Pimpl idiom.
If you want to implement an interface, but don't want that implementation to be public, you can create a composed object of an anonymous inner class that implements the interface.
Here's an example. Let's say you have this interface:
public interface Iface {
public void doSomething();
}
You create an object of the Iface type, and put your implementation in there:
public class IfaceUser {
private int someValue;
// Here's our implementor
private Iface impl = new Iface() {
public void doSomething() {
someValue++;
}
};
}
Whenever you need to invoke doSomething(), you invoke it on your composed impl object.
I just came across this trying to build a protected method with the intention of it only being used in a test case. I wanted to delete test data that I had stuffed into a DB table. In any case I was inspired by #Karl Giesing's post. Unfortunately it did not work. I did figure a way to make it work using a protected inner class.
The interface:
package foo;
interface SomeProtectedFoo {
int doSomeFoo();
}
Then the inner class defined as protected in public class:
package foo;
public class MyFoo implements SomePublicFoo {
// public stuff
protected class ProtectedFoo implements SomeProtectedFoo {
public int doSomeFoo() { ... }
}
protected ProtectedFoo pFoo;
protected ProtectedFoo gimmeFoo() {
return new ProtectedFoo();
}
}
You can then access the protected method only from other classes in the same package, as my test code was as show:
package foo;
public class FooTest {
MyFoo myFoo = new MyFoo();
void doProtectedFoo() {
myFoo.pFoo = myFoo.gimmeFoo();
myFoo.pFoo.doSomeFoo();
}
}
A little late for the original poster, but hey, I just found it. :D
You can go with encapsulation instead of inheritance.
That is, create your class (which won't inherit anything) and in it, have an instance of the object you want to extend.
Then you can expose only what you want.
The obvious disadvantage of this is that you must explicitly pass-through methods for everything you want exposed. And it won't be a subclass...
I would just create an abstract class. There is no harm in it.
With an interface you want to define methods that can be exposed by a variety of implementing classes.
Having an interface with protected methods just wouldn't serve that purpose.
I am guessing your problem can be solved by redesigning your class hierarchy.
One way to get around this is (depending on the situation) to just make an anonymous inner class that implements the interface that has protected or private scope. For example:
public class Foo {
interface Callback {
void hiddenMethod();
}
public Foo(Callback callback) {
}
}
Then in the user of Foo:
public class Bar {
private Foo.Callback callback = new Foo.Callback() {
#Override public void hiddenMethod() { ... }
};
private Foo foo = new Foo(callback);
}
This saves you from having the following:
public class Bar implements Foo.Callback {
private Foo foo = new Foo(this);
// uh-oh! the method is public!
#Override public void hiddenMethod() { ... }
}
I think u can use it now with Java 9 release. From the openJdk notes for Java 9,
Support for private methods in interfaces was briefly in consideration
for inclusion in Java SE 8 as part of the effort to add support for
Lambda Expressions, but was withdrawn to enable better focus on higher
priority tasks for Java SE 8. It is now proposed that support for
private interface methods be undertaken thereby enabling non abstract
methods of an interface to share code between them.
refer https://bugs.openjdk.java.net/browse/JDK-8071453