The problem I am facing is as below -
I am using a 3rd party library, say Editor, which has an interface , EditorActions, with methods -
create(), edit(), delete().
I do not want to expose, EditorActions 's methods in my implementation. So my interface will have methods like -
myCreate(), myEdit(), myDelete() which in turn should call the EditorActions methods.
EditorActions is only an interface, the implementation is internal to the library.
How do I link the 2 interfaces without implementing either of them?
thanks for all your help
You can do this by exposing the methods that you want people to use in an abstract class. And then force people to implement the specific methods that you want them to.
You can then use the methods from the EditorActions interface as well as the methods that you force you implementations to implement.
public abstract class AbstractEditorActions {
private EditorActions ea;
public AbstractEditorActions(EditorActions ea) {
this.ea = ea;
}
// In this method, you can use the methods
// from the interface and from this abstract class.
// Make the method final so people don't break
// the implementation.
public final void yourExposedMethod() {
// code
this.toImplement();
ea.doMethod();
}
protected abstract toImplement();
}
Assuming you obtain an instance of EditorActions from the library you could do this:
public class FooActions implements MyEditorActions, EditorActions{
private EditorActions internal;
public FooActions(EditorActions internal){
this.internal = internal;
}
#Override
public void create(){
internal.create();
}
#Override
public void myCreate(){
// do stuff
this.create();
}
}
What this does is wrap the instance of the library object with an object that implements the same interface as well as yours. Then, you just expose the object as whatever interface you want it to be.
EditorActions a1 = new FooActions(); // a1 only shows library methods
MyEditorActions a2 = a1; // now a2 only shows your methods
How do I link the 2 interfaces without implementing either of them?
You can't. You are trying to do automatic magic here. Don't do magic. You have to implement either one of them no matter what.
Or, you'll have to implement your own reflection plumbing (or AOP somehow) to create classes on the fly. The later is no trivial manner, and typically an overkill and a red-flag of over-engineering just to avoid implementing what amounts to be a plain-old delegate.
OTH, if you only wanted to "expose" a subset of the methods provided by a third party interface A (say, for example, only the getter methods), you could almost trivially create (by good old elbow grease or a reflection library) an interface B that only exposes that subset of methods you desire.
interface DirtyThirdPartyInterface
{
StupidCrap getSomeStupidCrap();
void setStupidCrap();
}
interface MySanitizedInterface
{
StupidCrap getSomeStupidCrap();
// the setter is not part of this interface
}
Then with, say, Spring AOP or something similar or one of the several reflection libraries out there, then you could auto-generate an implementation of MySanitizedInterface as an AOP interceptor that simply proxies the call to the getter (via reflection) to the getter in the 3rd party interface.
But again, that's a lot of crap (not to mention 3rd party library dependencies) to simply avoiding what amounts to be simple hand-coding. It is rare to find a real-world case that justifies all that plumbing malarkey. If I were to run into something like that, the first thing I would think is "red flag". YMMV of course.
Related
Interfaces are great from a flexibility standpoint. But in case, where an interface is used by a large number of clients. Adding new methods to the interface while keeping the old mehtods intact will break all clients' code as new methods won't be present in clients. As shown below:
public interface CustomInterface {
public void method1();
}
public class CustomImplementation implements CustomInterface {
#Override
public void method1() {
System.out.println("This is method1");
}
}
If at some point later in time, we add another method to this interface all clients' code will break.
public interface CustomInterface {
public void method1();
public void method2();
}
To avoid this we have to explicitly implement new methods in all clients' code.
So I think of interfaces and this scenario as following:
Interfaces once written are like carving in stone. They are rarely supposed, and expected to change. And if they do, they come with a huge cost(rewriting the whole code) which programmers should be ready for.
In continuation with the point above, Is it possible to write interfaces that can stand the test of time?
How such a scenario is handled in interfaces where you expect additional functionality in future? That is anticipating change in the contract by which all clients are binded.
EDIT: Default method is indeed a nice addition to Java Interfaces which a lot of people have mentioned in their answers. But my question was more in the context of code design. And how forcing method implementation on the client is an intrinsic character of an interface. But this contract between an interface and a client seems fragile as functionality will eventually evolve.
One solution to this problem was introduced in Java 8 in the form of default methods in interfaces. It allowed to add new methods to existing Java SE interfaces without breaking existing code, since it supplied default implementation to all the new methods.
For example, the Iterable interface, which is widely used (it's a super interface of the Collection interface) was added two new default methods - default void forEach(Consumer<? super T> action) and default Spliterator<T> spliterator().
public interface CustomInterface {
public void method1();
}
public interface CustomInterface2 extends CustomInterface {
public void meathod2();
}
Other than default method you can use inheritance property as show above by which new interface will have all previous method along with new methods and use this interface in your required situation.
Java 8 has introduced default implementation for methods. These implementations reside in the interface. If a new method with a default implementation is created in an interface that is already implemented by many classes, there is no need to modify all the classes, but only the ones that we want to have a different implementation for the newly defined method than the default one.
Now, what about older Java versions? Here we can have another interface that extends the first one. After that, classes that we want to implement the newly-declared method will be changed to implement the new interface. As shown below.
public interface IFirst {
void method1();
}
public class ClassOne implements IFirst() {
public void method1();
}
public class ClassTwo implements IFirst() {
public void method1();
}
Now, we want method2() declared, but it should only be implemented by ClassOne.
public interface ISecond extends iFirst {
void method2();
}
public class ClassOne implements ISecond() {
public void method1();
public void method2();
}
public class ClassTwo implements IFirst() {
public void method1();
}
This approach will be ok in most cases, but it does have downsides as well. For example, we want method3() (and only that one) for ClassTwo. We will need a new interface IThird. If later we will want to add method4() that should be implemented by both ClassOne and ClassTwo, we will need to modify (but not ClassThree that also implemented IFirst) we will need to change both ISecond and IThird.
There rarely is a "magic bullet" when it comes to programming. In the case of interfaces, it is best if they don't change. This isn't always the case in real-life situations. That is why it is advised that interfaces offer just "the contract" (must-have functionality) and when possible use abstract classes.
A future interface change shouldn't break anything that has been working -- if it does, it's a different interface. (It may deprecate things, though, and a full cycle after deprecation it may be acceptable to say that throwing an Unimplemented exception is acceptable.)
To add things to an interface, the cleanest answer is to derive a new interface from it. That will allow using objects implementing the new behaviors with code expecting the old ones, while letting the user declare appropriately and/or typecast to get access to the new features. It's a bit annoying since it may require instanceof tests, but it's the most robust approach, and it's the one you'll see in many industry standards.
Interfaces are contracts between the developer and clients, so you're right - they are carved in stone and should not be changed. Therefore, an interface should expose (= demand) only the basic functionality that's absolutely required from a class.
Take the List interface for example. There are many implementations of lists in Java, many of which evolve over time (better under-the-hood algorithms, improved memory storage), but the basic "concept" of a list - add an item, search for an item, remove an item - should not and will not ever change.
So, to your question: Instead of writing interfaces which classes implement, you can use abstract classes. Interfaces are basically purely-abstract classes, in the sense that they do not provide any built-in functionality. However, one can add new, non-abstract methods to an abstract class that clients will not be required to implement (override).
Take this abstract class (= interface) for example:
abstract class BaseQueue {
abstract public Object pop();
abstract public void push(Object o);
abstract public int length();
public void clearEven() {};
}
public class MyQueue extends BaseQueue {
#Override
public Object pop() { ... }
...
}
Just like in interfaces, every class that extends BaseQueue is contractually bound to implement the abstract methods. The clearEven() method, however, is not an abstract method (and already comes with an empty implementation), so the client is not forced to implement it, or even use it.
That means that you can leverage the power of abstract classes in Java in order to create non-contractually-binding methods. You can add other methods to the base class in the future as much as you like, provided that they are not abstract methods.
I think your question is more about design and techniques, so java8 answers are a bit misleading. This problem was known long before java8, so there are some other solutions for it.
First, there are no absolutely chargeless ways to solve a problem. The size of inconviniences that come from interface evolving depends on how the library is used and how deliberate your design is.
1) No techniques will help, if you designed an interface and forgot to include a mandatory method in it. Plan your design better and try to anticipate how clients will use your interfaces.
Example: Imagine Machine interface that has turnOn() method but misses turnOff() method. Introducing a new method with default empty implementation in java8 will prevent compilation errors but will not really help, because calling a method will have no effect. Providing working implementation is sometimes impossible because interface has no fields and state.
2) Different implementations usually have things in common. Don't be afraid to keep common logic in parent class. Inherit your library classes from this parent class. This will enforce library clients to inherit their own implementations from your parent class as well. Now you can make small changes to the interface without breaking everything.
Example: You decided to include isTurnedOn() method to your interface. With a basic class, you can write a default method implementation that would make sence. Classes that were not inherited from parent class still need to provide their own method implementations, but since method is not mandatory, it will be easy for them.
3) Upgrading the functionality is usually achieved by extending the interfaces. There's no reason to force library clients to implement a bunch of new methods because they may not need them.
Example: You decided to add stayIdle() method to your interface. It makes sence for classes in your library, but not for custom client classes. Since this functionality is new, it's better to create a new interface that will extend Machine and use it when it's needed.
I was assigned to a project, and it is my job to implement a feature to the already existing system. This functionality needs to be added to two seperate classes. Both of these classes extend the same super class, but it does not make sense to add the feature to this superclass. What is the best way I can implement the same functionality into these two seperate classes without too much code duplication. The simple way would be implementing this functionality into a static class and then using the static methods in the two classes that need this extra functionality, but that sort of seems like bad design.
Is there any sort of design I can use to implement something like this, or is me running into this problem just showing a larger issue in the hierarchy that should be fixed rather than try to work on top of it?
Java does not have stand-alone "static" classes, so that's a non-starter since it's not even possible. As for use of static methods, that's fine if you're talking about stateless utility methods.
Myself, I guess I'd solve this with composition and interfaces:
Create an interface for the functionality that I desire
Create concrete instance(s) of this interface
Give the two classes fields of the interface
Plus getter and setter methods for the interface.
If the classes had to have the new behaviors themselves, then have them implement the interface, and then have these classes obtain the behaviors by "indirection" by calling the methods of the contained object in the interface methods.
I'm sorry that this answer is somewhat vague and overly general. If you need more specific advice from me or from anyone else here, then consider telling us more of the specifics of your problem.
Determine what common features of these two classes the new functionality relies on. Then, extract those features to an interface, modify the two classes to implement that interface, and put the new functionality code in its own class (or possibly a static method somewhere, e.g. NewFeature.doTheThing(NewFeaturable toWhat)) and make it operate on those interfaces.
If the existing classes have to obtain information from / call methods related to the "new feature", then give them a NewFeature field that is an instance of the new feature class and have them interact with that object. Pseudo-ish code:
interface NewFeaturable {
int getRelevantInfo ();
}
class NewFeature {
final NewFeaturable object;
NewFeature (NewFeaturable object) { this.object = object; }
void doSomething () { int x = object.getRelevantInfo(); ... }
}
class ExistingClass extends Base implements NewFeaturable {
final NewFeature feature;
ExistingClass () { ...; feature = new NewFeature(this); }
#Override int getRelevantInfo () { ... }
void doSomethingNew () { feature.doSomething(); }
}
Be wary of new NewFeature(this) there, as subclasses of ExistingClass will not be fully constructed when it is called. If it's an issue, consider deferring initialization of feature until it is needed.
A lot of the specifics depend on your exact situation, but hopefully you get the general idea.
I have seen in many libraries like Spring which use a lot of interfaces with single methods in them like BeanNameAware, etc.
And the implementer class will implement many interfaces with single methods.
In what scenarios does it make sense to keep single method interfaces? Is it done to avoid making one single interface bulky for example ResultSet? Or is there some design standard which advocates the use of these type of interfaces?
With Java 8, keeping the single method interface is quite useful, since single method interfaces will allow the usage of closures and "function pointers". So, whenever your code is written against a single method interface, the client code may hand in a closure or a method (which must have a compatible signature to the method declared in the single method interface) instead of having to create an anonymous class. In contrast, if you make one interface with more than one method, the client code will not have that possibility. It must always use a class that implements all methods of the interface.
So as a common guideline, one can say: If a class that only exposes a single method to the client code might be useful to some client, then using a single method interface for that method is a good idea. A counter example to this is the Iterator interface: Here, it would not be useful having only a next() method without a hasNext() method. Since having a class that only implements one of these methods is no use, splitting this interface is not a good idea here.
Example:
interface SingleMethod{ //The single method interface
void foo(int i);
}
class X implements SingleMethod { //A class implementing it (and probably other ones)
void foo(int i){...}
}
class Y { //An unrelated class that has methods with matching signature
void bar(int i){...}
static void bar2(int i){...}
}
class Framework{ // A framework that uses the interface
//Takes a single method object and does something with it
//(probably invoking the method)
void consume(SingleMethod m){...}
}
class Client{ //Client code that uses the framework
Framework f = ...;
X x = new X();
Y y = new Y();
f.consume(x); //Fine, also in Java 7
//Java 8
//ALL these calls are only possible since SingleMethod has only ONE method!
f.consume(y::bar); //Simply hand in a method. Object y is bound implicitly
f.consume(Y::bar2); //Static methods are fine, too
f.consume(i -> { System.out.println(i); }) //lambda expression. Super concise.
// the above could even be more concise
// presenting all the beauty of the recent Java changes
f.consume(System.out::println)
//This is the only way if the interface has MORE THAN ONE method:
//Calling Y.bar2 Without that closure stuff (super verbose)
f.consume(new SingleMethod(){
#Override void foo(int i){ Y.bar2(i); }
});
}
Interfaces with only one (or few) methods is the key to the highly useful Strategy pattern, which is "some design standard which advocates the use of these type of interfaces".
Another common scenario is when you want a callback. Foo calls Bar as an asynchronous task, and when Bar is finished with something, the result is sent back to Foo using a callback -- which can be an interface containing only one method. (An example of this is the many listeners in Android, Event Listeners in Swing...)
Also, if you have two classes that are tightly coupled with one another (let's call them Foo and Bar). Foo uses nearly all of Bar's methods, but Bar only needs some a few of those from Foo. Foo can implement FooInterface which is then sent to Bar. Now the coupling is looser, because Bar only knows about the FooInterface, but does not care about the other methods the implementing class contains.
In what scenarios does it make sense to keep single method interfaces?
In such a scenarios when you need an interface with only one method.
Interfaces are used to encapsulate a common behavior of several classes. So if you have several places in your code where you need to call only limited set of class methods, it's time to introduce an interface. The number of methods depends on what exactly do you need to call. Sometimes you need one method, sometimes two or more, sometimes you don't need methods at all. What matters is that you can separate behavior from implementation.
Favor Composition over Inheritance tutorial of Head First Design Pattern book recommends this approach to add functionality dynamically to a class. Let's take below case:
public interface Quackable {
public void quack();
}
public class Quacks implements Quackable {
public void quack(){
//quack behavior
}
}
public class DontQuack implements Quackable {
public void quack(){
//dont quack
}
}
public class QuackableDuck{
Quackable quack; //add behavior dynamicall
}
So QuackableDuck class can add feature dynamically.
quack = new Quacks();
//or
quack = new DontQuack();
So similarly you can add multiple behavior to the class dynamically.
You create interfaces not according to the number of methods in it but to define behaviour expected by components of your systems to deliver a single responsibility to their neighbors. If you follow this simple principle/rule, you might or might not end up with single method interfaces, depending on the responsibility you are defining. I like to keep tests stupid simple and the application very flexible so I usually have many of those
I have two classes A and B which both implment the interface Z. Now, class A should for some functions of Interface Z (Z.f1, Z.f2, Z.f3, ...) only work as dispatcher to an object of class B.
public class A implements Z{
private B b; //instantiated in constructor of A
#Override
public String f1(int p)
{
return b.f1(p);
}
...
Is there a generic way to do this in Java?
If you mean that method f1() is declared in interface Z the pattern you want to implement is called wrapper or decorator.
In java you can create generic implementation using dynamic proxy introduced to java 1.4.
I don't think so. But sometimes your IDE can assist in creating all the simple methods to delegate the calls. And sometimes you can find third part classes to do this. For example, Guava (http://code.google.com/p/guava-libraries/) has a ton of ForwardingXXX classes, which, by default, delegate everything to something else. For example, ForwardingMap delegates all calls to another Map. You need to override the methods that you do NOT want to delegate.
I know that it is the purpose of the interface and the class can be declared abstract to escape from it.
But is there any use for implementing all the methods that we declare in an interface? will that not increase the weight and complexity of the code if we keep on defining all the methods even it is not relevant for that class? why it is designed so?
The idea of an interface in Java is very much like a contract (and perhaps seen in retrospect this should have been the name of the concept)
The idea is that the class implementing the interface solemnly promises to provide all the things listed in the contract so that any use of a class implementing the interface is guaranteed to have that functionality available.
In my experience this facility is one of the things that makes it possible to build cathedrals in Java.
What you are critizing is exactly the goal interface achieve.
If you don't want to implement an interface, don't declare your class implementing it.
will that not increase the weight and complexity of the code if we
keep on defining all the methods even it is not relevant for that
class?
When you program against an interface, you want the concrete object behind it to implement all its methods. If your concrete object doesn't need or cannot implement all interface method you probably have a design issue to fix.
When any piece of code receives an instance of an interface without knowing what class is behind it, that piece of code should be assured of the ability to call any method in an interface. This is what makes an interface a contract between the callers and the providers of the functionality. The only way to achieve that is to require all non-abstract classes implementing the interface to provide implementations for all its functions.
There are two general ways to deal with the need to not implement some of the functionality:
Adding a tester method and an implementation that throws UnsupportedOperationException, and
Splitting your interface as needed into parts so that all method of a part could be implemented.
Here is an example of the first approach:
public interface WithOptionalMehtods {
void Optional1();
void Optional2();
boolean implementsOptional1();
boolean implementsOptional2();
}
public class Impl implements WithOptionalMehtods {
public void Optional1() {
System.out.println("Optional1");
}
public void Optional2() {
throw new UnsupportedOperationException();
}
public boolean implementsOptional1() {
return true;
}
public boolean implementsOptional2() {
return false;
}
}
Here is an example of the second approach:
public interface Part1 {
void Optional1();
}
public interface Part2 {
void Optional2();
}
public Impl implements Part1 {
public void Optional1() {
System.out.println("Optional1");
}
}
will that not increase the weight and complexity of the code if we
keep on defining all the methods even it is not relevant for that
class?
Yes you are right it will. That is why it is best practice in your coding to follow the Interface Segregation Principle which recommends not to force clients to implement interfaces that they don't use. So you should never have one "fat" interface with many methods but many small interfaces grouping methods, each group serving a specific behavior or sub-module.
This way clients of an interface implement only the needed methods without ever being forced into implementing methods they don't need.
It may depend on Liskov Substitution Principle
So, having A implements B means that you can use A when B is needed and, to make it work without problems, A must have at least the same methods of B.
Please keep in mind that mine is not a "proper" answer, as it's not based on official sources!
When implementing an Interface,we may not need to define all the method declared in the Interface.We can define the some methods,that we don't need,With nothing inside the body.