This seems a basic java question.
I have one interface, Pipeline, which has a method execute(Stage).
Then I create a sub interface to extend from Pipeline, say BookPipeline, I like the method to be execute(BookStage).
BookStage extends from Stage.
Seems this kind of definition could not pass java compile.
Any suggestion on that?
You may want to consider using generics.
public interface Pipeline<T extends Stage> {
public void execute(T stage);
}
public interface BookPipeline extends Pipeline<BookStage> {
#Override
public void execute(BookStage stage);
}
In addition to what #Jeffrey wrote as a possible solution, it is important to understand why you cannot do it.
Assume you had an interface Pipeline with a method execute(Stage), and an extending interface BookPipeline with execute(BookStage).
Also assume you have some class Conc that implements BookPipeline.
Consider the following
Pipeline p = new Conc();
p.execute(new Stage());
What will happen? It will be unsafe!
Java wants to avoid it, and thus prevent this situations from the first place.
The rule is an extending class/interface can add behavior, but not reduce it.
Just to elaborate on #amit 's answer, the code snippet is unsafe as the Conc.execute method takes a BookStage as a parameter and this would be trying to squeeze a Stage in place of that (and of course, not all Stages are BookStages).
However, imagine we wanted to go the other way, that is, make the parameter type of the BookePipeline.executea super type of Stage, such as Object.
So just to clarify, we would have:
interface Pipeline
{
void execute(Stage s);
}
interface BookPipeline extends Pipeline
{
#Override
void execute(Object s);
}
And where Conc implements BookPipeline:
Pipeline p = new Conc();
p.execute(new Stage());
This would, in theory, be safe because Liskov Substitutability has not been violated - we could safely pass a Stage into any implementation that took a Stage parameter or greater. This is known as contravariance. Java does not support contravariant argument types, however there are languages that do.
Your original question relates to covariant argument types which is unsafe for the reasons specified (however strangely enough, a language called Eiffel allows this).
Java does however support covariant return types. Imagine Pipeline had a
Stage getAStage();
it would be perfectly legal for BookPipeline to override this method like so:
#Override
BookStage getAStage();
Then imagine we had:
public void someMethodSomewhere(Pipeline p)
{
Stage s = p.getAStage();
//do some dance on Stage
}
Assuming we had some class Donc which implemented Pipeline and overrode getAStage() exactly as it is defined in Pipeline (so still returning Stage), both of these calls are OK:
someMethodSomewhere(new Conc());
someMethodSomewhere(new Donc());
Because we can always put a Stage or anything less (e.g. BookStage) in a variable of type Stage.
So to reword the rule to relate specifically to method overriding, an extending class/interface that overrides methods, can only make those methods more general in what they accept and more specific in what they return. (although in the case of Java, only more specific return types are allowed.)
Just remember, PECS - Producer Extends, Consumer Super (Joshua Bloch, Effective Java)
Related
Minimal working example:
static void foo(boolean bar){
some code A
if(bar){
some code B
}
else{
some code C
}
some code D
}
Here we use the parameter bar to determine the method's behavior, not to actually do something with its value. As a result we redundantly check the value of bar. The method that calls foo() knows the value of bar, since it actually passed it as a parameter. A simple alternative would be:
static void foo1(){
A;B;D;
}
static void foo2(){
A;C;D
}
The result is, that we have redundant code. Now we could put A and D into methods, but what if they manipulate several variables? Java doesn't have methods with multiple return types. Even assuming we could put them into methods, we would still have foo1 looking like a();b();d(), and foo2 looking like a();c();d(). My current solution to this issue is create a functional interface for c(), b() , then to define foo as
static void foo(BCinterface baz){ A; baz.do() ;D;}
The issue is that every time I want to write a method with slightly different behaviors, I have to define an interface for the methods where they differ. I know in other languages there are function pointers. Is there any way to achieve something similar in java without having to define an interface every time? Or is there some practice to avoid having these kinds of situations come up in the first place?
In fact, I think your very first code snippet is the best and most readable solution.
bar is used to determine what the method will do, so what? Why try to move this logic to the caller of foo? There is no point. If I were trying to read the caller of foo, do I need to know how foo works (given it's well named)? No. Because I'm only interested in what happens in the caller of foo. Abstraction is a good thing, not a bad thing. So my advice is, leave it as that.
If you really want to extract the logic, you don't need a new functional interface every time. The java.util.function package and java.lang package already provides you with some functional interfaces. Just use them. For example, in your specific case, BCInterface can be replaced by Runnable.
Your way of solving duplicated invocations seems over complicated.
To provide a distinct behavior at a specific step of an processing/algorithm, you can simply use the template method pattern that relies on abstract method(s)s and polymorphism :
In software engineering, the template method pattern is a behavioral
design pattern that defines the program skeleton of an algorithm in an
operation, deferring some steps to subclasses.1 It lets one redefine
certain steps of an algorithm without changing the algorithm's
structure.[2]
Of course you will have to remove all these static modifiers that don't allow to take advantage of OOP features.
The boolean parameter is not required either any longer.
Define in a base class Foo, foo() that defines the general behavior that relies on an abstract method and let the subclass to define the abstract method implementation.
public abstract class Foo{
public abstract void specificBehavior();
public void foo(){
a();
specificBehavior();
d();
}
public void a(){
...
}
public void d(){
...
}
}
Now subclasses :
public class FooOne extends Foo {
public void specificBehavior(){
...
}
}
public class FooTwo extends Foo {
public void specificBehavior(){
...
}
}
I have read Item 16 from Effective Java and
Prefer composition over inheritance? and now try to apply it to the code written 1 year ago, when I have started getting to know Java.
I am trying to model an animal, which can have traits, i.e. Swimming, Carnivorous, etc. and get different type of food.
public class Animal {
private final List<Trait> traits = new ArrayList<Trait>();
private final List<Food> eatenFood = new ArrayList<Food>();
}
In Item 16 composition-and-forwarding reuseable approach is suggested:
public class ForwardingSet<E> implements Set<E> {
private final Set<E> s;
public ForwardingSet(Set<E> s) {this.s = s;}
//implement all interface methods
public void clear() {s.clear();}
//and so on
}
public class InstrumentedSet<E> extends ForwardingSet<E> {
//counter for how many elements have been added since set was created
}
I can implement ForwardingList<E> but I am not sure on how I would apply it twice for Animal class. Now in Animal I have many methods like below for traits and also for eatenFood. This seems akward to me.
public boolean addTrait (Trait trait) {
return traits.add(trait);
}
public boolean removeTrait (Trait trait) {
return traits.remove(trait);
}
How would you redesign the Animal class?
Should I keep it as it is or try to apply ForwardingList?
There is no reason you'd want to specialize a List for this problem. You are already using Composition here, and it's pretty much what I would expect from the class.
Composition is basically creating a class which has one (or usually more) members. Forwarding is effectively having your methods simply make a call to one of the objects it holds, to handle it. This is exactly what you're already doing.
Anyhow, the methods you mention are exactly the sort of methods I would expect for a class that has-a Trait. I would expect similar addFood / removeFood sorts of methods for the food. If they're wrong, they're the exact sort of wrong that pretty much everyone does.
IIRC (my copy of Effective Java is at work): ForwardingSet's existence was simply because you cannot safely extend a class that wasn't explicitly designed to be extended. If self-usage patterns etc. aren't documented, you can't reasonably delegate calls to super methods because you don't know that addAll may or may not call add repeatedly for the default implemntation. You can, however, safely delegate calls because the object you are delegating to will never make a call the wrapper object. This absolutely doesn't apply here; you're already delegating calls to the list.
I have a class hierarchy where cousins share very similar functionality. For example:
Node
Statement
FunctionCallStatement
Expression
FunctionCallExpression
FunctionCallStatement and FunctionCallExpression share a very similar API, but I cannot express that in pure class terms with a single-inheritance hierarchy. So, I've created an IsFunctionCall Interface which both of these implement. I can now declare a method which takes either a FunctionCallStatement or a FunctionCallExpression as follows:
void <T extends Node & IsFunctionCall> doSomething(T node) { ... }
This all works very nicely.
Unfortunately, I've now found myself faced with a rather awkward problem. I have a Node; I know dynamically that it must be either a FunctionCallStatement or a FunctionCallExpression; I need to pass that Node into the doSomething() method above. I cannot find a way to upcast it to an appropriate type.
Right now I'm using a chain of instanceof to determine which class the Node is and to cast it to the appropriate concrete type, but that's butt-ugly. The only other way I know to make this work is to make an IsNode interface and have everything that currently expects a Node expect an IsNode instead; this would allow me to declare a union interface that implements IsNode and IsFunctionCall and let me do away without the generics above. But that's a hell of a lot of work and is still pretty ugly.
Is there an alternative way to do this?
(Note: example above is a simplified version of my actual code.)
Update: I tried the following piece of evil:
#SuppressWarnings("unchecked")
private <S extends Node & IsFunctionCall> S castNode(Node node)
{
return (S) node;
}
and then:
doSomething(castNode(node));
I got some very strange error messages. It would appear that the type inference used to determine the S of castNode() will not match against the T in the declaration of doSomething(); it's using the concrete type only and setting S to Node. Which of course does not match doSomething()'s declared type. Very peculiar.
Update update:
This appears to be a close duplicate of How should I cast for Java generic with multiple bounds?. My situation is slightly different because my bounds include an object and an interface, while the one in the other question has two interfaces, but it's still applicable.
Looks like I need to go and reengineer my entire application. Sigh.
Any admin, feel free to close this as a duplicate...
I think the way out of this, although not exactly elegant, is to have a few overloads for doSomething:
void doSomething(FunctionCallStatement node) ...
void doSomething(FunctionCallExpression node) ...
You are using the interface to flag functionality, how about passing as argument a reference to the FunctionCallInterface which offers access to the function call abstraction?
doSomething won't have to know the actual implementation type as long as it can access the relevant information and call relevant methods on the implementation objects.
public class FunctionCallStatement extends Statement implements FunctionCallInterface {
}
void doSomething(FunctionCallInterface node) {
}
I can't seem to figure out why a method call I'm trying to make doesn't work.
I've looked much around SO before asking this, and while there are (many) threads about similar problems, I couldn't find one that quite fits my problem..
I have the following code:
(in file Processor.java:)
public interface Processor
{
Runner<? extends Processor> getRunner();
}
(in file Runner.java:)
public interface Runner<P extends Processor>
{
int runProcessors(Collection<P> processors);
}
(in some other file, in some method:)
Collection<? extends Processor> processorsCollection = ...;
Runner<? extends Processor> runner = ...;
runner.runProcessors(processorsCollection);
IntelliJ marks the last line as an error:
"RunProcessors (java.util.Collection>) in Runner cannot be applied to (java.util.Collection>)".
I can't figure out whats wrong with what I did, especially since the error message is not quite clear..
any suggestions?
thanks.
Both your collection and your runner allow for anything that extend processor. But, you can't guarantee they're the same.
Collection might be Collection<Processor1> and Runner be Runner<Processor2>.
Whatever method you have that in needs to be typed (I forget the exact syntax, but I'm sure you can find it!)
void <T extends Processor<T>> foo() {
Collection<T> procColl = ...
Runner<T> runner = ...
runner.runProc(procColl);
}
Edit:
#newAcct makes an excellent point: you need to genericize (is that a word?) your Processor. I've updated my code snippet above as to reflect this important change.
public interface Processor<P extends Processor>
{
Runner<P> getRunner();
}
public interface Runner<P extends Processor<P>>
{
int runProcessors(Collection<P> processors);
}
You have not made your situation clear and you're not showing us any of the code of the methods or of how you get the objects, so we don't really know what you're trying to do.
Your code is not type-safe. As #glowcoder mentioned, there is no way of knowing that the parameter of Collection is the same as the parameter of Runner. If you believe they are indeed the same, then that is based on code that you're not showing us (i.e. what happens in "..."?)
You have written Processor's getRunner() method with a return type that has a wildcard parameter. This says when run it will return a Runner with a mysterious parameter that it determines and we don't know. This doesn't make much sense and is probably not what you wanted.
Also depending on what you are doing, the runProcessors method could possibly take a less strict bound. For example, perhaps <? extends P> or even <? extends Processor> if you don't need to modify the collection.
I'm trying to evolve an API. As part of this evolution I need to change the return type of a method to a subclass (specialize) in order for advanced clients to be able to access the new functionality.
Example (ignore the ugly :
public interface Entity {
boolean a();
}
public interface Intf1 {
Entity entity();
}
public interface Main {
Intf1 intf();
}
I now want to have ExtendedEntity, Intf2 and Main like this:
public interface ExtendedEntity extends Entity {
boolean b();
}
public interface Intf2 extends Intf1 {
ExtendedEntity entity();
}
public interface Main {
Intf2 intf();
}
However, since method return type is part of it's signature, clients already compiled with the previous version of the code show linkage errors (method not found iirc).
What I would like to do is add a method to Main with a different return type. The two methods (one that return super type and one that return subtype) should be mapped to the same implementation method (which returns the subtype). Note - as far as I understand this is allowed by the JVM, but not by the Java spec.
My solution, which seems to work abuses (I have no other word for that) the Java class system to add the required interface.
public interface Main_Backward_Compatible {
Intf1 intf();
}
public interface Main extends Main_Backward_Compatible{
Intf2 intf();
}
Now old clients will have the correct method returned to the invokevirtual lookup (since the method with the correct return type exists in the type hierarchy) and the implementation that will actually work will be the one that returns the subtype Intf2.
This seems to work. In all the tests I could devise (barring reflection - but I don't care about that bit) it did work.
Will it always work? Is my reasoning (about the invokevirtual) correct?
And another, related, question - are there tools to check "real" binary compatibility? The only ones I've found look at each method by itself, but fail to consider type hierarchy.
Thanks,
Ran.
Edit - Tools I've tried and found "not so good" (do not take into account type hierarchy):
Clirr 0.6.
IntelliJ "APIComparator" plugin.
Edit2 - Of course, my clients are barred from creating implementation classes to my interfaces (think services). However, if you want the example to be complete, think abstract class (for Main) instead of interface.
This was long enough that I admit I didn't read everything scrupulously, but it seems like you might actually want to leverage generics here. If you type Intf1 I think you can maintain binary compatibility while introducing specializations:
public interface Intf1<T extends Entity> {
T entity(); //erasure is still Entity so binary compatibility
}
public interface Intf2 extends Intf1<ExtendedEntity> { //if even needed
}
public interface Main {
Intf1<ExtendedEntity> intf(); //erasure is still Intf1, the raw type
}
Edit #1: There are some caveats when trying to maintain binary compatibility. See the Generics Tutorial chapters 6 and 10 for more information.
Edit #2:
You can extend this concept to typing Main as well:
public interface Main<T, I extends Intf1<T>> {
I intf(); //still has the same erasure as it used to, so binary compatible
}
Old clients would then be able to use the raw Main type as they used to with no recompilation needed, and new clients would type their references to Main:
Main<ExtendedEntity, Intf2> myMain = Factory.getMeAMain();
Intf2 intf = myMain.intf();
We ended up not needing the solution, but proved it working before that.
It would be simpler not to change the existing interfaces at all. Anyone using your new interface will be writing new code anyway.
Implementations of the existing Main.intf() signature can return an instance of Intf2.
Optionally, you could provide a new accessor that does not require casting:
public interface Main2 extends Main {
Intf2 intf2();
}