When using lombok #Data (which adds EqualsAndHashCode)
It adds canEqual method
protected boolean canEqual(Object other) {
return other instanceof Exercise;
}
which is only called once:
if (!other.canEqual((Object)this)) return false;
I search and found discussions about access level
If you implement equals and hashCode in a non-final class the safest thing we can do is add the can equal the way we do. Since we don't add any field the costs, especially if the method is protected, are slim.
But why do we need this generated method ? can't it be inline?
The canEqual method is defined in a paper entitled How to Write an Equality Method in Java. This method is meant for allowing to redefine equality on several levels of the class hierarchy while keeping its contract:
The idea is that as soon as a class redefines equals (and hashCode), it should also explicitly state that objects of this class are never equal to objects of some superclass that implement a different equality method. This is achieved by adding a method canEqual to every class that redefines equals.
Seems like it was introduced in Lombok 0.10, as described in the #EqualsAndHashCode documentation:
NEW in Lombok 0.10: Unless your class is final and extends java.lang.Object, lombok generates a canEqual method which means JPA proxies can still be equal to their base class, but subclasses that add new state don't break the equals contract.
And the documentation goes a bit further, referencing the paper quoted above:
The complicated reasons for why such a method is necessary are explained in this paper: How to Write an Equality Method in Java. If all classes in a hierarchy are a mix of scala case classes and classes with lombok-generated equals methods, all equality will 'just work'. If you need to write your own equals methods, you should always override canEqual if you change equals and hashCode.
Related
Why does the Collection interface have equals(Object o) and hashCode(), given that any implementation will have those by default (inherited from Object) ?
From the Collection JavaDoc:
While
the Collection interface adds no stipulations to the general contract
for the Object.equals, programmers who implement the Collection
interface "directly" (in other words, create a class that is a
Collection but is not a Set or a List) must exercise care if they
choose to override the Object.equals. It is not necessary to do so,
and the simplest course of action is to rely on Object's
implementation, but the implementor may wish to implement a "value
comparison" in place of the default "reference comparison." (The List
and Set interfaces mandate such value comparisons.)
The general contract for the Object.equals method states that equals
must be symmetric (in other words, a.equals(b) if and only if
b.equals(a)). The contracts for List.equals and Set.equals state that
lists are only equal to other lists, and sets to other sets. Thus, a
custom equals method for a collection class that implements neither
the List nor Set interface must return false when this collection is
compared to any list or set. (By the same logic, it is not possible to
write a class that correctly implements both the Set and List
interfaces.)
and
While the Collection interface adds no stipulations to the general contract for the Object.hashCode method, programmers should take note that any class that overrides the Object.equals method must also override the Object.hashCode method in order to satisfy the general contract for the Object.hashCode method. In particular, c1.equals(c2) implies that c1.hashCode()==c2.hashCode().
To answer your specific question: why does it have these methods? It's done simply for convenience to be able to include Java Docs giving hints as to what implementers should do with these methods (e.g. comparing equality of values rather than references).
To add to the other great answers. In the Collections interface, the equals method is defined in that interface to make some decisions in the way equaling two instances of collection should work. From the JAVA 8 documentation:
More generally, implementations of the various Collections Framework
interfaces are free to take advantage of the specified behavior of
underlying Object methods wherever the implementor deems it
appropriate.
So you don’t add methods from the Object class for any other reason that giving more definitiveness to the java doc. This is the reason why you don’t count those methods in the abstract methods in the abstract methods of an interface.
Moreover, in JAVA 8, along the same line of reasoning, default methods from the Object class are not allowed and will generate a compile error. I believe it’s was done to prevent this type of confusion. So if you try to create a default method called hashCode(), for example, it will not compile.
Here is a more in-depth explanation for this behavior in JAVA 8 from the Lambda FAQ:
An interface cannot provide a default implementation for any of the
methods of the Object class. This is a consequence of the “class wins”
rule for method resolution: a method found on the superclass chain
always takes precedence over any default methods that appear in any
superinterface. In particular, this means one cannot provide a default
implementation for equals, hashCode, or toString from within an
interface.
This seems odd at first, given that some interfaces actually define
their equals behavior in documentation. The List interface is an
example. So, why not allow this?
One reason is that it would become more difficult to reason about when
a default method is invoked. The current rules are simple: if a class
implements a method, that always wins over a default implementation.
Since all instances of interfaces are subclasses of Object, all
instances of interfaces have non-default implementations of equals,
hashCode, and toString already. Therefore, a default version of these
on an interface is always useless, and it may as well not compile.
Another reason is that providing default implementations of these
methods in an interface is most likely misguided. These methods
perform computations over the object’s state, but the interface, in
general, has no access to state; only the implementing class has
access to this state. Therefore, the class itself should provide the
implementations, and default methods are unlikely to be useful.
Just to add to the great answers above, it makes sense to have the 'equals' or `hashCode' methods in this scenario:
Collection<Whatever> list1 = getArrayList();
Collection<Whatever> list2 = getAnotherArrayList();
if(list1.equals(list2)){
// do something
}
In the absence of the equals method in the interface, we'll be forced to use concrete types, which is generally not a good practice :
ArrayList<Whatever> list1 = getArrayList();
ArrayList<Whatever> list2 = getAnotherArrayList();
if(list1.equals(list2)){
// do something
}
Default methods are a nice new tool in our Java toolbox. However, I tried to write an interface that defines a default version of the toString method. Java tells me that this is forbidden, since methods declared in java.lang.Object may not be defaulted. Why is this the case?
I know that there is the "base class always wins" rule, so by default (pun ;), any default implementation of an Object method would be overwritten by the method from Object anyway. However, I see no reason why there shouldn't be an exception for methods from Object in the spec. Especially for toString it might be very useful to have a default implementation.
So, what is the reason why Java designers decided to not allow default methods overriding methods from Object?
This is yet another of those language design issues that seems "obviously a good idea" until you start digging and you realize that its actually a bad idea.
This mail has a lot on the subject (and on other subjects too.) There were several design forces that converged to bring us to the current design:
The desire to keep the inheritance model simple;
The fact that once you look past the obvious examples (e.g., turning AbstractList into an interface), you realize that inheriting equals/hashCode/toString is strongly tied to single inheritance and state, and interfaces are multiply inherited and stateless;
That it potentially opened the door to some surprising behaviors.
You've already touched on the "keep it simple" goal; the inheritance and conflict-resolution rules are designed to be very simple (classes win over interfaces, derived interfaces win over superinterfaces, and any other conflicts are resolved by the implementing class.) Of course these rules could be tweaked to make an exception, but I think you'll find when you start pulling on that string, that the incremental complexity is not as small as you might think.
Of course, there's some degree of benefit that would justify more complexity, but in this case it's not there. The methods we're talking about here are equals, hashCode, and toString. These methods are all intrinsically about object state, and it is the class that owns the state, not the interface, who is in the best position to determine what equality means for that class (especially as the contract for equality is quite strong; see Effective Java for some surprising consequences); interface writers are just too far removed.
It's easy to pull out the AbstractList example; it would be lovely if we could get rid of AbstractList and put the behavior into the List interface. But once you move beyond this obvious example, there are not many other good examples to be found. At root, AbstractList is designed for single inheritance. But interfaces must be designed for multiple inheritance.
Further, imagine you are writing this class:
class Foo implements com.libraryA.Bar, com.libraryB.Moo {
// Implementation of Foo, that does NOT override equals
}
The Foo writer looks at the supertypes, sees no implementation of equals, and concludes that to get reference equality, all he need do is inherit equals from Object. Then, next week, the library maintainer for Bar "helpfully" adds a default equals implementation. Ooops! Now the semantics of Foo have been broken by an interface in another maintenance domain "helpfully" adding a default for a common method.
Defaults are supposed to be defaults. Adding a default to an interface where there was none (anywhere in the hierarchy) should not affect the semantics of concrete implementing classes. But if defaults could "override" Object methods, that wouldn't be true.
So, while it seems like a harmless feature, it is in fact quite harmful: it adds a lot of complexity for little incremental expressivity, and it makes it far too easy for well-intentioned, harmless-looking changes to separately compiled interfaces to undermine the intended semantics of implementing classes.
It is forbidden to define default methods in interfaces for methods in java.lang.Object, since the default methods would never be "reachable".
Default interface methods can be overwritten in classes implementing the interface and the class implementation of the method has a higher precedence than the interface implementation, even if the method is implemented in a superclass. Since all classes inherit from java.lang.Object, the methods in java.lang.Object would have precedence over the default method in the interface and be invoked instead.
Brian Goetz from Oracle provides a few more details on the design decision in this mailing list post.
To give a very pedantic answer, it is only forbidden to define a default method for a public method from java.lang.Object. There are 11 methods to consider, which can be categorized in three ways to answer this question.
Six of the Object methods cannot have default methods because they are final and cannot be overridden at all: getClass(), notify(), notifyAll(), wait(), wait(long), and wait(long, int).
Three of the Object methods cannot have default methods for the reasons given above by Brian Goetz: equals(Object), hashCode(), and toString().
Two of the Object methods can have default methods, though the value of such defaults is questionable at best: clone() and finalize().
public class Main {
public static void main(String... args) {
new FOO().clone();
new FOO().finalize();
}
interface ClonerFinalizer {
default Object clone() {System.out.println("default clone"); return this;}
default void finalize() {System.out.println("default finalize");}
}
static class FOO implements ClonerFinalizer {
#Override
public Object clone() {
return ClonerFinalizer.super.clone();
}
#Override
public void finalize() {
ClonerFinalizer.super.finalize();
}
}
}
I do not see into the head of Java language authors, so we may only guess. But I see many reasons and agree with them absolutely in this issue.
The main reason for introducing default methods is to be able to add new methods to interfaces without breaking the backward compatibility of older implementations. The default methods may also be used to provide "convenience" methods without the necessity to define them in each of the implementing classes.
None of these applies to toString and other methods of Object. Simply put, default methods were designed to provide the default behavior where there is no other definition. Not to provide implementations that will "compete" with other existing implementations.
The "base class always wins" rule has its solid reasons, too. It is supposed that classes define real implementations, while interfaces define default implementations, which are somewhat weaker.
Also, introducing ANY exceptions to general rules cause unnecessary complexity and raise other questions. Object is (more or less) a class as any other, so why should it have different behaviour?
All and all, the solution you propose would probably bring more cons than pros.
The reasoning is very simple, it is because Object is the base class for all the Java classes. So even if we have Object's method defined as default method in some interface, it will be useless because Object's method will always be used. That is why to avoid confusion, we cannot have default methods that are overriding Object class methods.
Default methods are a nice new tool in our Java toolbox. However, I tried to write an interface that defines a default version of the toString method. Java tells me that this is forbidden, since methods declared in java.lang.Object may not be defaulted. Why is this the case?
I know that there is the "base class always wins" rule, so by default (pun ;), any default implementation of an Object method would be overwritten by the method from Object anyway. However, I see no reason why there shouldn't be an exception for methods from Object in the spec. Especially for toString it might be very useful to have a default implementation.
So, what is the reason why Java designers decided to not allow default methods overriding methods from Object?
This is yet another of those language design issues that seems "obviously a good idea" until you start digging and you realize that its actually a bad idea.
This mail has a lot on the subject (and on other subjects too.) There were several design forces that converged to bring us to the current design:
The desire to keep the inheritance model simple;
The fact that once you look past the obvious examples (e.g., turning AbstractList into an interface), you realize that inheriting equals/hashCode/toString is strongly tied to single inheritance and state, and interfaces are multiply inherited and stateless;
That it potentially opened the door to some surprising behaviors.
You've already touched on the "keep it simple" goal; the inheritance and conflict-resolution rules are designed to be very simple (classes win over interfaces, derived interfaces win over superinterfaces, and any other conflicts are resolved by the implementing class.) Of course these rules could be tweaked to make an exception, but I think you'll find when you start pulling on that string, that the incremental complexity is not as small as you might think.
Of course, there's some degree of benefit that would justify more complexity, but in this case it's not there. The methods we're talking about here are equals, hashCode, and toString. These methods are all intrinsically about object state, and it is the class that owns the state, not the interface, who is in the best position to determine what equality means for that class (especially as the contract for equality is quite strong; see Effective Java for some surprising consequences); interface writers are just too far removed.
It's easy to pull out the AbstractList example; it would be lovely if we could get rid of AbstractList and put the behavior into the List interface. But once you move beyond this obvious example, there are not many other good examples to be found. At root, AbstractList is designed for single inheritance. But interfaces must be designed for multiple inheritance.
Further, imagine you are writing this class:
class Foo implements com.libraryA.Bar, com.libraryB.Moo {
// Implementation of Foo, that does NOT override equals
}
The Foo writer looks at the supertypes, sees no implementation of equals, and concludes that to get reference equality, all he need do is inherit equals from Object. Then, next week, the library maintainer for Bar "helpfully" adds a default equals implementation. Ooops! Now the semantics of Foo have been broken by an interface in another maintenance domain "helpfully" adding a default for a common method.
Defaults are supposed to be defaults. Adding a default to an interface where there was none (anywhere in the hierarchy) should not affect the semantics of concrete implementing classes. But if defaults could "override" Object methods, that wouldn't be true.
So, while it seems like a harmless feature, it is in fact quite harmful: it adds a lot of complexity for little incremental expressivity, and it makes it far too easy for well-intentioned, harmless-looking changes to separately compiled interfaces to undermine the intended semantics of implementing classes.
It is forbidden to define default methods in interfaces for methods in java.lang.Object, since the default methods would never be "reachable".
Default interface methods can be overwritten in classes implementing the interface and the class implementation of the method has a higher precedence than the interface implementation, even if the method is implemented in a superclass. Since all classes inherit from java.lang.Object, the methods in java.lang.Object would have precedence over the default method in the interface and be invoked instead.
Brian Goetz from Oracle provides a few more details on the design decision in this mailing list post.
To give a very pedantic answer, it is only forbidden to define a default method for a public method from java.lang.Object. There are 11 methods to consider, which can be categorized in three ways to answer this question.
Six of the Object methods cannot have default methods because they are final and cannot be overridden at all: getClass(), notify(), notifyAll(), wait(), wait(long), and wait(long, int).
Three of the Object methods cannot have default methods for the reasons given above by Brian Goetz: equals(Object), hashCode(), and toString().
Two of the Object methods can have default methods, though the value of such defaults is questionable at best: clone() and finalize().
public class Main {
public static void main(String... args) {
new FOO().clone();
new FOO().finalize();
}
interface ClonerFinalizer {
default Object clone() {System.out.println("default clone"); return this;}
default void finalize() {System.out.println("default finalize");}
}
static class FOO implements ClonerFinalizer {
#Override
public Object clone() {
return ClonerFinalizer.super.clone();
}
#Override
public void finalize() {
ClonerFinalizer.super.finalize();
}
}
}
I do not see into the head of Java language authors, so we may only guess. But I see many reasons and agree with them absolutely in this issue.
The main reason for introducing default methods is to be able to add new methods to interfaces without breaking the backward compatibility of older implementations. The default methods may also be used to provide "convenience" methods without the necessity to define them in each of the implementing classes.
None of these applies to toString and other methods of Object. Simply put, default methods were designed to provide the default behavior where there is no other definition. Not to provide implementations that will "compete" with other existing implementations.
The "base class always wins" rule has its solid reasons, too. It is supposed that classes define real implementations, while interfaces define default implementations, which are somewhat weaker.
Also, introducing ANY exceptions to general rules cause unnecessary complexity and raise other questions. Object is (more or less) a class as any other, so why should it have different behaviour?
All and all, the solution you propose would probably bring more cons than pros.
The reasoning is very simple, it is because Object is the base class for all the Java classes. So even if we have Object's method defined as default method in some interface, it will be useless because Object's method will always be used. That is why to avoid confusion, we cannot have default methods that are overriding Object class methods.
In this post I suggested a solution that uses interface and anonymous class. However, there is one thing to be implemented: the hashCode and equals method.
However I found it is hard to implement equals for anonymous class that implements an interface. In that example the interface is Pair<L,R>, and a factory method Pairs.makePair will return an anonymous implementation for it. Suppose I added an equals implementation. The user may implement their own Pair<L,R> classes with a different equals code, therefore the call userobj.equals(makepairobj) will enter their code, and makepairobj.equals(userobj) will enter my code. Because I have no control of their code, it is hard to make sure equals to be symmetric, which is required for a good implementation.
I believe this problem is common for other cases, so I would like to know how this issue being address generally?
EDIT:
In typical class, the implementation of equals will check the parameter type to make sure it is the same as its own. This guarantee only the implementing code will be called to compare the objects. However, the anonymous class do not have a name and cannot check the type with instanceof. What I can do is make sure it is an instance of the implementing interface/class. Which is not enough to prevent the above scenario.
You can use this.getClass() (with either == or isAssignableFrom()) to compare the types.
Edit
As in:
public boolean equals(Object obj) {
if (getClass() == obj.getClass()) {
// do whatever
}
return false;
}
Usually, when you make an interface like this, it requires the implementing classes to implement equals and hashCode to follow some convention. For example, if you look at the java.util.List interface, it requires lists to be equal iff they have the same length and equal elements in the same order, and it specifies a formula for calculating the hashCode based on the hash codes of the elements.
So then "it is hard to make sure equals to be symmetric" should not be a problem.
The problem you've encountered is a sign that an anonymous class is the wrong way to implement this.
Anonymous classes are a simply a shorthand way of implementing an interface or extending a class. It's purely syntactic sugar, with no extra functionality or other advantage. They were (perhaps mistakenly) intended to make your code simpler and more readable. If an anonymous class complicates your code instead, don't use it.
Many cases that used to be a good fit for anonymous classes are now better served by lambdas. If a class has two or three methods, trying to put it in an anonymous class makes your code hard to read; it should be an inner class anyway.
When writing one's own classes, is it always necessary to override equals(Object o)?
If I don't, will it automatically check that all the fields are the same? Or does it just check if the two variables point to the same object?
If one is writing a class that is going to have its objects be compared in some way, then one should override the equals and hashCode methods.
Not providing an explicit equals method will result in inheriting the behavior of the equals method from the superclass, and in the case of the superclass being the Object class, then it will be the behavior setforth in the Java API Specification for the Object class.
The general contract for providing an equals method can be found in the documentation for the Object class, specifically, the documentation of the equals and hashCode methods.
Only override equals() if it makes sense. But obviously if you override equals() you need to ensure that the hashcode() contract isn't broken, meaning if two objects are equal they must have the same hash code.
When does it make sense? When Object.equals() is insufficient. That method basically comes down to reference identity, meaning two objects are the same object so:
a.equals(b) iff q == b
Numbers are an obvious example when it makes sense because Integer(10) should equal another Intger(10).
Another example could be when you're representing database records. Let's say you have Student records with a unique integer ID, then it might be sufficient implementation of equals to simply compare the ID fields.
The equals method for class Object implements the most discriminating possible equivalence relation on objects; that is, for any non-null reference values x and y, this method returns true if and only if x and y refer to the same object (x == y has the value true).
To test whether two objects are equal in the sense of equivalency (containing the same information), you must override the equals() method.You should always override the equals() method if the identity operator is not appropriate for your class.
Note that it is generally necessary to override the hashCode method whenever this method is overridden, so as to maintain the general contract for the hashCode method, which states that equal objects must have equal hash codes.
While you shouldn't rely on an IDE, Eclipse provides this canned functionality by pressing alt + shift + s and selecting the equals and hashCode menu options. There is also a toString option. Effective Java by Josh Bloch has good info on this subject. The link will take you to the chapter hosted on Google Books that discusses this topic.