abstracting iterative object conversions - java

I have a collection of objects that I wish to convert freely in-between. Let us call them A through F.
The relationship between the objects might look something like this
A -- B -- C -- F
| |
D -- E ----
What this means is A can be converted to B but not C, if you want to convert A to C then you have to convert it to B first, then to C from there.
I'm having trouble coming up with an extensible and flexible implementation of this that would allow the easy addition of new types and the conversions that go with them. Even after combing through every design pattern I could find I'm no closer to coming up with an answer.
Initially, I had one Converter class that looked something like this
public class Converter
public B convA2B(A in){...}
public A convB2A(B in){...}
public C convB2C(B in){...} etc
this proved to be unwieldy as I tried to add more types. Next, I tried having multiple converter objects all extending the same abstract class
public abstract class Converter
final static int typeA = 0;
final static int typeB = 1;
final static int typeC = 2;
int type;
public abstract A toA(Object in);
public abstract B toB(Object in);
public abstract Object fromA(A in);
public abstract Object fromB(B in);
...
public Object convertTo(int type, Object data)
{
switch(type){
case 0: return new A().toA(data)
etc etc
Essentially what would happen is each converter would convert the data to the next object type in the path, before passing that data on to the next converter.
ie, if I wanted to convert from A to C, A.toC(x) would call B.toC(A.toB(x)).
this didn't work because each converter type needed to have some basic understanding of the relationship between all the types in order to know when converter to call next, which meant adding new converters became quite difficult in places and could even lead to infinite loops if handled poorly.
What should I do? Many of the design patterns I read about seem to be close to what I'm looking for, like mediator, chain of responsibility, interpreter, but I'm not certain how to adapt them to do what I want.

Interesting problem. This is what I came up with:
A base abstract Model class from which A, B, C... will extend:
public abstract class Model {
protected Set<Class<? extends Model>> targets;
public abstract Set<Class<? extends Model>> targets ();
public abstract <T extends Model> T convert (Class<T> target);
}
The B class for example (since it has the most connections)
public class B extends Model {
public B () {
}
#Override
public Set<Class<? extends Model>> targets () {
if (targets == null) {
targets = new HashSet<> ();
targets.add (A.class);
targets.add (C.class);
targets.add (D.class);
}
return targets;
}
#SuppressWarnings ("unchecked")
#Override
public <T extends Model> T convert (Class<T> target) {
if (target == A.class) {
return (T)toA ();
}
if (target == C.class) {
return (T)toC ();
}
return (T)toD ();
}
private A toA () {
A a = new A ();
// your conversion code
return a;
}
private C toC () {
C c = new C ();
// your conversion code
return c;
}
private D toD () {
D d = new D ();
// your conversion code
return d;
}
}
And your converter class:
public class Converter {
public Converter () {
}
public <S extends Model, T extends Model> T run (S source, Class<T> target) throws Exception {
if (!source.targets ().contains (target)) {
throw new Exception ("Inconvertible types.");
}
return source.convert (target);
}
}
And finally on your code, what you'll do is:
B b = new B ();
Converter converter = new Converter ();
try {
C c = converter.run (b, C.class);
F f = converter.run (c, F.class);
E e = converter.run (f, E.class);
} catch (Exception e) {
e.printStackTrace ();
}

This solution is the same thing as your first idea, but it neatly sidesteps the unwieldiness of the combinatorial explosion by treating Converters as objects that can be combined.
Define interface Converter<A, B> as a converter from A to B. This is going to end up the same as Function<A, B>, but we are attaching some extra semantics to it, so we may as well not confuse them:
#FunctionalInterface
interface Converter<A, B> {
B convert(A a);
}
Define a bunch of Converters. For each class, only define Converters for their direct neighbors (Converter<A, B>, but no Converter<A, C>).
Converter<A, B> aToB = a -> { ... };
Converter<B, C> bToC = b -> { ... };
Converter<B, D> bToD = b -> { ... };
// etc.
Now, to get from A to D, we need to combine Converters somehow. This is easily done by adding to Converter:
#FunctionalInterface
interface Converter<A, B> {
B convert(A a);
default <C> Converter<C, B> after(Converter<? super C, ? extends A> pre) {
return c -> this.convert(pre.convert(c));
}
default <C> Converter<A, C> then(Converter<? super B, ? extends C> post) {
return a -> post.convert(this.convert(c));
}
}
Now, you can write
// For a mathematician
Converter<A, C> aToC = bToC.after(aToB);
// If you aren't a mathematician
Converter<A, C> aToC = aToB.then(bToC);
You shouldn't store these compositions in a static somewhere, because then you'll get a hard-to-manage combinatorial explosion, one for every path in your graph. Instead, the code simply creates them as-needed with after and then. Adding a new type involves adding new Converters for its immediate neighbors in the graph.
If you don't feel like using Converter, you can use java.util.function.Function, which is the same thing (but convert is apply, after is compose, and then is andThen). The reason I wrote a new interface is that Converter is semantically different from Function. A Converter is a part of your specific universe of types; a Function is a function between any two types. Although they have the same code, they mean different things (you could say that Converter<A, B> extends Function<A, B> though).

Related

How is it possible that a method with generic return type with a bound can be assigned to a variable outside of that bound?

Suppose I have the following structure:
public interface A {
}
public interface B {
}
public interface B1 extends B {
}
public interface B2 extends B {
}
public class C implements A, B1 {
private final String s;
public C(final String s) {
this.s = s;
}
}
public class D implements A, B2 {
private final Exception e;
public D(final Exception e) {
this.e = e;
}
}
public class SomeClass<T> {
private final T t;
private final Exception e;
public SomeClass(final T t, final Exception e) {
this.t = t;
this.e = e;
}
public <U extends B> U transform(final java.util.function.Function<T, ? extends U> mapper1, final java.util.function.Function<Exception, ? extends U> mapper2) {
return t == null ? mapper2.apply(e) : mapper1.apply(t);
}
}
When now we do the following in another class:
public class AnotherClass {
public static void main(final String[] args) {
SomeClass<String> someClass = new SomeClass<>("Hello World!", null);
// this line is what is bothering me
A mappedResult = someClass.transform(C::new, D::new);
}
}
The code compiles without any problems. Why does the code compile? How is it possible that the type of 'mappedResult' can be A, even though the generic U in the method is declared to be a subtype of B?
Ok, so based on the comments on the question and some discussion with other people, there was a major point that I missed that might need addressing and that actually explains the answer given in the comments.
It's clear that the following compiles:
Object mappedResult = someClass.transform(C::new, D::new);
And yet Object is not a subclass of B, of course. The bound will ensure that the the types of C and D (in this case) will be a subtype of B, but they can be other types as well due thanks to other interfaces both C and D implement. The compiler will check what types they are and look at the most specific type(s) that they have in common. In this case, that is both A and B, so the type is derived to be A & B. Therefore, assigning this result to A is possible, because the compiler will derive the result to be an A as well.
The bound does provide some restrictions regarding the input, but not regarding the output and not regarding to the types of variables to which you can assign the result. That is what I was confused about before.
Another way to see this is the following: if the method had been defined as follows:
public <U> U transform(final java.util.function.Function<T, ? extends U> mapper1, final java.util.function.Function<Exception, ? extends U> mapper2) {
return t == null ? mapper2.apply(e) : mapper1.apply(t);
}
then the result can still be assigned to an A or a B when calling it as before. The bound had no influence on that. All it ensures here is that both mapper functions need to map to a result that is a subtype of U. With the bound, that becomes a subtype of U which is a subtype of B. But the fact that the result is a subtype of A doesn't change the fact that it is also a subtype of B. Therefore, the result can be assigned to either type.

How to use Constructors/Casting with Generic Types

I have a parent class, Parent, with two child classes, A and B. I have another class, Wrapper<Type1,Type2>, that contains an interface, Function<Type1,Type2>, which is supposed to transform an A into a B or a B into an A.
If I define
new Wrapper<A,B>(new Function<A,B>(){public B transform(A a){return new B(a);}});
outside of the Wrapper class, then this works fine.
I run into the problem that I can't instantiate a generic type when I want to define a default Function for the default constructor public Wrapper() within the Wrapper<Type1,Type2> class itself.
Eclipse recommends casting from Type1 to Type2, but the problem is that A can't cast to B because they are sibling classes. I do have constructors for Parent(Parent), A(B), and B(A), so it would be great if I could implement a generic constructor somehow. How can I work around this?
public class Parent {
protected int value = 0;
public void setValue(int x){ value = x; }
public int getValue(){ return value; }
public Parent(){}
public Parent(A a){setValue(a.getValue());}
public Parent(B b){setValue(b.getValue());}
public Parent(Parent p){setValue(p.getValue());}
}
public class A extends Parent{
public A(){ setValue(1); }
public A(B b){ setValue( b.getValue()); }
}
public class B extends Parent{
public B(){ setValue(2); }
public B(A a){ setValue(a.getValue()); }
}
public interface Function <Type1 extends Parent, Type2 extends Parent> {
public Type2 transform(Type1 t);
}
public class Wrapper<Type1 extends Parent, Type2 extends Parent> {
Function<Type1,Type2> function;
public Wrapper(Function<Type1,Type2> x){ function = x; }
public Wrapper(){
function = new Function<Type1,Type2>(){
public Type2 transform(Type1 t){
///I want to use constructor Type2(t), given that they both extend Parent
//return new Type2( t);
return (Type2) t; ///causes an error because can't cast from A to B
}
};
}
public Type2 transform(Type1 t){
return function.transform(t);
}
}
public class Main {
public static void main(String[] args){
///Start with custom function. This part works.
Wrapper<A,B> wrapper = new Wrapper<A,B>(
new Function<A,B>(){
public B transform(A a){
///Want to use constructor B(a)
///Can't cast A to B
return new B(a);
}
}
);
A a = new A();
B b = wrapper.transform(a);
///This part works
System.out.println(b.getValue());
///Next try the default Function
wrapper = new Wrapper<A,B>();
b = wrapper.transform(a); ///This part causes the error, as wrapper attempts to cast from A to B
System.out.println(b.getValue());
}
}
Edit:
My question is unique in scope and implementation from the suggested duplicate. E.g., the structure of my code is a simple parent with two sibling child classes. The structure in the possible duplicate is more intricate, involving multiple generations and child classes that are disheveled in a confusing way. I'm not sure what that code is attempting to do, and the answer didn't help me understand my own question in the slightest as it seemed particular to the distinct structure of the other question.
There's no way to make a "generic" constructor. The solution closes to your current implementation is to instantiate objects in your function. As this is anyway the responsibility of the caller (in your design), then it's easy:
Wrapper<A, B> wrapper = new Wrapper<A, B>((a) -> new B(a));
But where the default Wrapper() constructor is being called, you can make the caller send Class objects for type1 and type2:
public Wrapper(Class<Type1> type1Class, Class<Type2> type2Class) {
this.function = (object1) -> {
try {
return type2Class.getConstructor(type1Class).newInstance(object1);
} catch (Exception e) {
throw new RuntimeException(e);
}
};
}
With both of the above, your main method will look like the following:
public static void main(String... args) {
Wrapper<A, B> wrapper = new Wrapper<A, B>((a) -> new B(a));
A a = new A();
B b = wrapper.transform(a);
System.out.println(b.getValue());
wrapper = new Wrapper<A, B>(A.class, B.class);
b = wrapper.transform(a);
System.out.println(b.getValue());
}
And this runs without any type cast errors.
The java1.7 version of the above lambda expressions:
Wrapper<A, B> wrapper = new Wrapper<A, B>(new Function<A, B>() {
#Override
public B transform(A a) {
return new B(a);
}
});
And:
this.function = new Function<Type1, Type2>() {
#Override
public Type2 transform(Type1 object1) {
try {
return type2Class.getConstructor(type1Class).newInstance(object1);
} catch (Exception e) {
throw new RuntimeException(e);
}
}
};
Since it's not possible to create an instance using a generic type parameter we must work around it. I'll restrict the answer to Java 7 as I read from the comments you're working with. Here is my suggestion:
public interface Transform<P extends Parent> {
P with(int value);
}
public static void main(String[] args) {
Transform<B> transformToB = new Transform<B>() {
#Override
public B with(int value) {
return new B(value);
}
};
A a = new A();
B b = transformToB.with(a.getValue());
System.out.println(b.getValue());
}
How it works?
We have an interface Transform<P extends Parent> defining the method with. This method has one parameter. This is the only one field of the classes you defined. Using this value the method has to return an instance of some P which extends Parent. Looking at the implementation of transformToB it creates a B instance by calling the (by me added) constructor B(int value).
Why another constructor?
Declaring constructors like B(A a) or A(B b) results in a circular dependency between these classes. A and B aren't loosely coupled. Declaring a constructor which takes only a value we instantiate the state without having to know where this value comes from.
Also declaring constructors like Parent(A a) and Parent(B b) introduces dependencies to Parent on it's subclasses. Following this approach Parent would need to provide a constructor for each subclass.
Possible extension:
If value is just an example for many other fields, we do not want to define a constructor like A(int value1, int value2, String value3, ...) having many parameters. Instead of we could use the default constructor A() and do the transformation like this:
interface Transform<From extends Parent, To extends Parent> {
To from(From f);
}
public static void main(String[] args) {
Transform<A, B> transformToB = new Transform<A, B>() {
#Override
public B from(A a) {
B b = new B();
b.setValue(a.getValue());
b.setValue2(a.getValue2());
b.setValue3(a.getValue3());
return b;
}
};
A a = new A();
B b = transformToB.from(a);
System.out.println(b.getValue());
}
This last approach is also applicable if A and B have different fields. In case B has a field String value4 we could add an additional line to transformToB like b.setValue4(a.getValue3()+"#"+a.getValue2());.

Polymorphic method return type down-casting in java

So I don't know if this is possible I've tried searching it but maybe my search terms are off. Basically I'm wondering, is there a way to create a generic function/method in a super class that returns the downcast object.
class A {
public <downcasted type (in this example B if called from a B instance)> test() {
return this;
}
}
class B extends A { }
B b = new B().test()
basically having "test()" return the B instance as type B even know the function/method is declared purely in the parent class?
I know I can cast the variable, tho having many functions some of which may return Lists of the class type, etc become troublesome. I also realize I could #override the function in B and do a "return (B)this.super()" thing, but again wrapping many functions is tedious and makes makes updating the base classes code more painful.
I also know you can do
"class A<T extends A>"
and then define B as
"class B extends A<B>"
but then if you want to make a "C" that extends "B" it breaks.
So is this type of behavior possible? If so, what is it called and how do I implement it?
An example as to where this behavior could be useful would be any base data structures you want to make extendable like an N-Ary Tree that you extend into oct/quad tree structure and/or an extended class that adds a "Name" and "Attributes" or something for a xml-like node.
Edit:
This seems to work(as far as the linter is concerned), it's a bit more work to implement the base methods but it's got the desired end result as far as I can tell. That said when I attempt to run it, it gives me a "cannot find symbol: class type" error. :S
static class D extends auto {
final Class type = getClass();
#SuppressWarnings("unchecked")
public <T extends type> T test() {
return (T)type.cast(this);
}
}
static class E extends D { }
static class F extends E { }
static {
D d = new D().test();
E e = new E().test();
F f = new F().test();
}
Update
There is a simpler way, which seems to work:
class Alpha {
#SuppressWarnings("unchecked")
<T extends Alpha> T test() {
return (T) this;
}
}
class B extends A { }
However, that does not support method chaining.
Original post
You need test() to return a subtype of A, rather than A itself. In order to do this, the signature of the A class could be this:
class A<T extends A<?>> {
#SuppressWarnings("unchecked")
public T test() {
return (T) this;
}
}
If you create a class B extending A, you will need B.test() to return an instance of B, without needing to override test() returning a specific type. You could then do something like this:
class B<T extends B<?>> extends A<T> { }
Now T is a subclass of B, and because test()'s return type is T, it will return a B instance. Further subclassing can be done in the same way:
class C<T extends C<?>> extends B<T> { }
And statements like this will work:
C<?> c = new C<>().test();

Java tagged union / sum types

Is there any way to define a sum type in Java? Java seems to naturally support product types directly, and I thought enums might allow it to support sum types, and inheritance looks like maybe it could do it, but there is at least one case I can't resolve.
To elaborate, a sum type is a type which can have exactly one of a set of different types, like a tagged union in C.
In my case, I'm trying to implement haskell's Either type in Java:
data Either a b = Left a | Right b
but at the base level I'm having to implement it as a product type, and just ignore one of its fields:
public class Either<L,R>
{
private L left = null;
private R right = null;
public static <L,R> Either<L,R> right(R right)
{
return new Either<>(null, right);
}
public static <L,R> Either<L,R> left(L left)
{
return new Either<>(left, null);
}
private Either(L left, R right) throws IllegalArgumentException
{
this.left = left;
this.right = right;
if (left != null && right != null)
{
throw new IllegalArgumentException("An Either cannot be created with two values");
}
if (left == right)
{
throw new IllegalArgumentException("An Either cannot be created without a value");
}
}
.
.
.
}
I tried implementing this with inheritance, but I have to use a wildcard type parameter, or equivalent, which Java generics won't allow:
public class Left<L> extends Either<L,?>
I haven't used Java's Enums much, but while they seem the next best candidate, I'm not hopeful.
At this point, I think this might only be possible by type-casting Object values, which I would hope to avoid entirely, unless there's a way to do it once, safely, and be able to use that for all sum types.
Make Either an abstract class with no fields and only one constructor (private, no-args, empty) and nest your "data constructors" (left and right static factory methods) inside the class so that they can see the private constructor but nothing else can, effectively sealing the type.
Use an abstract method either to simulate exhaustive pattern matching, overriding appropriately in the concrete types returned by the static factory methods. Implement convenience methods (like fromLeft, fromRight, bimap, first, second) in terms of either.
import java.util.Optional;
import java.util.function.Function;
public abstract class Either<A, B> {
private Either() {}
public abstract <C> C either(Function<? super A, ? extends C> left,
Function<? super B, ? extends C> right);
public static <A, B> Either<A, B> left(A value) {
return new Either<A, B>() {
#Override
public <C> C either(Function<? super A, ? extends C> left,
Function<? super B, ? extends C> right) {
return left.apply(value);
}
};
}
public static <A, B> Either<A, B> right(B value) {
return new Either<A, B>() {
#Override
public <C> C either(Function<? super A, ? extends C> left,
Function<? super B, ? extends C> right) {
return right.apply(value);
}
};
}
public Optional<A> fromLeft() {
return this.either(Optional::of, value -> Optional.empty());
}
}
Pleasant and safe! No way to screw it up. Because the type is effectively sealed, you can rest assured that there will only ever be two cases, and every operation ultimately must be defined in terms of the either method, which forces the caller to handle both of those cases.
Regarding the problem you had trying to do class Left<L> extends Either<L,?>, consider the signature <A, B> Either<A, B> left(A value). The type parameter B doesn't appear in the parameter list. So, given a value of some type A, you can get an Either<A, B> for any type B.
A standard way of encoding sum types is Boehm–Berarducci encoding (often referred to by the name of its cousin, Church encoding) which represents an algebraic data type as its eliminator, i.e., a function that does pattern-matching. In Haskell:
left :: a -> (a -> r) -> (b -> r) -> r
left x l _ = l x
right :: b -> (a -> r) -> (b -> r) -> r
right x _ r = r x
match :: (a -> r) -> (b -> r) -> ((a -> r) -> (b -> r) -> r) -> r
match l r k = k l r
-- Or, with a type synonym for convenience:
type Either a b r = (a -> r) -> (b -> r) -> r
left :: a -> Either a b r
right :: b -> Either a b r
match :: (a -> r) -> (b -> r) -> Either a b r -> r
In Java this would look like a visitor:
public interface Either<A, B> {
<R> R match(Function<A, R> left, Function<B, R> right);
}
public final class Left<A, B> implements Either<A, B> {
private final A value;
public Left(A value) {
this.value = value;
}
public <R> R match(Function<A, R> left, Function<B, R> right) {
return left.apply(value);
}
}
public final class Right<A, B> implements Either<A, B> {
private final B value;
public Right(B value) {
this.value = value;
}
public <R> R match(Function<A, R> left, Function<B, R> right) {
return right.apply(value);
}
}
Example usage:
Either<Integer, String> result = new Left<Integer, String>(42);
String message = result.match(
errorCode -> "Error: " + errorCode.toString(),
successMessage -> successMessage);
For convenience, you can make a factory for creating Left and Right values without having to mention the type parameters each time; you can also add a version of match that accepts Consumer<A> left, Consumer<B> right instead of Function<A, R> left, Function<B, R> right if you want the option of pattern-matching without producing a result.
Alright, so the inheritance solution is definitely the most promising. The thing we would like to do is class Left<L> extends Either<L, ?>, which we unfortunately cannot do because of Java's generic rules. However, if we make the concessions that the type of Left or Right must encode the "alternate" possibility, we can do this.
public class Left<L, R> extends Either<L, R>`
Now, we would like to be able to convert Left<Integer, A> to Left<Integer, B>, since it doesn't actually use that second type parameter. We can define a method to do this conversion internally, thus encoding that freedom into the type system.
public <R1> Left<L, R1> phantom() {
return new Left<L, R1>(contents);
}
Complete example:
public class EitherTest {
public abstract static class Either<L, R> {}
public static class Left<L, R> extends Either<L, R> {
private L contents;
public Left(L x) {
contents = x;
}
public <R1> Left<L, R1> phantom() {
return new Left<L, R1>(contents);
}
}
public static class Right<L, R> extends Either<L, R> {
private R contents;
public Right(R x) {
contents = x;
}
public <L1> Right<L1, R> phantom() {
return new Right<L1, R>(contents);
}
}
}
Of course, you'll want to add some functions for actually accessing the contents, and for checking whether a value is Left or Right so you don't have to sprinkle instanceof and explicit casts everywhere, but this should be enough to get started, at the very least.
Inheritance can be used to emulate sum types (Disjoint unions), but there are a few issues you need to deal with:
You need to take care to keep others from adding new cases to your type. This is especially important if you want to exhaustively handle every case you might encounter. It's possible with a non-final super class, and package-private constructor.
The lack of pattern patching makes it quite difficult to consume a value of this type. If you want compiler-checked way to guarantee that you've exhaustively handled all cases, you need to implement a match function yourself.
You're forced into one of two styles of API, neither of which are ideal:
All cases implement a common API, throwing errors on API they don't support themselves. Consider Optional.get(). Ideally, this method would only be available on a disjoint type who's value is known to be some rather than none. But there's no way to do that, so it's an instance member of a general Optional type. It throws NoSuchElementException if you call it on an optional whose "case" is "none".
Each case has a unique API that tells you exactly what it's capable of, but that requires a manual type check and cast every time you wish to call one of these subclass-specific methods.
Changing "cases" requires new object allocation (and adds pressure on the GC if done often).
TL;DR: Functional programming in Java is not a pleasant experience.
Let me suggest a very different solution, that does not make use of inheritance / abstract classes / interfaces. On the downside, it requires some effort for each new "sum type" defined. However, I think it has many advantages : it is safe, it only uses basic concepts, it feels natural to use, and allows for more than 2 "subtypes".
Here is a proof of concept for binary trees because it's more practical than "Either", but you can just use the comments as the guideline to build your own sum type.
public class Tree {
// 1) Create an enum listing all "subtypes" (there may be more than 2)
enum Type { Node, Leaf }
// 2) Create a static class for each subtype (with the same name for clarity)
public static class Node {
Tree l,r;
public Node(Tree l, Tree r) {
this.l = l;
this.r = r;
}
}
public static class Leaf {
int label;
public Leaf(int label) {
this.label = label;
}
}
// 3) Each instance must have:
// One variable to indicate which subtype it corresponds to
Type type;
// One variable for each of the subtypes (only one will be different from null)
Leaf leaf;
Node node;
// 4) Create one constructor for each subtype (it could even be private)
public Tree(Node node) {
this.type = Type.Node;
this.node = node;
}
// 5) Create one "factory" method for each subtype (not mandatory but quite convenient)
public static Tree newNode(Tree l, Tree r) {
return new Tree(new Node(l,r));
}
public Tree(Leaf leaf) {
this.type = Type.Leaf;
this.leaf = leaf;
}
public static Tree newLeaf(int label) {
return new Tree(new Leaf(label));
}
// 6) Create a generic "matching" function with one argument for each subtype
// (the constructors ensure that no "null pointer exception" can be raised)
public <T> T match(Function<Node,T> matchNode, Function<Leaf,T> matchLeaf) {
switch (type) {
case Node:
return matchNode.apply(node);
case Leaf:
return matchLeaf.apply(leaf);
}
return null;
}
// 7) Have fun !
// Note that matchings are quite natural to write.
public int size() {
return match(
node -> 1 + node.l.size() + node.r.size(),
leaf -> 1
);
}
public String toString() {
return match(
node -> {
String sl = node.l.toString();
String sr = node.r.toString();
return "Node { "+sl+" , "+sr+" }";
},
leaf -> "Leaf: "+leaf.label
);
}
public static void main(String [] args) {
Tree node1 = Tree.newNode(Tree.newLeaf(1),Tree.newLeaf(2));
Tree node2 = Tree.newNode(node1,Tree.newLeaf(3));
System.out.println(node2.size());
System.out.println(node2);
}
}
Feel free to express criticism, I'm genuinely interested in this topic and will be happy to learn more.
How about
import java.util.Optional;
public interface Either<L, R> {
default Optional<L> left() { return Optional.empty();}
default Optional<R> right() { return Optional.empty();}
static <L, R> Either<L, R> fromLeft(L left) {
return new Either<L, R>() {
#Override public Optional<L> left() { return Optional.of(left); }
};
}
static <L, R> Either<L, R> fromRight(R right) {
return new Either<L, R>() {
#Override public Optional<R> right() { return Optional.of(right); }
};
}
}
The difference to other solutions proposed here is not deep, but stylistic.

Java abstracting the use of instanceof for collecting classes from List

Suppose I have the following hierarchy of classes:
public class MainClass {
}
class A extends MainClass {
}
class B extends MainClass {
}
class C extends MainClass {
}
Now suppose I have a List<MainClass> classes which looks like:
{A, MainClass, A, B, B, MainClass, C, A, C, B, A}
I want to be able to pick out sublists of objects by their class. For example, I would like to be able to extract only those classes in this list of class A (but not class MainClass). As such, using isAssignableFrom(A.class) will not work for me.
My current method looks like:
public <T extends MainClass> List<T> getClasses(List<MainClass> classes, Class classToCollect) {
List<T> subclasses = new ArrayList<T>();
for (MainClass clazz : classes) {
if (clazz.getClass().isInstance(classToCollect)) {
subclasses.add((T)clazz);
}
}
return subclasses;
}
This still doesn't work and passes back an empty list. What gives here?
The condition should look like this:
for (MainClass obj : classes) {
if (classToCollect.isInstance(obj)) {
subclasses.add((T)obj);
}
}
The name clazz is misleading, because it is actually an object.
You can further improve type safety of your code by using Class<T> in the method header:
public <T extends MainClass> List<T> getClasses(List<MainClass> classes, Class<T> classToCollect) {
...
}
Demo on ideone.
Note: This would not work if you pass MainClass.class as the second argument (Thanks, JB Nizet, for a great comment).
Using Java 8 you could express it this way:
public <T> List<T> getClasses(List<? super T> instances, Class<T> classToCollect) {
return instances.stream()
.filter(c -> c.getClass() == classToCollect)
.map(c -> (T) c)
.collect(Collectors.toList());
}

Categories