Related
Let's say I have a method that applies multiple functions to a value.
Example usage:
String value = "a string with numb3r5";
Function<String, List<String>> fn1 = ...
Function<List<String>, String> fn2 = ...
Function<String, List<Integer>> fn3 = ...
InputConverter<String> converter = new InputConverter<>(value);
List<Integer> ints = converter.convertBy(fn1, fn2, fn3);
Is it possible to make it apply multiple functions with various inputs and return values?
I've tried using wildcards, but this doesn't work.
public class InputConverter<T> {
private final T src;
public InputConverter(T src) {
this.src = src;
}
public <R> R convertBy(Function<?, ?>... functions) {
R value = (R) src;
for (Function<?, ?> function : functions)
value = (R) function.apply(value);
^^^^^
return value;
}
}
You can use a chain on Function like the following
Function<String, List<Integer>> functionChain = fn1.andThen(fn2).andThen(fn3);
You can achieve nearly the same thing by using raw types
#SuppressWarnings({"unchecked", "rawtypes"})
public <R> R convertBy(Function... functions) {
Function functionsChain = Function.identity();
for (Function function : functions) {
functionsChain = functionsChain.andThen(function);
}
return (R) functionsChain.apply(src);
}
Otherwise, the only other I see is to use the same pattern as Optional or Stream like suggested in the comments
List<Integer> fileInputStreams = converter.convertBy(fn1)
.convertBy(fn2)
.convertBy(fn3)
.get();
// with this implementation
public static class InputConverter<T> {
private final T src;
public InputConverter(T src) {
this.src = src;
}
public <R> InputConverter<R> convertBy(Function<T, R> function) {
return new InputConverter<>(function.apply(src));
}
public T get() {
return src;
}
}
I am trying to create a two dimensional list in java.
My first and preferred method is as so:
List<List<Integer>> seqList = IntStream.range(0, n)
.mapToObj(ArrayList<Integer>::new)
.collect(Collectors.toList());
However, for some reason this method takes too long and I get a timeout.
On the other hand, when I tried to create the two dimensional list using java 7 like so, there was no timeout.
List<List<Integer>> seqList = new ArrayList<>();
for(int i = 0; i < n; i++) {
seqList.add(new ArrayList<>());
}
I am trying to use as much java-8 streams as possible. Could someone explain to me why my java-8 code is taking too long and what I can do to make it run in the same time complexity as the java-7 code.
This is an alternative way to do it.
int n = 10;
List<List<Integer>> seqList =Stream.<List<Integer>>generate(()->new ArrayList<>())
.limit(n).collect(Collectors.toList());
Thanks to Jacob G I was able to see the problem.
The call .mapToObj(ArrayList<Integer>::new) was creating ArrayLists of varying size. It was equivalent to .mapToObj(i -> new ArrayList<Integer>(i)). Now this means that creating new arraylist objects when i is huge take longer hence the timeout. The better code is as follows:
List<List<Integer>> seqList2 = IntStream.range(0, n)
.mapToObj(i -> new ArrayList<Integer>())
.collect(Collectors.toList());
The relative cost of the streaming APIs will be high, even with the correction. This can be seen by walking through the many steps which are performed. The complexity is rather quite extraordinary.
The code examples, below, are from the IBM Java SE Runtime Environment version 1.8.
// A walkthrough of the operation:
//
// "Create a list of lists by collecting the results of applying the ArrayList
// initializer to the stream of 'int' values ranging from 0 to 10."
static {
List<List<Integer>> seqList = IntStream.range(0, 10)
.mapToObj( ArrayList<Integer>::new )
.collect( Collectors.toList() );
}
// First step: Create an 'int' Stream.
//
// Roughly, create an 'int' iterator, then wrap that in a 'int' stream.
//
// The iterator is responsible for knowing the initial and final values
// over the range of iteration, and for providing basic iteration.
//
// However, 'mapToObj' is part of the streaming API. The iterator
// must be put into a stream to access that API.
// The 'int' stream factory method.
//
// Fan out to 'RangeIntSpliterator.init' and to 'StreamSupport.intStream'.
//
// The 'int' stream is created with 'parallel' set to false.
class IntStream {
public static IntStream range(int startInclusive, int endExclusive) {
if ( startInclusive >= endExclusive ) {
return empty();
} else {
return StreamSupport.intStream(
new Streams.RangeIntSpliterator(startInclusive, endExclusive, false),
false );
}
}
}
// The 'int' iterator type.
//
// After setup, 'forEachRemaining' will be used to perform
// the 'int' iteration.
class RangeIntSpliterator implements Spliterator.OfInt {
protected int from;
protected final int upTo;
protected int last;
RangeIntSpliterator(int from, int upTo, boolean closed) {
this( from, upTo, (closed ? 1 : 0) );
}
void forEachRemaining(Consumer<? super Integer> action);
void forEachRemaining(IntConsumer consumer);
}
// The 'int' stream factory method.
//
// Fan out to 'IntPipeline.Head<>.init'. 'IntPipeline.Head' extends
// 'IntPipeline', which extends 'AbstractPipeline'.
//
// 'IntPipeline.mapToObj' creates an stream of 'ArrayList' instances
// out of the stream of 'int' instances.
class StreamSupport {
public static IntStream intStream(Spliterator.OfInt spliterator, boolean parallel) {
return new IntPipeline.Head<>(
spliterator,
StreamOpFlag.fromCharacteristics(spliterator),
parallel );
}
}
class IntPipeLine.Head<> extends IntPipeline<> {
Head(Spliterator<Integer> source, int sourceFlags, boolean parallel) {
super(source, sourceFlags, parallel);
}
}
class IntPipeline<>
extends AbstractPipeline<, Integer, IntStream>
implements IntStream {
IntPipeline(Spliterator<Integer> source, int sourceFlags, boolean parallel) {
super(source, sourceFlags, parallel);
}
<U> Stream<U> mapToObj(IntFunction<? extends U> mapper);
}
class AbstractPipeline {
AbstractPipeline(Spliterator<?> source, int sourceFlags, boolean parallel) {
this.previousStage = null;
this.sourceSpliterator = source;
this.sourceStage = this;
this.sourceOrOpFlags = ( sourceFlags & StreamOpFlag.STREAM_MASK );
this.combinedFlags = ( (~(sourceOrOpFlags << 1)) & StreamOpFlag.INITIAL_OPS_VALUE );
this.depth = 0;
this.parallel = parallel;
}
}
// Second step: Create a second stream by composing the 'int' stream with the ArrayList
// initializer.
//
// Fan out to 'ReferencePipeline.StatelessOp'. 'StatelessOp' extends 'ReferencePipeline',
// which extends 'AbstractPipeline'.
class IntPipeline {
#Override
public final <U> Stream<U> mapToObj(IntFunction<? extends U> mapper) {
Objects.requireNonNull(mapper);
return new ReferencePipeline.StatelessOp<Integer, U>(
this,
StreamShape.INT_VALUE,
(StreamOpFlag.NOT_SORTED | StreamOpFlag.NOT_DISTINCT) ) {
Sink<Integer> opWrapSink(int flags, Sink<U> sink) {
return new Sink.ChainedInt<U>(sink) {
public void accept(int t) {
downstream.accept( mapper.apply(t) );
}
};
}
};
}
}
class StatelessOp<E_IN, E_OUT> extends ReferencePipeline<E_IN, E_OUT> {
StatelessOp(AbstractPipeline<?, E_IN, ?> upstream, StreamShape inputShape, int opFlags) {
super(upstream, opFlags);
assert upstream.getOutputShape() == inputShape;
}
abstract class ReferencePipeline<P_IN, P_OUT>
extends AbstractPipeline<P_IN, P_OUT, Stream<P_OUT>>
implements Stream<P_OUT> {
ReferencePipeline(Supplier<? extends Spliterator<?>> source, int sourceFlags) {
super(source, sourceFlags);
}
}
abstract class AbstractPipeline<E_IN, E_OUT, S extends BaseStream<E_OUT, S>>
extends PipelineHelper<E_OUT> implements BaseStream<E_OUT, S> {
AbstractPipeline(AbstractPipeline<?, E_IN, ?> previousStage, int opFlags) {
if ( previousStage.linkedOrConsumed ) {
throw new IllegalStateException(MSG_STREAM_LINKED);
}
previousStage.linkedOrConsumed = true;
previousStage.nextStage = this;
this.previousStage = previousStage;
this.sourceOrOpFlags = opFlags & StreamOpFlag.OP_MASK;
this.combinedFlags = StreamOpFlag.combineOpFlags(opFlags, previousStage.combinedFlags);
this.sourceStage = previousStage.sourceStage;
if ( opIsStateful() ) {
sourceStage.sourceAnyStateful = true;
}
this.depth = previousStage.depth + 1;
}
}
// Third step: Obtain the collector which is to be used by the 'int' stream.
//
// Note use of 'CH_ID', which marks the collector as an 'identity finisher'.
class Collectors {
static final Set<Collector.Characteristics> CH_ID =
Collections.unmodifiableSet( EnumSet.of(Collector.Characteristics.IDENTITY_FINISH) );
public static <T> Collector<T, ?, List<T>> toList() {
return new CollectorImpl<>(
(Supplier<List<T>>) ArrayList::new,
List::add,
(left, right) -> { left.addAll(right); return left; },
CH_ID);
}
}
class CollectorImpl<T, A, R> implements Collector<T, A, R> {
private final Supplier<A> supplier;
private final BiConsumer<A, T> accumulator;
private final BinaryOperator<A> combiner;
private final Function<A, R> finisher;
private final Set<Characteristics> characteristics;
CollectorImpl(
Supplier<A> supplier,
BiConsumer<A, T> accumulator,
BinaryOperator<A> combiner,
Function<A,R> finisher,
Set<Characteristics> characteristics) {
this.supplier = supplier;
this.accumulator = accumulator;
this.combiner = combiner;
this.finisher = finisher;
this.characteristics = characteristics;
}
CollectorImpl(
Supplier<A> supplier,
BiConsumer<A, T> accumulator,
BinaryOperator<A> combiner,
Set<Characteristics> characteristics) {
this(supplier, accumulator, combiner, castingIdentity(), characteristics);
}
}
// Fourth step: Start collection.
//
// Push the collector through 'ReduceOps.makeRef'.
class ReferencePipeline {
public final <R, A> R collect(Collector<? super P_OUT, A, R> collector) {
A container;
if ( isParallel() &&
(collector.characteristics().contains(Collector.Characteristics.CONCURRENT)) &&
(!isOrdered() ||
collector.characteristics().contains(Collector.Characteristics.UNORDERED))) {
container = collector.supplier().get();
BiConsumer<A, ? super P_OUT> accumulator = collector.accumulator();
forEach(u -> accumulator.accept(container, u));
} else {
container = evaluate( ReduceOps.makeRef(collector) );
}
return collector.characteristics().contains(Collector.Characteristics.IDENTITY_FINISH)
? (R) container
: collector.finisher().apply(container);
}
}
class ReduceOps {
public static <T, I> TerminalOp<T, I> makeRef(Collector<? super T, I, ?> collector) {
Supplier<I> supplier = Objects.requireNonNull(collector).supplier();
BiConsumer<I, ? super T> accumulator = collector.accumulator();
BinaryOperator<I> combiner = collector.combiner();
class ReducingSink extends Box<I> implements AccumulatingSink<T, I, ReducingSink> {
public void begin(long size) {
state = supplier.get();
}
public void accept(T t) {
accumulator.accept(state, t);
}
public void combine(ReducingSink other) {
state = combiner.apply(state, other.state);
}
}
return new ReduceOp<T, I, ReducingSink>(StreamShape.REFERENCE) {
public ReducingSink makeSink() {
return new ReducingSink();
}
};
}
}
class ReduceOp<T, R, S extends AccumulatingSink<T, R, S>> implements TerminalOp<T, R> {
private final StreamShape inputShape;
ReduceOp(StreamShape shape) {
inputShape = shape;
}
}
// Fifth step: Walk into the stream API.
class ReferencePipeline {
<R> R evaluate(TerminalOp<E_OUT, R> terminalOp) {
assert ( getOutputShape() == terminalOp.inputShape() );
if ( linkedOrConsumed ) {
throw new IllegalStateException(MSG_STREAM_LINKED);
}
linkedOrConsumed = true;
return ( isParallel()
? terminalOp.evaluateParallel( this, sourceSpliterator( terminalOp.getOpFlags() ) )
: terminalOp.evaluateSequential( this, sourceSpliterator( terminalOp.getOpFlags() ) ) );
}
}
class AbstractPipeline {
Spliterator<E_OUT> sourceStageSpliterator() {
if ( this != sourceStage ) {
throw new IllegalStateException();
}
if ( linkedOrConsumed ) {
throw new IllegalStateException(MSG_STREAM_LINKED);
}
linkedOrConsumed = true;
if ( sourceStage.sourceSpliterator != null ) {
Spliterator<E_OUT> s = sourceStage.sourceSpliterator;
sourceStage.sourceSpliterator = null;
return s;
} else if ( sourceStage.sourceSupplier != null ) {
Spliterator<E_OUT> s = (Spliterator<E_OUT>) sourceStage.sourceSupplier.get();
sourceStage.sourceSupplier = null;
return s;
} else {
throw new IllegalStateException(MSG_CONSUMED);
}
}
}
class ReduceOp {
public <P_IN> R evaluateSequential(
PipelineHelper<T> helper,
Spliterator<P_IN> spliterator) {
return helper.wrapAndCopyInto( makeSink(), spliterator ).get();
}
}
class AbstractPipeline {
final <P_IN, S extends Sink<E_OUT>> S wrapAndCopyInto(S sink, Spliterator<P_IN> spliterator) {
copyInto( wrapSink( Objects.requireNonNull(sink) ), spliterator );
return sink;
}
}
<P_IN> Sink<P_IN> wrapSink(Sink<E_OUT> sink) {
Objects.requireNonNull(sink);
for ( AbstractPipeline p = AbstractPipeline.this; p.depth > 0; p = p.previousStage ) {
sink = p.opWrapSink( p.previousStage.combinedFlags, sink );
}
return (Sink<P_IN>) sink;
}
class StatelessOp {
Sink<Integer> opWrapSink(int flags, Sink<U> sink) {
return new Sink.ChainedInt<U>(sink) {
public void accept(int t) {
downstream.accept( mapper.apply(t) );
}
};
}
}
// Sixth step: Perform the actual iteration and collection.
//
// Ignoring 'begin' and 'end', iteration and collection occurs in the call
// to 'forEachRemaining'.
class AbstractPipeline {
<P_IN> void copyInto(Sink<P_IN> wrappedSink, Spliterator<P_IN> spliterator) {
Objects.requireNonNull(wrappedSink);
if ( !StreamOpFlag.SHORT_CIRCUIT.isKnown( getStreamAndOpFlags() ) ) {
wrappedSink.begin( spliterator.getExactSizeIfKnown() );
spliterator.forEachRemaining(wrappedSink);
wrappedSink.end();
} else {
copyIntoWithCancel(wrappedSink, spliterator);
}
}
}
class RangeIntSpliterator implements Spliterator.OfInt {
void forEachRemaining(Consumer<? super Integer> action) {
if ( action instanceof IntConsumer ) {
forEachRemaining((IntConsumer) action);
} else {
if ( Tripwire.ENABLED ) {
Tripwire.trip(getClass(), "{0} calling Spliterator.OfInt.forEachRemaining((IntConsumer) action::accept)");
forEachRemaining((IntConsumer) action::accept);
}
}
}
void forEachRemaining(IntConsumer consumer) {
Objects.requireNonNull(consumer);
int i = from;
final int hUpTo = upTo;
int hLast = last;
from = upTo;
last = 0;
while ( i < hUpTo ) {
consumer.accept(i++);
}
if ( hLast > 0 ) {
consumer.accept(i);
}
}
}
// Seventh step: For each iteration, unwind and perform the mapping and
// collection operations.
class new Sink.ChainedInt<U>(sink) {
public void accept(int t) {
downstream.accept( mapper.apply(t) );
}
}
class ArrayList {
public ArrayList(int initialCapacity) {
// ...
}
}
class ReducingSink {
public void accept(T t) {
accumulator.accept(state, t);
}
}
class ArrayList {
public boolean add(E e) {
// ...
}
}
// Eigth step: Walking out with the return value.
IntPipeline$4(AbstractPipeline<E_IN,E_OUT,S>).wrapAndCopyInto(S, Spliterator<P_IN>)
-- returns a 'ReducingSink' instance.
ReduceOps$3(ReduceOps$ReduceOp<T,R,S>).evaluateSequential(PipelineHelper<T>, Spliterator<P_IN>)
-- returns the 'ArrayList' instance.
IntPipeline$4(AbstractPipeline<E_IN,E_OUT,S>).evaluate(TerminalOp<E_OUT,R>)
-- returns the 'ArrayList' instance.
IntPipeline$4(ReferencePipeline<P_IN,P_OUT>).collect(Collector<? super P_OUT,A,R>)
-- returns the 'ArrayList' instance.
Tester.main
Is there a way in Java to have a map where the type parameter of a value is tied to the type parameter of a key? What I want to write is something like the following:
public class Foo {
// This declaration won't compile - what should it be?
private static Map<Class<T>, T> defaultValues;
// These two methods are just fine
public static <T> void setDefaultValue(Class<T> clazz, T value) {
defaultValues.put(clazz, value);
}
public static <T> T getDefaultValue(Class<T> clazz) {
return defaultValues.get(clazz);
}
}
That is, I can store any default value against a Class object, provided the value's type matches that of the Class object. I don't see why this shouldn't be allowed since I can ensure when setting/getting values that the types are correct.
EDIT: Thanks to cletus for his answer. I don't actually need the type parameters on the map itself since I can ensure consistency in the methods which get/set values, even if it means using some slightly ugly casts.
You're not trying to implement Joshua Bloch's typesafe hetereogeneous container pattern are you? Basically:
public class Favorites {
private Map<Class<?>, Object> favorites =
new HashMap<Class<?>, Object>();
public <T> void setFavorite(Class<T> klass, T thing) {
favorites.put(klass, thing);
}
public <T> T getFavorite(Class<T> klass) {
return klass.cast(favorites.get(klass));
}
public static void main(String[] args) {
Favorites f = new Favorites();
f.setFavorite(String.class, "Java");
f.setFavorite(Integer.class, 0xcafebabe);
String s = f.getFavorite(String.class);
int i = f.getFavorite(Integer.class);
}
}
From Effective Java (2nd edition) and this presentation.
The question and the answers made me come up with this solution: Type-safe object map. Here is the code. Test case:
import static org.junit.Assert.*;
import java.util.ArrayList;
import java.util.List;
import org.junit.Test;
public class TypedMapTest {
private final static TypedMapKey<String> KEY1 = new TypedMapKey<String>( "key1" );
private final static TypedMapKey<List<String>> KEY2 = new TypedMapKey<List<String>>( "key2" );
#Test
public void testGet() throws Exception {
TypedMap map = new TypedMap();
map.set( KEY1, null );
assertNull( map.get( KEY1 ) );
String expected = "Hallo";
map.set( KEY1, expected );
String value = map.get( KEY1 );
assertEquals( expected, value );
map.set( KEY2, null );
assertNull( map.get( KEY2 ) );
List<String> list = new ArrayList<String> ();
map.set( KEY2, list );
List<String> valueList = map.get( KEY2 );
assertEquals( list, valueList );
}
}
This is the Key class. Note that the type T is never used in this class! It's purely for the purpose of type casting when reading the value out of the map. The field key only gives the key a name.
public class TypedMapKey<T> {
private String key;
public TypedMapKey( String key ) {
this.key = key;
}
#Override
public int hashCode() {
final int prime = 31;
int result = 1;
result = prime * result + ( ( key == null ) ? 0 : key.hashCode() );
return result;
}
#Override
public boolean equals( Object obj ) {
if( this == obj ) {
return true;
}
if( obj == null ) {
return false;
}
if( getClass() != obj.getClass() ) {
return false;
}
TypedMapKey<?> other = (TypedMapKey<?>) obj;
if( key == null ) {
if( other.key != null ) {
return false;
}
} else if( !key.equals( other.key ) ) {
return false;
}
return true;
}
#Override
public String toString() {
return key;
}
}
TypedMap.java:
import java.util.Collection;
import java.util.HashMap;
import java.util.Map;
import java.util.Set;
public class TypedMap implements Map<Object, Object> {
private Map<Object, Object> delegate;
public TypedMap( Map<Object, Object> delegate ) {
this.delegate = delegate;
}
public TypedMap() {
this.delegate = new HashMap<Object, Object>();
}
#SuppressWarnings( "unchecked" )
public <T> T get( TypedMapKey<T> key ) {
return (T) delegate.get( key );
}
#SuppressWarnings( "unchecked" )
public <T> T remove( TypedMapKey<T> key ) {
return (T) delegate.remove( key );
}
public <T> void set( TypedMapKey<T> key, T value ) {
delegate.put( key, value );
}
// --- Only calls to delegates below
public void clear() {
delegate.clear();
}
public boolean containsKey( Object key ) {
return delegate.containsKey( key );
}
public boolean containsValue( Object value ) {
return delegate.containsValue( value );
}
public Set<java.util.Map.Entry<Object, Object>> entrySet() {
return delegate.entrySet();
}
public boolean equals( Object o ) {
return delegate.equals( o );
}
public Object get( Object key ) {
return delegate.get( key );
}
public int hashCode() {
return delegate.hashCode();
}
public boolean isEmpty() {
return delegate.isEmpty();
}
public Set<Object> keySet() {
return delegate.keySet();
}
public Object put( Object key, Object value ) {
return delegate.put( key, value );
}
public void putAll( Map<? extends Object, ? extends Object> m ) {
delegate.putAll( m );
}
public Object remove( Object key ) {
return delegate.remove( key );
}
public int size() {
return delegate.size();
}
public Collection<Object> values() {
return delegate.values();
}
}
No, you can't do it directly. You'll need to write a wrapper class around Map<Class, Object> to enforce that Object will be instanceof Class.
It's possible to create a class which stores a map of type safe key to a value, and cast when necessary. The cast in get method is safe, as after using new Key<CharSequence>(), it's not possible to safely cast it to Key<String> or Key<Object>, so the type system enforces the correct usage of a class.
The Key class needs to be final, as otherwise an user could override equals and cause type-unsafety if two elements with different types were to be equal. Alternatively, it's possible to override equals to be final if you want to use inheritance despite the issues with it.
public final class TypeMap {
private final Map<Key<?>, Object> m = new HashMap<>();
public <T> T get(Key<? extends T> key) {
// Safe, as it's not possible to safely change the Key generic type,
// hash map cannot be accessed by an user, and this class being final
// to prevent serialization attacks.
#SuppressWarnings("unchecked")
T value = (T) m.get(key);
return value;
}
public <T> void put(Key<? super T> key, T value) {
m.put(key, value);
}
public static final class Key<T> {
}
}
You can use below 2 classes, Map class: GenericMap, Map-Key class: GenericKey
For example:
// Create a key includine Type definition
public static final GenericKey<HttpServletRequest> REQUEST = new GenericKey<>(HttpServletRequest.class, "HttpRequestKey");
public void example(HttpServletRequest requestToSave)
{
GenericMap map = new GenericMap();
// Saving value
map.put(REQUEST, requestToSave);
// Getting value
HttpServletRequest request = map.get(REQUEST);
}
Advantages
It forces the user to put and get correct types by compilation error
It's doing casing for you inside
Generic Key helps to avoid write the class type each time you calling put(..) or get
No typo mistakes, like if key is 'String' type
GenericMap
public class GenericMap
{
private Map<String, Object> storageMap;
protected GenericMap()
{
storageMap = new HashMap<String, Object>();
}
public <T> T get(GenericKey<T> key)
{
Object value = storageMap.get(key.getKey());
if (value == null)
{
return null;
}
return key.getClassType().cast(value);
}
/**
* #param key GenericKey object with generic type - T (it can be any type)
* #param object value to put in the map, the type of 'object' mast be - T
*/
public <T> void put(GenericKey<T> key, T object)
{
T castedObject = key.getClassType().cast(object);
storageMap.put(key.getKey(), castedObject);
}
#Override
public String toString()
{
return storageMap.toString();
}
}
GenericKey
public class GenericKey<T>
{
private Class<T> classType;
private String key;
#SuppressWarnings("unused")
private GenericKey()
{
}
public GenericKey(Class<T> iClassType, String iKey)
{
this.classType = iClassType;
this.key = iKey;
}
public Class<T> getClassType()
{
return classType;
}
public String getKey()
{
return key;
}
#Override
public String toString()
{
return "[classType=" + classType + ", key=" + key + "]";
}
}
T as a type must be defined generically in the class instance. The following example works:
public class Test<T> {
private Map<Class<T>, T> defaultValues;
public void setDefaultValue(Class<T> clazz, T value) {
defaultValues.put(clazz, value);
}
public T getDefaultValue(Class<T> clazz) {
return defaultValues.get(clazz);
}
}
Alternatively, you can use Paul Tomblin's answer, and wrap the Map with your own object which will enforce this type of generics.
To be precise, I am trying to flatten a tree and I am stuck on trying to get the values of private attributes in a generic class using a generic function.
I have attached the classes to show how the tree is structured exactly. But it's looks something like this:
/|\
1 | 6
/|\
5 4 9
I am going to paste my attempt at the end. First, let me introduce the classes:
Triple simply stores three values of the same type.
public class Triple<V> {
private final V l, m, r;
public Triple(V l, V m, V r) {
this.l = l;
this.m = m;
this.r = r;
}
public V left() { return l; }
public V middle() { return m; }
public V right() { return r; }
}
Straightforward interface:
public interface Function<P, R> {
R apply(P p);
}
Now, for a tricky class. This one is simply a type that stores one of EitherOr of two types of value, but not both.
public class EitherOr<A,B> {
// Constructs a left-type EitherOr
public static <A> EitherOr left(A a) {
return new EitherOr(a, null);
}
// Constructs a right-type EitherOr
public static <B> EitherOr right(B b) {
return new EitherOr(null, b);
}
private final A a;
private final B b;
private EitherOr(A a, B b) {
this.a = a; this.b = b;
}
public<T> T ifLeft(Function<A,T> f) {
return f.apply(a);
}
public<T> T ifRight(Function<B,T> f) {
return f.apply(b);
}
public boolean isLeft() {
return b == null;
}
}
I know this is getting long, but bear with me. This class implements the tree structure.
public interface Tree<T> {
EitherOr<T, Triple<Tree<T>>> get();
static final class Leaf<T> implements Tree<T> {
public static <T> Leaf<T> leaf (T value) {
return new Leaf<T>(value);
}
private final T t;
public Leaf(T t) { this.t = t; }
#Override
public EitherOr<T, Triple<Tree<T>>> get() {
return EitherOr.left(t);
}
}
static final class Node<T> implements Tree<T> {
public static <T> Tree<T> tree (T left, T middle, T right) {
return new Node<T>(Leaf.leaf(left), Leaf.leaf(middle), Leaf.leaf(right));
}
private final Triple<Tree<T>> branches;
public Node(Tree<T> left, Tree<T> middle, Tree<T> right) {
this.branches = new Triple<Tree<T>>(left, middle, right);
}
#Override
public EitherOr<T, Triple<Tree<T>>> get() {
return EitherOr.right(branches);
}
}
}
Alright. Here is my idea for flattening:
public class MyFlattenTree<T> implements FlattenTree<T> {
public List<T> flattenInOrder(Tree<T> tree) {
List<T> list = new ArrayList<T>();
EitherOr<T, Triple<Tree<T>>> EitherOr;
EitherOr = tree.get();
// it is a leaf
if (EitherOr.isLeft()) {
// This is where the problem lies
// I don't how to get the value using a function f
list.add((T) EitherOr.ifLeft(f));
return list;
}
else {
// basically recursively go through the tree somehow
}
return null;
}
}
As I said, I am stuck with trying to retreive the value in the EitherOr class using the Function interface. I am thinking of implementing the Function interface and write a function for "apply" that just gets the value, but I am not sure how to do that. Any help would be appreciated. Thanks!
So, here is your flattenInOrder method:
public List<T> flattenInOrder(final Tree<T> tree) {
final EitherOr<T, Triple<Tree<T>>> EitherOr = tree.get();
if (EitherOr.isLeft()) {
return Collections.singletonList(EitherOr.ifLeft(this.ifLeftFunction));
}
return EitherOr.ifRight(this.ifRightFunction);
}
Quite simple, assuming that:
ifLeftFunction yields a single element (since EitherOr<T, Triple<Tree<T>>> has a single T elem' if it s "left")
... and:
ifRightFunction yields a collection of elements (since EitherOr<T, Triple<Tree<T>>> has a list of T elems' if it is "right")
Let's look into these functions now:
ifLeftFunction is... basic. I want to extract a T from... a T.
final Function<T, T> ifLeftFunction = new Function<T, T>() {
#Override
public T apply(final T t) {
return t;
}
};
ifRightFunction is slightly more complex: it has to be recursive and collect all Ts from the Tree it's browsing:
final Function<Triple<Tree<T>>, List<T>> ifRightFunction = new Function<Triple<Tree<T>>, List<T>>() {
#Override
public List<T> apply(final Triple<Tree<T>> t) {
final List<T> res = new ArrayList<>();
res.addAll(MyFlattenTree.this.flattenInOrder(t.left()));
res.addAll(MyFlattenTree.this.flattenInOrder(t.middle()));
res.addAll(MyFlattenTree.this.flattenInOrder(t.right()));
return res;
}
};
And... you're done!
Sample working code:
public class MyFlattenTree<T> {
private final Function<Triple<Tree<T>>, List<T>> ifRightFunction = new Function<Triple<Tree<T>>, List<T>>() {
#Override
public List<T> apply(final Triple<Tree<T>> t) {
final List<T> res = new ArrayList<>();
res.addAll(MyFlattenTree.this.flattenInOrder(t.left()));
res.addAll(MyFlattenTree.this.flattenInOrder(t.middle()));
res.addAll(MyFlattenTree.this.flattenInOrder(t.right()));
return res;
}
};
private final Function<T, T> ifLeftFunction = new Function<T, T>() {
#Override
public T apply(final T t) {
return t;
}
};
public static void main(final String[] args) {
final Tree<String> tree = new Node<>(new Leaf<>("1"), new Node<>(new Leaf<>("5"), new Leaf<>("4"), new Leaf<>("9")), new Leaf<>("6"));
System.out.println(new MyFlattenTree<String>().flattenInOrder(tree));
}
public List<T> flattenInOrder(final Tree<T> tree) {
final EitherOr<T, Triple<Tree<T>>> EitherOr = tree.get();
if (EitherOr.isLeft()) {
return Collections.singletonList(EitherOr.ifLeft(this.ifLeftFunction));
}
return EitherOr.ifRight(this.ifRightFunction);
}
}
Note that I'm creating the exact Tree you're featuring as an example in your question in the main method here:
public static void main(final String[] args) {
final Tree<String> tree = new Node<>(new Leaf<>("1"), new Node<>(new Leaf<>("5"), new Leaf<>("4"), new Leaf<>("9")), new Leaf<>("6"));
System.out.println(new MyFlattenTree<String>().flattenInOrder(tree));
}
Output: [1, 5, 4, 9, 6]
Cheers ;)
So, after playing around with Java generics a bit, to get a deeper understanding of their capabilities, I decided to try to implement the curried version of the composition function, familiar to functional programmers. Compose has the type (in functional languages) (b -> c) -> (a -> b) -> (a -> c). Doing currying arithmetic functions wasn't too hard, since they are just polymorphic, but compose is a higher order function, and its proven taxing to my understanding of generics in Java.
Here is the implementation I've created so far:
public class Currying {
public static void main(String[] argv){
// Basic usage of currying
System.out.println(add().ap(3).ap(4));
// Next, lets try (3 * 4) + 2
// First lets create the (+2) function...
Fn<Integer, Integer> plus2 = add().ap(2);
// next, the times 3 function
Fn<Integer, Integer> times3 = mult().ap(3);
// now we compose them into a multiply by 2 and add 3 function
Fn<Integer, Integer> times3plus2 = compose().ap(plus2).ap(times3);
// now we can put in the final argument and print the result
// without compose:
System.out.println(plus2.ap(times3.ap(4)));
// with compose:
System.out.println(times3plus2.ap(new Integer(4)));
}
public static <A,B,C>
Fn<Fn<B,C>, // (b -> c) -> -- f
Fn<Fn<A,B>, // (a -> b) -> -- g
Fn<A,C>>> // (a -> c)
compose(){
return new Fn<Fn<B,C>,
Fn<Fn<A,B>,
Fn<A,C>>> () {
public Fn<Fn<A,B>,
Fn<A,C>> ap(final Fn<B,C> f){
return new Fn<Fn<A,B>,
Fn<A,C>>() {
public Fn<A,C> ap(final Fn<A,B> g){
return new Fn<A,C>(){
public C ap(final A a){
return f.ap(g.ap(a));
}
};
}
};
}
};
}
// curried addition
public static Fn<Integer, Fn<Integer, Integer>> add(){
return new Fn<Integer, Fn<Integer, Integer>>(){
public Fn<Integer,Integer> ap(final Integer a) {
return new Fn<Integer, Integer>() {
public Integer ap(final Integer b){
return a + b;
}
};
}
};
}
// curried multiplication
public static Fn<Integer, Fn<Integer, Integer>> mult(){
return new Fn<Integer, Fn<Integer, Integer>>(){
public Fn<Integer,Integer> ap(final Integer a) {
return new Fn<Integer, Integer>() {
public Integer ap(final Integer b){
return a * b;
}
};
}
};
}
}
interface Fn<A, B> {
public B ap(final A a);
}
The implementations of add, mult, and compose all compile just fine, but I find myself having a problem when it comes to actually using compose. I get the following error for line 12 (the first usage of compose in main):
Currying.java:12: ap(Fn<java.lang.Object,java.lang.Object>) in
Fn<Fn<java.lang.Object,java.lang.Object>,Fn<Fn<java.lang.Object,java.lang.Object>,Fn<java.lang.Object,java.lang.Object>>>
cannot be applied to (Fn<java.lang.Integer,java.lang.Integer>)
Fn<Integer,Integer> times3plus2 = compose().ap(plus2).ap(times3);
I assume this error is because generic types are invariant, but I am not sure how to solve the problem. From what I've read, wildcard type variables can be used to alleviate invariance in some cases, but I'm not sure how to use it here or even whether it will be useful.
Disclaimer: I have no intention of writing code like this in any real project. This is a fun "can it be done" kind of thing. Also, I made the variable names brief in defiance of standard Java practice because otherwise this example becomes an even more of an incomprehensible wall of text.
The basic problem here is that in the original call to compose(), there is no way for the compiler to infer the bindings of A, B, and C, so it assumes them all to be Object. You can fix it by specifying the type bindings explicitly:
Fn<Integer, Integer> times3plus2 =
Currying.<Integer, Integer, Integer>compose().ap(plus2).ap(times3);
Of course, then you lose the clarity that comes from type inference. If you need type inference, you could define some intermediate classes to do the inferring:
public static ComposeStart compose() {
return new ComposeStart();
}
class ComposeStart {
public <B,C> ComposeContinuation<B,C> ap(Fn<B,C> f) {
return new ComposeContinuation<B, C>(f);
}
}
class ComposeContinuation<B, C> {
private final Fn<B,C> f;
ComposeContinuation(Fn<B,C> f) {
this.f = f;
}
public <A> Fn<A,C> ap(final Fn<A,B> g) {
return new Fn<A,C>() {
public C ap(A a) {
return f.ap(g.ap(a));
}
};
}
}
However, then the intermediate steps of currying are no longer Fns.
Thanks to Russell Zahniser's insight that I wasn't giving Java enough information to work with, I changed the layout a bit so that we instantiate a "Composer" object with the appropriate type variables filled in. Here is my current working solution:
interface Fn<A, B> {
public B ap(final A a);
}
public class Currying {
public static void main(String[] argv){
// Basic usage of currying
System.out.println(add().ap(3).ap(4));
// Next, lets try (3 * 4) + 2
// First lets create the (+2) function...
Fn<Integer, Integer> plus2 = add().ap(2);
// next, the times 3 function
Fn<Integer, Integer> times3 = mult().ap(3);
// now we compose them into a multiply by 2 and add 3 function
Fn<Integer, Integer> times3plus2 = new Composer<Integer,Integer,Integer>()
.compose().ap(plus2).ap(times3);
// without compose
System.out.println(plus2.ap(times3.ap(4)));
// with compose
System.out.println(times3plus2.ap(4));
}
static class Composer<A,B,C> {
public
Fn<Fn<B,C>, // (b -> c) -> -- f
Fn<Fn<A,B>, // (a -> b) -> -- g
Fn<A,C>>> // (a -> c)
compose(){
return new Fn<Fn<B,C>,
Fn<Fn<A,B>,
Fn<A,C>>> () {
public Fn<Fn<A,B>,
Fn<A,C>> ap(final Fn<B,C> f){
return new Fn<Fn<A,B>,
Fn<A,C>>() {
public Fn<A,C> ap(final Fn<A,B> g){
return new Fn<A,C>(){
public C ap(final A a){
return f.ap(g.ap(a));
}
};
}
};
}
};
}
}
public static Fn<Integer, Fn<Integer, Integer>> add(){
return new Fn<Integer, Fn<Integer, Integer>>(){
public Fn<Integer,Integer> ap(final Integer a) {
return new Fn<Integer, Integer>() {
public Integer ap(final Integer b){
return a + b;
}
};
}
};
}
public static Fn<Integer, Fn<Integer, Integer>> mult(){
return new Fn<Integer, Fn<Integer, Integer>>(){
public Fn<Integer,Integer> ap(final Integer a) {
return new Fn<Integer, Integer>() {
public Integer ap(final Integer b){
return a * b;
}
};
}
};
}
}
I've implemented this functionality myself in terms of a generic length-n chain of function calls.
public static final <X, Y> Chainer<X, Y> chain(
final Function<X, Y> primary
) {
return new Chainer<X, Y>(primary);
}
private static final class FunctionChain<IN, OUT> implements Function<IN, OUT> {
#SuppressWarnings("rawtypes")
private final List<Function> chain = new LinkedList<Function>();
private FunctionChain(#SuppressWarnings("rawtypes") final List<Function> chain) {
this.chain.addAll(chain);
}
#SuppressWarnings("unchecked")
#Override
public OUT apply(final IN in) {
Object ret = in;
for (final Function<Object, Object> f : chain) {
ret = f.apply(ret);
}
return (OUT) ret;
}
}
public static final class Chainer<IN, OUT> {
#SuppressWarnings("rawtypes")
private final LinkedList<Function> functions = new LinkedList<Function>();
#SuppressWarnings("unchecked")
private Chainer(#SuppressWarnings("rawtypes") final Function func) {
then(func);
}
#SuppressWarnings("unchecked")
public <OUT2> Chainer<IN, OUT2> then(final Function<OUT, OUT2> func) {
if (func instanceof FunctionChain) {
functions.addAll(((FunctionChain<?, ?>)func).chain);
} else {
functions.add(func);
}
return (Chainer<IN, OUT2>) this;
}
#SuppressWarnings("unchecked")
public Function<IN, OUT> build() {
// If empty, it's a noop function. If one element, there's no need for a chain.
return new FunctionChain<IN, OUT>(functions);
}
}
public static final <X, Y, Z> Function<X, Z> combine(
final Function<X, Y> primary,
final Function<Y, Z> secondary
) {
return chain(primary).then(secondary).build();
}
I would contend that this qualifies as a greater abuse of generics, since the Chainer class only uses generics to ensure that subsequent .then() calls are properly typed based on the last function supplied, and the functions are just stored in a list in what is known to be a safe calling order for later, but it does work and it does generalize well: chain(first).then(second).then(third).then(fourth).build() is completely valid with this approach.
Just be be explicit, this is based around Function from guava, but should port over to any Function interface just fine.