Java generic parameter itself using generics? - java

I'm trying to figure out how to structure a program using Java's generics, and wondering if I am doing something fundamentally wrong or just missing a simple bug in my code.
Say I have a generic class:
public interface Handler<T>{
public void process(T t);
}
Another generic class takes Handler as a generic parameter (pseudo code):
public interface Processor<S extends Handler<T>>{ //<== Error: cannot find symbol 'T'
public void addHandler(S u);
public void process(T t);
}
Abstract implementation providing boiler-plate implementations
public abstract class ProcessorImpl<.....> implements Processor<.....>{
...
}
Think of a processor as an object that dispatches requests to process data to any number of handlers. Specific instances can be variations of process pipelines, intercepting filters, event systems, etc.
I'd like to be able to use it like the following:
Handler<String> myHandler1 = new HandlerImpl<String>();
Handler<String> myHandler2 = new HandlerImpl<String>();
Handler<Integer> myHandler3 = new HandlerImpl<Integer>();
Processor<Handler<String>> proc = ProcessorImpl<Handler<String>>();
proc.addHandler(myHandler1);
proc.addhandler(myHandler2);
proc.addhandler(myHandler3);//this should be an error!
I can't get it to work. On paper it looks like it should be trivial, any ideas?
Thanks

So each type parameter is only defined within the class, thus T isn't defined or available in Processor class.
You probably want to have Processor be:
public interface Processor<T>{
public void addHandler(Handler<? super T> u);
public void process(T t);
}
Here you are declaring a Processor that can only handle events/input of a particular type, e.g. String, Integer, etc. So the following statement will be valid:
Processor<String> proc = ...
proc.addHandler(new Handler<String>()); // valid
proc.addHandler(new Handler<Object>()); // valid, as Strings are Objects too
proc.addHandler(new Handler<Integer>()); // invalid, not a String handler
proc.process("good"); // valid
proc.process(1); // invalid, not a String
If Processor is intended to handle types at runtime and makes a dynamic dispatch based on the appropriate runtime type, then you can declare proc (in the last example) as Processor<?>. Then all the statements are valid.

These changes should work:
public interface Processor<T, S extends Handler<T>>
and
class ProcessorImpl<T, S extends Handler<T>>
implements Processor<T, S>
and
Processor<String, Handler<String>> proc = new ProcessorImpl<String, Handler<String>>();

It shouldn't work, as your T=String and handlers of integers are not allowed.
At compile time, your class will have method process(String t) and not process(Integer t).

Related

Java generic type not working in this case

This class displays informations:
// display line numbers from a file
display(getLineNumber(myFile));
// display users from DB
display(getUsersName(myDBRepository));
etc...
I wanted to make a generic interface, so I can externalize the code that display information.
Then I could do something like:
myInformationElements.stream().forEach(e -> display(e.getValue());
Here is what I have so far (not working):
public interface InformationElement {
public <T> String getValue (T param);
}
public class NbFileLineInformationElement implements InformationElement{
#Override
public <File> String getValue(File param) {
return *same code as in getLineNumber(myFile)*;
}
}
public class UserInformationElement implements InformationElement{
#Override
public <UserRepository> String getValue(UserRepository param) {
return *same code as in getUsersName(myDBRepository)*;
}
}
Here my generic type is not working: File is not reconize as java.io.File (same for my jpa repository) What am I doing wrong here ?
Is this the best practice for my needs ?
You've defined type parameters File and UserRepository that are shadowing the class names File and UserRepository. This is one of the surprises of naming type parameters the same as existing classes. The type parameters don't represent the classes, and they don't have bounds, so the compiler can only assume they have Object methods.
This is not the best practice. When implementing generic methods, the methods must remain generic and at least as wide-open with respect to bounds. To be able to restrict what the type parameter means later, define it on the class/interface, and let subclasses supply what it's supposed to mean for that specific implementation with a type argument.
The best solution here is to move InformationElement's type parameter to the class, and to supply type arguments in your subclasses. The methods are no longer generic, but they do use the type parameters defined on the interface/classes.
interface InformationElement<T> {
public String getValue (T param);
}
class NbFileLineInformationElement implements InformationElement<File>{
#Override
public String getValue(File param) {
return /*same code as in getLineNumber(myFile)*/;
}
}
class UserInformationElement implements InformationElement<UserRepository>{
#Override
public String getValue(UserRepository param) {
return /*same code as in getUsersName(myDBRepository)*/;
}
}

Type safe tunneling of user data between method calls

I am developing an API allowing users to transfer data using a certain protocol. Throughout the communication, two events - EventA and EventB occur. B is optional, but strongly related to A. They occur in the sequence (AB?)*. This events are exposed to the user as a hook call to an interface:
interface IEventHandler {
void eventAOccured(EventAData aData);
void eventBOccured(EventBData bData);
}
Now I want the user to be able to pass some data about event A to the hook of event B while keeping the interface stateless. First I thought something like
interface IEventHandler<U> {
U eventAOccured(EventAData aData);
void eventBOccured(EventBData bData, U userData);
}
Unfortunately, as generics do not offer runtime information (not without reflection, at least) the API has no way to call eventBOccured, as the type of its second parameter is not known at compile time. Introducing a markerinterface and U extends IMarker solves this, but does not spare the upcast I wanted to avoid. IMHO if I would go with the upcast I could simply pass Object and get the same thing.
I am pretty sure (Java) generics are the wrong tool here. Am I missing something? How would you tackle the problem?
I will do that way :
package com.stackoverflow.user3590895.questions24098455;
//-------------------------------
package com.stackoverflow.user3590895.questions24098455;
public class EventData implements IEventData{
//default is type A
private int type=IEventData.TYPE_A;
#Override
public int getType() {
System.out.print("Event [type:"+type+"]!");
return type;
}
}
//-------------------------------
package com.stackoverflow.user3590895.questions24098455;
public interface IEventData {
public static char TYPE_A='A';
public static char TYPE_B='B';
public int getType();
}
//-------------------------------
package com.stackoverflow.user3590895.questions24098455;
public interface IEventHandler<U> {
U eventOccured(IEventData data);
}
Use case:
- make a call of event (with event of you choice
- api catch event and can know his type by calling function getType.
- if you are an eventData B, you can herit of EventData and make a new EventData for B Type. You can add some specific function for EventDataB, and specific datas also.
give me feedback.
If I correctly understand, you could use a wrapper around you data containing its class definition. Something like
class Wrapper<U extends IMarker> {
U data;
Class<U> clazz;
// constructor, getters and setters omitted
}
Your event interface would be
interface IEventHandler<U extends IMarker> {
Wrapper<U> eventAOccured(EventAData aData);
void eventBOccured(EventBData, Wrapper<U> user data);
}
That way, Wrapper class is known, and API can call it without problem. And in eventBOccured you have the right type for your upcast.

Do we need an interface/contract if we cannot generalize method parameters

I want to create an interface having two methods, say uploadFile and downloadFile. While I only need the implementors to just implement these two methods, I am not sure and want to care about what arguements these methods need to take. I mean, different implementors may ask for different parameters. In that case, should I still go ahead by creating an interface by making the above methods as var-arg methods, like below
boolean uploadFile(Object ... parameters)
OutputStream downloadFile(Object ... parameters)
Or is there even a better approach than this? Is it even right to create an interface if I cannot generalize method parameters? I am only sure about the method names and say return types.
This might be a use case for generics. Consider the following arrangement of classes - here we define an abstract "parameter" type and reference this in the interface. Concrete classes work with a particular parameter set.
abstract class HandlerParams {
}
interface FileHandler<T extends HandlerParams> {
boolean uploadFile(T parameters);
OutputStream downloadFile(T parameters);
}
Example implementations:
class URLParams extends HandlerParams {
// whatever...
}
class URLFileHandler implements FileHandler<URLParams> {
#Override
public boolean uploadFile(URLParams parameters) {
// ...
}
#Override
public OutputStream downloadFile(URLParams parameters) {
// ...
}
}
I must admit, I'm struggling to imagine scenarios where this arrangement would be that helpful. I suppose you could have something that works with file handlers, but it feels a little artificial:
class SomethingThatUsesFileHandlers {
public <T extends HandlerParams> void doSomething(FileHandler<T> handler,
T params) {
handler.downloadFile(params);
}
}
If you have to call with different parameter types / counts based on the implementor's type, you have two common choices:
Generalize parameters themselves into a separate type - This helps you unify interfaces at the cost of static type checking
Forego the interface altogether - If you need static type checking, the choice that you suggest (leaving the interface out) is valid.
Here is how you implement the first approach:
interface HandlerParameters {
void setValue(String mame, Object value);
Object getValue(String name);
String[] getNames();
}
interface UploadDownloadHandler {
boolean uploadFile(HandlerParameters parameters);
OutputStream downloadFile(HandlerParameters parameters);
HandlerParameters makeParameters();
}
The caller can call makeParameters to make an empty parameter block, populate parameter values as needed, and proceed to calling uploadFile or downloadFile.
I think this is still OK as you at least have the uploadFile and downloadFile methods defined in your contract. But it allows too many possibilities because you define Object... as parameters of the two methods. Maybe a better approach is to define a few concrete options for these parameters and stick to them. You can do this through several overloaded versions of these two methods e.g.
boolean uploadFile(File)
or
boolean uploadFile(File...)
or
boolean uploadFile(File[])
and then do the same for the
downloadFile method.
Perhaps You should use generic interface?
public interface XXXX< T > {
boolean uploadFile(T... parameters)
OutputStream downloadFile(T... parameters)
}

Type-safe method reflection in Java

Is any practical way to reference a method on a class in a type-safe manner? A basic example is if I wanted to create something like the following utility function:
public Result validateField(Object data, String fieldName,
ValidationOptions options) { ... }
In order to call it, I would have to do:
validateField(data, "phoneNumber", options);
Which forces me to either use a magic string, or declare a constant somewhere with that string.
I'm pretty sure there's no way to get around that with the stock Java language, but is there some kind of (production grade) pre-compiler or alternative compiler that may offer a work around? (similar to how AspectJ extends the Java language) It would be nice to do something like the following instead:
public Result validateField(Object data, Method method,
ValidationOptions options) { ... }
And call it with:
validateField(data, Person.phoneNumber.getter, options);
As others mention, there is no real way to do this... and I've not seen a precompiler that supports it. The syntax would be interesting, to say the least. Even in your example, it could only cover a small subset of the potential reflective possibilities that a user might want to do since it won't handle non-standard accessors or methods that take arguments, etc..
Even if it's impossible to check at compile time, if you want bad code to fail as soon as possible then one approach is to resolve referenced Method objects at class initialization time.
Imagine you have a utility method for looking up Method objects that maybe throws error or runtime exception:
public static Method lookupMethod( Class c, String name, Class... args ) {
// do the lookup or throw an unchecked exception of some kind with a really
// good error message
}
Then in your classes, have constants to preresolve the methods you will use:
public class MyClass {
private static final Method GET_PHONE_NUM = MyUtils.lookupMethod( PhoneNumber.class, "getPhoneNumber" );
....
public void someMethod() {
validateField(data, GET_PHONE_NUM, options);
}
}
At least then it will fail as soon as MyClass is loaded the first time.
I use reflection a lot, especially bean property reflection and I've just gotten used to late exceptions at runtime. But that style of bean code tends to error late for all kinds of other reasons, being very dynamic and all. For something in between, the above would help.
There isn't anything in the language yet - but part of the closures proposal for Java 7 includes method literals, I believe.
I don't have any suggestions beyond that, I'm afraid.
Check out https://proxetta.jodd.org/refs/methref. It uses the Jodd proxy library (Proxetta) to proxy your type. Not sure about its performance characteristics, but it does provide type safety.
An example: Suppose Str.class has method .boo(), and you want to get its name as the string "boo":
String methodName = Methref.of(Str.class).name(Str::boo);
There's more to the API than the example above: https://oblac.github.io/jodd-site/javadoc/jodd/methref/Methref.html
Is any practical way to reference a method on a class in a type-safe manner?
First of all, reflection is type-safe. It is just that it is dynamically typed, not statically typed.
So, assuming that you want a statically typed equivalent of reflection, the theoretical answer is that it is impossible. Consider this:
Method m;
if (arbitraryFunction(obj)) {
m = obj.getClass().getDeclaredMethod("foo", ...);
} else {
m = obj.getClass().getDeclaredMethod("bar", ...);
}
Can we do this so that that runtime type exceptions cannot happen? In general NO, since this would entail proving that arbitraryFunction(obj) terminates. (This is equivalent to the Halting Problem, which is proven to be unsolvable in general, and is intractable using state-of-the-art theorem proving technology ... AFAIK.)
And I think that this road-block would apply to any approach where you could inject arbitrary Java code into the logic that is used to reflectively select a method from an object's class.
To my mind, the only moderately practical approach at the moment would be to replace the reflective code with something that generates and compiles Java source code. If this process occurs before you "run" the application, you've satisfied the requirement for static type-safety.
I was more asking about reflection in which the result is always the same. I.E. Person.class.getMethod("getPhoneNumber", null) would always return the same method and it's entirely possible to resolve it at compile time.
What happens if after compiling the class containing this code, you change Person to remove the getPhoneNumber method?
The only way you can be sure that you can resolve getPhoneNumber reflectively is if you can somehow prevent Person from being changed. But you can't do that in Java. Runtime binding of classes is a fundamental part of the language.
(For record, if you did that for a method that you called non-reflectively, you would get an IncompatibleClassChangeError of some kind when the two classes were loaded ...)
It has been pointed out that in Java 8 and later you could declare your validator something like this:
public Result validateField(Object data,
SomeFunctionalInterface function,
ValidationOptions options) { ... }
where SomeFunctionalInterface corresponds to the (loosely speaking) common signature of the methods you are validating.
Then you can call it with a method reference; e.g.
validateField(data, SomeClass::someMethod, options)
This is approach is statically type-safe. You will get a compilation error if SomeClass doesn't have someMethod or if it doesn't conform to SomeFunctionalInterface.
But you can't use a string to denote the method name. Looking up a method by name would entail either reflection ... or something else that side-steps static (i.e. compile time / load time) type safety.
Java misses the syntax sugar to do something as nice as Person.phoneNumber.getter. But if Person is an interface, you could record the getter method using a dynamic proxy. You could record methods on non-final classes as well using CGLib, the same way Mockito does it.
MethodSelector<Person> selector = new MethodSelector<Person>(Person.class);
selector.select().getPhoneNumber();
validateField(data, selector.getMethod(), options);
Code for MethodSelector: https://gist.github.com/stijnvanbael/5965609
Inspired by mocking frameworks, we could dream up the following syntax:
validator.validateField(data, options).getPhoneNumber();
Result validationResult = validator.getResult();
The trick is the generic declaration:
class Validator {
public <T> T validateField(T data, options) {...}
}
Now the return type of the method is the same as your data object's type and you can use code completion (and static checking) to access all the methods, including the getter methods.
As a downside, the code isn't quite intuitive to read, since the call to the getter doesn't actually get anything, but instead instructs the validator to validate the field.
Another possible option would be to annotate the fields in your data class:
class FooData {
#Validate(new ValidationOptions(...))
private PhoneNumber phoneNumber;
}
And then just call:
FooData data;
validator.validate(data);
to validate all fields according to the annotated options.
The framework picklock lets you do the following:
class Data {
private PhoneNumber phoneNumber;
}
interface OpenData {
PhoneNumber getPhoneNumber(); //is mapped to the field phoneNumber
}
Object data = new Data();
PhoneNumber number = ObjectAccess
.unlock(data)
.features(OpenData.class)
.getPhoneNumber();
This works in a similar way setters and private methods. Of course, this is only a wrapper for reflection, but the exception does not occur at unlocking time not at call time. If you need it at build time, you could write a unit test with:
assertThat(Data.class, providesFeaturesOf(OpenData.class));
I found a way to get the Method instance using Lambdas. It works only on interface methods though currently.
It works using net.jodah:typetools which is a very lightweight library.
https://github.com/jhalterman/typetools
public final class MethodResolver {
private interface Invocable<I> {
void invokeWithParams(I instance, Class<?>[] parameterTypes) throws Throwable;
}
interface ZeroParameters<I, R> extends Invocable<I> {
R invoke(I instance) throws Throwable;
#Override
default void invokeWithParams(I instance, Class<?>[] parameterTypes) throws Throwable {
invoke(instance);
}
}
public static <I, R> Method toMethod0(ZeroParameters<I, R> call) {
return toMethod(ZeroParameters.class, call, 1);
}
interface OneParameters<I, P1, R> extends Invocable<I> {
R invoke(I instance, P1 p1) throws Throwable;
#Override
default void invokeWithParams(I instance, Class<?>[] parameterTypes) throws Throwable {
invoke(instance, param(parameterTypes[1]));
}
}
public static <I, P1, R> Method toMethod1(OneParameters<I, P1, R> call) {
return toMethod(OneParameters.class, call, 2);
}
interface TwoParameters<I, P1, P2, R> extends Invocable<I> {
R invoke(I instance, P1 p1, P2 p2) throws Throwable;
#Override
default void invokeWithParams(I instance, Class<?>[] parameterTypes) throws Throwable {
invoke(instance, param(parameterTypes[1]), param(parameterTypes[2]));
}
}
public static <I, P1, P2, R> Method toMethod2(TwoParameters<I, P1, P2, R> call) {
return toMethod(TwoParameters.class, call, 3);
}
private static final Map<Class<?>, Object> parameterMap = new HashMap<>();
static {
parameterMap.put(Boolean.class, false);
parameterMap.put(Byte.class, (byte) 0);
parameterMap.put(Short.class, (short) 0);
parameterMap.put(Integer.class, 0);
parameterMap.put(Long.class, (long) 0);
parameterMap.put(Float.class, (float) 0);
parameterMap.put(Double.class, (double) 0);
}
#SuppressWarnings("unchecked")
private static <T> T param(Class<?> type) {
return (T) parameterMap.get(type);
}
private static <I> Method toMethod(Class<?> callType, Invocable<I> call, int responseTypeIndex) {
Class<?>[] typeData = TypeResolver.resolveRawArguments(callType, call.getClass());
Class<?> instanceClass = typeData[0];
Class<?> responseType = responseTypeIndex != -1 ? typeData[responseTypeIndex] : Void.class;
AtomicReference<Method> ref = new AtomicReference<>();
I instance = createProxy(instanceClass, responseType, ref);
try {
call.invokeWithParams(instance, typeData);
} catch (final Throwable e) {
throw new IllegalStateException("Failed to call no-op proxy", e);
}
return ref.get();
}
#SuppressWarnings("unchecked")
private static <I> I createProxy(Class<?> instanceClass, Class<?> responseType,
AtomicReference<Method> ref) {
return (I) Proxy.newProxyInstance(MethodResolver.class.getClassLoader(),
new Class[] {instanceClass},
(proxy, method, args) -> {
ref.set(method);
return parameterMap.get(responseType);
});
}
}
Usage:
Method method = MethodResolver.toMethod2(SomeIFace::foobar);
System.out.println(method); // public abstract example.Result example.SomeIFace.foobar(java.lang.String,boolean)
Method get = MethodResolver.<Supplier, Object>toMethod0(Supplier::get);
System.out.println(get); // public abstract java.lang.Object java.util.function.Supplier.get()
Method accept = MethodResolver.<IntFunction, Integer, Object>toMethod1(IntFunction::apply);
System.out.println(accept); // public abstract java.lang.Object java.util.function.IntFunction.apply(int)
Method apply = MethodResolver.<BiFunction, Object, Object, Object>toMethod2(BiFunction::apply);
System.out.println(apply); // public abstract java.lang.Object java.util.function.BiFunction.apply(java.lang.Object,java.lang.Object)
Unfortunately you have to create a new interface and method based on the parameter count and whether the method returns void or not.
However, if you have a somewhat fixed/limited method signature/parameter types, then this becomes quite handy.

Higher-kinded generics in Java

Suppose I have the following class:
public class FixExpr {
Expr<FixExpr> in;
}
Now I want to introduce a generic argument, abstracting over the use of Expr:
public class Fix<F> {
F<Fix<F>> in;
}
But Eclipse doesn't like this:
The type F is not generic; it cannot be parametrized with arguments <Fix<F>>
Is this possible at all or have I overlooked something that causes this specific instance to break?
Some background information: in Haskell this is a common way to write generic functions; I'm trying to port this to Java. The type argument F in the example above has kind * -> * instead of the usual kind *. In Haskell it looks like this:
newtype Fix f = In { out :: f (Fix f) }
I think what you're trying to do is simply not supported by Java generics. The simpler case of
public class Foo<T> {
public T<String> bar() { return null; }
}
also does not compile using javac.
Since Java does not know at compile-time what T is, it can't guarantee that T<String> is at all meaningful. For example if you created a Foo<BufferedImage>, bar would have the signature
public BufferedImage<String> bar()
which is nonsensical. Since there is no mechanism to force you to only instantiate Foos with generic Ts, it refuses to compile.
Maybe you can try Scala, which is a functional language running on JVM, that supports higher-kinded generics.
[ EDIT by Rahul G ]
Here's how your particular example roughly translates to Scala:
trait Expr[+A]
trait FixExpr {
val in: Expr[FixExpr]
}
trait Fix[F[_]] {
val in: F[Fix[F]]
}
In order to pass a type parameter, the type definition has to declare that it accepts one (it has to be generic). Apparently, your F is not a generic type.
UPDATE: The line
F<Fix<F>> in;
declares a variable of type F which accepts a type parameter, the value of which is Fix, which itself accepts a type parameter, the value of which is F. F isn't even defined in your example. I think you may want
Fix<F> in;
That will give you a variable of type Fix (the type you did define in your example) to which you are passing a type parameter with value F. Since Fix is defined to accept a type parameter, this works.
UPDATE 2: Reread your title, and now I think you might be trying to do something similar to the approach presented in "Towards Equal Rights for Higher-Kinded Types" (PDF alert). If so, Java doesn't support that, but you might try Scala.
Still, there are ways to encode higer-kinded generics in Java. Please, have a look at higher-kinded-java project.
Using this as a library, you can modify your code like this:
public class Fix<F extends Type.Constructor> {
Type.App<F, Fix<F>> in;
}
You should probably add an #GenerateTypeConstructor annotation to your Expr class
#GenerateTypeConstructor
public class Expr<S> {
// ...
}
This annotation generates ExprTypeConstructor class.
Now you can process your Fix of Expr like this:
class Main {
void run() {
runWithTyConstr(ExprTypeConstructor.get);
}
<E extends Type.Constructor> void runWithTyConstr(ExprTypeConstructor.Is<E> tyConstrKnowledge) {
Expr<Fix<E>> one = Expr.lit(1);
Expr<Fix<E>> two = Expr.lit(2);
// convertToTypeApp method is generated by annotation processor
Type.App<E, Fix<E>> oneAsTyApp = tyConstrKnowledge.convertToTypeApp(one);
Type.App<E, Fix<E>> twoAsTyApp = tyConstrKnowledge.convertToTypeApp(two);
Fix<E> oneFix = new Fix<>(oneAsTyApp);
Fix<E> twoFix = new Fix<>(twoAsTyApp);
Expr<Fix<E>> addition = Expr.add(oneFix, twoFix);
process(addition, tyConstrKnowledge);
}
<E extends Type.Constructor> void process(
Fix<E> fixedPoint,
ExprTypeConstructor.Is<E> tyConstrKnowledge) {
Type.App<E, Fix<E>> inTyApp = fixedPoint.getIn();
// convertToExpr method is generated by annotation processor
Expr<Fix<E>> in = tyConstrKnowledge.convertToExpr(inTyApp);
for (Fix<E> subExpr: in.getSubExpressions()) {
process(subExpr, tyConstrKnowledge);
}
}
}
It looks as if you may want something like:
public class Fix<F extends Fix<F>> {
private F in;
}
(See the Enum class, and questions about its generics.)
There is a roundabout way to encode higher kinded types in Java as pointed out by Victor. The gist of it is to introduce a type H<F, T> to encode F<T>. This can then be used to encode fixed point of functors (i.e. Haskell's Fix type):
public interface Functor<F, T> {
<R> H<F, R> map(Function<T, R> f);
}
public static record Fix<F extends H<F, T> & Functor<F, T>, T>(F f) {
public Functor<F, Fix<F, T>> unfix() {
return (Functor<F, Fix<F, T>>) f;
}
}
From here you can go on and implement catamorphisms over initial algebras:
public interface Algebra<F, T> extends Function<H<F, T>, T> {}
public static <F extends H<F, T> & Functor<F, T>, T> Function<Fix<F, T>, T> cata(Algebra<F, T> alg) {
return fix -> alg.apply(fix.unfix().map(cata(alg)));
}
See my GitHub repo for working code including some example algebras. (Note, IDE's like IntelliJ struggle with the code although it compiles and runs just fine with Java 15).

Categories