Java bean mapper expected capture but is provided object - java

Please note: even though I mention Dozer in this question, I do believe its really just a pure Java generics question at heart. There may be a Dozer-specific solution out there, but I think anyone with strong working knowledge of Java (11) generics/captures/erasures should be able to help me out!
Java 11 and Dozer here. Dozer is great for applying default bean mapping rules to field names, but anytime you have specialized, custom mapping logic you need to implement a Dozer CustomConverter and register it. That would be great, except the Dozer API for CustomConverter isn't genericized, is monolithic and leads to nasty code like this:
public class MyMonolithicConverter implements CustomConverter {
#Override
public Object convert(Object destination, Object source, Class<?> destinationClass, Class<?> sourceClass) {
if (sourceClass.isAssignableFrom(Widget.class)) {
Widget widget = (Widget)source;
if (destinationClass.isAssignableFrom(Fizz.class)) {
Fizz fizz = (Fizz)destination;
// write code for mapping widget -> fizz here
} else if (destinationClass.isAssignableFrom(Foo.class)) {
// write code for mapping widget -> foo here
}
... etc.
} else if (sourceClass.isAssignableFrom(Baz.class)) {
// write all the if-else-ifs and mappings for baz -> ??? here
}
}
}
So again: monolithic, not genericized and leads to large, complex nested if-else-if blocks. Eek.
I'm trying to make this a wee bit more palatable:
public abstract class BeanMapper<SOURCE,TARGET> {
private Class<SOURCE> sourceClass;
private Class<TARGET> targetClass;
public abstract TARGET map(SOURCE source);
public boolean matches(Class<?> otherSourceClass, Class<?> otherTargetClass) {
return sourceClass.equals(otherSourceClass) && targetClass.equals(otherTargetClass);
}
}
Then, an example of it in action:
public class SignUpRequestToAccountMapper extends BeanMapper<SignUpRequest, Account> {
private PlaintextEncrypter encrypter;
public SignUpRequestToAccountMapper(PlaintextEncrypter encrypter) {
this.encrypter = encrypter;
}
#Override
public Account map(SignUpRequest signUpRequest) {
return Account.builder()
.username(signUpRequest.getRequestedName())
.email(signUpRequest.getEmailAddr())
.givenName(signUpRequest.getFirstName())
.surname(signUpRequest.getLastName()())
.dob(DateUtils.toDate(signUpRequest.getBirthDate()))
.passwordEnc(encrypter.saltPepperAndEncrypt(signUpRequest.getPasswordPlaintext()))
.build();
}
}
And now a way to invoke the correct source -> target mapper from inside my Dozer converter:
public class DozerConverter implements CustomConverter {
private Set<BeanMapper> beanMappers;
#Override
public Object convert(Object destination, Object source, Class<?> destinationClass, Class<?> sourceClass) {
BeanMapper<?,?> mapper = beanMappers.stream()
.filter(beanMapper -> beanMapper.matches(sourceClass, destinationClass))
.findFirst()
.orElseThrow();
// compiler error here:
return mapper.map(source);
}
}
I really like this design/API approach, however I get a compiler error on that mapper.map(source) line at the very end:
"Required type: capture of ?; Provided: Object"
What can I do to fix this compiler error? I'm not married to this API/approach, but I do like the simplicity it adds over the MyMonolithicConverter example above, which is the approach Dozer sort of forces on you. It is important to note that I am using Dozer elsewhere for simple bean mappings so I would prefer to use a CustomConverter impl and leverage Dozer for this instead of bringing in a whole other dependency/library for these custom/complex mappings. If Dozer offers a different solution I might be happy with that as well. Otherwise I just need to fix this capture issue. Thanks for any help here!

The issue seems to come from the beanMappers. You have a set of mappers of various types. The compiler cannot infer what types the found mapper will have.
You can make the compiler believe you by casting the result and suppress the warning it gives you.
Casting to a <?,?> isn't going to happen, so I've added symbols for the convert method. At least it can then be assumed that when you get a BeanMapper<S,T>, map will indeed return a T upon an S source.
class DozerConverter {
private Set<BeanMapper<Object,Object>> beanMappers;
public <S,T> T convert(S source,
Class<?> destinationClass,
Class<?> sourceClass) {
#SuppressWarnings("unchecked")
BeanMapper<S,T> mapper = (BeanMapper<S,T>) beanMappers.stream()
.filter(beanMapper -> beanMapper.matches(sourceClass, destinationClass))
.findFirst()
.orElseThrow();
return mapper.map(source);
}
}
I'm afraid you're going to have to call it like so:
TARGET-TYPE target = dozerConverter.<SOURCE-TYPE,TARGET-TYPE>convert(...);

Related

Generic overload at compile-time

Its possbile, to design a way to call different method-overloads at compile-time?
Lets say, I have this little class:
#RequiredArgsConstructor
public class BaseValidator<T> {
private final T newValue;
}
Now, I need methods that returns diffrent Objects (depends on the T).
Like this:
private StringValidator getValidator() {
return new ValidationString(newValue);
}
private IntegerValidator getValidator() {
return new Validation(newValue);
}
At the end, I want a call-hierachy that is very fluent and looks like this:
new BaseValidator("string")
.getValidator() // which returns now at compile-time a StringValidator
.checkIsNotEmpty();
//or
new BaseValidator(43)
.getValidator() // which returns now a IntegerValidator
.checkIsBiggerThan(42);
And in my "real"-case (I have a very specific way to update objects and a lot of conditions for every object and the chance of a copy-and-paste issue is very high. So the wizard enforces all developer to implement exact this way.) :
I tried diffrent ways. Complex generics inside the Validators, or play around with the generics. My last approch looks like this.
public <C> C getValidator() {
return (C) getValidation(newValue);
}
private ValidationString getValidation(String newValue) {
return new StringValidator(newValue);
}
private ValidationInteger getValidation(Integer newValue) {
return new IntegerValidation(newValue);
}
What is the trick?
//edit: I want it at compile-time and not with instanceof-checks at runtime.
What is the trick?
Not to do it like this.
Provide static factory methods:
class BaseValidator<T> {
static ValidationString getValidation(String newValue) {
return new ValidationString(newValue);
}
static ValidationInteger getValidation(Integer newValue) {
return new ValidationInteger(newValue);
}
}
class ValidationString extends BaseValidator<String> { ... }
class ValidationInteger extends BaseValidator<Integer> { ... }
Although I consider this to be odd: you are referring to subclasses inside the base class. Such cyclical dependencies make the code hard to work with, especially when it comes to refactoring, but also perhaps in initialization.
Instead, I would suggest creating a utility class to contain the factory methods:
class Validators {
private Validators() {}
static ValidationString getValidation(String newValue) {
return new ValidationString(newValue);
}
static ValidationInteger getValidation(Integer newValue) {
return new ValidationInteger(newValue);
}
}
which has no such cycles.
A really important thing to realize about generics is that it is nothing more than making explicit casts implicit (and then checking that all of these implicit casts are type-safe).
In other words, this:
List<String> list = new ArrayList<>();
list.add("foo");
System.out.println(list.get(0).length());
is just a nicer way of writing:
List list = new ArrayList();
list.add((String) "foo");
System.out.println(((String) list.get(0)).length());
Whilst <String> looks like it is part of the type, it is basically just an instruction to the compiler to squirt in a load of casts.
Generic classes with different type parameters all have the same methods. This is the specific difficulty in your approach: you can't make the BaseValidator<String>.getValidator() return something with a checkIsNotEmpty method (only), and the BaseValidator<Integer>.getValidator() return something with a checkIsGreaterThan method (only).
Well, this isn't quite true to say you can't. With your attempt involving the method-scoped type variable (<C> C getValidator()), you can write:
new BaseValidator<>("string").<StringValidator>getValidator().checkIsNotEmpty()
(assuming StringValidator has the checkIsNotEmpty method on it)
But:
Let's not mince words: it is ugly.
Worse than being ugly, it isn't type safe. You can equally write:
new BaseValidator<>("string").getValidator().checkIsGreaterThan(42)
which is nonsensical, but allowed by the compiler. The problem is that the return type is chosen at the call site: you will either have to return null (and get a NullPointerException when you try to invoke the following method); or return some non-null value and risk a ClassCastException. Either way: not good.
What you can do, however, is to make a generic validator a parameter of the method call. For example:
interface Validator<T> {
void validate(T b);
}
class BaseValidator<T> {
BaseValidator<T> validate(Validator<T> v) {
v.validate(this.value);
}
}
And invoke like so, demonstrating how you can chain method calls to apply multiple validations:
new BaseValidator<>("")
.validate(s -> !s.isEmpty())
.validate(s -> s.matches("pattern"))
...
new BaseValidator<>(123)
.validate(v -> v >= 0)
...
We decided to add more class-steps. You can go a the generic way or a way with explict types (in this examples, String). Our requirement for all updates-methods (we have many database-objects ...) are a little complicated. We want only one update-method (for each db-object), which ...
Ignore fields, that are null.
Ignore field, that are equal to "old" value.
Validate not ignored fields.
Save only, when no validation-issues occur.
To do that with many if-blocks is possbile but not really readable. And copy-paste-fails haves a high probably.
Our code look like this:
private void update(#NonNull final User.UpdateFinalStep params) {
UpdateWizard.update(dbUserService.get(params.getId())
.field(params.getStatus())
.withGetter(DbUser::getAccountStatus)
.withSetter(DbUser::setAccountStatus)
.finishField()
.field(Optional.ofNullable(params.getUsername())
.map(String::toLowerCase)
.orElse(null))
.withGetter(DbUser::getUsername)
.withSetter(DbUser::setUsername)
.beginValidationOfField(FieldName.USERNAME)
.notEmptyAndMatchPattern(USERNAME_PATTERN, () -> this.checkUniqueUsername(params.getUsername(), params.getId()))
.endValidation()
.field(params.getLastName())
.withGetter(DbUser::getLastname)
.withSetter(DbUser::setLastname)
.beginValidationOfField(FieldName.USER_LASTNAME)
.notEmptyAndMatchPattern(LAST_NAME_PATTERN)
.endValidation()
.field(params.getFirstName())
.withGetter(DbUser::getFirstname)
.withSetter(DbUser::setFirstname)
.beginValidationOfField(FieldName.USER_FIRSTNAME)
.notEmptyAndMatchPattern(FIRST_NAME_PATTERN)
.endValidation()
.save(dbUserService::save);
}
This is very readable and allows to add new field in a very simple way. With the generics, we dont give the "stupid developer" a chance to do an misstake.
As you can see in the image, accountStatus and username points to different classes.
At the end, we can use in a very fluent way the update-method:
userService.startUpdate()
.withId(currentUserId)
.setStatus(AccountStatus.INACTIVE)
.finallyUpdate();

How to implement a Gson equivalent of #JsonUnwrap

I know Gson doesn't come with a similar feature, but is there a way to add support for unwrapping Json fields the way #JsonUnwrap does?
The goal is to allow a structure like:
public class Person {
public int age;
public Name name;
}
public class Name {
public String first;
public String last;
}
to be (de)serialized as:
{
"age" : 18,
"first" : "Joey",
"last" : "Sixpack"
}
instead of:
{
"age" : 18,
"name" : {
"first" : "Joey",
"last" : "Sixpack"
}
}
I understand it could get fairly complex, so I'm not looking for a full solution, just some high-level guidelines if this is even doable.
I've made a crude implementation of a deserializer that supports this. It is fully generic (type-independent), but also expensive and fragile and I will not be using it for anything serious. I am posting only to show to others what I've got, if they end up needing to do something similar.
public class UnwrappingDeserializer implements JsonDeserializer<Object> {
//This Gson needs to be identical to the global one, sans this deserializer to prevent infinite recursion
private Gson delegate;
public UnwrappingDeserializer(Gson delegate) {
this.delegate = delegate;
}
#Override
public Object deserialize(JsonElement json, Type type, JsonDeserializationContext context) throws JsonParseException {
Object def = delegate.fromJson(json, type); //Gson doesn't care about unknown fields
Class raw = GenericTypeReflector.erase(type);
Set<Field> unwrappedFields = ClassUtils.getAnnotatedFields(raw, GsonUnwrap.class);
for (Field field : unwrappedFields) {
AnnotatedType fieldType = GenericTypeReflector.getExactFieldType(field, type);
field.setAccessible(true);
try {
Object fieldValue = deserialize(json, fieldType.getType(), context);
field.set(def, fieldValue);
} catch (IllegalAccessException e) {
throw new RuntimeException(e);
}
}
return def;
}
}
It can then be registered globally via new GsonBuilder().registerTypeHierarchyAdapter(Object.class, new UnwrappingDeserializer(new Gson())).create() or for a specific type via registerTypeAdapter.
Notes:
A real implementation should recursively check the entire class structure for the presence of GsonUnwrap, cache the result in a concurrent map, and only go through this procedure if it needs to. Otherwise it should just return def immediately
It should also cache discovered annotated fields to avoid scanning the hierarchy each time
GenericTypeReflector is coming from GeAnTyRef
ClassUtils#getAnnotatedFields is my own implementation, but it doesn't do anything special - it just gathers declared fields (via Class#getDeclaredFields) recursively for the class hierarchy
GsonUnwrap is just a simple custom annotation
I presume a similar thing can be done for serialization as well. Examples linked from Derlin's answer can be a starting point.
Currently, there is no easy way to do that. Here are anyway some pointers/alternative ways to make it work.
GsonFire: GsonFire implements some useful features missing from Gson. While it does not yet offer automatic wrapping/unwrapping, it may be a good starting point to create your custom logic.
If you only need serialization, you can add getters for first and last in Person and use #ExposeMethodResult to serialize them. Unfortunately, setters are not supported (cf. Is possible to use setters when Gson deserializes a JSON?).
Another way to support the serialization is to follow the advices from How to move fields to parent object.
Custom TypeAdapters : on of the only ways to support both serialization and deserialization is to create custom TypeAdapters. This won't be generic, but it will suit your usecase.
The thread Serialize Nested Object as Attributes already gives you examples, so I won't repeat them here.

Associating a generic type with Enum in Java

I am creating a store for user preferences, and there are a fixed number of preferences that users can set values for. The names of the preferences (settings) are stored as an Enum:
public enum UserSettingName {
FOO,
BAR,
ETC
}
What I would like to be able to do is store a value type with the name so that the service will store the user's value with the correct Java type. For example, FOO might be a Long, and BAR might be a String. Up until now, we were storing all values as Strings, and then manually casting the values into the appropriate Java type. This has lead to try/catch blocks everywhere, when it makes more sense to have only one try/catch in the service. I understand that Enums cannot have generic types, so I have been playing around with:
public enum UserSettingName {
FOO(Long.class),
BAR(String.class),
ETC(Baz.class)
private Class type;
private UserSettingName(Class type) {
this.type = type;
}
public Class getType() {
return this.type;
}
}
I have a generic UserSetting object that has public T getSettingValue() and public void setSettingValue(T value) methods that should return and set the value with the correct type. My problem comes from trying to specify that generic type T when I create or retrieve a setting because I can't do something like:
new UserSetting<UserSettingName.FOO.getType()>(UserSettingName.FOO, 123L)
Sorry if this isn't exactly clear, I can try to clarify if it's not understood.
Thanks!
UPDATE
Both the setting name and value are coming in from a Spring MVC REST call:
public ResponseEntity<String> save(#PathVariable Long userId, #PathVariable UserSettingName settingName, #RequestBody String settingValue)
So I used the Enum because Spring casts the incoming data automatically.
Firstly you have to step back and think about what you're trying to achieve, and use a standard pattern or language construct to achieve it.
It's not entirely clear what you're going after here but from your approach it almost certainly looks like you're reinventing something which could be done in a much more straightforward manner in Java. For example, if you really need to know and work with the runtime classes of objects, consider using the reflection API.
On a more practical level - what you're trying to do here isn't possible with generics. Generics are a compile-time language feature - they are useful for avoiding casting everything explicitly from Object and give you type-checking at compilation time. You simply cannot use generics in this way, i.e. setting T as some value UserSettingName.Foo.getType() which is only known at runtime.
Look how it done by netty:
http://netty.io/wiki/new-and-noteworthy.html#type-safe-channeloption
They done it by using typed constants:
http://grepcode.com/file/repo1.maven.org/maven2/io.netty/netty-all/4.0.0.Beta1/io/netty/channel/ChannelOption.java#ChannelOption
EDIT:
public interface ChannelConfig {
...
<T> boolean setOption(ChannelOption<T> option, T value);
...
}
public class ChannelOption<T> ...
public static final ChannelOption<Integer> SO_TIMEOUT =
new ChannelOption<Integer>("SO_TIMEOUT");
...
}
EDIT2: you can transform it like:
class Baz {}
class UserSettingName<T> {
public static final UserSettingName<Baz> ETC = new UserSettingName<Baz>();
}
class UserSetting {
public <T> UserSetting(UserSettingName<T> name, T param) {
}
}
public class Test {
public static void main(String[] args) {
new UserSetting(UserSettingName.ETC, new Baz());
}
}
Enums are not the answer here. If you find yourself repeating code everywhere you could just create a utility class and encapsulate all the try/catch logic there. That would cut down on your code redundancy without majorly impacting your current code.
public class Util
{
public static MyObject getObjectFromString(String s)
{
try
{
return (MyObject)s;
}
catch(Exception e)
{
return null;
}
}
}
Then use as follows:
MyObject myObj = Util.getObjectFromString(string);

Xstream: Implicitly ignoring all fields

How do I tell Xstream to serialize only fields which are annotated explicitly and ignore the rest?
I am trying to serialize a hibernate persistent object and all proxy related fields get serialized which I don’t want in my xml.
e.g.
<createdBy class="com..domain.Users " reference="../../values/createdBy"/>
is not something I want in my xml.
Edit: I don’t think I made this question clear. A class may inherit from a base class on which I have no control (as in hibernate’s case) on the base class properties.
public class A {
private String ShouldNotBeSerialized;
}
public class B extends A {
#XStreamAlias("1")
private String ThisShouldbeSerialized;
}
In this case when I serialize class B, the base class field ShouldNotBeSerialized will also get serialized. This is not something I want. In most circumstances I will not have control on class A.
Therefore I want to omit all fields by default and serialize only fields for which I explicitly specify the annotation. I want to avoid what GaryF is doing, where I need to explicitly specify the fields I need to omit.
You can omit fields with the #XstreamOmitField annotation. Straight from the manual:
#XStreamAlias("message")
class RendezvousMessage {
#XStreamOmitField
private int messageType;
#XStreamImplicit(itemFieldName="part")
private List<String> content;
#XStreamConverter(SingleValueCalendarConverter.class)
private Calendar created = new GregorianCalendar();
public RendezvousMessage(int messageType, String... content) {
this.messageType = messageType;
this.content = Arrays.asList(content);
}
}
I can take no credit for this answer, just sharing what I have found. You can override the wrapMapper method of the XStream class to achieve what you need.
This link explains in detail: http://pvoss.wordpress.com/2009/01/08/xstream/
Here is the code you need if you don't want the explanation:
// Setup XStream object so that it ignores any undefined tags
XStream xstream = new XStream() {
#Override
protected MapperWrapper wrapMapper(MapperWrapper next) {
return new MapperWrapper(next) {
#Override
public boolean shouldSerializeMember(Class definedIn,
String fieldName) {
if (definedIn == Object.class) {
return false;
}
return super
.shouldSerializeMember(definedIn, fieldName);
}
};
}
};
You might want to do all your testing before you implement this code because the exceptions thrown by the default XStream object are useful for finding spelling mistakes.
There was already a ticket for the XStream people:
Again, this is by design. XStream is a serialization tool, not a data
binding tool. It is made to serialize Java objects to XML and back. It
will write anything into XML that is necessary to recreate an equal
object graph. The generated XML can be tweaked to some extend by
configuration for convenience, but this is already an add-on. What you
like to do can be done by implementing a custom mapper, but that's a
question for the user's list and cannot be handled here.
http://jira.codehaus.org/browse/XSTR-569
I guess the only direct way is to dive into writing a MapperWrapper and exclude all fields you have not annotated. Sounds like a feature request for XStream.

Type-safe method reflection in Java

Is any practical way to reference a method on a class in a type-safe manner? A basic example is if I wanted to create something like the following utility function:
public Result validateField(Object data, String fieldName,
ValidationOptions options) { ... }
In order to call it, I would have to do:
validateField(data, "phoneNumber", options);
Which forces me to either use a magic string, or declare a constant somewhere with that string.
I'm pretty sure there's no way to get around that with the stock Java language, but is there some kind of (production grade) pre-compiler or alternative compiler that may offer a work around? (similar to how AspectJ extends the Java language) It would be nice to do something like the following instead:
public Result validateField(Object data, Method method,
ValidationOptions options) { ... }
And call it with:
validateField(data, Person.phoneNumber.getter, options);
As others mention, there is no real way to do this... and I've not seen a precompiler that supports it. The syntax would be interesting, to say the least. Even in your example, it could only cover a small subset of the potential reflective possibilities that a user might want to do since it won't handle non-standard accessors or methods that take arguments, etc..
Even if it's impossible to check at compile time, if you want bad code to fail as soon as possible then one approach is to resolve referenced Method objects at class initialization time.
Imagine you have a utility method for looking up Method objects that maybe throws error or runtime exception:
public static Method lookupMethod( Class c, String name, Class... args ) {
// do the lookup or throw an unchecked exception of some kind with a really
// good error message
}
Then in your classes, have constants to preresolve the methods you will use:
public class MyClass {
private static final Method GET_PHONE_NUM = MyUtils.lookupMethod( PhoneNumber.class, "getPhoneNumber" );
....
public void someMethod() {
validateField(data, GET_PHONE_NUM, options);
}
}
At least then it will fail as soon as MyClass is loaded the first time.
I use reflection a lot, especially bean property reflection and I've just gotten used to late exceptions at runtime. But that style of bean code tends to error late for all kinds of other reasons, being very dynamic and all. For something in between, the above would help.
There isn't anything in the language yet - but part of the closures proposal for Java 7 includes method literals, I believe.
I don't have any suggestions beyond that, I'm afraid.
Check out https://proxetta.jodd.org/refs/methref. It uses the Jodd proxy library (Proxetta) to proxy your type. Not sure about its performance characteristics, but it does provide type safety.
An example: Suppose Str.class has method .boo(), and you want to get its name as the string "boo":
String methodName = Methref.of(Str.class).name(Str::boo);
There's more to the API than the example above: https://oblac.github.io/jodd-site/javadoc/jodd/methref/Methref.html
Is any practical way to reference a method on a class in a type-safe manner?
First of all, reflection is type-safe. It is just that it is dynamically typed, not statically typed.
So, assuming that you want a statically typed equivalent of reflection, the theoretical answer is that it is impossible. Consider this:
Method m;
if (arbitraryFunction(obj)) {
m = obj.getClass().getDeclaredMethod("foo", ...);
} else {
m = obj.getClass().getDeclaredMethod("bar", ...);
}
Can we do this so that that runtime type exceptions cannot happen? In general NO, since this would entail proving that arbitraryFunction(obj) terminates. (This is equivalent to the Halting Problem, which is proven to be unsolvable in general, and is intractable using state-of-the-art theorem proving technology ... AFAIK.)
And I think that this road-block would apply to any approach where you could inject arbitrary Java code into the logic that is used to reflectively select a method from an object's class.
To my mind, the only moderately practical approach at the moment would be to replace the reflective code with something that generates and compiles Java source code. If this process occurs before you "run" the application, you've satisfied the requirement for static type-safety.
I was more asking about reflection in which the result is always the same. I.E. Person.class.getMethod("getPhoneNumber", null) would always return the same method and it's entirely possible to resolve it at compile time.
What happens if after compiling the class containing this code, you change Person to remove the getPhoneNumber method?
The only way you can be sure that you can resolve getPhoneNumber reflectively is if you can somehow prevent Person from being changed. But you can't do that in Java. Runtime binding of classes is a fundamental part of the language.
(For record, if you did that for a method that you called non-reflectively, you would get an IncompatibleClassChangeError of some kind when the two classes were loaded ...)
It has been pointed out that in Java 8 and later you could declare your validator something like this:
public Result validateField(Object data,
SomeFunctionalInterface function,
ValidationOptions options) { ... }
where SomeFunctionalInterface corresponds to the (loosely speaking) common signature of the methods you are validating.
Then you can call it with a method reference; e.g.
validateField(data, SomeClass::someMethod, options)
This is approach is statically type-safe. You will get a compilation error if SomeClass doesn't have someMethod or if it doesn't conform to SomeFunctionalInterface.
But you can't use a string to denote the method name. Looking up a method by name would entail either reflection ... or something else that side-steps static (i.e. compile time / load time) type safety.
Java misses the syntax sugar to do something as nice as Person.phoneNumber.getter. But if Person is an interface, you could record the getter method using a dynamic proxy. You could record methods on non-final classes as well using CGLib, the same way Mockito does it.
MethodSelector<Person> selector = new MethodSelector<Person>(Person.class);
selector.select().getPhoneNumber();
validateField(data, selector.getMethod(), options);
Code for MethodSelector: https://gist.github.com/stijnvanbael/5965609
Inspired by mocking frameworks, we could dream up the following syntax:
validator.validateField(data, options).getPhoneNumber();
Result validationResult = validator.getResult();
The trick is the generic declaration:
class Validator {
public <T> T validateField(T data, options) {...}
}
Now the return type of the method is the same as your data object's type and you can use code completion (and static checking) to access all the methods, including the getter methods.
As a downside, the code isn't quite intuitive to read, since the call to the getter doesn't actually get anything, but instead instructs the validator to validate the field.
Another possible option would be to annotate the fields in your data class:
class FooData {
#Validate(new ValidationOptions(...))
private PhoneNumber phoneNumber;
}
And then just call:
FooData data;
validator.validate(data);
to validate all fields according to the annotated options.
The framework picklock lets you do the following:
class Data {
private PhoneNumber phoneNumber;
}
interface OpenData {
PhoneNumber getPhoneNumber(); //is mapped to the field phoneNumber
}
Object data = new Data();
PhoneNumber number = ObjectAccess
.unlock(data)
.features(OpenData.class)
.getPhoneNumber();
This works in a similar way setters and private methods. Of course, this is only a wrapper for reflection, but the exception does not occur at unlocking time not at call time. If you need it at build time, you could write a unit test with:
assertThat(Data.class, providesFeaturesOf(OpenData.class));
I found a way to get the Method instance using Lambdas. It works only on interface methods though currently.
It works using net.jodah:typetools which is a very lightweight library.
https://github.com/jhalterman/typetools
public final class MethodResolver {
private interface Invocable<I> {
void invokeWithParams(I instance, Class<?>[] parameterTypes) throws Throwable;
}
interface ZeroParameters<I, R> extends Invocable<I> {
R invoke(I instance) throws Throwable;
#Override
default void invokeWithParams(I instance, Class<?>[] parameterTypes) throws Throwable {
invoke(instance);
}
}
public static <I, R> Method toMethod0(ZeroParameters<I, R> call) {
return toMethod(ZeroParameters.class, call, 1);
}
interface OneParameters<I, P1, R> extends Invocable<I> {
R invoke(I instance, P1 p1) throws Throwable;
#Override
default void invokeWithParams(I instance, Class<?>[] parameterTypes) throws Throwable {
invoke(instance, param(parameterTypes[1]));
}
}
public static <I, P1, R> Method toMethod1(OneParameters<I, P1, R> call) {
return toMethod(OneParameters.class, call, 2);
}
interface TwoParameters<I, P1, P2, R> extends Invocable<I> {
R invoke(I instance, P1 p1, P2 p2) throws Throwable;
#Override
default void invokeWithParams(I instance, Class<?>[] parameterTypes) throws Throwable {
invoke(instance, param(parameterTypes[1]), param(parameterTypes[2]));
}
}
public static <I, P1, P2, R> Method toMethod2(TwoParameters<I, P1, P2, R> call) {
return toMethod(TwoParameters.class, call, 3);
}
private static final Map<Class<?>, Object> parameterMap = new HashMap<>();
static {
parameterMap.put(Boolean.class, false);
parameterMap.put(Byte.class, (byte) 0);
parameterMap.put(Short.class, (short) 0);
parameterMap.put(Integer.class, 0);
parameterMap.put(Long.class, (long) 0);
parameterMap.put(Float.class, (float) 0);
parameterMap.put(Double.class, (double) 0);
}
#SuppressWarnings("unchecked")
private static <T> T param(Class<?> type) {
return (T) parameterMap.get(type);
}
private static <I> Method toMethod(Class<?> callType, Invocable<I> call, int responseTypeIndex) {
Class<?>[] typeData = TypeResolver.resolveRawArguments(callType, call.getClass());
Class<?> instanceClass = typeData[0];
Class<?> responseType = responseTypeIndex != -1 ? typeData[responseTypeIndex] : Void.class;
AtomicReference<Method> ref = new AtomicReference<>();
I instance = createProxy(instanceClass, responseType, ref);
try {
call.invokeWithParams(instance, typeData);
} catch (final Throwable e) {
throw new IllegalStateException("Failed to call no-op proxy", e);
}
return ref.get();
}
#SuppressWarnings("unchecked")
private static <I> I createProxy(Class<?> instanceClass, Class<?> responseType,
AtomicReference<Method> ref) {
return (I) Proxy.newProxyInstance(MethodResolver.class.getClassLoader(),
new Class[] {instanceClass},
(proxy, method, args) -> {
ref.set(method);
return parameterMap.get(responseType);
});
}
}
Usage:
Method method = MethodResolver.toMethod2(SomeIFace::foobar);
System.out.println(method); // public abstract example.Result example.SomeIFace.foobar(java.lang.String,boolean)
Method get = MethodResolver.<Supplier, Object>toMethod0(Supplier::get);
System.out.println(get); // public abstract java.lang.Object java.util.function.Supplier.get()
Method accept = MethodResolver.<IntFunction, Integer, Object>toMethod1(IntFunction::apply);
System.out.println(accept); // public abstract java.lang.Object java.util.function.IntFunction.apply(int)
Method apply = MethodResolver.<BiFunction, Object, Object, Object>toMethod2(BiFunction::apply);
System.out.println(apply); // public abstract java.lang.Object java.util.function.BiFunction.apply(java.lang.Object,java.lang.Object)
Unfortunately you have to create a new interface and method based on the parameter count and whether the method returns void or not.
However, if you have a somewhat fixed/limited method signature/parameter types, then this becomes quite handy.

Categories