What is #AutoAnnotation for? How could it be used? - java

In https://dagger.dev/multibindings.html, I read about #AutoAnnotation. It has a reference to https://github.com/google/auto/blob/master/value/src/main/java/com/google/auto/value/AutoAnnotation.java.
It is also mentioned in
https://github.com/google/auto/blob/57dfad360306619a820e6aae4a14a1aa67c29299/value/userguide/howto.md#annotation
I read about it, can't get to understand it.
I manage to access it from my Android code using
implementation 'com.google.auto.value:auto-value:1.5.2'
kapt 'com.google.auto.value:auto-value:1.5.2'
And also
android.defaultConfig.javaCompileOptions.annotationProcessorOptions.includeCompileClasspath = true
But I don't understand how it could be used. Is there any good tutorial of using it?

AutoAnnotation automatically generates a class that implements an annotation interface in the same way the JDK does.
Dagger map keys
When using a Multibindings map through Dagger that uses a custom annotation as its key, Dagger will install the instance T or Provider Provider<T> into the returned map, using the annotation instance itself as the key. To make this clearer:
#MapKey
#interface YourAnnotation {
String foo();
}
#Provides #YourAnnotation(foo="bar") YourClass getYourClassForBar() { /* ... */ }
// Dagger will create a multibinding that would allow you to inject this:
#Inject Map<YourAnnotation, YourClass> map;
If the only thing that matters here is foo, you could also use unwrapKeys to make the map keyed by String instead of YourAnnotation, but let's assume you want this because you want YourAnnotation to have multiple values in the future. But where do the implementations of YourAnnotation come from, and how are you supposed to call get on the map?
Annotations at runtime
When you annotate a Java element (frequently a class, method, or field) Java will return a particular implementation of that class's annotation. From the Java tutorial:
#interface ClassPreamble {
String author();
String date();
int currentRevision() default 1;
String lastModified() default "N/A";
String lastModifiedBy() default "N/A";
// Note use of array
String[] reviewers();
}
// [...]
#ClassPreamble (
author = "John Doe",
date = "3/17/2002",
currentRevision = 6,
lastModified = "4/12/2004",
lastModifiedBy = "Jane Doe",
// Note array notation
reviewers = {"Alice", "Bob", "Cindy"}
)
public class Generation3List extends Generation2List {/* ... */}
In this usage, Generation3List has one Annotation of type ClassPreamble. If the annotation is retained at runtime (i.e. ClassPreamble itself is annotated with #Retention(RUNTIME)), you can get to it via Generation3List.class.getAnnotations() or Generation3List.class.getAnnotation(ClassPreamble.class). (There are also declared counterparts that handle superclass annotations differently.)
Once you get to an instance of ClassPreamble, you can use the methods like author() and date() to retrieve the data out of the class. However, ClassPreamble behaves as an interface, and the implementation of that annotation is internal to the VM. That makes it more difficult to create your own arbitrary instance of ClassPreamble at runtime.
Conforming annotation implementations
Because YourAnnotation and ClassPreamble are interfaces, you could just create an implementation. However, that implementation is unlikely to have matching implementations of equals and hashCode compared to the VM's implementation, because the implementation may vary between JREs and may also vary in Android. However, the implementation of equals and hashCode is actually very closely prescribed in the docs for Annotation:
The hash code of an annotation is the sum of the hash codes of its members (including those with default values), as defined below: The hash code of an annotation member is (127 times the hash code of the member-name as computed by String.hashCode()) XOR the hash code of the member-value, as defined below [...]
Returns true if the specified object represents an annotation that is logically equivalent to this one. In other words, returns true if the specified object is an instance of the same annotation type as this instance, all of whose members are equal to the corresponding member of this annotation, as defined below [...]
It is possible to manually implement these rules, but it would be difficult to do so, and it would also impose a burden if the structure of YourAnnotation or ClassPreamble were to change. Though there are reflective solutions to this problem, AutoAnnotation generates the code for a conforming implementation automatically:
public class YourAnnotations {
#AutoAnnotation public static YourAnnotation yourAnnotation(String foo) {
return new AutoAnnotation_YourAnnotations_yourAnnotation(foo);
}
}
public class ClassPreambles {
#AutoAnnotation public static ClassPreamble classPreamble(
String author,
String date,
int currentRevision,
String lastModified,
String lastModifiedBy,
String[] reviewers) {
return new AutoAnnotation_ClassPreambles_classPreamble(
author,
date,
currentRevision,
lastModified,
lastModifiedBy,
reviewers);
}
}
With AutoAnnotation's generated implementation, you can call get on the map that Dagger Multibindings generates (or provide test implementations you control) without having to deal with Annotation-specific hashCode XORs or equals rules. This is useful beyond Dagger and tests, but because Dagger uses annotation instances in its maps, it makes sense that you might need to use AutoAnnotation to create similar instances.

Related

Java records private constructor with builder [duplicate]

I have a situation where I want record instances for a specific type to only be creatable using a factory method in a separate class within the same package. The reason for this is because before creating the record I need to perform a significant amount of validation.
The record is intended to be a dumb-data carrier of its validated fields but the validation cannot take place in the record's constructor because we require access to some elaborate validator objects to actually perform the validation.
Since passing the validator objects to the record constructor would mean they would form part of the record state it means we cannot use the record constructor to perform the record's validation.
And so I extracted the validation out into its own factory and coded up something like this (a factory class and a record in the same package):
package some.package;
// imports.....
#Component
class SomeRecordFactory {
private final SomeValidator someValidator;
private final SomeOtherValidator someOtherValidator;
// Rest of the fields
// ....
// constructor
// ....
public SomeRecord create(...) {
someValidator.validate(....);
someOtherValidator.validate(....);
// .... other validation
return new SomeRecord(...);
}
}
package some.package;
public record SomeRecord(...) {
/* package-private */ SomeRecord {
}
}
For whatever reason the above does not work with IntelliJ complaining:
Compact constructor access level cannot be more restrictive than the record access level (public)
I can avoid the issue by using a normal class (which allows for a single package-private constructor) but would like to more accurately model the data as a record.
Why does this restriction exist for records? Are there any plans to remove this restriction in the future?
I asked the question on the amber mailing list (http://mail.openjdk.java.net/pipermail/amber-dev/2020-December.txt).
The question was posed:
What exactly is the reason that the canonical constructor must have the
same access as the record?
And the answer given was (emphasis added mine):
Records are named tuples, they are defined only by their components, in a transparent manner i.e. no encapsulation. From a tuple, you can access to the value of each component and from all component values, you can create a tuple.
The idea is that, in a method, if you are able to see a record, you can create it. Thus the canonical constructor has the same visibility as the record itself.
So the restriction exists to comply with the design goal and fact that if someone has an instance of a record they should be able to deconstruct it and then reconstruct it with the canonical constructor. And of course as a corollary this necessitates the canonical constructor having the same access as the record itself.
Q: Why does this restriction exist for records?
There isn't an explicit justification for that decision in JEP 359 or in the JLS, but I think it is implied by this excerpt from the JEP:
"Because records make the semantic claim of being transparent carriers for their data ..."
A "transparent carrier" means (to me1) that records are designed to have a minimal abstraction boundary. Restricting the access of a constructor implies (to me) an additional abstraction boundary.
In addition, I suspect that record constructors with more restrictive access modifiers could impede or complicate intended use-cases for records in future versions of Java.
Anyway, my take is that if you want fancy stuff like that you should be declaring a class rather than a record.
1 - Transparent is the opposite of opaque, and abstract data types are typically opaque by design. Obviously, this is just my take on what the JEP authors meant.
Q: Are there any plans to remove this restriction in the future?
I am not aware of any. There are no (public) open Java Bugs or RFEs about this.
Indeed, all of the JDK bugs relating to this topic were to ensure that the Java 15+ specifications made the restriction clear. There is no suggestion that the restriction happened by accident or oversight.
You can't do exactly what you want here - public record with hidden constructor (or more generally - a record whose constructor has more restrictive access - as you've already pointed out!).
But if you tweak the requirements a bit you can achieve a similarly desired outcome.
The trick is to make the record itself hidden (private or package-private) and only expose the factory class:
// package-private
record SomeRecord(String key, String value) {}
public final class SomeRecordFactory {
public SomeRecord create(String value) {
return new SomeRecord("key", value);
}
}
Then you're forced to create instances like:
SomeRecordFactory factory = new SomeRecordFactory();
var myObject = factory.create("my value");
NOTE the use of var - I can't use MyRecord because it is hidden (package-private in this case, could make it private by nesting it in the factory class).
If you want to expose a Type (e.g. in cases where you can't use var such as method/constructor args) you can provide a sealed interface representing the type and only permit the hidden record:
public sealed interface SomeType permits SomeRecord {
String key();
String value();
}
// package-private
record SomeRecord(String key, String value) implements SomeType {}
public final class SomeRecordFactory {
public SomeType create(String value) {
return new SomeRecord("key", value);
}
}
Then you can create instances without var like:
SomeRecordFactory factory = new SomeRecordFactory();
SomeType myObject = factory.create("my value");
The downside to this approach is you won't have access to the records. So you can't use the new features like pattern matching (exhaustive switch on sealed interface + record deconstruction).
You'd have to do that within your package (you could argue this is a good thing, as it'll be hidden from the users of your API!).

Why can a Java record's canonical constructor not have more restrictive access than the record level?

I have a situation where I want record instances for a specific type to only be creatable using a factory method in a separate class within the same package. The reason for this is because before creating the record I need to perform a significant amount of validation.
The record is intended to be a dumb-data carrier of its validated fields but the validation cannot take place in the record's constructor because we require access to some elaborate validator objects to actually perform the validation.
Since passing the validator objects to the record constructor would mean they would form part of the record state it means we cannot use the record constructor to perform the record's validation.
And so I extracted the validation out into its own factory and coded up something like this (a factory class and a record in the same package):
package some.package;
// imports.....
#Component
class SomeRecordFactory {
private final SomeValidator someValidator;
private final SomeOtherValidator someOtherValidator;
// Rest of the fields
// ....
// constructor
// ....
public SomeRecord create(...) {
someValidator.validate(....);
someOtherValidator.validate(....);
// .... other validation
return new SomeRecord(...);
}
}
package some.package;
public record SomeRecord(...) {
/* package-private */ SomeRecord {
}
}
For whatever reason the above does not work with IntelliJ complaining:
Compact constructor access level cannot be more restrictive than the record access level (public)
I can avoid the issue by using a normal class (which allows for a single package-private constructor) but would like to more accurately model the data as a record.
Why does this restriction exist for records? Are there any plans to remove this restriction in the future?
I asked the question on the amber mailing list (http://mail.openjdk.java.net/pipermail/amber-dev/2020-December.txt).
The question was posed:
What exactly is the reason that the canonical constructor must have the
same access as the record?
And the answer given was (emphasis added mine):
Records are named tuples, they are defined only by their components, in a transparent manner i.e. no encapsulation. From a tuple, you can access to the value of each component and from all component values, you can create a tuple.
The idea is that, in a method, if you are able to see a record, you can create it. Thus the canonical constructor has the same visibility as the record itself.
So the restriction exists to comply with the design goal and fact that if someone has an instance of a record they should be able to deconstruct it and then reconstruct it with the canonical constructor. And of course as a corollary this necessitates the canonical constructor having the same access as the record itself.
Q: Why does this restriction exist for records?
There isn't an explicit justification for that decision in JEP 359 or in the JLS, but I think it is implied by this excerpt from the JEP:
"Because records make the semantic claim of being transparent carriers for their data ..."
A "transparent carrier" means (to me1) that records are designed to have a minimal abstraction boundary. Restricting the access of a constructor implies (to me) an additional abstraction boundary.
In addition, I suspect that record constructors with more restrictive access modifiers could impede or complicate intended use-cases for records in future versions of Java.
Anyway, my take is that if you want fancy stuff like that you should be declaring a class rather than a record.
1 - Transparent is the opposite of opaque, and abstract data types are typically opaque by design. Obviously, this is just my take on what the JEP authors meant.
Q: Are there any plans to remove this restriction in the future?
I am not aware of any. There are no (public) open Java Bugs or RFEs about this.
Indeed, all of the JDK bugs relating to this topic were to ensure that the Java 15+ specifications made the restriction clear. There is no suggestion that the restriction happened by accident or oversight.
You can't do exactly what you want here - public record with hidden constructor (or more generally - a record whose constructor has more restrictive access - as you've already pointed out!).
But if you tweak the requirements a bit you can achieve a similarly desired outcome.
The trick is to make the record itself hidden (private or package-private) and only expose the factory class:
// package-private
record SomeRecord(String key, String value) {}
public final class SomeRecordFactory {
public SomeRecord create(String value) {
return new SomeRecord("key", value);
}
}
Then you're forced to create instances like:
SomeRecordFactory factory = new SomeRecordFactory();
var myObject = factory.create("my value");
NOTE the use of var - I can't use MyRecord because it is hidden (package-private in this case, could make it private by nesting it in the factory class).
If you want to expose a Type (e.g. in cases where you can't use var such as method/constructor args) you can provide a sealed interface representing the type and only permit the hidden record:
public sealed interface SomeType permits SomeRecord {
String key();
String value();
}
// package-private
record SomeRecord(String key, String value) implements SomeType {}
public final class SomeRecordFactory {
public SomeType create(String value) {
return new SomeRecord("key", value);
}
}
Then you can create instances without var like:
SomeRecordFactory factory = new SomeRecordFactory();
SomeType myObject = factory.create("my value");
The downside to this approach is you won't have access to the records. So you can't use the new features like pattern matching (exhaustive switch on sealed interface + record deconstruction).
You'd have to do that within your package (you could argue this is a good thing, as it'll be hidden from the users of your API!).

How can we access methods generated by ByteBuddy in compilation time?

I wrote this example:
E someCreateMethod(Class<E> clazz) {
Class<? extends E> dynamicType = new ByteBuddy()
.subclass(clazz)
.name("NewEntity")
.method(named("getNumber"))
.intercept(FixedValue.value(100))
.defineField("stringVal", String.class, Visibility.PRIVATE)
.defineMethod("getStringVal", String.class, Visibility.PUBLIC)
.intercept(FieldAccessor.ofBeanProperty())
.make()
.load(clazz.getClassLoader(), ClassLoadingStrategy.Default.WRAPPER)
.getLoaded();
return dynamicType.newInstance();
}
And I would like to use it to get the redefined number atributte:
Integer num = someCreateMethod(EntityExample.class).getNumber(); //(1)
Or to get the newly defined stringVal attribute:
String sVal = someCreateMethod(EntityExample.class).getStringVal(); //(2)
My problem is that (1) works pretty fine, while (2) doesn't. I get the following error:
Error:(40, 67) java: cannot find symbol
symbol: method getStringVal()
Also, is it possible to do something like this with a dynamic generated class:
NewEntity newEntity = someCreateMethod(EntityExample.class);
Integer num = newEntity.getNumber();
String sVal = newEntity.getStringVal();
?
EDIT: I appreciate your help, this example was my first attempt on using ByteBuddy library. I figured that defineMethod actually defines an implementation of an interface method, not just add a random method to the class. So I decided to explain here what exactly I'm trying to accomplish.
For every Date attribute in a class E, I want to add two more fields (and theirs respectives getters and setters), let's say (atribute name)InitialDate and (atribute name)FinalDate, so that I can use intervals functinality for every date in E.
I was wondering if I could use code-generation to add those methods without having to create subclasses for every E.
PS: E can't be changed, it belongs to a legacy module.
PS2: I don't know how many date attributes there would be in each entity E, but the new attibutes and methods would be created using conventions (for example __FisrtDay , __LastDay), as shown below:
NewA a = eb.create(A.class);
a.getDeadLine(); //inherited
a.getDeadLineFirstDay(); //added
a.getDeadLineLastDay(); //added
NewA b = eb.create(B.class);
b.getBirthday(); //inherited
b.getBirthdayFirstDay(); //added
b.getBirthdayLastDay(); //added
b.getAnniversary(); //inherited
b.getAnniversaryFirstDay(); //added
b.getAnniversaryLastDay(); //added
PS3: Is what I'm trying to accomplish even possible with ByteBuddy or at all? Is there another way?
PS4: Should my EDIT have been a new question?
You need E to be a superclass/ or interface which includes the methods you are trying to call -- you will not be able to resolve subtyped methods which do not exist on E.
This is not a ByteBuddy issue, this is an issue of your class design -- you should design & group the functionality you intend to generate into abstractable parts, so it can be exposed via types which are meaningful at compile time.
For example, we could use a supertype 'ValueProvider' and then use ByteBuddy to define an IntConstantProvider.
public interface ValueProvider<T> {
public T getValue();
}
Class<? extends ValueProvider<Integer>> dynamicType = new ByteBuddy()
.subclass(clazz)
.name("ConstantIntProvider")
.method(named("getValue"))
.intercept(FixedValue.value(100))
// etc.
Your prototype had 3 separate functionalities (if we consider unreference private fields to be the stub of some intended behavior) with no obvious abstraction to encompass them. This could be better designed as 3 simple atomic behaviors, for which the abstractions would be obvious.
You could use reflection to find arbitrary methods on a arbitrary dynamically-defined class, but this is not really meaningful from a coding or design POV (how does your code know which methods to call? if it does know, why not use a type to express that?) nor is it very performant.
FOLLOWING EDIT TO QUESTION -- Java Bean properties work by reflection, so the example of finding "related properties" (such as First/ Last Date) from known properties is not unreasonable.
However it could be considered to use a DateInterval( FirstDate, LastDate) class so that only one supplementary property is needed per- base property.
As Thomas points out, Byte Buddy generates classes at runtime such that your compiler cannot validate their existance during compile time.
What you can do is to apply your code generation at build time. If your EntityExample.class exists in a specific module, you can enhance this module with the Byte Buddy Maven or Gradle plugin and then, after enhancement, allow your compiler to validate their existance.
What you can also do would be to define interfaces like
interface StringVal {
String getStringVal();
}
which you can ask Byte Buddy to implement in your subclass which allows your compiler to validate the method's existance if you represent your subclass as this interface.
Other than that, your compiler is doing exactly what it is supposed to do: telling you that you are calling a method that does not exist (at that time).

How to safely serialize a lambda?

Although it is possible to serialize a lambda in Java 8, it is strongly discouraged; even serializing inner classes is discouraged. The reason given is that lambdas may not deserialize properly on another JRE. However, doesn't this mean that there is a way to safely serialize a lambda?
For example, say I define a class to be something like this:
public class MyClass {
private String value;
private Predicate<String> validateValue;
public MyClass(String value, Predicate<String> validate) {
this.value = value;
this.validateValue = validate;
}
public void setValue(String value) {
if (!validateValue(value)) throw new IllegalArgumentException();
this.value = value;
}
public void setValidation(Predicate<String> validate) {
this.validateValue = validate;
}
}
If I declared an instance of the class like this, I should not serialize it:
MyClass obj = new MyClass("some value", (s) -> !s.isEmpty());
But what if I made an instance of the class like this:
// Could even be a static nested class
public class IsNonEmpty implements Predicate<String>, Serializable {
#Override
public boolean test(String s) {
return !s.isEmpty();
}
}
MyClass isThisSafeToSerialize = new MyClass("some string", new IsNonEmpty());
Would this now be safe to serialize? My instinct says that yes, it should be safe, since there's no reason that interfaces in java.util.function should be treated any differently from any other random interface. But I'm still wary.
It depends on which kind of safety you want. It’s not the case that serialized lambdas cannot be shared between different JREs. They have a well defined persistent representation, the SerializedLambda. When you study, how it works, you’ll find that it relies on the presence of the defining class, which will have a special method that reconstructs the lambda.
What makes it unreliable is the dependency to compiler specific artifacts, e.g. the synthetic target method, which has some generated name, so simple changes like the insertion of another lambda expression or recompiling the class with a different compiler can break the compatibility to existing serialized lambda expression.
However, using manually written classes isn’t immune to this. Without an explicitly declared serialVersionUID, the default algorithm will calculate an id by hashing class artifacts, including private and synthetic ones, adding a similar compiler dependency. So the minimum to do, if you want reliable persistent forms, is to declare an explicit serialVersionUID.
Or you turn to the most robust form possible:
public enum IsNonEmpty implements Predicate<String> {
INSTANCE;
#Override
public boolean test(String s) {
return !s.isEmpty();
}
}
Serializing this constant does not store any properties of the actual implementation, besides its class name (and the fact that it is an enum, of course) and a reference to the name of the constant. Upon deserialization, the actual unique instance of that name will be used.
Note that serializable lambda expressions may create security issues because they open an alternative way of getting hands on an object that allows to invoke the target methods. However, this applies to all serializable classes, as all variant shown in your question and this answer allow to deliberately deserialize an object allowing to invoke the encapsulated operation. But with explicit serializable classes, the author is usually more aware of this fact.

Java Annotation - array of Objects or toString values

I need to write an Annotation to exclude certain values from a result set.
Background:
Distinct values are selected from fields and are listed in a ComboBox. Some of the legacy values are Deprecated and I don't want to show them, even if they are returned by JDBC's SELECT DISTINCT(). It's like a mini-framework where people can build select queries by clicking values from ComboBoxes.
I tried the following (the code does not compile - commented lines are the ways I tried to solve the problem):
public enum JobType {
//...
S,
//...
}
public #interface Exclude {
Object[] values(); // Invalid type
Enum[] values(); // Invalid type again
String[] values(); // Accepts but see the following lines
}
#Table(name = "jobs_view")
public class JobSelectionView extends View {
//...
#Exclude(values = {JobType.S.toString()}) // Not a constant expression
#Exclude(values = {JobType.S.name()}) // Not a constant expression ?!?!?!
#Exclude(values = {"S"}) // Works but... come on!
#Enumerated(value = EnumType.STRING)
#Column(name = "type")
private JobType type;
//...
}
I don't like using {"S"}, any suggestions?
But if declare JobType[] values() then I won't be able to reuse the #Exclude for other types of Enum.
This is the best way to do what you want, though. Here's the thing:
The class Enum is, itself, meaningless.
It only gains meaning when subclassed. Let's say you want to add another filter, say Color (your own custom Color enum, not java.awt.Color). Obviously, the thing thing that your filtration class is doing is very different for filtering out JobType than it would be for filtering out Color!
Therefore, the best thing to do would be to have each different time of enum you're trying to filter in it's own argument, e.g.
public #interface Exclude {
JobType[] jobs;
Color[] colors;
Foo[] foos;
Quux[] quuxes;
}
This will accomplish two things:
Make it easy for you to do different filtration behavior for each different filter type.
Make the #Excludes annotation more legibile by sorting the different parameters into different groups.
The Javadoc for Enum.name() says:
Returns the name of this enum constant, exactly as declared in its enum declaration. Most programmers should use the toString() method in preference to this one, as the toString method may return a more user-friendly name. This method is designed primarily for use in specialized situations where correctness depends on getting the exact name, which will not vary from release to release.
I suggest you try to tell the people in your company to read up on the Open/Closed principle and explain why violating it would be particularly damaging in this case.

Categories