I have been looking through the MapStruct documentation without any success.
I am implementing a mapping between my Domain classes and my DTO classes; using MapStruct. In my domain, I do not want to use Setters for my fields because, we know that today Setters are bad (for many reasons, but that's not the topic of my question).
However when I want to convert ItemDto into Item, I got the following message :
Error:(17, 21) java: Property "name" has no write accessor in my.example.Item.
However my class Item has a Business method void changeName(String newName) that I would like to use in my Mapper.
The code of my Mapper is :
#Mapper
public interface MyMapper {
#Mapping(source="nameDto", target = "name")
Item map(ItemDto dto);
}
My question is quite simple : how to specify StructMap to use changeName as write accessor ?
Thanks for your help.
In order to implement something like that you would have to write your own custom AccessorNamingStrategy.
If your domain objects follow the same pattern changeXXX then a simple implementation can look like:
public class CustomAccessorNamingStrategy extends DefaultAccessorNamingStrategy {
#Override
public boolean isSetterMethod(ExecutableElement method) {
String methodName = method.getSimpleName().toString();
return methodName.startsWith( "change" ) && methodName.length() > 6;
}
#Override
public String getPropertyName(ExecutableElement getterOrSetterMethod) {
String methodName = getterOrSetterMethod.getSimpleName().toString();
if ( methodName.startsWith( "change") {
return IntrospectorUtils.decapitalize( methodName.substring( 6 );
}
return super.getPropertyName( getterOrSetterMethod );
}
}
You can of course adapt the CustomAccessorNamingStrategy to fit your needs. Keep in mind that this would be used for all objects. Also the ItemDto.
More information about it can be found here in the MapStruct documentation.
Related
I am trying to implement a simple java event-handler lambda for AWS. It receives sqs events and should make appropriate updates to the dynamoDB table.
One of the attributes in this table is a status field that has 4 defined states; therefore I wanted to use an enum class in java and map it to this attribute.
Under AWS SDK v1 I could use the #DynamoDBTypeConvertedEnum annotation. But it does not exist anymore in v2. Instead, there is the #DynamoDbConvertedBy() which receives a converter class reference. There is also an EnumAttributeConverter class which should work nicely with it.
But for some reason, it does not work. The following is a snip from my current code:
#Data
#DynamoDbBean
#NoArgsConstructor
public class Task{
#Getter(onMethod_ = {#DynamoDbPartitionKey})
String id;
...
#Getter(onMethod_ = {#DynamoDbConvertedBy(EnumAttributeConverter.class)})
ExportTaskStatus status;
}
The enum looks as follows:
#RequiredArgsConstructor
public enum TaskStatus {
#JsonProperty("running") PROCESSING(1),
#JsonProperty("succeeded") COMPLETED(2),
#JsonProperty("cancelled") CANCELED(3),
#JsonProperty("failed") FAILED(4);
private final int order;
}
With this, I get the following exception when launching the application:
Class 'class software.amazon.awssdk.enhanced.dynamodb.internal.converter.attribute.EnumAttributeConverter' appears to have no default constructor thus cannot be used with the BeanTableSchema
For anyone else coming here, it looks do me like just dropping the annotation from the enum altogether works just fine, i.e. the SDK applies the provided attribute converters implicitly. This is also mentioned in this Github issue. My own class looks like this (Brand is an enum here), and the enum is converted without any issues when fetching items.
#Value
#Builder(toBuilder = true)
#DynamoDbImmutable(builder = User.UserBuilder.class)
public class User {
#Getter(onMethod = #__({#DynamoDbPartitionKey}))
String id;
Brand brand;
...
}
How can I use Java Enums with Amazon DynamoDB and AWS SDK v2?
Although the documentation doesn't state it, the DynamoDbConvertedBy annotation requires any AttriuteConverter you supply to contain a parameterles default constructor
Unfortunately for you and me, whoever wrote many of the built-in AttributeConverter classes decided to use static create() methods to instantiate them instead of a constructor (maybe they're singletons under the covers? I don't know). This means anyone who wants to use these helpful constructor-less classes like InstantAsStringAttributeConverter and EnumAttributeConverter needs to wrap them in custom wrapper classes that simple parrot the converters we instantiated using create. For a non-generic typed class like InstantAsStringAttributeConverter, this is easy. Just create an wrapper class that parrots the instance you new up with create() and refer to that instead:
public class InstantAsStringAttributeConverterWithConstructor implements AttributeConverter<Instant> {
private final static InstantAsStringAttributeConverter CONVERTER = InstantAsStringAttributeConverter.create();
#Override
public AttributeValue transformFrom(Instant instant) {
return CONVERTER.transformFrom(instant);
}
#Override
public Instant transformTo(AttributeValue attributeValue) {
return CONVERTER.transformTo(attributeValue);
}
#Override
public EnhancedType<Instant> type() {
return CONVERTER.type();
}
#Override
public AttributeValueType attributeValueType() {
return CONVERTER.attributeValueType();
}
}
Then you update your annotation to point to that class intead of the actual underlying library class.
But wait, EnumAttributeConverter is a generic typed class, which means you need to go one step further. First, you need to create a version of the converter that wraps the official version but relies on a constructor taking in the type instead of static instantiation:
import software.amazon.awssdk.enhanced.dynamodb.AttributeConverter;
import software.amazon.awssdk.enhanced.dynamodb.AttributeValueType;
import software.amazon.awssdk.enhanced.dynamodb.EnhancedType;
import software.amazon.awssdk.enhanced.dynamodb.internal.converter.attribute.EnumAttributeConverter;
import software.amazon.awssdk.services.dynamodb.model.AttributeValue;
public class EnumAttributeConverterWithConstructor<T extends Enum<T>> implements AttributeConverter<T> {
private final EnumAttributeConverter<T> converter;
public CustomEnumAttributeConverter(final Class<T> enumClass) {
this.converter = EnumAttributeConverter.create(enumClass);
}
#Override
public AttributeValue transformFrom(T t) {
return this.converter.transformFrom(t);
}
#Override
public T transformTo(AttributeValue attributeValue) {
return this.converter.transformTo(attributeValue);
}
#Override
public EnhancedType<T> type() {
return this.converter.type();
}
#Override
public AttributeValueType attributeValueType() {
return this.converter.attributeValueType();
}
}
But that only gets us half-way there-- now we need to generate a version for each enum type we want to convert that subclasses our custom class:
public class ExportTaskStatusAttributeConverter extends EnumAttributeConverterWithConstructor<ExportTaskStatus> {
public ExportTaskStatusAttributeConverter() {
super(ExportTaskStatus.class);
}
}
#DynamoDbConvertedBy(ExportTaskStatusAttributeConverter.class)
public ExportTaskStatus getStatus() { return this.status; }
Or the Lombok-y way:
#Getter(onMethod_ = {#DynamoDbConvertedBy(ExportTaskStatusAttributeConverter.class)})
ExportTaskStatus status;
It's a pain. It's a pain that could be solved with a little bit of tweaking and a tiny bit of reflection in the AWS SDK, but it's where we're at right now.
I am thinking that your annotations might actually be the problem here. I would remove all annotations that mention a constructor, and instead, write out your own constructor(s). For both Task and TaskStatus.
The dynamodb-enhanced SDK does this out of the box.
When you declare a #DynamoDbBean the DefaultAttributeConverterProvider provides a long list of possible ways to convert attributes between java types, including an EnumAttributeConverter which is used if type.rawClass().isEnum() is true. So you don't need to worry about it.
If you ever wanted to extend the number of converters, you would need to add the converterProviders annotation parameter, and declare the default one (or omit it), as well as any other providers you want.
Example:
#DynamoDbBean(converterProviders = { DefaultAttributeConverterProvider.class, MyCustomAttributeConverterProvider.class });
Solution based on watkinsmatthewp Answer:
public class TaskStatusConverter implements AttributeConverter<TaskStatus> {
#Delegate
private final EnumAttributeConverter<TaskStatus> converter;
public TaskStatusConverter() {
converter = EnumAttributeConverter.create(TaskStatus.class);
}
}
Task status attribute looks like this:
#Getter(onMethod_ = {#DynamoDbConvertedBy(TaskStatusConverter.class)})
TaskStatus status;
I wonder if there is a possiblity to pass dynamically values to an annotation attribute.
I know that annotation are not designed to be modified but I'm using Hibernate Filters and condition to be put are not static in my case.
I think that the only solution is to use librairies whose aim is to read and modify byte code such as Javassist or ASM but it would be too much better if there is another solution.
ps: The difficulty in my case is that I should modify annotations (attribute's value) but librairies I mentioned above allow to create not to edit that's why I'm wondering for another solution
Thanks in advance
I don't know if it integrates nicely with your frameworks, but i would like to suggest the following:
Create an annotation which receives a Class that implements the validation rule
Create an interface which the annotation can receive
Create an implementation for the interface which has the logic for your rule
Add the annotations to your model class
Create an annotation processor which applies the validation for each annotated field
I wrote the following example in Groovy, but using standard Java libs and idiomatic Java. Warn me if anything is unreadable:
import java.lang.annotation.*
// Our Rule interface
interface Rule<T> { boolean isValid(T t) }
// Here is the annotation which can receive a Rule class
#Retention(RetentionPolicy.RUNTIME)
#interface Validation { Class<? extends Rule> value() }
// An implementation of our Rule, in this case, for a Person's name
class NameRule implements Rule<Person> {
PersonDAO dao = new PersonDAO()
boolean isValid(Person person) {
Integer mode = dao.getNameValidationMode()
if (mode == 1) { // Don't hardcode numbers; use enums
return person.name ==~ "[A-Z]{1}[a-z ]{2,25}" // regex matching
} else if (mode == 2) {
return person.name ==~ "[a-zA-Z]{1,25}"
}
}
}
After these declarations, the usage:
// Our model with an annotated field
class Person {
#Validation(NameRule.class)
String name
}
// Here we are mocking a database select to get the rule save in the database
// Don't use hardcoded numbers, stick to a enum or anything else
class PersonDAO { Integer getNameValidationMode() { return 1 } }
The processing of the annotations:
// Here we get each annotation and process it against the object
class AnnotationProcessor {
String validate(Person person) {
def annotatedFields = person.class.declaredFields.findAll { it.annotations.size() > 0 }
for (field in annotatedFields) {
for (annotation in field.annotations) {
Rule rule = annotation.value().newInstance()
if (! rule.isValid(person)) {
return "Error: name is not valid"
}
else {
return "Valid"
}
}
}
}
}
And tests:
// These two must pass
assert new AnnotationProcessor().validate(
new Person(name: "spongebob squarepants") ) == "Error: name is not valid"
assert new AnnotationProcessor().validate(
new Person(name: "John doe") ) == "Valid"
Also, take a look at GContracts, it provides some interesting validation-through-annotations model.
Annotation parameters are hard coded constants in the classfile. So the only way to change them is to generate a new classfile.
Unfortunately, I'm not familiar with Hibernate, so I can't suggest the best option in your specific case.
//Interface DAO
public abstract class BaseDAO<T extends BaseDTO> {
public void update(T t) throws DBException {
Field[] fieldsToInsert = t.getClass().getDeclaredFields();
//code to update database object academic or event
}
public Integer create(T t) throws DBException {
Field[] fieldsToInsert = t.getClass().getDeclaredFields();
//code to create academic or event in database
}
}
//Concrete DAOs
public class AcademicDAO extends BaseDAO<AcademicDTO> {
//provide implementation
}
public class EventDAO extends BaseDAO<EventDTO> {
//provide implementation
}
//Transfer object
public class AcademicDTO extends BaseDTO {
String title;
String surname;
//getters and setters
}
public class BaseDTO {
protected Integer ID;
public Integer getID() {
return ID;
}
public void setID(Integer ID) {
this.ID = ID;
}
}
Hello Guys, I have a sample code on me that follows the above structure to create a small java application to manage academics and events. It is leniently following this pattern
1- You experts are familiar with this pattern more than me. I would like to understand why generics are used in this case so DAOs can extend and implement a generic base class. It would be great if one can show how generics here may be advantageous using an example.
2 - I have also witnessed the use of java Fields. Is there a link between generics and Fields?
I would like to document DAO pattern in an academic report, but I am finding difficult to understand how Generics and Reflect Field play a part here. Do they support flexibility and loose coupling?
The code you've provided is reusable set of logic to load and persist entities. Many times, in an application of non-trivial size, you'll wind up persisting many different types of objects. In this example, you can define as many objects as necessary, but only define the logic to actually save and load once. By asking the DTO what Field objects are there, it can get at the data to help construct queries for loading and saving.
Generics allow you to use this pattern while maintaining type safety. AcademicDAO can only handle AcadmeicDTO. You can't use AcademicDAO to store EventDTO. Generics allow the instance of the class to rely on a more specific type when dealing with the Field objects. If you didn't have generics, the BaseDAO would take Object, and you wouldn't be able to access any methods except those that Object provides because the JVM wouldn't know what class is provided, so it has to limit it's knowledge to that of Object. Using getClass().getDeclaredFields() bypasses that limitation because getClass() returns the actual class of the Object parameter.
Field is just a way to use reflection to access the values of the properties in each DTO. If you had to access the fields directly, with getTitle(), you couldn't reuse a generic base class to do your persistence. What would happen when you needed to access EventDTO? You would have to provide logic for that. Field allows you to skip that logic.
Edit:
To explain what I mean by accessing getID, you could do the following within BaseDAO because T is known to be a BaseDTO with a getID() method defined:
public abstract class BaseDAO<T extends BaseDTO> {
public boolean update(T t) throws DBException {
Integer id = t.getID();
Field[] fields = t.getClass().getDeclaredFields();
// Assuming you have a db object to execute queries using bind variables:
boolean success = db.execute("UPDATE table SET ... WHERE id = ?", id.intValue());
return success;
}
}
If you had this instead (in a non-generic class):
public boolean update(Object o) throws DBException {
// This line doesn't work, since Object doesn't have a getID() method.
Integer id = t.getID();
Field[] fields = o.getClass().getDeclaredFields();
boolean success = db.execute("UPDATE table SET ... WHERE id = ?", id.intValue());
return success;
}
You'd have to look through those Field objects, or ask for the ID field and assume it existed.
For question 1. The use of generics allows the same implementations of update and create to be used regardless of the type of the DTO. Consider if you didn't use generics. Then the best you could do for the parameter type of update would be BaseDTO, but then you could call
academicDAO.update( eventDTO )
which doesn't make sense. With the code as you have it, this would be a type error. So the main advantage is: better type checking.
For question 2. The use of Fields allows a single implementation of update and create to work on DTO object of various concrete types.
I'm in the need of do some clean up of some invisible characters (\r\n) and html tags for specific getters on my entities.
I've been trying to use mixIns to modify what's returned from the entity but I'm not sure how can I reference the target class in my MixIn so I can add the clean up logic there. From the my tests seems that not even my method is called.
This is what I have so far, but it never gets called
public abstract class BookMixIn {
#JsonProperty
public String getTitle() {
return StringUtils.deleteWhitespace(getTitle());
}
}
public class Book {
private String title;
// getter/setters omitted...
}
And the ObjectMapper config:
mapper.getSerializationConfig().addMixInAnnotations(com.company.Book.class,
com.company.BookMixIn.class);
mapper.configure(SerializationConfig.Feature.FAIL_ON_EMPTY_BEANS, false);
String tmp = mapper.writeValueAsString(book);
log.info(tmp);
Can this be accomplished via MixIns?
Thanks
Jackson mix-ins are purely for associating annotations; they are not used for adding behavior (code).
So they would not help you here.
But the simple way that would work (possibly using mix-in too) is to add annotation for using custom serializer, which can use whatever filtering is needed:
#JsonSerialize(using=MyCoolSerializer.class) public String getTitle() { }
so either add that to POJO, if possible; but if not, associate it using mix-in.
If you are running Jackson 1.9, this works:
BookCleaner cleanBook = new BookCleaner(book);
mapper.getSerializationConfig().addMixInAnnotations(Book.class, BookMixIn.class);
mapper.writeValueAsString(cleanBook);
#JsonSerialize
class BookCleaner {
private Book book;
public BookCleaner(final Book book) { this.book = book; }
#JsonUnwrapped
public Book getBook() { return book; }
#JsonProperty("title")
public String getCleanTitle() { return cleanup(getBook().getTitle()); }
}
public interface BookMixIn {
#JsonIgnore public String getTitle();
}
I don't think it works like this; the class or interface is just used as a signature.
You could use AspectJ to modify the return value, but it might be easier to just create a decorator and serialize that instead of the underlying object.
Alternatively, you could create specific getters for the "safe" versions of things and use the #JsonProperty annotation to give it the name you need, and use #JsonIgnore on the "non-safe" getters.
How do I tell Xstream to serialize only fields which are annotated explicitly and ignore the rest?
I am trying to serialize a hibernate persistent object and all proxy related fields get serialized which I don’t want in my xml.
e.g.
<createdBy class="com..domain.Users " reference="../../values/createdBy"/>
is not something I want in my xml.
Edit: I don’t think I made this question clear. A class may inherit from a base class on which I have no control (as in hibernate’s case) on the base class properties.
public class A {
private String ShouldNotBeSerialized;
}
public class B extends A {
#XStreamAlias("1")
private String ThisShouldbeSerialized;
}
In this case when I serialize class B, the base class field ShouldNotBeSerialized will also get serialized. This is not something I want. In most circumstances I will not have control on class A.
Therefore I want to omit all fields by default and serialize only fields for which I explicitly specify the annotation. I want to avoid what GaryF is doing, where I need to explicitly specify the fields I need to omit.
You can omit fields with the #XstreamOmitField annotation. Straight from the manual:
#XStreamAlias("message")
class RendezvousMessage {
#XStreamOmitField
private int messageType;
#XStreamImplicit(itemFieldName="part")
private List<String> content;
#XStreamConverter(SingleValueCalendarConverter.class)
private Calendar created = new GregorianCalendar();
public RendezvousMessage(int messageType, String... content) {
this.messageType = messageType;
this.content = Arrays.asList(content);
}
}
I can take no credit for this answer, just sharing what I have found. You can override the wrapMapper method of the XStream class to achieve what you need.
This link explains in detail: http://pvoss.wordpress.com/2009/01/08/xstream/
Here is the code you need if you don't want the explanation:
// Setup XStream object so that it ignores any undefined tags
XStream xstream = new XStream() {
#Override
protected MapperWrapper wrapMapper(MapperWrapper next) {
return new MapperWrapper(next) {
#Override
public boolean shouldSerializeMember(Class definedIn,
String fieldName) {
if (definedIn == Object.class) {
return false;
}
return super
.shouldSerializeMember(definedIn, fieldName);
}
};
}
};
You might want to do all your testing before you implement this code because the exceptions thrown by the default XStream object are useful for finding spelling mistakes.
There was already a ticket for the XStream people:
Again, this is by design. XStream is a serialization tool, not a data
binding tool. It is made to serialize Java objects to XML and back. It
will write anything into XML that is necessary to recreate an equal
object graph. The generated XML can be tweaked to some extend by
configuration for convenience, but this is already an add-on. What you
like to do can be done by implementing a custom mapper, but that's a
question for the user's list and cannot be handled here.
http://jira.codehaus.org/browse/XSTR-569
I guess the only direct way is to dive into writing a MapperWrapper and exclude all fields you have not annotated. Sounds like a feature request for XStream.