I have an entity with an element of type enum:
#Column(name = "COL_NAME")
#Convert(converter = EnumConverter.class)
private COLNAME colname;
I need a generic converter (I don't want to write one new converter for each enum I'll have in my entities)
import java.lang.reflect.*;
#Converter(autoApply = false)
public class EnumConverter implements AttributeConverter<Object, String>{
#Override
public String convertToDatabaseColumn(Object attribute) {
String valuetoconvert = attribute.toString();
//do something on valuetoconvert
return valueconverted;
}
#Override
public Object convertToEntityAttribute(String dbData) {
// Object to return with dbData read from DB and modified
return objectconverted
}
}
In convertToEntityAttribute I try with Enumeration.valueOf, but this method need the class of enum. How can I try to find this?... if this is the correct way to do so.
Thanks
P.S. I find, googling, some approach tending to minimize the code written, but in every case I must write one class for each enum. And I don't want this. Is it possible?
So, in essence, you're asking how to inject the type of the annotated property into the AttributeConverter. I'm afraid that's impossible with vanilla JPA.
If you're using Hibernate, you could use a composite user type instead. See here, specifically section 4.4. 'Type Parameterization'. You'd end up with something like:
#Type(type = "com.example.ConvertibleEnumType", parameters = #Parameter(name = "lookup", value = MyEnum.class))
private MyEnumClass property;
and you'd still have to rely on reflection inside your custom ConvertibleEnumType definition heavily, but it would work - you would be able to read the value of lookup inside setParameterValues.
(TBH personally, I'd still consider a separate converter per each enum, using e.g. the approach described here, to be the cleaner solution)
Related
I am trying to implement a simple java event-handler lambda for AWS. It receives sqs events and should make appropriate updates to the dynamoDB table.
One of the attributes in this table is a status field that has 4 defined states; therefore I wanted to use an enum class in java and map it to this attribute.
Under AWS SDK v1 I could use the #DynamoDBTypeConvertedEnum annotation. But it does not exist anymore in v2. Instead, there is the #DynamoDbConvertedBy() which receives a converter class reference. There is also an EnumAttributeConverter class which should work nicely with it.
But for some reason, it does not work. The following is a snip from my current code:
#Data
#DynamoDbBean
#NoArgsConstructor
public class Task{
#Getter(onMethod_ = {#DynamoDbPartitionKey})
String id;
...
#Getter(onMethod_ = {#DynamoDbConvertedBy(EnumAttributeConverter.class)})
ExportTaskStatus status;
}
The enum looks as follows:
#RequiredArgsConstructor
public enum TaskStatus {
#JsonProperty("running") PROCESSING(1),
#JsonProperty("succeeded") COMPLETED(2),
#JsonProperty("cancelled") CANCELED(3),
#JsonProperty("failed") FAILED(4);
private final int order;
}
With this, I get the following exception when launching the application:
Class 'class software.amazon.awssdk.enhanced.dynamodb.internal.converter.attribute.EnumAttributeConverter' appears to have no default constructor thus cannot be used with the BeanTableSchema
For anyone else coming here, it looks do me like just dropping the annotation from the enum altogether works just fine, i.e. the SDK applies the provided attribute converters implicitly. This is also mentioned in this Github issue. My own class looks like this (Brand is an enum here), and the enum is converted without any issues when fetching items.
#Value
#Builder(toBuilder = true)
#DynamoDbImmutable(builder = User.UserBuilder.class)
public class User {
#Getter(onMethod = #__({#DynamoDbPartitionKey}))
String id;
Brand brand;
...
}
How can I use Java Enums with Amazon DynamoDB and AWS SDK v2?
Although the documentation doesn't state it, the DynamoDbConvertedBy annotation requires any AttriuteConverter you supply to contain a parameterles default constructor
Unfortunately for you and me, whoever wrote many of the built-in AttributeConverter classes decided to use static create() methods to instantiate them instead of a constructor (maybe they're singletons under the covers? I don't know). This means anyone who wants to use these helpful constructor-less classes like InstantAsStringAttributeConverter and EnumAttributeConverter needs to wrap them in custom wrapper classes that simple parrot the converters we instantiated using create. For a non-generic typed class like InstantAsStringAttributeConverter, this is easy. Just create an wrapper class that parrots the instance you new up with create() and refer to that instead:
public class InstantAsStringAttributeConverterWithConstructor implements AttributeConverter<Instant> {
private final static InstantAsStringAttributeConverter CONVERTER = InstantAsStringAttributeConverter.create();
#Override
public AttributeValue transformFrom(Instant instant) {
return CONVERTER.transformFrom(instant);
}
#Override
public Instant transformTo(AttributeValue attributeValue) {
return CONVERTER.transformTo(attributeValue);
}
#Override
public EnhancedType<Instant> type() {
return CONVERTER.type();
}
#Override
public AttributeValueType attributeValueType() {
return CONVERTER.attributeValueType();
}
}
Then you update your annotation to point to that class intead of the actual underlying library class.
But wait, EnumAttributeConverter is a generic typed class, which means you need to go one step further. First, you need to create a version of the converter that wraps the official version but relies on a constructor taking in the type instead of static instantiation:
import software.amazon.awssdk.enhanced.dynamodb.AttributeConverter;
import software.amazon.awssdk.enhanced.dynamodb.AttributeValueType;
import software.amazon.awssdk.enhanced.dynamodb.EnhancedType;
import software.amazon.awssdk.enhanced.dynamodb.internal.converter.attribute.EnumAttributeConverter;
import software.amazon.awssdk.services.dynamodb.model.AttributeValue;
public class EnumAttributeConverterWithConstructor<T extends Enum<T>> implements AttributeConverter<T> {
private final EnumAttributeConverter<T> converter;
public CustomEnumAttributeConverter(final Class<T> enumClass) {
this.converter = EnumAttributeConverter.create(enumClass);
}
#Override
public AttributeValue transformFrom(T t) {
return this.converter.transformFrom(t);
}
#Override
public T transformTo(AttributeValue attributeValue) {
return this.converter.transformTo(attributeValue);
}
#Override
public EnhancedType<T> type() {
return this.converter.type();
}
#Override
public AttributeValueType attributeValueType() {
return this.converter.attributeValueType();
}
}
But that only gets us half-way there-- now we need to generate a version for each enum type we want to convert that subclasses our custom class:
public class ExportTaskStatusAttributeConverter extends EnumAttributeConverterWithConstructor<ExportTaskStatus> {
public ExportTaskStatusAttributeConverter() {
super(ExportTaskStatus.class);
}
}
#DynamoDbConvertedBy(ExportTaskStatusAttributeConverter.class)
public ExportTaskStatus getStatus() { return this.status; }
Or the Lombok-y way:
#Getter(onMethod_ = {#DynamoDbConvertedBy(ExportTaskStatusAttributeConverter.class)})
ExportTaskStatus status;
It's a pain. It's a pain that could be solved with a little bit of tweaking and a tiny bit of reflection in the AWS SDK, but it's where we're at right now.
I am thinking that your annotations might actually be the problem here. I would remove all annotations that mention a constructor, and instead, write out your own constructor(s). For both Task and TaskStatus.
The dynamodb-enhanced SDK does this out of the box.
When you declare a #DynamoDbBean the DefaultAttributeConverterProvider provides a long list of possible ways to convert attributes between java types, including an EnumAttributeConverter which is used if type.rawClass().isEnum() is true. So you don't need to worry about it.
If you ever wanted to extend the number of converters, you would need to add the converterProviders annotation parameter, and declare the default one (or omit it), as well as any other providers you want.
Example:
#DynamoDbBean(converterProviders = { DefaultAttributeConverterProvider.class, MyCustomAttributeConverterProvider.class });
Solution based on watkinsmatthewp Answer:
public class TaskStatusConverter implements AttributeConverter<TaskStatus> {
#Delegate
private final EnumAttributeConverter<TaskStatus> converter;
public TaskStatusConverter() {
converter = EnumAttributeConverter.create(TaskStatus.class);
}
}
Task status attribute looks like this:
#Getter(onMethod_ = {#DynamoDbConvertedBy(TaskStatusConverter.class)})
TaskStatus status;
I am building Spring Boot webflux REST API functionality that needs to work with data containing few Java type's (let's consider String, Integer, Double for example) information as part of JSON request/responses. Attribute representing Java type must be persistable inside mongodb as well (should not be problem once JSON can work with such attribute). I have following model class and type enumeration which is used by REST API to serialize/deserialize JSON message's.
#Getter
#ToString
#EqualsAndHashCode(exclude = "id")
#Document(collection = "core_scheme")
#JsonDeserialize(builder = SchemeModel.Builder.class)
#Builder(builderClassName = "Builder", toBuilder = true, setterPrefix = "with")
public class SchemeModel {
#Id
private final String id;
#Field(name = "userId") private final String userId;
#Field(name = "date") private final String creationDate;
#Field(name = "properties") private final Map<String, SchemeTypes> properties;
}
public enum SchemeTypes {
INTEGER, STRING, DOUBLE
}
Serialization and deserialization work's well. Now the problem is that when i want to resolve real Java type's stored inside Map<String, SchemeTypes> properties map i need to do mapping similar to this (just abstraction not real code):
SchemeTypes.INTEGER => Java Integer class
SchemeTypes.STRING => Java String class
SchemeTypes.DOUBLE => Java Double class
Is there any more simple way to represent Java type's stored inside model class and used within serialized/deserialized JSON file's so i can directly use it to deduce Java type without additional validation that it's valid Java type. For example if type's enumarated inside mentioned enum would have exactly same naming as real Java type's i could do following without any mapping:
public void deduceClass(SchemeTypes type) {
Class myClass = Class.forName(type.toString());
}
Note that i am looking for a solution which would work out of the box (i don't have to validate type's provided by user). If such solution would be harder to implement as mentioned mapping i will stick with mapping.
If you weren't saving this entity I could say you can actually directly map the SchemeTypes into corresponding class like following
public enum SchemeTypes {
INTEGER(Integer.class), STRING(String.class), DOUBLE(Double.class);
private final Class clazz;
private SchemeTypes(Class clazz){
this.clazz = clazz;
}
public Class getClazz(){
return clazz;
}
}
But as you are saving this it could cause some issue to deserialize.
Maybe you can save not the SchemaType instance directly but just the name of enum to overcome this like following
private final Map<String, String> properties;
and find the corresponding clazz value with a static method on this class like following
public static Class findClazzFor(String schemeTypeName){
return SchemeTypes.valueOf(schemeTypeName).getClazz();
}
Nevertheless I think cleanest solution would be keeping the SchemeType class instance mapping somewhere as a one-to-one map. And retrieve the corresponding class for provided schemeType as in the getClazz method above.
I'm looking a way to bind Type for specific entity fields during entity manager configuration phase. I need it to be able to apply extra "rules" to target entity field using external source without entity class changes.
So basically I'm trying to avoid hardcode #Type annotation way as below:
#Type(type = foo.package.MyType, parameters = {
#Parameter(name = "fooProperty", value = "fooValue")
})
private String someField;
Instead I would like to set Type for someField while building model programmatically.
Here's one way I've seen before. It is a little low-level, so I suspect there is a cleaner way to do this.
This uses a custom Persister in Hibernate to allow us to substitute the type while the SessionFactory ( EntityManagerFactory ) is being created.
First, the #Persister annotation is used to declare the custom Persister :
#Entity
#Persister(impl = MyPersister.class)
public class EntityWithPersister {
private String someField;
Then normally the custom persister should extend SingleTableEntityPersister in Hibernate. If the entity is using a different #Inheritance(strategy), then it may need to extend JoinedSubclassEntityPersister or UnionSubclassEntityPersister instead.
This offers the chance to change a type at the point of construction, for example:
public class MyPersister extends SingleTableEntityPersister {
public MyPersister(PersistentClass persistentClass,
EntityDataAccess cacheAccessStrategy,
NaturalIdDataAccess naturalIdRegionAccessStrategy,
PersisterCreationContext creationContext)
throws HibernateException {
super(modify(persistentClass), cacheAccessStrategy,
naturalIdRegionAccessStrategy, creationContext);
}
private static PersistentClass modify(PersistentClass persistentClass) {
SimpleValue value = (SimpleValue) persistentClass
.getProperty("someField").getValue();
value.setTypeName(MyType.class.getName());
return persistentClass;
}
}
If you need to access more of the context you are in, creationContext.getSessionFactory() is probably a good starting point.
I have a structure of objects representing a Questionnaire and I need to serialize to JSON.
One class of the structure is a OpenQuestion and this class use generics with two parameters.
The problem starts when one of types used was Date, the date is serialized wrong, like a long.
Class code:
public class OpenQuestion <valueType,validationType> extends AbstractQuestion implements Serializable {
private valueType value;
private validationType minValue;
private validationType maxValue;
...
}
I saw how to serialize a date in a hash map if the hash map always uses a Date, but in this case I use the class with String, Integer or Date.
Any idea to solve it?
Thanks
You can add a JsonTypeInfo annotation for this. There's two ways of using this:
Get it to automatically add a type annotation to your object, so it knows what to deserialize it as.
Add a custom type resolver, to handle this for you.
The first will make your JSON ugly, but requires very little extra code and doesn't force you to make custom serializers. The latter is more difficult, but will result in cleaner JSON. Overall the problem is partly that one of your types isn't modelled in JSON (Date) so you'll probably need it to be serialised as an integer or String type in your JSON file.
The former option looks a bit like this:
#JsonTypeInfo( use = Id.CLASS, include = As.WRAPPER_PROPERTY )
private valiationType minValue;
This should encode say, a String value, as something like:
{ __type = "java.lang.String", value = "Hello, World" }
No promises on that being accurate as this is mostly from memory!
It depends. If you do know expected type, you just pass generic type reference:
OpenQuestion<Value,Validation> v = objectMapper.readValue(json,
new TypeReference<OpenQuestion<Value,Validation>>() { });
as that clues Jackson in as to expected type.
If you do not know it, then the other answer shows how to use #JsonTypeInfo.
As pointed out by #MiserableVariable, Jackson serializes (most) date fields as (numeric long) timestamps by default. You can override this behavior in a number of ways.
If using your own instance of ObjectMapper, override a property to write dates as ISO-8601:
objectMapper.configure(SerializationConfig.Feature.WRITE_DATES_AS_TIMESTAMPS, false);
If using your own instance of ObjectMapper, to have dates written in your own custom format:
objectMapper.setDateFormat(myDateFormat); // 1.8 and above
objectMapper.getSerializationConfig().setDateFormat(myDateFormat); // for earlier versions (deprecated for 1.8+)
To leave the default serialization behavior for most fields, but override it for certain fields on certain objects, use a custom serializer:
public class MyBean implements Serializable {
private Date postDate;
// ... constructors, etc
#JsonSerialize(using = MyCustomDateSerializer.class)
public Date getPostDate() {
return postDate;
}
}
public class MyCustomDateSerializer extends JsonSerializer<Date> {
#Override
public void serialize(final Date date, final JsonGeneraror generator,
final SerializerProvider provider) throws IOException,
JSONProcessingException {
generator.writeString(yourRepresentationHere);
}
}
All of this information is available in the Jackson Documentation, with the bulk of it in the section dealing with date handling.
How do I tell Xstream to serialize only fields which are annotated explicitly and ignore the rest?
I am trying to serialize a hibernate persistent object and all proxy related fields get serialized which I don’t want in my xml.
e.g.
<createdBy class="com..domain.Users " reference="../../values/createdBy"/>
is not something I want in my xml.
Edit: I don’t think I made this question clear. A class may inherit from a base class on which I have no control (as in hibernate’s case) on the base class properties.
public class A {
private String ShouldNotBeSerialized;
}
public class B extends A {
#XStreamAlias("1")
private String ThisShouldbeSerialized;
}
In this case when I serialize class B, the base class field ShouldNotBeSerialized will also get serialized. This is not something I want. In most circumstances I will not have control on class A.
Therefore I want to omit all fields by default and serialize only fields for which I explicitly specify the annotation. I want to avoid what GaryF is doing, where I need to explicitly specify the fields I need to omit.
You can omit fields with the #XstreamOmitField annotation. Straight from the manual:
#XStreamAlias("message")
class RendezvousMessage {
#XStreamOmitField
private int messageType;
#XStreamImplicit(itemFieldName="part")
private List<String> content;
#XStreamConverter(SingleValueCalendarConverter.class)
private Calendar created = new GregorianCalendar();
public RendezvousMessage(int messageType, String... content) {
this.messageType = messageType;
this.content = Arrays.asList(content);
}
}
I can take no credit for this answer, just sharing what I have found. You can override the wrapMapper method of the XStream class to achieve what you need.
This link explains in detail: http://pvoss.wordpress.com/2009/01/08/xstream/
Here is the code you need if you don't want the explanation:
// Setup XStream object so that it ignores any undefined tags
XStream xstream = new XStream() {
#Override
protected MapperWrapper wrapMapper(MapperWrapper next) {
return new MapperWrapper(next) {
#Override
public boolean shouldSerializeMember(Class definedIn,
String fieldName) {
if (definedIn == Object.class) {
return false;
}
return super
.shouldSerializeMember(definedIn, fieldName);
}
};
}
};
You might want to do all your testing before you implement this code because the exceptions thrown by the default XStream object are useful for finding spelling mistakes.
There was already a ticket for the XStream people:
Again, this is by design. XStream is a serialization tool, not a data
binding tool. It is made to serialize Java objects to XML and back. It
will write anything into XML that is necessary to recreate an equal
object graph. The generated XML can be tweaked to some extend by
configuration for convenience, but this is already an add-on. What you
like to do can be done by implementing a custom mapper, but that's a
question for the user's list and cannot be handled here.
http://jira.codehaus.org/browse/XSTR-569
I guess the only direct way is to dive into writing a MapperWrapper and exclude all fields you have not annotated. Sounds like a feature request for XStream.