This question already has answers here:
Implement converters for entities with Java Generics
(3 answers)
Closed 7 years ago.
I'm writing my first Java EE 6 web app as a learning exercise. I'm not using a framework, just JPA 2.0, EJB 3.1 and JSF 2.0.
I have a Custom Converter to convert a JPA Entity stored in a SelectOne component back to an Entity. I'm using an InitialContext.lookup to obtain a reference to a Session Bean to find the relevant Entity.
I'd like to create a generic Entity Converter so I don't have to create a converter per Entity. I thought I'd create an Abstract Entity and have all Entities extend it. Then create a Custom Converter for the Abstract Entity and use it as the converter for all Entities.
Does that sound sensible and/or practicable?
Would it make more sense not to have an abstract entity, just a converter that converts any entity? In that case I'm not sure how I'd obtain a reference to the appropriate Session Bean.
I've included my current converter because I'm not sure I'm obtaining a reference to my Session Bean in the most efficient manner.
package com.mycom.rentalstore.converters;
import com.mycom.rentalstore.ejbs.ClassificationEJB;
import com.mycom.rentalstore.entities.Classification;
import javax.faces.application.FacesMessage;
import javax.faces.component.UIComponent;
import javax.faces.context.FacesContext;
import javax.faces.convert.Converter;
import javax.faces.convert.ConverterException;
import javax.faces.convert.FacesConverter;
import javax.naming.InitialContext;
import javax.naming.NamingException;
#FacesConverter(forClass = Classification.class)
public class ClassificationConverter implements Converter {
private InitialContext ic;
private ClassificationEJB classificationEJB;
#Override
public Object getAsObject(FacesContext context, UIComponent component, String value) {
try {
ic = new InitialContext();
classificationEJB = (ClassificationEJB) ic.lookup("java:global/com.mycom.rentalstore_RentalStore_war_1.0-SNAPSHOT/ClassificationEJB");
} catch (NamingException e) {
throw new ConverterException(new FacesMessage(String.format("Cannot obtain InitialContext - %s", e)), e);
}
try {
return classificationEJB.getClassificationById(Long.valueOf(value));
} catch (Exception e) {
throw new ConverterException(new FacesMessage(String.format("Cannot convert %s to Classification - %s", value, e)), e);
}
}
#Override
public String getAsString(FacesContext context, UIComponent component, Object value) {
return String.valueOf(((Classification) value).getId());
}
}
I am using JSF 2.0 view map:
#FacesConverter("entityConverter")
public class EntityConverter implements Converter {
private static final String key = "com.example.jsf.EntityConverter";
private static final String empty = "";
private Map<String, Object> getViewMap(FacesContext context) {
Map<String, Object> viewMap = context.getViewRoot().getViewMap();
#SuppressWarnings({ "unchecked", "rawtypes" })
Map<String, Object> idMap = (Map) viewMap.get(key);
if (idMap == null) {
idMap = new HashMap<String, Object>();
viewMap.put(key, idMap);
}
return idMap;
}
#Override
public Object getAsObject(FacesContext context, UIComponent c, String value) {
if (value.isEmpty()) {
return null;
}
return getViewMap(context).get(value);
}
#Override
public String getAsString(FacesContext context, UIComponent c, Object value) {
if (value == null) {
return empty;
}
String id = ((Persistent) value).getId().toString();
getViewMap(context).put(id, value);
return id;
}
}
Well I had the same problem today, and I solved it by creating a generic ConversionHelper and using it in the converter.
For this purpose I have an EntityService which is a generic SLSB that I use to perform simple CRUD operations for any entity type. Also my entities implement a PersistentEntity interface, which has a getId and setId methods and I keep them with simple primary keys. That's it.
In the end my converter looks like this:
#FacesConverter(value = "userConverter", forClass = User.class)
public class UserConverter implements Converter {
#Override
public Object getAsObject(FacesContext ctx, UIComponent component, java.lang.String value) {
return ConversionHelper.getAsObject(User.class, value);
}
#Override
public String getAsString(FacesContext ctx, UIComponent component, Object value) {
return ConversionHelper.getAsString(value);
}
}
And my conversion helper looks like this:
public final class ConversionHelper {
private ConversionHelper() {
}
public static <T> T getAsObject(Class<T> returnType, String value) {
if (returnType== null) {
throw new NullPointerException("Trying to getAsObject with a null return type.");
}
if (value == null) {
throw new NullPointerException("Trying to getAsObject with a null value.");
}
Long id = null;
try {
id = Long.parseLong(value);
} catch (NumberFormatException e) {
throw new ConverterException("Trying to getAsObject with a wrong id format.");
}
try {
Context initialContext = new InitialContext();
EntityService entityService = (EntityService) initialContext.lookup("java:global/myapp/EntityService");
T result = (T) entityService.find(returnType, id);
return result;
} catch (NamingException e) {
throw new ConverterException("EntityService not found.");
}
}
public static String getAsString(Object value) {
if (value instanceof PersistentEntity) {
PersistentEntity result = (PersistentEntity) value;
return String.valueOf(result.getId());
}
return null;
}
}
Now creating converters for simple JPA entities is a matter of duplicate a converter and change 3 parameters.
This is working well for me, but I don't know if it is the best approach in terms of style and performance. Any tips would be appreciated.
my solution is the following :
#ManagedBean
#SessionScoped
public class EntityConverterBuilderBean {
private static Logger logger = LoggerFactory.getLogger(EntityConverterBuilderBean.class);
#EJB
private GenericDao dao;
public GenericConverter createConverter(String entityClass) {
return new GenericConverter(entityClass, dao);
}
}
public class GenericConverter implements Converter {
private Class clazz;
private GenericDao dao;
public GenericConverter(String clazz, Generic dao) {
try {
this.clazz = Class.forName(clazz);
this.dao = dao;
} catch (Exception e) {
logger.error("cannot get class: " + clazz, e);
throw new RuntimeException(e);
}
}
public Object getAsObject(javax.faces.context.FacesContext facesContext, javax.faces.component.UIComponent uiComponent, java.lang.String s) {
Object ret = null;
if (!"".equals(s)) {
Long id = new Long(s);
ret = dao.findById(clazz, id);
}
return ret;
}
public String getAsString(javax.faces.context.FacesContext facesContext, javax.faces.component.UIComponent uiComponent, java.lang.Object o) {
if (o != null) {
return ((SimpleEntity) o).getId() + "";
} else {
return "";
}
}
}
and in pages:
<h:selectOneMenu id="x" value="#{controller.x}"
converter="#{entityConverterBuilderBean.createConverter('com.test.model.TestEntity')}">
Use Seam Faces, it provides a Converter class that does what you want.
org.jboss.seam.faces.conversion.Converter
While it's a JBoss project, Seam 3 works fine with Glassfish 3.1 and newer.
http://seamframework.org/Seam3/FacesModule
On 3.1 it does have a couple of additional dependencies; see http://blog.ringerc.id.au/2011/05/using-seam-3-with-glassfish-31.html
Try this using Seam Faces from Seam 3.
#Named("DocTypeConverter")
public class DocumentTypeConverter implements Converter, Serializable {
private static final long serialVersionUID = 1L;
#Inject
private DocumentTypeSessionEJB proDocTypeSb;
#Override
public Object getAsObject(FacesContext context, UIComponent component,
String value) {
DocumentType result = null;
if (value != null && !value.trim().equals("")) {
try {
result = (DocumentType) proDocTypeSb.findById(DocumentType.class, value);
} catch(Exception exception) {
throw new ConverterException(new FacesMessage(FacesMessage.SEVERITY_ERROR, "Conversion Error", "Not a valid value"));
}
}
return result;
}
#Override
public String getAsString(FacesContext context, UIComponent component,
Object value) {
String result = null;
if (value != null && value instanceof DocumentType){
DocumentType docType = (DocumentType) value;
result = docType.getId();
}
return result;
}
}
(UPDATED FOR JSF 2.3)
I am using something like this:
#FacesConverter(value = "entityConverter", managed = true)
public class EntityConverter implements Converter<Object> {
#Inject
private EntityManager entityManager;
#Override
public Object getAsObject(FacesContext context, UIComponent component, String value) {
Class<?> entityType = component.getValueExpression("value").getType(context.getELContext());
Class<?> idType = entityManager.getMetamodel().entity(entityType).getIdType().getJavaType();
Converter idConverter = context.getApplication().createConverter(idType);
Object id = idConverter.getAsObject(context, component, value);
return entityManager.getReference(entityType, id);
}
#Override
public String getAsString(FacesContext context, UIComponent component, Object value) {
Object id = entityManager.getEntityManagerFactory().getPersistenceUnitUtil().getIdentifier(value);
Converter idConverter = context.getApplication().createConverter(id.getClass());
return idConverter.getAsString(context, component, id);
}
}
In template, use <f:converter binding="#{entityConverter}" />.
To complete the response of Craig Ringer, you can use the generic org.jboss.seam.faces.conversion.ObjectConverter of the Seam 3 FacesModule.
You can grab the code here : https://github.com/seam/faces/blob/develop/impl/src/main/java/org/jboss/seam/faces/conversion/ObjectConverter.java
It uses 2 HashMaps (one is used reversly) and stocks its objects in the Conversation.
Related
Problem I am trying to solve
I am trying to implement enum mapping for Hibernate. So far I have researched available options, and both the #Enumerated(EnumType.ORDINAL) and #Enumerated(EnumType.STRING) seemed inadequate for my needs. The #Enumerated(EnumType.ORDINAL) seems to be very error-prone, as a mere reordering of enum constants can mess the mapping up, and the #Enumerated(EnumType.STRING) does not suffice too, as the database I work with is already full of values to be mapped, and these values are not what I would like my enum constants to be named like (the values are foreign language strings / integers).
Currently, all these values are being mapped to String / Integer properties. At the same time the properties should only allow for a restricted sets of values (imagine meetingStatus property allowing for Strings: PLANNED, CANCELED, and DONE. Or another property allowing for a restricted set of Integer values: 1, 2, 3, 4, 5).
My idea was to replace the implementation with enums to improve the type safety of the code. A good example where the String / Integer implementation could cause errors is String method parameter representing such value - with String, anything goes there. Having an Enum parameter type on the other hand introduces compile time safety.
My best approach so far
The only solution that seemed to fulfill my needs was to implement custom javax.persistence.AttributeConverter with #Converter annotation for every enum. As my model would require quite a few enums, writing custom converter for each of them started to seem like a madness really quickly. So I searched for a generic solution to the problem -> how to write a generic converter for any type of enum. The following answer was of big help here: https://stackoverflow.com/a/23564597/7024402. The code example in the answer provides for somewhat generic implementation, yet for every enum there is still a separate converter class needed. The author of the answer also continues:
"The alternative would be to define a custom annotation, patch the JPA provider to recognize this annotation. That way, you could examine the field type as you build the mapping information and feed the necessary enum type into a purely generic converter."
And that's what I think I would be interested in. I could, unfortunately, not find any more information about that, and I would need a little more guidance to understand what needs to be done and how would it work with this approach.
Current Implementation
public interface PersistableEnum<T> {
T getValue();
}
public enum IntegerEnum implements PersistableEnum<Integer> {
ONE(1),
TWO(2),
THREE(3),
FOUR(4),
FIVE(5),
SIX(6);
private int value;
IntegerEnum(int value) {
this.value = value;
}
#Override
public Integer getValue() {
return value;
}
}
public abstract class PersistableEnumConverter<E extends PersistableEnum<T>, T> implements AttributeConverter<E, T> {
private Class<E> enumType;
public PersistableEnumConverter(Class<E> enumType) {
this.enumType = enumType;
}
#Override
public T convertToDatabaseColumn(E attribute) {
return attribute.getValue();
}
#Override
public E convertToEntityAttribute(T dbData) {
for (E enumConstant : enumType.getEnumConstants()) {
if (enumConstant.getValue().equals(dbData)) {
return enumConstant;
}
}
throw new EnumConversionException(enumType, dbData);
}
}
#Converter
public class IntegerEnumConverter extends PersistableEnumConverter<IntegerEnum, Integer> {
public IntegerEnumConverter() {
super(IntegerEnum.class);
}
}
This is how I was able to achieve the partially generic converter implementation.
GOAL: Getting rid of the need to create new converter class for every enum.
Luckily, you should not patch the hibernate for this.
You can declare an annotation like the following:
import java.lang.annotation.Retention;
import java.lang.annotation.Target;
import java.sql.Types;
import static java.lang.annotation.ElementType.METHOD;
import static java.lang.annotation.ElementType.FIELD;
import static java.lang.annotation.RetentionPolicy.RUNTIME;
#Target({METHOD, FIELD})
#Retention(RUNTIME)
public #interface EnumConverter
{
Class<? extends PersistableEnum<?>> enumClass() default IntegerEnum.class;
int sqlType() default Types.INTEGER;
}
A hibernate user type like the following:
import java.io.Serializable;
import java.lang.annotation.Annotation;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Types;
import java.util.Objects;
import java.util.Properties;
import org.hibernate.HibernateException;
import org.hibernate.engine.spi.SharedSessionContractImplementor;
import org.hibernate.usertype.DynamicParameterizedType;
import org.hibernate.usertype.UserType;
public class PersistableEnumType implements UserType, DynamicParameterizedType
{
private int sqlType;
private Class<? extends PersistableEnum<?>> clazz;
#Override
public void setParameterValues(Properties parameters)
{
ParameterType reader = (ParameterType) parameters.get(PARAMETER_TYPE);
EnumConverter converter = getEnumConverter(reader);
sqlType = converter.sqlType();
clazz = converter.enumClass();
}
private EnumConverter getEnumConverter(ParameterType reader)
{
for (Annotation annotation : reader.getAnnotationsMethod()){
if (annotation instanceof EnumConverter) {
return (EnumConverter) annotation;
}
}
throw new IllegalStateException("The PersistableEnumType should be used with #EnumConverter annotation.");
}
#Override
public int[] sqlTypes()
{
return new int[] {sqlType};
}
#Override
public Class<?> returnedClass()
{
return clazz;
}
#Override
public boolean equals(Object x, Object y) throws HibernateException
{
return Objects.equals(x, y);
}
#Override
public int hashCode(Object x) throws HibernateException
{
return Objects.hashCode(x);
}
#Override
public Object nullSafeGet(ResultSet rs,
String[] names,
SharedSessionContractImplementor session,
Object owner) throws HibernateException, SQLException
{
Object val = null;
if (sqlType == Types.INTEGER) val = rs.getInt(names[0]);
if (sqlType == Types.VARCHAR) val = rs.getString(names[0]);
if (rs.wasNull()) return null;
for (PersistableEnum<?> pEnum : clazz.getEnumConstants())
{
if (Objects.equals(pEnum.getValue(), val)) return pEnum;
}
throw new IllegalArgumentException("Can not convert " + val + " to enum " + clazz.getName());
}
#Override
public void nullSafeSet(PreparedStatement st,
Object value,
int index,
SharedSessionContractImplementor session) throws HibernateException, SQLException
{
if (value == null) {
st.setNull(index, sqlType);
}
else {
PersistableEnum<?> pEnum = (PersistableEnum<?>) value;
if (sqlType == Types.INTEGER) st.setInt(index, (Integer) pEnum.getValue());
if (sqlType == Types.VARCHAR) st.setString(index, (String) pEnum.getValue());
}
}
#Override
public Object deepCopy(Object value) throws HibernateException
{
return value;
}
#Override
public boolean isMutable()
{
return false;
}
#Override
public Serializable disassemble(Object value) throws HibernateException
{
return Objects.toString(value);
}
#Override
public Object assemble(Serializable cached, Object owner) throws HibernateException
{
return cached;
}
#Override
public Object replace(Object original, Object target, Object owner) throws HibernateException
{
return original;
}
}
And then, you can use it:
import org.hibernate.annotations.Type;
#Entity
#Table(name="TST_DATA")
public class TestData
{
...
#EnumConverter(enumClass = IntegerEnum.class, sqlType = Types.INTEGER)
#Type(type = "com.example.converter.PersistableEnumType")
#Column(name="INT_VAL")
public IntegerEnum getIntValue()
...
#EnumConverter(enumClass = StringEnum.class, sqlType = Types.VARCHAR)
#Type(type = "com.example.converter.PersistableEnumType")
#Column(name="STR_VAL")
public StringEnum getStrValue()
...
}
See also the chapter 5.3.3 Extending Hibernate with UserTypes at the excellent book "Java Persistence with Hibernate" by Bauer, King, Gregory.
Simplifying:
import com.pismo.apirest.mvc.enums.OperationType;
import com.pismo.apirest.mvc.enums.support.PersistableEnum;
import java.util.Objects;
import java.util.Optional;
import java.util.stream.Stream;
import javax.persistence.AttributeConverter;
import javax.persistence.Converter;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
#SuppressWarnings("unused")
public interface EnumsConverters {
#RequiredArgsConstructor
abstract class AbstractPersistableEnumConverter<E extends Enum<E> & PersistableEnum<I>, I> implements AttributeConverter<E, I> {
private final E[] enumConstants;
public AbstractPersistableEnumConverter(#NonNull Class<E> enumType) {
enumConstants = enumType.getEnumConstants();
}
#Override
public I convertToDatabaseColumn(E attribute) {
return Objects.isNull(attribute) ? null : attribute.getId();
}
#Override
public E convertToEntityAttribute(I dbData) {
return fromId(dbData, enumConstants);
}
public E fromId(I idValue) {
return fromId(idValue, enumConstants);
}
public static <E extends Enum<E> & PersistableEnum<I>, I> E fromId(I idValue, E[] enumConstants) {
return Objects.isNull(idValue) ? null : Stream.of(enumConstants)
.filter(e -> e.getId().equals(idValue))
.findAny()
.orElseThrow(() -> new IllegalArgumentException(
String.format("Does not exist %s with ID: %s", enumConstants[0].getClass().getSimpleName(), idValue)));
}
}
#Converter(autoApply = true)
class OperationTypeConverter extends AbstractPersistableEnumConverter<OperationType, Integer> {
public OperationTypeConverter() {
super(OperationType.class);
}
}
}
I tried 1000 times create something same.
Generate converter for each enum on the fly - not problem, but then they will be have same class. Main problem there: org.hibernate.boot.internal.MetadataBuilderImpl#applyAttributeConverter(java.lang.Class<? extends javax.persistence.AttributeConverter>, boolean).
If converter already registered we got exception.
public void addAttributeConverterInfo(AttributeConverterInfo info) {
if ( this.attributeConverterInfoMap == null ) {
this.attributeConverterInfoMap = new HashMap<>();
}
final Object old = this.attributeConverterInfoMap.put( info.getConverterClass(), info );
if ( old != null ) {
throw new AssertionFailure(
String.format(
"AttributeConverter class [%s] registered multiple times",
info.getConverterClass()
)
);
}
}
Perhaps we can change org.hibernate.boot.internal.BootstrapContext Impl, but I'm sure it's create too complex and non-flexible code.
First I'm not sure if it's a good idea to do all this.
Goal is to create some interfaces with annotations to hide legacy position based string access out of a configuration database, without implementing each interface.
Declarative configured Interface:
public interface LegacyConfigItem extends ConfigDbAccess{
#Subfield(length=3)
String BWHG();
#Subfield(start = 3, length=1)
int BNKST();
#Subfield(start = 4, length=1)
int BEINH();
:
}
Base interface for runtime identification
public interface ConfigDbAccess{
}
Dummy implementation without functionality, may change.
public class EmptyImpl {
}
Beanfactory and MethodInvocation interceptor, to handle the unimplemented methods.
#Component
public class InterfaceBeanFactory extends DefaultListableBeanFactory {
protected static final int TEXT_MAX = 400;
#Autowired
private EntityRepo entityRepo;
public <T> T getInstance(Class<T> legacyInterface, String key) {
ProxyFactory factory = new ProxyFactory(new EmptyImpl());
factory.setInterfaces(legacyInterface);
factory.setExposeProxy(true);
factory.addAdvice(new MethodInterceptor() {
#Override
public Object invoke(MethodInvocation invocation) throws Throwable {
KEY keyAnnotation = invocation.getThis().getClass().getAnnotation(Key.class);
String key= keyAnnotation.key().toUpperCase();
String ptart = invocation.getMethod().getDeclaringClass().getSimpleName();
Vpt result = entityRepo.getOne(new EntityId(ptart.toUpperCase(), schl.toUpperCase()));
Subfield sub = invocation.getMethod().getAnnotation(Subfield.class);
//TODO: Raise missing Subfield annotation
int start = sub.start();
int length = sub.length();
if (start + length > TEXT_MAX) {
//TODO: Raise invalid Subfield config
}
String value = result.getTextField().substring(start,start+length);
return value;
}
});
return (T) factory.getProxy();
}
#Override
protected Map<String, Object> findAutowireCandidates(String beanName, Class<?> requiredType, DependencyDescriptor descriptor) {
Map<String, Object> map = super.findAutowireCandidates(beanName, requiredType, descriptor);
if (ConfigDbAccess.class.isAssignableFrom(requiredType )) {
:
#SpringBootApplication
public class JpaDemoApplication {
#Autowired
private ApplicationContext context;
public static void main(String[] args) {
SpringApplication app = new SpringApplication(JpaDemoApplication.class);
// app.setApplicationContextClass(InterfaceInjectionContext .class);
app.run(args);
}
public class InterfaceInjectionContext extends AnnotationConfigApplicationContext {
public VptInjectionContext () {
super (new InterfaceBeanFactory ());
}
}
So far I got all this stuff working, except when I try to set the applications Context class to my DefaultListableBeanFactory, I'm killing the Spring boot starter web. The application starts, injects the the Autowired fields with my intercepted pseudo implementaition --- and ends.
I think I'm doing something wrong with registering the DefaultListableBeanFactory, but I've no idea how to do it right.
To get this answered:
M. Deinum pointed me to a much simpler solution:
Instead of creating a BeanFactory I installed a BeanPostProcessor with this functioniality.
#RestController
public class DemoRestController {
#Autowired
VptService vptService;
#ConfigItem(key="KS001")
private PrgmParm prgmKs001;
#ConfigItem(key="KS002")
private PrgmParm prgmKs002;
public DemoRestController() {
super();
}
Where the ConfigItem annotation defines the injection point.
Next I created a CustomBeanPostProcessor which scans all incoming beans for
fields having a ConfigItem annotation
#Component
public class CustomBeanPostProcessor implements BeanPostProcessor {
public Object postProcessBeforeInitialization(Object bean, String beanName) throws BeansException {
for (Field field : bean.getClass().getDeclaredFields()) {
SHL cfgDef = field.getAnnotation(ConfigItem.class);
if (cfgDef != null) {
Object instance = getlInstance(field.getType(), cfgDef.key());
boolean accessible = field.isAccessible();
field.setAccessible(true);
try {
field.set(bean, instance);
} catch (IllegalArgumentException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IllegalAccessException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
field.setAccessible(accessible);
}
}
return bean;
}
The getInstnce(field.getType(),cfgDef.key()) creates a proxy with the MethodInterceptor, which does the work.
There are a lot of things to finalize, but all in all it looks good to me.
I try to marshal an object and I want all the fields to be attributes. The normal fields are OK with the #XStreamAsAttribute annotation but I have two of them with a converter. For them when I marshal they are converted as field...
#XStreamAlias(value="sinistre")
public class ObjetMetierSinistreDto {
#XStreamAlias(value="S_sinistreEtat")
#XStreamAsAttribute
private String etat;
#XStreamAsAttribute
#XStreamAlias(value="S_sinistreDateSurv")
#XStreamConverter(value=JodaDateConverter.class)
private LocalDate dateSurvenanceDossier;
...
The converter:
public class JodaDateConverter implements Converter {
#Override
#SuppressWarnings("unchecked")
public boolean canConvert(final Class type) {
return (type != null) && LocalDate.class.getPackage().equals(type.getPackage());
}
#Override
public void marshal(final Object source, final HierarchicalStreamWriter writer,
final MarshallingContext context) {
writer.setValue(source.toString().replace("-", "/"));
}
#Override
#SuppressWarnings("unchecked")
public Object unmarshal(final HierarchicalStreamReader reader,
final UnmarshallingContext context) {
try {
final Class requiredType = context.getRequiredType();
final Constructor constructor = requiredType.getConstructor(Object.class);
return constructor.newInstance(reader.getValue());
} catch (final Exception e) {
throw new RuntimeException(String.format(
"Exception while deserializing a Joda Time object: %s", context.getRequiredType().getSimpleName()), e);
}
}
}
and the result:
<sinistre S_sinistreEtat="S">
<S_sinistreDateSurv>2015/02/01</S_sinistreDateSurv>
</sinistre>
and what I like:
<sinistre S_sinistreEtat="S"
S_sinistreDateSurv="2015/02/01"/>
I finally found how to solve this problem!
The JodaDateConverter should not implements Converter but extends AbstractSingleValueConverter (as the DateConverter from XStream)
Then you just need to override canConvert() and fromString() and you are good to go!
Exemple:
public class JodaDateConverter extends AbstractSingleValueConverter {
#Override
#SuppressWarnings("unchecked")
public boolean canConvert(final Class type) {
return (type != null) && LocalDate.class.getPackage().equals(type.getPackage());
}
#Override
public Object fromString(String str) {
String separator;
if(str.contains(":")){
separator = ":";
} else if(str.contains("/")){
separator = "/";
} else if(str.contains("-")){
separator = "-";
} else {
throw new RuntimeException("The date must contains ':' or '/' or '-'");
}
String[] date = str.split(separator);
if(date.length < 3){
throw new RuntimeException("The date must contains hour, minute and second");
}
return new LocalDate(Integer.valueOf(date[0]),Integer.valueOf(date[1]),Integer.valueOf(date[2]));
}
}
I have a hibernate-mapped Java object, JKL, which is full of a bunch of normal hibernate-mappable fields (like strings and integers).
I'm added a new embedded field to it (which lives in the same table -- not a mapping), asdf, which is a fj.data.Option<ASDF>. I've made it an option to make it clear that this field may not actually contain anything (as opposed to having to handle null every time I access it).
How do I set up the mapping in my JKL.hbm.xml file? I'd like hibernate to automatically convert a null in the database to a none of fj.data.Option<ASDF> when it retrieves the object. It should also convert a non-null instance of ASDF to a some of fj.data.Option<ASDF>.
Is there any other trickery that I have to do?
I would suggest introducing FunctionalJava's Option in the accessors (getter and setter), while leaving Hibernate to handle a simple java field which is allowed to be null.
For example, for an optional Integer field:
// SQL
CREATE TABLE `JKL` (
`JKL_ID` INTEGER PRIMARY KEY,
`MY_FIELD` INTEGER DEFAULT NULL
)
You can map a Hibernate private field directly:
// Java
#Column(nullable = true)
private Integer myField;
You could then introduce Option at the accessor boundary:
// Java
public fj.data.Option<Integer> getMyField() {
return fj.data.Option.fromNull(myField);
}
public void setMyField(fj.data.Option<Integer> value) {
myField = value.toNull();
}
Does that work for your needs?
You can use Hibernate's custom mapping types. Documentation is here. Here is an analogous example of mapping Scala's Option to a Hibernate mapping.
Simply put, you would need to extend the org.hibernate.UserType interface. You could also create a generic-typed base class with a JKL-typed sub-type, similar to what you see in the Scala example.
I think using getter/setter is simpler, but here's an example of what I did to make it work :
(It works fine for number and string, but not for date (error with #Temporal annotation)).
import com.cestpasdur.helpers.PredicateHelper;
import com.google.common.annotations.VisibleForTesting;
import com.google.common.base.Optional;
import org.apache.commons.lang.ObjectUtils;
import org.apache.commons.lang.StringUtils;
import org.hibernate.HibernateException;
import org.hibernate.usertype.UserType;
import org.joda.time.DateTime;
import java.io.Serializable;
import java.sql.*;
public class OptionUserType implements UserType {
#Override
public int[] sqlTypes() {
return new int[]{
Types.NULL
};
}
#Override
public Class returnedClass() {
return Optional.class;
}
#Override
public boolean equals(Object o, Object o2) throws HibernateException {
return ObjectUtils.equals(o, o2);
}
#Override
public int hashCode(Object o) throws HibernateException {
assert (o != null);
return o.hashCode();
}
#Override
public Optional<? extends Object> nullSafeGet(ResultSet rs, String[] names, Object owner) throws HibernateException, SQLException {
return Optional.fromNullable(rs.getObject(names[0]));
}
#VisibleForTesting
void handleDate(PreparedStatement st, Date value, int index) throws SQLException {
st.setDate(index, value);
}
#VisibleForTesting
void handleNumber(PreparedStatement st, String stringValue, int index) throws SQLException {
Double doubleValue = Double.valueOf(stringValue);
st.setDouble(index, doubleValue);
}
#Override
public void nullSafeSet(PreparedStatement st, Object value, int index) throws SQLException {
if (value != null) {
if (value instanceof Optional) {
Optional optionalValue = (Optional) value;
if (optionalValue.isPresent()) {
String stringValue = String.valueOf(optionalValue.get());
if (StringUtils.isNotBlank(stringValue)) {
if (PredicateHelper.IS_DATE_PREDICATE.apply(stringValue)) {
handleDate(st, new Date(DateTime.parse(stringValue).getMillis()), index);
} else if (StringUtils.isNumeric(stringValue)) {
handleNumber(st, stringValue, index);
} else {
st.setString(index, optionalValue.get().toString());
}
} else {
st.setString(index, null);
}
} else {
System.out.println("else Some");
}
} else {
//TODO replace with Preconditions guava
throw new IllegalArgumentException(value + " is not implemented");
}
} else {
st.setString(index, null);
}
}
#Override
public Object deepCopy(Object o) throws HibernateException {
return o;
}
#Override
public boolean isMutable() {
return false;
}
#Override
public Serializable disassemble(Object o) throws HibernateException {
return (Serializable) o;
}
#Override
public Object assemble(Serializable serializable, Object o) throws HibernateException {
return serializable;
}
#Override
public Object replace(Object original, Object target, Object owner) throws HibernateException {
return original;
}
}
I have a lot of classes UNO,HAV,MAS,KOS
I want to create a factory pattern.
validator.load("UNO").validate();
I need dynamically load classes into validator class and return an instance.
(dynamically set name of the class and return an instance)
My problem is: how can I return the instance of a class, if I have incompatible types?
I don't know what to write in return type of method.
The main problem in the Validator CLASS.
public SegmentAbstract load(String str) {
AND
return SegmentAbsClass.forName(identify);
Main class
try{
validator.load("UNO").validate();
}catch(Exception e){
System.out.print("No class ");
}
Abstract Class (SegmentAbstract)
public abstract class SegmentAbstract {
public abstract Boolean validate();
}
Class UNO
public class UNA extends SegmentAbstract{
public Boolean validate() {
System.out.print("UNO!!");
return true;
}
}
Class Validator
public class Validator {
public SegmentAbstract load(String str) {
String identify = str.substring(0, 3);
try {
return SegmentAbsClass.forName(identify);
}
catch(Exception e) {
return this;
}
}
}
Try this :
public interface Validator {
boolean validate(Object obj);
}
public final class ValidatorFactory {
private ValidatorFactory(){}
public static Validator load(String type){
try {
Class<?> clazz = Class.forName(type);
if (Arrays.asList(clazz.getInterfaces()).contains(Validator.class)){
return (Validator) clazz.newInstance();
}
throw new IllegalArgumentException("Provided class doesn't implement Validator interface");
} catch (Exception e) {
throw new IllegalArgumentException("Wrong class provided", e);
}
}
}
Maybe this will help???
I will do something like that:
// ISegment.java
public interface ISegment {
Boolean validate();
}
// Uno.java
public class Uno implements ISegment {
public Boolean validate() {
System.out.print("UNO!!");
return true;
}
}
// SegmentFactory.java
public final class SegmentFactory {
public static enum Supported {
UNO("uno", Uno.class), /* ... */, HAV("hav", Hav.class);
private final Class<?> clazz;
private final String name;
private Supported(final String name, final Class<?> clazz) {
this.name = name;
this.clazz = clazz;
}
public Class<?> getClazz() {
return clazz;
}
public static Supported for(final String name) {
for (final Supported s : values()) {
if (s.name.equals(name) {
return s;
}
}
return null; // a default one
}
}
public static ISegment create(final Supported supp) {
if (supp == null) {
return null;
}
return supp.getClazz.newInstance();
}
private SegmentFactory() {
// avoid instantiation
}
}
usage:
final ISegment sa = SegmentFactory.create(SegmentFactory.Supported.for("uno"));
sa.validate();
Not tested!!
Take a look here. Briefly, the idea is to create a map in your factory class (Map<String,String>, key is identifier, value is fully qualified class name), and add supported classes during initialization. Then you use reflection to instantiate an object in your factory method. Also, you can avoid reflection by using Map<String, SegmentAbstract> instead of Map<String,String> and adding public abstract getNewSegment() to your SegmentAbstract class.