Problem I am trying to solve
I am trying to implement enum mapping for Hibernate. So far I have researched available options, and both the #Enumerated(EnumType.ORDINAL) and #Enumerated(EnumType.STRING) seemed inadequate for my needs. The #Enumerated(EnumType.ORDINAL) seems to be very error-prone, as a mere reordering of enum constants can mess the mapping up, and the #Enumerated(EnumType.STRING) does not suffice too, as the database I work with is already full of values to be mapped, and these values are not what I would like my enum constants to be named like (the values are foreign language strings / integers).
Currently, all these values are being mapped to String / Integer properties. At the same time the properties should only allow for a restricted sets of values (imagine meetingStatus property allowing for Strings: PLANNED, CANCELED, and DONE. Or another property allowing for a restricted set of Integer values: 1, 2, 3, 4, 5).
My idea was to replace the implementation with enums to improve the type safety of the code. A good example where the String / Integer implementation could cause errors is String method parameter representing such value - with String, anything goes there. Having an Enum parameter type on the other hand introduces compile time safety.
My best approach so far
The only solution that seemed to fulfill my needs was to implement custom javax.persistence.AttributeConverter with #Converter annotation for every enum. As my model would require quite a few enums, writing custom converter for each of them started to seem like a madness really quickly. So I searched for a generic solution to the problem -> how to write a generic converter for any type of enum. The following answer was of big help here: https://stackoverflow.com/a/23564597/7024402. The code example in the answer provides for somewhat generic implementation, yet for every enum there is still a separate converter class needed. The author of the answer also continues:
"The alternative would be to define a custom annotation, patch the JPA provider to recognize this annotation. That way, you could examine the field type as you build the mapping information and feed the necessary enum type into a purely generic converter."
And that's what I think I would be interested in. I could, unfortunately, not find any more information about that, and I would need a little more guidance to understand what needs to be done and how would it work with this approach.
Current Implementation
public interface PersistableEnum<T> {
T getValue();
}
public enum IntegerEnum implements PersistableEnum<Integer> {
ONE(1),
TWO(2),
THREE(3),
FOUR(4),
FIVE(5),
SIX(6);
private int value;
IntegerEnum(int value) {
this.value = value;
}
#Override
public Integer getValue() {
return value;
}
}
public abstract class PersistableEnumConverter<E extends PersistableEnum<T>, T> implements AttributeConverter<E, T> {
private Class<E> enumType;
public PersistableEnumConverter(Class<E> enumType) {
this.enumType = enumType;
}
#Override
public T convertToDatabaseColumn(E attribute) {
return attribute.getValue();
}
#Override
public E convertToEntityAttribute(T dbData) {
for (E enumConstant : enumType.getEnumConstants()) {
if (enumConstant.getValue().equals(dbData)) {
return enumConstant;
}
}
throw new EnumConversionException(enumType, dbData);
}
}
#Converter
public class IntegerEnumConverter extends PersistableEnumConverter<IntegerEnum, Integer> {
public IntegerEnumConverter() {
super(IntegerEnum.class);
}
}
This is how I was able to achieve the partially generic converter implementation.
GOAL: Getting rid of the need to create new converter class for every enum.
Luckily, you should not patch the hibernate for this.
You can declare an annotation like the following:
import java.lang.annotation.Retention;
import java.lang.annotation.Target;
import java.sql.Types;
import static java.lang.annotation.ElementType.METHOD;
import static java.lang.annotation.ElementType.FIELD;
import static java.lang.annotation.RetentionPolicy.RUNTIME;
#Target({METHOD, FIELD})
#Retention(RUNTIME)
public #interface EnumConverter
{
Class<? extends PersistableEnum<?>> enumClass() default IntegerEnum.class;
int sqlType() default Types.INTEGER;
}
A hibernate user type like the following:
import java.io.Serializable;
import java.lang.annotation.Annotation;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Types;
import java.util.Objects;
import java.util.Properties;
import org.hibernate.HibernateException;
import org.hibernate.engine.spi.SharedSessionContractImplementor;
import org.hibernate.usertype.DynamicParameterizedType;
import org.hibernate.usertype.UserType;
public class PersistableEnumType implements UserType, DynamicParameterizedType
{
private int sqlType;
private Class<? extends PersistableEnum<?>> clazz;
#Override
public void setParameterValues(Properties parameters)
{
ParameterType reader = (ParameterType) parameters.get(PARAMETER_TYPE);
EnumConverter converter = getEnumConverter(reader);
sqlType = converter.sqlType();
clazz = converter.enumClass();
}
private EnumConverter getEnumConverter(ParameterType reader)
{
for (Annotation annotation : reader.getAnnotationsMethod()){
if (annotation instanceof EnumConverter) {
return (EnumConverter) annotation;
}
}
throw new IllegalStateException("The PersistableEnumType should be used with #EnumConverter annotation.");
}
#Override
public int[] sqlTypes()
{
return new int[] {sqlType};
}
#Override
public Class<?> returnedClass()
{
return clazz;
}
#Override
public boolean equals(Object x, Object y) throws HibernateException
{
return Objects.equals(x, y);
}
#Override
public int hashCode(Object x) throws HibernateException
{
return Objects.hashCode(x);
}
#Override
public Object nullSafeGet(ResultSet rs,
String[] names,
SharedSessionContractImplementor session,
Object owner) throws HibernateException, SQLException
{
Object val = null;
if (sqlType == Types.INTEGER) val = rs.getInt(names[0]);
if (sqlType == Types.VARCHAR) val = rs.getString(names[0]);
if (rs.wasNull()) return null;
for (PersistableEnum<?> pEnum : clazz.getEnumConstants())
{
if (Objects.equals(pEnum.getValue(), val)) return pEnum;
}
throw new IllegalArgumentException("Can not convert " + val + " to enum " + clazz.getName());
}
#Override
public void nullSafeSet(PreparedStatement st,
Object value,
int index,
SharedSessionContractImplementor session) throws HibernateException, SQLException
{
if (value == null) {
st.setNull(index, sqlType);
}
else {
PersistableEnum<?> pEnum = (PersistableEnum<?>) value;
if (sqlType == Types.INTEGER) st.setInt(index, (Integer) pEnum.getValue());
if (sqlType == Types.VARCHAR) st.setString(index, (String) pEnum.getValue());
}
}
#Override
public Object deepCopy(Object value) throws HibernateException
{
return value;
}
#Override
public boolean isMutable()
{
return false;
}
#Override
public Serializable disassemble(Object value) throws HibernateException
{
return Objects.toString(value);
}
#Override
public Object assemble(Serializable cached, Object owner) throws HibernateException
{
return cached;
}
#Override
public Object replace(Object original, Object target, Object owner) throws HibernateException
{
return original;
}
}
And then, you can use it:
import org.hibernate.annotations.Type;
#Entity
#Table(name="TST_DATA")
public class TestData
{
...
#EnumConverter(enumClass = IntegerEnum.class, sqlType = Types.INTEGER)
#Type(type = "com.example.converter.PersistableEnumType")
#Column(name="INT_VAL")
public IntegerEnum getIntValue()
...
#EnumConverter(enumClass = StringEnum.class, sqlType = Types.VARCHAR)
#Type(type = "com.example.converter.PersistableEnumType")
#Column(name="STR_VAL")
public StringEnum getStrValue()
...
}
See also the chapter 5.3.3 Extending Hibernate with UserTypes at the excellent book "Java Persistence with Hibernate" by Bauer, King, Gregory.
Simplifying:
import com.pismo.apirest.mvc.enums.OperationType;
import com.pismo.apirest.mvc.enums.support.PersistableEnum;
import java.util.Objects;
import java.util.Optional;
import java.util.stream.Stream;
import javax.persistence.AttributeConverter;
import javax.persistence.Converter;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
#SuppressWarnings("unused")
public interface EnumsConverters {
#RequiredArgsConstructor
abstract class AbstractPersistableEnumConverter<E extends Enum<E> & PersistableEnum<I>, I> implements AttributeConverter<E, I> {
private final E[] enumConstants;
public AbstractPersistableEnumConverter(#NonNull Class<E> enumType) {
enumConstants = enumType.getEnumConstants();
}
#Override
public I convertToDatabaseColumn(E attribute) {
return Objects.isNull(attribute) ? null : attribute.getId();
}
#Override
public E convertToEntityAttribute(I dbData) {
return fromId(dbData, enumConstants);
}
public E fromId(I idValue) {
return fromId(idValue, enumConstants);
}
public static <E extends Enum<E> & PersistableEnum<I>, I> E fromId(I idValue, E[] enumConstants) {
return Objects.isNull(idValue) ? null : Stream.of(enumConstants)
.filter(e -> e.getId().equals(idValue))
.findAny()
.orElseThrow(() -> new IllegalArgumentException(
String.format("Does not exist %s with ID: %s", enumConstants[0].getClass().getSimpleName(), idValue)));
}
}
#Converter(autoApply = true)
class OperationTypeConverter extends AbstractPersistableEnumConverter<OperationType, Integer> {
public OperationTypeConverter() {
super(OperationType.class);
}
}
}
I tried 1000 times create something same.
Generate converter for each enum on the fly - not problem, but then they will be have same class. Main problem there: org.hibernate.boot.internal.MetadataBuilderImpl#applyAttributeConverter(java.lang.Class<? extends javax.persistence.AttributeConverter>, boolean).
If converter already registered we got exception.
public void addAttributeConverterInfo(AttributeConverterInfo info) {
if ( this.attributeConverterInfoMap == null ) {
this.attributeConverterInfoMap = new HashMap<>();
}
final Object old = this.attributeConverterInfoMap.put( info.getConverterClass(), info );
if ( old != null ) {
throw new AssertionFailure(
String.format(
"AttributeConverter class [%s] registered multiple times",
info.getConverterClass()
)
);
}
}
Perhaps we can change org.hibernate.boot.internal.BootstrapContext Impl, but I'm sure it's create too complex and non-flexible code.
Related
I have a custom bean serializer that I'd like to apply, but when I do, Jackson no longer includes null properties.
The following code reproduces the issue:
import java.io.IOException;
import com.fasterxml.jackson.annotation.JsonInclude.Include;
import com.fasterxml.jackson.core.JsonGenerator;
import com.fasterxml.jackson.databind.BeanDescription;
import com.fasterxml.jackson.databind.JsonSerializer;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.SerializationConfig;
import com.fasterxml.jackson.databind.SerializerProvider;
import com.fasterxml.jackson.databind.module.SimpleModule;
import com.fasterxml.jackson.databind.ser.BeanSerializerModifier;
import lombok.Value;
public class Test {
#Value
public static class Contact {
String first;
String middle;
String last;
String email;
}
public static void main(String[] args) throws Exception {
Contact contact = new Contact("Bob", null, "Barker", null);
ObjectMapper mapper = new ObjectMapper();
mapper.registerModule(new SimpleModule() {
#Override public void setupModule(SetupContext context) {
super.setupModule(context);
context.addBeanSerializerModifier(new BeanSerializerModifier() {
#Override public JsonSerializer<?> modifySerializer(SerializationConfig config, BeanDescription desc, JsonSerializer<?> serializer) {
// return serializer;
return new JsonSerializer<Object>() {
#Override public void serialize(Object value, JsonGenerator gen, SerializerProvider serializers) throws IOException {
((JsonSerializer<Object>) serializer).serialize(value, gen, serializers);
}};
}
});
}
});
System.out.println(
mapper.writerWithDefaultPrettyPrinter().writeValueAsString(contact)
);
}
}
The above code does nothing other that register a 'custom' serializer (that just delegates back to the original serializer), yet it produces JSON without the null properties:
{ "first" : "Bob", "last" : "Barker" }
If you comment out the return new JsonSerializer<Object>() {... and return the passed in serializer as is return serializer;, then Jackson serializes the null properties:
{ "first" : "Bob", "middle" : null, "last" : "Barker", "email"
: null }
I have read over many seemingly related SO articles, but none have led me to a solution yet. I've tried explicitly setting the mapper to Include.ALWAYS on serialization, with no luck.
My only lead is a comment in the JavaDoc for JsonSerializer:
NOTE: various serialize methods are never (to be) called
with null values -- caller must handle null values, usually
by calling {#link SerializerProvider#findNullValueSerializer} to obtain
serializer to use.
This also means that custom serializers cannot be directly used to change
the output to produce when serializing null values.
I am using Jackson version 2.11.2.
My question is: How can I write a custom serializer and have Jackson respect its usual Include directives with regard to null property serialization?
Context Info: My actual custom serializer's job is to conditionally hide properties from serialization. I have a custom annotation, #JsonAuth that is meta-annotated with #JacksonAnnotationsInside #JsonInclude(Include.NON_EMPTY) which my custom serializer (a ContextualSerializer) looks for in an overriden isEmpty method and returns true (treat as empty) if authorization is lacking. The end result is that I have an annotation that can be applied to properties which will hide the property from serialization if the client is not authorized. Except ... usage of the custom serializer has the unintended side effect of dropping all null properties.
Update: Jackson's BeanPropertyWriter.serializeAsField(...) method will completely ignore any custom serializer assigned to the property if the value is null.
I was able to override this behavior by writing a small extension to the class, which allowed my "isAuthorized" logic to preempt the null check:
public class JsonAuthPropertyWriter extends BeanPropertyWriter {
private final Predicate<Object> authFilter;
private JsonAuthPropertyWriter(BeanPropertyWriter delegate, Predicate<Object> authFilter) {
super(delegate);
this.authFilter = authFilter;
// set null serializer or authorized null values disappear
super.assignNullSerializer(NullSerializer.instance);
}
#Override
public void serializeAsField(
Object bean,
JsonGenerator gen,
SerializerProvider prov) throws Exception {
boolean authorized = authFilter.test(bean);
if (!authorized) return;
super.serializeAsField(bean, gen, prov);
}
}
And I injected these custom BeanPropertyWriters using a BeanSerializerModifier:
private static class JsonAuthBeanSerializerModifier extends BeanSerializerModifier {
#Override
public List<BeanPropertyWriter> changeProperties(
SerializationConfig config,
BeanDescription beanDesc,
List<BeanPropertyWriter> beanProperties
) {
for (int i = 0; i < beanProperties.size(); i++) {
BeanPropertyWriter beanPropertyWriter = beanProperties.get(i);
JsonAuth jsonAuth = beanPropertyWriter.findAnnotation(JsonAuth.class);
if (jsonAuth != null) {
Predicate<Object> authPredicate = ...
beanProperties.set(i, new JsonAuthPropertyWriter(beanPropertyWriter, authPredicate));
}
}
return beanProperties;
}
}
I may be misunderstanding what you want, but this approach seems useful:
import com.fasterxml.jackson.annotation.JsonFilter;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.ObjectWriter;
import com.fasterxml.jackson.databind.ser.BeanPropertyWriter;
import com.fasterxml.jackson.databind.ser.FilterProvider;
import com.fasterxml.jackson.databind.ser.PropertyWriter;
import com.fasterxml.jackson.databind.ser.impl.SimpleBeanPropertyFilter;
import com.fasterxml.jackson.databind.ser.impl.SimpleFilterProvider;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
import java.util.HashMap;
import java.util.Map;
public class Test2 {
#Target(ElementType.FIELD)
#Retention(RetentionPolicy.RUNTIME)
#interface JsonAuth {
}
#JsonFilter("myFilter")
public static class Contact {
#JsonAuth
String first;
#JsonAuth
String middle;
#JsonAuth
String last;
String email;
public Contact(String first, String middle, String last, String email) {
this.first = first;
this.middle = middle;
this.last = last;
this.email = email;
}
public String getFirst() {
return first;
}
public void setFirst(String first) {
this.first = first;
}
public String getMiddle() {
return middle;
}
public void setMiddle(String middle) {
this.middle = middle;
}
public String getLast() {
return last;
}
public void setLast(String last) {
this.last = last;
}
public String getEmail() {
return email;
}
public void setEmail(String email) {
this.email = email;
}
}
public static Map<String,Boolean> fieldSerialisationCount = new HashMap<>();
public static void main(String[] args) throws Exception {
Contact contact = new Contact("Bob", null, "Barker", null);
ObjectMapper mapper = new ObjectMapper();
FilterProvider filters = new SimpleFilterProvider().addFilter("myFilter", new SimpleBeanPropertyFilter() {
#Override
protected boolean include(BeanPropertyWriter writer) {
return super.include(writer) && isAuthed(writer);
}
#Override
protected boolean include(PropertyWriter writer) {
return super.include(writer) && isAuthed(writer);
}
private boolean isAuthed(PropertyWriter writer) {
if (!writer.getMember().hasAnnotation(JsonAuth.class)) {
return true;
} else {
return fieldSerialisationCount.compute(writer.getName(), (n, b) -> b == null ? true : !b); // check auth here
}
}
});
mapper.setFilterProvider(filters);
ObjectWriter writer = mapper.writer(filters).withDefaultPrettyPrinter();
System.out.println(
writer.writeValueAsString(contact)
);
System.out.println(
writer.writeValueAsString(contact)
);
System.out.println(
writer.writeValueAsString(contact)
);
}
}
It serialises annotated fields every other time, just as an example of a filter using persistent state.
Please let me know whether this works for you.
By the way, I agree that Jackson has the problem you describe, and I don't know how to solve it, so this is a work-around rather than an answer to your original question.
I am trying to mask sensitive data while serializing using jackson.
I have tried using #JsonSerialize and a custom annotation #Mask .
Mask.java
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.FIELD)
public #interface Mask {
String value() default "XXX-DEFAULT MASK FORMAT-XXX";
}
Employee.java
import com.fasterxml.jackson.databind.annotation.JsonSerialize;
import java.util.Map;
public class Employee {
#Mask(value = "*** The value of this attribute is masked for security reason ***")
#JsonSerialize(using = MaskStringValueSerializer.class)
protected String name;
#Mask
#JsonSerialize(using = MaskStringValueSerializer.class)
protected String empId;
#JsonSerialize(using = MaskMapStringValueSerializer.class)
protected Map<Category, String> categoryMap;
public Employee() {
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public String getEmpId() {
return empId;
}
public void setEmpId(String empId) {
this.empId = empId;
}
public Map<Category, String> getCategoryMap() {
return categoryMap;
}
public void setCategoryMap(Map<Category, String> categoryMap) {
this.categoryMap = categoryMap;
}
}
Category.java
public enum Category {
#Mask
CATEGORY1,
#Mask(value = "*** This value of this attribute is masked for security reason ***")
CATEGORY2,
CATEGORY3;
}
MaskMapStringValueSerializer.java
import com.fasterxml.jackson.core.JsonGenerator;
import com.fasterxml.jackson.databind.JsonSerializer;
import com.fasterxml.jackson.databind.SerializerProvider;
import java.io.IOException;
import java.util.Map;
public class MaskMapStringValueSerializer extends JsonSerializer<Map<Category, String>> {
#Override
public void serialize(Map<Category, String> map, JsonGenerator jsonGenerator, SerializerProvider serializerProvider) throws IOException {
jsonGenerator.writeStartObject();
for (Category key : map.keySet()) {
Mask annot = null;
try {
annot = key.getClass().getField(key.name()).getAnnotation(Mask.class);
} catch (NoSuchFieldException e) {
e.printStackTrace();
}
if (annot != null) {
jsonGenerator.writeStringField(((Category) key).name(), annot.value());
} else {
jsonGenerator.writeObjectField(((Category) key).name(), map.get(key));
}
}
jsonGenerator.writeEndObject();
}
}
MaskStringValueSerializer.java
import com.fasterxml.jackson.core.JsonGenerator;
import com.fasterxml.jackson.databind.BeanProperty;
import com.fasterxml.jackson.databind.JsonMappingException;
import com.fasterxml.jackson.databind.JsonSerializer;
import com.fasterxml.jackson.databind.SerializerProvider;
import com.fasterxml.jackson.databind.ser.ContextualSerializer;
import com.fasterxml.jackson.databind.ser.std.StdSerializer;
import java.io.IOException;
public class MaskStringValueSerializer extends StdSerializer<String> implements ContextualSerializer {
private Mask annot;
public MaskStringValueSerializer() {
super(String.class);
}
public MaskStringValueSerializer(Mask logMaskAnnotation) {
super(String.class);
this.annot = logMaskAnnotation;
}
public void serialize(String s, JsonGenerator jsonGenerator, SerializerProvider serializerProvider) throws IOException {
if (annot != null && s != null && !s.isEmpty()) {
jsonGenerator.writeString(annot.value());
} else {
jsonGenerator.writeString(s);
}
}
public JsonSerializer<?> createContextual(SerializerProvider serializerProvider, BeanProperty beanProperty) throws JsonMappingException {
Mask annot = null;
if (beanProperty != null) {
annot = beanProperty.getAnnotation(Mask.class);
}
return new MaskStringValueSerializer(annot);
}
}
MaskValueTest.java
import com.fasterxml.jackson.databind.ObjectMapper;
import java.util.HashMap;
import java.util.Map;
public class MaskValueTest {
public static void main(String args[]) throws Exception{
Employee employee = new Employee();
employee.setName("John Doe");
employee.setEmpId("1234567890");
Map<Category, String> catMap = new HashMap<>();
catMap.put(Category.CATEGORY1, "CATEGORY1");
catMap.put(Category.CATEGORY2, "CATEGORY2");
catMap.put(Category.CATEGORY3, "CATEGORY3");
employee.setCategoryMap(catMap);
ObjectMapper mapper = new ObjectMapper();
System.out.println(mapper.writerWithDefaultPrettyPrinter().writeValueAsString(employee));
}
}
Output -
{
"name" : "*** The value of this attribute is masked for security reason ***",
"empId" : "XXX-DEFAULT MASK FORMAT-XXX",
"categoryMap" : {
"CATEGORY1" : "XXX-DEFAULT MASK FORMAT-XXX",
"CATEGORY2" : "*** The value of this attribute is masked for security reason ***",
"CATEGORY3" : "CATEGORY3"
}
}
The result is as per expectation, however, this seems to be static masking.
The intention was to mask only when needed, e.g. while printing in the logs where the all these sensitive data should be masked.
If I have to send this json for document indexing where the values should be as it is, this implementation fails.
I am looking for an Annotation based solution, where I can use 2 different instance of ObjectMapper initialized with JsonSerializers.
This can be an implementation for what Andreas suggested:
create a class MaskAnnotationIntrospector which extend from JacksonAnnotationIntrospector and override its findSerializer method, like this:
public class MaskAnnotationIntrospector extends JacksonAnnotationIntrospector {
#Override
public Object findSerializer(Annotated am) {
Mask annotation = am.getAnnotation(Mask.class);
if (annotation != null)
return MaskingSerializer.class;
return super.findSerializer(am);
}
}
Therefore, you can have two instance of ObjectMapper. Add MaskAnnotationIntrospector to the one in which you want to Mask (e.g. for logging purpose):
mapper.setAnnotationIntrospector(new MaskAnnotationIntrospector());
The other instance which MaskAnnotationIntrospector has not set into it, do not mask any during serialization.
P.S. MaskAnnotationIntrospector can be extended from both JacksonAnnotationIntrospector & NopAnnotationIntrospector, but the latter does not provide any implementation for findSerializer method and calling super.findSerializer(am) simply return null and as a direct result, other Jackson annotation (such as #JsonIgnore) discarded, but by using the former, this problem solved
Remove the #JsonSerialize annotations, and put the logic of how to handle the #Mask annotation in a Module, e.g. have it add an AnnotationIntrospector.
You can now choose whether or not to call registerModule(Module module).
As for writing the module, I'll leave that up to you. If you have any questions about that, ask another Question.
Instead of having MaskStringValueSerializer.java you can create module to bundle the serializer and register the module with objectmapper whenever you want , which will eventually allow you to have two different instances of objectmapper.
Create a module to bundle the serializer
public class MaskingModule extends SimpleModule {
private static final String NAME = "CustomIntervalModule";
private static final VersionUtil VERSION_UTIL = new VersionUtil() {};
public MaskingModule() {
super(NAME, VERSION_UTIL.version());
addSerializer(MyBean.class, new MaskMapStringValueSerializer());
}
}
Register the module with ObjectMapper and use it
ObjectMapper objectMapper = new ObjectMapper().registerModule(new MaskingModule());
System.out.println(objectMapper.writerWithDefaultPrettyPrinter().writeValueAsString(employee));
Also you can extend the Object Mapper , register the module and use it
public class CustomObjectMapper extends ObjectMapper {
public CustomObjectMapper() {
registerModule(new MaskingModule());
}
}
CustomObjectMapper customObjectMapper = new CustomObjectMapper ();
System.out.println(customObjectMapper .writerWithDefaultPrettyPrinter().writeValueAsString(employee));
why don't you use two parameters one for original value and one for masked value. For example in this case you can use String name and String maskedName. then for logging you can use masked value.
I have a hibernate-mapped Java object, JKL, which is full of a bunch of normal hibernate-mappable fields (like strings and integers).
I'm added a new embedded field to it (which lives in the same table -- not a mapping), asdf, which is a fj.data.Option<ASDF>. I've made it an option to make it clear that this field may not actually contain anything (as opposed to having to handle null every time I access it).
How do I set up the mapping in my JKL.hbm.xml file? I'd like hibernate to automatically convert a null in the database to a none of fj.data.Option<ASDF> when it retrieves the object. It should also convert a non-null instance of ASDF to a some of fj.data.Option<ASDF>.
Is there any other trickery that I have to do?
I would suggest introducing FunctionalJava's Option in the accessors (getter and setter), while leaving Hibernate to handle a simple java field which is allowed to be null.
For example, for an optional Integer field:
// SQL
CREATE TABLE `JKL` (
`JKL_ID` INTEGER PRIMARY KEY,
`MY_FIELD` INTEGER DEFAULT NULL
)
You can map a Hibernate private field directly:
// Java
#Column(nullable = true)
private Integer myField;
You could then introduce Option at the accessor boundary:
// Java
public fj.data.Option<Integer> getMyField() {
return fj.data.Option.fromNull(myField);
}
public void setMyField(fj.data.Option<Integer> value) {
myField = value.toNull();
}
Does that work for your needs?
You can use Hibernate's custom mapping types. Documentation is here. Here is an analogous example of mapping Scala's Option to a Hibernate mapping.
Simply put, you would need to extend the org.hibernate.UserType interface. You could also create a generic-typed base class with a JKL-typed sub-type, similar to what you see in the Scala example.
I think using getter/setter is simpler, but here's an example of what I did to make it work :
(It works fine for number and string, but not for date (error with #Temporal annotation)).
import com.cestpasdur.helpers.PredicateHelper;
import com.google.common.annotations.VisibleForTesting;
import com.google.common.base.Optional;
import org.apache.commons.lang.ObjectUtils;
import org.apache.commons.lang.StringUtils;
import org.hibernate.HibernateException;
import org.hibernate.usertype.UserType;
import org.joda.time.DateTime;
import java.io.Serializable;
import java.sql.*;
public class OptionUserType implements UserType {
#Override
public int[] sqlTypes() {
return new int[]{
Types.NULL
};
}
#Override
public Class returnedClass() {
return Optional.class;
}
#Override
public boolean equals(Object o, Object o2) throws HibernateException {
return ObjectUtils.equals(o, o2);
}
#Override
public int hashCode(Object o) throws HibernateException {
assert (o != null);
return o.hashCode();
}
#Override
public Optional<? extends Object> nullSafeGet(ResultSet rs, String[] names, Object owner) throws HibernateException, SQLException {
return Optional.fromNullable(rs.getObject(names[0]));
}
#VisibleForTesting
void handleDate(PreparedStatement st, Date value, int index) throws SQLException {
st.setDate(index, value);
}
#VisibleForTesting
void handleNumber(PreparedStatement st, String stringValue, int index) throws SQLException {
Double doubleValue = Double.valueOf(stringValue);
st.setDouble(index, doubleValue);
}
#Override
public void nullSafeSet(PreparedStatement st, Object value, int index) throws SQLException {
if (value != null) {
if (value instanceof Optional) {
Optional optionalValue = (Optional) value;
if (optionalValue.isPresent()) {
String stringValue = String.valueOf(optionalValue.get());
if (StringUtils.isNotBlank(stringValue)) {
if (PredicateHelper.IS_DATE_PREDICATE.apply(stringValue)) {
handleDate(st, new Date(DateTime.parse(stringValue).getMillis()), index);
} else if (StringUtils.isNumeric(stringValue)) {
handleNumber(st, stringValue, index);
} else {
st.setString(index, optionalValue.get().toString());
}
} else {
st.setString(index, null);
}
} else {
System.out.println("else Some");
}
} else {
//TODO replace with Preconditions guava
throw new IllegalArgumentException(value + " is not implemented");
}
} else {
st.setString(index, null);
}
}
#Override
public Object deepCopy(Object o) throws HibernateException {
return o;
}
#Override
public boolean isMutable() {
return false;
}
#Override
public Serializable disassemble(Object o) throws HibernateException {
return (Serializable) o;
}
#Override
public Object assemble(Serializable serializable, Object o) throws HibernateException {
return serializable;
}
#Override
public Object replace(Object original, Object target, Object owner) throws HibernateException {
return original;
}
}
I like to use ltree from PostgreSQL contrib in one of my projects, and i like to use some kind of ORM layer (Hibernate, EclipseLink) on top of the database. I didn't find anything useful about using this type with persistence. I guess i have to extend the current PostgreSQL dialect with a new type and the corresponding operators. However i don't really know where to start and what is the correct way to do this. ltree works very much like a string so guess i should start form a string representation.
Can somebody give me suggestions and/or links to examples which are doing similar things? I couldn't find a complete tutorial yet.
this:
#Column(name = "dir", nullable = false, columnDefinition = "ltree")
#Type(type = "ru.zen0n.hibernate.types.LTreeType")
private String path;
and this:
package ru.zen0n.hibernate.types;
import java.io.Serializable;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Types;
import org.hibernate.HibernateException;
import org.hibernate.usertype.UserType;
public class LTreeType implements UserType {
#Override
public int[] sqlTypes() {
return new int[] {Types.OTHER};
}
#SuppressWarnings("rawtypes")
#Override
public Class returnedClass() {
return String.class;
}
#Override
public boolean equals(Object x, Object y) throws HibernateException {
return x.equals(y);
}
#Override
public int hashCode(Object x) throws HibernateException {
return x.hashCode();
}
#Override
public Object nullSafeGet(ResultSet rs, String[] names, Object owner)
throws HibernateException, SQLException {
return rs.getString(names[0]);
}
#Override
public void nullSafeSet(PreparedStatement st, Object value, int index)
throws HibernateException, SQLException {
st.setObject(index, value, Types.OTHER);
}
#Override
public Object deepCopy(Object value) throws HibernateException {
return new String((String)value);
}
#Override
public boolean isMutable() {
return false;
}
#Override
public Serializable disassemble(Object value) throws HibernateException {
return (Serializable)value;
}
#Override
public Object assemble(Serializable cached, Object owner)
throws HibernateException {
return cached;
}
#Override
public Object replace(Object original, Object target, Object owner)
throws HibernateException {
// TODO Auto-generated method stub
return deepCopy(original);
}
}
Here some pointers that may help you. It is not a complete answer but it is a bit too much for a comment.
Both Hibernate and EclipseLink implement the JPA standard but also have some extensions of their own and allow for custom behavior.
I haven't done anywork with ltree's in specific but I think you could just use the java String class and add an annotation to the column to overrule the normal column type used (this can be done in standard JPA).
#Column(columnDefinition="ltree")
private String myLtreeValue;
If a String an the Java side is not sufficient you can create your own class and write a converter class for it. You can see an example of that in this question. This would be persistence provider dependent.
When performing queries with conditions involving ltree values you would probably be best of using the JPA #NamedNativeQuery annotation to define queries involving the special operators. For an example see here
Oracle supports the use of VARRAYS and NESTED TABLE data types, allowing multivalued attributes. (http://www.orafaq.com/wiki/NESTED_TABLE)
I am currently using Hibernate 3 as my ORM framework, but I can't see how I can map Hibernate to a NESTED TABLE/VARRAY data type in my database.
I looked at defining custom types in Hibernate, with no success. (Can Hibernate even handle the "COLUMN_VALUE" Oracle keyword necessary to unnest the subtable?)
Does anyone know how to implement these data types in Hibernate?
Thank you all for your help.
-- TBW.
Hibernate's UserType for Oracle's TABLE OF NUMBERS.
OracleNativeExtractor found here : https://community.jboss.org/wiki/MappingOracleXmlTypeToDocument . String YOUR_CUSTOM_ARRAY_TYPE replace with your name.
import oracle.sql.ARRAY;
import oracle.sql.ArrayDescriptor;
import org.apache.commons.lang.ArrayUtils;
import org.hibernate.HibernateException;
import org.hibernate.usertype.UserType;
import java.io.Serializable;
import java.sql.*;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
public class ArrayUserType
implements UserType, Serializable {
private static final OracleNativeExtractor EXTRACTOR = new OracleNativeExtractor();
#Override
public int[] sqlTypes() {
return new int[]{Types.ARRAY};
}
#Override
public Class returnedClass() {
return List.class;
}
#Override
public boolean equals(Object x, Object y) throws HibernateException {
if (x == null && y == null) return true;
else if (x == null && y != null) return false;
else return x.equals(y);
}
#Override
public int hashCode(Object x) throws HibernateException {
return x.hashCode();
}
#Override
public Object nullSafeGet(ResultSet rs, String[] names, Object owner) throws HibernateException, SQLException {
return Arrays.asList(ArrayUtils.toObject(((ARRAY) rs.getObject(names[0])).getLongArray()));
}
#Override
public void nullSafeSet(PreparedStatement st, Object value, int index) throws HibernateException, SQLException {
ARRAY array = null;
if (value != null) {
Connection nativeConn = EXTRACTOR.getNativeConnection(st.getConnection());
ArrayDescriptor descriptor =
ArrayDescriptor.createDescriptor("YOUR_CUSTOM_ARRAY_TYPE", nativeConn);
array = new ARRAY(descriptor, nativeConn, ((List<Long>) value).toArray(new Long[]{}));
}
st.setObject(1, array);
}
#Override
public Object deepCopy(Object value) throws HibernateException {
if (value == null) return null;
return new ArrayList<Long>((List<Long>) value);
}
#Override
public boolean isMutable() {
return false;
}
public Object assemble(Serializable _cached, Object _owner)
throws HibernateException {
return _cached;
}
public Serializable disassemble(Object _obj)
throws HibernateException {
return (Serializable) _obj;
}
#Override
public Object replace(Object original, Object target, Object owner) throws HibernateException {
return deepCopy(original);
}
}
I hope I'm wrong and that you find a better answer in your research, but this feature is not supported in Hibernate. Hibernate relies on standard JDBC to talk to a database and these features are not part of the standard. They are Oracle extensions.
That said, I can think of a few workarounds:
1) Implement your own UserType. With your specific user type, you'll have a chance to manipulate the values provided by the database (or about to be sent to the database). But that will only work if Oracle provides this value as one of these java.sql.Types: http://download.oracle.com/javase/1.5.0/docs/api/java/sql/Types.html
2) The other option is to use JDBC directly, through the use of a Hibernate worker. See this example of a Worker: https://github.com/hibernate/hibernate-core/blob/master/hibernate-core/src/test/java/org/hibernate/test/jdbc/GeneralWorkTest.java
That said, I think that you have to weight the solutions and re-evaluate if you really need a nested table.