Injection of stateless session bean into custom JsonDeserializer fails - java

I am building an application providing a JAX-RS REST service using JPA (EclipseLink). When exposing User entities over JSON, I am using the #XmlTransient annotation on some fields (e.g. the password field) to hide them from the JSON representation. When sending a create or update (POST/PUT) operation, I would like to populate the missing fields again so JPA will correctly perform the operation.
My current approach is that I have a custom JsonDeserializer that is used to deserialize the User and to add the missing fields. For this I would like to inject (using #Inject) a UserFacadeREST bean which handles the JPA-stuff. However, this injection fails and the bean instance is null (which then of course causes a NullPointerException).
My UserFacadeREST bean is annoted as follows:
#Stateless
#LocalBean
#Path(UserFacadeREST.PATH)
public class UserFacadeREST extends AbstractFacade<User> {
//...
}
My UserDeserilizer (custom JsonDeserializer):
public class UserDeserializer extends JsonDeserializer<User> {
#Inject
private UserFacadeREST userFacade;
#Override
public User deserialize(JsonParser parser, DeserializationContext context) throws IOException,
JsonProcessingException {
JsonNode node = parser.getCodec().readTree(parser);
int userId = (Integer) ((IntNode) node.get("userID")).numberValue();
System.out.println(userId);
User user = userFacade.find(userId); // This line produces the NullPointerException
return user;
}
}
which I then use on my User entity with #JsonDeserialize:
#Entity
#Table(name = "User")
#XmlRootElement
#JsonDeserialize(using = UserDeserializer.class)
public class User implements Serializable {
// ...
}
I have included a bean.xml file in my WEB-INF folder with bean-discovery-mode set to all. What am I missing?

Jon Peterson pointed me to the right direction. I finally chose to implement the 'hackish' solution, in a way. Please note that there are basically 2 options here (if you know another one, please let me know!). Short version:
Hackish solution (the solution I chose): inject a bean programmatically using javax.enterprise.inject.spi.CDI.current().select(UserFacadeRest.class).get() as described in the accepted answer of the question mentioned by Jon or
Better (clean) solution (but also more elaborate): Redesign the logic to fill the missing fields after deserialization as suggested by Jon.
So for my question, the solution looks as follows:
1.
import javax.enterprise.inject.spi.CDI;
public class UserDeserializer extends JsonDeserializer<User> {
private final UserFacadeREST userFacade =
CDI.current().select(UserFacadeREST.class).get();
// Rest as before
}
2. In this case, in the deserialize method of my JsonDeserializer I would construct a User that just holds the userID. In every request method I would then have to examine all the users and replace them by the actual user by calling EntityManager.find(User.class, user.getUserID()). This means more effort in the business logic as you have to keep in mind that everytime you need to work on a User in a request method, you first have to do a query to get the 'full' User object. In the first solution, this query is hidden from the business logic and already happens in the JsonDeserializer.
public class UserDeserializer extends JsonDeserializer<User> {
#Override
public User deserialize(JsonParser parser, DeserializationContext context) throws IOException,
JsonProcessingException {
JsonNode node = parser.getCodec().readTree(parser);
int userId = (Integer) ((IntNode) node.get("userID")).numberValue();
return new User(userId); // Placeholder User object containing only the user ID, needs to be replaced in business logic
}
}

I'm not super familiar with CDI, but some quick Google'ing leads me to believe that bean-discovery-mode should either be all, annotated, or none (true not being a valid value). Reference
If that doesn't fix it, it might be the same issue that Spring would have: you have to declare your UserDeserializer as a bean for the dependency injection to be applied.
EDIT: Just found this other question that is basically the same issue you are having.
Ultimately, you probably need to just redesign the logic to call userFacade after deserialization.

Related

Mass Assignment: Insecure Binder Configuration with Fortify Java 1.8 EJB

I am trying to solve some vulnerabilities issues, and I have one that I couldn't solve it, I tried to add #Valid annotation in sync method but same error, this is the description from fortify:
The framework binder used for binding the HTTP request parameters to
the model class has not been explicitly configured to allow, or
disallow certain attributes.
To ease development and increase productivity, most modern frameworks
allow an object to be automatically instantiated and populated with
the HTTP request parameters whose names match an attribute of the
class to be bound. Automatic instantiation and population of objects
speeds up development, but can lead to serious problems if implemented
without caution. Any attribute in the bound classes, or nested
classes, will be automatically bound to the HTTP request parameters.
Therefore, malicious users will be able to assign a value to any
attribute in bound or nested classes, even if they are not exposed to
the client through web forms or API contracts.
The error I am getting in this line:
public ResponseClass sync(#BeanParam MyClassRequest request) throws
Exception {
MyClassResource.java
#Api(tags = "Relay")
#Stateless
public class MyClassResource extends AbstractService<MyClassRequest, ResponseClass> {
#EJB
private MyClassService myClassService;
#POST
#Path("/api/v1/service")
#Produces({"application/json"})
#ApiOperation(value = "Processes Conn",
response = ResponseClass.class, responseContainer = "ResponseClass", hidden = true)
#Override
public ResponseClass sync(#BeanParam MyClassRequest request) throws Exception {
myClassService.processFeed(request);
return new RelayResponse(HttpStatuses.ACCEPTED.getStatus());
}
MyClassRequest.java
In this file I have tried #FormParam("ccc") but same
public class MyClassRequest extends RelayRequest {
public MyClassRequest() {
super.setMessageType("not required");
}
private String myData;
private String conneRid;
private String connectionCreatedDate;
If someone could give some hint that how I can solve it, I will really appreciate it.
Do you expect all fields to be present in request? You are using #Valid annotation but there are no validation annotations in MyClassRequest model. Try to add some validation annotations like #JsonIgnore for non mandatory fields. Or #JsonInclude on class. If this does not help, may be also try explicitly adding #JsonProperty on each field.

Spring MVC: issue between xml and annotation configurations

I have created a simple controller
#GetMapping("/playerAccount")
public Iterable<PlayerAccount> getPlayerAccounts(com.querydsl.core.types.Predicate predicate) {
return repository.findAll(predicate);
}
When I call the GET /playerAccount API, I get the exception IllegalStateException "No primary or default constructor found for interface com.querydsl.core.types.Predicate" (thrown by org.springframework.web.method.annotation.ModelAttributeMethodProcessor#createAttribute).
After some (deep!) digging, I found out that if I delete the following line in my spring.xml file:
<mvc:annotation-driven />
And if I add the following line in my Spring.java file:
#EnableWebMvc
then the problem disappears.
I really don't understand why. What could be the cause of that ? I thought that these were really equivalent (one being a xml based configuration, the other being java/annotation based).
I read this documentation on combining Java and Xml configuration, but I didn't see anything relevant there.
edit:
from the (few) comments/answers that I got so far, I understand that maybe using a Predicate in my API is not the best choice.
Although I would really like to understand the nature of the bug, I first want to address the initial issue I'm trying to solve:
Let's say I have a MyEntity entity that is composed of 10 different fields (with different names and types). I would like to search on it easily. If I create the following (empty) interface:
public interface MyEntityRepository extends JpaRepository<MyEntity, Long>, QuerydslPredicateExecutor<MyEntity> {
}
then without any other code (apart from the xml configuration ), I am able to easily search a myEntity entity in the database.
Now I just want to expose that functionality to a Rest endpoint. And ideally, if I add a new field to my MyEntity, I want that API to automatically work with that new field, just like the MyEntityRepository does, without modifying the controller.
I thought this was the purpose of Spring Data and a good approach, but please tell me if there's a better / more common way of creating a search API to a given Entity.
I didn't see that it returned an exception, that's why I thought it was a dependency problem.
Try to make your code look like this, and it will do it.
#RestController
public class MyClass {
#Autowired
private final MyRepository repository;
#GetMapping("/playerAccount")
public Iterable<PlayerAccount> getPlayerAccounts() {
return repository.findAll();
}
If you have a parameter in your request you add #RequestParam.
Code time (yaaaaaay) :
#RestController
public class MyClass {
#Autowired
private final MyRepository repository;
#GetMapping("/playerAccount")
public Iterable<PlayerAccount> getPlayerAccounts(#RequestParam(required = false) Long id) {
return repository.findById(id);
}
Ps: the request should keep the same variable name e.g
.../playerAccount?id=6

How to properly convert domain entities to DTOs while considering scalability & testability

I have read several articles and Stackoverflow posts for converting domain objects to DTOs and tried them out in my code. When it comes to testing and scalability I am always facing some issues. I know the following three possible solutions for converting domain objects to DTOs. Most of the time I am using Spring.
Solution 1: Private method in the service layer for converting
The first possible solution is to create a small "helper" method in the service layer code which is convertig the retrieved database object to my DTO object.
#Service
public MyEntityService {
public SomeDto getEntityById(Long id){
SomeEntity dbResult = someDao.findById(id);
SomeDto dtoResult = convert(dbResult);
// ... more logic happens
return dtoResult;
}
public SomeDto convert(SomeEntity entity){
//... Object creation and using getter/setter for converting
}
}
Pros:
easy to implement
no additional class for convertion needed -> project doesn't blow up with entities
Cons:
problems when testing, as new SomeEntity() is used in the privated method and if the object is deeply nested I have to provide a adequate result of my when(someDao.findById(id)).thenReturn(alsoDeeplyNestedObject) to avoid NullPointers if convertion is also dissolving the nested structure
Solution 2: Additional constructor in the DTO for converting domain entity to DTO
My second solution would be to add an additional constructor to my DTO entity to convert the object in the constructor.
public class SomeDto {
// ... some attributes
public SomeDto(SomeEntity entity) {
this.attribute = entity.getAttribute();
// ... nesting convertion & convertion of lists and arrays
}
}
Pros:
no additional class for converting needed
convertion hided in the DTO entity -> service code is smaller
Cons:
usage of new SomeDto() in the service code and therefor I have to provide the correct nested object structure as a result of my someDao mocking.
Solution 3: Using Spring's Converter or any other externalized Bean for this converting
If recently saw that Spring is offering a class for converting reasons: Converter<S, T> but this solution stands for every externalized class which is doing the convertion. With this solution I am injecting the converter to my service code and I call it when i want to convert the domain entity to my DTO.
Pros:
easy to test as I can mock the result during my test case
separation of tasks -> a dedicated class is doing the job
Cons:
doesn't "scale" that much as my domain model grows. With a lot of entities I have to create two converters for every new entity (-> converting DTO entitiy and entitiy to DTO)
Do you have more solutions for my problem and how do you handle it? Do you create a new Converter for every new domain object and can "live" with the amount of classes in the project?
Thanks in advance!
Solution 1: Private method in the service layer for converting
I guess Solution 1 will not not work well, because your DTOs are domain-oriented and not service oriented. Thus it will be likely that they are used in different services. So a mapping method does not belong to one service and therefore should not be implemented in one service. How would you re-use the mapping method in another service?
The 1. solution would work well if you use dedicated DTOs per service method. But more about this at the end.
Solution 2: Additional constructor in the DTO for converting domain entity to DTO
In general a good option, because you can see the DTO as an adapter to the entity. In other words: the DTO is another representation of an entity. Such designs often wrap the source object and provide methods that give you another view on the wrapped object.
But a DTO is a data transfer object so it might be serialized sooner or later and send over a network, e.g. using spring's remoting capabilities. In this case the client that receives this DTO must deserialize it and thus needs the entity classes in it's classpath, even if it only uses the DTO's interface.
Solution 3: Using Spring's Converter or any other externalized Bean for this converting
Solution 3 is the solution that I also would prefer. But I would create a Mapper<S,T> interface that is responsible for mapping from source to target and vice versa. E.g.
public interface Mapper<S,T> {
public T map(S source);
public S map(T target);
}
The implementation can be done using a mapping framework like modelmapper.
You also said that a converter for each entity
doesn't "scale" that much as my domain model grows. With a lot of entities I have to create two converters for every new entity (-> converting DTO entitiy and entitiy to DTO)
I doupt that you only have to create 2 converter or one mapper for one DTO, because your DTO is domain-oriented.
As soon as you start to use it in another service you will recognize that the other service usually should or can not return all values that the first service does.
You will start to implement another mapper or converter for each other service.
This answer would get to long if I start with pros and cons of dedicated or shared DTOs, so I can only ask you to read my blog pros and cons of service layer designs.
EDIT
About the third solution: where do you prefer to put the call for the mapper?
In the layer above the use cases. DTOs are data transfer objects, because they pack data in data structures that are best for the transfer protocol. Thus I call that layer the transport layer.
This layer is responsible for mapping use case's request and result objects from and to the transport representation, e.g. json data structures.
EDIT
I see you're ok with passing an entity as a DTO constructor parameter. Would you also be ok with the opposite? I mean, passing a DTO as an Entity constructor parameter?
A good question. The opposite would not be ok for me, because I would then introduce a dependency in the entity to the transport layer. This would mean that a change in the transport layer can impact the entities and I don't want changes in more detailed layers to impact more abstract layers.
If you need to pass data from the transport layer to the entity layer you should apply the dependency inversion principle.
Introduce an interface that will return the data through a set of getters, let the DTO implement it and use this interface in the entities constructor. Keep in mind that this interface belongs to the entity's layer and thus should not have any dependencies to the transport layer.
interface
+-----+ implements || +------------+ uses +--------+
| DTO | ---------------||-> | EntityData | <---- | Entity |
+-----+ || +------------+ +--------+
I like the third solution from the accepted answer.
Solution 3: Using Spring's Converter or any other externalized Bean for this converting
And I create DtoConverter in this way:
BaseEntity class marker:
public abstract class BaseEntity implements Serializable {
}
AbstractDto class marker:
public class AbstractDto {
}
GenericConverter interface:
public interface GenericConverter<D extends AbstractDto, E extends BaseEntity> {
E createFrom(D dto);
D createFrom(E entity);
E updateEntity(E entity, D dto);
default List<D> createFromEntities(final Collection<E> entities) {
return entities.stream()
.map(this::createFrom)
.collect(Collectors.toList());
}
default List<E> createFromDtos(final Collection<D> dtos) {
return dtos.stream()
.map(this::createFrom)
.collect(Collectors.toList());
}
}
CommentConverter interface:
public interface CommentConverter extends GenericConverter<CommentDto, CommentEntity> {
}
CommentConveter class implementation:
#Component
public class CommentConverterImpl implements CommentConverter {
#Override
public CommentEntity createFrom(CommentDto dto) {
CommentEntity entity = new CommentEntity();
updateEntity(entity, dto);
return entity;
}
#Override
public CommentDto createFrom(CommentEntity entity) {
CommentDto dto = new CommentDto();
if (entity != null) {
dto.setAuthor(entity.getAuthor());
dto.setCommentId(entity.getCommentId());
dto.setCommentData(entity.getCommentData());
dto.setCommentDate(entity.getCommentDate());
dto.setNew(entity.getNew());
}
return dto;
}
#Override
public CommentEntity updateEntity(CommentEntity entity, CommentDto dto) {
if (entity != null && dto != null) {
entity.setCommentData(dto.getCommentData());
entity.setAuthor(dto.getAuthor());
}
return entity;
}
}
I ended up NOT using some magical mapping library or external converter class, but just adding a small bean of my own which has convert methods from each entity to each DTO I need. The reason is that the mapping was:
either stupidly simple and I would just copy some values from one field to another, perhaps with a small utility method,
or was quite complex and would be more complicated to write down in the custom parameters to some generic mapping library, compared to just writing out that code. This is for example in the case where the client can send JSON but under the hood this is transformed into entities, and when the client retrieves the parent object of these entities again, it's converted back into JSON.
This means I can just call .map(converter::convert) on any collection of entities to get back a stream of my DTO's.
Is it scalable to have it all in one class? Well the custom configuration for this mapping would have to be stored somewhere even if using a generic mapper. The code is generally extremely simple, except for a handful of cases, so I'm not too worried about this class exploding in complexity. I'm also not expecting to have dozens more entities, but if I did I might group these converters in a class per subdomain.
Adding a base class to my entities and DTO's so I can write a generic converter interface and implement it per class just isn't needed (yet?) either for me.
In my opinion the third solution is the best one. Yes for each entity you'll have to create a two new convert classes but when you come time for testing you won't have a lot of headaches. You should never chose the solution which will cause you to write less code at the begining and then write much more when it comes to testing and maintaining that code.
Another point is , if you use the second approach and your entity has lazy dependencies, your Dto can't understand if dependency is loaded unless you inject EntityManager into the Dto and use it to check if dependency was loaded. I don't like this approach cause Dto shouldn't know anything about EntityManager. As a solution I personally prefer Converters but at the same time I prefer to have multiple Dto classes for the same entity . For example If I am 100 % sure that User Entity will be loaded without corresponding Company , then there has to be a UserDto that doesn't have CompanyDto as a field. At the same time If I know that UserEntity will be loaded with correlated Company , then I will use aggregate pattern , something like a UserCompanyDto class that contains UserDto and CompanyDto as parameters
On my side I prefer using option 3 with a third party library such as modelmapper or mapstruct. Also I use it through interface in an util package, because I don't want any external tool or library to interact directly with my code.
Definition:
public interface MapperWrapper {
<T> T performMapping(Object source, Class<T> destination);
}
#Component
public class ModelMapperWrapper implements MapperWrapper {
private ModelMapper mapper;
public ModelMapperWrapper() {
this.mapper = new ModelMapper();
}
#Override
public <T> T performMapping(Object source, Class<T>
destination) {
mapper.getConfiguration()
.setMatchingStrategy(MatchingStrategies.STRICT);
return mapper.map(source, destination);
}
}
Then after I can test it easily:
Testing:
#SpringJUnitWebConfig(TestApplicationConfig.class)
class ModelMapperWrapperTest implements WithAssertions {
private final MapperWrapper mapperWrapper;
#Autowired
public ModelMapperWrapperTest(MapperWrapper mapperWrapper) {
this.mapperWrapper = mapperWrapper;
}
#BeforeEach
void setUp() {
}
#Test
void givenModel_whenMapModelToDto_thenReturnsDto() {
var model = new DummyModel();
model.setId(1);
model.setName("DUMMY_NAME");
model.setAge(25);
var modelDto = mapperWrapper.performMapping(model, DummyModelDto.class);
assertAll(
() -> assertThat(modelDto.getId()).isEqualTo(String.valueOf(model.getId())),
() -> assertThat(modelDto.getName()).isEqualTo(model.getName()),
() -> assertThat(modelDto.getAge()).isEqualTo(String.valueOf(model.getAge()))
);
}
#Test
void givenDto_whenMapDtoToModel_thenReturnsModel() {
var modelDto = new DummyModelDto();
modelDto.setId("1");
modelDto.setName("DUMMY_NAME");
modelDto.setAge("25");
var model = mapperWrapper.performMapping(modelDto, DummyModel.class);
assertAll(
() -> assertThat(model.getId()).isEqualTo(Integer.valueOf(modelDto.getId())),
() -> assertThat(model.getName()).isEqualTo(modelDto.getName()),
() -> assertThat(model.getAge()).isEqualTo(Integer.valueOf(modelDto.getAge()))
);
}
}
After that it can be very easy to use another mapper library. I should have created an abstract factory, or strategy pattern also.

Patterns: Populate instance from Parameters and export it to XML

I'm building a simple RESTFul Service; and for achieve that I need two tasks:
Get an instance of my resource (i.e Book) from request parameters, so I can get that instance to be persisted
Build an XML document from that instance to send the representation to the clients
Right now, I'm doing both things in my POJO class:
public class Book implements Serializable {
private Long id;
public Book(Form form) {
//Initializing attributes
id = Long.parseLong(form.getFirstValue(Book.CODE_ELEMENT));
}
public Element toXml(Document document) {
// Getting an XML Representation of the Book
Element bookElement = document.createElement(BOOK_ELEMENT);
}
I've remembered an OO principle that said that behavior should be where the data is, but now my POJO depends from Request and XML API's and that doesn't feels right (also, that class has persistence anotations)
Is there any standard approach/pattern to solve that issue?
EDIT:
The libraries i'm using are Restlets and Objectify.
I agree with you when you say that the behavior should be where the data is. But at the same time, as you say I just don't feel confortable polluting a POJO interface with specific methods used for serialization means (which can grow considerably depending on the way you want to do it - JSON, XML, etc.).
1) Build an XML document from that instance to send the representation to the clients
In order to decouple the object from serialization logic, I would adopt the Strategy Pattern:
interface BookSerializerStrategy {
String serialize(Book book);
}
public class XmlBookSerializerStrategy implements BookSerializerStrategy {
public String serialize(Book book) {
// Do something to serialize your book.
}
}
public class JsonBookSerializerStrategy implements BookSerializerStrategy {
public String serialize(Book book) {
// Do something to serialize your book.
}
}
You POJO interface would become:
public class Book implements Serializable {
private Long id;
private BookSerializerStrategy serializer
public String serialize() {
return serializer.serialize(this);
}
public void setSerializer(BookSerializerStrategy serializer) {
this.serializer = serializer;
}
}
Using this approach you will be able to isolate the serialization logic in just one place and wouldn't pollute your POJO with that. Additionally, returning a String I won't need to couple you POJO with classes Document and Element.
2) Get an instance of my resource (i.e Book) from request parameters, so I can get that instance to be persisted
To find a pattern to handle the deserialization is more complex in my opinion. I really don't see a better way than to create a Factory with static methods in order to remove this logic from your POJO.
Another approach to answer your two questions would be something like JAXB uses: two different objects, an Unmarshaller in charge of deserialization and a Marshaller for serialization. Since Java 1.6, JAXB comes by default with JDK.
Finally, those are just suggestions. I've become really interested in your question actually and curious about other possible solutions.
Are you using Spring, or any other framework, in your project? If you used Spring, it would take care of serialization for you, as well as assigning request params to method params (parsing as needed).

How can I tell jackson to ignore a property for which I don't have control over the source code?

Long story short, one of my entities has a GeometryCollection that throws an exception when you call "getBoundary" (the why of this is another book, for now let's say this is the way it works).
Is there a way I can tell Jackson not to include that specific getter? I know I can use #JacksonIgnore when I do own/control the code. But this is not case, jackson ends reaching this point through continuous serialization of the parent objects. I saw a filtering option in jackson documentation. Is that a plausible solution?
Thanks!
You can use Jackson Mixins. For example:
class YourClass {
public int ignoreThis() { return 0; }
}
With this Mixin
abstract class MixIn {
#JsonIgnore abstract int ignoreThis(); // we don't need it!
}
With this:
objectMapper.getSerializationConfig().addMixInAnnotations(YourClass.class, MixIn.class);
Edit:
Thanks to the comments, with Jackson 2.5+, the API has changed and should be called with objectMapper.addMixIn(Class<?> target, Class<?> mixinSource)
One other possibility is, if you want to ignore all unknown properties, you can configure the mapper as follows:
mapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
Using Java Class
new ObjectMapper().configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false)
Using Annotation
#JsonIgnoreProperties(ignoreUnknown=true)
Annotation based approach is better. But sometimes manual operation is needed. For this purpose you can use without method of ObjectWriter.
ObjectMapper mapper = new ObjectMapper().configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false)
ObjectWriter writer = mapper.writer().withoutAttribute("property1").withoutAttribute("property2");
String jsonText = writer.writeValueAsString(sourceObject);
Mix-in annotations work pretty well here as already mentioned. Another possibility beyond per-property #JsonIgnore is to use #JsonIgnoreType if you have a type that should never be included (i.e. if all instances of GeometryCollection properties should be ignored). You can then either add it directly (if you control the type), or using mix-in, like:
#JsonIgnoreType abstract class MixIn { }
// and then register mix-in, either via SerializationConfig, or by using SimpleModule
This can be more convenient if you have lots of classes that all have a single 'IgnoredType getContext()' accessor or so (which is the case for many frameworks)
I had a similar issue, but it was related to Hibernate's bi-directional relationships. I wanted to show one side of the relationship and programmatically ignore the other, depending on what view I was dealing with. If you can't do that, you end up with nasty StackOverflowExceptions. For instance, if I had these objects
public class A{
Long id;
String name;
List<B> children;
}
public class B{
Long id;
A parent;
}
I would want to programmatically ignore the parent field in B if I were looking at A, and ignore the children field in A if I were looking at B.
I started off using mixins to do this, but that very quickly becomes horrible; you have so many useless classes laying around that exist solely to format data. I ended up writing my own serializer to handle this in a cleaner way: https://github.com/monitorjbl/json-view.
It allows you programmatically specify what fields to ignore:
ObjectMapper mapper = new ObjectMapper();
SimpleModule module = new SimpleModule();
module.addSerializer(JsonView.class, new JsonViewSerializer());
mapper.registerModule(module);
List<A> list = getListOfA();
String json = mapper.writeValueAsString(JsonView.with(list)
.onClass(B.class, match()
.exclude("parent")));
It also lets you easily specify very simplified views through wildcard matchers:
String json = mapper.writeValueAsString(JsonView.with(list)
.onClass(A.class, match()
.exclude("*")
.include("id", "name")));
In my original case, the need for simple views like this was to show the bare minimum about the parent/child, but it also became useful for our role-based security. Less privileged views of objects needed to return less information about the object.
All of this comes from the serializer, but I was using Spring MVC in my app. To get it to properly handle these cases, I wrote an integration that you can drop in to existing Spring controller classes:
#Controller
public class JsonController {
private JsonResult json = JsonResult.instance();
#Autowired
private TestObjectService service;
#RequestMapping(method = RequestMethod.GET, value = "/bean")
#ResponseBody
public List<TestObject> getTestObject() {
List<TestObject> list = service.list();
return json.use(JsonView.with(list)
.onClass(TestObject.class, Match.match()
.exclude("int1")
.include("ignoredDirect")))
.returnValue();
}
}
Both are available on Maven Central. I hope it helps someone else out there, this is a particularly ugly problem with Jackson that didn't have a good solution for my case.
If you want to ALWAYS exclude certain properties for any class, you could use setMixInResolver method:
#JsonIgnoreProperties({"id", "index", "version"})
abstract class MixIn {
}
mapper.setMixInResolver(new ClassIntrospector.MixInResolver(){
#Override
public Class<?> findMixInClassFor(Class<?> cls) {
return MixIn.class;
}
#Override
public ClassIntrospector.MixInResolver copy() {
return this;
}
});
One more good point here is to use #JsonFilter.
Some details here Feature: JSON Filter

Categories