Spring Data MongoDB - ignore empty objects - java

I'm using Spring Data with a MongoDB to save some documents. When saving documents, I would like that Mongo does not contain empty objects. (How) can this be achieved?
Say I have the following main class:
#Document(collection = "main_doc")
public class MainDoc {
#Id
private String id;
private String title;
private SubDoc subDoc;
}
that contains an object of the following class:
public class SubDoc {
private String title;
private String info;
}
Now if I would try to save the following object:
MainDoc main = new MainDoc();
main.setTitle("someTitle");
main.setSubDoc(new SubDoc());
Note: in reality I do not control the fact that the SubDoc is set like this. It can either be empty or filled in. What I want is that if an element's properties/fields are all NULL, it will not be stored in mongo at all.
This results in something like this in mongo:
{
"_id" : "5a328f9a-6118-403b-a3a0-a55ce52099f3",
"title": "someTitle",
"subDoc": {}
}
What I would like is that if an element contains only null properties, they aren't saved at all, so for the above example I would want the following result:
{
"_id" : "5a328f9a-6118-403b-a3a0-a55ce52099f3",
"title": "someTitle"
}
Saving of documents is done with the help of a repository as following:
#NoRepositoryBean
public interface MainRepo extends CrudRepository<MainDoc, String> {
// save inherited
}
Thanks in advance.

One thing you can do here is to write your custom converter for MainDoc:
public class MainDocConverter implements Converter<MainDoc, DBObject> {
#Override
public DBObject convert(final MainDoc source) {
final BasicDbObject dbObject = new BasicDBObject();
...
if(/* validate is subdoc is not null and not empty */) {
dbOject.put("subDoc", source.getSubDoc());
}
}
}
You can register it in #Configuration file for example:
#Configuration
#EnableMongoRepositories(basePackages = {"package"})
public class MongoConfig {
private final MongoDbFactory mongoDbFactory;
public MongoConfig(final MongoDbFactory mongoDbFactory) {
this.mongoDbFactory = mongoDbFactory;
}
#Bean
public MongoTemplate mongoTemplate() throws Exception {
final MongoTemplate mongoTemplate = new MongoTemplate(mongoDbFactory, getDefaultMongoConverter());
return mongoTemplate;
}
#Bean
public MappingMongoConverter getDefaultMongoConverter() throws Exception {
final MappingMongoConverter converter = new MappingMongoConverter(
new DefaultDbRefResolver(mongoDbFactory), new MongoMappingContext());
converter.setCustomConversions(new CustomConversions(Arrays.asList(new MainDocConverter())));
return converter;
}
}
If you don't want to write a custom converter for your object toy can use default one and and modify it a bit.
final Document document = (Document) getDefaultMongoConverter().convertToMongoType(mainDoc);
if(/* validate is null or is empty */) {
document .remove("subDoc");
}
mongoTemplate().save(document);
Actually it's not the best way. As guys wrote empty object should be stored as {}, but converter can help you with your case.

Related

Hibernate Search JsonB indexing

I am struggling with indexing jsonB column into Elasicsearch backend, using Hibernate Search 6.0.2
This is my entity:
#Data
#NoArgsConstructor
#Entity
#Table(name = "examples")
public class Example {
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
private UUID id;
#NotNull
#Column(name = "fields")
#Type(type = "jsonb")
private Map<String, Object> fields;
}
and this is my programmatic mapping of elasticsearch backend for Hibernate Search:
#Configuration
#RequiredArgsConstructor
public class ElasticsearchMappingConfig implements HibernateOrmSearchMappingConfigurer {
private final JsonPropertyBinder jsonPropertyBinder;
#Override
public void configure(HibernateOrmMappingConfigurationContext context) {
var mapping = context.programmaticMapping();
var exampleMapping = mapping.type(Example.class);
exampleMapping.indexed();
exampleMapping.property("fields").binder(jsonPropertyBinder);
}
}
I based my custom property binder implementation on Hibernate Search 6.0.2 documentation.
#Component
public class JsonPropertyBinder implements PropertyBinder {
#Override
public void bind(PropertyBindingContext context) {
context.dependencies().useRootOnly();
var schemaElement = context.indexSchemaElement();
var userMetadataField = schemaElement.objectField("metadata");
context.bridge(Map.class, new Bridge(userMetadataField.toReference()));
}
#RequiredArgsConstructor
private static class Bridge implements PropertyBridge<Map> {
private final IndexObjectFieldReference fieldReference;
#Override
public void write(DocumentElement target, Map bridgedElement, PropertyBridgeWriteContext context) {
var map = target.addObject(fieldReference);
((Map<String, Object>) bridgedElement).forEach(map::addValue);
}
}
}
I am aware that documentation defines multiple templates for what an Object in Map can be (like in MultiTypeUserMetadataBinder example), but I really do not know what can be inside. All I know, it is a valid json and my goal is to put it into Elasticsearch as valid json structure under "fields": {...}
In my case jsonB column may contain something like this:
{
"testString": "298",
"testNumber": 123,
"testBoolean": true,
"testNull": null,
"testArray": [
5,
4,
3
],
"testObject": {
"testString": "298",
"testNumber": 123,
"testBoolean": true,
"testNull": null,
"testArray": [
5,
4,
3
]
}
but it throws an exception:
org.hibernate.search.util.common.SearchException: HSEARCH400609: Unknown field 'metadata.testNumber'.
I have also set dynamic_mapping to true in my spring application:
...
spring.jpa.properties.hibernate.search.backend.hosts=127.0.0.3:9200
spring.jpa.properties.hibernate.search.backend.dynamic_mapping=true
...
Any other ideas how can I approach this problem? Or maybe I made an error somewhere?
I am aware that documentation defines multiple templates for what an Object in Map can be (like in MultiTypeUserMetadataBinder example), but I really do not know what can be inside. All I know, it is a valid json and my goal is to put it into Elasticsearch as valid json structure under "fields": {...}
If you don't know what the type of each field is, Hibernate Search won't be able to help much. If you really want to stuff that into your index, I'd suggest declaring a native field and pushing the JSON as-is. But then you won't be able to apply predicates to the metadata fields easily, except using native JSON.
Something like this:
#Component
public class JsonPropertyBinder implements PropertyBinder {
#Override
public void bind(PropertyBindingContext context) {
context.dependencies().useRootOnly();
var schemaElement = context.indexSchemaElement();
// CHANGE THIS
IndexFieldReference<JsonElement> userMetadataField = schemaElement.field(
"metadata",
f -> f.extension(ElasticsearchExtension.get())
.asNative().mapping("{\"type\": \"object\", \"dynamic\":true}");
)
.toReference();
context.bridge(Map.class, new Bridge(userMetadataField));
}
#RequiredArgsConstructor
private static class Bridge implements PropertyBridge<Map> {
private static final Gson GSON = new Gson();
private final IndexFieldReference<JsonElement> fieldReference;
#Override
public void write(DocumentElement target, Map bridgedElement, PropertyBridgeWriteContext context) {
// CHANGE THIS
target.addValue(fieldReference, GSON.toJsonTree(bridgedElement));
}
}
}
Alternatively, you can just declare all fields as strings. Then all features provided by Hibernate Search on string types will be available. But of course things like range predicates or sorts will lead to strange results on numeric values (2 is before 10, but "2" is after "10").
Something like this:
#Component
public class JsonPropertyBinder implements PropertyBinder {
#Override
public void bind(PropertyBindingContext context) {
context.dependencies().useRootOnly();
var schemaElement = context.indexSchemaElement();
var userMetadataField = schemaElement.objectField("metadata");
// ADD THIS
userMetadataField.fieldTemplate(
"userMetadataValueTemplate_default",
f -> f.asString().analyzer( "english" )
);
context.bridge(Map.class, new Bridge(userMetadataField.toReference()));
}
#RequiredArgsConstructor
private static class Bridge implements PropertyBridge<Map> {
private final IndexObjectFieldReference fieldReference;
#Override
public void write(DocumentElement target, Map bridgedElement, PropertyBridgeWriteContext context) {
var map = target.addObject(fieldReference);
// CHANGE THIS
((Map<String, Object>) bridgedElement).forEach(entry -> map.addValue( entry.getKey(), String.valueOf(entry.getValue())));
}
}
}

reactive repository throws exception when saving a new object

I am using r2dbc, r2dbc-h2 and experimental spring-boot-starter-data-r2dbc
implementation 'org.springframework.boot.experimental:spring-boot-starter-data-r2dbc:0.1.0.M1'
implementation 'org.springframework.data:spring-data-r2dbc:1.0.0.RELEASE' // starter-data provides old version
implementation 'io.r2dbc:r2dbc-h2:0.8.0.RELEASE'
implementation 'io.r2dbc:r2dbc-pool:0.8.0.RELEASE'
I have created reactive repositories
public interface IJsonComparisonRepository extends ReactiveCrudRepository<JsonComparisonResult, String> {}
Also added a custom script that creates a table in H2 on startup
#SpringBootApplication
public class JsonComparisonApplication {
public static void main(String[] args) {
SpringApplication.run(JsonComparisonApplication.class, args);
}
#Bean
public CommandLineRunner startup(DatabaseClient client) {
return (args) -> client
.execute(() -> {
var resource = new ClassPathResource("ddl/script.sql");
try (var is = new InputStreamReader(resource.getInputStream())) {
return FileCopyUtils.copyToString(is);
} catch (IOException e) {
throw new RuntimeException(e);
} })
.then()
.block();
}
}
My r2dbc configuration looks like this
#Configuration
#EnableR2dbcRepositories
public class R2dbcConfiguration extends AbstractR2dbcConfiguration {
#Override
public ConnectionFactory connectionFactory() {
return new H2ConnectionFactory(
H2ConnectionConfiguration.builder()
.url("mem:testdb;DB_CLOSE_DELAY=-1;DB_CLOSE_ON_EXIT=FALSE")
.username("sa")
.build());
}
}
My service where I perform the logic looks like this
#Override
public Mono<JsonComparisonResult> updateOrCreateRightSide(String comparisonId, String json) {
return updateComparisonSide(comparisonId, storedComparisonResult -> {
storedComparisonResult.setRightSide(json);
return storedComparisonResult;
});
}
private Mono<JsonComparisonResult> updateComparisonSide(String comparisonId,
Function<JsonComparisonResult, JsonComparisonResult> updateSide) {
return repository.findById(comparisonId)
.defaultIfEmpty(createResult(comparisonId))
.filter(result -> ComparisonDecision.NONE == result.getDecision()) // if not NONE - it means it was found and completed
.switchIfEmpty(Mono.error(new NotUpdatableCompleteComparisonException(comparisonId)))
.map(updateSide)
.flatMap(repository::save);
}
private JsonComparisonResult createResult(String comparisonId) {
LOGGER.info("Creating new comparison result: {}.", comparisonId);
var newResult = new JsonComparisonResult();
newResult.setDecision(ComparisonDecision.NONE);
newResult.setComparisonId(comparisonId);
return newResult;
}
The domain looks like this
#Table("json_comparison")
public class JsonComparisonResult {
#Column("comparison_id")
#Id
private String comparisonId;
#Column("left")
private String leftSide;
#Column("right")
private String rightSide;
// #Enumerated(EnumType.STRING) - no support for now
#Column("decision")
private ComparisonDecision decision;
private String differences;
The problem is that when I try to add any object to the database it fails with the exception
org.springframework.dao.TransientDataAccessResourceException: Failed to update table [json_comparison]. Row with Id [4] does not exist.
at org.springframework.data.r2dbc.repository.support.SimpleR2dbcRepository.lambda$save$0(SimpleR2dbcRepository.java:91) ~[spring-data-r2dbc-1.0.0.RELEASE.jar:1.0.0.RELEASE]
at reactor.core.publisher.FluxHandle$HandleSubscriber.onNext(FluxHandle.java:96) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
at reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onNext(FluxOnErrorResume.java:73) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
at reactor.core.publisher.MonoUsingWhen$MonoUsingWhenSubscriber.deferredComplete(MonoUsingWhen.java:276) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
at reactor.core.publisher.FluxUsingWhen$CommitInner.onComplete(FluxUsingWhen.java:536) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
at reactor.core.publisher.Operators$MultiSubscriptionSubscriber.onComplete(Operators.java:1858) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
at reactor.core.publisher.Operators.complete(Operators.java:132) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
at reactor.core.publisher.MonoEmpty.subscribe(MonoEmpty.java:45) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
at reactor.core.publisher.MonoDefer.subscribe(MonoDefer.java:52) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
For some reason during save in SimpleR2dbcRepository library class it doesn't consider the objectToSave as new, but then it fails to update as it is in reality doesn't exist.
// SimpleR2dbcRepository#save
#Override
#Transactional
public <S extends T> Mono<S> save(S objectToSave) {
Assert.notNull(objectToSave, "Object to save must not be null!");
if (this.entity.isNew(objectToSave)) { // not new
....
}
}
Why it is happening and what is the problem?
TL;DR: How should Spring Data know if your object is new or whether it should exist?
Relational Spring Data Repositories (both, JDBC and R2DBC) must differentiate on [Reactive]CrudRepository.save(…) whether the given object is new or whether it exists in your database. Performing a save(…) operation results either in an INSERT or UPDATE statement. Issuing the wrong statement either causes a primary key violation or a no-op as standard SQL does not have a way to express an upsert.
Spring Data JDBC|R2DBC use by default the presence/absence of the #Id value. Generated primary keys are a widely used mechanism. If the primary key is provided, the entity is considered existing. If the id value is null, the entity is considered new.
Read more in the reference documentation about Entity State Detection Strategies.
You have to implement Persistable because you’ve provided the #Id. The library needs to figure out, whether the row is new or whether it should exist. If your entity implements Persistable, then save(…) will use the outcome of isNew() to determine whether to issue an INSERT or UPDATE.
For example:
public class Product implements Persistable<Integer> {
#Id
private Integer id;
private String description;
private Double price;
#Transient
private boolean newProduct;
#Override
#Transient
public boolean isNew() {
return this.newProduct || id == null;
}
public Product setAsNew() {
this.newProduct = true;
return this;
}
}
May be you should consider this:
Choose data type of your id/Primary Key as INT/LONG and set it to AUTO_INCREMENT (something like below):
CREATE TABLE PRODUCT(id INT PRIMARY KEY AUTO_INCREMENT NOT NULL, modelname VARCHAR(30) , year VARCHAR(4), owner VARCHAR(50));
In your post request body, do not include id field.
Removing #ID issued insert statement

How can I use #JsonTypeInfo and #JsonSubTypes to instantiate Classes with different configurations?

I want to create a config file that will allow me to define different data generators, each of which will need a different configuration. But, they all share the same method, generateRow, so these classes can all implement an interface. I'm using Jackson version 2.9.4.
To illustrate, here's two sample config files:
{
"data": {
"type": "standard",
"config": {
"rows": 1000,
"columns": 10
}
}
}
and
{
"data": {
"type": "totalSize",
"config": {
"sizeInBytes": 1073741824,
"cellDensityInBytes": 12,
"columns": 5
}
}
}
The first data generator simply creates a file with the given number of rows and columns, the second generator creates a file of a pre-defined size, determining the number of rows needed to satisfy the configured variables (i.e., number of columns and cell density).
So, I created an interface:
import com.fasterxml.jackson.annotation.JsonSubTypes;
import com.fasterxml.jackson.annotation.JsonSubTypes.Type;
import com.fasterxml.jackson.annotation.JsonTypeInfo;
import com.fasterxml.jackson.annotation.JsonTypeInfo.As;
import com.fasterxml.jackson.annotation.JsonTypeInfo.Id;
#JsonTypeInfo(use = Id.NAME, include = As.PROPERTY, property = IGenerateRows.PROPERTY, defaultImpl = StandardRowGenerator.class)
#JsonSubTypes(value = { #Type(StandardRowGenerator.class) })
public interface IGenerateRows {
public static final String PROPERTY = "type";
public String[] generateRow();
}
And I have at least one concrete implementation:
import com.fasterxml.jackson.annotation.JsonProperty;
import com.fasterxml.jackson.annotation.JsonTypeName;
#JsonTypeName(value = StandardRowGenerator.TYPE)
public class StandardRowGenerator {
public static final String TYPE = "standard";
private static final String ROWS = "rows";
private static final String COLUMNS = "columns";
#JsonProperty(value = ROWS, required = true)
private int rows;
#JsonProperty(value = COLUMNS, required = true)
private int columns;
}
What I cannot figure out, is how to handle the config node of a data generator node in my configuration file. How would I correctly set up my concrete classes to define the properties they need to generate data?
In my bootstrap code, I instantiate the entire config object as follows:
new ObjectMapper().readValue(inputStream, DataGeneratorConfig.class);
For brevity, I've omitted getters and setters, and the rest of the config file which isn't pertinent to the question at-hand. If I can provide any additional details or code, let me know.
I'm a little unsure about the underlying implementation of your classes and what data they are genearting etc.
But you are along the right sort of lines, I've pushed what I think is a working example of what you are looking to this repo, note this is using https://projectlombok.org/ to generate the POJOs because Im lazy.
https://github.com/Flaw101/jackson-type-info
it will ignore the "data" node. This is mostly because again Im lazy, the entities could be wrapped in a Data class to handle it. The ObjectMapper in the test enables the features required for this.
It will read/write the data of the config classes. Inline with the exampels you've specified.
There's no quick wins for automagically desearlizing the data. You could maybe just write it to a map -> Object but that's incredibly messy and with tools like lombok/IDE class geneartion making these entities should be seconds of work.
The IGenerateRow looks like,
#JsonTypeInfo(use = Id.NAME, include = As.PROPERTY, property = RowGenerator.PROPERTY, defaultImpl = StandardRowGenerator.class)
#JsonSubTypes(value = { #Type(StandardRowGenerator.class), #Type(TotalSizeGeneartor.class) })
#JsonRootName(value = "data")
public abstract interface RowGenerator {
public static final String PROPERTY = "type";
Config getConfig();
}
And Config is just a marker interface for the concrete impls.
public interface Config {
}
The SimpleTypeGenerator now becomes,
#JsonTypeName(value = StandardRowGenerator.TYPE)
#Data
public class StandardRowGenerator implements RowGenerator {
public static final String TYPE = "standard";
private StandardConfig config;
#Data
public static class StandardConfig implements Config {
private int rows;
private int columns;
}
}
And similar for TotalSize,
#JsonTypeName(value = TotalSizeGeneartor.TYPE)
#Data
public class TotalSizeGeneartor implements RowGenerator {
public static final String TYPE = "totalSize";
private TotalSizeConfig config;
#Data
public static class TotalSizeConfig implements Config {
private long sizeInBytes;
private int cellDensityInBytes;
private int columns;
}
}
These could be improved with more/better generic type information to be able to get the concrete references to config.
The test class reads your two configs in the resource folder, writes them to the object and back to a string comparing the before/after, that there is no null or empty properties, and that the interfaces are of the correct implementation.
Note, this uses the assertThat from AssertJ
public class JacksonTest {
private ObjectMapper mapper;
private String json;
#Before
public void setup() throws Exception {
mapper = new ObjectMapper();
mapper.configure(SerializationFeature.WRAP_ROOT_VALUE, true);
mapper.configure(DeserializationFeature.UNWRAP_ROOT_VALUE, true);
}
#Test
public void testDeserStandard() throws Exception {
json = StringUtils.deleteWhitespace(
new String(Files.readAllBytes(Paths.get("src/main/resources/standard.json")), StandardCharsets.UTF_8));
RowGenerator generator = mapper.readValue(json, RowGenerator.class);
assertThat(generator).hasNoNullFieldsOrProperties().isExactlyInstanceOf(StandardRowGenerator.class);
assertThat(generator.getConfig()).hasNoNullFieldsOrProperties().isExactlyInstanceOf(StandardConfig.class);
assertThat(json).isEqualTo(mapper.writeValueAsString(generator));
System.out.println(generator);
}
#Test
public void testDeserTotalsize() throws Exception {
json = StringUtils.deleteWhitespace(
new String(Files.readAllBytes(Paths.get("src/main/resources/totalsize.json")), StandardCharsets.UTF_8));
RowGenerator generator = mapper.readValue(json, RowGenerator.class);
assertThat(generator).hasNoNullFieldsOrProperties().isExactlyInstanceOf(TotalSizeGeneartor.class);
assertThat(generator.getConfig()).hasNoNullFieldsOrProperties().isExactlyInstanceOf(TotalSizeConfig.class);
assertThat(json).isEqualTo(mapper.writeValueAsString(generator));
System.out.println(generator);
}
}

Spring Boot: Wrapping JSON response in dynamic parent objects

I have a REST API specification that talks with back-end microservices, which return the following values:
On "collections" responses (e.g. GET /users) :
{
users: [
{
... // single user object data
}
],
links: [
{
... // single HATEOAS link object
}
]
}
On "single object" responses (e.g. GET /users/{userUuid}) :
{
user: {
... // {userUuid} user object}
}
}
This approach was chosen so that single responses would be extensible (for example, maybe if GET /users/{userUuid} gets an additional query parameter down the line such at ?detailedView=true we would have additional request information).
Fundamentally, I think it is an OK approach for minimizing breaking changes between API updates. However, translating this model to code is proving very arduous.
Let's say that for single responses, I have the following API model object for a single user:
public class SingleUserResource {
private MicroserviceUserModel user;
public SingleUserResource(MicroserviceUserModel user) {
this.user = user;
}
public String getName() {
return user.getName();
}
// other getters for fields we wish to expose
}
The advantage of this method is that we can expose only the fields from the internally used models for which we have public getters, but not others. Then, for collections responses I would have the following wrapper class:
public class UsersResource extends ResourceSupport {
#JsonProperty("users")
public final List<SingleUserResource> users;
public UsersResource(List<MicroserviceUserModel> users) {
// add each user as a SingleUserResource
}
}
For single object responses, we would have the following:
public class UserResource {
#JsonProperty("user")
public final SingleUserResource user;
public UserResource(SingleUserResource user) {
this.user = user;
}
}
This yields JSON responses which are formatted as per the API specification at the top of this post. The upside of this approach is that we only expose those fields that we want to expose. The heavy downside is that I have a ton of wrapper classes flying around that perform no discernible logical task aside from being read by Jackson to yield a correctly formatted response.
My questions are the following:
How can I possibly generalize this approach? Ideally, I would like to have a single BaseSingularResponse class (and maybe a BaseCollectionsResponse extends ResourceSupport class) that all my models can extend, but seeing how Jackson seems to derive the JSON keys from the object definitions, I would have to user something like Javaassist to add fields to the base response classes at Runtime - a dirty hack that I would like to stay as far away from as humanly possible.
Is there an easier way to accomplish this? Unfortunately, I may have a variable number of top-level JSON objects in the response a year from now, so I cannot use something like Jackson's SerializationConfig.Feature.WRAP_ROOT_VALUE because that wraps everything into a single root-level object (as far as I am aware).
Is there perhaps something like #JsonProperty for class-level (as opposed to just method and field level)?
There are several possibilities.
You can use a java.util.Map:
List<UserResource> userResources = new ArrayList<>();
userResources.add(new UserResource("John"));
userResources.add(new UserResource("Jane"));
userResources.add(new UserResource("Martin"));
Map<String, List<UserResource>> usersMap = new HashMap<String, List<UserResource>>();
usersMap.put("users", userResources);
ObjectMapper mapper = new ObjectMapper();
System.out.println(mapper.writeValueAsString(usersMap));
You can use ObjectWriter to wrap the response that you can use like below:
ObjectMapper mapper = new ObjectMapper();
ObjectWriter writer = mapper.writer().withRootName(root);
result = writer.writeValueAsString(object);
Here is a proposition for generalizing this serialization.
A class to handle simple object:
public abstract class BaseSingularResponse {
private String root;
protected BaseSingularResponse(String rootName) {
this.root = rootName;
}
public String serialize() {
ObjectMapper mapper = new ObjectMapper();
ObjectWriter writer = mapper.writer().withRootName(root);
String result = null;
try {
result = writer.writeValueAsString(this);
} catch (JsonProcessingException e) {
result = e.getMessage();
}
return result;
}
}
A class to handle collection:
public abstract class BaseCollectionsResponse<T extends Collection<?>> {
private String root;
private T collection;
protected BaseCollectionsResponse(String rootName, T aCollection) {
this.root = rootName;
this.collection = aCollection;
}
public T getCollection() {
return collection;
}
public String serialize() {
ObjectMapper mapper = new ObjectMapper();
ObjectWriter writer = mapper.writer().withRootName(root);
String result = null;
try {
result = writer.writeValueAsString(collection);
} catch (JsonProcessingException e) {
result = e.getMessage();
}
return result;
}
}
And a sample application:
public class Main {
private static class UsersResource extends BaseCollectionsResponse<ArrayList<UserResource>> {
public UsersResource() {
super("users", new ArrayList<UserResource>());
}
}
private static class UserResource extends BaseSingularResponse {
private String name;
private String id = UUID.randomUUID().toString();
public UserResource(String userName) {
super("user");
this.name = userName;
}
public String getUserName() {
return this.name;
}
public String getUserId() {
return this.id;
}
}
public static void main(String[] args) throws JsonProcessingException {
UsersResource userCollection = new UsersResource();
UserResource user1 = new UserResource("John");
UserResource user2 = new UserResource("Jane");
UserResource user3 = new UserResource("Martin");
System.out.println(user1.serialize());
userCollection.getCollection().add(user1);
userCollection.getCollection().add(user2);
userCollection.getCollection().add(user3);
System.out.println(userCollection.serialize());
}
}
You can also use the Jackson annotation #JsonTypeInfo in a class level
#JsonTypeInfo(include=As.WRAPPER_OBJECT, use=JsonTypeInfo.Id.NAME)
Personally I don't mind the additional Dto classes, you only need to create them once, and there is little to no maintenance cost. And If you need to do MockMVC tests, you will most likely need the classes to deserialize your JSON responses to verify the results.
As you probably know the Spring framework handles the serialization/deserialization of objects in the HttpMessageConverter Layer, so that is the correct place to change how objects are serialized.
If you don't need to deserialize the responses, it is possible to create a generic wrapper, and a custom HttpMessageConverter (and place it before MappingJackson2HttpMessageConverter in the message converter list). Like this:
public class JSONWrapper {
public final String name;
public final Object object;
public JSONWrapper(String name, Object object) {
this.name = name;
this.object = object;
}
}
public class JSONWrapperHttpMessageConverter extends MappingJackson2HttpMessageConverter {
#Override
protected void writeInternal(Object object, Type type, HttpOutputMessage outputMessage) throws IOException, HttpMessageNotWritableException {
// cast is safe because this is only called when supports return true.
JSONWrapper wrapper = (JSONWrapper) object;
Map<String, Object> map = new HashMap<>();
map.put(wrapper.name, wrapper.object);
super.writeInternal(map, type, outputMessage);
}
#Override
protected boolean supports(Class<?> clazz) {
return clazz.equals(JSONWrapper.class);
}
}
You then need to register the custom HttpMessageConverter in the spring configuration which extends WebMvcConfigurerAdapter by overriding configureMessageConverters(). Be aware that doing this disables the default auto detection of converters, so you will probably have to add the default yourself (check the Spring source code for WebMvcConfigurationSupport#addDefaultHttpMessageConverters() to see defaults. if you extend WebMvcConfigurationSupport instead WebMvcConfigurerAdapter you can call addDefaultHttpMessageConverters directly (Personally I prefere using WebMvcConfigurationSupport over WebMvcConfigurerAdapter if I need to customize anything, but there are some minor implications to doing this, which you can probably read about in other articles.
Jackson doesn't have a lot of support for dynamic/variable JSON structures, so any solution that accomplishes something like this is going to be pretty hacky as you mentioned. As far as I know and from what I've seen, the standard and most common method is using wrapper classes like you are currently. The wrapper classes do add up, but if you get creative with your inheretence you may be able to find some commonalities between classes and thus reduce the amount of wrapper classes. Otherwise you might be looking at writing a custom framework.
I guess you are looking for Custom Jackson Serializer. With simple code implementation same object can be serialized in different structures
some example:
https://stackoverflow.com/a/10835504/814304
http://www.davismol.net/2015/05/18/jackson-create-and-register-a-custom-json-serializer-with-stdserializer-and-simplemodule-classes/

Jackson JSON, filtering properties by path

I need to filter bean properties dynamiclly on serialization.
The #JsonView isn't an option for me.
Assume my Bean (as Json notation):
{
id: '1',
name: 'test',
children: [
{ id: '1.1', childName: 'Name 1.1' },
{ id: '1.2', childName: 'Name 1.2' }
]
}
I want to write the JSON with the following properties:
// configure the ObjectMapper to only serialize this properties:
[ "name", "children.childName" ]
The expected JSON result is:
{
name: 'test',
children: [
{ childName: 'Name 1.1' },
{ childName: 'Name 1.2' }
]
}
Finally I will create an annotation (#JsonFilterProperties) to use with Spring in my RestControllers, something like this:
#JsonFilterProperties({"name", "children.childName"}) // display only this fields
#RequestMapping("/rest/entity")
#ResponseBody
public List<Entity> findAll() {
return serviceEntity.findAll(); // this will return all fields populated!
}
Well, it's tricky but doable. You can do this using Jacksons Filter feature (http://wiki.fasterxml.com/JacksonFeatureJsonFilter) with some minor alterations. To start, we are going to use class name for filter id, this way you won't have to add #JsonFIlter to every entity you use:
public class CustomIntrospector extends JacksonAnnotationIntrospector {
#Override
public Object findFilterId(AnnotatedClass ac) {
return ac.getRawType();
}
}
Next step, make that filter of super class will apply to all of its subclasses:
public class CustomFilterProvider extends SimpleFilterProvider {
#Override
public BeanPropertyFilter findFilter(Object filterId) {
Class id = (Class) filterId;
BeanPropertyFilter f = null;
while (id != Object.class && f == null) {
f = _filtersById.get(id.getName());
id = id.getSuperclass();
}
// Part from superclass
if (f == null) {
f = _defaultFilter;
if (f == null && _cfgFailOnUnknownId) {
throw new IllegalArgumentException("No filter configured with id '" + filterId + "' (type " + filterId.getClass().getName() + ")");
}
}
return f;
}
}
Custom version of ObjectMapper that utilizes our custom classes:
public class JsonObjectMapper extends ObjectMapper {
CustomFilterProvider filters;
public JsonObjectMapper() {
filters = new CustomFilterProvider();
filters.setFailOnUnknownId(false);
this.setFilters(this.filters);
this.setAnnotationIntrospector(new CustomIntrospector());
}
/* You can change methods below as you see fit. */
public JsonObjectMapper addFilterAllExceptFilter(Class clazz, String... property) {
filters.addFilter(clazz.getName(), SimpleBeanPropertyFilter.filterOutAllExcept(property));
return this;
}
public JsonObjectMapper addSerializeAllExceptFilter(Class clazz, String... property) {
filters.addFilter(clazz.getName(), SimpleBeanPropertyFilter.serializeAllExcept(property));
return this;
}
}
Now take a look at MappingJackson2HttpMessageConverter, you will see that it uses one instane of ObjectMapper internaly, ergo you cannot use it if you want different configurations simultaneously (for different requests). You need request scoped ObjectMapper and appropriate message converter that uses it:
public abstract class DynamicMappingJacksonHttpMessageConverter extends MappingJackson2HttpMessageConverter {
// Spring will override this method with one that provides request scoped bean
#Override
public abstract ObjectMapper getObjectMapper();
#Override
public void setObjectMapper(ObjectMapper objectMapper) {
// We dont need that anymore
}
/* Additionally, you need to override all methods that use objectMapper attribute and change them to use getObjectMapper() method instead */
}
Add some bean definitions:
<bean id="jsonObjectMapper" class="your.package.name.JsonObjectMapper" scope="request">
<aop:scoped-proxy/>
</bean>
<mvc:annotation-driven>
<mvc:message-converters>
<bean class="your.package.name.DynamicMappingJacksonHttpMessageConverter">
<lookup-method name="getObjectMapper" bean="jsonObjectMapper"/>
</bean>
</mvc:message-converters>
</mvc:annotation-driven>
And the last part is to implement something that will detect your annotation and perform actual configuration. For that you can create an #Aspect. Something like:
#Aspect
public class JsonResponseConfigurationAspect {
#Autowired
private JsonObjectMapper objectMapper;
#Around("#annotation(jsonFilterProperties)")
public Object around(ProceedingJoinPoint joinPoint) throws Throwable {
/* Here you will have to determine return type and annotation value from jointPoint object. */
/* See http://stackoverflow.com/questions/2559255/spring-aop-how-to-get-the-annotations-of-the-adviced-method for more info */
/* If you want to use things like 'children.childName' you will have to use reflection to determine 'children' type, and so on. */
}
}
Personally, I use this in a different way. I dont use annotations and just do configuration manually:
#Autowired
private JsonObjectMapper objectMapper;
#RequestMapping("/rest/entity")
#ResponseBody
public List<Entity> findAll() {
objectMapper.addFilterAllExceptFilter(Entity.class, "name", "children");
objectMapper.addFilterAllExceptFilter(EntityChildren.class, "childName");
return serviceEntity.findAll();
}
P.S. This approach has one major flaw: you cannot add two different filters for one class.
There's Jackson plugin called squiggly for doing exactly this.
String filter = "name,children[childName]";
ObjectMapper mapper = Squiggly.init(this.objectMapper, filter);
mapper.writeValue(response.getOutputStream(), myBean);
You could integrate it into a MessageConverter or similar, driven by annotations, as you see fit.
If you have a fixed number of possible options, then there is a static solution too: #JsonView
public interface NameAndChildName {}
#JsonView(NameAndChildName.class)
#ResponseBody
public List<Entity> findAll() {
return serviceEntity.findAll();
}
public class Entity {
public String id;
#JsonView(NameAndChildName.class)
public String name;
#JsonView({NameAndChildName.class, SomeOtherView.class})
public List<Child> children;
}
public class Child {
#JsonView(SomeOtherView.class)
public String id;
#JsonView(NameAndChildName.class)
public String childName;
}

Categories