MongoDB complex object versioning with Spring Boot - java

I'm trying to identify the best way to do mongodb object versioning.
Based on the mongodb document versioning pattern, storing revisions in a history collection and having current version in main collection is recommended. According to that, each revision contains the complete object instead of storing diffs.
Then I went through ways to implement data versioning in mongoDB where it recommends a method to store a single document containing all the revisions inside it having a separate history collection.
Therefore, I'm trying to implement my own object versioning implementation for the following document model due its complexity.
Invoice.java
public class Invoice {
private long invoiceId;
private String userName;
private String userId;
private LocalDateTime createdDate;
private LocalDateTime lastModifiedDate;
private List<String> operationalUnits;
private List<BodyModel> body;
private List<ReviewModel> reviews;
private BigDecimal changedPrice;
private String submitterId;
private LocalDateTime submittedTime;
private String approverId;
private LocalDateTime approvedTime;
}
BodyModel.java
public class BodyModel {
private String packageId;
private List<ItemModel> items;
private List<String> reviews;
}
ReviewModel.java
public class ReviewModel {
private String publishedTime;
private String authorName;
private String authorId;
private String text;
}
ItemModel.java
public class ItemModel {
private String itemNumber;
private String description;
private String brandId;
private String packId;
private List<String> reviews;
}
ER Diagram (Simplified)
At the moment, I'm using Javers library. But, Javers keeps the Invoice model as the main entity and other models such as BodyModel, ReviewModel, ItemModel as separated valueObjects. As a result, instead of creating a single revision document, it creates separated documents for the valueObjects. Additionally, it always constructs the current objects from the base version plus all changes which leads to huge read time. Addtionally, I identified a valueObjects issue that comes with javers. Refer this question for more info: MongoDB document version update issue with JaVers
Following are the issues, I've if I'm going to create my own implementation using spring boot.
If I'm going to put revisionId in each of the revisions (as shown in the below object) when mongoDB save(), how do I find the current revisionId to be included ?
{
_Id: <ObjectId>,
invoiceId: <InvoiceId>,
invoice: <Invoice>,
revisionId: <revisionId>
}
For this I can keep a field for revisionId in InvoiceModel which can be updated when saving to the main collection. And at the same time, it can be used to save the revision into history collection. Is it the best possible way to do it in this case ?
If I'm only going to store diffs, then how do I obtain the current version of the object ?
For this, it feels essential to fetch the current object from the main collection and then compare it with new version (I already have the new version, because that's what I'm going to store in main collection) to create the diff. Then diff can be stored in history collection and new version in main collection. Is it the best possible way to store diffs ?
In both scnearios, aspects coming from AOP can be used to intercept the save() method of the base repository to accomplish the task. But I don't mainly consider about coding implementation details, I would like to know which method or methods would be efficient in storing revisions for a data model such as given above (would like to discuss about methods I've mentioned as well) ?

Related

Change table on runtime - Spring API Rest

Now, I have the next entity. This one is the m1 table of my database.
#Entity(name = "m1")
#Data
public class Information {
#Id
#GeneratedValue(strategy=GenerationType.AUTO)
private int id;
private String date;
private Double weight_1;
private Double weight_2;
private Double weight_3;
private Double weight_4;
private int working;
}
So, when I do some call to the APIRest it returns me the information corresponding to the m1 table. The controller that I have is the next (simple controller that returns all the information):
#Controller
#RequestMapping(path = "/information")
public class InformationController {
#Autowired
private InformationRepository repository;
#GetMapping(path="/all")
public #ResponseBody List<Information> getAllInformations() {
// This returns a JSON or XML with the users
return repository.findAll();
}
}
The question is: There is any way to change the name of the m1 on runtime. For example can I put the name of the table in the call path and in the API Rest take it?
Maybe this is impossible and I am doing it the bad way I do not know.
EDIT: I mean, can I change the table that the API Rest is taking the data by putting the table that I want in the url/path that I call. For example: in my case the default table/entity that the APIRest take the data is m1, so can I call http://localhost:8080/information/especifictable/all/ where especific table is the table that I want the recieve the data of the database and in the API Rest take that url parameter and change the default m1 with the especifictable.
I do not know if I have explained it well, I do not know how to explain it well.
Such a design would only make sense, if there are two tables in DB, which look the same. if that is the case there is something wrong with your DB design.
Basically it is not possible, to the best of my knowledge.

Adding attributes to existing object

I have an existing REST application which is caching a POJO (E.g Trade object) to ehcache and many other applications are using that. Some are sending that Trade object to REST service so that it can be persisted to Cache and DB and some are doing get operation on this cache using REST service.
public class Trade implements Serializable {
private static final long serialVersionUID = -92565215465632589L;
private String tradeNo = new String();
private String isin = new String();
private String quantity = new String();
.... //getters and setters
}
Now I want to add one more component to our application which uses many of the above trade attributes and many new also I want to add as a part of the functionality. I don't want to add new attributes to exist Trade POJO as it will impact existing code also. Shall I create new POJO which will extend Trade and add new attributes and persist this new POJO to cache? I will have almost similar object in cache with this approach :-( . any other good approach or design pattern is available?
public class ExtendedTrade extends Trade {
private String operation = new String();
private String dealType = new String();
private String identifier = new String();
.... //getters and setters
}
Above is the ExtendedTrade that I was describing in my approach.
Also please suggest any design so that I can avoid caching this similar type of object.
Embedding (maybe with delegate pattern) seems more solid under circumstances.
public class ExtendedTrade {
private Trade trade;
private String operation = "";
private String dealType = "";
private String identifier = "";
.... //getters and setters
}
Consider:
whether existing Trade can be extended to an ExtendedTrade (abstract).
whether an ExtendedTrade2 might come in existence, with other attributes.
whether you need to patch existing attributes of Trade.
I certainly won't insist that this is better.

Spring data elasticsearch GeoPoint with spring mvc

I'm using elastic search for storing the data (spring data elasticsearch ) and I, need to store geolocation in my document. The class structure is in the following format.
#Document(indexName = "outlet")
public class OutletIndex implements IESMapper {
#Id
private String path;
private String name;
#GeoPointField
private GeoPoint geoPoint;
// setters and getters
}
Since there are no setters for class GeoPoint it does not work with #ModelAttribute annotation in spring MVC controller. That I need to get it from the front so I'd updated it to:
#Document(indexName = "outlet")
public class OutletIndex implements IESMapper {
#Id
private String path;
private String name;
#GeoPointField
private GeoPoint geoPoint;
private String geoLocation;
public void setGeoLocation(String geoLocation) {
this.geoLocation = geoLocation;
if (geoLocation != null && geoLocation.trim() != "") {
String[] loc = geoLocation.split(",");
this.geoPoint = new GeoPoint(Double.parseDouble(loc[0]), Double.parseDouble(loc[1]));
}
}
// setters and getters
}
An additional field which holds it's string representation and within the Setter which also updates the GeoPiont.
Is there any better approach to do this?
EDIT: One more doubt, is there any way to use the string as geopoint(comma separated value)?
It seems like you are using geo_point data type with data in Elasticsearch of the format location:"latVal,lonVal". This is allowed with Elasticsearch as one of the valid formats for geo_point.
Elasticsearch just provides the data stored in the format you've given. For the same geo_point type in your ES schema, you can store it in multiple formats in different documents and when you try to get them ES will just return them in the format you've stored.
This thing causes issues, as if you have different formats for the same type to be handled specially for a type safe language like Java. You can either do 2 things: ensure a consistent type throughout (both while indexing and retrieving), handle each corner case on application side.
To avoid all this mess, a good rule of thumb I follow is to use the same format as the one provided by the Java client. Here in this case I would not use any custom de-serialization and serialization logic. Instead it would be better to save the location in the format location:{"lat": latVal, "lon": lonVal}. (GeoPoint class expects a double lat and a double lon)
If you ensure this, you will no longer need to think multiple times over the types you'll be receiving and their corner cases while handling them and at the same time avoid a lot of confusion.

Spring Data Neo4j not mapping Class fields to node properties

I do have a Repository
#Repository
public interface PointOfInterestRepository extends GraphRepository<Poi> {
// currently empty
}
with no custom methods defined. So I use the like of save(T... entities) which are predefined.
And I have my Poi class as follows
#NodeEntity(label = "PointOfInterest")
public class Poi {
#JsonIgnore
#GraphId
Long neo4jId;
#JsonManagedReference("node-poi")
#JsonProperty("node")
#Relationship(type = "BELONGS_TO", direction = Relationship.UNDIRECTED)
private Node node;
#JsonProperty("id")
#Property(name = "poiID")
private final String id;
#JsonProperty("uris")
#Property(name = "uris")
private final Set<URI> correspondingURIs = new HashSet<>();
/* Some more stuff I skip here*/
}
with getters for the fields.
Currently I am able to save such Pois to neo4j and retrieve them back, but when I try to work with those Nodes in the database via cypher it appears that the fields aren't mapped to neo4j properties.
I thought spring-data-neo4j would convert my class fields to neo4j graph properties. Am I wrong with that?
Note: The save calls seems to work very well. After that I can see the Nodes in the database and calling findAll() afterwards will return me all the saved Nodes (Pois) properly with all the correct values. But somehow, within the database, I cannot see any properties/fields.
The problem is the final fields. SDN would be unable to write values back to the entity when loaded from the graph because these fields are final (and SDN will use only the default no-args constructor), and as such, final fields are not supported.
Removing the final should fix this.

How to create objects from different data sources/formats

I'm working on project where I need to create objects from different data sources/formats. I would like to know what is the best way to organize source code to make it easy.
Let's say I have class User and I want to have ability to create objects from data from database and JSON. The purpose of it is letting user of my app to browse data online and offline. I'm using GSON and ORMLite. In addition fields in JSON and database may be different, but the "main" fields are the same. Is it a good idea to create class which contains all properties/fields from JSON and database? Something similar to class below:
#DatabaseTable(tableName = "user", daoClass = UserDaoImpl.class)
public class User {
public static final String ID_FIELD_NAME = "id";
public static final String USER_LOGIN_FIELD_NAME = "login";
public static final String USER_EMAIL_FIELD_NAME = "email";
public static final String SERIALIZED_COUNTRY_FIELD_NAME = "user_county";
// DB & JSON
#DatabaseField(generatedId = true, columnName = ID_FIELD_NAME)
int id;
// DB & JSON
#DatabaseField(columnName = USER_LOGIN_FIELD_NAME)
String login;
//DB & JSON
#DatabaseField(columnName = USER_EMAIL_FIELD_NAME)
String email;
//Only JSON
#SerializedName(SERIALIZED_COUNTRY_FIELD_NAME)
String country;
public Track() {
}
}
Is it a good idea to create class which contains all properties/fields from JSON and database?
I think the short answer is yes. You can certain use the same objects to represent the data in the database and via JSON.
When you will get into problems is when you need to change the data representations in the database but don't want to change your JSON API or vice versa. Then you will need 2 separate classes and a mapping function between them.
But if you can get away with one class then that's the best way.
consider the factory pattern, you can use it to abstract the creation of concrete user classes as a function of the data source.
make User into an interface, and have a data-source specific implementation of User for each type of data source.

Categories