Spring boot - partial update best practise? - java

I am using Spring boot v2 with mongo database. I was wondering what is the best way to do partial updates on the data model. Say I have a model with x attributes, depending on the request I may only want to update 1, 2 , or x of them attributes. Should I be exposing an end point for each type of update operation, or is it possible to expose one end pint and do it in a generic way? Note I will need to be able to validate the contents of the request attributes (e.g tel no must be numbers only)
Thanks,

HTTP PATCH is a nice way to update a resource by specifying only the properties that have changed.
The following blog explain it very well

You can actually expose just one endpoint. This is the situation I had a few months ago:
I wanted people to modify any (or even all)fields of a Projects document (who am I to force the users to manually supply all fields lol). So I have my Model,
Project.java:
package com.foxxmg.jarvisbackend.models;
//imports
#Document(collection = "Projects")
public class Project {
#Id
public String id;
public String projectTitle;
public String projectOverview;
public Date startDate;
public Date endDate;
public List<String> assignedTo;
public String progress;
//constructors
//getters & setters
}
I have my repository:
ProjectRepository.java
package com.foxxmg.jarvisbackend.repositories;
//imports
#Repository
public interface ProjectRepository extends MongoRepository<Project, String>, QuerydslPredicateExecutor<Project> {
//please note, we are going to use findById(string) method for updating
Project findByid(String id);
//other abstract methods
}
Now to my Controller, ProjectController.java:
package com.foxxmg.jarvisbackend.controllers;
//import
#RestController
#RequestMapping("/projects")
#CrossOrigin("*")
public class ProjectController {
#Autowired
private ProjectRepository projectRepository;
#PutMapping("update/{id}")
public ResponseEntity<Project> update(#PathVariable("id") String id, #RequestBody Project project) {
Optional<Project> optionalProject = projectRepository.findById(id);
if (optionalProject.isPresent()) {
Project p = optionalProject.get();
if (project.getProjectTitle() != null)
p.setProjectTitle(project.getProjectTitle());
if (project.getProjectOverview() != null)
p.setProjectOverview(project.getProjectOverview());
if (project.getStartDate() != null)
p.setStartDate(project.getStartDate());
if (project.getEndDate() != null)
p.setEndDate(project.getEndDate());
if (project.getAssignedTo() != null)
p.setAssignedTo(project.getAssignedTo());
return new ResponseEntity<>(projectRepository.save(p), HttpStatus.OK);
} else
return new ResponseEntity<>(HttpStatus.NOT_FOUND);
}
}
That will allow partial update in MongoDB with Spring Boot.

If you are using Spring Data MongoDB, you have two options either use the MongoDB Repository or using the MongoTemplate.

Related

How to use Mongo Auditing and a UUID as id with Spring Boot 2.2.x?

I would like to have Documents stored with an UUID id and createdAt / updatedAt fields. My solution was working with Spring Boot 2.1.x. After I upgraded from Spring Boot 2.1.11.RELEASE to 2.2.0.RELEASE my test for MongoAuditing failed with createdAt = null. What do I need to do to get the createdAt field filled again?
This is not just a testproblem. I ran the application and it has the same behaviour as my test. All auditing fields stay null.
I have a Configuration to enable MongoAuditing and UUID generation:
#Configuration
#EnableMongoAuditing
public class MongoConfiguration {
#Bean
public GenerateUUIDListener generateUUIDListener() {
return new GenerateUUIDListener();
}
}
The listner hooks into the onBeforeConvert - I guess thats where the trouble starts.
public class GenerateUUIDListener extends AbstractMongoEventListener<IdentifiableEntity> {
#Override
public void onBeforeConvert(BeforeConvertEvent<IdentifiableEntity> event) {
IdentifiableEntity entity = event.getSource();
if (entity.isNew()) {
entity.setId(UUID.randomUUID());
}
}
}
The document itself (I dropped the getter and setters):
#Document
public class MyDocument extends InsertableEntity {
private String name;
}
public abstract class InsertableEntity extends IdentifiableEntity {
#CreatedDate
#JsonIgnore
private Instant createdAt;
}
public abstract class IdentifiableEntity implements Persistable<UUID> {
#Id
private UUID id;
#JsonIgnore
public boolean isNew() {
return getId() == null;
}
}
A complete minimal example can be find here (including a test) https://github.com/mab/auditable
With 2.1.11.RELEASE the test succeeds with 2.2.0.RELEASE it fails.
For me the best solution was to switch from event UUID generation to a callback based one. With the implementation of Ordered we can set the new callback to be executed after the AuditingEntityCallback.
public class IdEntityCallback implements BeforeConvertCallback<IdentifiableEntity>, Ordered {
#Override
public IdentifiableEntity onBeforeConvert(IdentifiableEntity entity, String collection) {
if (entity.isNew()) {
entity.setId(UUID.randomUUID());
}
return entity;
}
#Override
public int getOrder() {
return 101;
}
}
I registered the callback with the MongoConfiguration. For a more general solution you might want to take a look at the registration of the AuditingEntityCallback with the `MongoAuditingBeanDefinitionParser.
#Configuration
#EnableMongoAuditing
public class MongoConfiguration {
#Bean
public IdEntityCallback registerCallback() {
return new IdEntityCallback();
}
}
MongoTemplate works in the following way on doInsert()
this.maybeEmitEvent - emit an event (onBeforeConvert, onBeforeSave and such) so any AbstractMappingEventListener can catch and act upon like you did with GenerateUUIDListener
this.maybeCallBeforeConvert - call before convert callbacks like mongo auditing
like you can see in source code of MongoTemplate.class src (831-832)
protected <T> T doInsert(String collectionName, T objectToSave, MongoWriter<T> writer) {
BeforeConvertEvent<T> event = new BeforeConvertEvent(objectToSave, collectionName);
T toConvert = ((BeforeConvertEvent)this.maybeEmitEvent(event)).getSource(); //emit event
toConvert = this.maybeCallBeforeConvert(toConvert, collectionName); //call some before convert handlers
...
}
MongoAudit marks createdAt only to new entities by checking if entity.isNew() == true
because your code (UUID) already set the Id the createdAt is not populated (the entity is not considered new)
you can do the following (order by best to worst):
forget about the UUID and use String for your id, let the mongo itself create and manage it's entities ids (this how MongoTemplate actually works lines 811-812)
keep the UUID at the code level, convert from/to String when inserting and retrieving from the db
create a custom repository like in this post
stay with 2.1.11.RELEASE
set the updateAt by GenerateUUIDListener as well as id (rename it NewEntityListener or smth), basically implement the audit
implement a new isNew() logic that don't depends only on the entity id
in version 2.1.11.RELEASE the order of the methods was flipped (MongoTemplate.class 804-805) so your code worked fine
as an abstract approach, the nature of event is to be sort of send-and-forget (async compatible), so it's a very bad practice to change the object itself, there is NO grantee for order of computation, if any
this is why the audit build on callbacks and not events, and that's why Pivotal don't (need to) keep order between versions

Why is Spring Boot Starter Mongodb Reactive not saving the List fields of my entities?

I can't figure out why the List fields of my entities are not being persisted when saved using Spring Boot Mongodb Reactive.
This is how my Contact entity looks like:
#Data
#Document
public class Contact {
private String id;
#DBRef
private User owner;
private List<String> messageIds;
private Message lastMessage;
private LocalDateTime lastMessageAt;
public boolean addMessageId(String message) {
if (messageIds == null) {
messageIds = new ArrayList<>();
}
return messageIds.add(message);
}
}
I save it using this repository:
public interface ContactRepository extends ReactiveMongoRepository<Contact, String> {
Mono<Contact> findByOwnerId(String ownerId);
}
Everything but the messageIds list is getting persisted just fine. I already switched from directly having a list of Messages, but this didn't help so far.
The debugger also shows the messageIds list right there before the repo save call.
Already searched for a solution without any luck. Am I missing the obvious here?
It really was that simple:
I forgot to subscribe to the inner observables that saved my entities in my reactive chain.

What is the best way to get response without HATEOAS?

I tried get entity by Data JPA & Data Rest without HATEOAS.
The condition is that I use the HATEOAS form, and sometimes I need a pure Json response.
So I'm creating JSON by creating the controller path separately from the repository's endpoint and creating the DTO class separately.
this is my code :
#RepositoryRestController
public class MetricController {
#Autowired
private MetricRepository metricRepository;
#RequestMapping(method = RequestMethod.GET, value = "/metrics/in/{id}")
public #ResponseBody
MetricDTO getMetric(#PathVariable Long id) {
return MetricDTO.fromEntity(metricRepository.getOne(id));
}
}
#RepositoryRestResource
public interface MetricRepository extends JpaRepository<Metric, Long> { }
#Setter
#Getter
#NoArgsConstructor
#AllArgsConstructor
public class MetricDTO {
private SourceType sourceType;
private String metricTypeField;
private String metricType;
private String instanceType;
private String instanceTypeField;
private List<String> metricIdFields;
private List<String> valueFields;
private Map<String, String> virtualFieldValueEx;
public static MetricDTO fromEntity(Metric metric) {
return new MetricDTO(
metric.getSourceType(),
metric.getMetricTypeField(),
metric.getMetricType(),
metric.getInstanceType(),
metric.getInstanceTypeField(),
metric.getMetricIdFields(),
metric.getValueFields(),
metric.getVirtualFieldValueEx()
);
}
}
It's the way I do, but I expect there will be better options and patterns.
The question is, I wonder if this is the best way.
HATEOAS (Hypermedia as the Engine of Application State) is a constraint of the REST application architecture.
It basically tells that anyone who is a consumer of your REST endpoints can navigate between them with the help of the link.
let take your example
**HTTP Method** **Relation (rel)** **Link**
GET Up /metrics/in
GET Self /metrics/in/{id}
GET SourceType /sourceType/{id}
GET metricIdFields /url for each in JSON aarray
Delete Delete /employe/{employeId}
Use org.springframework.hateoas.Links class to create such link in your DTOs.
in you DTO add
public class MetricDTO {
private Links links;
//Getters and setters
//inside your setters add SLEF , GET , create Delete for current resource
}
https://www.baeldung.com/spring-hateoas-tutorial

Update entity in redis with spring-data-redis

I'm currently using Redis (3.2.100) with Spring data redis (1.8.9) and with Jedis connector.
When i use save() function on an existing entity, Redis delete my entity and re create the entity.
In my case i need to keep this existing entity and only update attributes of the entity. (I have another thread which read the same entity at the same time)
In Spring documentation (https://docs.spring.io/spring-data/data-redis/docs/current/reference/html/#redis.repositories.partial-updates), i found the partial update feature. Unfortunately, the example in the documentation use the update() method of RedisTemplate. But this method do not exist.
So did you ever use Spring-data-redis partial update?
There is another method to update entity redis without delete before?
Thanks
To get RedisKeyValueTemplate, you can do:
#Autowired
private RedisKeyValueTemplate redisKVTemplate;
redisKVTemplate.update(entity)
You should use RedisKeyValueTemplate for make partial update.
Well, consider following docs link and also spring data tests (link) actually made 0 contribution to resulting solution.
Consider following entity
#RedisHash(value = "myservice/lastactivity")
#Data
#AllArgsConstructor
#NoArgsConstructor
#Builder
public class LastActivityCacheEntity implements Serializable {
#Id
#Indexed
#Size(max = 50)
private String user;
private long lastLogin;
private long lastProfileChange;
private long lastOperation;
}
Let's assume that:
we don't want to do complex read-write exercise on every update.
entity = lastActivityCacheRepository.findByUser(userId);
lastActivityCacheRepository.save(LastActivityCacheEntity.builder()
.user(entity.getUser())
.lastLogin(entity.getLastLogin())
.lastProfileChange(entity.getLastProfileChange())
.lastOperation(entity.getLastOperation()).build());
what if there would pop up some 100 rows? then on each update entity got to fetched and saved, quite inefficient, but still would work out.
we don't actually want complex exercises with opsForHash + ObjectMapper + configuring beans approach - it's quite hard to implement and maintain (for example link)
So we're about to use something like:
#Autowired
private final RedisKeyValueTemplate redisTemplate;
void partialUpdate(LastActivityCacheEntity update) {
var partialUpdate = PartialUpdate
.newPartialUpdate(update.getUser(), LastActivityCacheEntity.class);
if (update.getLastLogin() > 0)
partialUpdate.set("lastlastLogin", update.getLastLogin());
if (update.getLastProfileChange() > 0)
partialUpdate.set("lastProfileChange", update.getLastProfileChange());
if (update.getLastOperation() > 0)
partialUpdate.set("lastOperation", update.getLastOperation());
redisTemplate.update(partialUpdate);
}
and the thing is - it doesn't really work for this case.
That is, values getting updated but you can not query new property later on via repository entity lookup: certain lastActivityCacheRepository.findAll() will return unchanged properties.
Here's the solution:
LastActivityCacheRepository.java:
#Repository
public interface LastActivityCacheRepository extends CrudRepository<LastActivityCacheEntity, String>, LastActivityCacheRepositoryCustom {
Optional<LastActivityCacheEntity> findByUser(String user);
}
LastActivityCacheRepositoryCustom.java:
public interface LastActivityCacheRepositoryCustom {
void updateEntry(String userId, String key, long date);
}
LastActivityCacheRepositoryCustomImpl.java
#Repository
public class LastActivityCacheRepositoryCustomImpl implements LastActivityCacheRepositoryCustom {
#Autowired
private final RedisKeyValueTemplate redisKeyValueTemplate;
#Override
public void updateEntry(String userId, String key, long date) {
redisKeyValueTemplate.update(new PartialUpdate<>(userId, LastActivityCacheEntity.class)
.set(key, date));
}
}
And finally working sample:
void partialUpdate(LastActivityCacheEntity update) {
if ((lastActivityCacheRepository.findByUser(update.getUser()).isEmpty())) {
lastActivityCacheRepository.save(LastActivityCacheEntity.builder().user(update.getUser()).build());
}
if (update.getLastLogin() > 0) {
lastActivityCacheRepository.updateEntry(update.getUser(),
"lastlastLogin",
update.getLastLogin());
}
if (update.getLastProfileChange() > 0) {
lastActivityCacheRepository.updateEntry(update.getUser(),
"lastProfileChange",
update.getLastProfileChange());
}
if (update.getLastOperation() > 0) {
lastActivityCacheRepository.updateEntry(update.getUser(),
"lastOperation",
update.getLastOperation());
}
all credits to Chris Richardson and his src
If you don't want to type your field names as strings in the updateEntry method, you can use use the lombok annotation on your entity class #FieldNameConstants. This creates field name constants for you and then you can access your field names like this:
...
if (update.getLastOperation() > 0) {
lastActivityCacheRepository.updateEntry(update.getUser(),
LastActivityCache.Fields.lastOperation, // <- instead of "lastOperation"
update.getLastOperation());
...
This makes refactoring the field names more bug-proof.

Java Spring REST API Handling Many Optional Parameters

I'm currently messing around with a Spring Boot REST API project for instructional purposes. I have a rather large table with 22 columns loaded into a MySQL database and am trying to give the user the ability to filter the results by multiple columns (let's say 6 for the purposes of this example).
I am currently extending a Repository and have initialized methods such as findByParam1 and findByParam2 and findByParam1OrderByParam2Desc and etc. and have verified that they are working as intended. My question to you guys is the best way to approach allowing the user the ability to leverage all 6 optional RequestParams without writing a ridiculous amount of conditionals/repository method variants. For example, I want to give the user the ability to hit url home/get-data/ to get all results, home/get-data?param1=xx to filter based on param1, and potentially, home/get-data?param1=xx&param2=yy...&param6=zz to filter on all the optional parameters.
For reference, here is what the relevant chunk of my controller looks like (roughly).
#RequestMapping(value = "/get-data", method = RequestMethod.GET)
public List<SomeEntity> getData(#RequestParam Map<String, String> params) {
String p1 = params.get("param1");
if(p1 != null) {
return this.someRepository.findByParam1(p1);
}
return this.someRepository.findAll();
}
My issue so far is that the way I am proceeding about this means that I will basically need n! amount of methods in my repository to support this functionality with n equalling the amount of fields/columns I want to filter on. Is there a better way to approach handling this, perhaps where I am filtering the repository 'in-place' so I can simply filter 'in-place' as I check the Map to see what filters the user did indeed populate?
EDIT: So I'm currently implementing a 'hacky' solution that might be related to J. West's comment below. I assume that the user will be specifying all n parameters in the request URL and if they do not (for example, they specify p1-p4 but not p5 and p6) I generate SQL that just matches the statement to LIKE '%' for the non-included params. It would look something like...
#Query("select u from User u where u.p1 = :p1 and u.p2 = :p2 ... and u.p6 = :p6")
List<User> findWithComplicatedQueryAndSuch;
and in the Controller, I would detect if p5 and p6 were null in the Map and if so, simply change them to the String '%'. I'm sure there is a more precise and intuitive way to do this, although I haven't been able to find anything of the sort yet.
You can do this easily with a JpaSpecificationExecutor and a custom Specification: https://spring.io/blog/2011/04/26/advanced-spring-data-jpa-specifications-and-querydsl/
I would replace the HashMap with a DTO containing all optional get params, then build the specifications based on that DTO, obviously you can also keep the HashMap and build the specification based on it.
Basically:
public class VehicleFilter implements Specification<Vehicle>
{
private String art;
private String userId;
private String vehicle;
private String identifier;
#Override
public Predicate toPredicate(Root<Vehicle> root, CriteriaQuery<?> query, CriteriaBuilder cb)
{
ArrayList<Predicate> predicates = new ArrayList<>();
if (StringUtils.isNotBlank(art))
{
predicates.add(cb.equal(root.get("art"), art));
}
if (StringUtils.isNotBlank(userId))
{
predicates.add(cb.equal(root.get("userId"), userId));
}
if (StringUtils.isNotBlank(vehicle))
{
predicates.add(cb.equal(root.get("vehicle"), vehicle));
}
if (StringUtils.isNotBlank(identifier))
{
predicates.add(cb.equal(root.get("identifier"), fab));
}
return predicates.size() <= 0 ? null : cb.and(predicates.toArray(new Predicate[predicates.size()]));
}
// getter & setter
}
And the controller:
#RequestMapping(value = "/{ticket}/count", method = RequestMethod.GET)
public long getItemsCount(
#PathVariable String ticket,
VehicleFilter filter,
HttpServletRequest request
) throws Exception
{
return vehicleService.getCount(filter);
}
Service:
#Override
public long getCount(VehicleFilter filter)
{
return vehicleRepository.count(filter);
}
Repository:
#Repository
public interface VehicleRepository extends JpaRepository<Vehicle, Integer>, JpaSpecificationExecutor<Vehicle>
{
}
Just a quick example adapted from company code, you get the idea!
Another solution with less coding would be to use QueryDsl integration with Spring MVC.
By using this approach all your request parameters will be automatically resolved to one of your domain properties and appended to your query.
For reference check the documentation https://spring.io/blog/2015/09/04/what-s-new-in-spring-data-release-gosling#querydsl-web-support and the example project https://github.com/spring-projects/spring-data-examples/tree/master/web/querydsl
You can do it even more easily using Query By Example (QBE) technique if your repository class implements JpaRepository interface as that interface implements QueryByExampleExecutor interface which provides findAll method that takes object of Example<T> as an argument.
Using this approach is really applicable for your scenario as your entity has a lot of fields and you want user to be able to get those which are matching filter represented as subset of entity's fields with their corresponding values that have to be matched.
Let's say the entity is User (like in your example) and you want to create endpoint for fetching users whose attribute values are equal to the ones which are specified. That could be accomplished with the following code:
Entity class:
#Entity
public class User implements Serializable {
private Long id;
private String firstName;
private String lastName;
private Integer age;
private String city;
private String state;
private String zipCode;
}
Controller class:
#Controller
public class UserController {
private UserRepository repository;
private UserController(UserRepository repository) {
this.repository = repository;
}
#GetMapping
public List<User> getMatchingUsers(#RequestBody User userFilter) {
return repository.findAll(Example.of(userFilter));
}
}
Repository class:
#Repository
public class UserRepository implements JpaRepository<User, Integer> {
}

Categories