Update Specific Fields with Spring Data Rest and MongoDB - java

I'm using Spring Data MongoDB and Spring Data Rest to create a REST API which allows GET, POST, PUT and DELETE operations on my MongoDB database and it's all working fine except for the update operations (PUT). It only works if I send the full object in the request body.
For example I have the following entity:
#Document
public class User {
#Id
private String id;
private String email;
private String lastName;
private String firstName;
private String password;
...
}
To update the lastName field, I have to send all of the user object, including the password ! which is obviously very wrong.
If I only send the field to update, all the others are set to null in my database. I even tried to add a #NotNull constraints on those fields and now the update won't even happens unless I send all of the user object's fields.
I tried searching for a solution here but I only found the following post but with no solution: How to update particular field in mongo db by using MongoRepository Interface?
Is there a way to implement this ?

Spring Data Rest uses Spring Data repositories to automatically retrieve and manipulate persistent data using Rest calls (check out https://docs.spring.io/spring-data/rest/docs/current/reference/html/#reference).
When using Spring Data MongoDB, you have the MongoOperations interface which is used as a repository for your Rest endpoints.
However MongoOperations currently does not supports specific fields updates !
PS: It will be awesome if they add this feature like #DynamicUpdate in Spring Data JPA
But this doesn't mean it can be done, here's the workaround I did when I had this issue.
Firstly let me explain what we're going to do:
We will create a controller which will override all the PUT operations so that we can implement our own update method.
Inside that update method, we will use MongoTemplate which do have the ability to update specific fields.
N.B. We don't want to re-do these steps for each model in our application, so we will retrieve which model to update dynamically. In order to do that we will create a utility class. [This is optional]
Let's start by adding the org.reflections api to our project dependency which allows us to get all the classes which have a specific annotation (#Document in our case):
<dependency>
<groupId>org.reflections</groupId>
<artifactId>reflections</artifactId>
<version>0.9.12</version>
</dependency>
Then create a new class, called UpdateUtility and add the following methods and also replace the MODEL_PACKAGE attribute with your own package containing your entities:
public class UpdateUtility {
private static final String MODEL_PACKAGE = "com.mycompany.myproject.models";
private static boolean initialized = false;
private static HashMap<String, Class> classContext = new HashMap<>();
private static void init() {
if(!initialized) {
Reflections reflections = new Reflections(MODEL_PACKAGE);
Set<Class<?>> classes = reflections.getTypesAnnotatedWith(Document.class); // Get all the classes annotated with #Document in the specified package
for(Class<?> model : classes) {
classContext.put(model.getSimpleName().toLowerCase(), model);
}
initialized = true;
}
}
public static Class getClassFromType(String type) throws Exception{
init();
if(classContext.containsKey(type)) {
return classContext.get(type);
}
else {
throw new Exception("Type " + type + " does not exists !");
}
}
}
Using this utility class we can retreive the model class to update from it's type.
E.g: UpdateUtility.getClassFromType() will returns User.class
Now let's create our controller:
public class UpdateController {
#Autowired
private MongoTemplate mongoTemplate;
#PutMapping("/{type}/{id}")
public Object update(#RequestBody HashMap<String, Object> fields,
#PathVariable(name = "type") String type,
#PathVariable(name = "id") String id) {
try {
Class classType = UpdatorUtility.getClassFromType(type); // Get the domain class from the type in the request
Query query = new Query(Criteria.where("id").is(id)); // Update the document with the given ID
Update update = new Update();
// Iterate over the send fields and add them to the update object
Iterator iterator = fields.entrySet().iterator();
while(iterator.hasNext()) {
HashMap.Entry entry = (HashMap.Entry) iterator.next();
String key = (String) entry.getKey();
Object value = entry.getValue();
update.set(key, value);
}
mongoTemplate.updateFirst(query, update, classType); // Do the update
return mongoTemplate.findById(id, classType); // Return the updated document
} catch (Exception e) {
// Handle your exception
}
}
}
Now we're able to update the specified fields without changing the calls.
So in your case, the call would be:
PUT http://MY-DOMAIN/user/MY-USER-ID { lastName: "My new last name" }
PS: You can improve it by adding the possibility to update specific field in a nested objects...

Related

Javers not recognizing insert as an initial change

Working on a SpringBoot application using MongoDB as a persistent store.
Using spring data and MongoRepository to access MongoDB.
Using Javers to provide auditting.
If I use mongoRepository.insert(document) followed later by a mongoRepository.save(document) and then use javers to query the changes to that document, javers does not detect the differences between the object inserted and the object saved. It reports only a single change as if the save call was the original object persisted.
If I replace the insert call with a save and let spring data handle whether or not to insert or update, javers reports the expected change.
Example:
Consider the following:
#JaversSpringDataAuditable
public interface SomeDocumentRepository extends MongoRepository<SomeDocument, String> {
}
#Builder
#Data
#Document(collection = "someDocuments")
public class SomeDocument {
#Id
private String id;
private String status;
}
#Service
public class SomeDocumentService {
#Autowired
private SomeDocumentRepository someDocumentRepository;
public SomeDocument insert(SomeDocument doc) {
return someDocumentRepository.insert(doc);
}
public SomeDocument save(SomeDocument doc) {
return someDocumentRepository.save(doc);
}
}
#Service
public class AuditService {
#Autowired
private Javers javers;
public List<Change> getStatusChangesById(String documentId) {
JqlQuery query = QueryBuilder
.byInstanceId(documentId, SomeDocument.class)
.withChangedProperty("status")
.build();
return javers.findChanges(query);
}
}
If I call my service as follows:
var doc = SomeDocument.builder().status("new").build();
doc = someDocumentService.insert(doc);
doc.setStatus("change1");
doc = someDocumentService.save(doc);
and then call the audit service to get the changes:
auditService.getStatusChangesById(doc.getId());
I get a single change with "left" set to a blank and "right" set to "change1".
If I call "save" instead of "insert" like:
var doc = SomeDocument.builder().status("new").build();
doc = someDocumentService.save(doc);
doc.setStatus("change1");
doc = someDocumentService.save(doc);
and then call the audit service to get the changes I get 2 changes, the first being the most recent change with "left" set to "new", and "right" set to "change1" and a second change with "left" set to "" and "right" set to "new".
Is this a bug?
That's a good point. In case of Mongo, Javers covers only the methods from the CrudRepository interface. See https://github.com/javers/javers/blob/master/javers-spring/src/main/java/org/javers/spring/auditable/aspect/springdata/JaversSpringDataAuditableRepositoryAspect.java
Looks like MongoRepository#insert() should be also covered by the aspect.
Feel free to contribute a PR to javers, I will merge it. If you want to discuss the design first - please create a discussion here https://github.com/javers/javers/discussions

How to update only particular fields in an entity using springbooot JPA update API

Here is my code.
Entit class:
#Entity
public class Book
{
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private long bookId;
private double bookPrice;
private String bookTitle;
private String bookAuthor;
private String bookLanguage;
private LocalDate publicationDate;
private String publisher;
private byte[] bookImage;
private long isbn;
private int bookQuantity;
Controller class:
#PutMapping("/updatebooks")
public ResponseEntity<ApiResponse> updateBook(#RequestBody BookDto bookDto)
throws DataNotFoundException
{
return ResponseEntity.ok(bookService.updateBook(bookDto));
}
Service class:
#Override
public ApiResponse updateBook(BookDto bookDto) throws DataNotFoundException
{
Book book = bookRepository.findById(bookDto.getBookId())
.orElseThrow(() -> new DataNotFoundException("Book not found"));
book.setBookAuthor(bookDto.getBookAuthor());
book.setBookLanguage(bookDto.getBookLanguage());
book.setBookPrice(bookDto.getBookPrice());
book.setBookTitle(bookDto.getBookTitle());
book.setIsbn(bookDto.getIsbn());
book.setPublicationDate(bookDto.getPublicationDate());
book.setPublisher(bookDto.getPublisher());
bookRepository.save(book);
return new ApiResponse(HttpStatus.OK.value(), "Updation successful");
}
So through postman I just want to update bookAuthor field alone and other fields has to be same as it is in the database. But when I update just one field the others are automatically assigned as null and I just want to update only one field.
Here see that i'm just updating the bookAuthor field but others are simply changing to null. So how can I update only particular fields and display the others as as it is in database.
Pre Updation DB:
Post Updation DB:
This is not really springboot jpa problem. The issue is in the way updateBook method in service has been implemented. The updateBook method is setting all the fields from dto on persisted book entity without checking if the dto.get* returns null or not.
Very simply but crude solution to this problem would be to have if(dto.get* != null) check for each field before setting the value in book entity. i.e.
if (bookDto.getBookAuthor() != null)
book.setBookAuthor(bookDto.getBookAuthor());
if (bookDto.getBookLanguage() != null)
book.setBookLanguage(bookDto.getBookLanguage());
if (bookDto.getBookPrice() != 0)
book.setBookPrice(bookDto.getBookPrice());
...
This leaves the updateBook service method generic to allow updating one or more fields without worrying about others being overwritten. If this makes your service method too noisy the dto to entity conversation part can be extracted into its own method for clarity.
For more advance usecases and if your entity/dto has more than a handful fields or nested objects as fields then this becomes cumbersome. In such scenarios you may want to handcraft a separate implementation which perhaps uses reflection to map fields from dto to entity if the field value is not null. The usage of reflection is however likely to be slow and/or error prone if not done properly.
There are however libraries and frameworks for more regular usecases which makes such conversation easier. At the simplest springframework's BeanUtils.copyProperties method provides a way to map values from one object to another and the last argument of ignored properties can be used to provide fields names to be ignored. This stackoverflow answer shows how to write a generic method that creates a list of property names to ignore if they are null in source object; and they uses this list to pass as parameter in BeanUtils.copyProperties.
public static String[] getNullPropertyNames (Object source) {
final BeanWrapper src = new BeanWrapperImpl(source);
PropertyDescriptor[] pds = src.getPropertyDescriptors();
Set<String> emptyNames = new HashSet<>();
for(PropertyDescriptor pd : pds) {
Object srcValue = src.getPropertyValue(pd.getName());
if (srcValue == null) emptyNames.add(pd.getName());
}
return emptyNames.toArray(new String[0]);
}
then in your service method;
Book book = bookRepository.findById(bookDto.getBookId())
.orElseThrow(() -> new DataNotFoundException("Book not found"));
BeanUtils.copyProperties(bookDto, book, getNullPropertyNames(bookDto));
bookRepository.save(book);
For further advanced usecases you can use frameworks such as mapstruct or modelmapper.
Don't get the fields from the DTO. Just Do findByID, set the new bookAuthor to the entity and save.
#Modifying
#Query(nativeQuery = true, value = "update book set book_author = :bookAuthor where id = :bookId")
int updateBookAuthor(#Param("bookAuthor") String bookAuthor, #Param("bookId") Long bookId);
You can do something like this in your BookRepository, then invoke from the service class like...
bookRepository.updateBookAuthor(bookAuthor, bookId)
Or modify your service class method like the following...
#Override
public ApiResponse updateBook(BookDto bookDto) throws DataNotFoundException
{
Book book = bookRepository.findById(bookDto.getBookId())
.orElseThrow(() -> new DataNotFoundException("Book not found"));
book.setBookAuthor(bookDto.getBookAuthor());
bookRepository.save(book);
return new ApiResponse(HttpStatus.OK.value(), "Updation successful");
}

Dynamic generate GraphQL schema supports

Is it possible to dynamically create a GraphQL schema ?
We store the data in mongoDB and there is a possibility of new fields getting added. We do not want any code change to happen for this newly added field in the mongoDB document.
Is there any way we can generate the schema dynamically ?
Schema is defined in code, but for java(schema as pojo), when new
attribute is added, you have to update and recompile code, then
archive and deploy the jar again. Any way to generate schema by the
data instead of pre-define it?
Currently we are using java related projects (graphql-java, graphql-java-annotations) for GraphQL development.
You could use graphql-spqr, it allows you auto-generate a schema based on your service classes. In your case, it would look like this:
public class Pojo {
private Long id;
private String name;
// whatever Ext is, any (complex) object would work fine
private List<Ext> exts;
}
public class Ext {
public String something;
public String somethingElse;
}
Presumably, you have a service class containing your business logic:
public class PojoService {
//this could also return List<Pojo> or whatever is applicable
#GraphQLQuery(name = "pojo")
public Pojo getPojo() {...}
}
To expose this service, you'd just do the following:
GraphQLSchema schema = new GraphQLSchemaGenerator()
                .withOperationsFromSingleton(new PojoService())
                .generate();
You could then fire a query such as:
query test {
pojo {
id
name
exts {
something
somethingElse
} } }
No need for strange wrappers or custom code of any kind, nor sacrificing type safety. Works with generics, dependency injection, or any other jazz you may have in your project.
Full disclosure: I'm the author of graphql-spqr.
After some days' investigation. I found it is hard to generate schema dynamically in Java (or cost is so high).
Well, from another way. I think we can use Map as a compromised way to accomplish that.
POJO/Entity
public class POJO{
#GraphQLField
private Long id;
#GraphQLField
private String name;
// ...
#GraphQLField
private GMap exts;
}
GMap is a customized Map (Because Map/HashMap is a JDK inner class which could not make as GraphQL Schema but only extend).
GMap
public class GMap extends HashMap<String, String> {
#GraphQLField
public String get(#GraphQLName("key") String key) {
return super.get(key);
}
}
Retrieve data from Client
// query script
query test
{
your_method
{
id
name
exts {
get(key: "ext") // Add a extended attribute someday
}
}
}
// result
{
"errors":[],
"data":
{
"list":
[
{"id":1, name: "name1", exts: {"get": "ext1"}},
{"id":2, name: "name2", exts: {"get": "ext2"}}
]
}
}

Updating with Morphia Optimistic locking

Hi considering the following example:
Resource:
#PUT
#Path("{id}")
public Response update(#PathParam(value = "id") final String id, final Person person) {
final Person person = service.getPerson(id);
final EntityTag etag = new EntityTag(Integer.toString(person.hashCode()));
// If-Match is required
ResponseBuilder builder = request.evaluatePreconditions(etag);
if (builder != null) {
throw new DataHasChangedException("Person data has changed: " + id);
}
service.updatePerson(id, person.getName());
....
}
Service:
public void updatePerson(final String id, final String name) {
final Query<Person> findQuery = morphiaDataStore.createQuery(Person.class).filter("id ==", id);
UpdateOperations<Person> operation = morphiaDataStore.createUpdateOperations(Person.class).set("name", name);
morphiaDataStore.findAndModify(findQuery, operation );
}
Person:
#Entity("person")
public class Person {
#Id
private ObjectId id;
#Version
private Long version;
private String name;
...
}
I do check if the etag provided is the same of the person within the database. However this check is been done on the resource itself. I don't think that this is safe since the update happens after the check and another thread could have gone threw the check in the meantime. How can this be solved correctly? Any example or advise is appreciated.
Morphia already implements optimistic-locking via #Version annotation.
http://mongodb.github.io/morphia/1.3/guides/annotations/#version
#Version marks a field in an entity to control optimistic locking. If the versions change in the database while modifying an entity (including deletes) a ConcurrentModificationException will be thrown. This field will be automatically managed for you – there is no need to set a value and you should not do so. If another name beside the Java field name is desired, a name can be passed to this annotation to change the document’s field name.
I see you have already use the annotation in your example. Make sure the clients include the version of the document as part of the request so you can also pass it to morphia.
Not sure if findAndModify will be able to handle it (I would think it does). but at least I'm sure save does handle it.
Assuming the object person contains the new name and version that the client was looking at, you can do directly something like this to update the record:
morphiaDataStore.save(person);
If there was another save before this client could pick it up the versions will no longer match and a ConcurrentModificationException will be issued with this message:
Entity of class %s (id='%s',version='%d') was concurrently updated

Spring JPA - Best way to update multiple fields

I'm new to using JPA and trying to transition my code from JdbcTemplate to JPA. Originally I updated a subset of my columns by taking in a map of the columns with their values and created the SQL Update string myself and executed it using a DAO. I was wondering what would be the best way to do something similar using JPA?
EDIT:
How would I transform this code from my DAO to something equivalent in JPA?
public void updateFields(String userId, Map<String, String> fields) {
StringBuilder sb = new StringBuilder();
for (Entry<String, String> entry : fields.entrySet()) {
sb.append(entry.getKey());
sb.append("='");
sb.append(StringEscapeUtils.escapeEcmaScript(entry.getValue()));
sb.append("', ");
}
String str = sb.toString();
if (str.length() > 2) {
str = str.substring(0, str.length() - 2); // remove ", "
String sql = "UPDATE users_table SET " + str + " WHERE user_id=?";
jdbcTemplate.update(sql, new Object[] { userId },
new int[] { Types.VARCHAR });
}
}
You have to read more about JPA for sure :)
Once entity is in Persistence Context it is tracked by JPA provider till the end of persistence context life or until EntityManager#detach() method is called. When transaction finishes (commit) - the state of managed entities in persistence context is synchronized with database and all changes are made.
If your entity is new, you can simply put it in the persistece context by invoking EntityManager#persist() method.
In your case (update of existing entity), you have to get a row from database and somehow change it to entity. It can be done in many ways, but the simpliest is to call EntityManager#find() method which will return managed entity. Returned object will be also put to current persistence context, so if there is an active transaction, you can change whatever property you like (not the primary key) and just finish transaction by invoking commit (or if this is container managed transaction just finish method).
update
After your comment I can see your point. I think you should redesign your app to fit JPA standards and capabilities. Anyway - if you already have a map of pairs <Attribute_name, Attrbute_value>, you can make use of something called Metamodel. Simple usage is shown below. This is naive implementation and works good only with basic attributes, you should take care of relationships etc. (access to more informations about attributes can be done via methods attr.getJavaType() or attr.getPersistentAttributeType())
Metamodel meta = entityManager.getMetamodel();
EntityType<User> user_ = meta.entity(User.class);
CriteriaBuilder cb = entityManager.getCriteriaBuilder();
CriteriaUpdate<User> update = cb.createCriteriaUpdate(User.class);
Root e = update.from(User.class);
for( Attribute<? super User, ?> attr : user_.getAttributes() ) {
if (map.containsKey(attr.getName())) {
update.set(attr, map.get(attr));
}
}
update.where(cb.equal(e.get("id"), idOfUser));
entityManager.createQuery(update).executeUpdate();
Please note that Update Criteria Queries are available in JPA since 2.1 version.
Here you can find more informations about metamodel generation.
Alternatively to metamodel you can just use java reflection mechanisms.
JPA handles the update. Retrieve a dataset as entity using the entitymanager, change the value and call persist. This will store the changed data in your db.
In case you are using Hibernate(as JPA provider), here's an example
Entity
#Entity
#Table(name="PERSON")
public class Person {
#Id #GeneratedValue(strategy=GenerationType.IDENTITY)
private int id;
#Column(name="NAME", nullable=false)
private String name;
other fields....
}
DAO
public interface PersonDao {
Person findById(int id);
void persist(Person person);
...
}
DaoImpl
#Repository("personDao")
public class PersonDaoImpl extends AnAbstractClassWithSessionFactory implements PersonDao {
public Person findById(int id) {
return (Person) getSession().get(Person.class, id);
}
public void persist(Person person){
getSession().persist(person);
}
}
Service
#Service("personService")
#Transactional
public class PersonServiceImpl implements PersonService {
#Autowired
PersonDao personDao;
#Override
public void createAndPersist(SomeSourceObject object) {
//create Person object and populates with the source object
Person person = new Person();
person.name = object.name;
...
personDao.persist(person);
}
#Override
public Person findById(int id) {
return personDao.findById(id);
}
public void doSomethingWithPerson(Person person) {
person.setName(person.getName()+" HELLO ");
//here since we are in transaction, no need to explicitly call update/merge
//it will be updated in db as soon as the methods completed successfully
//OR
//changes will be undone if transaction failed/rolledback
}
}
JPA documentation are indeed good resource for details.
From design point of view, if you have web interfacing, i tends to say include one more service delegate layer(PersonDelegateService e.g.) which maps the actual data received from UI to person entity (and viceversa, for display, to populate the view object from person entity) and delegate to service for actual person entity processing.

Categories