Hibernate check deletion constraints - java

I am using Spring - Hibernate to control models in my application. The models are linked to each other (one-to-many, many-to-many, almost kinds of relationships), and now I have a problem when deleting one entity which is being used by other entities. The problem is I want to show the detail message that informs exactly what other objects (type, name) are using the entity that I am going to delete. It's not the common message about Constraint violation that Hibernate throws me.
For example: Car --> Person, House --> Person; then when I delete one Person who has a car and house, the message will show "There are Car (named Ford Mustang) and House (named MyHouse) linked to this Person".
1. So is there any method from Hibernate support this requirement? I guess there's no implementation for this specific requirement.
2. If not any utility available for this problem, I am thinking about below solutions:
- in each entity class (i.e Person), I will define all checking method that detect linking from the this entity to other entities, for example:
class Person {
// Properties
// Checking methods, return type and name of linked objects
public Map<String, String> getLinkedCars() {
// Query to DB to get linked cars
// return a Map contain Class name along with object name <Car, Ford Mustang>
}
public Map<String, String> getLinkedHouses() {
// Query to DB to get linked houses
// return a Map contain Class name along with object name <House, MyHouse>
}
}
-and then, in service before deleting Person entity, I will use reflection mechanism to collect results from checking methods (whose name is started with "getLinkedXXX"), and build the detail error messages.
So is this solution good? About the performance, and the convention of MVC (because I have to query data inside model class)?
Thank you for your help.

One (not so simple) approach is to scan your entity class for #OneToMany or #ManyToMany annotated fields and perform checking so neat error message can be provided to user. Following sample code assumes you only annotate the field, not the getters method, eg:
public class Person {
#OneToMany(..)
private List<House> houses;
//...
}
First get the list of all fields using reflection:
Fields[] fields = Person.class.getDeclaredFields();
Then iterate and check for #OneToMany or #ManyToMany annotations
for(Field f : fields) {
if( f.getAnnotation(OneToMany.class) != null ||
f.getAnnotation(ManyToMany.class) != null) {
// Here you know f has to be checked before the person is deleted ...
}
}
The value of a field of a particular person object can be obtained using:
Person p = // fetch a person ..
Field f = // assume f is the "List<House> houses" field
List<House> houses = (List<House>) f.get(p);

I had a similar problem, I had to check if an entity could be safely deleted to avoid foreign key constraint violations. This is how I solved:
First, I created an annotation to mark the entity that needs to be checked before deletion:
#Target({ElementType.TYPE})
#Retention(RetentionPolicy.RUNTIME)
#Documented
#Constraint(validatedBy = SafeDeleteValidator.class)
public #interface SafeDelete {
String message() default "{lima.jefferson.SafeDelete.message}";
Class<?>[] groups() default { };
Class<? extends Payload>[] payload() default { };
}
Then I created another annotation to be applied to any method that will be used to check if the entity can be deleted:
#Target({ElementType.METHOD})
#Retention(RetentionPolicy.RUNTIME)
#Documented
public #interface CheckForDelete {
}
In the entity class I used these annotations like this:
#SafeDelete(message = "Couldn't delete this entity due to...")
public class MyEntity {
#CheckForDelete
public Boolean checkForDelete() {
// Insert your business logic here
return true;
}
}
And finally, the validator for the SafeDelete annotation:
public class SafeDeleteValidator implements ConstraintValidator<SafeDelete, Object> {
#Override
public void initialize(SafeDelete constraintAnnotation) {
}
#Override
public boolean isValid(Object object, ConstraintValidatorContext context) {
Method[] methods = object.getClass().getMethods();
return Arrays.stream(methods)
.filter(m -> m.getAnnotation(CheckForDelete.class) != null)
// Deal with the exception here
.map(m -> (Boolean) m.invoke(object))
.reduce(true, (a, b) -> a && b);
}
}
Then you can follow the answer of this question to apply the validation to deletion only.

Related

How to pass different annotations with the same property as a method parameter and have the ability to use that property?

CASE
I have a class whose responsibility is to compare field by field two instances of a DTO class for a certain subset of all DTO instance fields and collect "pretty" names of the fields whose values are different. Other fields may be added in the future and they may or may not be included in this comparison.
To allow for this expansion, this is currently implemented as follows: a field that needs to be included in the comparison logic is annotated with a custom annotation and its pretty name is passed as the annotation parameter.
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.FIELD)
public #interface ComparableField {
String prettyName();
}
public class Dto {
private String field1; //not included in the comparison logic
#ComparableField(prettyName="Pretty field 2")
private String field2;
#ComparableField(prettyName="Pretty field 3")
private String field3;
}
The aforementioned class uses reflection to iterate through all the fields of the DTO class and check if that annotation is present, compares the values, and, if different, adds the pretty name to a set.
public Set<String> getAnnotatedDifferentFields(Dto dto1, Dto dto2) {
Set<String> result = new HashSet<>();
Field[] declaredFields = dto1.getClass().getDeclaredFields();
Arrays.stream(declaredFields)
.filter(field -> field.isAnnotationPresent(ComparableField.class))
.forEach(field -> {
field.setAccessible(true);
if (!field.get(dto1).equals(field.get(dto2)) {
result.add(field.getAnnotation(ComparableField.class).prettyName());
}
});
return result;
}
I missed exception handling and other complications on purpose.
PROBLEM
There's a new requirement now: all "comparable" fields will have some logical groupings and the resulting sets should be different as well.
QUESTION
Is there an elegant way to implement this requirement along the same lines? I was thinking having different annotations for different logical groups and calling the method with the required annotation as a method parameter:
public Set<String> getAnnotatedDifferentFields(Dto dto1, Dto dto2, Class<? extends Annotation> annotationClass) {
Set<String> result = new HashSet<>();
Field[] declaredFields = dto1.getClass().getDeclaredFields();
Arrays.stream(declaredFields)
.filter(field -> field.isAnnotationPresent(annotationClass))
.forEach(field -> {
field.setAccessible(true);
if (!field.get(dto1).equals(field.get(dto2)) {
result.add(field.getAnnotation(annotationClass).prettyName()); //compile error here
}
});
return result;
}
But this of course gives me a compile error because annotationClass is now generic and might not have the prettyName property. There's also no such thing as annotation inheritance, wherebe I would create a parent annotation with the property and several child annotations. I also thought about annotating my custom annotations with another annotation but that wouldn't work either.

Creating multiple aliases for the same QueryDSL path in Spring Data

I have a generic Spring Data repository interface that extends QuerydslBinderCustomizer, allowing me to customize the query execution. I am trying to extend the basic equality testing built into the default repository implementation, so that I can perform other query operations using Spring Data REST. For example:
GET /api/persons?name=Joe%20Smith // This works by default
GET /api/persons?nameEndsWith=Smith // This requires custom parameter binding.
The problem I am having is that every alias of an entity path I create seems to override the preceding alias bindings.
#NoRepositoryBean
public interface BaseRepository<T, ID extends Serializable>
extends PagingAndSortingRepository<T, ID>, QueryDslPredicateExecutor<T>, QuerydslBinderCustomizer {
#Override
#SuppressWarnings("unchecked")
default void customize(QuerydslBindings bindings, EntityPath entityPath){
Class<T> model = entityPath.getType();
Path<T> root = entityPath.getRoot();
for (Field field: model.getDeclaredFields()){
if (field.isSynthetic()) continue;
Class<?> fieldType = field.getType();
if (fieldType.isAssignableFrom(String.class)){
// This binding works by itself, but not after the next one is added
bindings.bind(Expressions.stringPath(root, field.getName()))
.as(field.getName() + "EndsWith")
.first((path, value) -> {
return path.endsWith(value);
});
// This binding overrides the previous one
bindings.bind(Expressions.stringPath(root, field.getName()))
.as(field.getName() + "StartsWith")
.first((path, value) -> {
return path.startsWith(value);
});
}
}
}
}
Is it possible to create more than one alias for the same field? Can this be accomplished in a generic way?
You can create a transient property bound to QueryDSL this way:
#Transient
#QueryType(PropertyType.SIMPLE)
public String getNameEndsWith() {
// Whatever code, even return null
}
If you are using the QueryDSL annotation processor, you will see the "nameEndsWith" in the metadata Qxxx class, so you can bind it like any persisted property, but without persisting it.

How to get MongoRepository to return correct type when there are multiple types in the collection [duplicate]

Having two types of entities, that are mapped to two Java classes in the single MongoDB collection:
#Document
public class Superclass { ... }
#Document(collection = "superclass")
public class Subclass extends Superclass { ... }
and two repositories for those entities:
public interface SuperclassRepository extends MongoRepository<Superclass, String> {}
public interface SubclassRepository extends MongoRepository<Subclass, String> {}
MongoRepositories don't handle the inheritance of the entities correctly. While querying for all Subclass objects (e.g. SubclassRepository.findAll()) the result set contains Superclass objects, that are instantiated (or at least had been tried to be instantiated) with null values for fields that are part of the Subclass, but are not part of the Superclass.
The expected result would be that SubclassRepository should return only Subclass objects, while SuperclassRepository should return Superclass and Subclass objects. It works this way in Spring Data JPA.
Has anyone encountered this bug and has any solution on how to fix it?
I encounter the same issue.
Take a look at the source code and at my surprise it is kind of not implemented. It adds the Collection name and the entity class but not inserted in the final query the _class property.
And after taking a look at it I realized that how Mongo would know that SubClass1 or Subclass2 derived from SuperClass.
So I just override the SimpleMongoRepository Class and create my own Factory that put that class instead of the default SimpleMongoRepository
Here what i added:
public MySimpleMongoRepository(MongoEntityInformation<T, ID> metadata, MongoOperations mongoOperations) {
Assert.notNull(mongoOperations);
Assert.notNull(metadata);
this.entityInformation = metadata;
this.mongoOperations = mongoOperations;
Reflections reflections = new Reflections("com.cre8techlabs.entity");
Set<String> subTypes = reflections.getSubTypesOf(entityInformation.getJavaType()).stream().map(Class::getName).collect(Collectors.toSet());
subTypes.add(entityInformation.getJavaType().getName());
this.baseClassQuery = Criteria.where("_class").in(subTypes.toArray());
}
And here an example of implementation of a find
public T findOne(ID id) {
Assert.notNull(id, "The given id must not be null!");
Query q = getIdQuery(id).addCriteria(baseClassQuery);
return mongoOperations.findOne(q, entityInformation.getJavaType(), entityInformation.getCollectionName());
}
It works for me, I am just afraid that it take a little longer

MongoDB Auto-Incrementing id field with java mongo driver?

Whole day I have tried to find a answer on the question:
"How to add auto-Incrementing "id" field in an Entity class?".
I am using Morphia (a type-safe java library for Mongo DB). After a couple hours of digging in source code and googling I have found a LongIdEntity class in org.mongodb.morphia.utils package. Based on this class I have implemented the following solution. See below:
City class:
#Entity
public class City {
#Id
private Long id;
}
Hotel class:
#Entity
public class Hotel {
#Id
private Long id;
}
CityLongIdEntity class:
public class CityLongIdEntity extends LongIdEntity {
public CityLongIdEntity(Datastore ds) {
super(ds);
}
}
HotelLongIdEntity class:
public class HotelLongIdEntity extends LongIdEntity {
public HotelLongIdEntity(Datastore ds) {
super(ds);
}
}
DAO implementation:
CityDAO class:
public class CityDAO extends BasicDAO<City, Long> {
public CityDAO(Datastore ds) {
super(ds);
}
#Override
public Key<City> save(City c) {
if (c.getId() == null) {
CityLongIdEntity ent = new CityLongIdEntity(getDs());
getDs().save(ent);
c.setId(ent.getMyLongId());
}
return getDs().save(c);
}
}
HotelDAO class:
public class HotelDAO extends BasicDAO<Hotel, Long> {
public HotelDAO(Datastore ds) {
super(ds);
}
#Override
public Key<Hotel> save(Hotel h) {
if (h.getId() == null) {
HotelLongIdEntity ent = new HotelLongIdEntity(getDs());
getDs().save(ent);
h.setId(ent.getMyLongId());
}
return getDs().save(h);
}
}
Or you can see all this code on the Github
The UML diagram is also available:
All this code works as expected and I am happy, but I have a couple questions:
As you can see, for each Entity I need to create additional Entity, for example: for entity City I created CityLongIdEntity (this entity is crucial part of auto-incrementing functionality) . In this case, if my app will have 20 Entities (City, Address, Hotel, User, Room, Order etc.) I will need to create a 40 classes! I am afraid, but I think it will be "Code smell". Am I right?
Also, the Entity doesn't know about EntityNameLongIdEntity and EntityNameLongIdEntity has no idea who is Entity. And only specific EntityDAO class combines ans uses those classes together. Is it ok? Or it is again code smell?
Each EntityDAO class overrides extends BasicDAO class and overrides method save(). The difference between overrided methods save() for different DAO classes is minimal. I am afraid. that is code duplication and code smell again. Am I right?
please provide your opinion.
We require numeric IDs on some entities, but our implementation is a little different:
We use a regular ObjectId on all entities. Where required, we add a numeric ID.
There is a dedicated AutoIncrementEntity, which keeps a counter for different keys — that would be your class name.
We don't use DAOs but a generic save method, where we check if we have an instance of a class with a numeric ID. If that ID hasn't been set, we fetch one and update the AutoIncrementEntity. The relevant method isn't used at the moment — let me know if it's totally unclear and I'll finish that code.
Two more things from my implementation, which might be a little confusing:
You can always provide a starting number, so our numeric IDs could be 1000, 1001, 1002,... instead of 1, 2, 3,...
The key in the AutoIncrementEntity isn't required to be a class, it could also be a subset. For example, you want to number employees within a company, then the key for employees of company A would be employee-A, for company B company-B,...

Passing dynamic parameters to an annotation

I wonder if there is a possiblity to pass dynamically values to an annotation attribute.
I know that annotation are not designed to be modified but I'm using Hibernate Filters and condition to be put are not static in my case.
I think that the only solution is to use librairies whose aim is to read and modify byte code such as Javassist or ASM but it would be too much better if there is another solution.
ps: The difficulty in my case is that I should modify annotations (attribute's value) but librairies I mentioned above allow to create not to edit that's why I'm wondering for another solution
Thanks in advance
I don't know if it integrates nicely with your frameworks, but i would like to suggest the following:
Create an annotation which receives a Class that implements the validation rule
Create an interface which the annotation can receive
Create an implementation for the interface which has the logic for your rule
Add the annotations to your model class
Create an annotation processor which applies the validation for each annotated field
I wrote the following example in Groovy, but using standard Java libs and idiomatic Java. Warn me if anything is unreadable:
import java.lang.annotation.*
// Our Rule interface
interface Rule<T> { boolean isValid(T t) }
// Here is the annotation which can receive a Rule class
#Retention(RetentionPolicy.RUNTIME)
#interface Validation { Class<? extends Rule> value() }
// An implementation of our Rule, in this case, for a Person's name
class NameRule implements Rule<Person> {
PersonDAO dao = new PersonDAO()
boolean isValid(Person person) {
Integer mode = dao.getNameValidationMode()
if (mode == 1) { // Don't hardcode numbers; use enums
return person.name ==~ "[A-Z]{1}[a-z ]{2,25}" // regex matching
} else if (mode == 2) {
return person.name ==~ "[a-zA-Z]{1,25}"
}
}
}
After these declarations, the usage:
// Our model with an annotated field
class Person {
#Validation(NameRule.class)
String name
}
// Here we are mocking a database select to get the rule save in the database
// Don't use hardcoded numbers, stick to a enum or anything else
class PersonDAO { Integer getNameValidationMode() { return 1 } }
The processing of the annotations:
// Here we get each annotation and process it against the object
class AnnotationProcessor {
String validate(Person person) {
def annotatedFields = person.class.declaredFields.findAll { it.annotations.size() > 0 }
for (field in annotatedFields) {
for (annotation in field.annotations) {
Rule rule = annotation.value().newInstance()
if (! rule.isValid(person)) {
return "Error: name is not valid"
}
else {
return "Valid"
}
}
}
}
}
And tests:
// These two must pass
assert new AnnotationProcessor().validate(
new Person(name: "spongebob squarepants") ) == "Error: name is not valid"
assert new AnnotationProcessor().validate(
new Person(name: "John doe") ) == "Valid"
Also, take a look at GContracts, it provides some interesting validation-through-annotations model.
Annotation parameters are hard coded constants in the classfile. So the only way to change them is to generate a new classfile.
Unfortunately, I'm not familiar with Hibernate, so I can't suggest the best option in your specific case.

Categories