My JPA repositories extend a custom interface that carries annotations for handling authorization in a generic way.
public interface MultiTenantCrudRepo<T, ID> extends CrudRepository<T, ID>
This interface adds #PreAuthorize, #PostAuthorize, #PreFilter and #PostFilter annotations to the methods of CrudRepository.
Further, for some entities, I have the need to implement soft deletion. For this purpose, I created a "SoftDeleteRepository" like this:
public interface SoftDeleteRepository<T extends BaseEntity<I> & SoftDeletable, I extends Serializable> extends CrudRepository<T, I> {
#Query("update #{#entityName} e set e.isDeleted = true where e.id = ?#{#entity.id}")
#Modifying
#Override
public void delete(#Param("entity") T entity);
You can see it adds #Query annotations to implement the functionality I need.
Both interfaces work independently as expected, but when a repository required both attributes (authorization and soft deletion) like this
public interface FooRepo extends SoftDeleteRepository<Foo, Long>, MultiTenantCrudRepo<Foo, Long> {
it seems like only the annotations of the first interface after "extends" are effective. So in this case, I get a FooRepo that supports soft delection but without authorization validation.
What is the best way to get both to work?
Guess that it is a tricky thing to do because it actually would be multi inheritance thing which Java does not support, for example see this.
What would be chosen if there were two of the same annotations with different parameters, for example?
Many frameworks - like Spring data - do just fine when checking for inheritance of annotations but guess only if there is no multi-inheritance and/or with same annotations. These frameworks might use reflection to go up on the "implements tree" but might choose only one path because of the above or if well implemented throw an exception.
Because of this I am afraid you need to do something like:
public interface SoftDeleteMultitenantRepository
extends MultiTenantCrudRepo<Foo, Long> {
// a copy of your soft delete method here
}
Related
Is there any way to intercept or change the document before MongoRepository.save(e) updates it?
I am trying to push a subproperty inside and array in a document. I have tried to manipulate DBObjects by implementing converter(using custom converter) but the $push operation did not work over there.
I think to make it work I have to implement something like mongoOperation.update(dbObjectMatch,dbObjectUdate).
I found MongoRepositorysave(document) doesn't support partial update, i.e write only the change to an existing document. I want to know the internal code of MongoRepository.save to override the default behavior.
I have implemented MyRepositoryCustom where I can override save by extending the same in MyRepository, which extends MongoRepository<T, ID extends Serializable> and then used mongoTemplate.updateFirst(query,update,Clazz.class) to achieve what I am looking for but I am not satisfied.
You have multiple, slightly different questions:
From your title:
Change/Override Default Behaviour of Mongorepository Save() ( S save(S var1)) Method
You can use custom implementations to override the behavior of existing methods in Spring Data repositories. See the reference documentation how to do that. Your last paragraph suggests you already do that. Unfortunately you don't tell us why you aren't satisfied with this.
Is there any way to intercept/change before Mongorepository Save() ( S save(S var1)) method For document update.
Yes a Spring Data MongoDB repository fires various life cycle events for this purpose. Once again, see the reference documentation for details.
I want to know the Internal code of Mongorepository Save
What you are looking for is SimpleMongoRepository.java which delegates for almost all work to MongoTemplate.java
You're looking for Lifecycle Events.
Overriding repository base methods allows you to interact with the domain object itself but the mapping happens inside of MappingMongoConverter.
Saving an object will fire events such as
BeforeSaveEvent
BeforeConvertEvent
AfterConvertEvent
These events carry a reference to your saved object. AfterConvertEvent additionally exposes the mapped representation (DBObject) of your object which you can change/enhance.
You can listen to these events by configuring a listener bean such as ApplicationListener<AfterConvertEvent>.
You can fragmentally override the MongoRepository by ensuring you match the signature of method in the custom interface.
E.g. If you want to override save() method, create a new interface like
interface CustomizedSave<T> {
<S extends T> S save(S entity);
}
Implement this interface:
#Component
class CustomizedSaveImpl<T> implements CustomizedSave<T> {
//This is optional, I added it to just show you can use it here
MongoTemplate mongoTemplate;
#Autowired
public CustomizedSaveImpl(MongoTemplate mongoTemplate) {
this.mongoTemplate = mongoTemplate;
}
public <S extends T> S save(S entity) {
// Your custom implementation
}
}
Extend the interface in the base MongoRepository like
interface UserRepository extends CrudRepository<User, Long>, CustomizedSave<User> {
}
This will ensure only save menthod gets overridden and rest stay as is.
Docs: https://docs.spring.io/spring-data/mongodb/docs/current/reference/html/#repositories.single-repository-behavior
I'm building a Spring Boot application containing more than 10 domain classes which need to be persisted in a SQL database.
The problem is that I need to create an interface for every single domain class, so something like this for each one:
public interface BehandelaarRepo extends CrudRepository<BehandelCentrum, Long> {
}
Is there any way to decrease the number of repositories by using some kind of design pattern or whatever? Any suggestions?
You can actually make it some easier for yourself by using generics the same way Spring Data JPA does it:
public interface JpaRepository<T extends Serializable, ID extends Serializable> {
public <S extends T> S save(S object);
}
The trick is that you can use all subclasses, and you're getting that class back as well. I always create one superclass, so I get rid of my ID generic:
#MappedSuperclass
public class JpaObject {
#Id
#GeneratedValue
private Long id;
(.... created, last updated, general stuff here....)
}
I create my #Entity classes as a subclass from this JpaObject.
Second step: create my super interface for future usage of special queries:
#NoRepositoryBean
public interface Dao<T extends JpaObject> extends JpaRepository<T, Long> {
}
Next step: the generic Dao which looks some stupid and remains empty at all times
#Repository
public interface GenericDao extends Dao<JpaObject> {
}
Now have a sharp look at that save method in CrudRepository / JpaRepository:
public <S extends T> S save(S object);
Any object extending JpaObject (S extends JpaObject) now can be given as a parameter to all methods, and the returntype is the same class as your parameter.
(Aziz, als het handiger is, kan het ook in het Nederlands uitgelegd worden :P Groet uit Zwolle)
Well, I had a similar problem. I resolved it by creating new layer, namely
RepositoryManager (or ModelService) singleton that had all the repo interfaces and methods that used them.
If you want you can implement generic save method (then call that class ModelService) that resolves model types through reflection and chooses the corresponding repository.
It was also handy for decoupling cache implementation (I used spring cache).
I'm trying to write a generic SecurePagingAndSorting repository which will check security on CRUD operations to save repeating the same PreAuthorize (with different authorities) throughout all JPA repositories.
Here is a simple example below:
#NoRepositoryBean
public interface SecuredPagingAndSortingRepository<T, ID extends Serializable> extends PagingAndSortingRepository<T, ID> {
#Override
#PreAuthorize("hasPermission(#id, domainType, 'delete'))
void delete(ID id);
}
Now it's the domainType argument that's the problem here, since this is a generic interface, it can't be hard coded. What is the best approach here to get the domain type from repositories that derive from SecurePagingRepository.
The best solution I see is writing your own implementation of the interface PermissionEvaluator and then inject it in the security context replacing the default one.
Should you try this way, extending class AclPermissionEvaluator will save you lots of code already managed by Spring, and ensures back compatibility.
I settled on this solution in the end. PreAuthorize has the facility to use any bean within the Spel expression using the # character.
#Override
#PreAuthorize("hasPermission(#id, #security.getDeletePermission(#id,#this.this)))
void delete(ID id);
}
So when the 'security' bean's getDeletePermission function is called, the #this.this parameter translates to the SimpleJpaRepository in question. This allows us determine the concrete repository in question and returned the desired permission name
I would like to create a Spring Data JPA repository with custom behavior, and implement that custom behavior using Specifications. I have gone through the Spring Data JPA documentation for implementing custom behavior in a single repository to set this up, except there is no example of using a Spring Data Specification from within a custom repository. How would one do this, if even possible?
I do not see a way to inject something into the custom implementation that takes a specification. I thought I would be tricky and inject the CRUD repository portion of the repository into the custom portion, but that results in a circular instantiation dependency.
I am not using QueryDSL. Thanks.
I guess the primary source for inspiration could be how SimpleJpaRepository handles specifications. The key spots to have a look at are:
SimpleJpaRepository.getQuery(…) - it's basically creating a CriteriaQuery and bootstraps a select using a JPA Root. Whether the latter applies to your use case is already up to you. I think the former will apply definitely.
SimpleJpaRepository.applySpecificationToCriteria(…) - it basically uses the artifacts produced in getQuery(…) (i.e. the Root and the CriteriaQuery) and applies the given Specification to exactly these artifacts.
this is not using Specification, so not sure if it's relevant to you, but one way that I was able to inject custom behavior is as follows,
Basic structure: as follows
i. create a generic interface for the set of entity classes which are modeled after a generic parent entity. Note, this is optional. In my case I had a need for this hierarchy, but it's not necessary
public interface GenericRepository<T> {
// add any common methods to your entity hierarchy objects,
// so that you don't have to repeat them in each of the children entities
// since you will be extending from this interface
}
ii. Extend a specific repository from generic (step 1) and JPARepository as
public interface MySpecificEntityRepository extends GenericRepository<MySpecificEntity>, JpaRepository<MySpecificEntity, Long> {
// add all methods based on column names, entity graphs or JPQL that you would like to
// have here in addition to what's offered by JpaRepository
}
iii. Use the above repository in your service implementation class
Now, the Service class may look like this,
public interface GenericService<T extends GenericEntity, ID extends Serializable> {
// add specific methods you want to extend to user
}
The generic implementation class can be as follows,
public abstract class GenericServiceImpl<T extends GenericEntity, J extends JpaRepository<T, Long> & GenericRepository<T>> implements GenericService<T, Long> {
// constructor takes in specific repository
public GenericServiceImpl(J genericRepository) {
// save this to local var
}
// using the above repository, specific methods are programmed
}
specific implementation class can be
public class MySpecificEntityServiceImpl extends GenericServiceImpl<MySpecificEntity, MySpecificEntityRepository> implements MySpecificEntityService {
// the specific repository is autowired
#Autowired
public MySpecificEntityServiceImpl(MySpecificEntityRepository genericRepository) {
super(genericRepository);
this.genericRepository = (MySpecificEntityRepository) genericRepository;
}
}
Is it typical to name DAOs in the following way:
UserDAO - interface
UserDAOImpl - implements UserDAO
I am wondering if its standard to use the suffix 'Impl' for the implementation or if something more meaningful is the best practice. Thanks.
That is generally what I use. Sometimes the Default prefix like DefaultUserDAO might make more sense if you're creating an interface that you expect others to implement but you're providing the reference implementation.
Most of the time I feel those two can be used interchangeably but in some situations one provides a little more clarity than the other.
There are two conventions that I've seen:
FooDao for the interface and FooDaoImpl for the implementation
IFooDao for the interface and FooDao for the implementation
The former has its roots in CORBA; the latter is a Microsoft COM/.NET convention. (Thanks to Pascal for the correction.)
"Don't Repeat the DAO" is a fine idea. I personally think that article is more complex than it needs to be. There's a way to do it without reflection in finders that I happen to prefer. If you use Hibernate, query by example can be a great way to do it simply. The interface would look more like this:
package persistence;
import java.io.Serializable;
import java.util.List;
public interface GenericDao<T, K extends Serializable>
{
T find(K id);
List<T> find();
List<T> find(T example);
List<T> find(String queryName, String [] paramNames, Object [] bindValues);
K save(T instance);
void update(T instance);
void delete(T instance);
}
First of all - you may not really need a DAO class for each of your classes. Don't repeat the DAO! article explains what is a generic DAO. Wondering how to name boilerplate code is not productive.
Now, when you have a generic DAO, you could go for:
DAO (interface)
SessionDAO and EntityManagerDAO - for using either Session or EntityManager
And, of course, use the DAO only by interface. You can easily switch between implementations.
(I actually prefer it lowercased - Dao, although it's an abbreviation; and the Impl suffix)
I've been also fan of the GenericDao and GenericDaoImpl -convention with some support from generic helper classes, should the save or delete require extra actions for some persistent classes:
public interface PersistListener<T> {
void onPersist(T item);
}
Similar constructs can be used also for deletion. This is especially useful if you need some kind of event log to write each activity to and you don't want to use AOP for that.
My GenericDaoImpl would look something like this:
public class GenericDaoImpl<T> extends HibernateTemplate {
public void setPersistListeners(List<PersistListener> listeners) {
this.persistListeners = new GenericInterfaceHandler( listeners,
PersistListener.class );
}
// hibernate updates the key to the object itself
public T save(T item) {
getSession().save( item );
List<PersistListener<T>> listeners = this.persistListeners.getAll( item );
for ( PersistListener<T> listener : listeners )
listener.persist( item );
}
// ...
}
What the persistListener in the above example will do is to find a PersistListener with generic class matching to that of the class given as a parameter. It such is found, then call is delegated to the proper listener(s). My GenericInterfaceHandler also can be used to return only most specific handler or only handler for the given class if present.
If you are interested, I could also post the GenericInterfaceHandler implementation as it's quite powerful construct on many occasions.