This is my project structure
MODEL
TodoItem.java - this is an interface
TodoType1 - this implements the interface
TodoType2 - this implements the interface
Repo
TodoRepo.java - extends JPA repo with <TodoItem, Integer>
Controller (uses the TodoRepo for CRUD operations)
request 1 - needs to work with todotype1
request 2 - needs to work with todotype2
I am a bit confused, how should i go about using qualifiers here? Should I create different Repositories for each type?
TodoRepo.java - extends JPA repo with <TodoItem, Integer>
Here TodoItem is an interface. Springboot JPA gets confused about which entity it is going to handle(two class implements the TodItem interface). Instead of Interface, declaring a specified entity class won't throw the error.
I think you need to create two different repositories. And then you can use the #Autowired annotation to inject the desired bean into your controller.
This will inject the appropriate repository implementation (TodoType1Repo or TodoType2Repo) into your controller based on the value of the #Qualifier annotation.
more about #Qualifier https://www.baeldung.com/spring-qualifier-annotation
#Qualifier("todoType1Repo")
#Repository
public class TodoType1Repo extends JpaRepository<TodoType1, Integer> {}
#Qualifier("todoType2Repo")
#Repository
public class TodoType2Repo extends JpaRepository<TodoType2, Integer> {}
#Autowired
#Qualifier("todoType1Repo")
private TodoRepo todoType1Repo;
#Autowired
#Qualifier("todoType2Repo")
private TodoRepo todoType2Repo;
public void handleRequest1() {
// Use todoType1Repo to perform CRUD operations on TodoType1 objects
}
public void handleRequest2() {
// Use todoType2Repo to perform CRUD operations on TodoType2 objects
}
Related
I implemented a decorator to customize the mapping of an entity, let's say MappingDecoratorA, which is an abstract class and implements the MapperA interface of mapstructs.
public abstract class MappingDecoratorA implements MapperA {
...}
#Mapper
#DecoratedWith(MappingDecoratorA .class)
public interface MapperA {
In an other mapping I use the MapperA, let's say MappingB, which uses cdi
#Mapper(uses = { MapperA.class},
componentModel = "cdi")
public interface MapperB{
Mapstructs generates two implementations for the MapperA, MapperAImpl and MapperAImpl_. In my situation the inject mechanism doesn't know which implementation to use. The result is an ambiguous exception listening the two implementation.
Does mapstruct support a solution for my problem?
When using the non default componentModel you have to use it for all the mappers. Especially if you want to reuse them. Otherwise the specific component won't know how to inject and create the mappers.
So a solution for your problem would be to do
#Mapper(componentModel = "cdi")
#DecoratedWith(MappingDecoratorA .class)
public interface MapperA {
}
I have a requirement that based on profile I need to inject 2 different classes into DAO layer to perform CRUD operation. Let's say we have class A and Class B for profiles a and b respectively. Now in the DAO layer without using if else condition (As I am using that currently based on the profile, I am using service layer to call 2 different methods 1.saveA(), 2.saveB().) But is there any way to make it more generic and based on profile or either by the class reference I can instantiate different entity as well as JPA Classes? I tried to use
<T extends Parent> T factoryMethod(Class<T> clazz) throws Exception {
return (T) clazz.newInstance();
}
but this also will force me to cast the returned object to a class. I tried creating a parent P for both class A and B. and used them instead but got confused when injecting the entity types to JPARepository.
I tried creating a SimpleJPARepository but didnt worked as there are overridden methods in ARepository and BRepository.
Or,
is there a way I can use the same entity class for 2 different tables? that way it can be solved. for 1 profile I have different sets of columns whereas for 2nd profile I have different columns.
this is how I am expecting: Would it be possible? or, how I am doing now is correct?
public void doStuff(Class<T> class){
GenericRepository repo;
if(class instanceof A){
//use ARepository;
repo = applicationContext.getBean(ARepository);
}else{
//use BRepository;
repo = applicationContext.getBean(BRepository);
}
repo.save(class);
repo.flush();
}
You can create a method utility like following: The key is the class type of the entity and the value is the repository.
Map<Class<? extends Parent>, JpaRepository> repoMapping = new HashMap<>();
#PostConstruct
public void init(){
repoMapping.put(A.class, applicationContext.getBean(ARepository));
repoMapping.put(B.class, applicationContext.getBean(BRepository));
}
public JpaRepository getRepo(Class<? extends Parent> classs){
return repoMapping.get(classs);
}
Using Couchbase Enterprise Edition 5.0.0 build 2873 and Spring Data Couchbase 2.1.2, I am getting the error explained at the bottom of the question. Go forward if you just need the short story.
If you want a little more explanation, here it comes:
Imagine that the CrudRepository methods are fine to me. I mean, I don't need to add any more methods.
How the repository would look like? Would I declare an empty repository just like this?
#Repository
public interface PersonRepository extends CrudRepository<Person, String> {
// Empty
}
Or would I use directly CrudRepositoy as my Base repository. I don't think so, because you need to pass Type and ID to the CrudRepository.
But the question is not that. Here comes the question:
How Spring knows how to instantiate PersonRepository, taking into account that there is no implementation of that base repository? Take a look to PersonService and PersonServiceImpl interface/implementation.
Interface:
#Service
public interface PersonService {
Person findOne (String id);
List<Person> findAll();
//...
}
Implementation:
public class PersonServiceImpl implements PersonService {
// This is the variable for the repository
#Autowired
private PersonRepository personRepository;
public Person findOne(String id){
return personRepository(id);
}
public List<Person> findAll(){
List<Hero> personList = new ArrayList<>();
Iterator<Person> it = personRepository.iterator();
while (it.hasNext()){
personList.add(it.next());
}
return personList;
}
//...
}
Is it really enough with declaring an empty PersonRepository extending from CrudRepository? No need to implement and say anything about each method of CrudRepository?. At least something to tell Spring about some constructor...
This doubt are all because I am getting this error when Spring tries to inject the personRepository variable:
Error creating bean with name 'personRepository': Could not resolve matching constructor (hint: specify index/type/name arguments for simple parameters to avoid type ambiguities).
Apparently, it is asking to have some class defining at least the constructor of the implementation. How do I tell Spring to avoid those type ambiguities mentioned in the error?
As for the repository you'll need to extend CouchbaseRepository if you're strictly working with Couchbase, by that your repository will expose lower level couchbase objects/functionality.
For example
public interface PersonRepository extends CouchbaseRepository<Person, String> {
}
For your service, you don't need to define findOne() and findAll(), those are strictly the responsibility of the repository.
For example
public interface PersonService {
void doOperationOnPerson(String personId);
}
#Service
public class PersonServiceImpl implements PersonService {
#Autowired
PersonRepository personRepo;
#Override
void doOperationOnPerson(String personId) {
Person person = personRepo.findById(personId);
//do operation
}
}
Note that the #Service annotation goes on the implementation. (It actually should work either way but I think having the annotation on the implementation is more proper)
If you need to define custom queries then it should be done inside the repository itself.
You also might need to define an empty constructor on your Person class if you have a non-default constructor.
I suggest you to read more about Spring Data.
I would like to create a Spring Data JPA repository with custom behavior, and implement that custom behavior using Specifications. I have gone through the Spring Data JPA documentation for implementing custom behavior in a single repository to set this up, except there is no example of using a Spring Data Specification from within a custom repository. How would one do this, if even possible?
I do not see a way to inject something into the custom implementation that takes a specification. I thought I would be tricky and inject the CRUD repository portion of the repository into the custom portion, but that results in a circular instantiation dependency.
I am not using QueryDSL. Thanks.
I guess the primary source for inspiration could be how SimpleJpaRepository handles specifications. The key spots to have a look at are:
SimpleJpaRepository.getQuery(…) - it's basically creating a CriteriaQuery and bootstraps a select using a JPA Root. Whether the latter applies to your use case is already up to you. I think the former will apply definitely.
SimpleJpaRepository.applySpecificationToCriteria(…) - it basically uses the artifacts produced in getQuery(…) (i.e. the Root and the CriteriaQuery) and applies the given Specification to exactly these artifacts.
this is not using Specification, so not sure if it's relevant to you, but one way that I was able to inject custom behavior is as follows,
Basic structure: as follows
i. create a generic interface for the set of entity classes which are modeled after a generic parent entity. Note, this is optional. In my case I had a need for this hierarchy, but it's not necessary
public interface GenericRepository<T> {
// add any common methods to your entity hierarchy objects,
// so that you don't have to repeat them in each of the children entities
// since you will be extending from this interface
}
ii. Extend a specific repository from generic (step 1) and JPARepository as
public interface MySpecificEntityRepository extends GenericRepository<MySpecificEntity>, JpaRepository<MySpecificEntity, Long> {
// add all methods based on column names, entity graphs or JPQL that you would like to
// have here in addition to what's offered by JpaRepository
}
iii. Use the above repository in your service implementation class
Now, the Service class may look like this,
public interface GenericService<T extends GenericEntity, ID extends Serializable> {
// add specific methods you want to extend to user
}
The generic implementation class can be as follows,
public abstract class GenericServiceImpl<T extends GenericEntity, J extends JpaRepository<T, Long> & GenericRepository<T>> implements GenericService<T, Long> {
// constructor takes in specific repository
public GenericServiceImpl(J genericRepository) {
// save this to local var
}
// using the above repository, specific methods are programmed
}
specific implementation class can be
public class MySpecificEntityServiceImpl extends GenericServiceImpl<MySpecificEntity, MySpecificEntityRepository> implements MySpecificEntityService {
// the specific repository is autowired
#Autowired
public MySpecificEntityServiceImpl(MySpecificEntityRepository genericRepository) {
super(genericRepository);
this.genericRepository = (MySpecificEntityRepository) genericRepository;
}
}
So lets say we have a couple of entities we want to persist using DAO objects. So we implement the right interface so we end up with
class JdbcUserDao implements UserDao{
//...
}
class JdbcAddressDao implements AddressDao{
//...
}
So if I want to be able to switch persistance implementations from JDBC to JPA (for example) and vice versa, I'd need to have JPAUserDao and JPAAddressDao... Meaning if I had 20 entities, and decided to switch implementations(using DI container), I'd have to switch every Jdbc implementation with JPA in code.
Now it could be that I misunderstood how DAO works, but... If I just had
class JdbcDaoImpl implements UserDao,AddressDao{
//...
}
I'd then have all the JDBC implementations in one class, and switching implementations would be a piece of cake. Also, DaoImpl count is equal to number of Dao interfaces. Why not just group them by implementation (jdbc, JTA, JPA...) and have everything under one class?
Thanks in advance.
Having a single class implement every DAO interface in your entire application would be a rather bad design.
A more typical pattern is to have a BaseDAO interface (also often called GenericDAO) and have a JPABaseDAO, JDBCBaseDAO etc. These base classes will contain methods like find/get/read, save/store/persist, update/modify and delete/remove/purge.
Specific DAO interfaces like UserDAO then inherit from BaseDAO and concrete implementations like JPAUserDAO extends from JPABaseDAO.
A BaseDAO interface could look like this:
public interface BaseDAO <T> {
T getByID(Long ID);
T save(T type);
T update(T type);
void delete(T type);
}
And a UserDAO interface:
public interface UserDAO extends BaseDAO<User> {
List<User> getAllAuthorized();
}
Bare bones example of a JPABaseDAO implementing this interface:
#Stateless
public class JPABaseDAO<T> implements BaseDAO<T> {
#PersistenceContext
private EntityManager entityManager;
private final Class<T> entityType;
#SuppressWarnings("unchecked")
public JPABaseDAO() {
this.entityType = ((Class<T>) ((ParameterizedType) getClass().getGenericSuperclass()).getActualTypeArguments()[0]);
}
#Override
public T getByID(Long ID) {
return entityManager.find(entityType, ID);
}
#Override
public T save(T type) {
return entityManager.persist(type);
}
#Override
public T update(T type) {
return entityManager.merge(type);
}
#Override
public void delete(T type) {
entityManager.remove(entityManager.contains(type) ? type : entityManager.merge(type));
}
}
And some sample UserDAO implementation that would inherit from it:
#Stateless
public class JPAUserDAO extends JPABaseDAO<User> implements UserDAO {
#PersistenceContext
private EntityManager entityManager;
#Override
public List<User> getAllAuthorized() {
return entityManager.createNamedQuery("User.getAllAuthorized", User.class)
.getResultList();
}
}
In practice the base class can often do some other things transparently, for instance checking if an entity implements some kind of Auditable interface, and automatically setting the date and user that modified it, etc.
When using EJB to implement your DAOs, one strategy to change implementations would be to put all JDBC implementations in one package and all JPA implementations in the other. Then just include only one implementation package in your build.
The whole point of Dependency Injection is to make switching between implementation easier and to decouple the user from the provider. Hence all DI frameworks provide some way to "group" several implementations (here your JDBC-group and your JPA-group) and switch them in one place.
Also: Usually the number of consumers (in your case: some business logic working on users and addresses) is usually higher than the number of DAOs the DI framework will uncouple most of the stuff for you anyway. Assume: 50 business beans, two interfaces and two implementations for each interface (4 total): even basic DI will take care for the 50. Using grouping will halve that remaining rest for you.
There are definitely possibilities to implement the DAO pattern in a widely technology agnostic way such that switching persistence technology or even mixing multiple technologies becomes feasible. This article presents one implementation scheme including source code on github.
http://codeblock.engio.net/?p=180