I have a requirement that based on profile I need to inject 2 different classes into DAO layer to perform CRUD operation. Let's say we have class A and Class B for profiles a and b respectively. Now in the DAO layer without using if else condition (As I am using that currently based on the profile, I am using service layer to call 2 different methods 1.saveA(), 2.saveB().) But is there any way to make it more generic and based on profile or either by the class reference I can instantiate different entity as well as JPA Classes? I tried to use
<T extends Parent> T factoryMethod(Class<T> clazz) throws Exception {
return (T) clazz.newInstance();
}
but this also will force me to cast the returned object to a class. I tried creating a parent P for both class A and B. and used them instead but got confused when injecting the entity types to JPARepository.
I tried creating a SimpleJPARepository but didnt worked as there are overridden methods in ARepository and BRepository.
Or,
is there a way I can use the same entity class for 2 different tables? that way it can be solved. for 1 profile I have different sets of columns whereas for 2nd profile I have different columns.
this is how I am expecting: Would it be possible? or, how I am doing now is correct?
public void doStuff(Class<T> class){
GenericRepository repo;
if(class instanceof A){
//use ARepository;
repo = applicationContext.getBean(ARepository);
}else{
//use BRepository;
repo = applicationContext.getBean(BRepository);
}
repo.save(class);
repo.flush();
}
You can create a method utility like following: The key is the class type of the entity and the value is the repository.
Map<Class<? extends Parent>, JpaRepository> repoMapping = new HashMap<>();
#PostConstruct
public void init(){
repoMapping.put(A.class, applicationContext.getBean(ARepository));
repoMapping.put(B.class, applicationContext.getBean(BRepository));
}
public JpaRepository getRepo(Class<? extends Parent> classs){
return repoMapping.get(classs);
}
Related
I'm learning how to extend Spring's CrudRepository interface to create a repository for an entity. But I'm having trouble implementing more complicated queries that use values that aren't hard-coded. Here's a contrived example. The HQL is not valid, but it shows what I'm trying to do:
import mypackage.DogTypeEnum;
public interface myRepository extends CrudRepository<Dog, Integer> {
int oldAge = 10; // years - old for a dog
#Query(SELECT dog From Dog dog WHERE dog.age > oldAge and dog.type = DogTypeEnum.poodle
public List<Dog> findOldPoodles()
}
So in the above example, I'm trying to query for all dogs of type poodle that are over a certain age threshold. I don't want to hard code either poodle or the value 10 because these values that will be used elsewhere in the code as well and I want to avoid duplication. I don't want to require the user to pass those values in as parameters either.
Is there a way to do this?
You could create a Interface that extend your repository as like this:
//Only complex querys
public interface MyRepositoryCustom {
List<Dog> findOldPoodles()
}
// Your Repository must extends to MyRepositoryCustom
public interface MyRepository extends CrudRepository<Dog, Integer>, MyRepositoryCustom {
// Declare query methods
}
//More complex query
public class MyRepositoryImpl implements MyRepositoryCustom {
#PersistenceContext
private EntityManager em;
public List<Dog> findOldPoodles() {
Query query = em.createQuery("SELECT dog From Dog dog WHERE dog.age > :oldAge and dog.type = :type");
query.setParameter("oldAge",10);
query.setParameter("type",DogTypeEnum.poodle.name);
return query.getResultList();
}
}
Remember all java class starts with a Upper Case letter.
This link can help you: Spring repositories
In my case I'll never have this problem, if you structure your project as the next package architecture shows, you won't have this problem:
View (Angular, JSP, JSF...) -- APP
Controller -- APP
Services -- Main Core
DAO -- Main Core
Entities -- Main Core
This way you make more modular, escalable, maintainable and comprehensive application.
It doesn't matter what technology you use in your view, you just have to invoque the correct method on the service.
In here, on the services package you could have a service for example:
#Service
public class ServiceDog extends Serializable {
#Autowired
private MyRepository myRepository;
int oldAge = 10;
public List<Dog> findOldPoodles() throws ServicioException {
return myRepository.findAllByAgeGreaterThanAndType(oldAge, DogTypeEnum.poodle);
}
}
Now you could use all the advantage you get from using spring-data-jpa reference or make a more simple JPQL Query.
This is a simple example but this way you make sure that each DAO speak with only one entity (It will only have the methods needed for this entity like save, delete.. with no dependencies with other DAOs), and the service are the ones on calling the different DAOs and make the necessary actions on them.
Hope this helps.
I have read several articles and Stackoverflow posts for converting domain objects to DTOs and tried them out in my code. When it comes to testing and scalability I am always facing some issues. I know the following three possible solutions for converting domain objects to DTOs. Most of the time I am using Spring.
Solution 1: Private method in the service layer for converting
The first possible solution is to create a small "helper" method in the service layer code which is convertig the retrieved database object to my DTO object.
#Service
public MyEntityService {
public SomeDto getEntityById(Long id){
SomeEntity dbResult = someDao.findById(id);
SomeDto dtoResult = convert(dbResult);
// ... more logic happens
return dtoResult;
}
public SomeDto convert(SomeEntity entity){
//... Object creation and using getter/setter for converting
}
}
Pros:
easy to implement
no additional class for convertion needed -> project doesn't blow up with entities
Cons:
problems when testing, as new SomeEntity() is used in the privated method and if the object is deeply nested I have to provide a adequate result of my when(someDao.findById(id)).thenReturn(alsoDeeplyNestedObject) to avoid NullPointers if convertion is also dissolving the nested structure
Solution 2: Additional constructor in the DTO for converting domain entity to DTO
My second solution would be to add an additional constructor to my DTO entity to convert the object in the constructor.
public class SomeDto {
// ... some attributes
public SomeDto(SomeEntity entity) {
this.attribute = entity.getAttribute();
// ... nesting convertion & convertion of lists and arrays
}
}
Pros:
no additional class for converting needed
convertion hided in the DTO entity -> service code is smaller
Cons:
usage of new SomeDto() in the service code and therefor I have to provide the correct nested object structure as a result of my someDao mocking.
Solution 3: Using Spring's Converter or any other externalized Bean for this converting
If recently saw that Spring is offering a class for converting reasons: Converter<S, T> but this solution stands for every externalized class which is doing the convertion. With this solution I am injecting the converter to my service code and I call it when i want to convert the domain entity to my DTO.
Pros:
easy to test as I can mock the result during my test case
separation of tasks -> a dedicated class is doing the job
Cons:
doesn't "scale" that much as my domain model grows. With a lot of entities I have to create two converters for every new entity (-> converting DTO entitiy and entitiy to DTO)
Do you have more solutions for my problem and how do you handle it? Do you create a new Converter for every new domain object and can "live" with the amount of classes in the project?
Thanks in advance!
Solution 1: Private method in the service layer for converting
I guess Solution 1 will not not work well, because your DTOs are domain-oriented and not service oriented. Thus it will be likely that they are used in different services. So a mapping method does not belong to one service and therefore should not be implemented in one service. How would you re-use the mapping method in another service?
The 1. solution would work well if you use dedicated DTOs per service method. But more about this at the end.
Solution 2: Additional constructor in the DTO for converting domain entity to DTO
In general a good option, because you can see the DTO as an adapter to the entity. In other words: the DTO is another representation of an entity. Such designs often wrap the source object and provide methods that give you another view on the wrapped object.
But a DTO is a data transfer object so it might be serialized sooner or later and send over a network, e.g. using spring's remoting capabilities. In this case the client that receives this DTO must deserialize it and thus needs the entity classes in it's classpath, even if it only uses the DTO's interface.
Solution 3: Using Spring's Converter or any other externalized Bean for this converting
Solution 3 is the solution that I also would prefer. But I would create a Mapper<S,T> interface that is responsible for mapping from source to target and vice versa. E.g.
public interface Mapper<S,T> {
public T map(S source);
public S map(T target);
}
The implementation can be done using a mapping framework like modelmapper.
You also said that a converter for each entity
doesn't "scale" that much as my domain model grows. With a lot of entities I have to create two converters for every new entity (-> converting DTO entitiy and entitiy to DTO)
I doupt that you only have to create 2 converter or one mapper for one DTO, because your DTO is domain-oriented.
As soon as you start to use it in another service you will recognize that the other service usually should or can not return all values that the first service does.
You will start to implement another mapper or converter for each other service.
This answer would get to long if I start with pros and cons of dedicated or shared DTOs, so I can only ask you to read my blog pros and cons of service layer designs.
EDIT
About the third solution: where do you prefer to put the call for the mapper?
In the layer above the use cases. DTOs are data transfer objects, because they pack data in data structures that are best for the transfer protocol. Thus I call that layer the transport layer.
This layer is responsible for mapping use case's request and result objects from and to the transport representation, e.g. json data structures.
EDIT
I see you're ok with passing an entity as a DTO constructor parameter. Would you also be ok with the opposite? I mean, passing a DTO as an Entity constructor parameter?
A good question. The opposite would not be ok for me, because I would then introduce a dependency in the entity to the transport layer. This would mean that a change in the transport layer can impact the entities and I don't want changes in more detailed layers to impact more abstract layers.
If you need to pass data from the transport layer to the entity layer you should apply the dependency inversion principle.
Introduce an interface that will return the data through a set of getters, let the DTO implement it and use this interface in the entities constructor. Keep in mind that this interface belongs to the entity's layer and thus should not have any dependencies to the transport layer.
interface
+-----+ implements || +------------+ uses +--------+
| DTO | ---------------||-> | EntityData | <---- | Entity |
+-----+ || +------------+ +--------+
I like the third solution from the accepted answer.
Solution 3: Using Spring's Converter or any other externalized Bean for this converting
And I create DtoConverter in this way:
BaseEntity class marker:
public abstract class BaseEntity implements Serializable {
}
AbstractDto class marker:
public class AbstractDto {
}
GenericConverter interface:
public interface GenericConverter<D extends AbstractDto, E extends BaseEntity> {
E createFrom(D dto);
D createFrom(E entity);
E updateEntity(E entity, D dto);
default List<D> createFromEntities(final Collection<E> entities) {
return entities.stream()
.map(this::createFrom)
.collect(Collectors.toList());
}
default List<E> createFromDtos(final Collection<D> dtos) {
return dtos.stream()
.map(this::createFrom)
.collect(Collectors.toList());
}
}
CommentConverter interface:
public interface CommentConverter extends GenericConverter<CommentDto, CommentEntity> {
}
CommentConveter class implementation:
#Component
public class CommentConverterImpl implements CommentConverter {
#Override
public CommentEntity createFrom(CommentDto dto) {
CommentEntity entity = new CommentEntity();
updateEntity(entity, dto);
return entity;
}
#Override
public CommentDto createFrom(CommentEntity entity) {
CommentDto dto = new CommentDto();
if (entity != null) {
dto.setAuthor(entity.getAuthor());
dto.setCommentId(entity.getCommentId());
dto.setCommentData(entity.getCommentData());
dto.setCommentDate(entity.getCommentDate());
dto.setNew(entity.getNew());
}
return dto;
}
#Override
public CommentEntity updateEntity(CommentEntity entity, CommentDto dto) {
if (entity != null && dto != null) {
entity.setCommentData(dto.getCommentData());
entity.setAuthor(dto.getAuthor());
}
return entity;
}
}
I ended up NOT using some magical mapping library or external converter class, but just adding a small bean of my own which has convert methods from each entity to each DTO I need. The reason is that the mapping was:
either stupidly simple and I would just copy some values from one field to another, perhaps with a small utility method,
or was quite complex and would be more complicated to write down in the custom parameters to some generic mapping library, compared to just writing out that code. This is for example in the case where the client can send JSON but under the hood this is transformed into entities, and when the client retrieves the parent object of these entities again, it's converted back into JSON.
This means I can just call .map(converter::convert) on any collection of entities to get back a stream of my DTO's.
Is it scalable to have it all in one class? Well the custom configuration for this mapping would have to be stored somewhere even if using a generic mapper. The code is generally extremely simple, except for a handful of cases, so I'm not too worried about this class exploding in complexity. I'm also not expecting to have dozens more entities, but if I did I might group these converters in a class per subdomain.
Adding a base class to my entities and DTO's so I can write a generic converter interface and implement it per class just isn't needed (yet?) either for me.
In my opinion the third solution is the best one. Yes for each entity you'll have to create a two new convert classes but when you come time for testing you won't have a lot of headaches. You should never chose the solution which will cause you to write less code at the begining and then write much more when it comes to testing and maintaining that code.
Another point is , if you use the second approach and your entity has lazy dependencies, your Dto can't understand if dependency is loaded unless you inject EntityManager into the Dto and use it to check if dependency was loaded. I don't like this approach cause Dto shouldn't know anything about EntityManager. As a solution I personally prefer Converters but at the same time I prefer to have multiple Dto classes for the same entity . For example If I am 100 % sure that User Entity will be loaded without corresponding Company , then there has to be a UserDto that doesn't have CompanyDto as a field. At the same time If I know that UserEntity will be loaded with correlated Company , then I will use aggregate pattern , something like a UserCompanyDto class that contains UserDto and CompanyDto as parameters
On my side I prefer using option 3 with a third party library such as modelmapper or mapstruct. Also I use it through interface in an util package, because I don't want any external tool or library to interact directly with my code.
Definition:
public interface MapperWrapper {
<T> T performMapping(Object source, Class<T> destination);
}
#Component
public class ModelMapperWrapper implements MapperWrapper {
private ModelMapper mapper;
public ModelMapperWrapper() {
this.mapper = new ModelMapper();
}
#Override
public <T> T performMapping(Object source, Class<T>
destination) {
mapper.getConfiguration()
.setMatchingStrategy(MatchingStrategies.STRICT);
return mapper.map(source, destination);
}
}
Then after I can test it easily:
Testing:
#SpringJUnitWebConfig(TestApplicationConfig.class)
class ModelMapperWrapperTest implements WithAssertions {
private final MapperWrapper mapperWrapper;
#Autowired
public ModelMapperWrapperTest(MapperWrapper mapperWrapper) {
this.mapperWrapper = mapperWrapper;
}
#BeforeEach
void setUp() {
}
#Test
void givenModel_whenMapModelToDto_thenReturnsDto() {
var model = new DummyModel();
model.setId(1);
model.setName("DUMMY_NAME");
model.setAge(25);
var modelDto = mapperWrapper.performMapping(model, DummyModelDto.class);
assertAll(
() -> assertThat(modelDto.getId()).isEqualTo(String.valueOf(model.getId())),
() -> assertThat(modelDto.getName()).isEqualTo(model.getName()),
() -> assertThat(modelDto.getAge()).isEqualTo(String.valueOf(model.getAge()))
);
}
#Test
void givenDto_whenMapDtoToModel_thenReturnsModel() {
var modelDto = new DummyModelDto();
modelDto.setId("1");
modelDto.setName("DUMMY_NAME");
modelDto.setAge("25");
var model = mapperWrapper.performMapping(modelDto, DummyModel.class);
assertAll(
() -> assertThat(model.getId()).isEqualTo(Integer.valueOf(modelDto.getId())),
() -> assertThat(model.getName()).isEqualTo(modelDto.getName()),
() -> assertThat(model.getAge()).isEqualTo(Integer.valueOf(modelDto.getAge()))
);
}
}
After that it can be very easy to use another mapper library. I should have created an abstract factory, or strategy pattern also.
I'm building a Spring Boot application containing more than 10 domain classes which need to be persisted in a SQL database.
The problem is that I need to create an interface for every single domain class, so something like this for each one:
public interface BehandelaarRepo extends CrudRepository<BehandelCentrum, Long> {
}
Is there any way to decrease the number of repositories by using some kind of design pattern or whatever? Any suggestions?
You can actually make it some easier for yourself by using generics the same way Spring Data JPA does it:
public interface JpaRepository<T extends Serializable, ID extends Serializable> {
public <S extends T> S save(S object);
}
The trick is that you can use all subclasses, and you're getting that class back as well. I always create one superclass, so I get rid of my ID generic:
#MappedSuperclass
public class JpaObject {
#Id
#GeneratedValue
private Long id;
(.... created, last updated, general stuff here....)
}
I create my #Entity classes as a subclass from this JpaObject.
Second step: create my super interface for future usage of special queries:
#NoRepositoryBean
public interface Dao<T extends JpaObject> extends JpaRepository<T, Long> {
}
Next step: the generic Dao which looks some stupid and remains empty at all times
#Repository
public interface GenericDao extends Dao<JpaObject> {
}
Now have a sharp look at that save method in CrudRepository / JpaRepository:
public <S extends T> S save(S object);
Any object extending JpaObject (S extends JpaObject) now can be given as a parameter to all methods, and the returntype is the same class as your parameter.
(Aziz, als het handiger is, kan het ook in het Nederlands uitgelegd worden :P Groet uit Zwolle)
Well, I had a similar problem. I resolved it by creating new layer, namely
RepositoryManager (or ModelService) singleton that had all the repo interfaces and methods that used them.
If you want you can implement generic save method (then call that class ModelService) that resolves model types through reflection and chooses the corresponding repository.
It was also handy for decoupling cache implementation (I used spring cache).
I would like to create a Spring Data JPA repository with custom behavior, and implement that custom behavior using Specifications. I have gone through the Spring Data JPA documentation for implementing custom behavior in a single repository to set this up, except there is no example of using a Spring Data Specification from within a custom repository. How would one do this, if even possible?
I do not see a way to inject something into the custom implementation that takes a specification. I thought I would be tricky and inject the CRUD repository portion of the repository into the custom portion, but that results in a circular instantiation dependency.
I am not using QueryDSL. Thanks.
I guess the primary source for inspiration could be how SimpleJpaRepository handles specifications. The key spots to have a look at are:
SimpleJpaRepository.getQuery(…) - it's basically creating a CriteriaQuery and bootstraps a select using a JPA Root. Whether the latter applies to your use case is already up to you. I think the former will apply definitely.
SimpleJpaRepository.applySpecificationToCriteria(…) - it basically uses the artifacts produced in getQuery(…) (i.e. the Root and the CriteriaQuery) and applies the given Specification to exactly these artifacts.
this is not using Specification, so not sure if it's relevant to you, but one way that I was able to inject custom behavior is as follows,
Basic structure: as follows
i. create a generic interface for the set of entity classes which are modeled after a generic parent entity. Note, this is optional. In my case I had a need for this hierarchy, but it's not necessary
public interface GenericRepository<T> {
// add any common methods to your entity hierarchy objects,
// so that you don't have to repeat them in each of the children entities
// since you will be extending from this interface
}
ii. Extend a specific repository from generic (step 1) and JPARepository as
public interface MySpecificEntityRepository extends GenericRepository<MySpecificEntity>, JpaRepository<MySpecificEntity, Long> {
// add all methods based on column names, entity graphs or JPQL that you would like to
// have here in addition to what's offered by JpaRepository
}
iii. Use the above repository in your service implementation class
Now, the Service class may look like this,
public interface GenericService<T extends GenericEntity, ID extends Serializable> {
// add specific methods you want to extend to user
}
The generic implementation class can be as follows,
public abstract class GenericServiceImpl<T extends GenericEntity, J extends JpaRepository<T, Long> & GenericRepository<T>> implements GenericService<T, Long> {
// constructor takes in specific repository
public GenericServiceImpl(J genericRepository) {
// save this to local var
}
// using the above repository, specific methods are programmed
}
specific implementation class can be
public class MySpecificEntityServiceImpl extends GenericServiceImpl<MySpecificEntity, MySpecificEntityRepository> implements MySpecificEntityService {
// the specific repository is autowired
#Autowired
public MySpecificEntityServiceImpl(MySpecificEntityRepository genericRepository) {
super(genericRepository);
this.genericRepository = (MySpecificEntityRepository) genericRepository;
}
}
I am working on a web service application designed using spring framework, where I have different entity classes(ActivityA,ActivityB,ActivityC...) which inherit from a base class "activity"
Now I have written different service api's in service layer for the base class and all the child classes. (to name, ActivityService,ActivityAService,ActivityBService,ActivityCService..)
All the methods which operate similar for each activity are put in service api of base class(ActivityService) and rest in their respective services.
I generally know which object I am working on, and I call the respective service api. But in a particular case I have the activity object(don't know of which child class it is) and have to write a method which is different for all entity objects.
PROBLEM : Is there a way,I can call different service's based on the entity object I have.(the object I have is of entity, not service and I cant do any hard coding to get Service object)
But in a particular case I have the activity object(don't know of which child class it is) and have to write a method which is different for all entity objects.
Just make the base class abstract and define an abstract method that each sub class has to implement:
public abstract class ActivityService{
public abstract Foo processEntity(Entity entity);
}
PROBLEM : Is there a way,I can call different service's based on the entity object I have.(the object I have is of entity, not service and I cant do any hard coding to to )
This is a situation you should try to avoid. Usually, you should only send an entity to a service that knows what to do with it, and not to a bunch of services, one of which is in charge. But what I'd do is use a dispatcher service that keeps a map of classes that the services are in charge of. It uses logic like this:
private Map<Class<? extends Entity>,
Class<? extends ActivityService>> serviceMap =
new ConcurrentHashMap<Class<? extends Entity>,
Class<? extends ActivityService>>();
private ApplicationContext context;
private ActivityService getServiceForEntity(Entity e){
Class<? extends Entity> entityClass = e.getClass();
Class<? extends ActivityService> serviceClass =
serviceMap.get(entityClass);
if(serviceClass==null){
for(Map.Entry<Class<? extends Entity>,
Class<? extends ActivityService>> entry :
serviceMap.entrySet()){
if(entry.getKey().isAssignableFrom(entityClass)){
serviceClass = entry.getValue();
break;
}
}
if(serviceClass==null){
// throw a suitable exception
}
serviceMap.put(entityClass, serviceClass);
}
return context.getBean(serviceClass);
}