In my microservice written on spring-boot I have following DTO and Entity:
#Entity
class Order {
#Id
private Long id;
#OneToMany
private List<Product> products;
// getters setters
public Product getMainProduct() {
final String someBusinessValue = "A";
return products.stream()
.filter(product -> someBusinessValue.equals(product.getSomeBusinessValue()))
.findFirst()
.orElseThrow(() -> new IllegalStateException("No product found with someBusinessValue 'A'"));
}
}
class OrderRequest {
private List<ProductDto> productDtos;
// getters setters
public ProductDto getMainProductDto() {
final String someBusinessValue = "A";
return productDtos.stream()
.filter(productDto -> someBusinessValue.equals(productDto.getSomeBusinessValue()))
.findFirst()
.orElseThrow(() -> new IllegalStateException("No productDto found with someBusinessValue 'A'"));
}
}
As seen both entity and dto contain some business logic method for taking the "main" product from the list of all product. It is needed to work with this "main" product in many parts of the code (only in service layer). I have received a comment after adding these methods:
Design-wise you made it (in Dto) and the one in DB entity tightly coupled through all layers. This method is used in services only. That means that general architecture rules for layers must apply. In this particular case, the "separation-of-concerns" rule. Essentially it means that the service layer does not need to know about the request parameter of the controller, as well as the controller shouldn't be coupled with what's returned from service layer. Otherwise a tight coupling occurs. Here's a schematical example of how it should be:
class RestController {
#Autowired
private ItemService itemService;
public CreateItemResponse createItem(CreateItemRequest request) {
CreateItemDTO createItemDTO = toCreateItemDTO(request);
ItemDTO itemDTO = itemService.createItem(createItemDTO);
return toResponse(itemDTO);
}
In fact, it is proposed to create another intermediate DTO (OrderDto), which additionally contains the main product field:
class OrderDto {
private List<ProductDto> productDtos;
private ProductDto mainProductDto;
// getters setters
}
into which the OrderRequest will be converted in the controller before passing it to the service layer, and the OrderEntity already in the service layer itself before working with it. And the method for obtaining the main product should be placed in mapper. Schematically looks like this:
---OrderRequest---> Controller(convert to OrderDto)
---OrderDto--> Service(Convert OrderEntity to OrderDto) <---OrderEntity--- JpaRepository
Actually, this leads to a lot of additional conversions within the service, some unusual work, for example, when updating an entity, now you have two intermediate dto (one from the controller for the update), the other from the repository (the entity found for the update and converted to intermediate dto in service layer) and you need to adopt the state of one into the other, and then map the result to the entity and do update.
In addition, it takes a lot of time to refactor the entire microservice.
The question is, is it a bad approach to have such methods in the entity and incoming dto, though these methods are used only in service layer, and is there another approach to refactor this?
If you need any clarification please let me now, thank you in advance!
Related
Hello everyone I'm new to Spring world. Actually I want to know how can we use converter to update object instead of updating each element one by one using set and get. Right now in my controller I've :
#PostMapping("/edit/{userId}")
public Customer updateCustomer(#RequestBody Customer newCustomer, #PathVariable final String userId)
{
return customerService.update(userId, newCustomer);
}
and this is how I'm updating the customer object :
#Override
public Customer update(String id, Customer newCustomer) {
Customer customer = customerRepository.findById(id).get();
customer.setFirstName(newCustomer.getFirstName());
customer.setLastName(newCustomer.getLastName());
customer.setEmail(newCustomer.getEmail());
return customerRepository.save(customer);
}
Instead of using each time set and get, I want to use a converter.
The approach of passing the entity's id as a path variable when you're updating it isn't really right. Think about this: you have a #RequestBody, why don't you include the id inside this body too? Why do you want to specify a path variable for it?
Now, if you have the full Customer with its id from the body, you don't have to make any calls to your repository because hibernate adds it to a persistent state already based on its id and a simple
public Customer update(Customer newCustomer) {
return customerRepository.save(newCustomer);
}
should work.
Q: What is a persistent state?
A: A persistent entity has been associated with a database table row and it’s being managed by the current running Persistence Context. ( customerRepository.findById() is just asking the DB if the entity with the specified id exists and add it to a persistent state. Hibernate manage all this process if you have an #Id annotated field and is filled, in other words:
Customer customer = new Customer();
customer.setId(1);
is ALMOST the same thing as :
Customer customer = customerRepository.findById(1).get();
)
TIPS: Anyway, you shouldn't have (if you didn't know) a model in the controller layer. Why? Let's say that your Customer model can have multiple permissions. One possible structure could look like this:
#Entity
public class Customer{
//private fields here;
#OneToMany(mappedBy="customer",--other configs here--)
private List<Permission> permissions;
}
and
#Entity
public class Permission{
#Id
private Long id;
private String name;
private String creationDate;
#ManyToOne(--configs here--)
private Customer customer;
}
You can see that you have a cross reference between Customer and Permission entity which will eventually lead to a stack overflow exception (if you don't understand this, you can think about a recursive function which doesn't have a condition to stop and it's called over and over again => stack overflow. The same thing is happening here).
What can you do? Creating a so called DTO class that you want the client to receive instead of a model. How can you create this DTO? Think about what the user NEEDS to know.
1) Is "creationDate" from Permission a necessary field for the user? Not really.
2) Is "id" from Permission a necessary field for the user? In some cases yes, in others, not.
A possible CustomerDTO could look like this:
public class CustomerDTO
{
private String firstName;
private String lastName;
private List<String> permissions;
}
and you can notice that I'm using a List<String> instead of List<Permission> for customer's permissions which are in fact the permissions' names.
public CustomerDTO convertModelToDto(Customer customer)
{
//hard way
CustomerDTO customerDTO = new CustomerDTO();
customerDTO.setFirstName(customer.getFirstName());
customerDTO.setLastName(customer.getLastName());
customerDTO.setPermissions(
customer.getPermissions()
.stream()
.map(permission -> permission.getName())
.collect(Collectors.toList());
);
// easy-way => using a ModelMapper
customerDTO = modelMapper.map(customer,CustomerDTO.class);
return customerDTO;
}
Use ModelMapper to map one model into another.
First define a function that can map source data into the target model. Use this as a library to use whenever want.
public static <T> void merge(T source, T target) {
ModelMapper modelMapper = new ModelMapper();
modelMapper.getConfiguration().setMatchingStrategy(MatchingStrategies.STRICT);
modelMapper.map(source, target);
}
Use merge for mapping data
Customer customer = customerRepository.findById(id).get();
merge(newCustomer, customer);
customerRepository.save(customer);
Add dependency in pom.xml for model mapper
<dependency>
<groupId>org.modelmapper</groupId>
<artifactId>modelmapper</artifactId>
<version>2.3.4</version>
</dependency>
I have a User entity and a Role entity. The fields are not important other than the fact that the User entity has a role_id field that corresponds to the id of its respective role. Since Spring Data R2DBC doesn't do any form of relations between entities, I am turning to the DTO approach. I am very new to R2DBC and reactive programming as a whole and I cannot for the life of me figure out how to convert the Flux<User> my repository's findAll() method is returning me to a Flux<UserDto>. My UserDto class is extremely simple :
#Data
#RequiredArgsConstructor
public class UserDto
{
private final User user;
private final Role role;
}
Here is the UserMapper class I'm trying to make :
#Service
#RequiredArgsConstructor
public class UserMapper
{
private final RoleRepository roleRepo;
public Flux<UserDto> map(Flux<User> users)
{
//???
}
}
How can I get this mapper to convert a Flux<User> into a Flux<UserDto> containing the user's respective role?
Thanks!
Assuming your RoleRepository has a findById() method or similar to find a Role given its ID, and your user object has a getRoleId(), you can just do it via a standard map call:
return users.map(u -> new UserDto(u, roleRepo.findById(u.getRoleId())));
Or in the case where findById() returns a Mono:
return users.flatMap(u -> roleRepo.findById(u.getRoleId()).map(r -> new UserDto(u, r)));
You may of course want to add additional checks if it's possible that getRoleId() could return null.
Converting the data from business object to database object :
private static UserDAO covertUserBOToBUserDAO(UserBO userBO){
return new UserDAO(userBO.getUserId(), userBO.getName(), userBO.getMobileNumber(),
userBO.getEmailId(), userBO.getPassword());
}
Converting the data from database object to business object :
private static Mono<UserBO> covertUserDAOToBUserBO(UserDAO userDAO){
return Mono.just(new UserBO(userDAO.getUserId(), userDAO.getName(),
userDAO.getMobileNumber(), userDAO.getEmailId(), userDAO.getPassword()));
}
Now in service (getAllUsers) asynchronously:
public Flux<UserBO> getAllUsers(){
return userRepository.findAll().flatMap(UserService::covertUserDAOToBUserBO);
}
Since flatMap is asynchronous so we get the benefit from asynchronous operation for even converting the object from DAO to BO.
Similarly if saving data then I tried below :
public Mono<UserBO> saveUser(UserBO userBO)
{
return
userRepository.save(covertUserBOToBUserDAO(userBO)).flatMap(UserService::covertUserDAOToBUserBO);
}
I'm currently messing around with a Spring Boot REST API project for instructional purposes. I have a rather large table with 22 columns loaded into a MySQL database and am trying to give the user the ability to filter the results by multiple columns (let's say 6 for the purposes of this example).
I am currently extending a Repository and have initialized methods such as findByParam1 and findByParam2 and findByParam1OrderByParam2Desc and etc. and have verified that they are working as intended. My question to you guys is the best way to approach allowing the user the ability to leverage all 6 optional RequestParams without writing a ridiculous amount of conditionals/repository method variants. For example, I want to give the user the ability to hit url home/get-data/ to get all results, home/get-data?param1=xx to filter based on param1, and potentially, home/get-data?param1=xx¶m2=yy...¶m6=zz to filter on all the optional parameters.
For reference, here is what the relevant chunk of my controller looks like (roughly).
#RequestMapping(value = "/get-data", method = RequestMethod.GET)
public List<SomeEntity> getData(#RequestParam Map<String, String> params) {
String p1 = params.get("param1");
if(p1 != null) {
return this.someRepository.findByParam1(p1);
}
return this.someRepository.findAll();
}
My issue so far is that the way I am proceeding about this means that I will basically need n! amount of methods in my repository to support this functionality with n equalling the amount of fields/columns I want to filter on. Is there a better way to approach handling this, perhaps where I am filtering the repository 'in-place' so I can simply filter 'in-place' as I check the Map to see what filters the user did indeed populate?
EDIT: So I'm currently implementing a 'hacky' solution that might be related to J. West's comment below. I assume that the user will be specifying all n parameters in the request URL and if they do not (for example, they specify p1-p4 but not p5 and p6) I generate SQL that just matches the statement to LIKE '%' for the non-included params. It would look something like...
#Query("select u from User u where u.p1 = :p1 and u.p2 = :p2 ... and u.p6 = :p6")
List<User> findWithComplicatedQueryAndSuch;
and in the Controller, I would detect if p5 and p6 were null in the Map and if so, simply change them to the String '%'. I'm sure there is a more precise and intuitive way to do this, although I haven't been able to find anything of the sort yet.
You can do this easily with a JpaSpecificationExecutor and a custom Specification: https://spring.io/blog/2011/04/26/advanced-spring-data-jpa-specifications-and-querydsl/
I would replace the HashMap with a DTO containing all optional get params, then build the specifications based on that DTO, obviously you can also keep the HashMap and build the specification based on it.
Basically:
public class VehicleFilter implements Specification<Vehicle>
{
private String art;
private String userId;
private String vehicle;
private String identifier;
#Override
public Predicate toPredicate(Root<Vehicle> root, CriteriaQuery<?> query, CriteriaBuilder cb)
{
ArrayList<Predicate> predicates = new ArrayList<>();
if (StringUtils.isNotBlank(art))
{
predicates.add(cb.equal(root.get("art"), art));
}
if (StringUtils.isNotBlank(userId))
{
predicates.add(cb.equal(root.get("userId"), userId));
}
if (StringUtils.isNotBlank(vehicle))
{
predicates.add(cb.equal(root.get("vehicle"), vehicle));
}
if (StringUtils.isNotBlank(identifier))
{
predicates.add(cb.equal(root.get("identifier"), fab));
}
return predicates.size() <= 0 ? null : cb.and(predicates.toArray(new Predicate[predicates.size()]));
}
// getter & setter
}
And the controller:
#RequestMapping(value = "/{ticket}/count", method = RequestMethod.GET)
public long getItemsCount(
#PathVariable String ticket,
VehicleFilter filter,
HttpServletRequest request
) throws Exception
{
return vehicleService.getCount(filter);
}
Service:
#Override
public long getCount(VehicleFilter filter)
{
return vehicleRepository.count(filter);
}
Repository:
#Repository
public interface VehicleRepository extends JpaRepository<Vehicle, Integer>, JpaSpecificationExecutor<Vehicle>
{
}
Just a quick example adapted from company code, you get the idea!
Another solution with less coding would be to use QueryDsl integration with Spring MVC.
By using this approach all your request parameters will be automatically resolved to one of your domain properties and appended to your query.
For reference check the documentation https://spring.io/blog/2015/09/04/what-s-new-in-spring-data-release-gosling#querydsl-web-support and the example project https://github.com/spring-projects/spring-data-examples/tree/master/web/querydsl
You can do it even more easily using Query By Example (QBE) technique if your repository class implements JpaRepository interface as that interface implements QueryByExampleExecutor interface which provides findAll method that takes object of Example<T> as an argument.
Using this approach is really applicable for your scenario as your entity has a lot of fields and you want user to be able to get those which are matching filter represented as subset of entity's fields with their corresponding values that have to be matched.
Let's say the entity is User (like in your example) and you want to create endpoint for fetching users whose attribute values are equal to the ones which are specified. That could be accomplished with the following code:
Entity class:
#Entity
public class User implements Serializable {
private Long id;
private String firstName;
private String lastName;
private Integer age;
private String city;
private String state;
private String zipCode;
}
Controller class:
#Controller
public class UserController {
private UserRepository repository;
private UserController(UserRepository repository) {
this.repository = repository;
}
#GetMapping
public List<User> getMatchingUsers(#RequestBody User userFilter) {
return repository.findAll(Example.of(userFilter));
}
}
Repository class:
#Repository
public class UserRepository implements JpaRepository<User, Integer> {
}
I'm currently working on a SpringBoot API to interface with a MongoRepository, but I'm having trouble understanding how the JSON being passed becomes a Document for storage within Mongo. I currently have a simple API that stores a group of users:
#Document
#JsonInclude
public class Group {
#Id
#JsonView(Views.Public.class)
private String id;
#JsonView(Views.Public.class)
private String name;
#JsonView(Views.Public.class)
private Set<GroupMember> groupMembers = new HashSet<>();
}
There are also setter and getter methods for each of the fields, although I don't know how necessary those are either (see questions at the end).
Here is the straightforward component I'm using:
#Component
#Path("/groups")
#Api(value = "/groups", description = "Group REST")
public class Groups {
#Autowired
private GroupService groupService;
#GET
#Produces(MediaType.APPLICATION_JSON)
#ApiOperation(value = "Get all Groups", response = Group.class, responseContainer = "List")
#JsonView(Views.Public.class)
public List<Group> getAllGroups() {
return groupService.getAllGroups();
}
#POST
#Produces(MediaType.APPLICATION_JSON)
#Consumes(MediaType.APPLICATION_JSON)
#ApiOperation(value = "Create a Group", response = Group.class)
#JsonView(Views.Detailed.class)
public Group submitGroup(Group group) {
return groupService.addGroup(group);
}
}
Finally, I have a Service class:
#Service
public class GroupServiceImpl implements GroupService {
#Autowired
private GroupRepository groupRepository;
#Override
public Group addGroup(Group group) {
group.setId(null);
return groupRepository.save(group);
}
#Override
public List<Group> getAllGroups() {
return groupRepository.findAll();
}
}
The GroupRespository is simply an interface which extends MongoRepository<Group,String>
Now, when I actually make a call to the POST method, with a body containing:
{
"name": "group001",
"groupMembers": []
}
I see that it properly inserts this group with a random Mongo UUID. However, if I try to insert GroupMember objects inside the list, I receive a null pointer exception. From this, I have two questions:
How does SpringBoot (Jackson?) know which fields to deserialize from the JSON being passed? I tested this after deleting the getter and setter methods, and it still works.
How does SpringBoot handle nested objects, such as the Set inside the class? I tested with List instead of Set, and it worked, but I have no idea why. My guess is that for each object that is both declared in my class and listed in my JSON object, SpringBoot is calling a constructor that it magically created behind the scenes, and one doesn't exist for the Set interface.
Suppose I'm adamant on using Set (the same user shouldn't show up twice anyway). What tools can I use to get SpringBoot to work as expected?
It seems to me that a lot of the things that happen in Spring are very behind-the-scenes, which makes it difficult for me to understand why things work when they do. Not knowing why things work makes it difficult to construct things from scratch, which makes it feel as though I'm hacking together a project rather than actually engineering one. So my last question is something like, is there a guide that explains the wiring behind the scenes?
Finally, this is my first time working with Spring... so please excuse me if my questions are entirely off the mark, but I would appreciate any answers nonetheless.
I'm trying to use the DomainClassConverter from Spring Data to load entities in a controller and then use these entities in a service.
The problem is that I get a LazyInitializationException when I access lazy loaded collection from my Service.
Adding Transactional annotation to the controller does not help, I guess the conversion occurs before the start of the controller transaction.
Is there a way to use this converter in this kind of use case ? can I reattach the entity to the current session someway ?
My controller:
#RestController
#RequestMapping("/api/quotation-requests/{id}/quotation")
public class QuotationResource {
#RequestMapping(value = "/lines", method = RequestMethod.POST, params="freeEntry")
#Timed
public ResponseEntity<PricingLineDTO> addFreeEntryLine(#PathVariable("id") Quotation quotation, #RequestBody PricingLineDTO pricingLineTo)
{
PricingLine pricingLine = conversionService.convert(pricingLineTo, PricingLine.class);
pricingLine = quotationService.addFreeLineToQuotation(quotation, pricingLine);
return new ResponseEntity<>(conversionService.convert(pricingLine, PricingLineDTO.class), HttpStatus.OK);
}
}
The service:
#Service
#Transactional
public class QuotationService {
public PricingLine addFreeLineToQuotation(Quotation quotation, PricingLine pricingLine) {
quotation.getPricingLines().add(pricingLine); // org.hibernate.LazyInitializationException: failed to lazily initialize a collection of role: x.y.z.Quotation.pricingLines, could not initialize proxy
}
}
And the entity
#Entity
public class Quotation {
#OneToMany(cascade = CascadeType.ALL, fetch = FetchType.LAZY)
private List<PricingLine> pricingLines = new ArrayList<>();
}
If it is not possible, what is my best alternative :
make the controller transactionnal, inject data repositories in it and still offer a Service API that takes Entities parameters instead of IDs.
Cons: controller become transactional, which seem to be commonly considered as a bad practice; it introduces boilerplate code.
make the Service API takes IDs as parameters, letting the controller out of the transaction.
Cons: The API become harder to read and can be error prone as every entities as referred as "Long" object, especially when a Service method needs severals entities as input. For example:
void addUserToGroup(Long userId, Long groupId)
One could easily switch parameters.