I created a couple of DTO's and a MapStruct interface for getting the User data:
public class UserDto {
private Long id;
private CountryDto country;
}
public class CountryDto {
private Long id;
private String name;
private List<TimeZoneDto> timeZones = new ArrayList<TimeZoneDto>();
}
#Mapper
public interface UserMapper {
UserMapper INSTANCE = Mappers.getMapper(UserMapper.class);
// TODO exclude country timezones
UserDto mapToDto(User entity);
}
I would like to modify UserMapper so the CountryDto timezones list are excluded
{
"id":1,
"country": {
"id": 182,
"name":"Australia"
}
}
I finally found a solution for this, just adding the line below in UserMapper did the trick:
#Mapping(target = "country.timeZones", ignore = true)
UserDto mapToDto(User entity);
Related
I had a class like:
public class EmailAddress {
public String value;
public String tld() {...}
public String host() {...}
public String mailbox() {...}
}
Now I use this class in an Object / Entity:
#Entity
public class Customer {
public String name;
public EmailAddress mail;
}
Now, when I do a rest service for Customer, I get this format:
{
"id": 1,
"name": "Test",
"email": {
"value": "test#test.de"
}
}
But I only want "email": "test#test.de"
{
"id": 1,
"name": "Test",
"email": "test#test.de"
}
What I must do? I use Spring Boot and Hibernate Entities.
Thank you for any support
You should use DTO class in request handling and make mappings from DTO to Entity and backwards, e.g.:
public class CustomerDTO {
private Integer id;
private String name;
private String email;
}
You should use DataTransferObjects for your (REST) APIs.
The DTOs only contain the fields the interface should provide (or receive).
When receiving objects from the client and before returning the object from your Controller you can convert the DTOs to your domain model (Which could be your JPA entites classes).
Example for a controller method. We assume you get an object from an user-editor which contains all data you want to update in your database-objects and return the updated company DTO:
#PutMapping
public CustomerDto updateCustomer(CustomerEditorDto updatedCustomerDto) {
Customer updatedCustomer = CustomerConverter.convert(updatedCustomerDto);
updatedCustomer = customerService.updateCustomer(updatedCustomer);
return CustomerConverter.convert(updatedCustomer);
}
and your Converter class:
#NoArgsConstructor(access = AccessLevel.PRIVATE)
public class CustomerConverter {
public static CustomerDto convert(Customer customer) {
CustomerDto result = null;
if (customer != null) {
// TODO: set fields in result-dto
}
return result;
}
public static Customer convert(CustomerEditorDto customer) {
Customer result = null;
if (customer != null) {
// TODO set fields in result;
}
return result;
}
}
and here are the DTOs
#Getter
#Setter
public class CustomerDto {
private Integer id;
private String name;
private String email;
}
#Getter
#Setter
public class CustomerEditorDto {
private Integer id;
private String firstName;
private String lastName;
private String email;
private String otherPropertyOrStuff;
}
This way you can separate the API modell from your JPA entites. You can use the same models for input/output. And you can even use a different model to work with inside your services and the finally convert them into your JPA entites, before persisting the data (or after reading the data).
There are tools which can take care of the conversion, like mapstruct.
* The above annotations #Getter, #Setter, ... are from project lombok and very are handy to generate boiler-plate code automatically.
I found an other easier solution, use a JsonSerializer on the entity Property:
#JsonSerialize(using = EmailAddressSerializer.class)
private EmailAddress email;
The serializer class:
public class EmailAddressSerializer extends StdSerializer<EmailAddress> {
public EmailAddressSerializer() {
super(EmailAddress.class);
}
protected EmailAddressSerializer(Class<EmailAddress> t) {
super(t);
}
#Override
public void serialize(EmailAddress email,
JsonGenerator jsonGenerator,
SerializerProvider serializerProvider) throws IOException {
jsonGenerator.writeString(email.value);
}
}
I'm using java validation API to validate fields in my Note class:
#Entity
#Data
#NoArgsConstructor
#AllArgsConstructor
#Table(name = "note")
public class Note {
#Id
#Column(name = "id", nullable = false)
#GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
#Column(name = "date", columnDefinition = "DATE")
private LocalDate date;
#NotBlank(message = "Enter a topic")
#Column(name = "topic")
private String topic;
#NotBlank(message = "Content can't be empty")
#Column(name = "content")
private String content;
#Column(name = "type")
private NoteType noteType;
#NotNull
#ManyToOne(fetch = FetchType.LAZY, cascade = {CascadeType.DETACH, CascadeType.MERGE, CascadeType.PERSIST, CascadeType.REFRESH})
#JoinColumn(name = "user_id")
#JsonIgnore
private User user;
}
NoteService:
#Service
#AllArgsConstructor
public class NoteService {
#Autowired
private NoteRepository noteRepository;
#Autowired
private UserRepository userRepository;
public void addNote(#Valid Note note) {
note.setUser(getLoggedInUser());
if (validateNote(note)) {
noteRepository.save(note);
}
}
public List<Note> getNotes() {
return getLoggedInUser().getNotes();
}
public Note editNote(Note newNote, Long id) {
noteRepository.editNoteById(newNote, id);
return newNote;
}
public List<Note> getNotesByTopic(String topic) {
List<Note> notes = noteRepository.getNotesByTopicAndUser(topic, getLoggedInUser());
return notes;
}
public boolean validateNote(Note note) {
return validateNoteType(note.getNoteType())
&& note.getDate() != null;
}
public boolean validateNoteType(NoteType type) {
return type.equals(NoteType.NOTE)
|| type.equals(NoteType.SKILL);
}
public User getLoggedInUser() {
return userRepository.findByEmail(SecurityContextHolder.getContext().getAuthentication().getName());
}
}
Test:
#ExtendWith(MockitoExtension.class)
#ExtendWith(SpringExtension.class)
class NoteServiceTest {
#Mock
private NoteRepository noteRepositoryMock;
#Mock
private UserRepository userRepositoryMock;
#Mock
SecurityContext mockSecurityContext;
#Mock
Authentication authentication;
private NoteService noteService;
#BeforeEach
void setUp() {
noteService = new NoteService(noteRepositoryMock, userRepositoryMock);
Mockito.when(mockSecurityContext.getAuthentication()).thenReturn(authentication);
SecurityContextHolder.setContext(mockSecurityContext);
}
#Test
void shouldAddNote() {
LocalDate date = LocalDate.now();
Note note = new Note(0L, date, "test", "", NoteType.NOTE, null);
noteService.addNote(note);
Mockito.verify(noteRepositoryMock).save(note);
}
}
The field user in the Note class is annotated with #NotNull and I'm passing a null user to this note but the note is still getting saved. Same thing when I pass an empty string. Any idea why that is happening? I'm new to unit testing
I'm new to unit testing - your perfectly valid question has nothing to do with unit testing.
#NotNull does nothing on it own. Its actually a contract stating the following:
A data member (or anything else annotated with #NotNull like local variables, and parameters) can't be should not be null.
For example, instead of this:
/**
* #param obj should not be null
*/
public void MyShinyMethod(Object obj)
{
// Some code goes here.
}
You can write this:
public void MyShinyMethod(#NotNull Object obj)
{
// Some code goes here.
}
P.S.
It is usually appropriate to use some kind of annotation processor at compile time, or something that processes it at runtime. But I don't really know much about annotation processing. But I am sure Google knows :-)
You need to activate validation on you service class with the #Validated annotation so the validation of parameters kicks in.
#Service
#AllArgsConstructor
#Validated
public class NoteService {
...
See Spring #Validated in service layer and Spring Boot: How to test a service in JUnit with #Validated annotation? for more details.
If for some reason you need to manually perform the validation you can always do something like this:
#Component
public class MyValidationImpl {
private final LocalValidatorFactoryBean validator;
public MyValidationImpl (LocalValidatorFactoryBean validator) {
this.validator = validator;
}
public void validate(Object o) {
Set<ConstraintViolation<Object>> set = validator.validate(o);
if (!set.isEmpty()) {
throw new IllegalArgumentException(
set.stream().map(x -> String.join(" ", x.getPropertyPath().toString(), x.getMessage())).collect(
Collectors.joining("\n\t")));
}
}
}
So your noteRepository is Mocked, so you it's not actually calling save on your repository.
Mockito.verify(noteRepositoryMock).save(note);
All you are verifying here is that a call to save is made, not that it was successful.
I want to use DTO to communicate with the Angular, but actually it doesn't work. I want to create POST request to add data from my application to the database using Dto model.
You can see my errors on the picture:
My class Customer:
#Entity
#Table(name = "customer")
public class Customer {
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
private int id;
#Column(name = "name")
private String name;
#OneToMany
private List<Ticket> ticket;
...
Class CustomerDto:
public class CustomerDto {
private String name;
private List<TicketDto> ticket;
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public List<TicketDto> getTicket() {
return ticket;
}
public void setTicket(List<TicketDto> ticket) {
this.ticket = ticket;
}
}
Class CustomerController:
#Autowired
CustomerService customerService;
#PostMapping(value = "/customers/create")
public Customer postCustomer(#RequestBody CustomerDto customerDto, List<TicketDto> ticketDtos) {
//ArrayList<TicketDto> tickets = new ArrayList<>();
ticketDtos.add(customerDto.getName());
ticketDtos.add(customerDto.getTicket());
Customer _customer = customerService.save(new Customer(customerDto.getName(), ticketDtos ));
return _customer;
}
CustomerService:
public interface CustomerService {
void save(CustomerDto customerDto, List<TicketDto> ticketDtos);
}
CustomerServiceImpl:
#Service
public class CustomerServiceImpl implements CustomerService {
#Autowired
CustomerRepository repository;
#Override
public void save(CustomerDto customerDto, List<TicketDto> ticketDtos) {
Customer customer = new Customer();
customer.setName(customerDto.getName());
customer.setTicket(customerDto.getTicket());
List<Ticket> tickets = new ArrayList<>();
for (TicketDto ticketDto : ticketDtos) {
Ticket ticket = new Ticket();
ticket.setDestinationCity(ticketDto.getDepartureCity());
ticket.setDestinationCity(ticketDto.getDestinationCity());
tickets.add(ticket);
}
}
Since you CustomerServiceImpl is taking CustomerDto and list of TicketDtos, you need to change your method call on controller as below:
Class CustomerController:
#Autowired
CustomerService customerService;
#PostMapping(value = "/customers/create")
public Customer postCustomer(#RequestBody CustomerDto customerDto) {
Customer _customer = customerService.save(customerDto));
return _customer;
}
And update CustomerServiceImpl as:
#Service
public class CustomerServiceImpl implements CustomerService {
#Autowired
CustomerRepository repository;
// change save to return saved customer
#Override
public Customer save(CustomerDto customerDto) {
Customer customer = new Customer();
customer.setName(customerDto.getName());
// customer.setTicket(customerDto.getTicket()); // remove this
List<Ticket> tickets = new ArrayList<>();
for (TicketDto ticketDto : customerDto.getTicketDtos) {
Ticket ticket = new Ticket();
ticket.setDestinationCity(ticketDto.getDepartureCity());
ticket.setDestinationCity(ticketDto.getDestinationCity());
tickets.add(ticket);
}
customer.setTickets(tickets); // add this to set tickets on customer
return repository.save(customer);
}
Obviously, you need to change your interface as well:
public interface CustomerService {
Customer save(CustomerDto customerDto);
}
For entity-DTO conversion, we need to use ModelMapper or mapstruct library.
With the help of these libraries, we can easily convert from Dto to entity and entity to dto object. After adding any of the dependency, We are able to use it.
How can we use, Let see...
Define modelMapper bean in spring configuration.
#Bean
public ModelMapper modelMapper() {
return new ModelMapper();
}
Suppose we need to convert List to List obj then we can perform simply like that :
List<TicketDto> ticketDtos = .... //Suppose It is holding some data
List<Ticket> tickets = ticketDtos.stream()
.map(tkt-> mappper.map(tkt, ticket.class))
.collect(Collectors.toList());
It is very simple to use like mappper.map(targetClass, DestinationClass.class)
I used Java8 code here but you can use anyone. I hope It would be very helpful to you.
I have as parent class : User.java , and 2 classes : FacebookUser.java and TwitterUser.java they are entities that returned depends on the type column in database using DiscriminatorColumn, I want to write correct mapper to map User that could be instance of FacebookUser or TwitterUser. I have the following mapper that seems not works as intended, only Mapping the User parent not the children:
#Mapper
public interface UserMapper {
public static UserMapper INSTANCE = Mappers.getMapper(UserMapper.class);
User map(UserDTO userDTO);
#InheritInverseConfiguration
UserDTO map(User user);
List<UserDTO> map(List<User> users);
FacebookUser map(FacebookUserDTO userDTO);
#InheritInverseConfiguration
FacebookUserDTO map(FacebookUser user);
TwitterUser map(TwitterUserDTO userDTO);
#InheritInverseConfiguration
TwitterUserDTO map(TwitterUser user);
}
Then I use :
UserDTO userDto = UserMapper.INSTANCE.map(user);
Classes to map:
#Entity
#Table(name = "users")
#Inheritance(strategy = InheritanceType.SINGLE_TABLE)
#DiscriminatorColumn(name = "type", discriminatorType = DiscriminatorType.STRING, length = 10)
#DiscriminatorValue(value = "Local")
public class User {
#Column
private String firstName;
#Column
private String lastName;
///... setters and getters
}
#Entity
#DiscriminatorValue(value = "Facebook")
public class FacebookUser extends User {
#Column
private String userId;
///... setters and getters
}
#Entity
#DiscriminatorValue(value = "Twitter")
public class TwitterUser extends User {
#Column
private String screenName;
///... setters and getters
}
The DTOs:
public class UserDTO {
private String firstName;
private String lastName;
///... setters and getters
}
public class FacebookUserDTO extends UserDTO {
private String userId;
///... setters and getters
}
public class TwitterUserDTO extends UserDTO {
private String screenName;
///... setters and getters
}
Also if I have list of users that mixed with Facebook users and Twitter users, or basic user:
Lets say I have the following users:
User user = new User ("firstName","lastName");
User fbUser = new FacebookUser ("firstName","lastName","userId");
User twUser = new TwitterUser ("firstName","lastName","screenName");
List<User> users = new ArrayList<>();
users.add(user);
users.add(fbUser);
users.add(twUser);
//Then:
List<UserDTO> dtos = UserMapper.INSTANCE.map(users);
I get only firstName and lastName but not screenName or userId.
Any solution for this?
Currently, it seems it's not available yet as a feature for mapstruct : Support for Type-Refinement mapping (or Downcast Mapping)
I asked the question in their google group: https://groups.google.com/forum/?fromgroups#!topic/mapstruct-users/PqB-g1SBTPg
and found that I need to do manual mapping using default method inside interface (for java 8).
And got another issue for mapping parent that was almost not applicable so I write one more empty class that child of parent class called LocalUserDTO:
So the code becomes like the following:
#Mapper
public interface UserMapper {
public static UserMapper INSTANCE = Mappers.getMapper(UserMapper.class);
LocalUser map(LocalUserDTO userDTO);
#InheritInverseConfiguration
LocalUserDTO map(LocalUser user);
List<UserDTO> map(List<User> users);
FacebookUser map(FacebookUserDTO userDTO);
#InheritInverseConfiguration
FacebookUserDTO map(FacebookUser user);
TwitterUser map(TwitterUserDTO userDTO);
#InheritInverseConfiguration
TwitterUserDTO map(TwitterUser user);
default UserDTO map(User user) {
if (user instanceof FacebookUser) {
return this.map((FacebookUser) user);
} else if (user instanceof TwitterUser) {
return this.map((TwitterUser) user);
} else {
return this.map((LocalUser) user);
}
}
#InheritInverseConfiguration
default User map(UserDTO userDTO) {
if (userDTO instanceof FacebookUserDTO) {
return this.map((FacebookUserDTO) userDTO);
} else if (userDTO instanceof TwitterUserDTO) {
return this.map((TwitterUserDTO) userDTO);
} else {
return this.map((LocalUserDTO) userDTO);
}
}
}
Was googling about this exact issue yesterday, but couldn't find anything on internet, so let's leave this here.
As the issue mentioned in the former answer has been solved earlier this year, Mapstruct now officially supports downcasting by #SubclassMapping.
public interface UserMapper {
#Named("UserToDTO")
#SubclassMapping(source = FacebookUser.class, target = FacebookUserDTO.class)
#SubclassMapping(source = TwitterUser.class, target = TwitterUserDTO.class)
UserDTO toDTO(User source);
}
However you can't put parameterized class i.e. List<User> in that annotation, and Mapstruct isn't actually using this specification above for Lists. Hence, one more function is needed.
public interface UserMapper {
#Named("UserToDTOList")
#IterableMapping(qualifiedByName = "UserToDTO")
List<UserDTO> toDTO(List<User> source);
#Named("UserToDTO")
#SubclassMapping(source = FacebookUser.class, target = FacebookUserDTO.class)
#SubclassMapping(source = TwitterUser.class, target = TwitterUserDTO.class)
UserDTO toDTO(User source);
}
This also applies to field in a class. For my case, we have a list field like that in a class. At first I only added #SubclassMapping, but after checking the code Mapstruct generated, I found that it isn't using the method for list converting. Adding #IterableMapping fixed that.
Consider a UsersWrapper class with a list field in it.
public class UsersWrapper {
List<User> list;
}
And a UsersWrapperDTO.
public class UsersWrapperDTO {
List<UserDTO> list;
}
Then, we can use one more method in the mapper.
public interface UserMapper {
#Mapping(source = "source.list", target = "list", qualifiedByName = "UserToDTOList")
UsersWrapperDTO toDTO(UsersWrapper source);
#Named("UserToDTOList")
#IterableMapping(qualifiedByName = "UserToDTO")
List<UserDTO> toDTO(List<User> source);
#Named("UserToDTO")
#SubclassMapping(source = FacebookUser.class, target = FacebookUserDTO.class)
#SubclassMapping(source = TwitterUser.class, target = TwitterUserDTO.class)
UserDTO toDTO(User source);
}
Mapstruct actually generates very readable codes, so if you find it doesn't work like you thought, checking there first is also an option.
Ain't familiar with Mapstruct so took me a whole afternoon to put the pieces together! Though, it does seem simple and make sense that I need to combine these two annotations together. Hopefully this post will save people some time.
I am trying to create a springboot application using MongoDB and a Rest controller and connect objects together using DBRef instead of classic Jpa annotations like OneToMany etc. The purpose is to print all the bookmarks for a specific account. The list of bookmarks is found by the username but it seems that it doesn't work.
These are my classes:
#Document
public class Account {
#DBRef
private Set<Bookmark> bookmarkSet = new HashSet<>();
#Id
private String id;
#JsonIgnore
private String username;
private String password;
public Account(String username, String password) {
this.username = username;
this.password = password;
}
public void setBookmarkSet(Set<Bookmark> bookmarkSet) {
this.bookmarkSet = bookmarkSet;
}
public String getId() {
return id;
}
}
#Document
public class Bookmark {
#DBRef
#JsonIgnore
private Account account;
#Id
private String id;
private String uri;
private String description;
public Bookmark(Account account, String uri, String description) {
this.account = account;
this.uri = uri;
this.description = description;
}
public Account getAccount() {
return account;
}
public String getId() {
return id;
}
public String getUri() {
return uri;
}
public String getDescription() {
return description;
}
}
repositories:
public interface AccountRepository extends MongoRepository<Account, Long> {
Optional<Account> findOneByUsername(String username);
}
public interface BookmarkRepository extends MongoRepository<Bookmark, Long> {
Collection<Bookmark> findByAccountUsername(String username);
}
And RestController:
#RestController
#RequestMapping("/{userId}/bookmarks")
public class BookmarkRestController {
private final AccountRepository accountRepository;
private final BookmarkRepository bookmarkRepository;
#Autowired
public BookmarkRestController(AccountRepository accountRepository, BookmarkRepository bookmarkRepository) {
this.accountRepository = accountRepository;
this.bookmarkRepository = bookmarkRepository;
}
#RequestMapping(value = "/{bookmarkId}", method = RequestMethod.GET)
Bookmark readBookmark(#PathVariable String userId, #PathVariable Long bookmarkId) {
this.validateUser(userId);
return bookmarkRepository.findOne(bookmarkId);
}
#RequestMapping(method = RequestMethod.GET)
Collection<Bookmark> readBookmarks(#PathVariable String userId) {
this.validateUser(userId);
return this.bookmarkRepository.findByAccountUsername(userId);
}
private void validateUser(String userId) {
this.accountRepository.findOneByUsername(userId).orElseThrow(() -> new UserNotFoundException(userId));
}
}
After I run the application I get this error:
Invalid path reference account.username! Associations can only be pointed to directly or via their id property!
I'm not sure you have the right schema design. I assume you've modeled you objects based on a relational database type model, where the data is normalised and data is split across multiple tables, with relationships captured using Ids. With MongoDB you can structure and store your data with the heirarchy simply contained in within the one document.
So in your example the Bookmark would not be a Document itself, but would be a sub document of the Account. Remove the #Document annotation from the Bookmark object, and the #DBRef annotations, and simply store the Bookmarks within the Account document.
This would give you a schema more like this:
{
"_id": 1,
"bookmarkSet": [
{
"uri": "http://www.foo.com",
"description": "foo"
},
{
"uri": "http://www.bar.com",
"description": "bar"
}
],
"username": "John",
"password": "password"
}
*Note: if you make the bookmarks sub documents you can remove the _id member from the Bookmark object
The best design will depend on how many bookmarks you expect each account to have. If its only a few bookmarks then what I suggested would work well. If you have thousands then you might want to structure it differently. There are lots of articles about schema design in NoSQL database. This one covers the options for embedding subdocuments quite well:
http://blog.mongodb.org/post/87200945828/6-rules-of-thumb-for-mongodb-schema-design-part-1