In Spring Rest, how do I override the GET and PUT calls? - java

My Spring REST program, a slight extension of a Stephen Zerhusen demo using Json Web Tokens (JWT), works OK -- as far as it goes. I added an Option object, and I can GET, PUT and POST using just an Option class (#Entity) and an OptionRepository interface (extends JpaRepository)
I'm now trying, but failing, to restrict the returned data to just what the logged-in user has rights to. As an example, suppose that my logged in user only has rights to Option values 1, 3, and 5.
If I have a service call like GET /option I should not return Option values 2 or 4.
If I have a service call like GET /option/2 I should get back a HTTP 404 result.
I understand that once the user has logged in I can get their user information through a Principal object reference. Such a solution was offered in this previous stackoverflow question, and other pages also offer similar solutions.
My immediate problem is to find where I can affect the GET and PUT behavior of /option. Here is all that I added to an existing, working demo. First the entity defining class.
#Entity
#Table(name="choice")
public class Option implements Serializable {
#Id
#Column(name="id")
#GeneratedValue(strategy=GenerationType.AUTO)
private Long id = Utilities.INVALID_ID;
#Column(name="value", length=50, nullable=false)
private String value;
#Column(name="name", length=100, nullable=false)
private String name;
public Long getId() { return this.id; }
public void setId(Long id) { this.id = id; }
public String getValue() { return this.value; }
public void setValue(String value) { this.value = value; }
public String getName() { return this.name; }
public void setName(String name) { this.name = name; }
}
Now the JpaRepository interface extension:
#RepositoryRestResource(collectionResourceRel="option", path="option")
public interface OptionRepository extends JpaRepository<Option, Long> {
}
I merely added those two files to the program and GET, PUT and POST work. BTW, it turns out that if I comment out the #RepositoryRestResource statement the call to /option/1 returns HTTP 404. Some documentation suggests it isn't needed, but I guess it really is.
Now to filter the output. Let's pretend to filter by making the server always return Option (id = 5). I do this by:
#RepositoryRestResource(collectionResourceRel="option", path="option")
public interface OptionRepository extends JpaRepository<Option, Long> {
#RequestMapping(path = "/option/{id}", method = RequestMethod.GET)
#Query("from Option o where o.id = 5")
public Iterable<Option> getById(#PathVariable("id") Long id);
}
When I run this server and do GET /option/1 I get back ... Option 1, not Option 5. The #Query isn't used.
What is the magic needed to affect the GET, PUT, etc?
Thanks,
Jerome.

You can use Resource Processor to manipulate returned resources:
#Component
public class OptionResourceProcessor implements ResourceProcessor<Resource<Option>> {
#Override
public Resource<Option> process(Resource<Option> resource) {
Option option = resource.getContent();
if (/* Logged User is not allowed to get this Option */ ) {
throw new MyCustomException(...);
} else {
return resource;
}
}
}
Then you can create custom Exception handler, for example:
#ControllerAdvice
public class ExceptionsHandler {
#ExceptionHandler(MyCustomException.class)
public ResponseEntity<?> handleMyCustomException(MyCustomException e) {
return new ResponseEntity<>(new MyCustomMessage(e), HttpStatus.FORBIDDEN);
}
}
To add some logic to PUT/POST/DELETE request you can use a custom Event Handler, for example:
#RepositoryEventHandler(Option.class)
public class OptionEventHandler {
#HandleBeforeSave
public void handleBeforeSave(Option option) {
if (/* Logged User is not allowed to save this Option */ ) {
throw new MyCustomException(...);
}
}
}
You can find more SDR usage examples in my sample project...

Related

How to use Mongo Auditing and a UUID as id with Spring Boot 2.2.x?

I would like to have Documents stored with an UUID id and createdAt / updatedAt fields. My solution was working with Spring Boot 2.1.x. After I upgraded from Spring Boot 2.1.11.RELEASE to 2.2.0.RELEASE my test for MongoAuditing failed with createdAt = null. What do I need to do to get the createdAt field filled again?
This is not just a testproblem. I ran the application and it has the same behaviour as my test. All auditing fields stay null.
I have a Configuration to enable MongoAuditing and UUID generation:
#Configuration
#EnableMongoAuditing
public class MongoConfiguration {
#Bean
public GenerateUUIDListener generateUUIDListener() {
return new GenerateUUIDListener();
}
}
The listner hooks into the onBeforeConvert - I guess thats where the trouble starts.
public class GenerateUUIDListener extends AbstractMongoEventListener<IdentifiableEntity> {
#Override
public void onBeforeConvert(BeforeConvertEvent<IdentifiableEntity> event) {
IdentifiableEntity entity = event.getSource();
if (entity.isNew()) {
entity.setId(UUID.randomUUID());
}
}
}
The document itself (I dropped the getter and setters):
#Document
public class MyDocument extends InsertableEntity {
private String name;
}
public abstract class InsertableEntity extends IdentifiableEntity {
#CreatedDate
#JsonIgnore
private Instant createdAt;
}
public abstract class IdentifiableEntity implements Persistable<UUID> {
#Id
private UUID id;
#JsonIgnore
public boolean isNew() {
return getId() == null;
}
}
A complete minimal example can be find here (including a test) https://github.com/mab/auditable
With 2.1.11.RELEASE the test succeeds with 2.2.0.RELEASE it fails.
For me the best solution was to switch from event UUID generation to a callback based one. With the implementation of Ordered we can set the new callback to be executed after the AuditingEntityCallback.
public class IdEntityCallback implements BeforeConvertCallback<IdentifiableEntity>, Ordered {
#Override
public IdentifiableEntity onBeforeConvert(IdentifiableEntity entity, String collection) {
if (entity.isNew()) {
entity.setId(UUID.randomUUID());
}
return entity;
}
#Override
public int getOrder() {
return 101;
}
}
I registered the callback with the MongoConfiguration. For a more general solution you might want to take a look at the registration of the AuditingEntityCallback with the `MongoAuditingBeanDefinitionParser.
#Configuration
#EnableMongoAuditing
public class MongoConfiguration {
#Bean
public IdEntityCallback registerCallback() {
return new IdEntityCallback();
}
}
MongoTemplate works in the following way on doInsert()
this.maybeEmitEvent - emit an event (onBeforeConvert, onBeforeSave and such) so any AbstractMappingEventListener can catch and act upon like you did with GenerateUUIDListener
this.maybeCallBeforeConvert - call before convert callbacks like mongo auditing
like you can see in source code of MongoTemplate.class src (831-832)
protected <T> T doInsert(String collectionName, T objectToSave, MongoWriter<T> writer) {
BeforeConvertEvent<T> event = new BeforeConvertEvent(objectToSave, collectionName);
T toConvert = ((BeforeConvertEvent)this.maybeEmitEvent(event)).getSource(); //emit event
toConvert = this.maybeCallBeforeConvert(toConvert, collectionName); //call some before convert handlers
...
}
MongoAudit marks createdAt only to new entities by checking if entity.isNew() == true
because your code (UUID) already set the Id the createdAt is not populated (the entity is not considered new)
you can do the following (order by best to worst):
forget about the UUID and use String for your id, let the mongo itself create and manage it's entities ids (this how MongoTemplate actually works lines 811-812)
keep the UUID at the code level, convert from/to String when inserting and retrieving from the db
create a custom repository like in this post
stay with 2.1.11.RELEASE
set the updateAt by GenerateUUIDListener as well as id (rename it NewEntityListener or smth), basically implement the audit
implement a new isNew() logic that don't depends only on the entity id
in version 2.1.11.RELEASE the order of the methods was flipped (MongoTemplate.class 804-805) so your code worked fine
as an abstract approach, the nature of event is to be sort of send-and-forget (async compatible), so it's a very bad practice to change the object itself, there is NO grantee for order of computation, if any
this is why the audit build on callbacks and not events, and that's why Pivotal don't (need to) keep order between versions

reactive repository throws exception when saving a new object

I am using r2dbc, r2dbc-h2 and experimental spring-boot-starter-data-r2dbc
implementation 'org.springframework.boot.experimental:spring-boot-starter-data-r2dbc:0.1.0.M1'
implementation 'org.springframework.data:spring-data-r2dbc:1.0.0.RELEASE' // starter-data provides old version
implementation 'io.r2dbc:r2dbc-h2:0.8.0.RELEASE'
implementation 'io.r2dbc:r2dbc-pool:0.8.0.RELEASE'
I have created reactive repositories
public interface IJsonComparisonRepository extends ReactiveCrudRepository<JsonComparisonResult, String> {}
Also added a custom script that creates a table in H2 on startup
#SpringBootApplication
public class JsonComparisonApplication {
public static void main(String[] args) {
SpringApplication.run(JsonComparisonApplication.class, args);
}
#Bean
public CommandLineRunner startup(DatabaseClient client) {
return (args) -> client
.execute(() -> {
var resource = new ClassPathResource("ddl/script.sql");
try (var is = new InputStreamReader(resource.getInputStream())) {
return FileCopyUtils.copyToString(is);
} catch (IOException e) {
throw new RuntimeException(e);
} })
.then()
.block();
}
}
My r2dbc configuration looks like this
#Configuration
#EnableR2dbcRepositories
public class R2dbcConfiguration extends AbstractR2dbcConfiguration {
#Override
public ConnectionFactory connectionFactory() {
return new H2ConnectionFactory(
H2ConnectionConfiguration.builder()
.url("mem:testdb;DB_CLOSE_DELAY=-1;DB_CLOSE_ON_EXIT=FALSE")
.username("sa")
.build());
}
}
My service where I perform the logic looks like this
#Override
public Mono<JsonComparisonResult> updateOrCreateRightSide(String comparisonId, String json) {
return updateComparisonSide(comparisonId, storedComparisonResult -> {
storedComparisonResult.setRightSide(json);
return storedComparisonResult;
});
}
private Mono<JsonComparisonResult> updateComparisonSide(String comparisonId,
Function<JsonComparisonResult, JsonComparisonResult> updateSide) {
return repository.findById(comparisonId)
.defaultIfEmpty(createResult(comparisonId))
.filter(result -> ComparisonDecision.NONE == result.getDecision()) // if not NONE - it means it was found and completed
.switchIfEmpty(Mono.error(new NotUpdatableCompleteComparisonException(comparisonId)))
.map(updateSide)
.flatMap(repository::save);
}
private JsonComparisonResult createResult(String comparisonId) {
LOGGER.info("Creating new comparison result: {}.", comparisonId);
var newResult = new JsonComparisonResult();
newResult.setDecision(ComparisonDecision.NONE);
newResult.setComparisonId(comparisonId);
return newResult;
}
The domain looks like this
#Table("json_comparison")
public class JsonComparisonResult {
#Column("comparison_id")
#Id
private String comparisonId;
#Column("left")
private String leftSide;
#Column("right")
private String rightSide;
// #Enumerated(EnumType.STRING) - no support for now
#Column("decision")
private ComparisonDecision decision;
private String differences;
The problem is that when I try to add any object to the database it fails with the exception
org.springframework.dao.TransientDataAccessResourceException: Failed to update table [json_comparison]. Row with Id [4] does not exist.
at org.springframework.data.r2dbc.repository.support.SimpleR2dbcRepository.lambda$save$0(SimpleR2dbcRepository.java:91) ~[spring-data-r2dbc-1.0.0.RELEASE.jar:1.0.0.RELEASE]
at reactor.core.publisher.FluxHandle$HandleSubscriber.onNext(FluxHandle.java:96) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
at reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onNext(FluxOnErrorResume.java:73) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
at reactor.core.publisher.MonoUsingWhen$MonoUsingWhenSubscriber.deferredComplete(MonoUsingWhen.java:276) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
at reactor.core.publisher.FluxUsingWhen$CommitInner.onComplete(FluxUsingWhen.java:536) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
at reactor.core.publisher.Operators$MultiSubscriptionSubscriber.onComplete(Operators.java:1858) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
at reactor.core.publisher.Operators.complete(Operators.java:132) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
at reactor.core.publisher.MonoEmpty.subscribe(MonoEmpty.java:45) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
at reactor.core.publisher.MonoDefer.subscribe(MonoDefer.java:52) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
For some reason during save in SimpleR2dbcRepository library class it doesn't consider the objectToSave as new, but then it fails to update as it is in reality doesn't exist.
// SimpleR2dbcRepository#save
#Override
#Transactional
public <S extends T> Mono<S> save(S objectToSave) {
Assert.notNull(objectToSave, "Object to save must not be null!");
if (this.entity.isNew(objectToSave)) { // not new
....
}
}
Why it is happening and what is the problem?
TL;DR: How should Spring Data know if your object is new or whether it should exist?
Relational Spring Data Repositories (both, JDBC and R2DBC) must differentiate on [Reactive]CrudRepository.save(…) whether the given object is new or whether it exists in your database. Performing a save(…) operation results either in an INSERT or UPDATE statement. Issuing the wrong statement either causes a primary key violation or a no-op as standard SQL does not have a way to express an upsert.
Spring Data JDBC|R2DBC use by default the presence/absence of the #Id value. Generated primary keys are a widely used mechanism. If the primary key is provided, the entity is considered existing. If the id value is null, the entity is considered new.
Read more in the reference documentation about Entity State Detection Strategies.
You have to implement Persistable because you’ve provided the #Id. The library needs to figure out, whether the row is new or whether it should exist. If your entity implements Persistable, then save(…) will use the outcome of isNew() to determine whether to issue an INSERT or UPDATE.
For example:
public class Product implements Persistable<Integer> {
#Id
private Integer id;
private String description;
private Double price;
#Transient
private boolean newProduct;
#Override
#Transient
public boolean isNew() {
return this.newProduct || id == null;
}
public Product setAsNew() {
this.newProduct = true;
return this;
}
}
May be you should consider this:
Choose data type of your id/Primary Key as INT/LONG and set it to AUTO_INCREMENT (something like below):
CREATE TABLE PRODUCT(id INT PRIMARY KEY AUTO_INCREMENT NOT NULL, modelname VARCHAR(30) , year VARCHAR(4), owner VARCHAR(50));
In your post request body, do not include id field.
Removing #ID issued insert statement

How to connect OSGI bundles (Send entity though bundles)?

I have a REST service, which contains three classes in one module(bundle)
User.java -> Entity
UserService.java -> REST service
UserValidation.java -> Special validator for the entity. Server send entity to this validator and get validation result (true or false):
User.java
#XmlRootElement(name = "User")
public class User {
private long id;
private String name;
private String surname;
private String patronymic;
/*Getters and Setters*/
}
UserService.java
public class UserServiceImpl implements UserService {
private UserDAO userDbDao = new UserDatabaseDAO();
#POST
#Path("/users/")
public Response addUser(User user) {
UserValidator userValidator = new UserValidator(user);
if (userValidator.isValid()) {
User newUser = userDbDao.createUser(user);
return Response.ok().type("application/xml").entity(newUser).build();
} else {
return Response.status(Response.Status.BAD_REQUEST).entity(userValidator.getErrorMessage()).build();
}
}
}
UserValidator.java
public class UserValidator {
private static final int MAX_SIZE_NAME = 50;
private static final int MIN_SIZE_NAME = 2;
private User user;
public UserValidator(User user) {
this.user = user;
}
private BadUserResponse badUserResponse = new BadUserResponse();
private boolean isNameValid(String name) {
if (name == null) {
badUserResponse.setNsp("Null in fields name/surname/patronymic");
return false;
}
String tempName = name.trim();
if (tempName.length() < MIN_SIZE_NAME || tempName.length() > MAX_SIZE_NAME) {
badUserResponse.setNsp(String.format("Fields name/surname/patronymic too long or too short (Allowed length from %d to %d)", MIN_SIZE_NAME, MAX_SIZE_NAME));
return false;
}
for (int i = 0; i < tempName.length(); i++) {
if (!Character.isLetter(tempName.charAt(i))) {
badUserResponse.setNsp("Fields name/surname/patronymic contains wrong symbols (Only letters allowed)");
return false;
}
}
return true;
}
public boolean isValid() {
return (isNameValid(user.getName()) &
isNameValid(user.getSurname()) &
isNameValid(user.getPatronymic()));
}
public BadUserResponse getErrorMessage() {
return badUserResponse;
}
BadUserResponse.java
#XmlRootElement(name="baduserresponce")
public class BadUserResponse {
private String nsp;
public String getNsp() {
return nsp;
}
public void setNsp(String nsp) {
this.nsp = nsp;
}
}
But now, I need to split this into separate bundles. Because, as you can see, they uses functionality of each other. For example UserService.java
just used this UserValidator userValidator = new UserValidator(user);
I need to connect these bundles somehow (OSGI Service, ActiveMQ).
In my opinion it works something like this:
UserService bundle get User entity from REST method.
Put all of the User fields (name, surname, patronymic) to ActiveMQ queue (because UserValidator bundle don't know what's is User entity).
UserValidator bundle get User's fiels from queue and validate them.
UserValidator bundle put validation result (true/false) to queue.
UserService bundle get validation result from queue and send User to DAO.
But this is just a concept. Am I wrong?
What's the best way to pass entity though bundles and how should I do this?
You current way of simply initiating the UserValidator via new is technically fine even if they live in different bundles. If your validator is only needed in this place and is simple I would even leave it in the same bundle.
The other options can make sense to decouple your bundles. Using messaging allows you to avoid sync calls. It can also be use to send the data to a remote machine. JMS messaging is quite heavy weight though. You need a broker and depend on the API. In your case you also directly need the result of the validation. So you would simulate a sync call with JMS. So I would rather avoid this.
Using an OSGi service allows you to decouple from the implementation of the service. In this case it makes sense to create an interface for UserValidator. I would also put this interface into a separate bundle. You then need to register the service in the bundle that implements the validator and bind the service in the bundle that uses the validator. OSGi services are very light weight and by default synchronous. So I think they would fit your problem well.
For registering and binding services do not use the OSGi API directly. Instead use declarative services with annotations. They take away most of the complexity in dealing with OSGi services.
Btw. I am not sure how you do REST. I suggest to have a look at the Aries JAX-RS Whiteboard.

Converting & validating CSV file upload in Spring MVC

I have a Customer entity that contains a list of Sites, as follows:
public class Customer {
#Id
#GeneratedValue
private int id;
#NotNull
private String name;
#NotNull
#AccountNumber
private String accountNumber;
#Valid
#OneToMany(mappedBy="customer")
private List<Site> sites
}
public class Site {
#Id
#GeneratedValue
private int id;
#NotNull
private String addressLine1;
private String addressLine2;
#NotNull
private String town;
#PostCode
private String postCode;
#ManyToOne
#JoinColumn(name="customer_id")
private Customer customer;
}
I am in the process of creating a form to allow users to create a new Customer by entering the name & account number and supplying a CSV file of sites (in the format "addressLine1", "addressLine2", "town", "postCode"). The user's input needs to be validated and errors returned to them (e.g. "file is not CSV file", "problem on line 7").
I started off by creating a Converter to receive a MultipartFile and convert it into a list of Site:
public class CSVToSiteConverter implements Converter<MultipartFile, List<Site>> {
public List<Site> convert(MultipartFile csvFile) {
List<Site> results = new List<Site>();
/* open MultipartFile and loop through line-by-line, adding into List<Site> */
return results;
}
}
This worked but there is no validation (i.e. if the user uploads a binary file or one of the CSV rows doesn't contain a town), there doesn't seem to be a way to pass the error back (and the converter doesn't seem to be the right place to perform validation).
I then created a form-backing object to receive the MultipartFile and Customer, and put validation on the MultipartFile:
public class CustomerForm {
#Valid
private Customer customer;
#SiteCSVFile
private MultipartFile csvFile;
}
#Documented
#Constraint(validatedBy = SiteCSVFileValidator.class)
#Target(ElementType.FIELD)
#Retention(RetentionPolicy.RUNTIME)
public #interface SiteCSVFile {
String message() default "{SiteCSVFile}";
Class<?>[] groups() default {};
Class<? extends Payload>[] payload() default {};
}
public class SiteCSVFileValidator implements ConstraintValidator<SiteCSVFile, MultipartFile> {
#Override
public void initialize(SiteCSVFile siteCSVFile) { }
#Override
public boolean isValid(MultipartFile csvFile, ConstraintValidatorContext cxt) {
boolean wasValid = true;
/* test csvFile for mimetype, open and loop through line-by-line, validating number of columns etc. */
return wasValid;
}
}
This also worked but then I have to re-open the CSV file and loop through it to actually populate the List within Customer, which doesn't seem that elegant:
#RequestMapping(value="/new", method = RequestMethod.POST)
public String newCustomer(#Valid #ModelAttribute("customerForm") CustomerForm customerForm, BindingResult bindingResult) {
if (bindingResult.hasErrors()) {
return "NewCustomer";
} else {
/*
validation has passed, so now we must:
1) open customerForm.csvFile
2) loop through it to populate customerForm.customer.sites
*/
customerService.insert(customerForm.customer);
return "CustomerList";
}
}
My MVC config limits file uploads to 1MB:
#Bean
public MultipartResolver multipartResolver() {
CommonsMultipartResolver multipartResolver = new CommonsMultipartResolver();
multipartResolver.setMaxUploadSize(1000000);
return multipartResolver;
}
Is there a spring-way of converting AND validating at the same time, without having to open the CSV file and loop through it twice, once to validate and another to actually read/populate the data?
IMHO, it is a bad idea to load the whole CSV in memory unless :
you are sure it will always be very small (and what if a user click on wrong file ?)
the validation is global (only real use case, but does not seem to be here)
your application will never be used in a production context under serious load
You should either stick to the MultipartFile object, or use a wrapper exposing the InputStream (and eventually other informations you could need) if you do not want to tie your business classes to Spring.
Then you carefully design, code and test a method taking an InputStream as input, reads it line by line and call line by line methods to validate and insert data. Something like
class CsvLoader {
#Autowired Verifier verifier;
#Autowired Loader loader;
void verifAndLoad(InputStream csv) {
// loop through csv
if (verifier.verify(myObj)) {
loader.load(myObj);
}
else {
// log the problem eventually store the line for further analysis
}
csv.close();
}
}
That way, your application only uses the memory it really needs, only looping once other the file.
Edit : precisions on what I meant by wrapping Spring MultipartFile
First, I would split validation in 2. Formal validation is in controller layer and only controls that :
there is a Customer field
the file size and mimetype seems Ok (eg : size > 12 && mimetype = text/csv)
The validation of the content is IMHO a business layer validation and can happen later. In this pattern, SiteCSVFileValidator would only test csv for mimetype and size.
Normally, you avoid directly using Spring classes from business classes. If it is not a concern, the controller directly sends the MultipartFile to a service object, passing also the BindingResult to populate directly the eventual error messages. The controller becomes :
#RequestMapping(value="/new", method = RequestMethod.POST)
public String newCustomer(#Valid #ModelAttribute("customerForm") CustomerForm customerForm, BindingResult bindingResult) {
if (bindingResult.hasErrors()) {
return "NewCustomer"; // only external validation
} else {
/*
validation has passed, so now we must:
1) open customerForm.csvFile
2) loop through it to validate each line and populate customerForm.customer.sites
*/
customerService.insert(customerForm.customer, customerForm.csvFile, bindingResult);
if (bindingResult.hasErrors()) {
return "NewCustomer"; // only external validation
} else {
return "CustomerList";
}
}
}
In service class we have
insert(Customer customer, MultipartFile csvFile, Errors errors) {
// loop through csvFile.getInputStream populating customer.sites and eventually adding Errors to errors
if (! errors.hasErrors) {
// actually insert through DAO
}
}
But we get 2 Spring classes in a method of service layer. If it is a concern, just replace the line customerService.insert(customerForm.customer, customerForm.csvFile, bindingResult); with :
List<Integer> linesInError = new ArrayList<Integer>();
customerService.insert(customerForm.customer, customerForm.csvFile.getInputStream(), linesInError);
if (! linesInError.isEmpty()) {
// populates bindingResult with convenient error messages
}
Then the service class only adds line numbers where errors where detected to linesInError
but it only gets the InputStream, where it could need say the original file name. You can pass the name as another parameter, or use a wrapper class :
class CsvFile {
private String name;
private InputStream inputStream;
CsvFile(MultipartFile file) {
name = file.getOriginalFilename();
inputStream = file.getInputStream();
}
// public getters ...
}
and call
customerService.insert(customerForm.customer, new CsvFile(customerForm.csvFile), linesInError);
with no direct Spring dependancies

Spring MVC: How to perform validation?

I would like to know what is the cleanest and best way to perform form validation of user inputs. I have seen some developers implement org.springframework.validation.Validator. A question about that: I saw it validates a class. Does the class have to be filled manually with the values from the user input, and then passed to the validator?
I am confused about the cleanest and best way to validate the user input. I know about the traditional method of using request.getParameter() and then manually checking for nulls, but I don't want to do all the validation in my Controller. Some good advice on this area will be greatly appreciated. I am not using Hibernate in this application.
With Spring MVC, there are 3 different ways to perform validation : using annotation, manually, or a mix of both. There is not a unique "cleanest and best way" to validate, but there is probably one that fits your project/problem/context better.
Let's have a User :
public class User {
private String name;
...
}
Method 1 : If you have Spring 3.x+ and simple validation to do, use javax.validation.constraints annotations (also known as JSR-303 annotations).
public class User {
#NotNull
private String name;
...
}
You will need a JSR-303 provider in your libraries, like Hibernate Validator who is the reference implementation (this library has nothing to do with databases and relational mapping, it just does validation :-).
Then in your controller you would have something like :
#RequestMapping(value="/user", method=RequestMethod.POST)
public createUser(Model model, #Valid #ModelAttribute("user") User user, BindingResult result){
if (result.hasErrors()){
// do something
}
else {
// do something else
}
}
Notice the #Valid : if the user happens to have a null name, result.hasErrors() will be true.
Method 2 : If you have complex validation (like big business validation logic, conditional validation across multiple fields, etc.), or for some reason you cannot use method 1, use manual validation. It is a good practice to separate the controller’s code from the validation logic. Don't create your validation class(es) from scratch, Spring provides a handy org.springframework.validation.Validator interface (since Spring 2).
So let's say you have
public class User {
private String name;
private Integer birthYear;
private User responsibleUser;
...
}
and you want to do some "complex" validation like : if the user's age is under 18, responsibleUser must not be null and responsibleUser's age must be over 21.
You will do something like this
public class UserValidator implements Validator {
#Override
public boolean supports(Class clazz) {
return User.class.equals(clazz);
}
#Override
public void validate(Object target, Errors errors) {
User user = (User) target;
if(user.getName() == null) {
errors.rejectValue("name", "your_error_code");
}
// do "complex" validation here
}
}
Then in your controller you would have :
#RequestMapping(value="/user", method=RequestMethod.POST)
public createUser(Model model, #ModelAttribute("user") User user, BindingResult result){
UserValidator userValidator = new UserValidator();
userValidator.validate(user, result);
if (result.hasErrors()){
// do something
}
else {
// do something else
}
}
If there are validation errors, result.hasErrors() will be true.
Note : You can also set the validator in a #InitBinder method of the controller, with "binder.setValidator(...)" (in which case a mix use of method 1 and 2 would not be possible, because you replace the default validator). Or you could instantiate it in the default constructor of the controller. Or have a #Component/#Service UserValidator that you inject (#Autowired) in your controller : very useful, because most validators are singletons + unit test mocking becomes easier + your validator could call other Spring components.
Method 3 :
Why not using a combination of both methods? Validate the simple stuff, like the "name" attribute, with annotations (it is quick to do, concise and more readable). Keep the heavy validations for validators (when it would take hours to code custom complex validation annotations, or just when it is not possible to use annotations). I did this on a former project, it worked like a charm, quick & easy.
Warning : you must not mistake validation handling for exception handling. Read this post to know when to use them.
References :
A very interesting blog post about bean validation (Original link is dead)
Another good blog post about validation (Original link is dead)
Latest Spring documentation about validation
There are two ways to validate user input: annotations and by inheriting Spring's Validator class. For simple cases, the annotations are nice. If you need complex validations (like cross-field validation, eg. "verify email address" field), or if your model is validated in multiple places in your application with different rules, or if you don't have the ability to modify your model object by placing annotations on it, Spring's inheritance-based Validator is the way to go. I'll show examples of both.
The actual validation part is the same regardless of which type of validation you're using:
RequestMapping(value="fooPage", method = RequestMethod.POST)
public String processSubmit(#Valid #ModelAttribute("foo") Foo foo, BindingResult result, ModelMap m) {
if(result.hasErrors()) {
return "fooPage";
}
...
return "successPage";
}
If you are using annotations, your Foo class might look like:
public class Foo {
#NotNull
#Size(min = 1, max = 20)
private String name;
#NotNull
#Min(1)
#Max(110)
private Integer age;
// getters, setters
}
Annotations above are javax.validation.constraints annotations. You can also use Hibernate's
org.hibernate.validator.constraints, but it doesn't look like you are using Hibernate.
Alternatively, if you implement Spring's Validator, you would create a class as follows:
public class FooValidator implements Validator {
#Override
public boolean supports(Class<?> clazz) {
return Foo.class.equals(clazz);
}
#Override
public void validate(Object target, Errors errors) {
Foo foo = (Foo) target;
if(foo.getName() == null) {
errors.rejectValue("name", "name[emptyMessage]");
}
else if(foo.getName().length() < 1 || foo.getName().length() > 20){
errors.rejectValue("name", "name[invalidLength]");
}
if(foo.getAge() == null) {
errors.rejectValue("age", "age[emptyMessage]");
}
else if(foo.getAge() < 1 || foo.getAge() > 110){
errors.rejectValue("age", "age[invalidAge]");
}
}
}
If using the above validator, you also have to bind the validator to the Spring controller (not necessary if using annotations):
#InitBinder("foo")
protected void initBinder(WebDataBinder binder) {
binder.setValidator(new FooValidator());
}
Also see Spring docs.
Hope that helps.
I would like to extend nice answer of Jerome Dalbert. I found very easy to write your own annotation validators in JSR-303 way. You are not limited to have "one field" validation. You can create your own annotation on type level and have complex validation (see examples below). I prefer this way because I don't need mix different types of validation (Spring and JSR-303) like Jerome do. Also this validators are "Spring aware" so you can use #Inject/#Autowire out of box.
Example of custom object validation:
#Target({ TYPE, ANNOTATION_TYPE })
#Retention(RUNTIME)
#Constraint(validatedBy = { YourCustomObjectValidator.class })
public #interface YourCustomObjectValid {
String message() default "{YourCustomObjectValid.message}";
Class<?>[] groups() default {};
Class<? extends Payload>[] payload() default {};
}
public class YourCustomObjectValidator implements ConstraintValidator<YourCustomObjectValid, YourCustomObject> {
#Override
public void initialize(YourCustomObjectValid constraintAnnotation) { }
#Override
public boolean isValid(YourCustomObject value, ConstraintValidatorContext context) {
// Validate your complex logic
// Mark field with error
ConstraintViolationBuilder cvb = context.buildConstraintViolationWithTemplate(context.getDefaultConstraintMessageTemplate());
cvb.addNode(someField).addConstraintViolation();
return true;
}
}
#YourCustomObjectValid
public YourCustomObject {
}
Example of generic fields equality:
import static java.lang.annotation.ElementType.ANNOTATION_TYPE;
import static java.lang.annotation.ElementType.TYPE;
import static java.lang.annotation.RetentionPolicy.RUNTIME;
import java.lang.annotation.Documented;
import java.lang.annotation.Retention;
import java.lang.annotation.Target;
import javax.validation.Constraint;
import javax.validation.Payload;
#Target({ TYPE, ANNOTATION_TYPE })
#Retention(RUNTIME)
#Constraint(validatedBy = { FieldsEqualityValidator.class })
public #interface FieldsEquality {
String message() default "{FieldsEquality.message}";
Class<?>[] groups() default {};
Class<? extends Payload>[] payload() default {};
/**
* Name of the first field that will be compared.
*
* #return name
*/
String firstFieldName();
/**
* Name of the second field that will be compared.
*
* #return name
*/
String secondFieldName();
#Target({ TYPE, ANNOTATION_TYPE })
#Retention(RUNTIME)
public #interface List {
FieldsEquality[] value();
}
}
import java.lang.reflect.Field;
import javax.validation.ConstraintValidator;
import javax.validation.ConstraintValidatorContext;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.util.ReflectionUtils;
public class FieldsEqualityValidator implements ConstraintValidator<FieldsEquality, Object> {
private static final Logger log = LoggerFactory.getLogger(FieldsEqualityValidator.class);
private String firstFieldName;
private String secondFieldName;
#Override
public void initialize(FieldsEquality constraintAnnotation) {
firstFieldName = constraintAnnotation.firstFieldName();
secondFieldName = constraintAnnotation.secondFieldName();
}
#Override
public boolean isValid(Object value, ConstraintValidatorContext context) {
if (value == null)
return true;
try {
Class<?> clazz = value.getClass();
Field firstField = ReflectionUtils.findField(clazz, firstFieldName);
firstField.setAccessible(true);
Object first = firstField.get(value);
Field secondField = ReflectionUtils.findField(clazz, secondFieldName);
secondField.setAccessible(true);
Object second = secondField.get(value);
if (first != null && second != null && !first.equals(second)) {
ConstraintViolationBuilder cvb = context.buildConstraintViolationWithTemplate(context.getDefaultConstraintMessageTemplate());
cvb.addNode(firstFieldName).addConstraintViolation();
ConstraintViolationBuilder cvb = context.buildConstraintViolationWithTemplate(context.getDefaultConstraintMessageTemplate());
cvb.addNode(someField).addConstraintViolation(secondFieldName);
return false;
}
} catch (Exception e) {
log.error("Cannot validate fileds equality in '" + value + "'!", e);
return false;
}
return true;
}
}
#FieldsEquality(firstFieldName = "password", secondFieldName = "confirmPassword")
public class NewUserForm {
private String password;
private String confirmPassword;
}
If you have same error handling logic for different method handlers, then you would end up with lots of handlers with following code pattern:
if (validation.hasErrors()) {
// do error handling
}
else {
// do the actual business logic
}
Suppose you're creating RESTful services and want to return 400 Bad Request along with error messages for every validation error case. Then, the error handling part would be same for every single REST endpoint that requires validation. Repeating that very same logic in every single handler is not so DRYish!
One way to solve this problem is to drop the immediate BindingResult after each To-Be-Validated bean. Now, your handler would be like this:
#RequestMapping(...)
public Something doStuff(#Valid Somebean bean) {
// do the actual business logic
// Just the else part!
}
This way, if the bound bean was not valid, a MethodArgumentNotValidException will be thrown by Spring. You can define a ControllerAdvice that handles this exception with that same error handling logic:
#ControllerAdvice
public class ErrorHandlingControllerAdvice {
#ExceptionHandler(MethodArgumentNotValidException.class)
public SomeErrorBean handleValidationError(MethodArgumentNotValidException ex) {
// do error handling
// Just the if part!
}
}
You still can examine the underlying BindingResult using getBindingResult method of MethodArgumentNotValidException.
Find complete example of Spring Mvc Validation
import org.springframework.validation.Errors;
import org.springframework.validation.ValidationUtils;
import org.springframework.validation.Validator;
import com.technicalkeeda.bean.Login;
public class LoginValidator implements Validator {
public boolean supports(Class aClass) {
return Login.class.equals(aClass);
}
public void validate(Object obj, Errors errors) {
Login login = (Login) obj;
ValidationUtils.rejectIfEmptyOrWhitespace(errors, "userName",
"username.required", "Required field");
ValidationUtils.rejectIfEmptyOrWhitespace(errors, "userPassword",
"userpassword.required", "Required field");
}
}
public class LoginController extends SimpleFormController {
private LoginService loginService;
public LoginController() {
setCommandClass(Login.class);
setCommandName("login");
}
public void setLoginService(LoginService loginService) {
this.loginService = loginService;
}
#Override
protected ModelAndView onSubmit(Object command) throws Exception {
Login login = (Login) command;
loginService.add(login);
return new ModelAndView("loginsucess", "login", login);
}
}
Put this bean in your configuration class.
#Bean
public Validator localValidatorFactoryBean() {
return new LocalValidatorFactoryBean();
}
and then You can use
<T> BindingResult validate(T t) {
DataBinder binder = new DataBinder(t);
binder.setValidator(validator);
binder.validate();
return binder.getBindingResult();
}
for validating a bean manually. Then You will get all result in BindingResult and you can retrieve from there.
Validation groups
Also it is worth to mention validation for some more complex cases, when you have some "multi steps" within your business logic. In such cases we need "validation groups".
#Validated annotation was added to support "validation groups" in validated bean. This can be used in multi step forms where in the first step you need, for example, validate name and email, and in the second step you need to validate, for example, phone number.
With #Validated you first need to declare groups. Groups are declared with your custom marker interfaces.
#Validated example
Let's say we have a scenario when we have a form for user sign up. On this form we want user to provide a name and email. And after user is signed up we have another form where we suggest the user to add his some extra information, for example, email. We don't want email be provided on the first step. But it is required to provide it on the second step.
For this case, we'll declare two groups. First group would be OnCreate, and the second group would be OnUpdate :
OnCreate:
public interface OnCreate {}
OnUpdate:
public interface OnUpdate {}
Our user UserAccount class:
public class UserAccount {
// we will return this field after User is created
// and we want this field to be provided only on update
// so we can determine which user needs to be updated
#NotBlank(groups = OnUpdate.class)
private String id;
#NotBlank(groups = OnCreate.class)
private String name;
#NotBlank(groups = OnCreate.class)
private String email;
#NotBlank(groups = OnUpdate.class)
private String phone;
// standard constructors / setters / getters / toString
}
We mark the validation annotations with our groups interfaces depending on which group those validations are supposed to be related.
And finally our Controller methods:
#PostMapping(value = "/create")
public UserAccount createAccount(#Validated(OnCreate.class) #RequestBody UserAccount userAccount) {
...
}
#PatchMapping(value = "/update")
public UserAccount updateAccount(#Validated(OnUpdate.class) #RequestBody UserAccount userAccount) {
...
}
Here we specify #Validated(...) instead of #Valid and specify the validation group which should be used in different cases.
Now depending on validation group we'll perform the validations for the particular fields within different steps.

Categories