I have forms where I want to be able to leave some numerical fields empty, but I get "Invalid value" errors from Form.bindFromRequest(). This seems like a common requirement, but I can't find any documentation on this after hours of searching. I am still hoping I've missed something obvious, though!
I did some digging and found that Play uses Spring's DataBinder class to do the actual binding. That class is really sophisticated and would allow me to set custom binders for my fields, and if I was using Spring I could just add an #InitBinder method to my controller to set up the binder exactly the way I want using a CustomNumberEditor. However, it seems that Play Framework's Form object does not allow access to the DataBinder, apart from setting allowed fields. So, the binder tries to convert an empty string into a long, which gives a NumberFormatException:
Failed to convert property value of type 'java.lang.String' to required type 'long'
for property 'unitPriceExVAT';
nested exception is java.lang.NumberFormatException: For input string: ""
So, is there some other way of getting the binder to allow empty numerical fields?
Declare your fields as Long instead of the long primitive type and the empty values will be treated as null.
Related
I'm using AWS in my project and I just want to make Regions configurable in properties file.
So com.amazonaws.regions.Regions accepts enum values like Regions.AP_EAST_1.
I tried assigning it in application.properties:
aws.s3.config.region=com.amazonaws.regions.Regions.US_WEST_2
#Value("${aws.s3.config.region}")
private Regions clientRegion;
This just doesn't work out. It gives me this:
java.lang.IllegalStateException: Cannot convert value of type [java.lang.String] to required type [com.amazonaws.regions.Regions]: no matching editors or conversion strategy found
Spring knows how to map enums. Just use the enum value (not a fully-qualified name):
aws.s3.config.region: US_WEST_2
Note also that Spring Cloud AWS is available (though it may not always be the best choice; it has very strong opinions), and that in general it's better to use #ConfigurationProperties or at a minimum constructor injection instead of #Value on a field, which has all the pitfalls of field injection.
Is it possible to pass Java code as a value in a YAML file. For example, something like this:
---
dueDate: "DueDateCalc()"
DueDateCalc() might be a method defined in the Java code that is parsing the YAML. It would then set the Java dueDate property to the return of the predefined DueDateCalc() method.
This is possible within the constraints of Java runtime reflection, however you need to implement it yourself.
For example, your YAML could look like this:
---
dueDate: !call DueDateCalc
!call is a local tag for telling the loading code that the scalar value DueDateCalc should be interpreted as method to be called (this is chosen by you, not something predefined). You can implement this with a custom constructor for the !calc tag that searches for a method with the given name within some given class, and then calls it on some given object.
What about parameters? Well, still possible, but will get ugly fast. First problem is how you define the paramaters:
with nested YAML sequences: !call [MyMethod, [1, 2, 3]]
with a scalar that needs to be parsed: !call MyMethod(1, 2, 3)
The former option lets YAML parse the parameters and you'll get a list; the latter option requires you to parse the method call yourself from the string you get from YAML.
The second problem is to load the values into Java variables so that you can give them as argument list. Java reflection lets you get the method's parameter types and you can use those to load the parameter values. For example, if the first parameter's type is a String, you would parse 1 as a "1", while if it's an int, you can parse 1 as int. This is possible with SnakeYAML's builtin facilities if you're using nested YAML sequences for method call encoding.
This would even work if parameters are class objects with complex structure, you'd just use normal YAML syntax and the objects will be loaded properly. Referring to variables in your code is not directly possible, but you could define another tag !lookup which retrieves values from a given Map structure.
While reflection lets you make method calls, you can not directly evaluate an expression like 6*9. So before you try and implement anything, evaluate which functionality you need and check whether it's doable via reflection.
I have an application which convert values from one one format to another. This conversion is required at the time of receiving an input or sending an output. I want to use annotations for this purpose.
The requirement is to create a custom annotation that can reset the value of the field, the annotation is linked to, after conversion based on some value, e.g.
#Convert(value = "butter")
String meal;
So, if meal = "bread" when it is received in the request, it should be converted to "bread n butter" and then passed on for further processing.
I did try to search it on google but did not find anything useful. Though I have worked on custom annotations for validation but that doesn't suit the requirement here. I am looking for a way to program such a custom annotation and not aware of any interface, similar to one when we write custom validation annotations, that could be implemented for this purpose.
I'm making an API call that returns JSON which has a particular field which either returns false or a map depending on content. It's a field that I don't care about. I expected GSON to ignore this particular field, though it doesn't seem to be. The object generation fails with the following message:
com.google.gson.JsonSyntaxException: java.lang.IllegalStateException: Expected a string but was BEGIN_OBJECT at line 1 column 403560
I've seen this particular question (Gson deserialize json with varying value types). I want to make sure I need to make a custom deserializer before doing so. I'm wondering if I may have another issue.
edit:
Example:
"anonymous_flag": { }
vs
"anonymous_flag": "yes"
Another Edit:
I actually had the field in my model object... I was referencing the wrong class. Judge away :)
If you are deserializing the JSON into a domain object of your own design, you can simply omit the field from it if you're not interested in the value. GSON will ignore the field in the JSON.
I have a basic doubt in how to proceed in my application. I have a form and I need to validate that all the inputs which are numbers. I have a problem in deciding the type of the attributes of the bean asociated with the form. I don't know if setting them to String or to double and here are the reasons:
If I set them to double: If I enter in the input something which is not a number when spring populates the inputs into the bean I get an Exception in the JSP that it could not convert it into double.
If I set them to String: I have a good validation although I have to change them later to double. But my problem here is that this bean is stored in a database with hibernate and the annotation #column would store it as a text and I would like to store it as if it were a double. Is there any posibility to change the column type to the double deferred type?
Does anyone can give me any idea in how to preceed in this case?
Thanks.
I suggest you always work with your domain types and not use String just because that's the way HTTP sends params. If a field has type double, you will use it as such in your code and also store it as such in the database. Let Spring convert the request params to your needed type.
Data binding is useful for allowing user input to be dynamically bound to the domain model of an application (or whatever objects you use to process user input). Spring provides the so-called DataBinder class to do exactly that.
You can register those in the initBinder method of your controllers and will allow you to transform the Strings from your request into the desired type. See for example the CustomNumberEditor class used to parse user-entered number strings into Number properties of beans. You can then combine this with the Validator interface for more complex checks.
EDIT: Spring binding uses typeMismatch error codes for binding errors when a conversion fails (required code if you specify a field as required but you don’t supply it). In your case it defaults to showing the exception message. To change the message to a more friendly one, you must supply a bundle key in your property file using the typeMismatch prefix.
This is specified by the DataBinder.setMessageCodesResolver and defaults to org.springframework.validation.DefaultMessageCodesResolver. In the javadoc of DefaultMessageCodesResolver you can see complete examples, but basically, you just have to add an entry like this in your properties file:
typeMismatch.yourField=Your user friendly error message goes here
You can map the exception to the custom message if you have an entry in the following form in your message.properties (or the equivalent message bundle that you are using).
typeMismatch.fieldName, where fieldName would be the name of the field you are validating.
If you are using Spring 3.0
have a look at the Overriding Defaults with Annotations part of
Spring 3 Type Conversion and Validation
If you are using Spring 2.x+ you can achieve this by registering Custom PropertyEditor as mentioned in above post