BigDecimal has scientific notation in soap message - java

I have got strange problem with our webservice.
I have got object OrderPosition which has got a price (which is xsd:decimal with fractionDigits = 9). Apache CXF generate proxy classes for me, and this field is BigDecimal. When I'd like to send value greater than 10000000.00000, this field in soap message has got scientific notation (for example 1.1423E+7).
How can I enforce that the value has not been sent in scientific notation.

Here is one way this can be done.
BigDecimal has a constructor which takes input number as a string. This when used, preservs the input formatting when its .toString() method is called. e.g.
BigDecimal bd = new BigDecimal("10000000.00000");
System.out.println(bd);
will print 10000000.00000.
This can be leveraged in Jaxb XmlAdapters. Jaxb XmlAdapters offer a convenient way to control/customize the marshalling/unmashalling process. A typical adapter for BigDecimmal would look like as follows.
public class BigDecimalXmlAdapter extends XmlAdapter{
#Override
public String marshal(BigDecimal bigDecimal) throws Exception {
if (bigDecimal != null){
return bigDecimal.toString();
}
else {
return null;
}
}
#Override
public BigDecimal unmarshal(String s) throws Exception {
try {
return new BigDecimal(s);
} catch (NumberFormatException e) {
return null;
}
}
}
This needs to be registered with Jaxb context. Here is the link with complete example.

#Santosh Thank you! XMLAdapter was what I needed.
Furthermore as I said in my question I generate client classes with Apache CXF. In this kind of problem I had to add the following code to bindings.xjb (binding file which is use to cxf-codegen-plugin in maven).
<jaxb:javaType name="java.math.BigDecimal" xmlType="xs:decimal"
parseMethod="sample.BigDecimalFormater.parseBigDecimal"
printMethod="sample.BigDecimalFormater.printBigDecimal" />
This is my formatter code:
public class BigDecimalFormater {
public static String printBigDecimal(BigDecimal value) {
value.setScale(5);
return value.toPlainString();
}
public static BigDecimal parseBigDecimal(String value) {
return new BigDecimal(value);
}
}
Then this plugin generate Adapter for me
public class Adapter1 extends XmlAdapter<String, BigDecimal> {
public BigDecimal unmarshal(String value) {
return (sample.BigDecimalFormater.parseBigDecimal(value));
}
public String marshal(BigDecimal value) {
return (sample.BigDecimalFormater.printBigDecimal(value));
}
}
In generated class BigDecimal field has got annotation #XmlJavaTypeAdapter(Adapter1 .class), and it resolved problem.

Related

Method returning alternative data structures based on execution result

Say I have a function that looks at a file and returns two results: recognized and unrecognized. When it returns the recognized result, I want the result to also contain a message but when it is unrecognized, no message is necessary.
public Result checkFile(File file) {
...
}
There are two ways I can think of to accomplish this...
Have the Result class like so:
class Result {
private Type type;
private String message;
enum Type {
RECOGNIZED, UNRECOGNIZED
}
}
Or do it like so:
class Result {
}
class Unrecognized extends Result {
}
class Recognized extends Result {
private String message;
}
I'm inclined to use the second method, even though I'd have to check the result using instanceof and I've read that instanceof should be avoided whenever possible, but doing this avoids having a null message when the result is unrecognized. For this example a null message wouldn't be much of an issue, but what if there is a lot more data associated with a recognized result? It seems like worse practice to me to instantiate a class that could have all null fields.
What is the best practice to handle this situation? Is there some standard method or pattern?
Two classes might be overkill, because of it being one and the same class of object. Also an enum with two values which merely reassemble true and false is not required. One class Result should suffice and this would also remove the demand for a common interface. I'd be all for "no complexity beyond necessary" ...
class RecognitionResult {
private String message = "default message";
private boolean recognized = false;
public Result() {}
public Result(boolean value) {
this.setRecognised(value);
}
public boolean setRecognised(boolean value) {
this.recognized = value;
}
public boolean setMessage(#NonNull String value) {
this.message = value;
}
public boolean getRecognised() {
return this.recognized;
}
#Nullable
public String getMessage() {
return this.recognized ? this.message : null;
}
}
then one can simply do:
return new RecognitionResult(true);
an interface for asynchronous callbacks might look alike this:
interface Recognition {
void OnComplete(RecognitionResult result);
}
or if you really want to optimize:
interface Recognition {
void OnSuccess(RecognitionResult result);
void OnFailure(RecognitionException e);
}
Of course there's no 'correct' design here - it's going to be a matter of opinion which way you go. However my view is that the modern trend in OOD is to minimise the use of extension and to use delegation and implementation of interfaces wherever possible.
As a general rule, whenever you think of using instanceof, reconsider your design.
This would be my suggestion:
interface Result {
boolean isRecognised();
String getMessage();
}
class RecognisedResult implements Result {
private final String message;
public boolean isRecognised() {
return true;
}
public String getMessage() {
return message;
}
}
class UnrecognisedResult implements Result {
public boolean isRecognised() {
return false;
}
public String getMessage() {
throw new UnsupportedOperationException("No message for unrecognised results");
}
}
you can look at the way Retrofit implement your concept of "recognised" and "message"
https://square.github.io/retrofit/2.x/retrofit/retrofit2/Response.html. it is similar to your first method.
what they did is to have a class called Response, containing a method called isSuccessful(), and a method called body() containing the payload if it's successful (or null if it is unsuccessful.
you can try some thing like the following
class Result {
private Type type;
private String message;
public bool isSuccessful(){
return type == RECOGNIZED;
}
public String getMessage(){
return message; //null if unrecognized.
}
enum Type {
RECOGNIZED, UNRECOGNIZED
}
}
The functional way to do this would be to use an Either type, which doesn’t come with the JDK, but is available in vavr library. Based on your comments on this thread, it appears you don’t clearly understand how type inheritance works. In that case, a functional solution may be overkill, and I’d suggest going with #sprinter’s solution.

Using protobuf with flink

I'm using flink to read data from kafka and convert it to protobuf. The problem I'm facing is when I run the java application I get the below error. If I modify the unknownFields variable name to something else, it works but it's hard to make this change on all protobuf classes.
I also tried to deserialize directly when reading from kafka but I'm not sure what should be the TypeInformation to be returned for getProducedType() method.
public static class ProtoDeserializer implements DeserializationSchema{
#Override
public TypeInformation getProducedType() {
// TODO Auto-generated method stub
return PrimitiveArrayTypeInfo.BYTE_PRIMITIVE_ARRAY_TYPE_INFO;
}
Appreciate all the help. Thanks.
java.lang.RuntimeException: The field protected com.google.protobuf.UnknownFieldSet com.google.protobuf.GeneratedMessage.unknownFields is already contained in the hierarchy of the class com.google.protobuf.GeneratedMessage.Please use unique field names through your classes hierarchy
at org.apache.flink.api.java.typeutils.TypeExtractor.getAllDeclaredFields(TypeExtractor.java:1594)
at org.apache.flink.api.java.typeutils.TypeExtractor.analyzePojo(TypeExtractor.java:1515)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1412)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1319)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:609)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateCreateTypeInfo(TypeExtractor.java:437)
at org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:306)
at org.apache.flink.api.java.typeutils.TypeExtractor.getFlatMapReturnTypes(TypeExtractor.java:133)
at org.apache.flink.streaming.api.datastream.DataStream.flatMap(DataStream.java:529)
Code:
FlinkKafkaConsumer09<byte[]> kafkaConsumer = new FlinkKafkaConsumer09<>("testArr",new ByteDes(),p);
DataStream<byte[]> input = env.addSource(kafkaConsumer);
DataStream<PBAddress> protoData = input.map(new RichMapFunction<byte[], PBAddress>() {
#Override
public PBAddress map(byte[] value) throws Exception {
PBAddress addr = PBAddress.parseFrom(value);
return addr;
}
});
Maybe you should try this follow:
env.getConfig().registerTypeWithKryoSerializer(PBAddress. class,ProtobufSerializer.class);
or
env.getConfig().registerTypeWithKryoSerializer(PBAddress. class,PBAddressSerializer.class);
public class PBAddressSerializer extends Serializer<Message> {
final private Map<Class,Method> hashMap = new HashMap<Class, Method>();
protected Method getParse(Class cls) throws NoSuchMethodException {
Method method = hashMap.get(cls);
if (method == null) {
method = cls.getMethod("parseFrom",new Class[]{byte[].class});
hashMap.put(cls,method);
}
return method;
}
#Override
public void write(Kryo kryo, Output output, Message message) {
byte[] ser = message.toByteArray();
output.writeInt(ser.length,true);
output.writeBytes(ser);
}
#Override
public Message read(Kryo kryo, Input input, Class<Message> pbClass) {
try {
int size = input.readInt(true);
byte[] barr = new byte[size];
input.read(barr);
return (Message) getParse(pbClass).invoke(null,barr);
} catch (Exception e) {
throw new RuntimeException("Could not create " + pbClass, e);
}
}
}
try this:
public class ProtoDeserializer implements DeserializationSchema<PBAddress> {
#Override
public TypeInformation<PBAddress> getProducedType() {
return TypeInformation.of(PBAddress.class);
}
https://issues.apache.org/jira/browse/FLINK-11333 is the JIRA ticket tracking the issue of implementing first-class support for Protobuf types with evolvable schema. You'll see that there was a pull request quite some time ago, which hasn't been merged. I believe the problem was that there is no support there for handling state migration in cases where Protobuf was previously being used by registering it with Kryo.
Meanwhile, the Stateful Functions project (statefun is a new API that runs on top of Flink) is based entirely on Protobuf, and it includes support for using Protobuf with Flink: https://github.com/apache/flink-statefun/tree/master/statefun-flink/statefun-flink-common/src/main/java/org/apache/flink/statefun/flink/common/protobuf. (The entry point to that package is ProtobufTypeInformation.java.) I suggest exploring this package (which includes nothing statefun specific); however, it doesn't concern itself with migrations from Kryo either.

set Jackson ObjectMapper class not to use scientific notation for double

I am using a library com.fasterxml.jackson library for JsonSchema,
I am creating an IntegerSchema object, when I set range for integer schema using below code:
main(){
IntegerSchema intSchema = new IntegerSchema();
// setMaximum accepts Double object
intSchema.setMaximum(new Double(102000000));
// setMaximum accepts Double object
intSchema.setMinimum(new Double(100));
printJsonSchema(intSchema);
}
public void printJsonSchema(JsonSchema schema){
ObjectMapper mapper = new ObjectMapper();
try {
logger.info(mapper.writeValueAsString(schema));
} catch (JsonProcessingException e) {
throw new IllegalStateException(e);
}
}
When I convert IntegerSchema to string using ObjectMapper getting below response:
{"type":"integer","maximum":1.02E8,"minimum":100.0}
maximum and minimum values are getting converted to scientific notation.
But I need output in non scientific notation as below:
{"type":"integer","maximum":102000000,"minimum":100}
I cannot change IntegerSchema class.
Please suggest how to get the required output without extending IntegerSchema class?
Thanks in advance
this is a java issue somewhat I believe. If you debug your program, your Double will always be displayed scientifically, so what we'll want is to force it into a String. This can be achieved in Java in multiple ways, and you can look it up here:
How to print double value without scientific notation using Java?
In terms of your specific question about Jackson, I've written up some code for you:
public class ObjectMapperTest {
public static void main(String[] args) throws JsonProcessingException {
IntegerSchema schema = new IntegerSchema();
schema.type = "Int";
schema.max = 10200000000d;
schema.min = 100d;
ObjectMapper m = new ObjectMapper();
System.out.println(m.writeValueAsString(schema));
}
public static class IntegerSchema {
#JsonProperty
String type;
#JsonProperty
double min;
#JsonProperty
#JsonSerialize(using=MyDoubleDesirializer.class)
double max;
}
public static class MyDoubleDesirializer extends JsonSerializer<Double> {
#Override
public void serialize(Double value, JsonGenerator gen, SerializerProvider serializers)
throws IOException, JsonProcessingException {
// TODO Auto-generated method stub
BigDecimal d = new BigDecimal(value);
gen.writeNumber(d.toPlainString());
}
}
}
The trick is to register a custom Serializer for your Double value. This way, you can control what you want.
I am using the BigDecimal value to create a String representation of your Double. The output then becomes (for the specific example):
{"type":"Int","min":100.0,"max":10200000000}
I hope that solves your problem.
Artur
Feature.WRITE_BIGDECIMAL_AS_PLAIN
set this for your Object Mapper
I know I am answering late, but something I faced may help other
While converting a BigDecimal I have faced below is working
mapper = mapper.setNodeFactory(JsonNodeFactory.withExactBigDecimals(true));
while this is not working for me
mapper.configure(JsonGenerator.Feature.WRITE_BIGDECIMAL_AS_PLAIN, true);
Update for Jakson 2.9.10:
Property WRITE_BIGDECIMAL_AS_PLAIN replaced to com.fasterxml.jackson.core.JsonGenerator. You could use:
mapper.enable(JsonGenerator.Feature.WRITE_BIGDECIMAL_AS_PLAIN);
If you are using ValueToTree then no need of any factory settings. only problem with ValueToTree is it is converting as TextNode (String Fromat), So if you have any logic based on ObjectNodes it will not work
You should use
mapper.configure(DeserializationFeature.USE_BIG_DECIMAL_FOR_FLOATS, true);
To avoid scientific notation on floating numbers.
You can find an example below.
ObjectMapper mapper = new ObjectMapper();
mapper.configure(DeserializationFeature.USE_BIG_DECIMAL_FOR_FLOATS, true);
String test ="{\"doubleValue\" : 0.00001}";
try {
System.out.println(mapper.readTree(test).toPrettyString());
} catch (JsonProcessingException e) {
e.printStackTrace();
}
Output
{
"doubleValue" : 0.00001
}

Genson Polymorphic / Generic Serialization

I am trying to implement a JSON serialization in Java with Genson 1.3 for polymorphic types, including:
Numbers
Arrays
Enum classes
The SSCCE below demonstrates roughly what I am trying to achieve:
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import com.owlike.genson.Genson;
import com.owlike.genson.GensonBuilder;
/**
* A Short, Self Contained, Compilable, Example for polymorphic serialization
* and deserialization.
*/
public class GensonPolymoprhicRoundTrip {
// our example enum
public static enum RainState {
NO_RAIN,
LIGHT_RAIN,
MODERATE_RAIN,
HEAVY_RAIN,
LIGHT_SNOW,
MODERATE_SNOW,
HEAVY_SNOW;
}
public static class Measurement<T> {
public T value;
public int qualityValue;
public String source;
public Measurement() {
}
public Measurement(T value, int qualityValue, String source) {
this.value = value;
this.qualityValue = qualityValue;
this.source = source;
}
}
public static class DTO {
public List<Measurement<?>> measurements;
public DTO(List<Measurement<?>> measurements) {
this.measurements = measurements;
}
}
public static void main(String... args) {
Genson genson = new GensonBuilder()
.useIndentation(true)
.useRuntimeType(true)
.useClassMetadataWithStaticType(false)
.addAlias("RainState", RainState.class)
.useClassMetadata(true)
.create();
DTO dto = new DTO(
new ArrayList(Arrays.asList(
new Measurement<Double>(15.5, 8500, "TEMP_SENSOR"),
new Measurement<double[]>(new double[] {
2.5,
1.5,
2.0
}, 8500, "WIND_SPEED"),
new Measurement<RainState>(RainState.LIGHT_RAIN, 8500, "RAIN_SENSOR")
)));
String json = genson.serialize(dto);
System.out.println(json);
DTO deserialized = genson.deserialize(json, DTO.class);
}
}
Numbers and Arrays worked well out-of-the-box, but the enum class is providing a bit of a challenge. In this case the serialized JSON form would have to be IMO a JSON object including a:
type member
value member
Looking at the EnumConverter class I see that I would need to provide a custom Converter. However I can't quite grasp how to properly register the Converter so that it would be called during deserialization. How should this serialization be solved using Genson?
Great for providing a complete example!
First problem is that DTO doesn't have a no arg constructor, but Genson supports classes even with constructors that have arguments. You just have to enable it via the builder with 'useConstructorWithArguments(true)'.
However this will not solve the complete problem. For the moment Genson has full polymorphic support only for types that are serialized as a json object. Because Genson will add a property called '#class' to it. There is an open issue for that.
Probably the best solution that should work with most situations would be to define a converter that automatically wraps all the values in json objects, so the converter that handles class metadata will be able to generate it. This can be a "good enough" solution while waiting for it to be officially supported by Genson.
So first define the wrapping converter
public static class LiteralAsObjectConverter<T> implements Converter<T> {
private final Converter<T> concreteConverter;
public LiteralAsObjectConverter(Converter<T> concreteConverter) {
this.concreteConverter = concreteConverter;
}
#Override
public void serialize(T object, ObjectWriter writer, Context ctx) throws Exception {
writer.beginObject().writeName("value");
concreteConverter.serialize(object, writer, ctx);
writer.endObject();
}
#Override
public T deserialize(ObjectReader reader, Context ctx) throws Exception {
reader.beginObject();
T instance = null;
while (reader.hasNext()) {
reader.next();
if (reader.name().equals("value")) instance = concreteConverter.deserialize(reader, ctx);
else throw new IllegalStateException(String.format("Encountered unexpected property named '%s'", reader.name()));
}
reader.endObject();
return instance;
}
}
Then you need to register it with a ChainedFactory which would allow you to delegate to the default converter (this way it works automatically with any other type).
Genson genson = new GensonBuilder()
.useIndentation(true)
.useConstructorWithArguments(true)
.useRuntimeType(true)
.addAlias("RainState", RainState.class)
.useClassMetadata(true)
.withConverterFactory(new ChainedFactory() {
#Override
protected Converter<?> create(Type type, Genson genson, Converter<?> nextConverter) {
if (Wrapper.toAnnotatedElement(nextConverter).isAnnotationPresent(HandleClassMetadata.class)) {
return new LiteralAsObjectConverter(nextConverter);
} else {
return nextConverter;
}
}
}).create();
The downside with this solution is that useClassMetadataWithStaticType needs to be set to true...but well I guess it is acceptable as it's an optim and can be fixed but would imply some changes in Gensons code, the rest still works.
If you are feeling interested by this problem it would be great you attempted to give a shot to that issue and open a PR to provide this feature as part of Genson.

Serialize a Double to 2 decimal places using Jackson

I'm using Jackson, with Spring MVC, to write out some simple objects as JSON. One of the objects, has an amount property, of type Double. (I know that Double should not be used as a monetary amount. However, this is not my code.)
In the JSON output, I'd like to restrict the amount to 2 decimal places. Currently it is shown as:
"amount":459.99999999999994
I've tried using Spring 3's #NumberFormat annotation, but haven't had success in that direction. Looks like others had issues too: MappingJacksonHttpMessageConverter's ObjectMapper does not use ConversionService when binding JSON to JavaBean propertiesenter link description here.
Also, I tried using the #JsonSerialize annotation, with a custom serializer.
In the model:
#JsonSerialize(using = CustomDoubleSerializer.class)
public Double getAmount()
And serializer implementation:
public class CustomDoubleSerializer extends JsonSerializer<Double> {
#Override
public void serialize(Double value, JsonGenerator jgen, SerializerProvider provider) throws IOException, JsonGenerationException {
if (null == value) {
//write the word 'null' if there's no value available
jgen.writeNull();
} else {
final String pattern = ".##";
//final String pattern = "###,###,##0.00";
final DecimalFormat myFormatter = new DecimalFormat(pattern);
final String output = myFormatter.format(value);
jgen.writeNumber(output);
}
}
}
The CustomDoubleSerializer "appears" to work. However, can anyone suggest any other simpler (or more standard) way of doing this.
I know that Double should not be used as a monetary amount. However,
this is not my code.
Indeed, it should not. BigDecimal is a much better choice for storing monetary amounts because it is lossless and provides more control of the decimal places.
So for people who do have control over the code, it can be used like this:
double amount = 111.222;
setAmount(new BigDecimal(amount).setScale(2, BigDecimal.ROUND_HALF_UP));
That will serialize as 111.22. No custom serializers needed.
I had a similar situation in my project. I had added the formatting code to the setter method of the POJO. DecimalFormatter, Math and other classes ended up rounding off the value, however, my requirement was not to round off the value but only to limit display to 2 decimal places.
I recreated this scenario.
Product is a POJO which has a member Double amount.
JavaToJSON is a class that will create an instance of Product and convert it to JSON.
The setter setAmount in Product will take care of formatting to 2 decimal places.
Here is the complete code.
Product.java
package com;
import java.math.BigDecimal;
import java.math.RoundingMode;
public class Product {
private String name;
private Double amount;
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public Double getAmount() {
return amount;
}
public void setAmount(Double amount) {
BigDecimal bd = new BigDecimal(amount).setScale(2, RoundingMode.FLOOR);
this.amount = bd.doubleValue();
}
#Override
public String toString() {
return "Product [name=" + name + ", amount=" + amount + "]";
}
}
JavaToJSON.java
package com;
import java.io.File;
import java.io.IOException;
import org.codehaus.jackson.JsonGenerationException;
import org.codehaus.jackson.map.JsonMappingException;
import org.codehaus.jackson.map.ObjectMapper;
public class JavaToJSON {
public static void main(String[] args){
ObjectMapper mapper = new ObjectMapper();
try {
Product product = new Product();
product.setName("TestProduct");
product.setAmount(Double.valueOf("459.99999999999994"));
// Convert product to JSON and write to file
mapper.writeValue(new File("d:\\user.json"), product);
// display to console
System.out.println(product);
} catch (JsonGenerationException e) {
e.printStackTrace();
} catch (JsonMappingException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
}
I haven't accumulated enough points so I am not able to upload the screenshots to show you the output.
Hope this helps.
Regarding what was stated above, I just wanted to fix a little something, so that people won't waste time on it as I did. One should actually use
BigDecimal.valueOf(amount).xxx
instead of
new BigDecimal(amount).xxx
and this is actually somehow critical. Because if you don't, your decimal amount will be messed up. This is a limitation of floating point representation, as stated here.
Best way I have seen till now is to create a customized serializer and #JsonSerializer(using=NewClass.class). Wanted to try with #JsonFormat(pattern=".##") or so, but it may not work according one comment of OP(I think the formatter does not honor that)
See here: https://github.com/FasterXML/jackson-databind/issues/632
public class MoneyDeserializer extends JsonDeserializer<BigDecimal> {
private NumberDeserializers.BigDecimalDeserializer delegate = NumberDeserializers.BigDecimalDeserializer.instance;
#Override
public BigDecimal deserialize(JsonParser jp, DeserializationContext ctxt) throws IOException, JsonProcessingException {
BigDecimal bd = delegate.deserialize(jp, ctxt);
bd = bd.setScale(2, RoundingMode.HALF_UP);
return bd;
}
}
BUT, although more convenient and less code is written, usually, deciding the scale of a field is concern of business logic, not part of (de)serialization. Be clear about that. Jackson should be able to just pass the data as-is.
Note that 459.99999999999994 is effectively 460 and is expected to be serialized in this way. So, your logic should be trickier than just dropping digits.
I might suggest something like:
Math.round(value*10)/10.0
You might want to put it into setter, and get rid of custom serialization.

Categories