Apache Commons Configuration validate properties file - java

I am using Apache Commons Configuration library with PropertiesConfiguration.
My application loads the config file right after its started, like this:
public PropertiesConfiguration loadConfigFile(File configFile) throws ConfigurationNotFoundException {
try {
if (configFile != null && configFile.exists()) {
config.load(configFile);
config.setListDelimiter(';');
config.setAutoSave(true);
config.setReloadingStrategy(new FileChangedReloadingStrategy());
setConfigLoaded(true);
}
else {
throw new ConfigurationNotFoundException("Configuration file not found.");
}
} catch (ConfigurationException e) {
logger.warn(e.getMessage());
setDefaultConfigValues(config);
config.setFile(configFile);
}
return config;
}
My question is, how can I validate the configFile, so I can be sure that no property in that file is missing and later in my code I won't get a NullPointerException when trying to access the properties, e.g.:
PropertiesConfiguration config = loadConfig(configFile);
String rootDir = config.getString("paths.download"); // I want to be sure that this property exists right at the app start
I didn't found anything in the documentation or google, just something about XML validation.
The goal is to provide feedback to the user at program start that the configuration file is corrupted.
There is no build-in mechanism for properties-file?

What is a configuration object supposed to do if you pass in a key to one of its get methods that does not map to an existing property?
the default behavior as implemented in AbstractConfiguration is to return null if the return value is an object type.
For primitive types as return values returning null (or any other special value) is not possible, so in this case a NoSuchElementException is thrown
// This will return null if no property with key "NonExistingProperty" exists
String strValue = config.getString("NonExistingProperty");
// This will throw a NoSuchElementException exception if no property with
// key "NonExistingProperty" exists
long longValue = config.getLong("NonExistingProperty");
For object types like String, BigDecimal, or BigInteger this default behavior can be changed:
If the setThrowExceptionOnMissing() method is called with an argument of true, these methods will behave like their primitive counter parts and also throw an exception if the passed in property key cannot be resolved.
Situation is little tricky for Collection & array types as they will return empty collection or array.

Related

Default value to identify parsing error (String -> double/any numeric)

What is the best practice in Java to check for invalid parsing results while using a "default value" instead of exceptions?
The (old-grown) project I'm working on has a fail-safe util-method to parse String to double like:
//null checks, LOG output etc removed for readability
double parseDouble(String input, double defaultValue){
try{
return Double.parseDouble(input)
} catch (Exception e){
return defaultValue;
}
}
Now the previous developer(s) always used a default value like returnedValue = parseDouble(someString, -99); and a check like if(returnedValue == -99) to identify an invalid parsing result. The (finally) added SonarQube server complains about this check using == on double and I want to replace these checks with a "correct" check.
What is the best practice to handle such cases?
I personally would use
parseDouble(someString, Double.NaN); and accordingly the check if(Double.isNan(returnedValue). Is this a viable solution?
EDIT:
I forgot to mention, that the utility class is not editable (from my point) and therefore I'm looking on how to easily "fix" the existing code. Also adding third-party libraries would be nice but is (at this point in time) not possible either.
In old times I also would have gone with the NAN or POSITIVE_INFINITY or MAX_VALUE or any other that isn't used. Now I would use Optional-class. Using integers is not reliable because of conversion and using null is Hoares one-billion-dollar mistake: Does null mean error, not initialized, not given? Was the input string null or was it parsed and it was not a valid double representation?
Basically you do not want a method that gives you back a default value when there was a parsing error or null input or the user entered the default value. You want a method that gives you back the information wether there was an error or not, and if not then the parsed value.
To make your code more readable and easier understandable I would write exactly such a method and use it everywhere where appropriate (instead of copy-pasting workarounds everywhere). If you cannot put it into the existing utility class, make your own additional utility class.
If you are using Java 8, you can make use of Optional-class. If not, then program your own Optional (or grab it from some library).
Here is the utility method:
Optional<Double> parseDouble(String input) {
try {
return Optional.of(Double.parseDouble(input));
} catch (Exception e) {
return Optional.empty();
}
}
Here is how to use it:
String input = ...;
Optional<Double> parsedInput = parseDouble(input);
if (! parsedInput.isPresent()) {
// print out warning and retry input or whatever
}
double convertedInput = parsedInput.value();
Remark:
SonarQube would also critizise catching a common "Exception". Instead you should catch NumberFormatException and NullPointerException.
When the caller needs to know the exact cause you can the add a method getEmptyReason() to your Optional (or derived class) and store the exception reason there. But I guess in your case you want to use a default value if the input string was not given (empty or null) and want to make an error handling if the value was given but unparseable. In this case you can use:
Optional<Double> parseDouble(String input, double defaultValue) {
if (input == null || input.trim().length == 0) {
return Optional.of(defaultValue);
}
try {
return Optional.of(Double.parseDouble(input));
} catch (NumberFormatException e) {
return Optional.empty();
}
}

Get Field value from entity using DTO

I have an app with several #MappedSuperClasses. Out of one of them I need to write a csv with columns in a very particular order stablished by the client.
Doing a Entity.class.getDeclaredFields() used to be enough to retrieve and write the columns in the right order before we had superclasses, but now, even if I use a custom solution to iterate through the superclasses's fields the order is incorrect, so I resorted to using a DTO Entity which returns the right order when calling getDeclaredFields().
The problems come when I try to retrieve the values present in the entities related, we used to do something like:
Object value = getLineFromField(field, line);
Where getLineFromField() method would be like:
private Object getLineFromField(Field field, Entity line) {
Object value = null;
try {
value = field.get(line);
} catch (Exception e) {
LOG.info("There is no value. Adding a WhiteSpace to the Column Value");
}
return value;
}
The problem appears in the field.get(line), this method from the Field library will always return a null value
Any experience out there doing a similar mapping?
Just trying to avoid writing a super-ugly 100-liner switch case in the codebase...
EDIT to add internal exception I get from the Field library: UnsafeObjectFieldAccessorImpl

How do you handle blank non String values with a FieldSet in FieldSetMapper?

I am using common FieldSetMapper logic found through searches and in examples on StackOverflow and I have run into a situation which surprised me. Either it is a feature or a bug, but I thought I would present it here for review to see how others handle it.
Using Spring Batch, I have a pipe delimited file which has string and number values which may by optional depending on position. For example:
string|string|number|number|string
string||number||string
In your field set mapper class which implements FieldSetMapper, you usually do some mapping such as:
newThingy.setString1(fieldSet.readString("string1"));
newThingy.setString2(fieldSet.readString("string2"));
newThingy.setValue1(fieldSet.readInt("value1"));
newThingy.setValue2(fieldSet.readInt("value2"));
newThingy.setString3(fieldSet.readString("string3"));
During testing the code for line 1 above worked fine.
For line 2 with the blank values for string2 and value, a Java exception was thrown for the number but not the string:
Caused by: java.lang.NumberFormatException: Unparseable number:
at org.springframework.batch.item.file.transform.DefaultFieldSet.parseNumber(DefaultFieldSet.java:754)
at org.springframework.batch.item.file.transform.DefaultFieldSet.readInt(DefaultFieldSet.java:323)
at org.springframework.batch.item.file.transform.DefaultFieldSet.readInt(DefaultFieldSet.java:335)
at com.healthcloud.batch.mapper.MemberFieldSetMapper.mapFieldSet(MemberFieldSetMapper.java:31)
at com.healthcloud.batch.mapper.MemberFieldSetMapper.mapFieldSet(MemberFieldSetMapper.java:1)
I did some research in the DefaultFieldSetMapper.java class provided by Spring Batch which implements the FieldSet class to try and understand what is going on.
What I found is that the readAndTrim function called by readString returns null if the value read is blank
protected String readAndTrim(int index) {
String value = tokens[index];
if (value != null) {
return value.trim();
}
else {
return null;
}
}
... but when using readInt (and maybe others) we are returning an exception.
private Number parseNumber(String candidate) {
try {
return numberFormat.parse(candidate);
}
catch (ParseException e) {
throw new NumberFormatException("Unparseable number: " + candidate);
}
}
I do see where you can return a default value in some of the methods, but null is obviously not allowed. What I would expect is consistent behavior between all methods in FieldSet implementations which allow one to match the file to my database as the data is read. Blank values in delimited and fixed length files are fairly common.
If number based values cannot be properly handled, I will probably have to convert everything over to String as it is read and then go through the trouble to manual handle the conversion to the database, which obviously defeats the purpose of using Spring Batch.
Am I missing something that I should handle better? I can add more code if needed, I just felt this is commonly used and I could keep this short. Will edit as needed.
Edit: Add info on Unit Tests found for Spring Batch class
The comments in the test case state a default should be set instead, but why? I don't want a default. My database allows a null value in the Integer column. I would have to set the default to some arbitrary number which hopefully no one EVER sends, check for it before insert and then switch to null on insert. I still don't like this "feature."
#Test
public void testReadBlankInt() {
// Trying to parse a blank field as an integer, but without a default
// value should throw a NumberFormatException
try {
fieldSet.readInt(13);
fail();
}
catch (NumberFormatException ex) {
// expected
}
try {
fieldSet.readInt("BlankInput");
fail();
}
catch (NumberFormatException ex) {
// expected
}
}
Always sanity check your input/data. I'll usually throw together a Util class with all the parse/read/verification I need. Bare bones version below...
public static Integer getInteger(FieldSet fs, String key, Integer default) {
if(StringUtils.isNumeric(fs.readString(key))) {
return fs.readInt(key);
} else {
return default;
}
}

java.io.IOException: org.apache.thrift.protocol.TProtocolException: Cannot write a TUnion with no set value

I am using Thrift scheme to store Thrift bundles as pail file in hadoop cluster. Everything seems to be working correctly. Thrift bundle is being created without any errors.
Although,I am using kafka to send the bundle and while serializing,the serializer function converts the bundle in to byte array.I am getting the above mentioned error at this point. Why would kafka look in to bundle object for converting in to byte array. Or is there any way so that I can convert any object to byte array safely.If so can you please provide it.The error is :
java.io.IOException: org.apache.thrift.protocol.TProtocolException: Cannot write a TUnion with no set value!
following is the writeObject function that is throwing the error
private void writeObject(java.io.ObjectOutputStream out)
throws java.io.IOException {
try {
write(new org.apache.thrift.protocol.TCompactProtocol(
new org.apache.thrift.transport.TIOStreamTransport(out)));
} catch (org.apache.thrift.TException te) {
throw new java.io.IOException(te);
}
}
The message is quite clear: There is somewhere an union object where no value is set. Exactly one value must be set for Thrift unions.
Thrift union example, see your *.thrift file:
union Foo {
1: string bar
2: bool baz
}
This is the condition that raised the error:
public void write(TProtocol oprot, TUnion struct) throws TException {
if (struct.getSetField() == null || struct.getFieldValue() == null) {
throw new TProtocolException("Cannot write a TUnion with no set value!");
}
// ... more code ...
}
Can't tell what object it is from your given code, you are too deep in the stack. Look at the code that calls writeObject(), then walk upwards. At some place in your code some data are not set as they should, or the value is set to a null value - which is illegal with Thrift. If you do need an indicator for null values fopr your use case, consider an extra boolean flag.

Set deep property on bean, creating intermediary instances if needed

I'm using BeanUtils.setProperty to set deep properties on a bean.
Home home = new Home() ;
String path = "home.family.father.age";
Integer value = 40;
BeanUtils.setProperty(home, path, value);
// Does the same as home.getHome().getFamily().getFather().setAge(value);
// But stops on null (instead of throwing an NPE).
The behavior of BeanUtils is to do nothing if one of the intermediary properties is null. So for example in my case, home's family property is null, and nothing happens. If I do
family = new Family();
Then father will be null and I'd have to initialize it too. Obviously my real use case is more complex, with many dynamic properties (and also indexed ones).
Is there a way to tell BeanUtils to instantiate intermediate members ? I know that in general this is not possible (because the concrete type of a property may not be known). But in my case all properties have concrete types and are proper beans (with a public no-args constructor). So it would be possible.
I'd like to make sure there aren't already existing solutions (using BeanUtils or something else) for this before rolling my own.
I rolled my own. It only supports simple properties but I guess adding support for nested/mapped properties wouldn't be too hard.
Here is a gist in case anyone needs the same thing:
https://gist.github.com/ThomasGirard/7115693
And here's what the most important part looks like:
/** Mostly copy-pasted from {#link PropertyUtilsBean.setProperty}. */
public void initProperty(Object bean, String path) throws SecurityException, NoSuchMethodException,
IllegalAccessException, InvocationTargetException {
// [...]
// If the component is null, initialize it
if (nestedBean == null) {
// There has to be a method get* matching this path segment
String methodName = "get" + StringUtils.capitalize(next);
Method m = bean.getClass().getMethod(methodName);
// The return type of this method is the object type we need to init.
Class<?> propType = m.getReturnType();
try {
// Since it's a bean it must have a no-arg public constructor
Object newInst = propType.newInstance();
PropertyUtils.setProperty(bean, next, newInst);
// Now we have something instead of null
nestedBean = newInst;
} catch (Exception e) {
throw new NestedNullException("Could not init property value for '" + path + "' on bean class '"
+ bean.getClass() + "'. Class: " + propType);
}
}
// [...]
}

Categories