I have a problem with my JAXB. I have a method with #XmlAnyAttribute (on my getter) but it doesn't seem to work with the setter (using JAXB RI if it matters).
Simplified code:
#XmlRootElement( name = "element" )
#XmlAccessorType( value = XmlAccessType.PUBLIC_MEMBER )
public class Element
{
private Map<QName, String> convertedAttributes = new HashMap<QName, String>();
private List<Attribute> attributes = new ArrayList<Attribute>();
#XmlAnyAttribute
public Map<QName, String> getConvertedAttributes() throws Exception
{
if ( attributes != null )
{
return new AttributeMapAdapter().marshal( attributes );
}
return new HashMap<QName, String>();
}
public void setConvertedAttributes( Map<QName, String> convertedAttributes )
{
this.convertedAttributes = convertedAttributes;
}
#XmlTransient
public List<Attribute> getAttributes()
{
return attributes;
}
public void setAttributes( List<Attribute> attributes )
{
this.attributes = attributes;
}
}
This work great for marshalling, and I get the output I want. But when I try to unmarshall it, no values it sent to the setter.
I tried moving the #XmlAnyAttribute annotation to the field, and it works fine (but then I can't do the adaption in the getter).
It kinda feels like a bug, but I'm not sure. Any ideas? I'm using Java 1.6 on Mac OS X (10.7.2)
This isn't a bug in the JAXB RI. The problem is in your getConvertedAttributes() method. The following works a bit better:
public Map<QName, String> getConvertedAttributes() throws Exception
{
if(!convertedAttributes.isEmpty()) {
return convertedAttributes;
}
if ( attributes != null ) {
convertedAttributes = new AttributeMapAdapter().marshal( attributes );
} else {
convertedAttributes = new HashMap<QName, String>();
}
return convertedAttributes;
}
Your setter has to unmarshall the map again. So you need an adapter for the other direction, too.
Related
I've some server response (a long one) which I've converted to POJO (by using moshi library).
Eventually I have list of "Items" , each "Item" looks like follow :
public class Item
{
private String aa;
private String b;
private String abc;
private String ad;
private String dd;
private String qw;
private String arew;
private String tt;
private String asd;
private String aut;
private String id;
...
}
What I actually need, is to pull all properties which start with "a" , and then I need to use their values for further req ...
Any way to achieve it without Reflection ? (usage of streams maybe ?)
Thanks
With guava-functions tranformation you might transform your items with somethng following:
public static void main(String[] args) {
List<Item> items //
Function<Item, Map<String, Object>> transformer = new Function<Item, Map<String, Object>>() {
#Override
public Map<String, Object> apply(Item input) {
Map<String, Object> result = new HashMap<String, Object>();
for (Field f : input.getClass().getDeclaredFields()) {
if(! f.getName().startsWith("a")) {
continue;
}
Object value = null;
try {
value = f.get(input);
} catch (IllegalAccessException e) {
throw new RuntimeException("failed to cast" + e)
}
result.put(f.getName(), value);
}
return result
};
Collection<Map<String, Object> result
= Collections2.transform(items, transformer);
}
Sounds like you may want to perform your filtering on a regular Java map structure.
// Dependencies.
Moshi moshi = new Moshi.Builder().build();
JsonAdapter<Map<String, String>> itemAdapter =
moshi.adapter(Types.newParameterizedType(Map.class, String.class, String.class));
String json = "{\"aa\":\"value1\",\"b\":\"value2\",\"abc\":\"value3\"}";
// Usage.
Map<String, String> value = itemAdapter.fromJson(json);
Map<String, String> filtered = value.entrySet().stream().filter(
stringStringEntry -> stringStringEntry.getKey().charAt(0) == 'a')
.collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue));
You could wrap up the filtering logic in a custom JsonAdapter, but validation and business logic tends to be nice to leave to the application usage layer.
I am getting json from dynamoDb that looks like this -
{
"id": "1234",
"payment": {
"payment_id": "2345",
"user_defined": {
"some_id": "3456"
}
}
}
My aim is to get the user_defined field in a Java HashMap<String, Object> as user_defined field can contain any user defined fields, which would be unknown until the data arrives. Everything works fine except my DynamoDBMapper cannot convert the user_defined field to a Java HashMap. It is throwing this error -
Exception occured Response[payment]; could not unconvert attribute
This is how the classes looks like -
#DynamoDBTable(tableName = "PaymentDetails")
public class Response {
private String id;
public Response() {
}
private Payment payment = new Payment();
#DynamoDBHashKey(attributeName="id")
public String getId() { return id; }
public void setId(String id) { this.id = id; }
public Payment getPayment() {
return payment;
}
public void setPayment(Payment payment) {
this.payment = payment;
}
}
The payment field mapper -
#DynamoDBDocument
public class Payment {
private String payment_id:
private HashMap<String, Object> user_defined;
public Payment() {}
public getPayment_id() {
return payment_id;
}
public setPayment_id(String payment_id) {
this.payment_id = payment_id;
}
#DynamoDBTypeConverted(converter = HashMapMarshaller.class)
public HashMap<String, Object> getUser_defined() {
return user_defined;
}
public void setUser_defined(HashMap<String, Object> user_defined) {
this.user_defined = user_defined;
}
}
The HashMapMarshaller(Just to check if Hashmap marshaller wasn't working with gson, I just defined a Hashmap, put in a value and return it, but seems to still not working) -
public class HashMapMarshaller implements DynamoDBTypeConverter<String, HashMap<String, Object>> {
#Override
public String convert(HashMap<String, Object> hashMap) {
return new Gson().toJson(hashMap);
}
#Override
public HashMap<String, Object> unconvert(String jsonString) {
System.out.println("jsonString received for unconverting is " + jsonString);
System.out.println("Unconverting attribute");
HashMap<String, Object> hashMap = new HashMap<>();
hashMap.put("key", "value");
return hashMap;
//return new Gson().fromJson(jsonString, new TypeToken<HashMap<String, Object>>(){}.getType());
}
}
Marshaller approach is till now not working for me. It is also not printing any of the printlns I've put in there. I've also tried using #DynamoDBTyped(DynamoDBMapperFieldModel.DynamoDBAttributeType.M) and using Map instead of HashMap above my user_defined getter to no avail.
I want to find out how to convert the user_defined field to Java HashMap or Map. Any help is appreciated. Thank you!
Make Map<String, Object> to Map<String, String>. It should work without any custom converters. Otherwise be specific about Map's value type. For example, Map<String, SimplePojo> should work. Don't forget to annotate SimplePojo class with #DynamoDBDocument.
With Object as a type of Map's value, DynamoDB will not able to decide which object it has to create while reading entry from DynamoDB. It should know about specific type like String, Integer, SimplePojo etc.
I am stuck at converting Java Bean to Map. There are many resources on the internet, but unfortunately they all treat converting simple beans to Maps. My ones are a little bit more extensive.
There's simplified example:
public class MyBean {
private String firstName;
private String lastName;
private MyHomeAddress homeAddress;
private int age;
// getters & setters
}
My point is to produce Map<String, Object> which, in this case, is true for following conditions:
map.containsKey("firstName")
map.containsKey("lastName")
map.containsKey("homeAddress.street") // street is String
map.containsKey("homeAddress.number") // number is int
map.containsKey("homeAddress.city") // city is String
map.containsKey("homeAddress.zipcode") // zipcode is String
map.containsKey("age")
I have tried using Apache Commons BeanUtils. Both approaches BeanUtils#describe(Object) and BeanMap(Object) produce a Map which "deep level" is 1 (I mean that there's only "homeAddress" key, holding MyHomeAddress object as a value). My method should enter the objects deeper and deeper until it meets a primitive type (or String), then it should stop digging and insert key i.e. "order.customer.contactInfo.home".
So, my question is: how can it be easliy done (or is there already existing project which would allow me to do that)?
update
I have expanded Radiodef answer to include also Collections, Maps Arrays and Enums:
private static boolean isValue(Object value) {
final Class<?> clazz = value.getClass();
if (value == null ||
valueClasses.contains(clazz) ||
Collection.class.isAssignableFrom(clazz) ||
Map.class.isAssignableFrom(clazz) ||
value.getClass().isArray() ||
value.getClass().isEnum()) {
return true;
}
return false;
}
Here's a simple reflective/recursive example.
You should be aware that there are some issues with doing a conversion the way you've asked:
Map keys must be unique.
Java allows classes to name their private fields the same name as a private field owned by an inherited class.
This example doesn't address those because I'm not sure how you want to account for them (if you do). If your beans inherit from something other than Object, you will need to change your idea a little bit. This example only considers the fields of the subclass.
In other words, if you have
public class SubBean extends Bean {
this example will only return fields from SubBean.
Java lets us do this:
package com.acme.util;
public class Bean {
private int value;
}
package com.acme.misc;
public class Bean extends com.acme.util.Bean {
private int value;
}
Not that anybody should be doing that, but it's a problem if you want to use String as the keys, because there would be two keys named "value".
import java.lang.reflect.*;
import java.util.*;
public final class BeanFlattener {
private BeanFlattener() {}
public static Map<String, Object> deepToMap(Object bean) {
Map<String, Object> map = new LinkedHashMap<>();
try {
putValues(bean, map, null);
} catch (IllegalAccessException x) {
throw new IllegalArgumentException(x);
}
return map;
}
private static void putValues(Object bean,
Map<String, Object> map,
String prefix)
throws IllegalAccessException {
Class<?> cls = bean.getClass();
for (Field field : cls.getDeclaredFields()) {
if (field.isSynthetic() || Modifier.isStatic(field.getModifiers()))
continue;
field.setAccessible(true);
Object value = field.get(bean);
String key;
if (prefix == null) {
key = field.getName();
} else {
key = prefix + "." + field.getName();
}
if (isValue(value)) {
map.put(key, value);
} else {
putValues(value, map, key);
}
}
}
private static final Set<Class<?>> VALUE_CLASSES =
Collections.unmodifiableSet(new HashSet<>(Arrays.asList(
Object.class, String.class, Boolean.class,
Character.class, Byte.class, Short.class,
Integer.class, Long.class, Float.class,
Double.class
// etc.
)));
private static boolean isValue(Object value) {
return value == null
|| value instanceof Enum<?>
|| VALUE_CLASSES.contains(value.getClass());
}
}
You could always use the Jackson Json Processor. Like this:
import com.fasterxml.jackson.databind.ObjectMapper;
//...
ObjectMapper objectMapper = new ObjectMapper();
//...
#SuppressWarnings("unchecked")
Map<String, Object> map = objectMapper.convertValue(pojo, Map.class);
where pojo is some Java bean. You can use some nice annotations on the bean to control the serialization.
You can re-use the ObjectMapper.
I have this code in my JSP page:
<h:selectManyCheckbox id="chb" value="#{MyBean.selectedCheckBoxes}" layout="pageDirection">
<f:selectItems value="#{MyBean.checkBoxItems}"/>
</h:selectManyCheckbox>
And in my MyBean:
public class MyBean {
public MyBean() {
for (Elem section : sections) {
checkBoxItems.put(section.getName(), section.getObjectID());
}
}
private String[] selectedCheckBoxes;
private Map<String, Object> checkBoxItems = new LinkedHashMap<String, Object>();
public String save() {
//save is not being executed....
return FORWARD;
}
public Map<String, Object> getCheckBoxItems() {
return checkBoxItems;
}
public void setCheckBoxItems(Map<String, Object> checkBoxItems) {
this.checkBoxItems = checkBoxItems;
}
public String[] getSelectedCheckBoxes() {
return selectedCheckBoxes;
}
public void setSelectedCheckBoxes(String[] selectedCheckBoxes) {
this.selectedCheckBoxes = selectedCheckBoxes;
}
}
When I click save it is giving the below message in <t:message for="chb"/>
"chb": Value is not a valid option.
Even though I did not add the required attribute for h:selectManyCheckbox, it is trying to validate or doing something else...
I've changed checkBoxItems variable type(with getter/setters) to List<SelectItem>, but it is not working as well.
What can be the reason, how can I solve it?
PS: I'm using JSF 1.1
You will get this error when the equals() test on a selected item has not returned true for any of the available items. So, when roughly the following happens under JSF's covers:
boolean valid = false;
for (Object availableItem : availableItems) {
if (selectedItem.equals(availableItem)) {
valid = true;
break;
}
}
if (!valid) {
// Validation error: Value is not valid!
}
That can in your particular case only mean that section.getObjectID() does not return a String which is what your selectedCheckboxes is declared to, but a different type or a custom type where equals() is not implemented or broken.
Update as per your comment, the getObjectID() returns Integer. It's thus been treated as String because selectedCheckBoxes is declared as String[]. You should change the following
private String[] selectedCheckBoxes;
private Map<String, Object> checkBoxItems = new LinkedHashMap<String, Object>();
to
private Integer[] selectedCheckBoxes;
private Map<String, Integer> checkBoxItems = new LinkedHashMap<String, Integer>();
and maybe (not sure, can't tell from top of head now) also explicitly supply a converter:
<h:selectManyCheckbox ... converter="javax.faces.Integer">
i didnt find any problem in th code, i thought there is the problem the list u passed to oneManyCheckBox.
hardcode some values in list in getter than check
public Map<String, Object> getCheckBoxItems() {
checkBoxItems.clear();
checkBoxItems.put("aaaa", "aaaa");
checkBoxItems.put("bbbb", "bbbb");
checkBoxItems.put("cccc", "cccc");
checkBoxItems.put("dddd", "dddd");
checkBoxItems.put("eeee", "eeee");
return checkBoxItems;
}
Is there a version of BeanUtils.describe(customer) that recursively calls the describe() method on the complex attributes of 'customer'.
class Customer {
String id;
Address address;
}
Here, I would like the describe method to retrieve the contents of the address attribute as well.
Currently, all I have can see the name of the class as follows:
{id=123, address=com.test.entities.Address#2a340e}
Funny, I would like the describe method to retrieve the contents of nested attributes as well, I don't understand why it doesn't. I went ahead and rolled my own, though. Here it is, you can just call:
Map<String,String> beanMap = BeanUtils.recursiveDescribe(customer);
A couple of caveats.
I'm wasn't sure how commons BeanUtils formatted attributes in collections, so i went with "attribute[index]".
I'm wasn't sure how it formatted attributes in maps, so i went with "attribute[key]".
For name collisions the precedence is this: First properties are loaded from the fields of super classes, then the class, then from the getter methods.
I haven't analyzed the performance of this method. If you have objects with large collections of objects that also contain collections, you might have some issues.
This is alpha code, not garunteed to be bug free.
I am assuming that you have the latest version of commons beanutils
Also, fyi, this is roughly taken from a project I've been working on called, affectionately, java in jails so you could just download it and then run:
Map<String, String[]> beanMap = new SimpleMapper().toMap(customer);
Though, you'll notice that it returns a String[], instead of a String, which may not work for your needs. Anyway, the below code should work, so have at it!
public class BeanUtils {
public static Map<String, String> recursiveDescribe(Object object) {
Set cache = new HashSet();
return recursiveDescribe(object, null, cache);
}
private static Map<String, String> recursiveDescribe(Object object, String prefix, Set cache) {
if (object == null || cache.contains(object)) return Collections.EMPTY_MAP;
cache.add(object);
prefix = (prefix != null) ? prefix + "." : "";
Map<String, String> beanMap = new TreeMap<String, String>();
Map<String, Object> properties = getProperties(object);
for (String property : properties.keySet()) {
Object value = properties.get(property);
try {
if (value == null) {
//ignore nulls
} else if (Collection.class.isAssignableFrom(value.getClass())) {
beanMap.putAll(convertAll((Collection) value, prefix + property, cache));
} else if (value.getClass().isArray()) {
beanMap.putAll(convertAll(Arrays.asList((Object[]) value), prefix + property, cache));
} else if (Map.class.isAssignableFrom(value.getClass())) {
beanMap.putAll(convertMap((Map) value, prefix + property, cache));
} else {
beanMap.putAll(convertObject(value, prefix + property, cache));
}
} catch (Exception e) {
e.printStackTrace();
}
}
return beanMap;
}
private static Map<String, Object> getProperties(Object object) {
Map<String, Object> propertyMap = getFields(object);
//getters take precedence in case of any name collisions
propertyMap.putAll(getGetterMethods(object));
return propertyMap;
}
private static Map<String, Object> getGetterMethods(Object object) {
Map<String, Object> result = new HashMap<String, Object>();
BeanInfo info;
try {
info = Introspector.getBeanInfo(object.getClass());
for (PropertyDescriptor pd : info.getPropertyDescriptors()) {
Method reader = pd.getReadMethod();
if (reader != null) {
String name = pd.getName();
if (!"class".equals(name)) {
try {
Object value = reader.invoke(object);
result.put(name, value);
} catch (Exception e) {
//you can choose to do something here
}
}
}
}
} catch (IntrospectionException e) {
//you can choose to do something here
} finally {
return result;
}
}
private static Map<String, Object> getFields(Object object) {
return getFields(object, object.getClass());
}
private static Map<String, Object> getFields(Object object, Class<?> classType) {
Map<String, Object> result = new HashMap<String, Object>();
Class superClass = classType.getSuperclass();
if (superClass != null) result.putAll(getFields(object, superClass));
//get public fields only
Field[] fields = classType.getFields();
for (Field field : fields) {
try {
result.put(field.getName(), field.get(object));
} catch (IllegalAccessException e) {
//you can choose to do something here
}
}
return result;
}
private static Map<String, String> convertAll(Collection<Object> values, String key, Set cache) {
Map<String, String> valuesMap = new HashMap<String, String>();
Object[] valArray = values.toArray();
for (int i = 0; i < valArray.length; i++) {
Object value = valArray[i];
if (value != null) valuesMap.putAll(convertObject(value, key + "[" + i + "]", cache));
}
return valuesMap;
}
private static Map<String, String> convertMap(Map<Object, Object> values, String key, Set cache) {
Map<String, String> valuesMap = new HashMap<String, String>();
for (Object thisKey : values.keySet()) {
Object value = values.get(thisKey);
if (value != null) valuesMap.putAll(convertObject(value, key + "[" + thisKey + "]", cache));
}
return valuesMap;
}
private static ConvertUtilsBean converter = BeanUtilsBean.getInstance().getConvertUtils();
private static Map<String, String> convertObject(Object value, String key, Set cache) {
//if this type has a registered converted, then get the string and return
if (converter.lookup(value.getClass()) != null) {
String stringValue = converter.convert(value);
Map<String, String> valueMap = new HashMap<String, String>();
valueMap.put(key, stringValue);
return valueMap;
} else {
//otherwise, treat it as a nested bean that needs to be described itself
return recursiveDescribe(value, key, cache);
}
}
}
The challenge (or show stopper) is problem that we have to deal with an object graph instead of a simple tree. A graph may contain cycles and that requires to develop some custom rules or requirements for the stop criteria inside the recursive algorithm.
Have a look at a dead simple bean (a tree structure, getters are assumed but not shown):
public class Node {
private Node parent;
private Node left;
private Node right;
}
and initialize it like this:
root
/ \
A B
Now call a describe on root. A non-recursive call would result in
{parent=null, left=A, right=B}
A recursive call instead would do a
1: describe(root) =>
2: {parent=describe(null), left=describe(A), right=describe(B)} =>
3: {parent=null,
{A.parent=describe(root), A.left=describe(null), A.right= describe(null)}
{B.parent=describe(root), B.left=describe(null), B.right= describe(null)}}
and run into a StackOverflowError because describe is called with objects root, A and B over and over again.
One solution for a custom implementation could be to remember all objects that have been described so far (record those instances in a set, stop if set.contains(bean) return true) and store some kind of link in your result object.
You can simple use from the same commom-beanutils:
Map<String, Object> result = PropertyUtils.describe(obj);
Return the entire set of properties for which the specified bean provides a read method.