JSON mappnig unxepectable case - java

Earlier I used import com.fasterxml.jackson in my application.
Because I used akka http, I wanted to try live with Marshal/Unmarshal and spray.json.toJson.compactPrint.
Without extra package (com.fasterxml.jackson) dependency.
But I stuck on simple case
old working code:
...
import com.fasterxml.jackson.databind.ObjectMapper
import com.fasterxml.jackson.module.scala.DefaultScalaModule
val obj: AnyRef = new Object()
val mapper = new ObjectMapper()
mapper.registerModule(DefaultScalaModule)
val json = mapper.writeValueAsString(obj)
new code:
import spray.json._
val obj: AnyRef = new Object()
val json = obj.toJson.compactPrint
This cause exception
Cannot find JsonWriter or JsonFormat type class for AnyRef on
obj.toJson.compactPrint
Help please!
upd:
this is real part of code - for better understand what i need
it works well.
com.fasterxml.jackson mapper does not have restriction to write AnyRef to json string
import com.fasterxml.jackson.databind.ObjectMapper
import com.fasterxml.jackson.module.scala.DefaultScalaModule
object mapper {
private val _mapper = new ObjectMapper()
_mapper.registerModule(DefaultScalaModule)
def get: ObjectMapper = _mapper
}
import akka.actor.{Actor, Props, ReceiveTimeout}
import akka.http.scaladsl.model.{ContentTypes, HttpEntity, HttpResponse}
object RequestHandler {
def props(ctx: ImperativeRequestContext): Props = Props(new RequestHandler(ctx))
}
class RequestHandler(ctx: ImperativeRequestContext) extends Actor {
import context._
import concurrent.duration._
setReceiveTimeout(30.second)
def receive: Receive = {
case ReceiveTimeout =>
ctx.complete(HttpResponse(500, entity = "timeout"))
stop(self)
case x: AnyRef =>
ctx.complete(HttpEntity(ContentTypes.`application/json`, mapper.get.writeValueAsString(x)))
stop(self)
}
}

I'm not sure what exactly you complain about. Spray JSON library as it is typical for a Scala library is more typesafe than Jackson. Particularly toJson works basing on the compile-time type rather than rune-time type. And obviously there is no safe way to transform any AnyRef (aka Object) to a JSON because there might be any object of any type behind that variable. Consider following example:
val obj: AnyRef = new Thread()
val json = obj.toJson.compactPrint
To Spray JSON this code looks exactly the same as your example. Obviously you can't write Thread to JSON. The difference is that with Jackson you will get some kind of a runtime error. With Spray JSON it will not even compile. If you use some specific types that can be converted to JSON safely - Spray JSON will work for you.
If the question is that you want to have some generic method that accepts arguments and as one of the steps does conversion to JSON, then you should use generics and constraints to specify that type as in:
def foo[T : JsonWriter](bar : T) = {
//do something
bar.toJson.compactPrint
//do something more
}
Update: more realistic example
Here is an example that compiles for me and runs as I would expect:
import spray.json._
import DefaultJsonProtocol._
case class Foo(i: Int)
implicit val fooFormat = jsonFormat1(Foo)
case class Bar(ii: Int, ss: String)
implicit val barFormat = jsonFormat2(Bar)
def printJson[T: JsonWriter](obj: T):Unit = {
println(obj.toJson.compactPrint)
}
printJson(Foo(42))
printJson(Bar(123, "this works!"))

Related

Externalized Configuration not works for Scala's Map

I tried to load a property values for external file, and the property is stored as key-value format (Map). It works properly if I use java's Map, just as the code as:
import java.util.{Map, HashMap}
#Component
#ConfigurationProperties(prefix="config")
class ConfigProperties {
val corporationSecrets : Map[String, String] = new HashMap[String, String]
}
But when I just change the map to Scala's map, I cannot get any value from the map, viz., the map is empty.
import java.util.HashMap
import scala.collection.JavaConverters._
#Component
#ConfigurationProperties(prefix="config")
class ConfigProperties {
val corporationSecrets : Map[String, String] = new HashMap[String, String].asScala
}
I tried both mutable map and immutable map, but no case works.
Does it mean I cannot use Scala's map in the case?
Yes, Spring Boot doesn't know how to handle Scala collections. But you could use Java collections internally and add methods returning the Scala versions. Of course, they'll need to have different names. E.g.
import java.util.{HashMap => JHashMap, Map => JMap}
import org.springframework.boot.context.properties.ConfigurationProperties
import org.springframework.stereotype.Component
import scala.beans.BeanProperty
import scala.collection.JavaConversions._
#Component
#ConfigurationProperties(prefix = "security-util")
class SecurityUtilProperties {
#BeanProperty
val corporationSecrets: JMap[String, String] = new JHashMap[String, String]
def corporationSecretsScala = corporationSecrets.asScala
}

Json4s serialisation of superclass

I'm having trouble writing a custom serialiser for Json4s to handle the following situation. I have case classes:
trait Condition
case class BasicExpression (field:String, operator:String, value:String) extends Condition
case class BooleanExpression (val leftExpr: Condition, val logicalOperator:String,
val rightExpr: Condition) extends Condition
and I want to be able to read the JSON for both BasicExpression and BooleanExpression using, for example:
var jsonStringBasic:String = """ {"field":"name","operator":"=","value":"adam"}""";
var jsonStringBoolean:String = """{"leftExpr":{"leftExpr":{"field":"field1", "operator":"=", "value":"value1"}, "logicalOperator":"AND", "rightExpr":{"field":"field2","operator":">","value":"500"}}, "logicalOperator":"AND", "rightExpr": {"field":"field3","operator":"<","value":"10000"}}""";
var jValueBasic:JValue = parse(jsonStringBasic, false);
var readCBasic = jValueBasic.extract[Condition];
I understand how the custom serialiser works for reading the BasicExpression, and I could use SimpleTypeHints, but it'd be good to not have to bloat the JSON for every Condition. I could also try extract[BooleanExpression] and if that fails try extract[BasicExpression], but that seems ugly. Is it possible to write the custom serialiser to handle the fact that a BooleanCondition will itself contain another Condition recursively so I can extract[Condition]?
for better parsing JSON you can try this :
import com.google.gson.Gson;
import com.google.gson.GsonBuilder;
public class GsonUtils {
public static String defaultDateTimeFormat = "yyyy-MM-dd'T'HH:mm:ssZ";
private static GsonBuilder gsonBuilder = new GsonBuilder().setDateFormat(defaultDateTimeFormat);
/***
* Creates a GSON instance from the builder with the default date/time format
*
* #return the GSON instance
*/
public static Gson createGson() {
// Create with default params
gsonBuilder = gsonBuilder.setDateFormat(defaultDateTimeFormat);
return gsonBuilder.create();
}
/***
* Creates a GSON instance from the builder specifying custom date/time format
*
* #return the GSON instance
*/
public static Gson createGson(String dateTimeFormat) {
// Create with the specified dateTimeFormat
gsonBuilder = gsonBuilder.setDateFormat(dateTimeFormat);
return gsonBuilder.create();
}
}
and use it like this
var String = """ {"field":"name","operator":"=","value":"adam"}""";
Type collectionType = new TypeToken<YOUR_CLASS>() {}.getType();
YOUR_CLASS iii= GsonUtils.createGson().fromJson(jsonStringBasic, collectionType);
Managed to get the CustomSerializer working so it could call itself recursively, in the case of a BooleanExpression, as below:
class ConditionSerialiser extends CustomSerializer[Condition]( format => (
{
def deserialiseCondition:PartialFunction[JValue, Condition]= {
case JObject(List(JField("field", JString(field)), JField("operator", JString(operator)), JField("value", JString(value)))) => BasicStringExpression(field, operator, value)
case JObject(List(JField("field", JString(field)), JField("operator", JString(operator)), JField("value", JInt(value)))) => BasicNumericExpression(field, operator, value.doubleValue)
case JObject(List(JField("field", JString(field)), JField("operator", JString(operator)), JField("value", JDouble(value)))) => BasicNumericExpression(field, operator, value)
case JObject(List(JField("leftExpr", leftExpr), JField("logicalOperator", JString(logicalOperator)), JField("rightExpr", rightExpr))) => BooleanExpression(deserialiseCondition(leftExpr), logicalOperator, deserialiseCondition(rightExpr))
}
deserialiseCondition
},
{
case bse: BasicStringExpression => JObject(List(JField("field", JString(bse.field)), JField("operator", JString(bse.operator)), JField("value", JString(bse.value))))
}
))

Gson and Serializing an ArrayList of Objects with Inheritance

I am very new to Gson and Json. I have simple Events that I want to serialize through Json with the help of Gson.
Note: Code in Kotlin.
public abstract class Event() {
}
public class Move : Event() {
var from: Point? = null
var to: Point? = null
}
public class Fire : Event() {
var damage: Int = 0
var area: ArrayList<Point> = ArrayList(0)
}
public class Build : Event() {
var to: Point? = null
var type: String = ""
var owner: String = ""
}
I am persisting bunch of these via this way:
val list: ArrayList<Event>() = ArrayList()
list.add(move)
list.add(fire)
val str = gson.toJson(events)
And unpersisting:
val type = object : TypeToken<ArrayList<Event>>(){}.getType()
val eventStr = obj.getString("events")
val events: ArrayList<Event> = gson.fromJson(eventStr, type)
I have tried both creating a serializer & deserializer for Event-class, and registering it via registerTypeAdapter, and I have also tried the RuntimeTypeAdapterFactory, but neither will persist the information required to unpersist the correct type.
For example, the RuntimeTypeAdapterFactory says:
"cannot deserialize Event because it does not define a field named type"
EDIT: Here's the code for the "Adapter", which was.. well, adapted from another StackOverflow post:
public class Adapter :
JsonSerializer<Event>,
JsonDeserializer<Event> {
final val CLASSNAME = "CLASSNAME"
final val INSTANCE = "INSTANCE"
override fun serialize(src: Event?, typeOfSrc: Type?, context: JsonSerializationContext?): JsonElement? {
val obj = JsonObject()
val className = (src as Event).javaClass.getCanonicalName()
obj.addProperty(CLASSNAME, className)
val elem = context!!.serialize(src)
obj.add(INSTANCE, elem)
return obj
}
override fun deserialize(json: JsonElement?, typeOfT: Type?, context: JsonDeserializationContext?): Event? {
val jsonObject = json!!.getAsJsonObject()
val prim = jsonObject.get(CLASSNAME)
val className = prim.getAsString()
val klass = Class.forName(className)
return context!!.deserialize(jsonObject.get(INSTANCE), klass)
}
}
This code fails with NullPointerException on line:
val className = prim.getAsString()
You can't do it this way.
The example you are referring is not targeted to your case. It works in only one case: if you register base type (not type hierarchy) and serialize using gson.toJson(obj, javaClass<Event>()). It will never work for array except you write custom serializer for you events container object too
Generally you need another approach: use TypeAdapterFactory and delegate adapters: GSON: serialize/deserialize object of class, that have registered type hierarchy adapter, using ReflectiveTypeAdapterFactory.Adapter and https://code.google.com/p/google-gson/issues/detail?id=43#c15
I believe this approach is overcomplicated so if you have few types the easiest solution is two serialize these types by hand, field by field via custom serializer and forget about attempts to delegate to default

Efficient POJO mapping to/from Java Mongo DBObject using Jackson

Although similar to Convert DBObject to a POJO using MongoDB Java Driver my question is different in that I am specifically interested in using Jackson for mapping.
I have an object which I want to convert to a Mongo DBObject instance. I want to use the Jackson JSON framework to do the job.
One way to do so is:
DBObject dbo = (DBObject)JSON.parse(m_objectMapper.writeValueAsString(entity));
However, according to https://github.com/FasterXML/jackson-docs/wiki/Presentation:-Jackson-Performance this is the worst way to go. So, I am looking for an alternative. Ideally, I would like to be able to hook into the JSON generation pipeline and populate a DBObject instance on the fly. This is possible, because the target in my case is a BasicDBObject instance, which implements the Map interface. So, it should fit into the pipeline easily.
Now, I know I can convert an object to Map using the ObjectMapper.convertValue function and then recursively convert the map to a BasicDBObject instance using the map constructor of the BasicDBObject type. But, I want to know if I can eliminate the intermediate map and create the BasicDBObject directly.
Note, that because a BasicDBObject is essentially a map, the opposite conversion, namely from a scalar DBObject to a POJO is trivial and should be quite efficient:
DBObject dbo = getDBO();
Class clazz = getObjectClass();
Object pojo = m_objectMapper.convertValue(dbo, clazz);
Lastly, my POJO do not have any JSON annotations and I would like it to keep this way.
You can probably use Mixin annotations to annotate your POJO and the BasicDBObject (or DBObject), so annotations is not a problem. Since BasicDBOject is a map, you can use #JsonAnySetter on the put method.
m_objectMapper.addMixInAnnotations(YourMixIn.class, BasicDBObject.class);
public interface YourMixIn.class {
#JsonAnySetter
void put(String key, Object value);
}
This is all I can come up with since I have zero experience with MongoDB Object.
Update: MixIn are basically a Jackson mechanism to add annotation to a class without modifying said class. This is a perfect fit when you don't have control over the class you want to marshal (like when it's from an external jar) or when you don't want to clutter your classes with annotation.
In your case here, you said that BasicDBObject implements the Map interface, so that class has the method put, as defined by the map interface. By adding #JsonAnySetter to that method, you tell Jackson that whenever he finds a property that he doesn't know after introspection of the class to use the method to insert the property to the object. The key is the name of the property and the value is, well, the value of the property.
All this combined makes the intermediate map go away, since Jackson will directly convert to the BasicDBOject because it now knows how to deserialize that class from Json. With that configuration, you can do:
DBObject dbo = m_objectMapper.convertValue(pojo, BasicDBObject.class);
Note that I haven't tested this because I don't work with MongoDB, so there might be some loose ends. However, I have used the same mechanism for similar use cases without any problem. YMMV depending on the classes.
Here's an example of a simple serializer (written in Scala) from POJO to BsonDocument which could be used with version 3 of Mongo driver. The de-serializer would be somewhat more difficult to write.
Create a BsonObjectGenerator object which would do a streaming serialization to Mongo Bson directly:
val generator = new BsonObjectGenerator
mapper.writeValue(generator, POJO)
generator.result()
Here's the code for a serializer:
class BsonObjectGenerator extends JsonGenerator {
sealed trait MongoJsonStreamContext extends JsonStreamContext
case class MongoRoot(root: BsonDocument = BsonDocument()) extends MongoJsonStreamContext {
_type = JsonStreamContext.TYPE_ROOT
override def getCurrentName: String = null
override def getParent: MongoJsonStreamContext = null
}
case class MongoArray(parent: MongoJsonStreamContext, arr: BsonArray = BsonArray()) extends MongoJsonStreamContext {
_type = JsonStreamContext.TYPE_ARRAY
override def getCurrentName: String = null
override def getParent: MongoJsonStreamContext = parent
}
case class MongoObject(name: String, parent: MongoJsonStreamContext, obj: BsonDocument = BsonDocument()) extends MongoJsonStreamContext {
_type = JsonStreamContext.TYPE_OBJECT
override def getCurrentName: String = name
override def getParent: MongoJsonStreamContext = parent
}
private val root = MongoRoot()
private var node: MongoJsonStreamContext = root
private var fieldName: String = _
def result(): BsonDocument = root.root
private def unsupported(): Nothing = throw new UnsupportedOperationException
override def disable(f: Feature): JsonGenerator = this
override def writeStartArray(): Unit = {
val array = new BsonArray
node match {
case MongoRoot(o) =>
o.append(fieldName, array)
fieldName = null
case MongoArray(_, a) =>
a.add(array)
case MongoObject(_, _, o) =>
o.append(fieldName, array)
fieldName = null
}
node = MongoArray(node, array)
}
private def writeBsonValue(value: BsonValue): Unit = node match {
case MongoRoot(o) =>
o.append(fieldName, value)
fieldName = null
case MongoArray(_, a) =>
a.add(value)
case MongoObject(_, _, o) =>
o.append(fieldName, value)
fieldName = null
}
private def writeBsonString(text: String): Unit = {
writeBsonValue(BsonString(text))
}
override def writeString(text: String): Unit = writeBsonString(text)
override def writeString(text: Array[Char], offset: Int, len: Int): Unit = writeBsonString(new String(text, offset, len))
override def writeString(text: SerializableString): Unit = writeBsonString(text.getValue)
private def writeBsonFieldName(name: String): Unit = {
fieldName = name
}
override def writeFieldName(name: String): Unit = writeBsonFieldName(name)
override def writeFieldName(name: SerializableString): Unit = writeBsonFieldName(name.getValue)
override def setCodec(oc: ObjectCodec): JsonGenerator = this
override def useDefaultPrettyPrinter(): JsonGenerator = this
override def getFeatureMask: Int = 0
private def writeBsonBinary(data: Array[Byte]): Unit = {
writeBsonValue(BsonBinary(data))
}
override def writeBinary(bv: Base64Variant, data: Array[Byte], offset: Int, len: Int): Unit = {
val res = if (offset != 0 || len != data.length) {
val subset = new Array[Byte](len)
System.arraycopy(data, offset, subset, 0, len)
subset
} else {
data
}
writeBsonBinary(res)
}
override def writeBinary(bv: Base64Variant, data: InputStream, dataLength: Int): Int = unsupported()
override def isEnabled(f: Feature): Boolean = false
override def writeRawUTF8String(text: Array[Byte], offset: Int, length: Int): Unit = writeBsonString(new String(text, offset, length, "UTF-8"))
override def writeRaw(text: String): Unit = unsupported()
override def writeRaw(text: String, offset: Int, len: Int): Unit = unsupported()
override def writeRaw(text: Array[Char], offset: Int, len: Int): Unit = unsupported()
override def writeRaw(c: Char): Unit = unsupported()
override def flush(): Unit = ()
override def writeRawValue(text: String): Unit = writeBsonString(text)
override def writeRawValue(text: String, offset: Int, len: Int): Unit = writeBsonString(text.substring(offset, offset + len))
override def writeRawValue(text: Array[Char], offset: Int, len: Int): Unit = writeBsonString(new String(text, offset, len))
override def writeBoolean(state: Boolean): Unit = {
writeBsonValue(BsonBoolean(state))
}
override def writeStartObject(): Unit = {
node = node match {
case p#MongoRoot(o) =>
MongoObject(null, p, o)
case p#MongoArray(_, a) =>
val doc = new BsonDocument
a.add(doc)
MongoObject(null, p, doc)
case p#MongoObject(_, _, o) =>
val doc = new BsonDocument
val f = fieldName
o.append(f, doc)
fieldName = null
MongoObject(f, p, doc)
}
}
override def writeObject(pojo: scala.Any): Unit = unsupported()
override def enable(f: Feature): JsonGenerator = this
override def writeEndArray(): Unit = {
node = node match {
case MongoRoot(_) => unsupported()
case MongoArray(p, a) => p
case MongoObject(_, _, _) => unsupported()
}
}
override def writeUTF8String(text: Array[Byte], offset: Int, length: Int): Unit = writeBsonString(new String(text, offset, length, "UTF-8"))
override def close(): Unit = ()
override def writeTree(rootNode: TreeNode): Unit = unsupported()
override def setFeatureMask(values: Int): JsonGenerator = this
override def isClosed: Boolean = unsupported()
override def writeNull(): Unit = {
writeBsonValue(BsonNull())
}
override def writeNumber(v: Int): Unit = {
writeBsonValue(BsonInt32(v))
}
override def writeNumber(v: Long): Unit = {
writeBsonValue(BsonInt64(v))
}
override def writeNumber(v: BigInteger): Unit = unsupported()
override def writeNumber(v: Double): Unit = {
writeBsonValue(BsonDouble(v))
}
override def writeNumber(v: Float): Unit = {
writeBsonValue(BsonDouble(v))
}
override def writeNumber(v: BigDecimal): Unit = unsupported()
override def writeNumber(encodedValue: String): Unit = unsupported()
override def version(): Version = unsupported()
override def getCodec: ObjectCodec = unsupported()
override def getOutputContext: JsonStreamContext = node
override def writeEndObject(): Unit = {
node = node match {
case p#MongoRoot(_) => p
case MongoArray(p, a) => unsupported()
case MongoObject(_, p, _) => p
}
}
}
You might be intereted in checking how jongo does it. It is open source and the code can be found on github. Or you could also simply use their library. I use a mix of jongo and plain DBObjects when I need more flexibility.
They claim that they are (almost) as fast as using the Java driver directly so I suppose their method is efficient.
I use the little helper utility class below which is inspired from their code base and uses a mix of Jongo (the MongoBsonFactory) and Jackson to convert between DBObjects and POJOs. Note that the getDbObject method does a deep copy of the DBObject to make it editable - if you don't need to customise anything you can remove that part and improve performance.
import com.fasterxml.jackson.annotation.JsonAutoDetect;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.ObjectReader;
import com.fasterxml.jackson.databind.ObjectWriter;
import com.fasterxml.jackson.databind.introspect.VisibilityChecker;
import com.mongodb.BasicDBObject;
import com.mongodb.DBEncoder;
import com.mongodb.DBObject;
import com.mongodb.DefaultDBEncoder;
import com.mongodb.LazyWriteableDBObject;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import org.bson.LazyBSONCallback;
import org.bson.io.BasicOutputBuffer;
import org.bson.io.OutputBuffer;
import org.jongo.marshall.jackson.bson4jackson.MongoBsonFactory;
public class JongoUtils {
private final static ObjectMapper mapper = new ObjectMapper(MongoBsonFactory.createFactory());
static {
mapper.setVisibilityChecker(VisibilityChecker.Std.defaultInstance().withFieldVisibility(
JsonAutoDetect.Visibility.ANY));
}
public static DBObject getDbObject(Object o) throws IOException {
ObjectWriter writer = mapper.writer();
ByteArrayOutputStream baos = new ByteArrayOutputStream();
writer.writeValue(baos, o);
DBObject dbo = new LazyWriteableDBObject(baos.toByteArray(), new LazyBSONCallback());
//turn it into a proper DBObject otherwise it can't be edited.
DBObject result = new BasicDBObject();
result.putAll(dbo);
return result;
}
public static <T> T getPojo(DBObject o, Class<T> clazz) throws IOException {
ObjectReader reader = mapper.reader(clazz);
DBEncoder dbEncoder = DefaultDBEncoder.FACTORY.create();
OutputBuffer buffer = new BasicOutputBuffer();
dbEncoder.writeObject(buffer, o);
T pojo = reader.readValue(buffer.toByteArray());
return pojo;
}
}
Sample usage:
Pojo pojo = new Pojo(...);
DBObject o = JongoUtils.getDbObject(pojo);
//you can customise it if you want:
o.put("_id", pojo.getId());
I understand that this is a very old question, but if asked today I would instead recommend the built-in POJO support on the official Mongo Java driver.
Here's an update to assylias' answer that doesn't require Jongo and is compatible with the Mongo 3.x drivers. It also handles nested object graphs, I couldn't get that to work with LazyWritableDBObject which has been removed in the mongo 3.x drivers anyway.
The idea is to tell Jackson how to serialize an object to a BSON byte array, and then deserialize the BSON byte array into BasicDBObject. I'm sure you can find some low level API in the mongo-java-drivers if you want to ship the BSON bytes directly to the database. You will need a dependency to bson4jackson in order for ObjectMapper to serialize BSON when you call writeValues(ByteArrayOutputStream, Object):
import com.fasterxml.jackson.databind.ObjectMapper;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import de.undercouch.bson4jackson.BsonFactory;
import de.undercouch.bson4jackson.BsonParser;
import org.bson.BSON;
import org.bson.BSONObject;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
public class MongoUtils {
private static ObjectMapper mapper;
static {
BsonFactory bsonFactory = new BsonFactory();
bsonFactory.enable(BsonParser.Feature.HONOR_DOCUMENT_LENGTH);
mapper = new ObjectMapper(bsonFactory);
}
public static DBObject getDbObject(Object o) {
try {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
mapper.writeValue(baos, o);
BSONObject decode = BSON.decode(baos.toByteArray());
return new BasicDBObject(decode.toMap());
} catch (IOException e) {
throw new RuntimeException(e);
}
}
}

How to convert a JSON object (returns from google places API) to a Java object

Google places API returns a JSON when it requested for a place under food category which includes the details of several places.
I want to create an object array where each object contains details of a specific place.
I have used GSON library for my implementation and it works fine for a dummy JSON object but not with the JSON result given from Google place API and 'JsonSyntaxException' is thrown.
I look for a solution for following matters..
1 How can I proceed with GSON and given JSON object to create my object array or
2 Is there any other way to accomplish my task (still using JSON result)
Thanks.
update
Class PlaceObject
import java.util.List;
public class PlaceObject {
private List<String> results;
}
Class JSONconverter
import java.io.BufferedReader;
import java.io.FileReader;
import java.io.IOException;
import com.google.gson.Gson;
public class JSONconverter {
public static void main(String[] args) {
Gson gson = new Gson();
try {
BufferedReader br = new BufferedReader(
new FileReader("c:\\placeAPI.json"));
//convert the json string back to object
PlaceObject obj = gson.fromJson(br, PlaceObject.class);
//obj.results = null ,when debugged thats the problem
System.out.println("Result: " + obj);
} catch (IOException e) {
e.printStackTrace();
}
}
}
The link of JSON
http://www.mediafire.com/?8mmnuxuopimhdnz
I like working with gson
btw. there is another relevant thread
Jersey client's documentation proposes to use Jackson library (see wiki)
You can also take a look at Genson library http://code.google.com/p/genson/.
It provides an out of box integration with jersey. You only need to have the jar in your classpath.

Categories