Are there any generic implementations out-there which can transform a Scala case class to SolrDocument ?
Since I could not find any such mapper utility that I can reuse, I took the below approach:
Create the case class object
Get non-empty fields by transforming case class object to Map
Add fields to the mutable document one-by-one.
This approach works for me, but I would have to create the intermediate Map object. I want to avoid this for verbosity and complexity reasons. Is there a better way of doing it?
Not a complete answer (I would write this in a comment, but need a few more rep), but just you point you in a direction, macros are the way to do this in a type-safe way without writing boiler plate mapping functions for every case class. JSON libraries deal with the same problem (except replace SolrDocument with JSON object). As an example you can take a look at the JSON serializer/deserializer macro implementation from Play Framework:
https://github.com/playframework/playframework/blob/master/framework/src/play-json/src/main/scala/play/api/libs/json/JsMacroImpl.scala
I suspect this solution is a little more heavy than you were looking for. The way I would approach it is to write the stupid boilerplate mapping functions for each case class, and only go down the macro path if this becomes a significant burden.
Seems fairly trivial to modify one of these answers:
def getSolrDoc(cc: Product): SolrDocument = {
val solrDoc = new SolrDocument
cc.getClass.getDeclaredFields.foreach { f =>
f.setAccessible(true)
solrDoc.addField(f.getName, f.get(cc))
}
solrDoc
}
And usage:
case class X(a:Int, b:String)
val x = X(1, "Test")
val doc = getSolrDoc(x)
println(doc) // prints SolrDocument{a=1, b=Test}
Related
I want to create a Java transpiler that will read nearly-Java code (call it JavaHash) and emit "pure" Java code on the other end. In particular, I want to add a new token that is the hashtag "#" in front of a hashmap member so that I might access it similar to a JavaScript hash object:
Map<String, String> foo = new HashMap<String, String>();
...
foo.put("name", "Roger");
...
String name = #foo.name;
I can't get the JavaParser to do anything but throw an error on the "#" hashtag.
Are there ways to catch tokens before they are parsed?
This is very far from trivial, but doable.
JavaParser is based on JavaCC, it uses the following grammar to generate parser code. The parser then creates an abstract syntax tree using code model classes.
If you want to add new language elements, you will need to:
implement code model classes;
extend the grammar used for parser generation.
This is not so easy, you will need good knowledge and understanding of JavaCC. But it is absolutely doable.
The rest is peanuts. You'll write a visitor and use it to traverse the AST. Once you've encountered the node of the appropriate type, simply transform the part of AST into "normal" Java and serialize.
By the way, JavaParser is a very good basis to build something like what you want. So congratulations to your choice, this is half of the deal, actually.
I have quite complex object structure (with bunch of primitive fields and object references) and want to test all fields except -a few- of them. As an example;
ComplexObject actual = generateMagically("someInput");
ComplexObject expected = ActualFunction.instance.workMagically(actual);
// we want to be sure that workMagically() would create a new ComplexObject
// with some fields are different than "actual" object.
// assertThat(actual, samePropertyValuesAs(expected)); would check all fields.
// what I want is actually; - notice that "fieldName1" and "fieldName2" are
// primitives belong to ComplexObject
assertThat(actual, samePropertyValuesExceptAs(expected, "fieldName1", "fieldName2"))
Since I don't want to check all fields manually, I believe there must be a way to write that test elegantly. Any ideas?
Cheers.
You should have a look at shazamcrest, a great Hamcrest extension that offers what you need.
assertThat(expected, sameBeanAs(expectedPerson).ignoring("fieldName1").ignoring("fieldName2"));
See https://github.com/shazam/shazamcrest#ignoring-fields
Just pass the list of properties to ignore as 2nd parameter to samePropertyValuesAs.
Hamcrest matcher API
public static <B> Matcher<B> samePropertyValuesAs(B expectedBean, String... ignoredProperties)
e.g.
samePropertyValuesAs(salesRecord,"id")
In general I see two solutions if ComplexObject can be modified by yourself.
You could introduce an interface that represents the properties of ComplexObject that are being changed by ActualFunction. Then you can test that all properties of that new interface have changed. This would require that ComplexObject implements that new interface.
Another approach would be to replace the properties of ComplextObject that are changed by ActualFunction with a new property of a new type that contains all those properties. A better design would then be to let ActualFunction return an instance of the new type.
Last time I had a similar requirements I came to the conclusion that manually writing both code and tests to assert that some values are updated is inherently fagile and error-prone.
I externalized the fields in a bag object and generated the Java source files for both the bag class itself and the copier at compile time. This way you can test actual code (the generator) and have the actual definition of the domain in exactly one place, so the copy code can't be out-of-date.
The language to describe the property can be anything you are comfortable with, from JSON-schema to XML to Java itself (Java example follows - custom annotations are to be consumed from the generator)
public class MyBag {
#Prop public int oh;
#Prop public String yeah;
}
Please correct me if I am wrong, but when using Java with say Spring MVC you didn't have to create these extra classes to map your Java class to JSON and JSON to class.
Why do you have to do this in Play with Scala? Is it something to do with Scala?
case class Location(lat: Double, long: Double)
implicit val locationWrites: Writes[Location] = (
(JsPath \ "lat").write[Double] and
(JsPath \ "long").write[Double]
)(unlift(Location.unapply))
implicit val locationReads: Reads[Location] = (
(JsPath \ "lat").read[Double] and
(JsPath \ "long").read[Double]
)(Location.apply _)
The reason why you have to do it in Play is a framework design choice, and it is a very good one.
In Play, the mechanism relies on Scala implicits, which are a very powerful feature leveraged to make the mechanism highly pluggable , in the sense that at the moment you call:
Json.toJson(Location(4.5, 5.3))
The compiler will look for an implicit in scope matching the required type. The Scala language specification describes the algorithm to resolve implicits, and such algorithm is designed in a way that you can "import" an implicit in a limited scope. Thanks to this feature, in different part of your program you can make visible a different implementation of your Reads / Writes or any typeclass .
object MyImplicits {
object ImplicitJson1{
implicit val write:Write[Location] = "write to json all fields"
}
object ImplicitJson2{
implicit val write:Write[Location] = "skip field a"
}
}
object MyBusinessCode{
def f1(location:Location){
import MyImplicits.ImplicitJson1._
Json.toJson(location)
}
def f2(location:Location){
import MyImplicits.ImplicitJson2._
Json.toJson(location)
}
def dynamicChoice(location:Location){
implicit val write = {
if(location.isEurope)
MyImplicits.ImplicitJson1.write
else
MyImplicits.ImplicitJson2.write
}
Json.toJson(location)
}
}
Instead, in Spring this is typically done through introspection and reflection. You might need to use annotations to help Spring determining how to build your Json from your data model. The consequence is that you cannot change the way it is done, and therefore you have less flexibility.
Since you might not need more flexibility, many Scala libraries/framework provides you functions to generate default implementation of the typeclass you need. That extra line of code
implicit val fmt = Json.format[Location]
is the price that you have to pay because Play json serialization relies on implicit.
You don't need to:
case class Location(lat: Double, long: Double)
object Location {
implicit val fmt = Json.format[Location]
}
Json.toJson(Location(4.5, 5.3)) // returns JsValue
The hand-written reads/writes/formats are useful when your JSON structure doesn't match your object definition.
Since you haven't mentioned which JSON/Spring library integration you meant, I am taking example of JSON via Jackson / Spring integration. I believe it takes advantage of Java Beans field naming convention. This would involve reflection and that happens at run-time.
However, Play's Scala Json library provides compile time safety of all the type in your JSON data. Also it gives you a nice functional syntax map, flatMap, orElse etc. This is a huge advantage.
See this question for more:
https://softwareengineering.stackexchange.com/questions/228193/json-library-jackson-or-play-framework
Theoretically it's possible to write function with default implicit parameter like
def toJson[T](x: T)(implicit fmt: Reads[T] = Json.format[T]) = Json.toJson(x)(fmt)
But in case of Play Json it will not work because Json.format[T] is a macro and it cannot resolve generic-type symbol T. It can resolve only symbols which directly refers to case classes or any with unapply (see sources)
On the other hand, it seems to be possible to write a macro which generates same function as i described but instead of Apply( Json.format[T]) uses AST from JsMacroImpl.macroImpl(c, "format", ...).
Anyway, it's not a language restriction - it's just not implemented in given library.
Is it possible to make ObjectMapper convert only the actual object without converting the rest of the object tree recursively ?
So that :
Map<String,Object> props = new ObjectMapper().convertValue(obj, Map.class);
results in a map of [field, value] where values are the actual references to instances of the fields of obj instead of Maps ?
There is no such feature right now with Jackson. You can probably achieve this with a custom Serializer/Deserializer pair that could share some data and "protocol". But, why bother doing this when the easier (and a LOT faster) way would be to have a generic way to go from POJO to Map, probably using reflection.
I am not sure I understand what you are really trying to do here.
But one thing that may help is to keep in mind that java.lang.Object type (as well as JsonNode) can be freely included in the structure, to get sort of "untyped" binding deeper in the structure. With these types, you can avoid rigid data-binding for some subsets of the object model; and possibly convert to POJOs using ObjectMapper.convertValue() more dynamically.
In a case where Person is a POJO having a List of "hobbies".
Just trying to understand this statement to implement a deep serialize mechanism:
new JSONSerializer().include("hobbies").serialize( person );
Does the syntax seem intuitive? From a java user POV, it seems the syntax should be:
new JSONSerializer().serialize( person ).include("hobbies");
I say this because it seems intuitive first to serialize the priamry object and then any Lists, references thereof.
Also, is the source code of flexjson available for public use? It is not present on sourceforge.net
You cannot do the latter so easily - the implementation would not know when you are done. You need to have some kind of terminator that performs the action, such as as .run() or .done()..