I'm working on a WebService built from an existing WSDL, using NetBeans and Glassfish.
NetBeans has created the needed classes from the given WSDL.
The WSDL define some base data types (for example BaseType) and other data types which extend them. (for example ExtType1, ExtType2 ...)
Some of the SOAP functions described in WSDL accept parameters of BaseType type, so it could be possibile to use extended types as parameters, too.
In the web service client, written in PHP, I can invoke a method using a base type parameter:
$response = $ws->__soapCall(
'myFunctionName',
array('theParameter' => array (
'BaseTypeField1' => 'some value',
'BaseTypeField2' => 'some other value'
)
)
);
or using an extended type parameter
$response = $ws->__soapCall(
'myFunctionName',
array('theParameter' => array (
'BaseTypeField1' => 'some value',
'BaseTypeField2' => 'some other value',
'ExtTypeField1' => 'some value',
'ExtTypeField2' => 'some other value'
)
)
);
Now in netbeans generated classes I have an object of type JAXBElement<? extends BaseType>, where a BaseType object is expected.
The question is: how can I determine, from within the Java web method call, if the parameter object from the web service client is a BaseType one or one of his extended types (and which of those)?
I have tried to retrieve some class data information for that object, but it always says it's a BaseType, so I cannot know if ExtTypeField1 and ExtTypeField2 are available for sure.
Thanks
Given that you have something like this JAXBElement<? extends BaseType> object you can determine the type of the value as follow:
Class<? extends BaseType> klass = object.getValue().getClass();
Now from there you can do something based on the object type but this is not always the best way to go. What you will probably want is more something like this:
BaseType value = object.getValue();
if (value instanceof ExtType1) {
ExtType1 field1 = (ExtType1) value;
// we now know that it's an ExtType1
} else if (value instanceof ExtTypeField2) {
ExtType2 field2 = (ExtType2) value;
// we now know that it's an ExtType2
} // etc...
Related
I have a generic getter trait
trait Getter[A] {
def get: A
}
and I would like to parse JSON into a List of objects implementing this trait. Two such implementations:
case class CoalesceGetter[A](getters: List[Getter[String]]) extends Getter[A] {
override def get: A = getters.map(_.get).find(_ != null).orNull
}
case class AsnGetter(ipGetter: Getter[String]) extends Getter[Long] {
override def get: Long = 99L // dummy function
}
I would like to parse JSON into the correct Getter class based upon a property called function which corresponds to the class and type which corresponds to the generic type in the case of getters which need a generic (both properties are strings in the json blob I'm parsing). I've looked at custom serializers for json4s but don't see how to work with generics. Any help is appreciated!
First of all, I don't think it is a good idea to jsonify classes with type argument. I think it is a better design to define non-typed (case) classes that are direct equivalent of your json object, and use standard read/write json as provided by many libraries.
But then, to answer your question, I'd like to return another question: how would you do it "manually"?
I.e. how would you write and read different CoalesceGetter[A] with different A?
Here is a proposition: put the type arg in a json field:
"ofInt": {"type-arg":"Int", "getters":[ ... list of getters in json ...]},
"ofDouble":{"type-arg":"Double", "getters":[ ... list of getters in json ...]}
Now, if you'd write the reader, how would you instantiate the 2 ofInt and ofDouble, knowing the type-arg "Int" and "Double" (which are string!).
I see 2 solutions:
1) Either you have a hard-coded map of arg-type string => actual scala type
argType match{
case "Int" => new CoalesceGetter[Int](...)
case "Double" => new CoalesceGetter[Double](...)
}
2) Or you store and read a generalized type as string value in the arg-type string, such as the java Class.forName (see [https://stackoverflow.com/a/7495850/1206998] for example). But this is a really really bad idea IMHO.
(note: if you want to serialize any object just to reload it later or on another computer, don't use json but dedicated serialization such as the Java Serialization or kryo that is used by spark)
My data on firebase uses many fields which have string type, but really are enum values (which I check in my validation rules). To download the data into my Android app, following the guide, the field must be a basic String. I know I can work around this with a second (excluded) field which is an enum, and set this basing on the string value. A short example:
class UserData : BaseModel() {
val email: String? = null
val id: String = ""
val created: Long = 0
// ... more fields omitted for clarity
#Exclude
var weightUnitEnum: WeightUnit = WeightUnit.KG
var weightUnit: String
get() = weightUnitEnum.toString()
set(value) { weightUnitEnum = WeightUnit.fromString(value) }
}
enum class WeightUnit(val str: String) {
KG("kg"), LB("lb");
override fun toString(): String = str
companion object {
#JvmStatic
fun fromString(s: String): WeightUnit = WeightUnit.valueOf(s.toUpperCase())
}
}
Now, while this works, it's not really clean:
The enum class itself is (1) kinda long for an
enum, (2) the insides are repeated for every enum. And I have more of them.
It's not only enums, the created field above is really a timestamp,
not a Long.
Each model uses these enum fields a lot of times, which bloats the model classes with repeatable code...
The helper field/functions are getting much worse/longer for fields with types such as Map<SomeEnum, Timestamp>...
So, is there any way to do this properly? Some library maybe? Or some way to write a magic "field wrapper" that would automatically convert strings to enums, or numbers to timestamps, and so on, but is still compatible with Firebase library for getting/setting data?
(Java solutions are welcome too :) )
If the conversion between a property with your enum value and another property of String type is enough, this can be easily done in a flexible way using Kotlin delegated properties.
To say it short, you can implement a delegate for String properties which performs the conversion and actually gets/sets the value of another property storing the enum values, and then delegate the String property to it.
One possible implementation would look like this:
class EnumStringDelegate<T : Enum<T>>(
private val enumClass: Class<T>,
private val otherProperty: KMutableProperty<T>,
private val enumNameToString: (String) -> String,
private val stringToEnumName: (String) -> String) {
operator fun getValue(thisRef: Any?, property: KProperty<*>): String {
return enumNameToString(otherProperty.call(thisRef).toString())
}
operator fun setValue(thisRef: Any?, property: KProperty<*>, value: String) {
val enumValue = java.lang.Enum.valueOf(enumClass, stringToEnumName(value))
otherProperty.setter.call(thisRef, enumValue)
}
}
Note: This code requires you to add the Kotlin reflection API, kotlin-reflect, as a dependency to your project. With Gradle, use compile "org.jetbrains.kotlin:kotlin-reflect:$kotlin_version".
This will be explained below, but first let me add a convenience method to avoid creating the instances directly:
inline fun <reified T : Enum<T>> enumStringLowerCase(
property: KMutableProperty<T>) = EnumStringDelegate(
T::class.java,
property,
String::toLowerCase,
String::toUpperCase)
And a usage example for your class:
// if you don't need the `str` anywhere else, the enum class can be shortened to this:
enum class WeightUnit { KG, LB }
class UserData : BaseModel() {
// ... more fields omitted for clarity
#Exclude
var weightUnitEnum: WeightUnit = WeightUnit.KG
var weightUnit: String by enumStringLowerCase(UserData::weightUnitEnum)
}
Now, the explanation:
When you write var weightUnit: String by enumStringLowerCase(UserData::weightUnitEnum), you delegate the String property to the constructed delegate object. This means that when the property is accessed, the delegate methods are called instead. And the delegate object, in turn, works with the weightUnitEnum property under the hood.
The convenience function I added saves you from the necessity of writing UserData::class.java at the property declaration site (using a reified type parameter) and provides the conversion functions to EnumStringDelegate (you can create other functions with different conversions at any time, or even make a function that receives the conversion functions as lambdas).
Basically, this solution saves you from the boilerplate code that represents a property of enum type as a String property, given the conversion logic, and also allows you to get rid of the redundant code in your enum, if you don't use it anywhere else.
Using this technique, you can implement any other conversion between properties, like the number to timestamp you mentioned.
I am in similar situation & thus found your question, plus whole lot of other similar questions/answers.
Cant answer your question directly but this is what I ended up doing: I decided to change my app & not use enum data types at all - mainly because of the advice from Google dev portal which shows how bad the enum's are on app's performance. See the video below https://www.youtube.com/watch?v=Hzs6OBcvNQE
There have been some questions answered on this before.
How can I pass a scala object reference around in Java
How can I use a Scala singleton object in Java?
But my problem is that I have nested scala objects, something like:
object Criteria {
object ActionCriteria {
case class Action (parameter: String) {
def this(parameter: String) = { this(paramerter) }
}
object Action {
def apply(parameter: String): Action = { apply(parameter) }
}
}
}
In java I then need to create a list of Actions. I have tried this... to no avail:
import Criteria.ActionCriteria.Action$
....
List<Criteria.ActionCriteria.Action$.MODULE$> actions = new ArrayList<>();
As well as a bunch of other combinations like adding $.MODULE$ with every object. Right now I am getting the following error:
error: cannot find symbol Criteria.ActionCriteria
List<Criteria$ActionCriteria$Action> actions = new ArrayList<>();
Seems to work fine. Found this with Scala REPL:
scala> classOf[Criteria.ActionCriteria.Action]
res1: Class[Criteria.ActionCriteria.Action] = class Criteria$ActionCriteria$Action
If you want the type of Action object, not case class (highly unlikely, but for the sake of completeness):
scala> Criteria.ActionCriteria.Action.getClass
res2: Class[_ <: Criteria.ActionCriteria.Action.type] = class Criteria$ActionCriteria$Action$
The difference is caused by Scala expecting Action to be a type in classOf[Action], so it returns the type corresponding to the case class. When you use Action in a context where a value is expected, it returns the singleton instance instead, so you can call standard Java method getClass to get the type of object Action.
In case you need other types:
Criteria$ cm = Criteria$.MODULE$;
Criteria.ActionCriteria$ cacm = Criteria.ActionCriteria$.MODULE$;
Criteria$ActionCriteria$Action$ cacam = Criteria$ActionCriteria$Action$.MODULE$;
Criteria$ActionCriteria$Action caca = new Criteria$ActionCriteria$Action("Foo");
Criteria.ActionCriteria$ is breaking the pattern here. Why? According to Iulian Dragos' comment under bug SI-2034 this is a special case:
since objects are "the equivalent of static" in the Java world, we
wanted to make it easier for Java code to use static inner classes.
When there's only one level of nesting, there's a guaranteed
companion: every top-level object gets a mirror class (if there isn't
one) that creates static forwarders to module methods (that's how one
can run a main method defined inside an object). Therefore, a
special case for one-level nesting: those classes use the flattened
name (without a $ suffix) as outer_name. So, Java code can say new Outer.Inner.
Summary
For every level of nesting other than first you replace . with $ in your class names
If the target type is also an object you add $ at the end
If you want an instance you add .MODULE$
In Java I have something like:
Collectors.groupingBy((Re r) -> return r.pName)
And it works properly. Now I'm trying to get the same thing into scala, like:
Collectors.groupingBy((r:Re) => return r.pName)
but then I get stuff like
cannot resolve reference groupingBy with such signature
cannot resolve symbol groupingBy
unspecified value parameters Collector
unspecified value parameters Supplier
Let me know if you need any more info/code, and I'll create some dummy example since I'm not allowed to post the exact code.
Update based on #Vladimir Matveev answer:
pName should be String, but if I write new java.util.function.Function[Re, java.lang.String] then I get a
type mismatch;
found : java.util.function.Function[Re,String]
required: java.util.function.Function[_ >: Re, _ <: ?0(in value x$1)(in value x$1)(in value x$1)(in value x$1)]
Java lambdas are "implementors" of arbitrary functional interfaces (in this particular case Collectors.groupingBy() accepts java.util.function.Function. Scala anonymous functions, however, are instances of some class implementing scala.FunctionX trait. Consequently, you can't use Scala functions for arbitrary functional interfaces (but there are plans to allow that, as far as I know).
You need to create an anonymous class extending java.util.function.Function explicitly:
Collectors.groupingBy(new java.util.function.Function[Re, ???] {
def apply(r: Re) = r.pName
})
(you need to put correct type of pName instead of ???, of course).
If you're doing this often, you can define an implicit conversion for Scala's T => U to java.util.function.Function[T, U]:
implicit class FunctionWrapper[T, U](f: T => U) extends java.util.function.Function[T, U] {
def apply(x: T): U = f(x)
}
Then (given that this implicit is in scope) you can use it like you tried initially:
Collectors.groupingBy((r: Re) => r.pName)
Update I have no idea why your error happens (probably because of some incompatibilities between Scala and Java generics), but if you specify all types explicitly it does work:
scala> Collectors.groupingBy[Re, String](new JFunction[Re, String] {
| def apply(r: Re) = r.pName
| })
res2: java.util.stream.Collector[Re, _, java.util.Map[String,java.util.List[Re]]] = java.util.stream.Collectors$CollectorImpl#4f83df68
(JFunction is an alias for java.util.function.Function).
The variant with an implicit adaptor looks nicer (but still requires explicit type annotations):
scala> Collectors.groupingBy[Re, String]((r: Re) => r.pName)
res4: java.util.stream.Collector[Re, _, java.util.Map[String,java.util.List[Re]]] = java.util.stream.Collectors$CollectorImpl#71075444
I want to create a request for JSON-RPC with three parameters - String, Integer and my own object. Request should look like this:
{"method":"MyMethod", "params":["text", 123, {"name": "any text", "num": 15}], "id":1}
Ideally, I would like to create an AutoBean like this (but it does not work):
interface JsonRpcRequest {
String getJsonrpc();
void setJsonrpc(String value);
String getMethod();
void setMethod(String value);
List<Object> getParams(); // ERROR: Type Object may not be used
void setParams(List<Object> params); // ERROR: Type Object may not be used
}
interface JsonRpcRequestFactory extends AutoBeanFactory {
AutoBean<JsonRpcRequest> jsonRpcRequest();
}
The problem is that the AutoBean framework does not allows the use of List<Object> inside interface.
Is there another way to create a list/array of elements of different based and non-based types?
No, you simply can't. AutoBean requires everything to be statically typed: no polymorphism, and no mixed-typed lists of maps.
You might be interested by RequestFactory's built-in support for JSON-RPC though.
Why do your params all need to be passed back in a list? Surely you're not going to do the same thing with a String, an Integer, and another Object! Just send them all back separately.
Further, you're not sending a custom Object over the JSON, you're sending the objid of that object... so just send the Integer id and let the server handle it.