Avoiding garbage while creating objects using Scala runtime reflection - java

In the example code below, I am trying to create case class objects with default values using runtime Scala reflection (required for my use case)!
First Approach
Define default values for case class fields
Create objects at runtime
Second Approach
Create a case class object in the companion object
Fetch that object using reflection
At first glance, the second approach seemed better because we are creating object only once but upon profiling these two approaches, the second doesn't seem to add much value. Although while sampling only one object is created indeed throughout the runtime of the application! Though it looks obvious that those objects are being created every time when using reflection (Correct me if I am wrong).
newDefault
newDefault2
object TestDefault extends App {
case class XYZ(str: String = "Shivam")
object XYZ { private val default: XYZ = XYZ() }
case class ABC(int: Int = 99)
object ABC { private val default: ABC = ABC() }
def newDefault[A](implicit t: reflect.ClassTag[A]): A = {
import reflect.runtime.{universe => ru}
import reflect.runtime.{currentMirror => cm}
val clazz = cm.classSymbol(t.runtimeClass)
val mod = clazz.companion.asModule
val im = cm.reflect(cm.reflectModule(mod).instance)
val ts = im.symbol.typeSignature
val mApply = ts.member(ru.TermName("apply")).asMethod
val syms = mApply.paramLists.flatten
val args = syms.zipWithIndex.map {
case (p, i) =>
val mDef = ts.member(ru.TermName(s"apply$$default$$${i + 1}")).asMethod
im.reflectMethod(mDef)()
}
im.reflectMethod(mApply)(args: _*).asInstanceOf[A]
}
for (i <- 0 to 1000000000)
newDefault[XYZ]
// println(s"newDefault XYZ = ${newDefault[XYZ]}")
// println(s"newDefault ABC = ${newDefault[ABC]}")
def newDefault2[A](implicit t: reflect.ClassTag[A]): A = {
import reflect.runtime.{currentMirror => cm}
val clazz = cm.classSymbol(t.runtimeClass)
val mod = clazz.companion.asModule
val im = cm.reflect(cm.reflectModule(mod).instance)
val ts = im.symbol.typeSignature
val defaultMember = ts.members.filter(_.isMethod).filter(d => d.name.toString == "default").head.asMethod
val result = im.reflectMethod(defaultMember).apply()
result.asInstanceOf[A]
}
for (i <- 0 to 1000000000)
newDefault2[XYZ]
}
Is there any way to reduce the memory footprint? Any other better approach to achieve the same?
P.S. If are trying to run this app, comment the following lines alternatively:
for (i <- 0 to 1000000000)
newDefault[XYZ]
for (i <- 0 to 1000000000)
newDefault2[XYZ]
EDIT
As per #Levi Ramsey's suggestion, I did try memoization but it seems to only make a small difference!
val cache = new ConcurrentHashMap[universe.Type, XYZ]()
def newDefault2[A](implicit t: reflect.ClassTag[A]): A = {
import reflect.runtime.{currentMirror => cm}
val clazz = cm.classSymbol(t.runtimeClass)
val mod = clazz.companion.asModule
val im = cm.reflect(cm.reflectModule(mod).instance)
val ts = im.symbol.typeSignature
if (!cache.contains(ts)) {
val default = ts.members.filter(_.isMethod).filter(d => d.name.toString == "default").head.asMethod
cache.put(ts, im.reflectMethod(default).apply().asInstanceOf[XYZ])
}
cache.get(ts).asInstanceOf[A]
}
for (i <- 0 to 1000000000)
newDefault2[XYZ]

Related

Scala: Reflection APIs to call one of the two methods with the same name

I am trying to use Scala Reflection APIs to call one of the two methods with the same name. Only difference is that one of them takes an argument but the other one doesn't. I want to call the one that doesn't take any arguments. I am trying something like this:
val ru = scala.reflect.runtime.universe
val rm = ru.runtimeMirror(getClass.getClassLoader)
val instanceMirror = rm.reflect(myInstance)
val methodSymbol = instanceMirror.symbol.typeSignature.member(ru.TermName("getXyzMethod")).asTerm.alternatives
if (methodSymbol != null && methodSymbol.nonEmpty) {
try {
val method = instanceMirror.reflectMethod(methodSymbol.head.asMethod)
val value = method()
}
} catch {
case e: java.lang.IndexOutOfBoundsException =>
val method = instanceMirror.reflectMethod(methodSymbol.last.asMethod)
val value = method()
case e: Exception =>
}
}
This works but as you can see this is a bit ugly. The reason for doing it this way is that the 'methodSymbol' is a list in which the method I want is sometimes in the 'head' position & sometimes in the 'last' position.
How do I use Scala Reflection APIs to get only the method that I want which has no arguments?
You can do something like this:
val ru: JavaUniverse = scala.reflect.runtime.universe
val rm: ru.Mirror = ru.runtimeMirror(getClass.getClassLoader)
val instanceMirror: ru.InstanceMirror = rm.reflect(myInstance)
val methodSymbol: Seq[ru.Symbol] =
instanceMirror.symbol.typeSignature.member(ru.TermName("getXyzMethod")).asTerm.alternatives
val maybeMethods: Try[ru.MethodSymbol] = Try(methodSymbol.map(_.asMethod).filter(_.paramLists.flatten.isEmpty).head)
val result: ru.MethodMirror = maybeMethods match {
case Failure(exception) => //do something with it
throw new Exception(exception)
case Success(value) => instanceMirror.reflectMethod(value)
}
println(result)
This will always return the method with no parameters.
Being like this:
def getXyzMethod() = ???
or
def getXyzMethod = ???
Adjust the size of the sequence if that method as more parameters, so if the method you want has exactly 1 parameter:
val maybeMethods: Try[ru.MethodSymbol] = Try(methodSymbol.map(_.asMethod).filter(_.paramLists.flatten.size==1).head)
And so on, hope this helps.

Scala accept only String or Int generic case class in List

I have a case class defined as below
case class ChooseBoxData[T](index:T, text:String)
Is it possible to declare a List so that the list only accept type of ChooseBoxData[String] and ChooseBoxData[Int]?
What I expected is something like:
val specialList:List[some type declaration] = List(
ChooseBoxData[String]("some string","some string"),/* allow, because is ChooseBoxData[String]*/
ChooseBoxData[Int](12,"some string"), /* also allow, because is ChooseBoxData[Int]*/
ChooseBoxData[Boolean](true,"some string")/* not allow type other than ChooseBoxData[String] or ChooseBoxData[Int]*/
)
Something like this maybe:
trait AllowableBoxData
object AllowableBoxData {
private of[T](cbd: ChooseBoxData[T]) = new ChooseBoxData(cbd.index, cbd.text)
with AllowableBoxData
implicit def ofInt(cbd: ChooseBoxData[Int]) = of(cbd)
implicit def ofString(cbd: ChooseBoxData[String]) = of(cbd)
}
Now you can do things like
val list: List[ChooseBoxData[_] with AllowableBoxData] = List(ChooseBoxData("foo", "bar"), ChooseBoxData(0, "baz")
But not val list: List[AllowableBoxData] = List(ChooseBoxData(false, "baz"))
Also, if you were looking to declare a function argument rather than just a variable, there would be a bit more elegant solution:
trait CanUse[T]
implicit case object CanUseInt extends CanUse[Int]
implicit case object CanUseString extends CanUse[String]
def foo[T : CanUse](bar: List[ChooseBoxData[T]])
Here's what I came up with:
First, we create the following Algebraic Data Types (ADT):
sealed trait StringInt
case class Stringy(s : String) extends StringInt
case class Inty(s : Int) extends StringInt
And define ChoooseBoxData as follows:
case class ChooseBoxData(index : StringInt, text : String)
Then we define the following implicts to convert Int and String in the scope to the defined ADT:
object CBImplicits {
implicit def conv(u : String) = Stringy(u)
implicit def conv2(u : Int) = Inty(u)
}
Now, we can enforce the requirement in the question. Here is an example:
import CBImplicits._
val list = List(ChooseBoxData("str", "text"),
ChooseBoxData(1, "text"),
ChooseBoxData(true, "text"))
Trying to run the above, the compiler will complain about type mismatch. But this will compile and run:
List(
ChooseBoxData("str", "text"),
ChooseBoxData(1, "text"),
ChooseBoxData(12, "text2"))
which results in:
a: List[ChooseBoxData] =
List(ChooseBoxData(Stringy(str),text), ChooseBoxData(Inty(1),text), ChooseBoxData(Inty(12),text2))
This preserves index type information (wrapped in StringInt supertype of course) which later can be easily extracted using pattern matching for individual elements.
It is easy to remove the wrapper for all elements too, but it will result in the index type to become Any which is what we would expect because Any is the lowest common ancestor for both String and Int in Scala's class hierarchy.
EDIT: A Solution Using Shapeless
import shapeless._
import syntax.typeable._
case class ChooseBoxData[T](index : T, text : String)
val a = ChooseBoxData(1, "txt")
val b = ChooseBoxData("str", "txt")
val c = ChooseBoxData(true, "txt")
val list = List(a, b, c)
val `ChooseBoxData[Int]` = TypeCase[ChooseBoxData[Int]]
val `ChooseBoxData[String]` = TypeCase[ChooseBoxData[String]]
val res = list.map {
case `ChooseBoxData[Int]`(u) => u
case `ChooseBoxData[String]`(u) => u
case _ => None
}
//result
res: List[Product with Serializable] = List(ChooseBoxData(1,txt), ChooseBoxData(str,txt), None)
So it allows compilation, but will replace invalid instances with None (which then can be used to throw a runtime error if desired), or you can directly filter the instances you want using:
list.flatMap(x => x.cast[ChooseBoxData[Int]])
//results in:
List[ChooseBoxData[Int]] = List(ChooseBoxData(1,txt))
You can build extra constraint on top of your case class.
import language.implicitConversions
case class ChooseBoxData[T](index:T, text:String)
trait MySpecialConstraint[T] {
def get: ChooseBoxData[T]
}
implicit def liftWithMySpecialConstraintString(cbd: ChooseBoxData[String]) =
new MySpecialConstraint[String] {
def get = cbd
}
implicit def liftWithMySpecialConstraintInt(cbd: ChooseBoxData[Int]) =
new MySpecialConstraint[Int] {
def get = cbd
}
// Now we can just use this constraint for out list
val l1: List[MySpecialConstraint[_]] = List(ChooseBoxData("A1", "B1"), ChooseBoxData(2, "B2"))
Why can't you do it like this:
object solution extends App {
case class ChooseBoxData[T](index: T, text: String) extends GenericType[T]
trait GenericType[T] {
def getType(index: T, text: String): ChooseBoxData[T] = ChooseBoxData[T](index, text)
}
val specialList = List(
ChooseBoxData[String]("some string", "some string"),
ChooseBoxData[Int](12, "some string"),
ChooseBoxData[Boolean](true, "some string")
)
println(specialList)
}
//output: List(ChooseBoxData(some string,some string), ChooseBoxData(12,some string), ChooseBoxData(true,some string))

Access a value's parent naming from within the instantiated class (Scala)?

Assume Scala 2.11. I'm writing a class that will persist a Scala value. It's intention is to be used as such:
class ParentClass {
val instanceId: String = "aUniqueId"
val statefulString: Persisted[String] = persisted { "SomeState" }
onEvent {
case NewState(state) => statefulString.update(state)
}
}
Persisted is a class with a type parameter that is meant to persist that specific value like a cache, and Persist handles all of the logic associated with persistence. However, to simply the implementation, I'm hoping to retrieve information about it's instantiation. For example, if it's instance in the parent class is named statefulString, how can I access that name from within the Persisted class itself?
The purpose of doing this is to prevent collisions in automatic naming of persisted values while simplifying the API. I cannot rely on using type, because there could be multiple values of String type.
Thanks for your help!
Edit
This question may be helpful: How can I get the memory location of a object in java?
Edit 2
After reading the source code for ScalaCache, it appears there is a way to do this via WeakTypeTag. Can someone explain what exactly is happening in its macros?
https://github.com/cb372/scalacache/blob/960e6f7aef52239b85fa0a1815a855ab46356ad1/core/src/main/scala/scalacache/memoization/Macros.scala
I was able to do this with the help of Scala macros and reflection, and adapting some code from ScalaCache:
class Macros(val c: blackbox.Context) {
import c.universe._
def persistImpl[A: c.WeakTypeTag, Repr: c.WeakTypeTag](f: c.Tree)(keyPrefix: c.Expr[ActorIdentifier], scalaCache: c.Expr[ScalaCache[Repr]], flags: c.Expr[Flags], ec: c.Expr[ExecutionContext], codec: c.Expr[Codec[A, Repr]]) = {
commonMacroImpl(keyPrefix, scalaCache, { keyName =>
q"""_root_.persistence.sync.caching($keyName)($f)($scalaCache, $flags, $ec, $codec)"""
})
}
private def commonMacroImpl[A: c.WeakTypeTag, Repr: c.WeakTypeTag](keyPrefix: c.Expr[ActorIdentifier], scalaCache: c.Expr[ScalaCache[Repr]], keyNameToCachingCall: (c.TermName) => c.Tree): Tree = {
val enclosingMethodSymbol = getMethodSymbol()
val valNameTree = getValName(enclosingMethodSymbol)
val keyName = createKeyName()
val scalacacheCall = keyNameToCachingCall(keyName)
val tree = q"""
val $keyName = _root_.persistence.KeyStringConverter.createKeyString($keyPrefix, $valNameTree)
$scalacacheCall
"""
tree
}
/**
* Get the symbol of the method that encloses the macro,
* or abort the compilation if we can't find one.
*/
private def getValSymbol(): c.Symbol = {
def getValSymbolRecursively(sym: Symbol): Symbol = {
if (sym == null || sym == NoSymbol || sym.owner == sym)
c.abort(
c.enclosingPosition,
"This persistence block does not appear to be inside a val. " +
"Memoize blocks must be placed inside vals, so that a cache key can be generated."
)
else if (sym.isTerm)
try {
val termSym = sym.asInstanceOf[TermSymbol]
if(termSym.isVal) termSym
else getValSymbolRecursively(sym.owner)
} catch {
case NonFatal(e) => getValSymbolRecursively(sym.owner)
}
else
getValSymbolRecursively(sym.owner)
}
getValSymbolRecursively(c.internal.enclosingOwner)
}
/**
* Convert the given method symbol to a tree representing the method name.
*/
private def getValName(methodSymbol: c.Symbol): c.Tree = {
val methodName = methodSymbol.asMethod.name.toString
// return a Tree
q"$methodName"
}
private def createKeyName(): TermName = {
// We must create a fresh name for any vals that we define, to ensure we don't clash with any user-defined terms.
// See https://github.com/cb372/scalacache/issues/13
// (Note that c.freshName("key") does not work as expected.
// It causes quasiquotes to generate crazy code, resulting in a MatchError.)
c.freshName(c.universe.TermName("key"))
}
}

Using boxed/atomic values in Scala with Chronicle Map

We're using ChronicleMap to support off-heap persistence in a large number of different stores, but hit a bit a of a problem with the most simple usecase.
First of all, here's the helper I wrote to make creation easier:
import java.io.File
import java.util.concurrent.atomic.AtomicLong
import com.madhukaraphatak.sizeof.SizeEstimator
import net.openhft.chronicle.map.{ChronicleMap, ChronicleMapBuilder}
import scala.reflect.ClassTag
object ChronicleHelper {
def estimateSizes[Key, Value](data: Iterator[(Key, Value)], keyEstimator: AnyRef => Long = defaultEstimator, valueEstimator: AnyRef => Long = defaultEstimator): (Long, Long, Long) = {
println("Estimating sizes...")
val entries = new AtomicLong(1)
val keySum = new AtomicLong(1)
val valueSum = new AtomicLong(1)
var i = 0
val GroupSize = 5000
data.grouped(GroupSize).foreach { chunk =>
chunk.par.foreach { case (key, value) =>
entries.incrementAndGet()
keySum.addAndGet(keyEstimator(key.asInstanceOf[AnyRef]))
valueSum.addAndGet(valueEstimator(value.asInstanceOf[AnyRef]))
}
i += 1
println("Progress:" + i * GroupSize)
}
(entries.get(), keySum.get() / entries.get(), valueSum.get() / entries.get())
}
def defaultEstimator(v: AnyRef): Long = SizeEstimator.estimate(v)
def createMap[Key: ClassTag, Value: ClassTag](data: => Iterator[(Key, Value)], file: File): ChronicleMap[Key, Value] = {
val keyClass = implicitly[ClassTag[Key]].runtimeClass.asInstanceOf[Class[Key]]
val valueClass = implicitly[ClassTag[Value]].runtimeClass.asInstanceOf[Class[Value]]
val (entries, averageKeySize, averageValueSize) = estimateSizes(data)
val builder = ChronicleMapBuilder.of(keyClass, valueClass)
.entries(entries)
.averageKeySize(averageKeySize)
.averageValueSize(averageValueSize)
.asInstanceOf[ChronicleMapBuilder[Key, Value]]
val cmap = builder.createPersistedTo(file)
val GroupSize = 5000
println("Inserting data...")
var i = 0
data.grouped(GroupSize).foreach { chunk =>
chunk.par.foreach { case (key, value) =>
cmap.put(key, value)
}
i += 1
println("Progress:" + i * GroupSize)
}
cmap
}
def empty[Key: ClassTag, Value: ClassTag]: ChronicleMap[Key, Value] = {
val keyClass = implicitly[ClassTag[Key]].runtimeClass.asInstanceOf[Class[Key]]
val valueClass = implicitly[ClassTag[Value]].runtimeClass.asInstanceOf[Class[Value]]
ChronicleMapBuilder.of(keyClass, valueClass).create()
}
def loadMap[Key: ClassTag, Value: ClassTag](file: File): ChronicleMap[Key, Value] = {
val keyClass = implicitly[ClassTag[Key]].runtimeClass.asInstanceOf[Class[Key]]
val valueClass = implicitly[ClassTag[Value]].runtimeClass.asInstanceOf[Class[Value]]
ChronicleMapBuilder.of(keyClass, valueClass).createPersistedTo(file)
}
}
It uses https://github.com/phatak-dev/java-sizeof for object size estimation. Here's the kind of usage we want to support:
object TestChronicle {
def main(args: Array[String]) {
def dataIterator: Iterator[(String, Int)] = (1 to 5000).toIterator.zipWithIndex.map(x => x.copy(_1 = x._1.toString))
ChronicleHelper.createMap[String, Int](dataIterator, new File("/tmp/test.map"))
}
}
But it throws an exception:
[error] Exception in thread "main" java.lang.ClassCastException: Key
must be a int but was a class java.lang.Integer [error] at
net.openhft.chronicle.hash.impl.VanillaChronicleHash.checkKey(VanillaChronicleHash.java:661)
[error] at
net.openhft.chronicle.map.VanillaChronicleMap.queryContext(VanillaChronicleMap.java:281)
[error] at
net.openhft.chronicle.map.VanillaChronicleMap.put(VanillaChronicleMap.java:390)
[error] at ...
I can see that it might have something to do with atomicity of Scala's Int as opposed to Java's Integer, but how do I bypass that?
Scala 2.11.7
Chronicle Map 3.8.0
Seems suspicious that in your test it's Iterator[(String, Int)] (rather than Iterator[(Int, String)]) for key type is String and value type is Int, while the error message is compaining about key's type (int/Integer)
If error message says Key must be a %type% it means that you configured that type in the first ChronicleMapBuilder.of(keyType, valueType) statement. So in your case it means that you configured int.class (the Class object, representing the primitive int type in Java), that is not allowed, and providing java.lang.Integer instance to map's methods (probably you provide primitive ints, but they become Integer due to boxing), that is allowed. You should ensure that you are providing java.lang.Integer.class (or some other Scala's class) to ChronicleMapBuilder.of(keyType, valueType) call.
I don't know what size estimation this project gives: https://github.com/phatak-dev/java-sizeof, but in any case you should specify size in bytes that the object will take in serialized form. Serialized form itself depends on default serializers, chosen for a specific type in Chronicle Map (and may change between Chronicle Map versions), or custom serializers configured for specific ChronicleMapBuilder. So using any information about key/value "sizes" to configure a Chronicle Map, other than out of the Chronicle Map itself, is fragile. You can use the following procedure to estimate sizes more reliably:
public static <V> double averageValueSize(Class<V> valueClass, Iterable<V> values) {
try (ChronicleMap<Integer, V> testMap = ChronicleMap.of(Integer.class, valueClass)
// doesn't matter, anyway not a single value will be written to a map
.averageValueSize(1)
.entries(1)
.create()) {
LongSummaryStatistics statistics = new LongSummaryStatistics();
for (V value : values) {
try (MapSegmentContext<Integer, V, ?> c = testMap.segmentContext(0)) {
statistics.accept(c.wrapValueAsData(value).size());
}
}
return statistics.getAverage();
}
}
You can find it in this test: https://github.com/OpenHFT/Chronicle-Map/blob/7aedfba7a814578a023f7975ef15ba88b4d435db/src/test/java/eg/AverageValueSizeTest.java
This procedure is hackish, but there are no better options right now.
Another recommendation:
If your keys or values are kind of primitives (ints, longs, doubles, but boxed), or any other type that is always of the same size, you shouldn't use averageKey/averageValue/averageKeySize/averageValueSize methods, better you use constantKeySizeBySample/constantValueSizeBySample method. Specifically for java.lang.Integer, Long and Double even this is not needed, Chronicle Map already knows that those types are constantly sized.

How to wrap an incremental mutable Java class in a functional Scala class without eagerly wasting memory?

[I created an imaginary JavaClass just to be able to test the code, see at the end of the question.]
I use an incremental/mutable algorithm from a Java library (Weka, but the question applies to any Java library). I am trying to wrap its mutable nature. One way to do it is like in code below:
class Model {
private val model = new JavaModel //just a fake example (could be NaiveBayes)
//just returns the JavaModel to simplify the question.
lazy val last_state(items: Seq[Item]) = {
items foreach model.update
model
}
}
The problem is that sometimes, some of the intermediary states are also needed.
The straight forward way to do this is to keep a copy of each of them:
class Model {
private val model = new JavaModel //just a fake example (could be NaiveBayes)
def states(items: Seq[Item]): Stream[JavaModel] =
if (items.isEmpty) Stream.Empty
else {
val h = items.head
val t = items.tail
val clone = clone(model) // (AbstractClassifier.makeCopy for weka users)
clone.update(h)
clone #:: states(t)
}
}
}
When one needs to get the last result, this is much slower than the first code because of all the unneeded copying. I could put lazy vals inside the stream. But at the time that its evaluation occurs, the instance model is not guaranteed to be the same anymore.
class Lazy (model: JavaModel) {
lazy val copy = clone(model) // (AbstractClassifier.makeCopy for weka users)
}
class Model {
private val model = new JavaModel //just a fake example (could be NaiveBayes)
def states(items: Seq[Item]): Stream[Lazy] =
if (items.isEmpty) Stream.Empty
else {
val h = items.head
val t = items.tail
model.update(h)
Lazy(model) #:: states(t)
}
}
}
This ugly solution was the best I found. I has non-private mutable fields, etc.:
class JavaClass(initial:Int = 1) {
//represents a lot of data structures
private var state = initial
//Int represents something more complex
def update(h: Int) {
println("Some heavy calculations performed. State=" + state)
state += 1
}
def copy = new JavaClass(state)
//other methods that make use of the state ...
}
case class Lazy(java_object: JavaClass) {
var lazy_var: Option[JavaClass] = null
def copied = if (lazy_var == null) {
lazy_var = Some(java_object.copy)
lazy_var
} else lazy_var
}
class Model {
def states(items: Seq[Int]): Stream[Lazy] = {
val java_class = new JavaClass
def rec(items: Seq[Int], laz: Lazy): Stream[Lazy] =
if (laz != null && items.isEmpty) Stream.Empty
else {
if (laz.lazy_var == null) laz.lazy_var = None
val h = items.head
val t = items.tail
java_class.update(h)
val new_laz = Lazy(java_class)
new_laz #:: rec(t, new_laz)
}
rec(items, Lazy(null))
}
}
//Test:
scala> val m = new Model
m: Model = Model#726b80fa
scala> val states = m.states(Seq(1, 2, 3, 4, 5))
Some heavy calculations performed. State=1
states: Stream[Lazy] = Stream(Lazy(JavaClass#283e1abf), ?)
scala> states(0).copied match {case Some(x) => x; case None => -1}
res31: Any = JavaClass#1029bf49
scala> states(3).copied match {case Some(x) => x; case None => -1}
Some heavy calculations performed. State=2
Some heavy calculations performed. State=3
Some heavy calculations performed. State=4
res32: Any = JavaClass#3cb40c69
scala> states(3).copied match {case Some(x) => x; case None => -1}
res33: Any = JavaClass#3cb40c69
scala> states(1).copied match {case Some(x) => x; case None => -1}
res34: Any = -1
A good place is to start from "artificial fields" using "_=" posfix as explained in the example from this site. Maybe it is better to open a new question with correct camelCase names.
// Fields may be artificial:
class H {
private var realX = 0
def x = realX
// called for "this.x = <value>":
def x_=(newX : Int) {
this.realX = newX
}
}

Categories