accessing java two-dimensional vector from scala - java

Here's my case:
I created a table with DefaultTableModel
So when I use getDataVector I get a two-dimensional java.util.Vector.
When I use toSeq or any other converter I get something like
Buffer([5.0, 1.0, 50.0], [10.0, 1.5, 40.0], [2.0, 1.5, 90.0], [1.0, 1.0, 100.0], [6.0, 3.0, 100.0], [16.0, 3.5, 50.0])
The inner objects are returned as java.lang.Object (AnyRef in scala), and not as arrays
How can I convert them or access their contents?
Here is the code to test
import collection.mutable.{Buffer, ArrayBuffer}
import javax.swing.table._
import scala.collection.JavaConversions._
var data = Array(
Array("5.0", "1.0", "50.0"),
Array("10.0", "1.5", "40.0"),
Array("2.0", "1.5", "90.0"),
Array("1.0", "1.0", "100.0"),
Array("6.0", "3.0", "100.0"),
Array("16.0", "3.5", "50.0"))
val names = Array("K¹", "K²", "K³")
val m = new DefaultTableModel(data.asInstanceOf[Array[Array[AnyRef]]], names.asInstanceOf[Array[AnyRef]])
val t = m.getDataVector.toSeq

This is an older interface in Java, so it returns a pre-generic Vector (i.e. a Vector[_]). There are a variety of ways you could deal with this, but one is:
val jv = m.getDataVector.asInstanceOf[java.util.Vector[java.util.Vector[AnyRef]]]
val sv = jv.map(_.toSeq)
to first explicitly specify what the return type ought to be, and then convert it into Scala collections. If you prefer to convert to immutable collections, you can
val sv = Vector() ++ jv.map(Vector() ++ _)
among other things. (These are now Scala immutable vectors, not java.util.Vectors.)
If you want to mutate the vectors that were returned, just use jv as-is, and rely upon the implicit conversions to do the work for you.
Edit: added a couple other ways to get immutable collections (possible, but I wouldn't say that they're better):
val sv = List(jv.map(v => List(v: _*)): _*)
val sv = Vector.tabulate(jv.length,jv(0).length)((i,j) => jv(i)(j))
Note that the second only works if the table is nonempty and rectangular.

Related

How to convert a List[java.lang.Long] to a List[scala.Long]

I'm trying to convert from a java List to a scala List[scala.Long], i have seen from scala to java, but not the other way around.
I have tried using:
def convertJavaList2ScalaList[A]( list : java.util.List[A] ) : List[A]
={
val buffer = list.asScala
buffer.toList
}
And it works for other Objects (Eg. Person), but doesn't work when i try to convert scala.Long to java.lang.Long
Thanks for the help.
import scala.collection.JavaConverters._
// given a Java List of Java Longs:
val jlist: java.util.List[java.lang.Long] = ???
val scalaList: List[Long] = jlist.asScala.toList.map(_.toLong)

How to get multi-dimenstional primitive array class in kotlin?

in Java I have been using gson to parse a json like this [[1.2, 4.1], [3.4, 4.4]] into a java primitive multi-array double[][]
The code looks like this (and works fine) :
String json = "[[1.2, 4.1], [3.4, 4.4]]"
double[][] variable = new Gson().fromJson(json, double[][].class);
Is there a way to get the double[][].class in kotlin ?
Is double[][] variable; can be substitute in kotlin ?
Edit :
My goal is to achieve the same behavior with gson in kotlin. I have thousand of doubles arrays to parse.
I would like to do something like this in kotlin :
val json = "[[1.1, 1.2], [2.1, 2.2, 2.3], [3.1, 3.2]"
val variable:Double[][] = Gson().fromJson(json, Double[][]::class.java)
Answer to the Gson problem
For the class type of your use case use Array<DoubleArray>::class.java)
Some additional Words on Multidimensional Arrays
Simply wrap arrayOf into another arrayOf or doubleArrayOf (less Boxing overhead) to get something like Array<DoubleArray>:
val doubles : Array<DoubleArray> = arrayOf(doubleArrayOf(1.2), doubleArrayOf(2.3))
It's also possible to nest multiple Array initializers with the following constructor:
public inline constructor(size: Int, init: (Int) -> T)
A call can look like this:
val doubles2: Array<DoubleArray> = Array(2) { i ->
DoubleArray(2) { j ->
j + 1 * (i + 1).toDouble()
}
}
//[[1.0, 2.0], [2.0, 3.0]]
In the future, you can try using the Kotlin converter. I took your code and ran it through the converter and got the following working code which agrees with the answer given.
internal var json = "[[1.2, 4.1], [3.4, 4.4]]"
internal var variable = Gson().fromJson(json, Array<DoubleArray>::class.java)
You can mix arrayOf and doubleArrayOf for that case.
arrayOf(
doubleArrayOf(1.2, 4.1)
doubleArrayOf(3.4, 4.4)
)

convert scala hashmap with List to java hashmap with java list

I am new to scala and spark.I have below case class A
case class A(uniqueId : String,
attributes: HashMap[String, List[String]])
Now I have a dataFrame of type A. I need to call a java function on each row of that DF. I need to convert Hashmap to Java HashMap and List to java list..
How can i do that.
I am trying to do following
val rddCaseClass = RDD[A]
val a = rddCaseClass.toDF().map ( x=> {
val rowData = x.getAs[java.util.HashMap[String,java.util.List[String]]]("attributes")
callJavaMethod(rowData)
But this is giving me error :
java.lang.ClassCastException: scala.collection.mutable.WrappedArray$ofRef cannot be cast to java.util.List
Please help.
You can convert Scala Wrapped array to Java List using
scala.collection.JavaConversions
val wrappedArray: WrappedArray[String] = WrappedArray.make(Array("Java", "Scala"))
val javaList = JavaConversions.mutableSeqAsJavaList(wrappedArray)
JavaConversions.asJavaList can also be used but its deprecated: use mutableSeqAsJavaList instead
I think, you could use Seq instead of List for your parameters to work efficiently with List. This way it should work with most of the Seq implementations and no need to to convert the seqs like WrappedArray.
val rddCaseClass = RDD[A]
val a = rddCaseClass.toDF().map ( x=> {
val rowData = x.getAs[java.util.HashMap[String, Seq[String]]]("attributes")
callJavaMethod(rowData)

Update of the element in the DenseVector class, Spark

how is it possible to update some element with the index i in the object of the class DenseVector?
Is it possible? Well, it is:
scala> val vec = Vectors.dense(1, 2, 3)
vec: org.apache.spark.mllib.linalg.Vector = [1.0,2.0,3.0]
scala> vec.toArray(0) = 3.0
scala> vec
res28: org.apache.spark.mllib.linalg.Vector = [3.0,2.0,3.0]
I doubt it is an intended behavior though. Since Vectors don't implement update method there are clearly designed as immutable data structures.

Scala equivalent of new HashSet(Collection)

What is the equivalent Scala constructor (to create an immutable HashSet) to the Java
new HashSet<T>(c)
where c is of type Collection<? extends T>?.
All I can find in the HashSet Object is apply.
The most concise way to do this is probably to use the ++ operator:
import scala.collection.immutable.HashSet
val list = List(1,2,3)
val set = HashSet() ++ list
There are two parts to the answer. The first part is that Scala variable argument methods that take a T* are a sugaring over methods taking Seq[T]. You tell Scala to treat a Seq[T] as a list of arguments instead of a single argument using "seq : _*".
The second part is converting a Collection[T] to a Seq[T]. There's no general built in way to do in Scala's standard libraries just yet, but one very easy (if not necessarily efficient) way to do it is by calling toArray. Here's a complete example.
scala> val lst : java.util.Collection[String] = new java.util.ArrayList
lst: java.util.Collection[String] = []
scala> lst add "hello"
res0: Boolean = true
scala> lst add "world"
res1: Boolean = true
scala> Set(lst.toArray : _*)
res2: scala.collection.immutable.Set[java.lang.Object] = Set(hello, world)
Note the scala.Predef.Set and scala.collection.immutable.HashSet are synonyms.
From Scala 2.13 use the companion object
import scala.collection.immutable.HashSet
val list = List(1,2,3)
val set = HashSet.from(list)

Categories