Parsing ASCII communication with Hardware using Java/Scala? - java

I am writing software that will interface external device via exchange of ASCII commands. Example:
POS? 1 2
=>1=-1158.4405
=>2=+0000.0000
where above we ask for position of motorised microscope stage for the 1-st and 2-nd axes. It responds with positions in um. More details and examples.
My Question: is there a library that would ease parsing such string outputs, and/or would help generate queries?
Otherwise, what are the best practises for parsing and communicating with hardware using Java/Scala?

Trying to cope with following syntax (see 12.1 Format):
Reply syntax:
[<argument>[{SP<argument>}]"="]<value>LF
Multi-line reply syntax:
{[<argument>[{SP<argument>}]"="]<value>SP LF}
[<argument>[{SP<argument>}]"="]<value>LF for the last line!
This is my code:
import scala.util.parsing.combinator._
case class Result(argument: String, value: Float)
class ReplyParser extends RegexParsers{
override def skipWhitespace = false
private def floatingPointNumber: Parser[String] =
"""(-|\+)?(\d+(\.\d*)?|\d*\.\d+)""".r
private def value: Parser[Float] = floatingPointNumber ^^ (_.toFloat)
private def argument: Parser[String] = "[^= \n]+".r
private def arguments: Parser[List[String]] = rep1sep(argument," ") <~ "="
private def result: Parser[List[Result]] = arguments.? ~ value ^^ {
case arguments ~ value =>
arguments.getOrElse(List("")).map {
Result(_,value)
}
}
def reply: Parser[List[Result]] = rep1sep(result, " \n".r) <~ " " ^^ {
case result => result.flatten
}
}
object Parsing extends ReplyParser {
def main(args: Array[String]) {
val result = parseAll(reply,"a=+000.123 \nc d=-999.567 \n789 ")
println(s"$result")
}
}

Related

What is configuartion required to get data from object storage by SWIFT in Spark

I go through document but still it is very much confusing how to get data from swift.
I configured swift in my one linux machine. By using below command I am able to get container list,
swift -A https://acc.objectstorage.softlayer.net/auth/v1.0/ -U
username -K passwordkey list
I seen many blog for blumix(https://console.ng.bluemix.net/docs/services/AnalyticsforApacheSpark/index-gentopic1.html#genTopProcId2) and written the below code
sc.textFile("swift://container.myacct/file.xml")
I am looking to integrate in java spark. Where need to configure object storage credential in java code. Is there any sample code or blog?
This notebook illustrates a number of ways to load data using the Scala language. Scala runs on the JVM. Java and Scala classes can be freely mixed, no matter whether they reside in different projects or in the same. Looking at the mechanics of how Scala code interacts with Openstack Swift object storage should help guide you to craft a Java equivalent.
From the above notebook, here are some steps illustrating how to configure and extract data from an Openstack Swift Object Storage instance using the Stocator library using the Scala language. The swift url decomposes into:
swift2d :// container . myacct / filename.extension
^ ^ ^ ^
stocator name of namespace object storage
protocol container filename
Imports
import org.apache.spark.SparkContext
import scala.util.control.NonFatal
import play.api.libs.json.Json
val sqlctx = new SQLContext(sc)
val scplain = sqlctx.sparkContext
Sample Creds
// #hidden_cell
var credentials = scala.collection.mutable.HashMap[String, String](
"auth_url"->"https://identity.open.softlayer.com",
"project"->"object_storage_3xxxxxx3_xxxx_xxxx_xxxx_xxxxxxxxxxxx",
"project_id"->"6xxxxxxxxxx04fxxxxxxxxxx6xxxxxx7",
"region"->"dallas",
"user_id"->"cxxxxxxxxxxaxxxxxxxxxx1xxxxxxxxx",
"domain_id"->"cxxxxxxxxxxaxxyyyyyyxx1xxxxxxxxx",
"domain_name"->"853255",
"username"->"Admin_cxxxxxxxxxxaxxxxxxxxxx1xxxxxxxxx",
"password"->"""&M7372!FAKE""",
"container"->"notebooks",
"tenantId"->"undefined",
"filename"->"file.xml"
)
Helper Method
def setRemoteObjectStorageConfig(name:String, sc: SparkContext, dsConfiguration:String) : Boolean = {
try {
val result = scala.util.parsing.json.JSON.parseFull(dsConfiguration)
result match {
case Some(e:Map[String,String]) => {
val prefix = "fs.swift2d.service." + name
val hconf = sc.hadoopConfiguration
hconf.set("fs.swift2d.impl","com.ibm.stocator.fs.ObjectStoreFileSystem")
hconf.set(prefix + ".auth.url", e("auth_url") + "/v3/auth/tokens")
hconf.set(prefix + ".tenant", e("project_id"))
hconf.set(prefix + ".username", e("user_id"))
hconf.set(prefix + ".password", e("password"))
hconf.set(prefix + "auth.method", "keystoneV3")
hconf.set(prefix + ".region", e("region"))
hconf.setBoolean(prefix + ".public", true)
println("Successfully modified sparkcontext object with remote Object Storage Credentials using datasource name " + name)
println("")
return true
}
case None => println("Failed.")
return false
}
}
catch {
case NonFatal(exc) => println(exc)
return false
}
}
Load the Data
val setObjStor = setRemoteObjectStorageConfig("sparksql", scplain, Json.toJson(credentials.toMap).toString)
val data_rdd = scplain.textFile("swift2d://notebooks.sparksql/" + credentials("filename"))
data_rdd.take(5)

JSON4s can't find constructor w/spark

I've run into an issue with attempting to parse json in my spark job. I'm using spark 1.1.0, json4s, and the Cassandra Spark Connector, with DSE 4.6. The exception thrown is:
org.json4s.package$MappingException: Can't find constructor for BrowserData org.json4s.reflect.ScalaSigReader$.readConstructor(ScalaSigReader.scala:27)
org.json4s.reflect.Reflector$ClassDescriptorBuilder.ctorParamType(Reflector.scala:108)
org.json4s.reflect.Reflector$ClassDescriptorBuilder$$anonfun$6.apply(Reflector.scala:98)
org.json4s.reflect.Reflector$ClassDescriptorBuilder$$anonfun$6.apply(Reflector.scala:95)
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
My code looks like this:
case class BrowserData(navigatorObjectData: Option[NavigatorObjectData],
flash_version: Option[FlashVersion],
viewport: Option[Viewport],
performanceData: Option[PerformanceData])
.... other case classes
def parseJson(b: Option[String]): Option[String] = {
implicit val formats = DefaultFormats
for {
browserDataStr <- b
browserData = parse(browserDataStr).extract[BrowserData]
navObject <- browserData.navigatorObjectData
userAgent <- navObject.userAgent
} yield (userAgent)
}
def getJavascriptUa(rows: Iterable[com.datastax.spark.connector.CassandraRow]): Option[String] = {
implicit val formats = DefaultFormats
rows.collectFirst { case r if r.getStringOption("browser_data").isDefined =>
parseJson(r.getStringOption("browser_data"))
}.flatten
}
def getRequestUa(rows: Iterable[com.datastax.spark.connector.CassandraRow]): Option[String] = {
rows.collectFirst { case r if r.getStringOption("ua").isDefined =>
r.getStringOption("ua")
}.flatten
}
def checkUa(rows: Iterable[com.datastax.spark.connector.CassandraRow], sessionId: String): Option[Boolean] = {
for {
jsUa <- getJavascriptUa(rows)
reqUa <- getRequestUa(rows)
} yield (jsUa == reqUa)
}
def run(name: String) = {
val rdd = sc.cassandraTable("beehive", name).groupBy(r => r.getString("session_id"))
val counts = rdd.map(r => (checkUa(r._2, r._1)))
counts
}
I use :load to load the file into the REPL, and then call the run function. The failure is happening in the parseJson function, as far as I can tell. I've tried a variety of things to try to get this to work. From similar posts, I've made sure my case classes are in the top level in the file. I've tried compiling just the case class definitions into a jar, and including the jar in like this: /usr/bin/dse spark --jars case_classes.jar
I've tried adding them to the conf like this: sc.getConf.setJars(Seq("/home/ubuntu/case_classes.jar"))
And still the same error. Should I compile all of my code into a jar? Is this a spark issue or a JSON4s issue? Any help at all appreciated.

How to link classes from JDK into scaladoc-generated doc?

I'm trying to link classes from the JDK into the scaladoc-generated doc.
I've used the -doc-external-doc option of scaladoc 2.10.1 but without success.
I'm using -doc-external-doc:/usr/lib/jvm/java-7-openjdk-amd64/jre/lib/rt.jar#http://docs.oracle.com/javase/7/docs/api/, but I get links such as index.html#java.io.File instead of index.html?java/io/File.html.
Seems like this option only works for scaladoc-generated doc.
Did I miss an option in scaladoc or should I fill a feature request?
I've configured sbt as follows:
scalacOptions in (Compile,doc) += "-doc-external-doc:/usr/lib/jvm/java-7-openjdk-amd64/jre/lib/rt.jar#http://docs.oracle.com/javase/7/docs/api"
Note: I've seen the Opts.doc.externalAPI util in the upcoming sbt 0.13. I think a nice addition (not sure if it's possible) would be to pass a ModuleID instead of a File. The util would figure out which file corresponds to the ModuleID.
I use sbt 0.13.5.
There's no out-of-the-box way to have the feature of having Javadoc links inside scaladoc. And as my understanding goes, it's not sbt's fault, but the way scaladoc works. As Josh pointed out in his comment You should report to scaladoc.
There's however a workaround I came up with - postprocess the doc-generated scaladoc so the Java URLs get replaced to form proper Javadoc links.
The file scaladoc.sbt should be placed inside a sbt project and whenever doc task gets executed, the postprocessing via fixJavaLinksTask task kicks in.
NOTE There are lots of hardcoded paths so use it with caution (aka do the polishing however you see fit).
import scala.util.matching.Regex.Match
autoAPIMappings := true
// builds -doc-external-doc
apiMappings += (
file("/Library/Java/JavaVirtualMachines/jdk1.8.0_11.jdk/Contents/Home/jre/lib/rt.jar") ->
url("http://docs.oracle.com/javase/8/docs/api")
)
lazy val fixJavaLinksTask = taskKey[Unit](
"Fix Java links - replace #java.io.File with ?java/io/File.html"
)
fixJavaLinksTask := {
println("Fixing Java links")
val t = (target in (Compile, doc)).value
(t ** "*.html").get.filter(hasJavadocApiLink).foreach { f =>
println("fixing " + f)
val newContent = javadocApiLink.replaceAllIn(IO.read(f), fixJavaLinks)
IO.write(f, newContent)
}
}
val fixJavaLinks: Match => String = m =>
m.group(1) + "?" + m.group(2).replace(".", "/") + ".html"
val javadocApiLink = """\"(http://docs\.oracle\.com/javase/8/docs/api/index\.html)#([^"]*)\"""".r
def hasJavadocApiLink(f: File): Boolean = (javadocApiLink findFirstIn IO.read(f)).nonEmpty
fixJavaLinksTask <<= fixJavaLinksTask triggeredBy (doc in Compile)
I took the answer by #jacek-laskowski and modified it so that it avoid hard-coded strings and could be used for any number of Java libraries, not just the standard one.
Edit: the location of rt.jar is now determined from the runtime using sun.boot.class.path and does not have to be hard coded.
The only thing you need to modify is the map, which I have called externalJavadocMap in the following:
import scala.util.matching.Regex
import scala.util.matching.Regex.Match
val externalJavadocMap = Map(
"owlapi" -> "http://owlcs.github.io/owlapi/apidocs_4_0_2/index.html"
)
/*
* The rt.jar file is located in the path stored in the sun.boot.class.path system property.
* See the Oracle documentation at http://docs.oracle.com/javase/6/docs/technotes/tools/findingclasses.html.
*/
val rtJar: String = System.getProperty("sun.boot.class.path").split(java.io.File.pathSeparator).collectFirst {
case str: String if str.endsWith(java.io.File.separator + "rt.jar") => str
}.get // fail hard if not found
val javaApiUrl: String = "http://docs.oracle.com/javase/8/docs/api/index.html"
val allExternalJavadocLinks: Seq[String] = javaApiUrl +: externalJavadocMap.values.toSeq
def javadocLinkRegex(javadocURL: String): Regex = ("""\"(\Q""" + javadocURL + """\E)#([^"]*)\"""").r
def hasJavadocLink(f: File): Boolean = allExternalJavadocLinks exists {
javadocURL: String =>
(javadocLinkRegex(javadocURL) findFirstIn IO.read(f)).nonEmpty
}
val fixJavaLinks: Match => String = m =>
m.group(1) + "?" + m.group(2).replace(".", "/") + ".html"
/* You can print the classpath with `show compile:fullClasspath` in the SBT REPL.
* From that list you can find the name of the jar for the managed dependency.
*/
lazy val documentationSettings = Seq(
apiMappings ++= {
// Lookup the path to jar from the classpath
val classpath = (fullClasspath in Compile).value
def findJar(nameBeginsWith: String): File = {
classpath.find { attributed: Attributed[File] => (attributed.data ** s"$nameBeginsWith*.jar").get.nonEmpty }.get.data // fail hard if not found
}
// Define external documentation paths
(externalJavadocMap map {
case (name, javadocURL) => findJar(name) -> url(javadocURL)
}) + (file(rtJar) -> url(javaApiUrl))
},
// Override the task to fix the links to JavaDoc
doc in Compile <<= (doc in Compile) map {
target: File =>
(target ** "*.html").get.filter(hasJavadocLink).foreach { f =>
//println(s"Fixing $f.")
val newContent: String = allExternalJavadocLinks.foldLeft(IO.read(f)) {
case (oldContent: String, javadocURL: String) =>
javadocLinkRegex(javadocURL).replaceAllIn(oldContent, fixJavaLinks)
}
IO.write(f, newContent)
}
target
}
)
I am using SBT 0.13.8.

Strange NullPointerException in Actors in scala using play framework

I want to do some parallel computation and I'm getting a really strange java.lang.NullPointerException on calling ANY functions outside the object I have.
Take a look:
case class Return(session: String, job: Int)
case class Ready(n: Int)
case class DoJob(session: String, job: Int)
case class NotReady
object Notifications extends Controller with Secure {
class AtorMeio extends Actor {
import scala.collection.mutable.{Map => MMap}
val job: MMap[(String, Int), Option[Int]] = MMap()
def act {
loop {
react {
case DoJob(session, jobn) =>
if (job.get((session, jobn)).isEmpty) {
jobn match {
case 1 =>
job.put((session, jobn), None)
val n = Messaging.oi //Messaging.retrieveNumberOfMessages(new FlagTerm(new Flags(Flags.Flag.SEEN), false))
job.put((session, jobn), Some(n))
case 2 =>
// do!
}
}
case Return(session, jobn) =>
if (job.get((session, jobn)).isDefined && job.get((session, jobn)).get.isDefined) {
val ret = job.get((session, jobn)).get.get
job.remove((session, jobn))
reply(Ready(ret))
}
else
reply(NotReady)
}
}
}
}
private var meuator: AtorMeio = null
lazy val ator = {
if (Option(meuator).isEmpty) {
meuator = new AtorMeio
meuator.start
}
meuator
}
def pendingNotifications = {
ator ! DoJob(session.getId, 1)
ator !? Return(session.getId, 1) match {
case Ready(ret) =>
if (ret.toString != Option[String](params.get("current")).getOrElse("-1")) "true" else Suspend("80s")
case _ =>
}
}
}
I'm getting an error in executing Messaging.oi which is basically an object with:
def oi = 4
Here is the stacktrace:
controllers.Notifications$AtorMeio#1889d53: caught java.lang.NullPointerException
java.lang.NullPointerException
at controllers.Messaging$.oi(Messaging.scala:108)
at controllers.Notifications$AtorMeio$$anonfun$act$1$$anonfun$apply$1.apply(Notifications.scala:38)
at controllers.Notifications$AtorMeio$$anonfun$act$1$$anonfun$apply$1.apply(Notifications.scala:31) at scala.actors.ReactorTask.run(ReactorTask.scala:34)
at scala.actors.ReactorTask.compute(ReactorTask.scala:66)
at scala.concurrent.forkjoin.RecursiveAction.exec(RecursiveAction.java:147)
at scala.concurrent.forkjoin.ForkJoinTask.quietlyExec(ForkJoinTask.java:422)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.mainLoop(ForkJoinWorkerThread.java:340)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:325)
Line 108 is exactly this oneliner def. Ahh entrance point is def pendingNotifications.
Anyone can help? Thanks a lot!
Have you tried replacing
private var meuator: AtorMeio = null
by either:
private var meuator: AtorMeio = None
Configure your breakpoints view in your debugger to halt/break on NullPointerExceptions ...
And: you did see you set this to null here:
private var meuator: AtorMeio = null
?? Or?
Ok people, after digging a lot I discovered the problem: Somehow, somewhere if you have the "Controller" class from play framework mixed in, it crashes mercifully. So I just wrapped this thing into a 'clean' class and it worked.

Is there a good library to embed a command prompt in a scala (or java) application

I have an application that I'd like to have a prompt in. If it helps, this is a graph database implementation and I need a prompt just like any other database client (MySQL, Postgresql, etc.).
So far I have my own REPL like so:
object App extends Application {
REPL ! Read
}
object REPL extends Actor {
def act() {
loop {
react {
case Read => {
print("prompt> ")
var message = Console.readLine
this ! Eval(message)
}
case More(sofar) => {
//Eval didn't see a semicolon
print(" --> ")
var message = Console.readLine
this ! Eval(sofar + " " + message)
}
case Eval(message) => {
Evaluator ! Eval(message)
}
case Print(message) => {
println(message)
//And here's the loop
this ! Read
}
case Exit => {
exit()
}
case _ => {
println("App: How did we get here")
}
}
}
}
this.start
}
It works, but I would really like to have something with history. Tab completion is not necessary.
Any suggestions on a good library? Scala or Java works.
Just to be clear I don't need an REPL to evaluate my code (I get that with scala!), nor am I looking to call or use something from the command line. I'm looking for a prompt that is my user experience when my client app starts up.
Scala itself, and lots of programs out there, uses a readline-like library for its REPL. Specifically, JLine.
I found another question about this, for which the answers don't seem promising.
BeanShell does some of what you want: http://www.beanshell.org/
I got it. these two blogs really helped.
http://danielwestheide.com/blog/2013/01/09/the-neophytes-guide-to-scala-part-8-welcome-to-the-future.html
http://danielwestheide.com/blog/2013/01/16/the-neophytes-guide-to-scala-part-9-promises-and-futures-in-practice.html
def interprete(code: String) : Future[String] = {
val p = Promise[String]()
Future {
var result = reader.readLine()
p.success(result)
}
writer.write(code + "\n")
writer.flush()
p.future
}
for (ln <- io.Source.stdin.getLines){
val f = interprete(ln)
f.onComplete {
case Success(s) =>
println("future returned: " + s)
case Failure(ex) =>
println(s"interpreter failed due to ${ex.getMessage}")
}
}

Categories