How to use jnaerator in a sbt project - java

I work on a Scala project that uses c++ code, using sbt. Once compiled, this c++ code is imported into Scala through Java code that uses jna.
Now, currently the Java wrapper are manually written, and I like to automatize this. I've found jnaerator that can do that, but I don't know how I should use it in sbt.
I see two general approaches:
use command line, such as java -jar jnaerator ... but I don't know how to setup such command line task in sbt? Also, I would need to know the typical project structure to follow: where to output the jna generated code?
Use jnaerator maven plugin through sbt, if it is possible?

This might take some iteration until we get it do what you need.
For the first approach, here is how you can run custom system command on sbt (you essentially solve this using Scala code). Add the following to your build.sbt file:
lazy val runJnaerator= taskKey[Unit]("This task generates libraries from native code")
runJnaerator := {
import sys.process._
Seq("java" , "-jar", "jnaerator", "..." ).!
}
To execute:
>sbt runJnaerator
Now the question is where do you need these files files to go? Finally, how do you want to invoke everything?

Related

How to run a Gatling simulation from java code without maven and gradle?

I would like to execute my Gatling simulation from within Java code and not with a command maven or gradle. Is it possible to run the tests/scenarios directly from Java code?
Option 1:
If you want to run Gatling from code you can invoke this class:
io.gatling.app.Gatling
Source code:
https://github.com/gatling/gatling/blob/master/gatling-app/src/main/scala/io/gatling/app/Gatling.scala
I probably wouldn't call main directly but rather the start function or custom start function like that. Here is an attempt at that or.
Something like this (copied from the link above):
import io.gatling.app.Gatling
import io.gatling.core.config.GatlingPropertiesBuilder
object Engine extends App {
val props = new GatlingPropertiesBuilder
props.simulationClass("your.simulation.class.goes.here")
props.dataDirectory("path.to.data.directory") //optional
props.resultsDirectory("path.to.results.directory") //optional
props.bodiesDirectory("path.to.template.directory") //optional
props.binariesDirectory("path.to.binaries.directory") //optional
Gatling.fromMap(props.build)
}
Option 2:
(Compilation Phase) Use Maven archetype to generate helper classes (you probably need to compile your Java anyway). Docs. This will generate Engine (code) and other classes which you can run. This is similar to Option 1 but helps to resolve paths if you are working from a maven project. Makes sense if you use maven to build your project.
Option 3:
Invoke gatling.sh or gatling.bat as a process from Java with Runtime.getRuntime().exec() or similar.
Bear in mind:
Gatling tests need to be compiled before they are executed. This is basically what gatling.[sh|bat] is doing:
# Run the compiler
"$JAVA" $COMPILER_OPTS -cp "$COMPILER_CLASSPATH" io.gatling.compiler.ZincCompiler $EXTRA_COMPILER_OPTIONS "$#" 2> /dev/null
# Run Gatling
"$JAVA" $DEFAULT_JAVA_OPTS $JAVA_OPTS -cp "$GATLING_CLASSPATH" io.gatling.app.Gatling "$#"
If you call Scala code from Java there is definitely inter-op available. Make sure Scala is compiled first in your case or can be loaded in Java CP easily. Create Java friendly wrapper if needed on Scala side.
I would start with Option 1 or 2 if you want tight integration and with option 3 if you just want to glue things together and don't mind startup/init time.
Pay attention to Classpath needed for Gatling - this will depend where its classes are located (in your proj or outside) and how you invoke it.
You can definitely pass test names, just see how the arguments are used in those classes. For example simulationClass in props. All available methods (simulationsDirectory, simulationClass, etc).
I'm sure it will take a bit of trial and error but definitely can be done.

How can I run Kotlin-Script (.kts) files from within Kotlin/Java?

I noticed that IntelliJ can parse .kts files as Kotlin and the code editor picks them up as free-floating Kotlin files. You are also able to run the script in IntelliJ as you would a Kotlin file with a main method. The script executes from top to bottom.
This form is PERFECT for the project I'm working on, if only I knew an easy way to use them from within Java or Kotlin.
What's the idiomatic way to "run" these scripts from Java or Kotlin?
Note that script files support in Kotlin is still pretty much experimental. This is an undocumented feature which we're still in the process of designing. What's working today may change, break or disappear tomorrow.
That said, currently there are two ways to invoke a script. You can use the command line compiler:
kotlinc -script foo.kts <args>
Or you can invoke the script directly from IntelliJ IDEA, by right-clicking in the editor or in the project view on a .kts file and selecting "Run ...":
KtsRunner
I've published a simple library that let's you run scripts from regular Kotlin programs.
https://github.com/s1monw1/KtsRunner
Example
The example class
data class ClassFromScript(val x: String)
The .kts file
import de.swirtz.ktsrunner.objectloader.ClassFromScript
ClassFromScript("I was created in kts")
The code to load the class
val scriptReader = Files.newBufferedReader(Paths.get("path/classDeclaration.kts"))
val loadedObj: ClassFromScript = KtsObjectLoader().load<ClassFromScript>(scriptReader)
println(loadedObj.x) // >> I was created in kts
As shown, the KtsObjectLoader class can be used for executing a .kts script and return its result. The example shows a script that creates an instance of the ClassFromScript type that is loaded via KtsObjectLoader and then processed in the regular program.
As of 2020 (Kotlin 1.3.70), you can just use the straightforward
kotlin script.main.kts
Note that using the file extension .main.kts instead of .kts seems to be important.
Note that for me this does not seem to run the main() function if defined, I had to add a manual call to main() at the top level.
One of the advantages of Kotlin script is the ability to declare code and dependencies inside a single file (with #file:DependsOn, see for example here)
early 2020ies kscript that you find at https://github.com/holgerbrandl/kscript
seems to be the most convenient and well supported way to go ...
jgo can fetch and run code from Maven repositories, so can be used to invoke
https://github.com/scijava/scijava-common and https://github.com/scripting-kotlin to execute a local Foo.kt like so:
jgo --repository scijava.public=maven.scijava.org/content/groups/public org.scijava:scijava-common:#ScriptREPL+org.scijava:scripting-kotlin Foo.kt
If no Foo.kt is provided, it launches a Kotlin REPL.

Where should I configure code generation in NPM packages?

Disclaimer: I am the author of Jsonix and Jsonix Schema Compiler and I'm trying to figure the canonical way the Jsonix Schema Compiler should be integrated in NPM package.json.
The jsonix-schema-compiler NPM package provides a Java-based tool for code generation. If the jsonix-schema-compiler is installed as dependency then it can be used to generate XML<->JS mappings. The invocation is as something like:
java -jar node_modules/jsonix-schema-compiler/lib/jsonix-schema-compiler-full.jar
schema.xsd
This generates a JavaScript file like Mappings.js which is basically a part of module's code.
Ideally, jsonix-schema-compiler invocation above (java -jar ... and so on) should be executed during the module build. But it must be executed after modules dependencies are installed (otherwise node_modules/jsonix-schema-compiler will be missing).
My question is - where should I canonically configure code generation in NPM packages?
Right now I'm doing it in the postinstall scripts, something like:
{
...
"dependencies": {
"jsonix": "x.x.x",
"jsonix-schema-compiler": "x.x.x"
},
"devDependencies" : {
"nodeunit" : "~0.8.6"
},
"scripts": {
"postinstall" : "java -jar node_modules/jsonix-schema-compiler/lib/jsonix-schema-compiler-full.jar schema.xsd",
"test": "nodeunit src/test/javascript/tests.js"
}
}
However having read this:
tl;dr Don't use install. Use a .gyp file for compilation, and
prepublish for anything else.
You should almost never have to explicitly set a preinstall or install
script. If you are doing this, please consider if there is another
option.
I am now confused if postinstall is also OK.
All I want to do is to be able to execute a certain command-line command after dependencies are installed but before other things (like tests or publish). How should I canonically do it?
Typically people are running things like coffeescript-to-javascript compilers, Ecmascript 6->5 transpilers, and minifiers as a build step, which is what it sounds like you're doing.
The difference between doing it pre-publish and post-install is that a prepublish script is probably going to be run in your checked-out directory, so it's reasonable to assume that java is available and various dev-dependencies are available; while the post-install script would be run after every install, and will fail if java (etc.) is not available, as on a minimalist docker image. So you should put your build step in a prepublish or similar script.
Personally what I like to do is define a script 'mypublish' in package.json that ensures all tests pass, runs the build, ensures build artefacts exist, and then runs npm publish. I find this more intuitive than prepublish which is meant to be used as an "I'm about to publish" hook, not a "do the build before publishing".
Here is a package.json that uses this setup: https://github.com/reid/node-jslint/blob/master/package.json and here's the Makefile with the prepublish target: https://github.com/reid/node-jslint/blob/master/Makefile
Let me know if you have more questions; I'm kind of rambling because there are many legitimate ways to get it done-- as long as you avoid postinstall scripts. ;-)

How do I import the necessary Jinterface packages in Java?

I understand that I need to import the packages in
import com.ericsson.otp.erlang.*;
To run Jinterface in Java, this is not included in Java's default libraries but in Erlang's. How do I access this library? Which path should I use? I've google it but found nothing. I am using Ubuntu 13.10. The above code is not enough for this to work.
If you have done any Java development before, then you know that you should add OtpErlang.jar to your application's class path.
You can do this in the command line, Ant, Maven, Gradle or even in your IDE.
Command line example:
javac -classpath OtpErlang.jar YourGame.java
I use OS X and OtpErlang.jar is under:
/usr/local/Cellar/erlang/R16B03-1/lib/erlang/lib/jinterface-1.5.8/priv/OtpErlang.jar
Keep in mind that you need to include OtpErlang.jar also when you run your game.
Try looking in the Jinterface users Guide, they give an example of compiling the Java code.

RemoteActorRefProvider ClassNotFound

I'm struggling trying to get remote actors setup in Scala. I'm running Scala 2.10.2 and Akka 2.2.1.
I compile using [I've shortened the paths on the classpath arg for clarity sake]:
$ scalac -classpath "akka-2.2.1/lib:akka-2.2.1/lib/scala-library.jar:akka-2.2.1/lib/akka:akka-2.2.1/lib/akka/scala-reflect-2.10.1.jar:akka-2.2.1/lib/akka/config-1.0.2.jar:akka-2.2.1/lib/akka/akka-remote_2.10-2.2.1.jar:akka-2.2.1/lib/akka/akka-kernel_2.10-2.2.1.jar:akka-2.2.1/lib/akka/akka-actor_2.10-2.2.1.jar:." [file.scala]
I've continuously added new libraries trying to debug this - I'm pretty sure all I really need to include is akka-remote, but the others shouldn't hurt.
No issues compiling.
I attempt to run like this:
$ scala -classpath "[same as above]" [application]
And I receive a NSM exception:
java.lang.NoSuchMethodException: akka.remote.RemoteActorRefProvider.<init>(java.lang.String, akka.actor.ActorSystem$Settings, akka.event.EventStream, akka.actor.Scheduler, akka.actor.DynamicAccess)
at java.lang.Class.getConstructor0(Class.java:2810)
at java.lang.Class.getDeclaredConstructor(Class.java:2053)
...
Looking into the source code, it appears that Akka 2.2.X's flavor of this constructor takes 4 arguments (the Scheduler is removed). But in Akka < 2.2.X, the constructor takes 5 args.
Thus, I'm thinking my classpath isn't setup quite right. At compile-time, Scala must be finding the <2.2.X flavor. I don't even know where it would be finding it, since I only have Akka 2.2.1 installed.
Any suggestions!? Thanks! (Please don't say to use SBT).
The problem here is that the Scala distribution contains akka-actor 2.1.0 and helpfully puts that in the boot class path for you. We very strongly recommend using a dependency manager like sbt or maven when building anything which goes beyond the most trivial projects.
As noted in another answer, the problem is that scala puts a different version of Akka into the bootclasspath.
To more directly answer your question (as you said you don't want to use sbt): you can execute your program with java instead of scala. You just have to put the appropriate Scala jars into the classpath.
Here is a spark-dev message about the problem. The important part is: "the workaround is to use java to launch the application instead of scala. All you need to do is to include the right Scala jars (scala-library and scala-compiler) in the classpath."

Categories