How to build a .jar out of scala in eclipse? - java

I am trying to run a scala code using spark-submit. I have to convert the scala code into a .jar and then execute. I tried to convert .scala file to .jar directly but I got "NoClassFoundException".
Then I tried to call scala class from java file and convert java file to a .jar and execute using spark-submit. But this time java class is found but scala class is not found. When I run this in my eclipse it is giving expected output
java class
package com.connect.cassandra;
public class MainClass {
public static void main(String[] args) {
// TODO Auto-generated method stub
MainScalaClass app = new MainScalaClass();
app.main(args);
}
}
Scala class
package com.connect.cassandra
import com.datastax.driver.core.Cluster
class MainScalaClass {
def main(args: Array[String]): Unit = {
simpleCassConnector.foo()
}
}
object simpleCassConnector {
def foo(){
println("inside object")
val cluster = Cluster.builder()
.addContactPoint("localhost")
.withPort(9042)
.build()
val session = cluster.connect()
val p = session.execute("Select * from persons.user limit 2")
println(p.all())
}
}
Error
Exception in thread "main" java.lang.NoClassDefFoundError: com/connect/cassandra/MainScalaClass
where am I going wrong and is there any way to directly convert scala file to .jar and execute without "NoClassFound" exception?

Related

How to invoke a method of a groovy class from Java - Method name and parameter in string format

I have a Java program which accepts some string input in the below format:
setData("hello")
Also, I have a groovy script say "sample.groovy", it is a groovy file in the following sample format:
class sample
{
def doOperation()
{
println("Inside doOperation()")
}
def setData(String str)
{
println("Incoming data : " + str)
}
}
From the Java class, create an object of above groovy class named : sampleObj.
I have to invoke sampleObj.setData("hello") from my Java application using the input string say "setData("hello")".
Then how can I invoke this method?
This is exactly the kind of problem that GroovyShell solves.
Here's an example:
import groovy.transform.Canonical
import org.codehaus.groovy.control.CompilerConfiguration
#Canonical
class ScriptState {
String data
}
abstract class MyScript extends Script {
void setData(String data) {
binding.state.data = data
}
}
def state = new ScriptState()
def cc = new CompilerConfiguration(scriptBaseClass: MyScript.class.name)
def shell = new GroovyShell(MyScript.classLoader, new Binding(state: state), cc)
shell.evaluate('println "Running script"; setData "The Data"')
assert state.data == 'The Data'
println state
Running this will print:
Running script
ScriptState(The Data)
I based this example on the Groovy Goodness example.
Normally, you don't need to set the classloader as I did in MyScript.classLoader... I only needed to do this because I ran this as a script, to the script class would not be visible to the GroovyShell's script classloader if I didn't do that.
EDIT
After the question was heavily edited, it seem the problem is that you don't know which class the Java object to call from the script will have.
In that case, just change the MyScript class to do something like this:
abstract class MyScript extends Script {
def methodMissing(String name, args) {
// this will call any method called inside the script
// on the sample Object
binding.sampleObject."$name"(*args)
}
}
Now, when creating the GroovyShell:
def shell = new GroovyShell(
MyScript.classLoader,
new Binding(sampleObject: new Sample()),
cc)
Running this code:
shell.evaluate('doOperation(); setData "The Data"')
will print the expected:
Inside doOperation()
Incoming data : The Data

Running groovy script in Java code

Hello I am looking to run a groovy script inside Java code but I didn't find many tutorial about that.
I have a String that contain a groovy script :
private String processingCode = "def hello_world() { println \"Hello, world!\" }";
I have also downloaded the Groovy SDK.
Which groovy jar should I include in java project ? And how to execute the script in Java ?
What you need is a groovy-all dependency and GroovyShell.
Main class will be:
package lol;
import groovy.lang.GroovyShell;
public class Lol {
public static void main(String[] args) {
String processingCode = "def hello_world() { println 'Hello, world!' }; hello_world();";
GroovyShell shell = new GroovyShell();
shell.evaluate(processingCode);
}
}
Here is a demo.
Use gradle run to run it.

ResourceManager class mismatch when using Velocity as OSGi bundle

I am using Apache Velocity in my Eclipse plugin. The corresponding entry has been added to MANIFEST.MF:
Require-Bundle: org.apache.velocity;bundle-version="1.5.0"
A Velocity-instance is initialized as follows:
VelocityEngine ve = new VelocityEngine();
ve.setProperty(Velocity.RUNTIME_LOG_LOGSYSTEM_CLASS, NullLogChute.class.getName());
ve.init();
I build the JAR with my plugin and test it on several machines. On 2 PCs this works fine but on the third one I get an exception:
java.lang.Exception: The specified class for ResourceManager (org.apache.velocity.runtime.resource.ResourceManagerImpl) does not implement org.apache.velocity.runtime.resource.ResourceManager; Velocity is not initialized correctly.
at org.apache.velocity.runtime.RuntimeInstance.initializeResourceManager(RuntimeInstance.java:589)
at org.apache.velocity.runtime.RuntimeInstance.init(RuntimeInstance.java:241)
at org.apache.velocity.runtime.RuntimeSingleton.init(RuntimeSingleton.java:113)
at org.apache.velocity.app.Velocity.init(Velocity.java:83)
It seems I get this exception because Velocity isn't OGSi-friendly. Can anybody give me a workaround?
I was able to solve similar issue using this : https://wiki.eclipse.org/FAQ_How_do_I_use_the_context_class_loader_in_Eclipse%3F
Basically you set the current Thread class loader to the one of your Eclipse/OSGi plug-in (or a class of your plug-in).
Resulting sample code below works fine with both plain java execution (Run as Java Application) and called through plugin API after exporting the Eclipse product.
public class Printer {
#Override
public void myPluginApiOverride() {
Printer.main(new String[0]);
}
public static void main(final String[] args) {
// without the following class loader initialization, I get the
// following exception when running as Eclipse plugin:
// org.apache.velocity.exception.VelocityException: The specified
// class for ResourceManager
// (org.apache.velocity.runtime.resource.ResourceManagerImpl) does not
// implement org.apache.velocity.runtime.resource.ResourceManager;
// Velocity is not initialized correctly.
final ClassLoader oldContextClassLoader = Thread.currentThread().getContextClassLoader();
Thread.currentThread().setContextClassLoader(Printer.class.getClassLoader());
// sample velocity call
final Writer writer = new PrintWriter(new OutputStreamWriter(System.out));
final VelocityContext context = new VelocityContext();
context.put("name", "World");
try {
Velocity.evaluate(context, writer, "org.apache.velocity", "Hello $name !");
writer.close();
} catch (final IOException e) {
e.printStackTrace();
}
// set back default class loader
Thread.currentThread().setContextClassLoader(oldContextClassLoader);
}
}

Loading java library functions to Luaj

I am stuck with loading java functions so that it can be called from lua file using luaj.
What i currently do is create something like this :
in some_package/aif.java :
package some_package;
public class aif extends TwoArgFunction {
public aif() {
}
#Override
public LuaValue call(LuaValue modname, LuaValue env) {
LuaValue library = tableOf();
library.set("foo", new foo());
env.set("aif", library);
return library;
}
//the rest contains the implementations of java functions
}
and then in lua file :
require "some_package/aif"
--etc ...
and then in Main.java file :
public static void Main(String[] args) {
String script = "lib/some_lua_file.lua";
globals = JsePlatform.standardGlobals();
LuaValue chunk = globals.loadFile(script);
chunk.call( LuaValue.valueOf(script) );
}
this code works , but what i want is that in lua file we dont have to use "require". I have achieved this similarly but in c++ using this line :
luaL_requiref(L, "aif", luaopen_aiflib, 1);
can we do like that in luaj? i tried :
globals.load(new aif());
but gets Exception in thread "main" org.luaj.vm2.LuaError: index expected, got nil (variable env in call function of aif class is nil)
anybody knows how to setup aif as lua libary to use with luaj?
You can write your MyXArgImpl like the following:
package mypackage;
import org.luaj.vm2.LuaValue;
import org.luaj.vm2.lib.ZeroArgFunction;
public class MyZeroArgImpl extends ZeroArgFunction {
public LuaValue call() {
return valueOf("My Zero Arg Implementation");
}
}
and then add it to your LUA as the following:
LuaValue globals = JsePlatform.standardGlobals();
globals.get("dofile").call( LuaValue.valueOf(yourScriptfile));
globals.set("callMyFunction", new MyZeroArgImpl());
Now you can call your function inside your LUA script even without require('...'):
print(callMyFunction())
I have found the answer after looking at the luaj implementation of Lua library.
i modified my code :
package some_package;
public class aif extends OneArgFunction{
public aif() {
}
#Override
public LuaValue call(LuaValue env) {
Globals globals = env.checkglobals();
LuaTable aif = new LuaTable();
aif.set("foo", new foo());
env.set("aif", aif);
globals.package_.loaded.set("aif", aif);
return aif;
}
//the rest contains the implementations of java functions
}
I code the aif class to TwoArgFunction is because the tutorial said to do so. Now with the above code, no need to require the class in lua file
Lets say your script that you are loading has a function "receive_aif"
function receive_aif( aifObj )
--This is how you can invoke public function associated with aifObj
aifObj:someAifFunction()
end
From java, you can pass aif instance as: (This should work with any java object )
aif aifObj = new aif()
LuaValue receive_aif_handle = globals.get("receive_aif");
LuaValue retvals = receive_aif_handle.call( CoerceJavaToLua.coerce( aifObj ) );
I am using similar constructs in my application using "3.0 aplha-2" release

My Scala methods do not return values to Java code

This is a Scala module:
package xpf
import java.io.File
import org.jdom.Element
import org.jdom.input.SAXBuilder
object xmlpf {
def load_file(filename: String): Element = {
val builder = new SAXBuilder
builder.build(new File(filename)).getRootElement
}
}
And here is Java code, calling the method from Scala above:
package textxpf;
import org.jdom.Element;
public class Main {
public static void main(String[] args) {
Element root = xpf.xmlpf.load_file("/home/capkidd/proj/XmlPathFinder/Staff.xml");
System.out.println(root.getName());
}
}
Running java main procedure I see
run:
Exception in thread "main" java.lang.NullPointerException
at textxpf.Main.main(Main.java:8)
Java Result: 1
BUILD SUCCESSFUL (total time: 0 seconds)
Exploring the problem I found that I cannot return any value of any type from any Scala method to the Java code that called it.
I use NetBeans 6.9.1 with Scala 2.8.1 plugin.
scala-library.jar and jdom.jar are properly plugged to the project.
What am I doing wrong?
Has anybody any idea?
Try this and then debug accordingly:
package xpf
import java.io.File
import org.jdom.Element
import org.jdom.input.SAXBuilder
object xmlpf {
def load_file(filename: String): Element = {
val builder = new SAXBuilder
val re = builder.build(new File(filename)).getRootElement
if (re == null) throw new NullPointerException("the root element is null!")
re
}
}
I tried a similar program with no problems:
// ms/MyObject.scala
package ms
object myObject {
def foo(s: String) = s
}
// mj/MyObject2.java
package mj;
public class MyObject2 {
public static void main(String[] args) {
System.out.println(ms.myObject.foo("hello"));
}
}
I compiled both files, then "scala -cp . mj.MyObject2". Works fine with scala 2.8.1.final. Does this example work in your setup?
So, I wonder if it's some sort of environment issue, such as picking up a stale build of the Scala class? Have you tried a clean build from scratch? Is your runtime class path correct?

Categories