I am trying to use ARIMA object (Scala), which is imported from a package, in my Java program. Although the compilation succeeds, meaning that ARIMA class is recognized during compilation, there is NoClassDefFoundError for the ARIMA object in runtime. ARIMAModel class has no problem with importing since it is a class.
Is there any way to use the Scala object from my Java program?
Here is the source code for the object in Scala package.
File: .../com/cloudera/sparkts/models/ARIMA.scala
package com.cloudera.sparkts.models
object ARIMA {
def autoFit(ts: Vector, maxP: Int = 5, maxD: Int = 2, maxQ: Int = 5): ARIMAModel = {
...
}
}
class ARIMAModel(...) {
...
}
Here is my Java code.
File: src/main/java/SingleSeriesARIMA.java
import com.cloudera.sparkts.models.ARIMA;
import com.cloudera.sparkts.models.ARIMAModel;
public class SingleSeriesARIMA {
public static void main(String[] args) {
...
ARIMAModel arimaModel = ARIMA.autoFit(tsVector, 1, 0, 1);
...
}
}
Here is the error.
Exception in thread "main" java.lang.NoClassDefFoundError: com/cloudera/sparkts/models/ARIMA
at SingleSeriesARIMA.main(SingleSeriesARIMA.java:43)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.cloudera.sparkts.models.ARIMA
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 10 more
I am using Scala version 2.11.8 and Java 1.8
You need to supply the dependency having Arima object present to the spark cluster using --jars option as below-
spark-submit --jars <path>/<to>/sparkts-0.4.1.jar --class SingleSeriesARIMA target/simple-project-1.0.jar
This will pass the other dependency along with the application jar to be available at spark-runtime.
TO call ARIMA object from java use-
ARIMA$.MODULE$.autoFit(tsVector, 1, 0, 1);
Related
I am using janusgraph, with storage backend HBase. Currently I am trying to add vertices to the database. The part of the code is
public class Graph {
private static JanusGraph graph = JanusGraphFactory.open("conf/jg.properties");
public static JanusGraph getGraph() {
return graph;
}
public static void addVertex() {
for (int i=0; i<5; i++) {
graph.addVertex("test", i);
}
graph.tx().commit();
}
}
with the main function calls
Graph.addVertex();
The error is
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.eclipse.jdt.internal.jarinjarloader.JarRsrcLoader.main(JarRsrcLoader.java:58)
Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/MasterNotRunningException
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at org.janusgraph.util.system.ConfigurationUtil.instantiate(ConfigurationUtil.java:56)
at org.janusgraph.diskstorage.Backend.getImplementationClass(Backend.java:477)
at org.janusgraph.diskstorage.Backend.getStorageManager(Backend.java:409)
at org.janusgraph.graphdb.configuration.GraphDatabaseConfiguration.<init>(GraphDatabaseConfiguration.java:1376)
at org.janusgraph.core.JanusGraphFactory.open(JanusGraphFactory.java:164)
at org.janusgraph.core.JanusGraphFactory.open(JanusGraphFactory.java:133)
at org.janusgraph.core.JanusGraphFactory.open(JanusGraphFactory.java:80)
... 5 more
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.MasterNotRunningException
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 16 more
I am using janusgraph 0.2.0, in maven 0.2.0
and hbase 1.2.0, java 1.8
I set storage-hostname=127.0.0.1 in jg.properties, so is it a dependency error? Where is exactly the MasterNotRunning?
I have found the solution, but still curious.
I search the class MasterNotRunning and found this is in package org.apache.hadoop.hbase and finally make a guess to add org.apache.hbase & hbase-client into dependency. And the error disappears.
However, I still could not find the reason of this error. If some code in dependency jar uses the class MasterNotRunning then it should import the hbase-client jar, then this should be already in the dependency. How can this code pass the compiling and finally throws an exception during the running time.
Just add, I uses export->runnable jar file to get my jar, so all the dependencies should be added into it.
I create an Android app by using Kotlin, and there's something strange with lambda. I pass mapOf(1 to {...}, 2 to {...}) and get NoClassDefFoundError or ClassNotFoundException.
I try to rewrite it in desktop, and get the same, but with different stack trace.
fun main(args: Array<String>) {
call(mapOf(
1 to { "asd" },
2 to { 999 }
))
}
fun call(x: Map<Int, () -> Any>) {
}
This is the stack trace:
Exception in thread "main" java.lang.NoClassDefFoundError: kotlin/jvm/internal/Intrinsics
at TestKt.main(test.kt)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.jetbrains.kotlin.runner.AbstractRunner.run(runners.kt:61)
at org.jetbrains.kotlin.runner.Main.run(Main.kt:109)
at org.jetbrains.kotlin.runner.Main.main(Main.kt:119)
Caused by: java.lang.ClassNotFoundException: kotlin.jvm.internal.Intrinsics
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 8 more
It is not a problem in your code. It is a problem with running your application.
Kotlin comes with its own standard class library kotlin-runtime.jar. You should have this library in classpath:
java -cp $KOTLIN_HOME/lib/kotlin-runtime.jar:MyApp.jar com.my.AppKt
or your should compile the application with -include-runtime:
kotlinc app.kt -include-runtime -d MyApp.jar
I have a Java class myObj with a static block as shown:
static{
Class<myObj> klass = myObj.class;
log.info("\nClientAPIVersion : "+ klass.getPackage().getImplementationVersion());
}
I'm creating an instance of this class in jython 2.2.
When I run my python script using :
java -Dlog4j.configuration=file://${CLASSPATH}/log4j.xml -jar ~/jython_2.2/jython.jar test.py
I get an exception as shown:
File "test.py", line 42, in ?
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
java.lang.ExceptionInInitializerError: java.lang.ExceptionInInitializerError
I found online that java.lang.ExceptionInInitializerError occurs due to a crash in the static block.
When I remove the log.info, the python code executes correctly.
I think that klass.getPackage() might be null, but if so, it should simply print 'null' in the log. Why the exception ?
What is the source of the problem and how may I fix it ? Thanks
I'm trying to create a directory (using ceylon's file module) in the Jimfs file system but I'm having problems with the Jimfs provider not being installed when accessing the filesystem from ceylon.
This is my test program:
// File: test.se.gustavkarlsson.autogit.file.watcher.run
import ceylon.file {
Nil,
parseURI
}
import com.google.common.jimfs {
Jimfs {
jimFs=newFileSystem
}
}
shared void run() {
value fs = jimFs();
value jPath = fs.getPath("directory");
value uri = jPath.toUri().string;
value path = parseURI(uri);
value resource = path.resource;
assert (is Nil resource);
resource.createDirectory();
}
When run, prints the following stacktrace:
ceylon run: Provider "jimfs" not found
java.nio.file.ProviderNotFoundException: Provider "jimfs" not found
at java.nio.file.FileSystems.newFileSystem(FileSystems.java:341)
at java.nio.file.FileSystems.newFileSystem(FileSystems.java:276)
at ceylon.file.internal.createSystem_.createSystem(ConcreteSystem.ceylon:64)
at ceylon.file.createSystem_.createSystem(System.ceylon:43)
at test.se.gustavkarlsson.autogit.file.watcher.run_.run(run.ceylon:17)
at test.se.gustavkarlsson.autogit.file.watcher.run_.main(run.ceylon)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at ceylon.modules.api.runtime.SecurityActions.invokeRunInternal(SecurityActions.java:57)
at ceylon.modules.api.runtime.SecurityActions.invokeRun(SecurityActions.java:48)
at ceylon.modules.api.runtime.AbstractRuntime.invokeRun(AbstractRuntime.java:75)
at ceylon.modules.api.runtime.AbstractRuntime.execute(AbstractRuntime.java:122)
at ceylon.modules.api.runtime.AbstractRuntime.execute(AbstractRuntime.java:106)
at ceylon.modules.Main.execute(Main.java:69)
at ceylon.modules.Main.main(Main.java:42)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.jboss.modules.Module.run(Module.java:312)
at org.jboss.modules.Main.main(Main.java:460)
at ceylon.modules.bootstrap.CeylonRunTool.run(CeylonRunTool.java:244)
at com.redhat.ceylon.common.tools.CeylonTool.run(CeylonTool.java:491)
at com.redhat.ceylon.common.tools.CeylonTool.execute(CeylonTool.java:380)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.redhat.ceylon.launcher.Launcher.runInJava7Checked(Launcher.java:114)
at com.redhat.ceylon.launcher.Launcher.run(Launcher.java:41)
at com.redhat.ceylon.launcher.Launcher.run(Launcher.java:34)
at com.redhat.ceylon.launcher.Launcher.main(Launcher.java:27)
Any ideas on how to install that provider?
I'm running Ceylon 1.2.0 on Linux, with JimFs 1.0 (also tested 1.1-rc1) and working with Jimfs the "intended" way (pure java nio) works fine.
This is related to module visibility, where we need to add a "read" (using Jigsaw terminology) from the JDK to the jimFs module.
I've opened https://github.com/ceylon/ceylon/issues/5995 to investigate.
So, I have the following use case.
I'm simplifying the usage of Spark dataframes for a particular domain by providing a DSL like interface.
All this code goes in a fat jar created by maven shade plugin. (fat jar = without spark and hadoop dependencies)
This fat jar has a main class, lets call it JavaMain.
Inside JavaMain, I make a rest call to get a string whose contents are valid DSL.
I instantiate a IMain object with initial Settings object.
And I bind a few variables. using the imain.bind method.
However this bind fails with the following error:
Set failed in bind(results, com.dhruv.dsl.DslDataFrame.DSLResults, com.dhruv.dsl.DslDataFrame$DSLResults#7650a5f3)
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:734)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.callEither(IMain.scala:738)
at scala.tools.nsc.interpreter.IMain.bind(IMain.scala:625)
at scala.tools.nsc.interpreter.IMain.bind(IMain.scala:661)
at scala.tools.nsc.interpreter.IMain.bind(IMain.scala:662)
at com.thoughtworks.dsl.DSL.run(DSL.scala:44)
at com.thoughtworks.dsl.JavaMain.run(JavaMain.java:30)
at com.thoughtworks.dsl.JavaMain.main(JavaMain.java:43)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassCastException: com.thoughtworks.dsl.DslDataFrame$DSLResults cannot be cast to com.thoughtworks.dsl.DslDataFrame$DSLResults
at $line3.$eval$.set(<console>:6)
at $line3.$eval.set(<console>)
... 21 more
More context:
I had issues with classpath when trying this out. Although it seems like I haven't been able to resolve them all.
Earlier when creating Setting object, I was doing something like this:
val settings = {
val x = new Settings()
x.classpath.value += File.pathSeparator + System.getProperty("java.class.path")
x.usejavacp.value = true
x.verbose.value = true
x
}
However this didn't seem to work as when doing a spark submit this only had the spark and hadoop related jars on the classpath.
I then added the following to the classpath:
val urLs: Array[URL] = Thread.currentThread.getContextClassLoader.asInstanceOf[URLClassLoader].getURLs
and did following:
val settings = {
val x = new Settings()
x.classpath.value += File.pathSeparator + urLs(0)
x.usejavacp.value = true
x.verbose.value = true
x
}
This is the code I'm using to bind the objects:
interpreter.bind("notagin", new SomeDummyObject)
This throws the exception I attached earlier.
Interestingly the following code works: (i.e. imports and new of the same object inside Interpreter doesn't cause a problem)
interpreter.interpret(
"""
import com.dhruv.dsl.operations._
import com.dhruv.dsl.implicits._
import com.dhruv.dsl.DslDataFrame._
import org.apache.spark.sql.Column
import com.dhruv.dsl._
implicit def RichColumn(column: Column): RichColumn = new RichColumn(column)
val justdont = new SomeDummyObject()
justdont.justdontcallme(thatJson)
"""
)
Another detail that I'm aware of and is bothering me is that IMain internally does change the classloader. Not sure if that is causing the issue.
Any help is more than appreciated.
Okay. So we figured out how to solve the problem for us.
I think IMain uses a different classloader to load classes then the one they are supposed to be loaded with. Anyways, the following solves the problem, leaving it for others to have a look.
val interpreter = new IMain(settings){
override protected def parentClassLoader: ClassLoader = this.getClass.getClassLoader
}