NoClassDefFound of MasterNotRunning when using janusgraph backend HBase - java

I am using janusgraph, with storage backend HBase. Currently I am trying to add vertices to the database. The part of the code is
public class Graph {
private static JanusGraph graph = JanusGraphFactory.open("conf/jg.properties");
public static JanusGraph getGraph() {
return graph;
}
public static void addVertex() {
for (int i=0; i<5; i++) {
graph.addVertex("test", i);
}
graph.tx().commit();
}
}
with the main function calls
Graph.addVertex();
The error is
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.eclipse.jdt.internal.jarinjarloader.JarRsrcLoader.main(JarRsrcLoader.java:58)
Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/MasterNotRunningException
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at org.janusgraph.util.system.ConfigurationUtil.instantiate(ConfigurationUtil.java:56)
at org.janusgraph.diskstorage.Backend.getImplementationClass(Backend.java:477)
at org.janusgraph.diskstorage.Backend.getStorageManager(Backend.java:409)
at org.janusgraph.graphdb.configuration.GraphDatabaseConfiguration.<init>(GraphDatabaseConfiguration.java:1376)
at org.janusgraph.core.JanusGraphFactory.open(JanusGraphFactory.java:164)
at org.janusgraph.core.JanusGraphFactory.open(JanusGraphFactory.java:133)
at org.janusgraph.core.JanusGraphFactory.open(JanusGraphFactory.java:80)
... 5 more
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.MasterNotRunningException
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 16 more
I am using janusgraph 0.2.0, in maven 0.2.0
and hbase 1.2.0, java 1.8
I set storage-hostname=127.0.0.1 in jg.properties, so is it a dependency error? Where is exactly the MasterNotRunning?

I have found the solution, but still curious.
I search the class MasterNotRunning and found this is in package org.apache.hadoop.hbase and finally make a guess to add org.apache.hbase & hbase-client into dependency. And the error disappears.
However, I still could not find the reason of this error. If some code in dependency jar uses the class MasterNotRunning then it should import the hbase-client jar, then this should be already in the dependency. How can this code pass the compiling and finally throws an exception during the running time.
Just add, I uses export->runnable jar file to get my jar, so all the dependencies should be added into it.

Related

JackCess "NoClassDefFoundError: java/sql/Blob" exception

The following simple java program:
import com.healthmarketscience.jackcess.util.OleBlob;
public class Test {
public static void main(String[] args) throws Exception {
byte[] data = new byte[100];
OleBlob oleBlob = OleBlob.Builder.fromInternalData(data);
}
}
gets me the exception
Exception in thread "main" java.lang.NoClassDefFoundError: java/sql/Blob
at java.base/java.lang.ClassLoader.defineClass1(Native Method)
at java.base/java.lang.ClassLoader.defineClass(ClassLoader.java:1010)
at java.base/java.lang.ClassLoader.defineClass(ClassLoader.java:1088)
at java.base/java.security.SecureClassLoader.defineClass(SecureClassLoader.java:182)
at java.base/jdk.internal.loader.BuiltinClassLoader.defineClass(BuiltinClassLoader.java:814)
at java.base/jdk.internal.loader.BuiltinClassLoader.findClassInModuleOrNull(BuiltinClassLoader.java:735)
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClassOrNull(BuiltinClassLoader.java:660)
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:634)
at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:182)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:519)
at com.healthmarketscience.jackcess#4.0.1/com.healthmarketscience.jackcess.util.OleBlob$Builder.fromInternalData(OleBlob.java:423)
at tabellenFahrplan/test.Test.main(Test.java:12)
Caused by: java.lang.ClassNotFoundException: java.sql.Blob
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:636)
at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:182)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:519)
... 12 more
JackCess documentation says in case of such an exception there are dependencies missing. But what dependendy referring to java/sql/Blob which is part of Java already?
Besides, through maven I already have the dependencies
jackcess-4.0.1.jar
commons-lang3-3.10.jar
commons-logging-1.2.jar
running JackCess 4.0.1 on OpenJdk 16.0.2
The problem was I was missing entries in the module-info.java file. Regarding my initial question it is
requires java.sql;
Going along using JackCess I also encountered that in order to access some tables in my MS Access database (but not for some other tables) I also need
requires java.scripting;

How do I import singleton object from Scala package in Java?

I am trying to use ARIMA object (Scala), which is imported from a package, in my Java program. Although the compilation succeeds, meaning that ARIMA class is recognized during compilation, there is NoClassDefFoundError for the ARIMA object in runtime. ARIMAModel class has no problem with importing since it is a class.
Is there any way to use the Scala object from my Java program?
Here is the source code for the object in Scala package.
File: .../com/cloudera/sparkts/models/ARIMA.scala
package com.cloudera.sparkts.models
object ARIMA {
def autoFit(ts: Vector, maxP: Int = 5, maxD: Int = 2, maxQ: Int = 5): ARIMAModel = {
...
}
}
class ARIMAModel(...) {
...
}
Here is my Java code.
File: src/main/java/SingleSeriesARIMA.java
import com.cloudera.sparkts.models.ARIMA;
import com.cloudera.sparkts.models.ARIMAModel;
public class SingleSeriesARIMA {
public static void main(String[] args) {
...
ARIMAModel arimaModel = ARIMA.autoFit(tsVector, 1, 0, 1);
...
}
}
Here is the error.
Exception in thread "main" java.lang.NoClassDefFoundError: com/cloudera/sparkts/models/ARIMA
at SingleSeriesARIMA.main(SingleSeriesARIMA.java:43)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.cloudera.sparkts.models.ARIMA
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 10 more
I am using Scala version 2.11.8 and Java 1.8
You need to supply the dependency having Arima object present to the spark cluster using --jars option as below-
spark-submit --jars <path>/<to>/sparkts-0.4.1.jar --class SingleSeriesARIMA target/simple-project-1.0.jar
This will pass the other dependency along with the application jar to be available at spark-runtime.
TO call ARIMA object from java use-
ARIMA$.MODULE$.autoFit(tsVector, 1, 0, 1);

What for do we need ignite.sh script?

I'm new to ignite and trying to run my simple multi-node computation example. I wrote the following simple application:
try(Ignite ignite = Ignition.start("example-cache.xml")){
IgniteCompute asyncCompute = ignite.compute().withAsync();
for(int i = 0; i < 100; i++) {
int[] a = new int[1];
a[0] = i;
asyncCompute.call(() -> {
out.println(a[0]);
return a[0];
});
}
}
First I ran 3 server nodes with ignite.sh that is supplied with the binary ignite distribution. Each in a different JVM instance.
Then I build this application and run it with the same xml-spring config as the 3 server nodes before.
But actually I got the following exception:
class org.apache.ignite.IgniteCheckedException: com.test.App
at org.apache.ignite.internal.util.IgniteUtils.unmarshal(IgniteUtils.java:9826)
at org.apache.ignite.internal.processors.job.GridJobWorker.initialize(GridJobWorker.java:432)
at org.apache.ignite.internal.processors.job.GridJobProcessor.processJobExecuteRequest(GridJobProcessor.java:1108)
at org.apache.ignite.internal.processors.job.GridJobProcessor$JobExecutionListener.onMessage(GridJobProcessor.java:1894)
at org.apache.ignite.internal.managers.communication.GridIoManager.invokeListener(GridIoManager.java:1222)
at org.apache.ignite.internal.managers.communication.GridIoManager.processRegularMessage0(GridIoManager.java:850)
at org.apache.ignite.internal.managers.communication.GridIoManager.access$2100(GridIoManager.java:108)
at org.apache.ignite.internal.managers.communication.GridIoManager$7.run(GridIoManager.java:790)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: class org.apache.ignite.binary.BinaryInvalidTypeException: com.test.App
at org.apache.ignite.internal.binary.BinaryContext.descriptorForTypeId(BinaryContext.java:692)
at org.apache.ignite.internal.binary.BinaryUtils.doReadClass(BinaryUtils.java:1486)
at org.apache.ignite.internal.binary.BinaryUtils.doReadClass(BinaryUtils.java:1424)
at org.apache.ignite.internal.binary.BinaryReaderExImpl.readClass(BinaryReaderExImpl.java:370)
at org.apache.ignite.internal.binary.BinaryFieldAccessor$DefaultFinalClassAccessor.readFixedType(BinaryFieldAccessor.java:828)
at org.apache.ignite.internal.binary.BinaryFieldAccessor$DefaultFinalClassAccessor.read(BinaryFieldAccessor.java:639)
at org.apache.ignite.internal.binary.BinaryClassDescriptor.read(BinaryClassDescriptor.java:833)
at org.apache.ignite.internal.binary.BinaryReaderExImpl.deserialize0(BinaryReaderExImpl.java:1498)
at org.apache.ignite.internal.binary.BinaryReaderExImpl.deserialize(BinaryReaderExImpl.java:1450)
at org.apache.ignite.internal.binary.BinaryUtils.doReadObject(BinaryUtils.java:1640)
at org.apache.ignite.internal.binary.BinaryReaderExImpl.readObject(BinaryReaderExImpl.java:1124)
at org.apache.ignite.internal.processors.closure.GridClosureProcessor$C2V2.readBinary(GridClosureProcessor.java:2073)
at org.apache.ignite.internal.binary.BinaryClassDescriptor.read(BinaryClassDescriptor.java:823)
at org.apache.ignite.internal.binary.BinaryReaderExImpl.deserialize0(BinaryReaderExImpl.java:1498)
at org.apache.ignite.internal.binary.BinaryReaderExImpl.deserialize(BinaryReaderExImpl.java:1450)
at org.apache.ignite.internal.binary.GridBinaryMarshaller.deserialize(GridBinaryMarshaller.java:298)
at org.apache.ignite.internal.binary.BinaryMarshaller.unmarshal0(BinaryMarshaller.java:99)
at org.apache.ignite.marshaller.AbstractNodeNameAwareMarshaller.unmarshal(AbstractNodeNameAwareMarshaller.java:82)
at org.apache.ignite.internal.util.IgniteUtils.unmarshal(IgniteUtils.java:9820)
... 10 more
Caused by: java.lang.ClassNotFoundException: com.test.App
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.ignite.internal.util.IgniteUtils.forName(IgniteUtils.java:8459)
at org.apache.ignite.internal.MarshallerContextAdapter.getClass(MarshallerContextAdapter.java:185)
at org.apache.ignite.internal.binary.BinaryContext.descriptorForTypeId(BinaryContext.java:683)
This exception is perfectly clear. But what do we need the ignite.sh script for?
You should either add you App class in classpath for all ignite nodes or
turn on peer-class-loading[1].
[1] https://apacheignite.readme.io/v1.9/docs/zero-deployment#peer-class-loading
You should add your JAR file to the lib/ folder under the Apache Ignite installation folder. Then you can start ignite.sh and all your classes will be automatically loaded.
Alternatively, you can try enabling peer-class-loading as suggested above, and Ignite will load the classes automatically.

Using Google-Reflection within Groovy causes exception whereas equivalent Java code works

I'm trying to use some code from another answer on SO, and while the code run in Java, from Groovy it causes an exception.
The code in question is:
Reflections reflections = new Reflections(new ConfigurationBuilder()
.setScanners( new SubTypesScanner(false /* don't exclude Object.class */), new ResourcesScanner() )
.setUrls(ClasspathHelper.forClassLoader(classLoadersList.toArray(new ClassLoader[0])))
.filterInputsBy(
new FilterBuilder()
.include( prefix( "net.initech" ) )
.exclude( prefix( "net.initech.util" )
)));
The exception is getting throwline in question seems to be: ClasspathHelper.forClassLoader(...)
The happens regardless of whether I'm using #CompileStatic or not. Also, tried just using this.getClassLoader() and the same issue occurs.
The exception is:
Exception in thread "main" java.lang.NoClassDefFoundError: javax/servlet/ServletContext
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2688)
at java.lang.Class.getDeclaredMethods(Class.java:1962)
at org.codehaus.groovy.reflection.stdclasses.CachedSAMClass.getAbstractMethods(CachedSAMClass.java:91)
at org.codehaus.groovy.reflection.stdclasses.CachedSAMClass.getSAMMethod(CachedSAMClass.java:155)
at org.codehaus.groovy.reflection.ClassInfo.isSAM(ClassInfo.java:280)
at org.codehaus.groovy.reflection.ClassInfo.createCachedClass(ClassInfo.java:270)
at org.codehaus.groovy.reflection.ClassInfo.access$400(ClassInfo.java:36)
at org.codehaus.groovy.reflection.ClassInfo$LazyCachedClassRef.initValue(ClassInfo.java:441)
at org.codehaus.groovy.reflection.ClassInfo$LazyCachedClassRef.initValue(ClassInfo.java:432)
at org.codehaus.groovy.util.LazyReference.getLocked(LazyReference.java:46)
at org.codehaus.groovy.util.LazyReference.get(LazyReference.java:33)
at org.codehaus.groovy.reflection.ClassInfo.getCachedClass(ClassInfo.java:89)
at org.codehaus.groovy.reflection.ReflectionCache.getCachedClass(ReflectionCache.java:107)
at groovy.lang.MetaClassImpl.(MetaClassImpl.java:163)
at groovy.lang.MetaClassImpl.(MetaClassImpl.java:187)
at groovy.lang.MetaClassImpl.(MetaClassImpl.java:193)
at groovy.lang.MetaClassRegistry$MetaClassCreationHandle.createNormalMetaClass(MetaClassRegistry.java:158)
at groovy.lang.MetaClassRegistry$MetaClassCreationHandle.createWithCustomLookup(MetaClassRegistry.java:148)
at groovy.lang.MetaClassRegistry$MetaClassCreationHandle.create(MetaClassRegistry.java:131)
at org.codehaus.groovy.reflection.ClassInfo.getMetaClassUnderLock(ClassInfo.java:175)
at org.codehaus.groovy.reflection.ClassInfo.getMetaClass(ClassInfo.java:192)
at org.codehaus.groovy.runtime.metaclass.MetaClassRegistryImpl.getMetaClass(MetaClassRegistryImpl.java:255)
at org.codehaus.groovy.runtime.InvokerHelper.getMetaClass(InvokerHelper.java:859)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.createCallStaticSite(CallSiteArray.java:72)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.createCallSite(CallSiteArray.java:159)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:116)
at net.initech.DeltaCodeGen.main(DeltaCodeGen.groovy:27)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
Caused by: java.lang.ClassNotFoundException: javax.servlet.ServletContext
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 35 more
I can work around this by adding to my POM.xml
<dependency>
<groupId>org.apache.tomcat</groupId>
<artifactId>servlet-api</artifactId>
<version>6.0.37</version>
</dependency>
but I shouldn't have, and don't have it in the Java version.
You might be running into the well-known problem that the Groovy compiler sometimes needs runtime dependencies to be put on its compile class path. This is because the compiler uses Java reflection to access its compile class path. There are concrete plans to fix this in an upcoming release (don't remember if it's 2.x or 3.0).
Looks like the domain you wish to scan is "net.initech". In that case, why not using ClasspathHelper.forPackage("net.initech") (and leaving the exclude pattern)?
Second, what's the idea of using new ClassLoader[0]?
Also, note the using new SubTypesScanner(false) is not a best practice, as it might create a huge md store of all classes (well, all classes are derived from Object).
Basically Reflections does not intend to list all classes (though it obviously can), but to aggregate types based on some criteria (annotation/supertype and so).

Official supercsv dozer example does not work with Play — Play 2.1.1, Java

Update:
Now I found out something very strange, it works when i put the compiled test classes directly
into the "super-csv-dozer-2.1.0.jar" with the same structure as in the svn repo.
But as soon as i use my own package it wont work. I always used the entire path
like this: myPackage.csv.SurveyResponse.class
What am i missing? and why does it work in a normal Java Project but not in a Play Project?
I also tried it outside of eclipse in a different directory and dont have any checkouts from
the svn repo that could confuse the paths. Another thing i tried is to put the Writing.class into adifferent package but didnt help.
Original qustion:
Im trying to use SuperCSV with Dozer in a Play Project.
The official supercsv sample works fine in a separate Java Project.
But when I put the code and the needed SuperCSV Jars in a freshly created Play project I always
get a ClassNotfoundException for SurveyResponse.class in this line of code:
beanWriter.configureBeanMapping(myPackage.csv.SurveyResponse.class, FIELD_MAPPING);
Here a screenshot of my project structure:
http://oi44.tinypic.com/mrqp7s.jpg
I made sure that all the JARS are available, I put them unmanaged into the /lib folder,
they are available in eclipse and no errors appear during compilation.
I debugged the code and the SurveyResponse.class is found and the beanWriter is initialized.
So somehow Play must do some magic in the background to trigger this bug.
What could play be doing in the background to possibly trigger such odd behavior?
What could I try to do to fix this?
Changes I made to the sample so it works with Play:
I used exactly the same code as the offical supercsv example.
The only change i did is remove the Writing.main(..) method and set the methods Writing.writeWithDozerCsvBeanWriter()
and Writing.partialWriteWithCsvDozerBeanWriter() to public,so it can be accessed from the Application controller.
And of course I changed the package name in all classes to package myPackage.test;
Links to sample:
See in the comment, i cant add more than 2 links because of low reputation.
Controller:
public class Application extends Controller {
public static Result index() throws Exception {
Writing.writeWithDozerCsvBeanWriter();
Writing.partialWriteWithCsvDozerBeanWriter();
return ok(index.render("Your new application is ready."));
}
}
Code that triggers error in Writing class:
ICsvDozerBeanWriter beanWriter = null;
try {
beanWriter = new CsvDozerBeanWriter(new FileWriter("target/writeWithCsvDozerBeanWriter.csv"),
CsvPreference.STANDARD_PREFERENCE);
// configure the mapping from the fields to the CSV columns
//Here the exception occures:
beanWriter.configureBeanMapping(myPackage.csv.SurveyResponse.class, FIELD_MAPPING);
Stack trace:
play.api.Application$$anon$1: Execution exception[[MappingException: java.lang.ClassNotFoundException: myPackage.test.SurveyResponse]]
at play.api.Application$class.handleError(Application.scala:289) ~[play_2.10.jar:2.1.1]
at play.api.DefaultApplication.handleError(Application.scala:383) [play_2.10.jar:2.1.1]
at play.core.server.netty.PlayDefaultUpstreamHandler$$anon$2$$anonfun$handle$1.apply(PlayDefaultUpstreamHandler.scala:144) [play_2.10.jar:2.1.1]
at play.core.server.netty.PlayDefaultUpstreamHandler$$anon$2$$anonfun$handle$1.apply(PlayDefaultUpstreamHandler.scala:140) [play_2.10.jar:2.1.1]
at play.api.libs.concurrent.PlayPromise$$anonfun$extend1$1.apply(Promise.scala:113) [play_2.10.jar:2.1.1]
at play.api.libs.concurrent.PlayPromise$$anonfun$extend1$1.apply(Promise.scala:113) [play_2.10.jar:2.1.1]
at play.api.libs.concurrent.PlayPromise$$anonfun$extend$1$$anonfun$apply$1.apply(Promise.scala:104) [play_2.10.jar:2.1.1]
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24) [scala-library.jar:na]
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24) [scala-library.jar:na]
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(Unknown Source) [na:1.6.0_37]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) [na:1.6.0_37]
at java.lang.Thread.run(Unknown Source) [na:1.6.0_37]
org.dozer.MappingException: java.lang.ClassNotFoundException: myPackage.test.SurveyResponse
at org.dozer.util.MappingUtils.throwMappingException(MappingUtils.java:82) ~[dozer-5.4.0.jar:na]
at org.dozer.util.DefaultClassLoader.loadClass(DefaultClassLoader.java:38) ~[dozer-5.4.0.jar:na]
at org.dozer.util.MappingUtils.loadClass(MappingUtils.java:224) ~[dozer-5.4.0.jar:na]
at org.dozer.loader.DozerBuilder$MappingBuilder.classA(DozerBuilder.java:129) ~[dozer-5.4.0.jar:na]
at org.dozer.loader.api.BeanMappingBuilder.mapping(BeanMappingBuilder.java:72) ~[dozer-5.4.0.jar:na]
at org.dozer.loader.api.BeanMappingBuilder.mapping(BeanMappingBuilder.java:67) ~[dozer-5.4.0.jar:na]
at org.supercsv.io.dozer.CsvDozerBeanWriter$MappingBuilder.configure(CsvDozerBeanWriter.java:178) ~[super-csv-dozer-2.1.0.jar:na]
at org.dozer.loader.api.BeanMappingBuilder.build(BeanMappingBuilder.java:42) ~[dozer-5.4.0.jar:na]
at org.dozer.DozerBeanMapper.addMapping(DozerBeanMapper.java:258) ~[dozer-5.4.0.jar:na]
at org.supercsv.io.dozer.CsvDozerBeanWriter.configureBeanMapping(CsvDozerBeanWriter.java:91) ~[super-csv-dozer-2.1.0.jar:na]
at myPackage.test.Writing.writeWithDozerCsvBeanWriter(Writing.java:88) ~[na:na]
at controllers.Application.index(Application.java:12) ~[na:na]
at Routes$$anonfun$routes$1$$anonfun$applyOrElse$1$$anonfun$apply$1.apply(routes_routing.scala:49) ~[na:na]
at Routes$$anonfun$routes$1$$anonfun$applyOrElse$1$$anonfun$apply$1.apply(routes_routing.scala:49) ~[na:na]
at play.core.Router$HandlerInvoker$$anon$6$$anon$2.invocation(Router.scala:164) ~[play_2.10.jar:2.1.1]
at play.core.Router$Routes$$anon$1.invocation(Router.scala:345) ~[play_2.10.jar:2.1.1]
at play.core.j.JavaAction$$anon$1.call(JavaAction.scala:31) ~[play_2.10.jar:2.1.1]
at play.core.j.JavaAction$$anon$2.apply(JavaAction.scala:74) ~[play_2.10.jar:2.1.1]
at play.core.j.JavaAction$$anon$2.apply(JavaAction.scala:73) ~[play_2.10.jar:2.1.1]
at play.libs.F$Promise$PromiseActor.onReceive(F.java:420) ~[play_2.10.jar:2.1.1]
at akka.actor.UntypedActor$$anonfun$receive$1.applyOrElse(UntypedActor.scala:159) ~[akka-actor_2.10.jar:na]
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:425) ~[akka-actor_2.10.jar:na]
at akka.actor.ActorCell.invoke(ActorCell.scala:386) ~[akka-actor_2.10.jar:na]
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:230) ~[akka-actor_2.10.jar:na]
at akka.dispatch.Mailbox.run(Mailbox.scala:212) ~[akka-actor_2.10.jar:na]
at akka.dispatch.ForkJoinExecutorConfigurator$MailboxExecutionTask.exec(AbstractDispatcher.scala:502) ~[akka-actor_2.10.jar:na]
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:262) ~[scala-library.jar:na]
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:975) ~[scala-library.jar:na]
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1478) ~[scala-library.jar:na]
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:104) ~[scala-library.jar:na]
Caused by: java.lang.ClassNotFoundException: myPackage.test.SurveyResponse
at java.net.URLClassLoader$1.run(Unknown Source) ~[na:1.6.0_37]
at java.security.AccessController.doPrivileged(Native Method) ~[na:1.6.0_37]
at java.net.URLClassLoader.findClass(Unknown Source) ~[na:1.6.0_37]
at java.lang.ClassLoader.loadClass(Unknown Source) ~[na:1.6.0_37]
at java.lang.ClassLoader.loadClass(Unknown Source) ~[na:1.6.0_37]
at sbt.PlayCommands$$anonfun$53$$anonfun$55$$anon$2.loadClass(PlayCommands.scala:535) ~[na:na]
at java.lang.Class.forName0(Native Method) ~[na:1.6.0_37]
at java.lang.Class.forName(Unknown Source) ~[na:1.6.0_37]
at org.apache.commons.lang3.ClassUtils.getClass(ClassUtils.java:823) ~[commons-lang3.jar:3.1]
at org.apache.commons.lang3.ClassUtils.getClass(ClassUtils.java:889) ~[commons-lang3.jar:3.1]
at org.apache.commons.lang3.ClassUtils.getClass(ClassUtils.java:872) ~[commons-lang3.jar:3.1]
at org.dozer.util.DefaultClassLoader.loadClass(DefaultClassLoader.java:36) ~[dozer-5.4.0.jar:na]
... 28 common frames omitted
Extra information:
I still couldnt solve the problem, but here some extra infomation to narrow down the problem:
The csv header is successfully written to the file when i remove the configureBeanMapping Method call (see code below).
The SurveyResponse class is working too when i do a System.out.println() on a filled SurveyResponse object (see code below).
So its not a problem with the package or class name.
The modiefied Code that writes only the header:
//...more code
myPackage.csv.SurveyResponse response3 = new myPackage.csv.SurveyResponse(42, false, Arrays.asList(new Answer(1, null), new Answer(2,
"Carl Sagan"), new Answer(3, "Star Wars")));
final List<myPackage.csv.SurveyResponse> surveyResponses = Arrays.asList(response1, response2, response3);
ICsvDozerBeanWriter beanWriter = null;
try {
beanWriter = new CsvDozerBeanWriter(new FileWriter("target/writeWithCsvDozerBeanWriter.csv"),
CsvPreference.STANDARD_PREFERENCE);
// configure the mapping from the fields to the CSV columns
//beanWriter.configureBeanMapping(myPackage.csv.SurveyResponse.class, FIELD_MAPPING);
//Prints the value 42 successfully
System.out.println(response3.getAge());
// write the header
beanWriter.writeHeader("age", "consentGiven", "questionNo1", "answer1", "questionNo2", "answer2","questionNo3", "answer3");
//...more code
From what I understand from the error, Play is trying to load the wrong class. Look at this line :
java.lang.ClassNotFoundException: org.supercsv.mock.dozer.SurveyResponse
I think that SurverResponse is one of your own project classes and not part of SuperCSV. Try to prefix it with the full path such as :
beanWriter.configureBeanMapping(whateverpackage.models.SurveyResponse.class, FIELD_MAPPING);
SurveyResponse and Answer are test classes of the super-csv-dozer artifact - so they aren't packaged in the distribution (i.e they're not in Maven or in the zip file on SourceForge).
You can view the test source online or checkout the SVN repo (which it sounds like you've already done) and just copy them into your Play project.
Both files are in the following directory:
supercsv/super-csv-dozer/src/test/java/org/supercsv/mock/dozer/
I'm a little curious why you're not getting a compilation error when trying to use SurveyResponse in your code. My guess is that you have the super-csv-dozer project checked out and open in your IDE and your project is finding it.

Categories