Flink can not find groovy class during checkpoint - java

i have a problem in flink. my real-time compute engine use groovy script to expend compute type(like:sum、average、count adn so on). we define a standard compute interface(AbstractCompute),and if i want expand a compute type in this framework i just need impl AbstractCompute.and then store groovy script in the db. then application can read script by task and load into jvm by GroovyClassLoader.
This process does not use Flink again, depending on the work is very good。The reason is that Flink uses another ClassLoader (FlinkUserCodeClassLoaders$ChildFirstClassLoader) to load the object instantiated by the groovy script at checkpoint instead of using GroovyClassLoader.
Code
// Init Groovy ClassLoader
CompilerConfiguration classLoaderConfig = new CompilerConfiguration();
classLoaderConfig.setSourceEncoding("UTF-8");
CLASS_LOADER = new GroovyClassLoader(Thread.currentThread().getContextClassLoader(), classLoaderConfig);
......
......
// parse script and new instance and put into cache
Class clazz = CLASS_LOADER.parseClass(computeType.getScript());
AbstractComputable computableObject = (AbstractComputable) clazz.newInstance();
removeComputeType(computeType);
// 自定义计算方式对象存入缓存
IndicatorCache.COMPUTABLE_OBJECT_CACHE.put(computeType.getId().intValue(), computableObject);
......
......
AbstractComputable computable = IndicatorCache.COMPUTABLE_OBJECT_CACHE.get(indicator.getComputeType());
if (computable == null) {
if (log.isDebugEnabled()) {
log.debug("without computeType:{} in cache", indicator.getComputeType());
}
return false;
}
indicator.setComputableObject(computable);
Exception stack:
com.esotericsoftware.kryo.KryoException: Unable to find class: com.xxx.xxx.common.computable.CurValueCompute
Serialization trace:
computableObject (com.xxx.xxx.common.pojo.property.IndicatorProperty)
normalIndicatorList (com.xxx.xxx.common.pojo.property.ComputeTuple)
at com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:138)
at com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:115)
at com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:641)
at com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:99)
at com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:528)
at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:761)
at com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:116)
at com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:22)
at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:679)
at com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106)
at com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:528)
at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:657)
at org.apache.flink.api.java.typeutils.runtime.kryo.KryoSerializer.copy(KryoSerializer.java:231)
at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.pushToOperator(OperatorChain.java:577)
at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:554)
at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:534)
at org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:718)
at org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:696)
at org.apache.flink.streaming.api.operators.StreamMap.processElement(StreamMap.java:41)
at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.pushToOperator(OperatorChain.java:579)
at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:554)
at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:534)
at org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:718)
at org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:696)
at org.apache.flink.streaming.api.operators.StreamFilter.processElement(StreamFilter.java:40)
at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.pushToOperator(OperatorChain.java:579)
at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:554)
at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:534)
at org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:718)
at org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:696)
at org.apache.flink.streaming.api.operators.StreamSourceContexts$ManualWatermarkContext.processAndCollectWithTimestamp(StreamSourceContexts.java:310)
at org.apache.flink.streaming.api.operators.StreamSourceContexts$WatermarkContext.collectWithTimestamp(StreamSourceContexts.java:409)
at org.apache.flink.streaming.connectors.kafka.internals.AbstractFetcher.emitRecordWithTimestamp(AbstractFetcher.java:398)
at org.apache.flink.streaming.connectors.kafka.internal.Kafka010Fetcher.emitRecord(Kafka010Fetcher.java:89)
at org.apache.flink.streaming.connectors.kafka.internal.Kafka09Fetcher.runFetchLoop(Kafka09Fetcher.java:154)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.run(FlinkKafkaConsumerBase.java:738)
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:94)
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:58)
at org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run(SourceStreamTask.java:99)
at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:300)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:704)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: com.xxx.xxx.common.computable.CurValueCompute
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader.loadClass(FlinkUserCodeClassLoaders.java:129)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:136)
... 41 common frames omitted
How to use Groovy dynamic language correctly in Flink?

Flink needs to know the types it is processing. Otherwise it is not possible to serialize and deserialize the instances. Therefore, the class definitions need to be contained in the user code jar which you submit to the Flink cluster.
If you want to support dynamically loaded classes, then you should serialize these instances into a generic format (e.g. AbstractComputeContainer) which you completely resolve in the user code function where you have the GroovyClassLoader.

Related

What for do we need ignite.sh script?

I'm new to ignite and trying to run my simple multi-node computation example. I wrote the following simple application:
try(Ignite ignite = Ignition.start("example-cache.xml")){
IgniteCompute asyncCompute = ignite.compute().withAsync();
for(int i = 0; i < 100; i++) {
int[] a = new int[1];
a[0] = i;
asyncCompute.call(() -> {
out.println(a[0]);
return a[0];
});
}
}
First I ran 3 server nodes with ignite.sh that is supplied with the binary ignite distribution. Each in a different JVM instance.
Then I build this application and run it with the same xml-spring config as the 3 server nodes before.
But actually I got the following exception:
class org.apache.ignite.IgniteCheckedException: com.test.App
at org.apache.ignite.internal.util.IgniteUtils.unmarshal(IgniteUtils.java:9826)
at org.apache.ignite.internal.processors.job.GridJobWorker.initialize(GridJobWorker.java:432)
at org.apache.ignite.internal.processors.job.GridJobProcessor.processJobExecuteRequest(GridJobProcessor.java:1108)
at org.apache.ignite.internal.processors.job.GridJobProcessor$JobExecutionListener.onMessage(GridJobProcessor.java:1894)
at org.apache.ignite.internal.managers.communication.GridIoManager.invokeListener(GridIoManager.java:1222)
at org.apache.ignite.internal.managers.communication.GridIoManager.processRegularMessage0(GridIoManager.java:850)
at org.apache.ignite.internal.managers.communication.GridIoManager.access$2100(GridIoManager.java:108)
at org.apache.ignite.internal.managers.communication.GridIoManager$7.run(GridIoManager.java:790)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: class org.apache.ignite.binary.BinaryInvalidTypeException: com.test.App
at org.apache.ignite.internal.binary.BinaryContext.descriptorForTypeId(BinaryContext.java:692)
at org.apache.ignite.internal.binary.BinaryUtils.doReadClass(BinaryUtils.java:1486)
at org.apache.ignite.internal.binary.BinaryUtils.doReadClass(BinaryUtils.java:1424)
at org.apache.ignite.internal.binary.BinaryReaderExImpl.readClass(BinaryReaderExImpl.java:370)
at org.apache.ignite.internal.binary.BinaryFieldAccessor$DefaultFinalClassAccessor.readFixedType(BinaryFieldAccessor.java:828)
at org.apache.ignite.internal.binary.BinaryFieldAccessor$DefaultFinalClassAccessor.read(BinaryFieldAccessor.java:639)
at org.apache.ignite.internal.binary.BinaryClassDescriptor.read(BinaryClassDescriptor.java:833)
at org.apache.ignite.internal.binary.BinaryReaderExImpl.deserialize0(BinaryReaderExImpl.java:1498)
at org.apache.ignite.internal.binary.BinaryReaderExImpl.deserialize(BinaryReaderExImpl.java:1450)
at org.apache.ignite.internal.binary.BinaryUtils.doReadObject(BinaryUtils.java:1640)
at org.apache.ignite.internal.binary.BinaryReaderExImpl.readObject(BinaryReaderExImpl.java:1124)
at org.apache.ignite.internal.processors.closure.GridClosureProcessor$C2V2.readBinary(GridClosureProcessor.java:2073)
at org.apache.ignite.internal.binary.BinaryClassDescriptor.read(BinaryClassDescriptor.java:823)
at org.apache.ignite.internal.binary.BinaryReaderExImpl.deserialize0(BinaryReaderExImpl.java:1498)
at org.apache.ignite.internal.binary.BinaryReaderExImpl.deserialize(BinaryReaderExImpl.java:1450)
at org.apache.ignite.internal.binary.GridBinaryMarshaller.deserialize(GridBinaryMarshaller.java:298)
at org.apache.ignite.internal.binary.BinaryMarshaller.unmarshal0(BinaryMarshaller.java:99)
at org.apache.ignite.marshaller.AbstractNodeNameAwareMarshaller.unmarshal(AbstractNodeNameAwareMarshaller.java:82)
at org.apache.ignite.internal.util.IgniteUtils.unmarshal(IgniteUtils.java:9820)
... 10 more
Caused by: java.lang.ClassNotFoundException: com.test.App
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.ignite.internal.util.IgniteUtils.forName(IgniteUtils.java:8459)
at org.apache.ignite.internal.MarshallerContextAdapter.getClass(MarshallerContextAdapter.java:185)
at org.apache.ignite.internal.binary.BinaryContext.descriptorForTypeId(BinaryContext.java:683)
This exception is perfectly clear. But what do we need the ignite.sh script for?
You should either add you App class in classpath for all ignite nodes or
turn on peer-class-loading[1].
[1] https://apacheignite.readme.io/v1.9/docs/zero-deployment#peer-class-loading
You should add your JAR file to the lib/ folder under the Apache Ignite installation folder. Then you can start ignite.sh and all your classes will be automatically loaded.
Alternatively, you can try enabling peer-class-loading as suggested above, and Ignite will load the classes automatically.

Axis Client gsoap Server

I already have a C++ server containing a service that inserts an user to the DB, the service work's great when I test it on console.
But the fact is that I'm developing a Java client application that consumes the service with Apache Axis, unfortunately it doesn't works. I have been searching for information that could help me with this trouble but I don't see any similar implementation.
My Apache Axis files are in /usr/share/java, which is the value of my AXIS2_HOME variable, this, in order to execute:
java -cp $AXIS2_HOME org.apache.axis.wsdl.WSDL2Java -p CrearAlumno http://localhost/CrearAlumno.wsdl
to generate the files, later I execute:
javac -cp $AXIS2_HOME *.java
to compile my files Including the Client Class
//CrearAlumnoClient.java
package CrearAlumno;
import java.rmi.RemoteException;
import javax.xml.rpc.ServiceException;
public class CrearAlumnoClient{
public static void main(String[] args)
{
Input in = new Input("asdf", "adgfsdf", "asdg", 453, "asdf", "asdfasdf", "pasdfsd", "asdfsd");
try{
CrearAlumno_Service service = new CrearAlumno_ServiceLocator();
CrearAlumnoPortType port = service.getCrearAlumno();
String response = port.getInfo(in);
}catch(RemoteException e){
e.printStackTrace();
}catch(ServiceException e){
e.printStackTrace();
}
}
}
but when I excecute:
java CrearAlumno.CrearAlumnoClient
My application throws this errors:
Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: javax/xml/rpc/ServiceException
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
at java.lang.Class.getMethod0(Class.java:3018)
at java.lang.Class.getMethod(Class.java:1784)
at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: javax.xml.rpc.ServiceException
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more
I have no idea how to solve this errors, I have been searching for an implementation but at this moment, I dont have it.
I will also be pleased if anyone can show me a simply implementation of Axis and gsoap.
Thank you for your attention :).
This looks like a simple case of your classpath not being set up correctly.
There's information on that specific topic here: http://axis.apache.org/axis/java/install.html#Classpath_setup
You need to ensure that the jar file containing javax.xml.rpc.ServiceException is present.
I see you are setting up your classpath using -cp $AXIS2_HOME which won't work. At best if your jars are in $AXIS2_HOME then you will need to do $AXIS2_HOME/*.jar but it all liklihood you'll need to have something more like:
set AXIS_HOME=/usr/axis
set AXIS_LIB=$AXIS_HOME/lib
set AXISCLASSPATH=$AXIS_LIB/axis.jar:$AXIS_LIB/commons-discovery.jar:
$AXIS_LIB/commons-logging.jar:$AXIS_LIB/jaxrpc.jar:$AXIS_LIB/saaj.jar:
$AXIS_LIB/log4j-1.2.8.jar:$AXIS_LIB/xml-apis.jar:$AXIS_LIB/xercesImpl.jar:
$AXIS_LIB/wsdl4j.jar
export AXIS_HOME; export AXIS_LIB; export AXISCLASSPATH
Then invoke your application with:
java -cp $AXISCLASSPATH
With regards to integration between Axis and Gsoap it really should be quite straightforward. There shouldn't really be any special interventions required because you're crossing between java and c world - at least for simple use-cases.

Loading Mllib models outside Spark

I'm training a model in spark with mllib and saving it:
val model = SVMWithSGD.train(training, numIterations)
model.save(sc, "~/model")
but I'm having trouble loading it from a java app without spark to make real time predictions.
SparkConf sconf = new SparkConf().setAppName("Application").setMaster("local");
SparkContext sc = new SparkContext(sconf);
SVMModel model = SVMModel.load(sc, "/model");
I'm getting:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/SparkConf
at ModelUser$.main(ModelUser.scala:11)
at ModelUser.main(ModelUser.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.SparkConf
Is there a way to load the model in normal java app?
Have a look PMML model export here
PPML model export in spark is not being maintained anymore, and only the old RDD api support it.
I've been using jpmml-sparkml to solve the problem. It also has the java runtime for standalone model execution.

InvocationTargetException. Cannot cast class X to class X. When invoked in Scala Imain through spark-submit

So, I have the following use case.
I'm simplifying the usage of Spark dataframes for a particular domain by providing a DSL like interface.
All this code goes in a fat jar created by maven shade plugin. (fat jar = without spark and hadoop dependencies)
This fat jar has a main class, lets call it JavaMain.
Inside JavaMain, I make a rest call to get a string whose contents are valid DSL.
I instantiate a IMain object with initial Settings object.
And I bind a few variables. using the imain.bind method.
However this bind fails with the following error:
Set failed in bind(results, com.dhruv.dsl.DslDataFrame.DSLResults, com.dhruv.dsl.DslDataFrame$DSLResults#7650a5f3)
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:734)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.callEither(IMain.scala:738)
at scala.tools.nsc.interpreter.IMain.bind(IMain.scala:625)
at scala.tools.nsc.interpreter.IMain.bind(IMain.scala:661)
at scala.tools.nsc.interpreter.IMain.bind(IMain.scala:662)
at com.thoughtworks.dsl.DSL.run(DSL.scala:44)
at com.thoughtworks.dsl.JavaMain.run(JavaMain.java:30)
at com.thoughtworks.dsl.JavaMain.main(JavaMain.java:43)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassCastException: com.thoughtworks.dsl.DslDataFrame$DSLResults cannot be cast to com.thoughtworks.dsl.DslDataFrame$DSLResults
at $line3.$eval$.set(<console>:6)
at $line3.$eval.set(<console>)
... 21 more
More context:
I had issues with classpath when trying this out. Although it seems like I haven't been able to resolve them all.
Earlier when creating Setting object, I was doing something like this:
val settings = {
val x = new Settings()
x.classpath.value += File.pathSeparator + System.getProperty("java.class.path")
x.usejavacp.value = true
x.verbose.value = true
x
}
However this didn't seem to work as when doing a spark submit this only had the spark and hadoop related jars on the classpath.
I then added the following to the classpath:
val urLs: Array[URL] = Thread.currentThread.getContextClassLoader.asInstanceOf[URLClassLoader].getURLs
and did following:
val settings = {
val x = new Settings()
x.classpath.value += File.pathSeparator + urLs(0)
x.usejavacp.value = true
x.verbose.value = true
x
}
This is the code I'm using to bind the objects:
interpreter.bind("notagin", new SomeDummyObject)
This throws the exception I attached earlier.
Interestingly the following code works: (i.e. imports and new of the same object inside Interpreter doesn't cause a problem)
interpreter.interpret(
"""
import com.dhruv.dsl.operations._
import com.dhruv.dsl.implicits._
import com.dhruv.dsl.DslDataFrame._
import org.apache.spark.sql.Column
import com.dhruv.dsl._
implicit def RichColumn(column: Column): RichColumn = new RichColumn(column)
val justdont = new SomeDummyObject()
justdont.justdontcallme(thatJson)
"""
)
Another detail that I'm aware of and is bothering me is that IMain internally does change the classloader. Not sure if that is causing the issue.
Any help is more than appreciated.
Okay. So we figured out how to solve the problem for us.
I think IMain uses a different classloader to load classes then the one they are supposed to be loaded with. Anyways, the following solves the problem, leaving it for others to have a look.
val interpreter = new IMain(settings){
override protected def parentClassLoader: ClassLoader = this.getClass.getClassLoader
}

Using Google-Reflection within Groovy causes exception whereas equivalent Java code works

I'm trying to use some code from another answer on SO, and while the code run in Java, from Groovy it causes an exception.
The code in question is:
Reflections reflections = new Reflections(new ConfigurationBuilder()
.setScanners( new SubTypesScanner(false /* don't exclude Object.class */), new ResourcesScanner() )
.setUrls(ClasspathHelper.forClassLoader(classLoadersList.toArray(new ClassLoader[0])))
.filterInputsBy(
new FilterBuilder()
.include( prefix( "net.initech" ) )
.exclude( prefix( "net.initech.util" )
)));
The exception is getting throwline in question seems to be: ClasspathHelper.forClassLoader(...)
The happens regardless of whether I'm using #CompileStatic or not. Also, tried just using this.getClassLoader() and the same issue occurs.
The exception is:
Exception in thread "main" java.lang.NoClassDefFoundError: javax/servlet/ServletContext
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2688)
at java.lang.Class.getDeclaredMethods(Class.java:1962)
at org.codehaus.groovy.reflection.stdclasses.CachedSAMClass.getAbstractMethods(CachedSAMClass.java:91)
at org.codehaus.groovy.reflection.stdclasses.CachedSAMClass.getSAMMethod(CachedSAMClass.java:155)
at org.codehaus.groovy.reflection.ClassInfo.isSAM(ClassInfo.java:280)
at org.codehaus.groovy.reflection.ClassInfo.createCachedClass(ClassInfo.java:270)
at org.codehaus.groovy.reflection.ClassInfo.access$400(ClassInfo.java:36)
at org.codehaus.groovy.reflection.ClassInfo$LazyCachedClassRef.initValue(ClassInfo.java:441)
at org.codehaus.groovy.reflection.ClassInfo$LazyCachedClassRef.initValue(ClassInfo.java:432)
at org.codehaus.groovy.util.LazyReference.getLocked(LazyReference.java:46)
at org.codehaus.groovy.util.LazyReference.get(LazyReference.java:33)
at org.codehaus.groovy.reflection.ClassInfo.getCachedClass(ClassInfo.java:89)
at org.codehaus.groovy.reflection.ReflectionCache.getCachedClass(ReflectionCache.java:107)
at groovy.lang.MetaClassImpl.(MetaClassImpl.java:163)
at groovy.lang.MetaClassImpl.(MetaClassImpl.java:187)
at groovy.lang.MetaClassImpl.(MetaClassImpl.java:193)
at groovy.lang.MetaClassRegistry$MetaClassCreationHandle.createNormalMetaClass(MetaClassRegistry.java:158)
at groovy.lang.MetaClassRegistry$MetaClassCreationHandle.createWithCustomLookup(MetaClassRegistry.java:148)
at groovy.lang.MetaClassRegistry$MetaClassCreationHandle.create(MetaClassRegistry.java:131)
at org.codehaus.groovy.reflection.ClassInfo.getMetaClassUnderLock(ClassInfo.java:175)
at org.codehaus.groovy.reflection.ClassInfo.getMetaClass(ClassInfo.java:192)
at org.codehaus.groovy.runtime.metaclass.MetaClassRegistryImpl.getMetaClass(MetaClassRegistryImpl.java:255)
at org.codehaus.groovy.runtime.InvokerHelper.getMetaClass(InvokerHelper.java:859)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.createCallStaticSite(CallSiteArray.java:72)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.createCallSite(CallSiteArray.java:159)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:116)
at net.initech.DeltaCodeGen.main(DeltaCodeGen.groovy:27)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
Caused by: java.lang.ClassNotFoundException: javax.servlet.ServletContext
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 35 more
I can work around this by adding to my POM.xml
<dependency>
<groupId>org.apache.tomcat</groupId>
<artifactId>servlet-api</artifactId>
<version>6.0.37</version>
</dependency>
but I shouldn't have, and don't have it in the Java version.
You might be running into the well-known problem that the Groovy compiler sometimes needs runtime dependencies to be put on its compile class path. This is because the compiler uses Java reflection to access its compile class path. There are concrete plans to fix this in an upcoming release (don't remember if it's 2.x or 3.0).
Looks like the domain you wish to scan is "net.initech". In that case, why not using ClasspathHelper.forPackage("net.initech") (and leaving the exclude pattern)?
Second, what's the idea of using new ClassLoader[0]?
Also, note the using new SubTypesScanner(false) is not a best practice, as it might create a huge md store of all classes (well, all classes are derived from Object).
Basically Reflections does not intend to list all classes (though it obviously can), but to aggregate types based on some criteria (annotation/supertype and so).

Categories