Exception when running DL4J example - java

I have cloned DL4J examples and just trying to run one of them. One that I am trying is LogDataExample.java. Project has been build successfully and everyting seams fine expect when starting it following exception is thrown
Exception in thread "main" java.lang.NoSuchMethodError: io.netty.util.concurrent.SingleThreadEventExecutor.<init>(Lio/netty/util/concurrent/EventExecutorGroup;Ljava/util/concurrent/Executor;ZLjava/util/Queue;Lio/netty/util/concurrent/RejectedExecutionHandler;)V
at io.netty.channel.SingleThreadEventLoop.<init>(SingleThreadEventLoop.java:65)
at io.netty.channel.nio.NioEventLoop.<init>(NioEventLoop.java:138)
at io.netty.channel.nio.NioEventLoopGroup.newChild(NioEventLoopGroup.java:138)
at io.netty.channel.nio.NioEventLoopGroup.newChild(NioEventLoopGroup.java:37)
at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:84)
at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:58)
at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:47)
at io.netty.channel.MultithreadEventLoopGroup.<init>(MultithreadEventLoopGroup.java:59)
at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:78)
at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:73)
at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:60)
at org.apache.spark.network.util.NettyUtils.createEventLoop(NettyUtils.java:50)
at org.apache.spark.network.client.TransportClientFactory.<init>(TransportClientFactory.java:102)
at org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:99)
at org.apache.spark.rpc.netty.NettyRpcEnv.<init>(NettyRpcEnv.scala:71)
at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:461)
at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:57)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:249)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:424)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
at org.datavec.transform.logdata.LogDataExample.main(LogDataExample.java:85)
I was not able to find anything online that would help me fix this. My code is exactly the same as in the example
pom.xml contains following
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-all</artifactId>
<version>4.1.46.Final</version>
</dependency>

I think you are forcing a newer version of netty than Spark supports.
By running mvn dependency:tree you can see what version Spark wants here, and use that instead of the one you've defined.
If you don't care about Spark, but want to just use DataVec to transform your data, take a look at https://www.dubs.tech/guides/quickstart-with-dl4j/. It is a little bit outdated concerning the dependencies, but the datavec part shows how to use it without spark.

Related

Adding new aggregate function in Presto fails

I am trying to build custom aggregate functions for presto. I have created a FAT Jar and deployed the jar into the plugin directory. When I restart presto, it is always giving me this error :
java.lang.AbstractMethodError
at com.facebook.presto.server.PluginManager.installPlugin(PluginManager.java:183)
at com.facebook.presto.server.PluginManager.loadPlugin(PluginManager.java:175)
at com.facebook.presto.server.PluginManager.loadPlugin(PluginManager.java:158)
Unfortunately, I do not see any verbose error message which will give me a clue on what is actually missing here. I used presto-ml plugin as an example and implemented getFunctions() in plugin implementation. Is there a way to figure out what is missing?
I did check the source code of PLuginManager.java. I am just looking for a way to debug this in a better way.

Getting java.lang.NoSuchFieldError: usingExperimentalRuntime in GAE MemCache

Whenever I try to access MemCache I get a java.lang.NoSuchFieldError: usingExperimentalRuntime. No documentation on that field from what I can find. Is there some configuration option that I'm not aware of?
Here's the source code:
MemcacheService syncCache = MemcacheServiceFactory.getMemcacheService();
syncCache.setErrorHandler(ErrorHandlers.getConsistentLogAndContinue(Level.ALL));
Object cacheObject = syncCache.get("arbitrary");
That last line crashes with this error (partial stack trace up to my code):
Caused by: java.lang.NoSuchFieldError: usingExperimentalRuntime
at com.google.appengine.api.memcache.MemcacheServicePb$MemcacheGetRequest.writeTo(MemcacheServicePb.java:1511)
at com.google.appengine.repackaged.com.google.protobuf.AbstractMessageLite.toByteArray(AbstractMessageLite.java:41)
at com.google.appengine.api.memcache.MemcacheServiceApiHelper.makeAsyncCall(MemcacheServiceApiHelper.java:97)
at com.google.appengine.api.memcache.AsyncMemcacheServiceImpl.doGet(AsyncMemcacheServiceImpl.java:405)
at com.google.appengine.api.memcache.AsyncMemcacheServiceImpl.getIdentifiable(AsyncMemcacheServiceImpl.java:422)
at com.google.appengine.api.memcache.MemcacheServiceImpl.getIdentifiable(MemcacheServiceImpl.java:54)
at com.myCode.CacheOrDbUtil.getUser(CacheOrDbUtil.java:27)
What makes this so strange is that the code was working last week, complete with unit tests using MemCache. Now they are failing. Of course I've tried undoing everything I've done, but without success:
Upgrading gcloud
Downgrading gcloud
Shutting down all other servers
Rebooting the machine
Reset to last known good code revision
Either stay with 1.9.48 (out of the box in the Cloud SDK for now, upgraded next week to 1.9.49) and make sure your pom.xml/gradle build files are all using 1.9.48,
or use the standard appengine Maven plugin from https://github.com/GoogleCloudPlatform/appengine-maven-plugin
<plugin>
<groupId>com.google.appengine</groupId>
<artifactId>appengine-maven-plugin</artifactId>
<version>1.9.49</version>
</plugin>
or equivalent Gradle plugin for AppEngine (not the Cloud SDK one..)
We had to push everything to 1.9.49 to get it to work. I'm not sure why Maven doesn't list it on the website yet...
It's there -- it's just not listed:
https://repo1.maven.org/maven2/com/google/appengine/appengine-api-1.0-sdk/1.9.49/appengine-api-1.0-sdk-1.9.49.jar
Try to update appEngineVersion to 1.9.49

NoSuchMethodError with Camel RouteDefinition class

I am trying to debug a Java / Maven project with a lot of depencies on various libraries.
When I run it on a Linux server the program starts up fine, but when I try to run it in Eclipse it throws the following exception:
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.camel.model.RouteDefinition.getErrorHandlerBuilder()Lorg/apache/camel/ErrorHandlerFactory;
at org.apache.camel.spring.spi.SpringTransactionPolicy.wrap(SpringTransactionPolicy.java:69)
at org.apache.camel.model.PolicyDefinition.createProcessor(PolicyDefinition.java:133)
at org.apache.camel.model.ProcessorDefinition.makeProcessor(ProcessorDefinition.java:437)
at org.apache.camel.model.ProcessorDefinition.addRoutes(ProcessorDefinition.java:183)
at org.apache.camel.model.RouteDefinition.addRoutes(RouteDefinition.java:817)
at org.apache.camel.model.RouteDefinition.addRoutes(RouteDefinition.java:165)
at org.apache.camel.impl.DefaultCamelContext.startRoute(DefaultCamelContext.java:697)
at org.apache.camel.impl.DefaultCamelContext.startRouteDefinitions(DefaultCamelContext.java:1654)
at org.apache.camel.impl.DefaultCamelContext.doStartCamel(DefaultCamelContext.java:1441)
at org.apache.camel.impl.DefaultCamelContext.doStart(DefaultCamelContext.java:1338)
at org.apache.camel.impl.ServiceSupport.start(ServiceSupport.java:67)
at org.apache.camel.impl.ServiceSupport.start(ServiceSupport.java:54)
at org.apache.camel.impl.DefaultCamelContext.start(DefaultCamelContext.java:1316)
Now, I can see that the RouteDefinition class is in the camel-core-2.9.3,jar and I can see that this library is imported. So how come it doesn't see that method?
How do I go about debugging this?
Could I get info from the process running on the Linux server? For example can I get the list of Jars that are imported and the order in which they are imported?
Many thanks!
The error that you're getting is caused by Maven pulling in the wrong version. Try deleting all versions out of your local repo, add it explicitly to your pom, clean out all of your builds, pray to the eclipse gods, etc. If it still gives you the error, check your local repo to see which wrong versions it pulled in, figure out what depends on them, and add explicit ignores for them while keeping the explicit include.

AWS Java SDK Error - java.lang.NoSuchMethodError

I've seen this type of error over here for exceptions that are thrown by various classes, though I haven't found the right solution for mine just yet.
I'm trying to get AWS Java SDK work locally so I can write a test application that reads data from a Kinesis stream.
Problem is, when I run the init() static method I encounter the following error:
Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.http.impl.conn.DefaultClientConnectionOperator.<init>
(Lorg/apache/http/conn/scheme/SchemeRegistry;Lorg/apache/http/conn/DnsResolver;)V
Now, this is not the first error I've been thrown. I've been thrown four or five exceptions prior to this one, and the solution to all of them was just importing some jar's into the project. e.g.:
apache-httpcomponents-httpclient.jar
com.fasterxml.jackson.databind.jar
commons-codec-1.9.jar / commons-codec-1.9-javadoc.jar / commons-codec-1.9-sources.jar
httpclient-4.2.jar
httpcore-4.0.1.jar
I've seen in other threads around here that it could be the version of the httpcore library, however I imported the latest one.
Any ideas how I can resolve this? I'm thinking about starting over, as my project seems to be a heap of imports I'm not sure I'll actually utilize. Furthermore, I can't debug the binary imports of the AWS SDK (or can't I?).
Cheers.
Problem solved, I gradually added the missing libraries to the project and when the apache httpclient jar should be version 4.0 or later, and without any previous version to contradict.
I imported httpclient-4.2.jar and it worked.
Other than that, I just solved the exception that followed by importing joda-time-2.4.jar and it's all up and running.

javax.xml.parsers.SAXParserFactory ClassCastException

I get on my local machine the following exception when running the tests by maven (mvn test).
ch.qos.logback.core.joran.event.SaxEventRecorder#195ed659 - Parser configuration error occured
java.lang.ClassCastException: com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl cannot be cast to javax.xml.parsers.SAXParserFactory
After googling around I came across several pages which describe the main problem behind it (several SAXParserFactoryImpl in different classloaders).
-> http://www.xinotes.org/notes/note/702/
My question is, how can I figure out which library is also providing the SAXParserFactoryImpl, so that I can exclude it. I am using Maven, IntelliJ and JDK 1.6.0_23. The issue occurs on the command line as well as when running the tests from IntelliJ.
But the strange issue is, that on the build server this issue doesn't occur.
Update 1
Just figured out when I run the first time mvn test after an mvn clean, the error doesn't appear. But as soon as I run mvn test again (without clean, the exception occurs) (when I run it from IntelliJ).
When I run it on the cmd line, then several mvn test calls do work.
I found the issue. It was related to PowerMockito who tried to load the SAXParserFactory. The reason why I haven't figured that one out was because the stacktrace contained only twice PowerMockito, and this at the middle :-)
So If you figure out this problem in IntelliJ and you do use PowerMockito, annotate your test class with the following annotation:
#PowerMockIgnore(["javax.management.*", "javax.xml.parsers.*",
"com.sun.org.apache.xerces.internal.jaxp.*", "ch.qos.logback.*", "org.slf4j.*"])
This has solved the problem in my case.
Your JDK probably has its own SAXParserFactoryImpl.
Check for jars like xercesImpl, xml/xml-api and sax.
One your server the one from the server is probably going to be used.
You can use a jarfinder: http://www.jarfinder.com/index.php/java/search/~SAXParserFactoryImpl~
I encountered the same error today. After a lot of digging, I found that the solutions here, or on other places are not helpful.
However, after playing around, I found a solution that works deterministically, unlike the accepted answer which does not apply to all cases.
The answer is, look through stack trace to find any ClassCast exceptions, and just add them to \#PowerMockIgnore list. Keep repeating until the issue is solved. Worked like magic for me.

Categories