Getting java.lang.NoSuchFieldError: usingExperimentalRuntime in GAE MemCache - java

Whenever I try to access MemCache I get a java.lang.NoSuchFieldError: usingExperimentalRuntime. No documentation on that field from what I can find. Is there some configuration option that I'm not aware of?
Here's the source code:
MemcacheService syncCache = MemcacheServiceFactory.getMemcacheService();
syncCache.setErrorHandler(ErrorHandlers.getConsistentLogAndContinue(Level.ALL));
Object cacheObject = syncCache.get("arbitrary");
That last line crashes with this error (partial stack trace up to my code):
Caused by: java.lang.NoSuchFieldError: usingExperimentalRuntime
at com.google.appengine.api.memcache.MemcacheServicePb$MemcacheGetRequest.writeTo(MemcacheServicePb.java:1511)
at com.google.appengine.repackaged.com.google.protobuf.AbstractMessageLite.toByteArray(AbstractMessageLite.java:41)
at com.google.appengine.api.memcache.MemcacheServiceApiHelper.makeAsyncCall(MemcacheServiceApiHelper.java:97)
at com.google.appengine.api.memcache.AsyncMemcacheServiceImpl.doGet(AsyncMemcacheServiceImpl.java:405)
at com.google.appengine.api.memcache.AsyncMemcacheServiceImpl.getIdentifiable(AsyncMemcacheServiceImpl.java:422)
at com.google.appengine.api.memcache.MemcacheServiceImpl.getIdentifiable(MemcacheServiceImpl.java:54)
at com.myCode.CacheOrDbUtil.getUser(CacheOrDbUtil.java:27)
What makes this so strange is that the code was working last week, complete with unit tests using MemCache. Now they are failing. Of course I've tried undoing everything I've done, but without success:
Upgrading gcloud
Downgrading gcloud
Shutting down all other servers
Rebooting the machine
Reset to last known good code revision

Either stay with 1.9.48 (out of the box in the Cloud SDK for now, upgraded next week to 1.9.49) and make sure your pom.xml/gradle build files are all using 1.9.48,
or use the standard appengine Maven plugin from https://github.com/GoogleCloudPlatform/appengine-maven-plugin
<plugin>
<groupId>com.google.appengine</groupId>
<artifactId>appengine-maven-plugin</artifactId>
<version>1.9.49</version>
</plugin>
or equivalent Gradle plugin for AppEngine (not the Cloud SDK one..)

We had to push everything to 1.9.49 to get it to work. I'm not sure why Maven doesn't list it on the website yet...
It's there -- it's just not listed:
https://repo1.maven.org/maven2/com/google/appengine/appengine-api-1.0-sdk/1.9.49/appengine-api-1.0-sdk-1.9.49.jar

Try to update appEngineVersion to 1.9.49

Related

Suddenly getting NoSuchFieldError: usingExperimentalRuntime in App Engine with GCE libraries

I just did a push (with no code changes) of my App Engine app, and it began throwing:
java.lang.NoSuchFieldError: usingExperimentalRuntime
at
com.google.appengine.api.memcache.MemcacheServicePb$MemcacheGetRequest.getSerializedSize (MemcacheServicePb.java:1597)
at
com.google.appengine.repackaged.com.google.protobuf.AbstractMessageLite.toByteArray (AbstractMessageLite.java:44)
at
com.google.appengine.api.memcache.MemcacheServiceApiHelper.makeAsyncCall (MemcacheServiceApiHelper.java:97)
at
com.google.appengine.api.memcache.AsyncMemcacheServiceImpl.doGet (AsyncMemcacheServiceImpl.java:405)
at
com.google.appengine.api.memcache.AsyncMemcacheServiceImpl.get (AsyncMemcacheServiceImpl.java:412)
at
com.google.appengine.api.memcache.MemcacheServiceImpl.get (MemcacheServiceImpl.java:49)
at
com.google.appengine.api.appidentity.AppIdentityServiceImpl.getAccessToken (AppIdentityServiceImpl.java:288)
at
com.google.api.client.googleapis.extensions.appengine.auth.oauth2.AppIdentityCredential.intercept (AppIdentityCredential.java:98)
at
com.google.api.client.googleapis.extensions.appengine.auth.oauth2.AppIdentityCredential$AppEngineCredentialWrapper.intercept (AppIdentityCredential.java:243)
at
com.google.api.client.http.HttpRequest.execute (HttpRequest.java:868)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed (AbstractGoogleClientRequest.java:419)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed (AbstractGoogleClientRequest.java:352)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute (AbstractGoogleClientRequest.java:469)
at
I am using version 1.9.80 of the following dependencies:
appengine-api-1.0-sdk
appengine-tools-sdk
appengine-maven-plugin
I have not changed the dependencies or code in any way since my last successful push several weeks ago. I simply did a:
mvn clean install
mvn appengine:update
and it deployed successfully.
My service is dead in the water, as I don't seem to be able to roll back.
All the questions I've seen on this subject were from years ago, and they recommended going to version 1.9.49. We're way past that now. What magic do I need in order to get all my dependencies to stop looking for nonexistent fields in other dependencies?
For this particular case, if you're noticing the following error message java.lang.NoSuchFieldError: usingExperimentalRuntime, and your dependencies were updated recently to a newer version, please make sure that you have all JARs updated to your current version and not previous versions.
For instance, mixing some JARs from version 1.9.78 and some others from version 1.9.80 can cause this issue. Therefore, you should move all your JARs to version 1.9.80.

Exception when running DL4J example

I have cloned DL4J examples and just trying to run one of them. One that I am trying is LogDataExample.java. Project has been build successfully and everyting seams fine expect when starting it following exception is thrown
Exception in thread "main" java.lang.NoSuchMethodError: io.netty.util.concurrent.SingleThreadEventExecutor.<init>(Lio/netty/util/concurrent/EventExecutorGroup;Ljava/util/concurrent/Executor;ZLjava/util/Queue;Lio/netty/util/concurrent/RejectedExecutionHandler;)V
at io.netty.channel.SingleThreadEventLoop.<init>(SingleThreadEventLoop.java:65)
at io.netty.channel.nio.NioEventLoop.<init>(NioEventLoop.java:138)
at io.netty.channel.nio.NioEventLoopGroup.newChild(NioEventLoopGroup.java:138)
at io.netty.channel.nio.NioEventLoopGroup.newChild(NioEventLoopGroup.java:37)
at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:84)
at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:58)
at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:47)
at io.netty.channel.MultithreadEventLoopGroup.<init>(MultithreadEventLoopGroup.java:59)
at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:78)
at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:73)
at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:60)
at org.apache.spark.network.util.NettyUtils.createEventLoop(NettyUtils.java:50)
at org.apache.spark.network.client.TransportClientFactory.<init>(TransportClientFactory.java:102)
at org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:99)
at org.apache.spark.rpc.netty.NettyRpcEnv.<init>(NettyRpcEnv.scala:71)
at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:461)
at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:57)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:249)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:424)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
at org.datavec.transform.logdata.LogDataExample.main(LogDataExample.java:85)
I was not able to find anything online that would help me fix this. My code is exactly the same as in the example
pom.xml contains following
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-all</artifactId>
<version>4.1.46.Final</version>
</dependency>
I think you are forcing a newer version of netty than Spark supports.
By running mvn dependency:tree you can see what version Spark wants here, and use that instead of the one you've defined.
If you don't care about Spark, but want to just use DataVec to transform your data, take a look at https://www.dubs.tech/guides/quickstart-with-dl4j/. It is a little bit outdated concerning the dependencies, but the datavec part shows how to use it without spark.

NoSuchMethodError with Camel RouteDefinition class

I am trying to debug a Java / Maven project with a lot of depencies on various libraries.
When I run it on a Linux server the program starts up fine, but when I try to run it in Eclipse it throws the following exception:
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.camel.model.RouteDefinition.getErrorHandlerBuilder()Lorg/apache/camel/ErrorHandlerFactory;
at org.apache.camel.spring.spi.SpringTransactionPolicy.wrap(SpringTransactionPolicy.java:69)
at org.apache.camel.model.PolicyDefinition.createProcessor(PolicyDefinition.java:133)
at org.apache.camel.model.ProcessorDefinition.makeProcessor(ProcessorDefinition.java:437)
at org.apache.camel.model.ProcessorDefinition.addRoutes(ProcessorDefinition.java:183)
at org.apache.camel.model.RouteDefinition.addRoutes(RouteDefinition.java:817)
at org.apache.camel.model.RouteDefinition.addRoutes(RouteDefinition.java:165)
at org.apache.camel.impl.DefaultCamelContext.startRoute(DefaultCamelContext.java:697)
at org.apache.camel.impl.DefaultCamelContext.startRouteDefinitions(DefaultCamelContext.java:1654)
at org.apache.camel.impl.DefaultCamelContext.doStartCamel(DefaultCamelContext.java:1441)
at org.apache.camel.impl.DefaultCamelContext.doStart(DefaultCamelContext.java:1338)
at org.apache.camel.impl.ServiceSupport.start(ServiceSupport.java:67)
at org.apache.camel.impl.ServiceSupport.start(ServiceSupport.java:54)
at org.apache.camel.impl.DefaultCamelContext.start(DefaultCamelContext.java:1316)
Now, I can see that the RouteDefinition class is in the camel-core-2.9.3,jar and I can see that this library is imported. So how come it doesn't see that method?
How do I go about debugging this?
Could I get info from the process running on the Linux server? For example can I get the list of Jars that are imported and the order in which they are imported?
Many thanks!
The error that you're getting is caused by Maven pulling in the wrong version. Try deleting all versions out of your local repo, add it explicitly to your pom, clean out all of your builds, pray to the eclipse gods, etc. If it still gives you the error, check your local repo to see which wrong versions it pulled in, figure out what depends on them, and add explicit ignores for them while keeping the explicit include.

AWS Java SDK Error - java.lang.NoSuchMethodError

I've seen this type of error over here for exceptions that are thrown by various classes, though I haven't found the right solution for mine just yet.
I'm trying to get AWS Java SDK work locally so I can write a test application that reads data from a Kinesis stream.
Problem is, when I run the init() static method I encounter the following error:
Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.http.impl.conn.DefaultClientConnectionOperator.<init>
(Lorg/apache/http/conn/scheme/SchemeRegistry;Lorg/apache/http/conn/DnsResolver;)V
Now, this is not the first error I've been thrown. I've been thrown four or five exceptions prior to this one, and the solution to all of them was just importing some jar's into the project. e.g.:
apache-httpcomponents-httpclient.jar
com.fasterxml.jackson.databind.jar
commons-codec-1.9.jar / commons-codec-1.9-javadoc.jar / commons-codec-1.9-sources.jar
httpclient-4.2.jar
httpcore-4.0.1.jar
I've seen in other threads around here that it could be the version of the httpcore library, however I imported the latest one.
Any ideas how I can resolve this? I'm thinking about starting over, as my project seems to be a heap of imports I'm not sure I'll actually utilize. Furthermore, I can't debug the binary imports of the AWS SDK (or can't I?).
Cheers.
Problem solved, I gradually added the missing libraries to the project and when the apache httpclient jar should be version 4.0 or later, and without any previous version to contradict.
I imported httpclient-4.2.jar and it worked.
Other than that, I just solved the exception that followed by importing joda-time-2.4.jar and it's all up and running.

geotools 10 RC1 + hsqldb 2.3

I'm currently working on a Swing project that uses hsqldb 2.3 as an optional database...
This project has a map, and geotools also uses hsqldb, however it uses hsqldb 1.8.
I tried to put them both working together but I get this exception:
"Caused by: java.lang.ClassNotFoundException: org.hsqldb.jdbc.jdbcDataSource"
I checked the source code, and I believe the reason it happens is because on 2.3 the "jdbc" word is in upper case:
"org.hsqldb.jdbc.JDBCDataSource"
I don't know what to do from here. If I add both jars I will get a class conflict error.
Any suggestion is welcome....
It seems there are also some other dependencies on hsqldb 1.8. You can start by modifying GeoTools and changing references to the new class. The SQL statements in GeoTools scripts are generally compatible but some usage may need updating. You will find out if you get an error when the scripts are run.
https://github.com/geotools/geotools/tree/master/modules/plugin/epsg-hsql/src/main/java/org/geotools/referencing/factory/epsg
Note there is some use of CREATE ALIAS in source code which may be redundant and can be removed.
See the resources directory in the same source tree for the SQL.

Categories