Scala unsatisfiable cyclic dependency in "table-layout" library (Toolkit class) - java

When I try to compile with sbt some code containing an instance of a Table from this library I get this error:
java.lang.AssertionError: assertion failed: unsatisfiable cyclic dependency in 'class Toolkit'
It seems to work with Java so I don't understand why it fails in Scala.
Here is the toolkit class: http://code.google.com/p/table-layout/source/browse/branches/v1/tablelayout/src/com/esotericsoftware/tablelayout/Toolkit.java
As long as I get this error I'm totally stopped in my project :(.
Edit: It works with Scala 2.10.0 every Milestone, But this version of scala doesn't work with Android (or at least not yet ... [or at least not with libgdx ...]). So I still need a way to solve this problem even if it's a bit constraignant.

it will compile if you force scalac to load dependencies in correct order like this:
classOf[com.esotericsoftware.tablelayout.Toolkit[_,_,_]]
println(new com.badlogic.gdx.scenes.scene2d.ui.Table toString)
must be a bug which was accidentally fixed in 2.10

Related

Spring MVC application getting runtime exception after Spring framework upgrade to 5.2.20.Release [duplicate]

What are the possible causes for ABstractMethodError?
Exception in thread "pool-1-thread-1" java.lang.AbstractMethodError:
org.apache.thrift.ProcessFunction.isOneway()Z
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:51)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at com.gemfire.gemstone.thrift.hbase.ThreadPoolServer$ClientConnnection.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
It usually means that you are using an old version of an interface implementation which is missing a new interface method. For example java.sql.Connection interface got a new getSchema method in 1.7. If you have 1.6 JDBC driver and call Connection.getSchema you will get AbstractMethodError.
The simple answer is this: some code is trying to call a method which is declared abstract. Abstract methods have no body and cannot be executed. Since you have provided so little information I can't really elaborate more on how this can happen since the compiler usually catches this problem - as described here, this means the class must have changed at runtime.
From documnentation of AbstractMethodError
Thrown when an application tries to call an abstract method. Normally,
this error is caught by the compiler; this error can only occur at run
time if the definition of some class has incompatibly changed since
the currently executing method was last compiled.
A kind of special case of the above answer.
I had this error, because I was using a spring-boot-starter-parent (e.g. 2.1.0.RELEASE uses spring version: 5.1.2.RELEASE) but I also included a BOM, that also defined some spring dependencies, but in an older version (e.g. 5.0.9.RELEASE).
So one thing to do, is check your dependency tree (in Eclipse e.g. you can use the Dependency Hierarchy) if you are using the same versions.
So one solution could be that you upgrade the spring dependencies in your BOM, another one could be that you exclude them (but depending on the amount, this could be ugly).
If you download any project zip file, after unzipping them and importing into Android Studio, you are unable to run the project because this error happened.
I got out of the problem by deleting my android studio, then download and install the new version.
I truly hope it help.
If you you are getting this error on the implemented methods, make sure you have added your dependencies correctly as mentioned in this thread.
As Damian quoted :
Normally, this error is caught by the compiler; this error can only
occur at run time if [...]
I had the same error that was not caught by the compiler but at runtime. To solve it I only compiled again without giving the code any modification.
if you are getting this error on a minified build using Proguard then check if the class is a POJO class and if so then exclude it from the Proguard using the below rule:
-keep class your.application.package.pojo.** {*;}
I had the same error when I imported an eclipse project into intellij ide.. I tried to import it without .iml file then my problem was solved
I get this problem when I update my kotlin plugin to a new version .... the problem is that my pom file is using the older kotlin version .. I guess it might help someone if he is doing this mistake
I am getting various of these and others infrequently on android.. I have to clean everything change som configuration rebuild change configuration again to normal somehow just the build tools don't rebuild everything they should for whatever reason (Android gradle bug obviously).

Scala module requiring specific version of data bind for Spark

I am having issues trying to get Spark to load, read and query a parquet file. The infrastructure seems to be set up (Spark standalone 3.0) and can be seen and will pick up jobs.
The issue I am having is when this line is called
Dataset<Row> parquetFileDF = sparkSession.read().parquet(parquePath);
the following error is thrown
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Scala module 2.10.0 requires Jackson Databind version >= 2.10.0 and < 2.11.0
at com.fasterxml.jackson.module.scala.JacksonModule.setupModule(JacksonModule.scala:61)
I looked into JacksonModule.setupModule and when it gets to context.getMapperVersion the version that is being passed is 2.9.10. It appears to me that the DefaultScalaModule is pulling some older version.
I'm using Gradle to build and have the dependencies set up as such
implementation 'com.fasterxml.jackson.core:jackson-core:2.10.0'
implementation 'com.fasterxml.jackson.core:jackson-databind:2.10.0'
implementation 'org.apache.spark:spark-core_2.12:3.0.0'
implementation 'org.apache.spark:spark-sql_2.12:3.0.0'
implementation 'org.apache.spark:spark-launcher_2.12:3.0.0'
implementation 'org.apache.spark:spark-catalyst_2.12:3.0.0'
implementation 'org.apache.spark:spark-streaming_2.12:3.0.0'
That didn't work, so I tried forcing databind
implementation ('com.fasterxml.jackson.core:jackson-databind') {
version {
strictly '2.10.0'
}
}
I've tried a few different versions and still keep hitting this issue. Maybe I'm missing something super simple, but right now, I can't seem to get past this error.
Any help would be appreciated.
I was able to figure out the issue. I was pulling in jar file from another project. The functionality in the jar file wasn't being used at all, so it wasn't suspect. Unfortunately, that project hadn't been updated and there were some older Spark libraries that were some how being picked up by my current running app. Once I removed that, the error went away. What's interesting is the dependency graph didn't show anything about the libraries the other jar file was using.
I suppose if you run into a similar issue, double check any jar files being imported.

GWT-Jackson-Apt seemingly undefined class constructor call

Looking at trying to use the GWT-Jackson-Apt library for doing certain RPC, but when looking at examples and trying to run some demos there are always interfaces with a bizarre undefined constructor call.
#JSONMapper
public interface SampleMapper extends ObjectMapper<SimpleBean> {
SampleMapper INSTANCE = new App_SampleMapperImpl();
}
source: https://github.com/DominoKit/gwt-jackson-apt/blob/f60d0358b90bcbf78d066796f680aeae1d7156bb/samples/basic/basic-client/src/main/java/org/dominokit/jacksonapt/samples/basic/App.java
I've been digging around, but there is no definition of App_SampleMapperImpl() anywhere in the source code. And it doesn't compile, saying that there is an undefined symbol
The exact same thing is done in the readme file's exmaples which can be found on this page: https://github.com/DominoKit/gwt-jackson-apt/tree/f60d0358b90bcbf78d066796f680aeae1d7156bb
can anyone explain what is going on here? How is this constructor being defined, or implied? And what do I need to do to make the example compile?
Assuming you are making a Maven project, the important thing is to include the annotation processor which generates the mappers. Then, once the project knows how to generate them, you'll be able to use them in your code.
Annotation Processors run while the compiler is running, which means you technically get to write code which doesn't appear it will compile. Then, as the compiler is running, it asks all registered annotation processors to please generate code based on the annotations and existing types (not the missing references like App_Sample_MapperImpl as you might think). The processor then runs, generates the missing class, and then the compile continues.
Usually what happens is that you build while writing code (eclipse, for example, does this every time a file is saved, intellij does it when you ask for a build, etc), and then the class exists and can be referenced going forward. Even when the project is cleaned and rebuilt, while the reference seems like it should not work, it will work as soon as the compiler runs.
In this case, we'll need to follow the example to make sure the processor is present. in https://github.com/DominoKit/gwt-jackson-apt/blob/f60d0358b90bcbf78d066796f680aeae1d7156bb/samples/shared-mappers/shared-mappers-shared/pom.xml, we see this in the dependencies:
<dependency>
<groupId>org.dominokit.jackson</groupId>
<artifactId>jackson-apt-processor</artifactId>
<version>1.0-SNAPSHOT</version>
<scope>provided</scope>
</dependency>
This is marked scope=provided since it is only needed to compile, then shouldn't be included in later dependency graphs. For each specific IDE, you may need to specify additional options to get it to re-run automatically (a checkbox in Eclipse, nothing in IntelliJ I believe, and I haven't used other IDEs in too long to say).
One final note for maven: you must use a relatively recent maven-compiler-plugin so that generated code is handled correctly: latest is 3.8.0, published July 2018, but I think anything after 3.5.1 will be sufficient if you must use an older one.
Just follow the example on the main page of the project: https://github.com/DominoKit/gwt-jackson-apt/
Does that work ?

How to debug project.clj to find which dependency is breaking my project compilation

I have just updated my system from Java 8 to the OpenJDK Java 11 version. I have one project that won't compile and I get the following error:
Java.lang.IllegalArgumentException: Must hint overloaded method:
toArray, compiling:(flatland/ordered/set.clj:19:1)
Exception in thread "main" java.lang.IllegalArgumentException: Must
hint overloaded method: toArray, compiling:
(flatland/ordered/set.clj:19:1)
From the looks of it, this error was fixed here: https://dev.clojure.org/jira/browse/CLJ-2374
So I update my project to clojure 1.10.0-RC3 and now I get this error:
Syntax error compiling deftype* at (flatland/ordered/set.clj:19:1).
Exception in thread "main" Syntax error compiling deftype* at
(flatland/ordered/set.clj:19:1).
Has anyone seen this error OR is there a way for me to expand Clojure's error messages to show me which dependency in my project is failing during compilation (could be multiple)?
I also noticed that I copied over the dependency list which was failing in my first project to a new project and the new project compiled. However, I didn't reference the dependencies or call functions from the deps list. Does Clojure bring in the dependencies/libraries and then reference the required dependencies from the libraries that are included in my project.clj?
EDIT***
I found that this is likely the error.
https://github.com/amalloy/ordered/pull/37
You already have an answer though I thought i'd leave my general process for this incase it's useful for folks who come along later:
turn of any auto AOT in my tooling so i can get a repl witout triggering the problem.
load namespaces one at a time till i find one that triggers the problem (this usually doesn't take long ;-)
comment out half the dependencies of that namespace and evaluate the ns form at the top of the file
do a binary search till i find the one or two that trigger it
load just that dependency in a scrap project.
... lots of effort ...
SUCCESS !

RemoteActorRefProvider ClassNotFound

I'm struggling trying to get remote actors setup in Scala. I'm running Scala 2.10.2 and Akka 2.2.1.
I compile using [I've shortened the paths on the classpath arg for clarity sake]:
$ scalac -classpath "akka-2.2.1/lib:akka-2.2.1/lib/scala-library.jar:akka-2.2.1/lib/akka:akka-2.2.1/lib/akka/scala-reflect-2.10.1.jar:akka-2.2.1/lib/akka/config-1.0.2.jar:akka-2.2.1/lib/akka/akka-remote_2.10-2.2.1.jar:akka-2.2.1/lib/akka/akka-kernel_2.10-2.2.1.jar:akka-2.2.1/lib/akka/akka-actor_2.10-2.2.1.jar:." [file.scala]
I've continuously added new libraries trying to debug this - I'm pretty sure all I really need to include is akka-remote, but the others shouldn't hurt.
No issues compiling.
I attempt to run like this:
$ scala -classpath "[same as above]" [application]
And I receive a NSM exception:
java.lang.NoSuchMethodException: akka.remote.RemoteActorRefProvider.<init>(java.lang.String, akka.actor.ActorSystem$Settings, akka.event.EventStream, akka.actor.Scheduler, akka.actor.DynamicAccess)
at java.lang.Class.getConstructor0(Class.java:2810)
at java.lang.Class.getDeclaredConstructor(Class.java:2053)
...
Looking into the source code, it appears that Akka 2.2.X's flavor of this constructor takes 4 arguments (the Scheduler is removed). But in Akka < 2.2.X, the constructor takes 5 args.
Thus, I'm thinking my classpath isn't setup quite right. At compile-time, Scala must be finding the <2.2.X flavor. I don't even know where it would be finding it, since I only have Akka 2.2.1 installed.
Any suggestions!? Thanks! (Please don't say to use SBT).
The problem here is that the Scala distribution contains akka-actor 2.1.0 and helpfully puts that in the boot class path for you. We very strongly recommend using a dependency manager like sbt or maven when building anything which goes beyond the most trivial projects.
As noted in another answer, the problem is that scala puts a different version of Akka into the bootclasspath.
To more directly answer your question (as you said you don't want to use sbt): you can execute your program with java instead of scala. You just have to put the appropriate Scala jars into the classpath.
Here is a spark-dev message about the problem. The important part is: "the workaround is to use java to launch the application instead of scala. All you need to do is to include the right Scala jars (scala-library and scala-compiler) in the classpath."

Categories