Been getting a NoSuchMethodError on one of our classes on a simple getter method. The odd thing is that we can debug the code and see the error occur in the debugger (by stepping over the relevant line), however we can also use the IDE (IntelliJ IDEA) to see the method does exist.
Doing xxxx.getYYY() evaluates fine through the IDE expression evaluator. And going xxxx.getClass().getMethods() we can see the getYYY() method in the list. We have tried cleaning out all the built files, IDE output directories, IDE caches, rebooting etc and nothing seems to help.
I would understand a NoSuchMethodError would be happening if we had compiled against something but then a different Jar/class was being found at runtime. But that doesn't explain to me why, at runtime while debugging to the line in question, we can see the method is there, but stepping over the line throws the Exception.
Tried reproducing on another machine but it does not reproduce.
Does anyone have any insight into what could be happening here?
Most likely you are not running the same versions of code in IntelliJ as what you are editing. I get this problem often with lots of maven projects open at once with different versions in dependencies to what I am editing. IntelliJ can get confused (or I get confused as to which version I am actually running)
NoSuchMethodError - This mostly occurs at runtime. I came across this error, the reason was incompatible versions of asm and cglib libraries in my classpath.
The asm and cglib libraries are used by many frameworks like hibernate, spring, hadoop for runtime byte code manipulation.
Classloader always refers to the first version of the jar on the classpath.
Related
I am using sca-maven-plugin to scan a multi module maven project. Unfortunately I ran into the the following warning while executing translate
[warning]: The following references to java functions could not be
resolved. These functions may be part of classes that could not be
found, or there may be a type error at the call site of the given
function relative to the function declaration. Please ensure the java
source code can be compiled by a java compiler.
The code can be compiled by a java compiler, which leads me to believe that something is wrong with the classpath. Despite this I remain unconvinced due to the fact that as I understand it Maven handles the classpath and passes it to sourceanalyzer. How do I solve the issue?
Check the maven compiler plugin you are using . May be you are using some code which compiles with the higher version of java.
Also makes sure all the dependencies are added properly and the jars are apart of classpath.
Please share your pom.xml
I am using Eclipse Kepler and Drools 6.0.1 Final. The basic installation works and I can
run rules and even see a refactor apply to both Java and DRL files. JRE 7. I can add more
detail if needed - but I don't think it affects my core question.
My problem occurs when I want to incorporate an existing project. This is in NetBeans so I want
to just pull in the jars. I have added them to Properties -> Java Build Path -> Libraries. This works
fine and executes correctly from the java side (I've added the external jars in a variety of different ways and
nothing changes the problem I am getting). However, the problem I have appears to be that the Drools compiler
is not finding the same Jars and hence raises Drools errors (different symbol, listed as "Drools Error"
in the Problems list). I have spent ages trying to figure out where I might reference this external Jar so
Drools can see it - but zilch success. If I disable Drools builder in the list of builders (and clear
the existing problems) then no more are raised. With Drools builder enabled, I get "xxx cannot be resolved to a type"
or "Error importing xxx" etc. It's like the Drools compiler either has its own idea of classpath
or similar, or just isn't bothering to scan something.
How does the drools compiler decide on where to find libraries or external jars? All the behaviour I
am getting points to it doing it differently to Java compilation - which feels to be a bad way for
it to behave.
Any pointers most welcome.
I am currently working in an very old Java Eclipse project which has a lot of JARs linked to it's build path.
I have noticed that several of them aren't being used by the project any more, meaning old libraries that has been forgotten when the code has evolved over the years.
A standard way to determine if the library is used, is to simply remove it and see if there are any compilation errors.
However, I have noticed that some libraries in the build path are invoked by reflection, meaning if I remove the library I won't get any compilation errors. But the project will crash due to not finding the code anymore. Problem is you don't know when that can happen.
Is there a tool I can use to figure out which of all the linked JAR libraries I can safely remove, without getting compilation or reflection errors?
Check ClassPathHelper.
In Netbeans, I used some scala code (jar) written by someone else and included it into a java project project along with Scala-library.jar. It worked nicely without hiccups.
Now when I try to do the same using eclipse, I get the following build error
Internal compiler error: java.lang.ClassCastException:
org.eclipse.jdt.internal.compiler.lookup.BaseTypeBinding cannot be cast to
org.eclipse.jdt.internal.compiler.lookup.ReferenceBinding at
org.eclipse.jdt.internal.compiler.lookup.BinaryTypeBinding.initializeTypeVariable(BinaryTypeBinding.java:944) DemoApp.java /demo/ line 0 Java Problem
On Googling, I found that others have had this problem but not seen any fixes.
If any of you have seen this error and figured how to fix it, please share it here.
Let me know if any other information is needed. Unfortunately, I do not have the source of the Scala code that I used, just the jar. If you need the code of DemoApp.java, I can paste it here, but that is not very useful: it just references an object in the Scala code.
Details: scala-2.8.0.r22602-b20100720020114
Thanks.
One of the problems of Scala is its lack of binary compatibility between different versions.
Either use the same library version with which the original Jar was compiled, or recompile the Jar (if that's an option).
Do you have JDT Weaving enabled? Go to Preferences -> JDT Weaving to find out. If it is disabled, then you may have inexplicable errors in your IDE.
Ok. I finally found the solution!
Thanks to this SO question
The problem seems to be Scala 2.8 compiler (apparantly). This issue is not there in 2.9.
One of the fixes suggested was to use Scala 2.9, but that is not always possible. So here is the right solution.
The problem lies in the List type of Scala. I found that I was returning (exposing) a List in the Scala code somewhere, which was causing the problem with Java in Eclpise.
To fix the problem, do not return List. Return Array or some Java type.
I've run accorss a really weird issue, in eclipse I've got a codebase I've been working on for a couple of weeks and it's working fine. I did an svn update and all of a sudden one of my classes doesn't compile because it can't resolve an enum which is in the same namespace to a type.
I've checked the Java version and I'm running under Java 6 so enums should be supported.
Also it worked up till yesterday and now it doesn't.
Has anyone else seen this kind of behaviour? I've reloaded eclipse but beyond that I dont know where to start diagnosing it.
If it does say "Step cannot be resolved to a type", just try and clean the project (Project -> Clean). Eclipse gets confused sometimes, and a clean usually helps.
I had this recently. Turned out that someone had committed some jars that conflicted (had a previous build in) and put on the build path. Check recent commits to see if that's the problem, or to see what could have caused it.
However I would definitely do a build clean first within Eclipse, and see if ANT/Maven is affected (you do have such build scripts I assume).
Weird idea, but could it be that eclipse is trying to compile your class using a 1.4.2 compiler and isn't recognizing the enum?
I unloaded the project and reloaded it and it just works... No idea what the origianl issue was...