Unable to add watch expressions in eclipse - java

I am using Eclipse based development environment - a Groovy/Grails Tool Suite. The issue seems to be that eclipse is not able to find some JARs. And I dont understand where to put those JARs. Below are the details.
I am trying to step through the code. However when I put watch on any variable it gives error, but the same variable evaluates correctly on Variables window. I did one obeservation, that when there are a lot of expressions in my Expressions window, the debugger slows down, it takes time to evaluate those expressions but gives error for all. When I remove all expressions it steps through without any slow down. Also similar error occur when I use Display window
Below is the screen shot of whats happening
I checked the eclipse error log file. There are many errors, but below seems to be relevant one:
!ENTRY org.grails.ide.eclipse.groovy.debug.core 4 0 2015-08-19 12:40:09.963
!MESSAGE Internal error logged from Groovy Core Debug:
!STACK 0
java.lang.Exception: (Groovy) Complete snippet:
/////start
import java.io.BufferedReader;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.util.concurrent.BlockingQueue;
import java.util.concurrent.TimeUnit;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
println it
/////end
at org.grails.ide.eclipse.groovy.debug.core.evaluation.GroovyJDIEvaluator.performEvaluate(GroovyJDIEvaluator.java:202)
at org.grails.ide.eclipse.groovy.debug.core.evaluation.GroovyJDIEvaluator$1$1.run(GroovyJDIEvaluator.java:160)
at org.eclipse.jdt.internal.debug.core.model.JDIThread.runEvaluation(JDIThread.java:764)
at org.grails.ide.eclipse.groovy.debug.core.evaluation.GroovyJDIEvaluator$1.run(GroovyJDIEvaluator.java:165)
at org.eclipse.jdt.internal.debug.core.model.JDIThread$ThreadJob.run(JDIThread.java:3157)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:54)
Caused by: org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
____Eval.groovy: 8: unable to resolve class org.slf4j.Logger
# line 8, column 18.
import org.slf4j.Logger;
^
____Eval.groovy: 9: unable to resolve class org.slf4j.LoggerFactory
# line 9, column 18.
import org.slf4j.LoggerFactory;
^
2 errors
at org.codehaus.groovy.control.ErrorCollector.failIfErrors(ErrorCollector.java:313)
at org.codehaus.groovy.control.CompilationUnit.applyToSourceUnits(CompilationUnit.java:1040)
at org.codehaus.groovy.control.CompilationUnit.doPhaseOperation(CompilationUnit.java:647)
at org.codehaus.groovy.control.CompilationUnit.compile(CompilationUnit.java:596)
at groovy.lang.GroovyClassLoader.doParseClass(GroovyClassLoader.java:279)
at groovy.lang.GroovyClassLoader.parseClass(GroovyClassLoader.java:258)
at org.grails.ide.eclipse.groovy.debug.core.evaluation.GroovyJDIEvaluator.convertSnippetToScript(GroovyJDIEvaluator.java:349)
at org.grails.ide.eclipse.groovy.debug.core.evaluation.GroovyJDIEvaluator.performEvaluate(GroovyJDIEvaluator.java:183)
... 5 more
Seems like it needs some jars, but which and where to put them? I tried adding the file slf4j-api-1.7.2.jar which contains these classes to %JAVA_HOME%/lib and %JAVA_HOME%\jre\lib and also to ggts' plugins folder, but it did not helped.

Related

"No such method" using NCTOOLBOX (including .jar) within a compiled MATLAB App

My goal is to use the NCTOOLBOX (https://github.com/nctoolbox/nctoolbox/wiki/) in a compiled MATLAB app.
Currently, I'm able to use the toolbox as intended when I run my code NOT from the compiled app.
However, when I try to perform the same tasks in the compiled app, it seems my scripts are struggling to find the right java drivers.
I'm relatively familiar with compiling apps using MATLAB. There are other java drivers which I successfully use in the same compiled app. The approach I use for those java drivers is to add them in the 'settings' menu of the Application Compiler.
For example, in the field "Additional parameters passed to the mcc:" I have a series of .jar files which I add like so (this driver is not the one causing the issue, it is just an example):
-a "C:\someFilePath\postgresql-42.2.12.jar"
^^ when I run my app NOT in its compiled state, the same java driver is added like so (and successfully ):
javaaddpath('C:\someFilePath\postgresql-42.2.12.jar');
The NCTOOLBOX script which is erroring on me is "ncdataset.m" and the exact line of code I'm erroring out on is:
obj.netcdf = ucar.nc2.dataset.NetcdfDataset.openDataset(url)
Here is the rest of the error readout (the above line of code shared is line 87 referenced in the following error readout):
Error using ncdataset (line 87)
Java exception occurred:
java.io.IOException: java.lang.RuntimeException: java.lang.NoSuchMethodError: ucar.nc2.grib.grib2.Grib2IndexProto$GribIdSection.emptyIntList()Lcom/google/protobuf/Internal$IntList;
at ucar.nc2.NetcdfFile.open(NetcdfFile.java:401)
at ucar.nc2.dataset.NetcdfDataset.openProtocolOrFile(NetcdfDataset.java:831)
at ucar.nc2.dataset.NetcdfDataset.openDataset(NetcdfDataset.java:479)
at ucar.nc2.dataset.NetcdfDataset.openDataset(NetcdfDataset.java:461)
at ucar.nc2.dataset.NetcdfDataset.openDataset(NetcdfDataset.java:442)
at ucar.nc2.dataset.NetcdfDataset.openDataset(NetcdfDataset.java:426)
Caused by: java.lang.RuntimeException: java.lang.NoSuchMethodError: ucar.nc2.grib.grib2.Grib2IndexProto$GribIdSection.emptyIntList()Lcom/google/protobuf/Internal$IntList;
at ucar.nc2.NetcdfFile.<init>(NetcdfFile.java:1634)
at ucar.nc2.NetcdfFile.open(NetcdfFile.java:798)
at ucar.nc2.NetcdfFile.open(NetcdfFile.java:398)
... 5 more
Caused by: java.lang.NoSuchMethodError: ucar.nc2.grib.grib2.Grib2IndexProto$GribIdSection.emptyIntList()Lcom/google/protobuf/Internal$IntList;
at ucar.nc2.grib.grib2.Grib2IndexProto$GribIdSection.<init>(Grib2IndexProto.java:140)
at ucar.nc2.grib.grib2.Grib2IndexProto$GribIdSection.<clinit>(Grib2IndexProto.java:1395)
at ucar.nc2.grib.grib2.Grib2Index.makeIdProto(Grib2Index.java:334)
at ucar.nc2.grib.grib2.Grib2Index.makeRecordProto(Grib2Index.java:286)
at ucar.nc2.grib.grib2.Grib2Index.makeIndex(Grib2Index.java:243)
at ucar.nc2.grib.GribIndex.readOrCreateIndexFromSingleFile(GribIndex.java:94)
at ucar.nc2.grib.collection.Grib2CollectionBuilder.makeGroups(Grib2CollectionBuilder.java:84)
at ucar.nc2.grib.collection.GribCollectionBuilder.createMultipleRuntimeCollections(GribCollectionBuilder.java:128)
at ucar.nc2.grib.collection.GribCollectionBuilder.createIndex(GribCollectionBuilder.java:120)
at ucar.nc2.grib.collection.GribCdmIndex.openGribCollectionFromDataFile(GribCdmIndex.java:825)
at ucar.nc2.grib.collection.GribCdmIndex.openGribCollectionFromDataFile(GribCdmIndex.java:804)
at ucar.nc2.grib.collection.GribCdmIndex.openGribCollectionFromRaf(GribCdmIndex.java:774)
at ucar.nc2.grib.collection.GribIosp.open(GribIosp.java:201)
at ucar.nc2.NetcdfFile.<init>(NetcdfFile.java:1610)
... 7 more
Error in cfdataset (line 59)
Error in ncgeodataset (line 74)
Despite adding all the of .jar files supplied with the NCTOOLBOX, MATLAB appears to struggle to find / use them.
I notice that in the supplied NCTOOLBOX function called "ncugrid.m", there is a block of code which 'imports' various "ucar" related methods.
I wonder if I need to set something up in my app compilation project definition which can take the place of this 'import' block.
Example of 'import' block:
import ucar.ma2.Array;
import ucar.ma2.DataType;
import ucar.nc2.Attribute;
import ucar.nc2.Dimension;
import ucar.nc2.Variable;
import ucar.nc2.dataset.CoordinateSystem;
import ucar.nc2.dataset.NetcdfDataset;
import ucar.nc2.dataset.VariableDS;
import ucar.nc2.dt.ugrid.Cell;
import ucar.nc2.dt.ugrid.Edge;
import ucar.nc2.dt.ugrid.Face;
import ucar.nc2.dt.ugrid.Node;
import ucar.nc2.dt.ugrid.UGridDataset;
import ucar.nc2.dt.ugrid.geom.LatLonPoint2D;
If anyone has insights / suggestions on what to try / solutions, they would be very much appreciated! Thank you in advance!

How to correctly develop, upload java library with dependencies to Oracle DB and call my Java function from PL\SQL?

I have:
Oracle 19c
java 8 on its machine
What i did:
I write simple class with one method in Java 8.
package <mypackage>;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.networknt.schema.JsonSchema;
import com.networknt.schema.JsonSchemaFactory;
import com.networknt.schema.SpecVersion;
import com.networknt.schema.ValidationMessage;
import java.io.IOException;
import java.io.Reader;
import java.sql.Clob;
import java.sql.SQLException;
import java.util.Set;
import java.util.stream.Collectors;
public class <classname> {
<some simple validation function>
}
I compile my project with maven and maven-assembly-plugin to build .jar file with dependencies.
I upload it with loadtool: loadjava -f -r -v -synonym -oracleresolver -resolve -grant <user> -thin -user <credentials> <filename>.jar
There were 0 errors during upload. All uploaded classes (including dependencies) have 'VALID' status in dba_objects table.
I write PL/SQL wrapper over my Java function.
create FUNCTION <funcname>(P_IN_BODY_TEXT CLOB, P_IN_BODY_SCHEMA_TEXT CLOB)
RETURN VARCHAR2
AS LANGUAGE JAVA NAME '<packagename>.<classname>.<funcname>(java.sql.Clob, java.sql.Clob) return java.lang.String';
/
I use this function in my ORDS REST Service.
When doing request to ORDS i am getting this exception:
The request could not be processed because an error occurred whilst attempting to evaluate
the SQL statement associated with this resource.
Please check the SQL statement is correctly formed and executes without error. SQL Error Code: 29532,
Error Message: ORA-29532: Java call terminated by uncaught Java exception:
java.lang.NoClassDefFoundError ORA-06512: at <rest stacktrace that gives me nothing>
Quiestion is:
What is root of this problem? By -synonym flag tool creates synonyms for me, all classes is valid. Do i need some permisions to use java packages liike java.sql that in my class imports? I upload some others open sources java libraries into my Oracle, but they doesn't hava dependencies - is that the problem?
Problem was in slf4j library that throws this exception. slf4j was dependency of library that i used.
Didn't dig the problem, I just pick another labrary with less dependencies and its works.

Native Quarkus with AWS lambda does not build

I am trying to use Quarkus native to build my AWS Lambda.
My setup is:
GraalVM 19.3.0
Java 11
Ubuntu
When I run
docker run -v /home/mypc/dev/java/quarkus/alexa_swear/target/<my project>-1.0-SNAPSHOT-native-image-source-jar:/project:z --user 1000:1000 --rm quay.io/quarkus/ubi-quarkus-native-image:19.2.1 -J-Djava.util.logging.manager=org.jboss.logmanager.LogManager --initialize-at-build-time= -H:InitialCollectionPolicy=com.oracle.svm.core.genscavenge.CollectionPolicy\$BySpaceAndTime -jar <my project>-1.0-SNAPSHOT-runner.jar -J-Djava.util.concurrent.ForkJoinPool.common.parallelism=1 -H:FallbackThreshold=0 -H:+ReportExceptionStackTraces -H:+AddAllCharsets -H:EnableURLProtocols=http -H:-JNI --no-server -H:-UseServiceLoaderFeature -H:+StackTrace <my project>-1.0-SNAPSHOT-runner
I get the following error:
[alexa_swear-1.0-SNAPSHOT-runner:23] (typeflow): 52,070.99 ms
[alexa_swear-1.0-SNAPSHOT-runner:23] (objects): 25,961.57 ms
[alexa_swear-1.0-SNAPSHOT-runner:23] (features): 803.41 ms
[alexa_swear-1.0-SNAPSHOT-runner:23] analysis: 81,015.48 ms
[alexa_swear-1.0-SNAPSHOT-runner:23] (clinit): 1,277.52 ms
[alexa_swear-1.0-SNAPSHOT-runner:23] universe: 4,416.32 ms
Error: Unsupported features in 5 methods
Detailed message:
Call path from entry point to java.lang.Runtime.traceInstructions(boolean):
at java.lang.Runtime.traceInstructions(Runtime.java)
at com.oracle.svm.reflect.Runtime_traceInstructions_91eaacf084b9d7e2af6dcc0028ee87fea9223b51_77.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Method.java:498)
at sun.net.www.protocol.http.NTLMAuthenticationProxy.isTrustedSite(NTLMAuthenticationProxy.java:102)
at sun.net.www.protocol.http.HttpURLConnection.getServerAuthentication(HttpURLConnection.java:2481)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1743)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1498)
at io.quarkus.amazon.lambda.runtime.AmazonLambdaRecorder$2.run(AmazonLambdaRecorder.java:171)
at java.lang.Thread.run(Thread.java:748)
at com.oracle.svm.core.thread.JavaThreads.threadStartRoutine(JavaThreads.java:460)
at com.oracle.svm.core.posix.thread.PosixJavaThreads.pthreadStartRoutine(PosixJavaThreads.java:193)
at com.oracle.svm.core.code.IsolateEnterStub.PosixJavaThreads_pthreadStartRoutine_e1f4a8c0039f8337338252cd8734f63a79b5e3df(generated:0) ... 6 more
Error: Image build request failed with exit status 1
The above error is truncated: the same call stack points to different unsupported methods, such as java.lang.Thread.stop.
My basic understanding is thatio.quarkus.amazon.lambda.runtime.AmazonLambdaRecorder$2.run(AmazonLambdaRecorder.java is referencing to some unsupported methods, such as java.lang.Thread.resume().
I have also tried with Quarkus 19.2.1 unsuccessfully.
The above command was executed by mvn clean install -Pnative -Dnative-image.docker-build=true -e.
Finally I have found the cause of my issue.
In the non-working version of my code, I use a factory of com.amazon.ask.AlexaSkill that is somehow injected in the entry point, as follows:
package io.mirko.lambda;
import com.amazon.ask.AlexaSkill;
import com.amazon.ask.Skills;
import com.amazon.ask.dispatcher.request.handler.HandlerInput;
import com.amazon.ask.dispatcher.request.handler.RequestHandler;
import com.amazon.ask.model.RequestEnvelope;
import com.amazon.ask.model.ResponseEnvelope;
import com.amazon.ask.request.interceptor.GenericRequestInterceptor;
import io.mirko.lambda.handlers.*;
import javax.enterprise.context.ApplicationScoped;
import javax.enterprise.inject.Instance;
import javax.enterprise.inject.Produces;
import javax.inject.Inject;
import javax.inject.Named;
import java.util.*;
import java.util.stream.StreamSupport;
public class SkillFactory {
#Inject
Instance<RequestHandler> handlers;
#Produces
#ApplicationScoped
#Named
public AlexaSkill<RequestEnvelope, ResponseEnvelope> createSkill() {
return Skills.standard()
.addRequestHandlers(handlers.stream().toArray(RequestHandler[]::new))
.addRequestInterceptor(new GenericRequestInterceptor<HandlerInput>() {
#Override
public void process(HandlerInput handlerInput) {
System.out.format("Processing %s\n", handlerInput.getRequest());
}
})
// Add your skill id below
//.withSkillId("")
.build();
}
}
...
public class SwearStreamLambda extends SkillStreamHandler {
#Named("swearStream")
public SwearStreamLambda() {
//noinspection unchecked
super((AlexaSkill<RequestEnvelope, ResponseEnvelope>)
getBean(new ParameterizedTypeImpl(AlexaSkill.class, RequestEnvelope.class, ResponseEnvelope.class)));
By removing the SkillFactory class and moving its logic inside SwearStreamLambda class the compilation went well.
Some notes:
The issue is not related to CDI, as it is heavily used throughout the project
The issue is not related to #Produces annotation, which is present in other parts of the project
The issue is not related to javax.enterprise.inject.Instance, as its removal does not solve the issue
All in all, I could not find the root cause of the problem, but I consider my issue solved.
P.S. The original problem is eradicated by building with the following:
mvn clean install -Pnative -Dnative-image.docker-build=true -Dquarkus.native.enable-jni=true
Please see https://github.com/quarkusio/quarkus/issues/6395#issuecomment-570755587.
This does not solve all the problems, as Quarkus reflection has to be configured, but it solves the specific issue.
Just some update for 2022
There is an extension in quarkus which do some configuration already for transitive dependencies: https://github.com/quarkiverse/quarkus-amazon-alex
You need GraalVM and docker installed
I just went to:
https://code.quarkus.io/?b=GRADLE&e=io.quarkiverse.amazonalexa%3Aquarkus-amazon-alexa&extension-search=origin:platform%20alexa
download the zip
overwrite all of my existing project/skill.
small further adaptions to my existing skill as described here: https://quarkus.io/guides/writing-native-applications-tips
gradlew build -x test -Dquarkus.package.type=native -Dquarkus.native.container-build=true
upload functions.zip to AWS Lambda

java.lang.NoClassDefFoundError for spark-submit

I built *.jar file for my apache spark scala project using maven. When I try to execute the main class, at some line of the code it provides the error Exception in thread "main" java.lang.NoClassDefFoundError for the class org.apache.spark.ml.recommendation.ALS.
I run the spark-submit as follows:
sudo -u hdfs /usr/bin/spark-submit --class
org.apache.spark.examples.ml.MyTest spark-examples-*.jar --rank 10 --path
/home/ubuntu/data
It looks like it only cannot find org.apache.spark.ml.recommendation.ALS. I have the following import statements in the class:
package org.apache.spark.examples.ml
import java.util.Date
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.examples.mllib.AbstractParams
import org.apache.spark.ml.recommendation.ALS
import org.apache.spark.rdd.RDD
import org.apache.spark.sql.{Row, SQLContext}
import scopt.OptionParser
How to solve this issue?
UPDATE 1:
I added maven-assebly plugin to pom.xml and also assembly.xml into the folder resources. Then I successfully made mvn package, but again the same problem.
UPDATE 2:
jar -tvf spark-examples-1.5.3-SNAPSHOT-hadoop2.2.0.jar | grep "ALS"
2678 Mon May 23 13:11:44 CEST 2016 org/apache/spark/examples/ml/recommendation/MyFunc$ALSParams.class
First find out reason for failure. 1) Is class missing ? 2) or It is initialization problem.
To find out whether class is available run command jar -tvf <.jar> | grep "class name" by this way you can find out whether class file is present or not or its initialization issue.

Java issue NoClassDefFound javax.xml.soap.SOAPPart

I have been given a java app to modify with the following:
import javax.xml.soap.MessageFactory;
import javax.xml.soap.SOAPException;
import javax.xml.soap.SOAPMessage;
import javax.xml.soap.SOAPPart;
But When I run it from cmd it throws the error:
Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/axis/SOAPP art
It seems to find the other imports without issue. Does 'NoClassDefFound' mean the class itself wasn't found? If so how can I check it is there and replace it if not? (I am using Eclipse)
Update: Okay I have found the class in 'JRE System Library>rt.jar>javax.xml.soap>SOAPPart.class' So if the class is there why do I get the error?
Turns out I needed to choose Export> 'Runnable JAR file' in eclipse instead of 'JAR file'

Categories