I am trying to use Quarkus native to build my AWS Lambda.
My setup is:
GraalVM 19.3.0
Java 11
Ubuntu
When I run
docker run -v /home/mypc/dev/java/quarkus/alexa_swear/target/<my project>-1.0-SNAPSHOT-native-image-source-jar:/project:z --user 1000:1000 --rm quay.io/quarkus/ubi-quarkus-native-image:19.2.1 -J-Djava.util.logging.manager=org.jboss.logmanager.LogManager --initialize-at-build-time= -H:InitialCollectionPolicy=com.oracle.svm.core.genscavenge.CollectionPolicy\$BySpaceAndTime -jar <my project>-1.0-SNAPSHOT-runner.jar -J-Djava.util.concurrent.ForkJoinPool.common.parallelism=1 -H:FallbackThreshold=0 -H:+ReportExceptionStackTraces -H:+AddAllCharsets -H:EnableURLProtocols=http -H:-JNI --no-server -H:-UseServiceLoaderFeature -H:+StackTrace <my project>-1.0-SNAPSHOT-runner
I get the following error:
[alexa_swear-1.0-SNAPSHOT-runner:23] (typeflow): 52,070.99 ms
[alexa_swear-1.0-SNAPSHOT-runner:23] (objects): 25,961.57 ms
[alexa_swear-1.0-SNAPSHOT-runner:23] (features): 803.41 ms
[alexa_swear-1.0-SNAPSHOT-runner:23] analysis: 81,015.48 ms
[alexa_swear-1.0-SNAPSHOT-runner:23] (clinit): 1,277.52 ms
[alexa_swear-1.0-SNAPSHOT-runner:23] universe: 4,416.32 ms
Error: Unsupported features in 5 methods
Detailed message:
Call path from entry point to java.lang.Runtime.traceInstructions(boolean):
at java.lang.Runtime.traceInstructions(Runtime.java)
at com.oracle.svm.reflect.Runtime_traceInstructions_91eaacf084b9d7e2af6dcc0028ee87fea9223b51_77.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Method.java:498)
at sun.net.www.protocol.http.NTLMAuthenticationProxy.isTrustedSite(NTLMAuthenticationProxy.java:102)
at sun.net.www.protocol.http.HttpURLConnection.getServerAuthentication(HttpURLConnection.java:2481)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1743)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1498)
at io.quarkus.amazon.lambda.runtime.AmazonLambdaRecorder$2.run(AmazonLambdaRecorder.java:171)
at java.lang.Thread.run(Thread.java:748)
at com.oracle.svm.core.thread.JavaThreads.threadStartRoutine(JavaThreads.java:460)
at com.oracle.svm.core.posix.thread.PosixJavaThreads.pthreadStartRoutine(PosixJavaThreads.java:193)
at com.oracle.svm.core.code.IsolateEnterStub.PosixJavaThreads_pthreadStartRoutine_e1f4a8c0039f8337338252cd8734f63a79b5e3df(generated:0) ... 6 more
Error: Image build request failed with exit status 1
The above error is truncated: the same call stack points to different unsupported methods, such as java.lang.Thread.stop.
My basic understanding is thatio.quarkus.amazon.lambda.runtime.AmazonLambdaRecorder$2.run(AmazonLambdaRecorder.java is referencing to some unsupported methods, such as java.lang.Thread.resume().
I have also tried with Quarkus 19.2.1 unsuccessfully.
The above command was executed by mvn clean install -Pnative -Dnative-image.docker-build=true -e.
Finally I have found the cause of my issue.
In the non-working version of my code, I use a factory of com.amazon.ask.AlexaSkill that is somehow injected in the entry point, as follows:
package io.mirko.lambda;
import com.amazon.ask.AlexaSkill;
import com.amazon.ask.Skills;
import com.amazon.ask.dispatcher.request.handler.HandlerInput;
import com.amazon.ask.dispatcher.request.handler.RequestHandler;
import com.amazon.ask.model.RequestEnvelope;
import com.amazon.ask.model.ResponseEnvelope;
import com.amazon.ask.request.interceptor.GenericRequestInterceptor;
import io.mirko.lambda.handlers.*;
import javax.enterprise.context.ApplicationScoped;
import javax.enterprise.inject.Instance;
import javax.enterprise.inject.Produces;
import javax.inject.Inject;
import javax.inject.Named;
import java.util.*;
import java.util.stream.StreamSupport;
public class SkillFactory {
#Inject
Instance<RequestHandler> handlers;
#Produces
#ApplicationScoped
#Named
public AlexaSkill<RequestEnvelope, ResponseEnvelope> createSkill() {
return Skills.standard()
.addRequestHandlers(handlers.stream().toArray(RequestHandler[]::new))
.addRequestInterceptor(new GenericRequestInterceptor<HandlerInput>() {
#Override
public void process(HandlerInput handlerInput) {
System.out.format("Processing %s\n", handlerInput.getRequest());
}
})
// Add your skill id below
//.withSkillId("")
.build();
}
}
...
public class SwearStreamLambda extends SkillStreamHandler {
#Named("swearStream")
public SwearStreamLambda() {
//noinspection unchecked
super((AlexaSkill<RequestEnvelope, ResponseEnvelope>)
getBean(new ParameterizedTypeImpl(AlexaSkill.class, RequestEnvelope.class, ResponseEnvelope.class)));
By removing the SkillFactory class and moving its logic inside SwearStreamLambda class the compilation went well.
Some notes:
The issue is not related to CDI, as it is heavily used throughout the project
The issue is not related to #Produces annotation, which is present in other parts of the project
The issue is not related to javax.enterprise.inject.Instance, as its removal does not solve the issue
All in all, I could not find the root cause of the problem, but I consider my issue solved.
P.S. The original problem is eradicated by building with the following:
mvn clean install -Pnative -Dnative-image.docker-build=true -Dquarkus.native.enable-jni=true
Please see https://github.com/quarkusio/quarkus/issues/6395#issuecomment-570755587.
This does not solve all the problems, as Quarkus reflection has to be configured, but it solves the specific issue.
Just some update for 2022
There is an extension in quarkus which do some configuration already for transitive dependencies: https://github.com/quarkiverse/quarkus-amazon-alex
You need GraalVM and docker installed
I just went to:
https://code.quarkus.io/?b=GRADLE&e=io.quarkiverse.amazonalexa%3Aquarkus-amazon-alexa&extension-search=origin:platform%20alexa
download the zip
overwrite all of my existing project/skill.
small further adaptions to my existing skill as described here: https://quarkus.io/guides/writing-native-applications-tips
gradlew build -x test -Dquarkus.package.type=native -Dquarkus.native.container-build=true
upload functions.zip to AWS Lambda
Related
I have:
Oracle 19c
java 8 on its machine
What i did:
I write simple class with one method in Java 8.
package <mypackage>;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.networknt.schema.JsonSchema;
import com.networknt.schema.JsonSchemaFactory;
import com.networknt.schema.SpecVersion;
import com.networknt.schema.ValidationMessage;
import java.io.IOException;
import java.io.Reader;
import java.sql.Clob;
import java.sql.SQLException;
import java.util.Set;
import java.util.stream.Collectors;
public class <classname> {
<some simple validation function>
}
I compile my project with maven and maven-assembly-plugin to build .jar file with dependencies.
I upload it with loadtool: loadjava -f -r -v -synonym -oracleresolver -resolve -grant <user> -thin -user <credentials> <filename>.jar
There were 0 errors during upload. All uploaded classes (including dependencies) have 'VALID' status in dba_objects table.
I write PL/SQL wrapper over my Java function.
create FUNCTION <funcname>(P_IN_BODY_TEXT CLOB, P_IN_BODY_SCHEMA_TEXT CLOB)
RETURN VARCHAR2
AS LANGUAGE JAVA NAME '<packagename>.<classname>.<funcname>(java.sql.Clob, java.sql.Clob) return java.lang.String';
/
I use this function in my ORDS REST Service.
When doing request to ORDS i am getting this exception:
The request could not be processed because an error occurred whilst attempting to evaluate
the SQL statement associated with this resource.
Please check the SQL statement is correctly formed and executes without error. SQL Error Code: 29532,
Error Message: ORA-29532: Java call terminated by uncaught Java exception:
java.lang.NoClassDefFoundError ORA-06512: at <rest stacktrace that gives me nothing>
Quiestion is:
What is root of this problem? By -synonym flag tool creates synonyms for me, all classes is valid. Do i need some permisions to use java packages liike java.sql that in my class imports? I upload some others open sources java libraries into my Oracle, but they doesn't hava dependencies - is that the problem?
Problem was in slf4j library that throws this exception. slf4j was dependency of library that i used.
Didn't dig the problem, I just pick another labrary with less dependencies and its works.
I want to do a small test with a nanoCUL868 USB device in order to transmit some signal to remote devices. The stick works with some 3rd party software and I was able to communicate with the remote device. I now want to test this stick with the following code in JAVA:
package de.saltest.home;
import java.io.IOException;
import java.io.OutputStream;
import java.util.logging.Logger;
import org.openmuc.jrxtx.Parity;
import org.openmuc.jrxtx.SerialPort;
import org.openmuc.jrxtx.SerialPortBuilder;
public class SomfyCULTest {
private static final Logger log = Logger.getLogger(SomfyCULTest.class.getName());
public static void main(String[] args) throws IOException {
log.info("Opening port ttyUSB0");
SerialPort port = SerialPortBuilder.newBuilder("/dev/ttyAMA0").setBaudRate(9600).setParity(Parity.NONE).build();
OutputStream out = port.getOutputStream();
String commandLEDOn = "l01\n";
String commandLEDOff = "l00\n";
String encryptionKey = "A1";
// C - Command (1 = My, 2 = Up, 4 = Down, 8 = Prog)
String command = "2";
String rollingCode = "001D";
String address = "000029";
String somfyCommand = "Ys" + encryptionKey + command + "0" + rollingCode + address + "\n";
out.write(somfyCommand.getBytes());
out.close();
port.close();
log.info("Closed port");
}
}
I have installed Oracle Java on a Rasperry Pi 3 and also got the librtrx-java via apt-get. A dpkg-query -L librxtx-java yields:
/usr/lib/jni/librxtxRS485.so
/usr/lib/jni/librxtxRaw.so
/usr/lib/jni/librxtxI2C.so
/usr/lib/jni/librxtxParallel.so
/usr/lib/jni/librxtxSerial.so
/usr/share/java/RXTXcomm.jar
So, I assume those libraries are correctly installed for the right platform.
If I use javac to compile my code:
I get the following error message:
javac -classpath /usr/lib/jni -cp /usr/share/java/RXTXcomm.jar:. src/main/java/de/saltest/home/SomfyCULTest.java
src/main/java/de/saltest/home/SomfyCULTest.java:7: error: package org.openmuc.jrxtx does not exist
import org.openmuc.jrxtx.Parity;
^
I have also a maven project which compiles fine using this code, however, I cant execute it because of the following error:
mvn exec:java -Dexec.mainClass=de.saltest.home.SomfyCULTest
[INFO] --- exec-maven-plugin:1.6.0:java (default-cli) # serialTest ---
Okt 07, 2018 3:25:41 PM de.saltest.home.SomfyCULTest main
INFORMATION: Opening port ttyUSB0
Could not load lib from jar and from system.
gnu.io.LibLoadException: directory does not exist /libs
at gnu.io.LibraryLoader.loadLib(LibraryLoader.java:65)
at gnu.io.LibraryLoader.loadLibsFromJar(LibraryLoader.java:48)
at gnu.io.LibraryLoader.loadRxtxNative(LibraryLoader.java:29)
at gnu.io.RXTXCommDriver.<clinit>(RXTXCommDriver.java:85)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at gnu.io.CommPortIdentifier.<clinit>(CommPortIdentifier.java:104)
at org.openmuc.jrxtx.JRxTxPort.openSerialPort(JRxTxPort.java:50)
at org.openmuc.jrxtx.SerialPortBuilder.build(SerialPortBuilder.java:166)
at de.saltest.home.SomfyCULTest.main(SomfyCULTest.java:24)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.mojo.exec.ExecJavaMojo$1.run(ExecJavaMojo.java:282)
at java.lang.Thread.run(Thread.java:748)
java.lang.UnsatisfiedLinkError: no rxtxSerial in java.library.path thrown while loading gnu.io.RXTXCommDriver
From here I'm lost. I have tried many things like copying libraries to different places like suggested in some other posts. I have also changed the classpath and many other things I could think of. However, nothing seems to work. The only thing I find strange is that I try to use java.io. but the code tries to load gnu.io. May be there is some mismatch?
Any help or solution is really appreciated.
Thanks!
What seems to be the solution is to add a plugin to the maven build which produces a JAR with all dependencies included. Like described here: https://www.mkyong.com/maven/create-a-fat-jar-file-maven-assembly-plugin/
However, what's still odd, you have to add the library path like mentioned above:
-Djava.library.path=/usr/lib/jni
For my understanding this should not be necessary anymore. Nevertheless, without the path it is not working.
Did you try to add the native path of the library as start parameter:
-Djava.library.path=/usr/lib/jni
I face the same issue, for resolve the issues, steps that worked for me
Step1. add RXTXcomm.jar and RXTXcomm-2.2pre2.jar in JAVA_HOME/jre/lib/ext
Step2:- create war file using eclipse or anything
Step3: On the client environment these jar should be store in same location
Step4: Add RXTXcomm.jar and RXTXcomm-2.2pre2.jar in JAVA_HOME/jre/lib/ext of client environment.
Note:- If you not geting the data from serial device then check locks on the serial port, It may be other thread or process already occupied or not release the lock.
For checking locks on port,
command is fuser /dev/ttyS0 then
kill -9 PID
I hope this will be work.
Working with centos7
I want to test my lambda functions locally with Serverless Application Model (SAM)
In the AWS docs they write :
SAM Local leverages the docker-lambda Docker images to run your code in a sandbox that simulates the Lambda execution environment.
I pulled the docker image on my computer. I could successfully run a simple Hello World Lambda Function.
Command to run Lambda function locally:
$ docker run -v "$PWD/target/classes":/var/task lambci/lambda:java8 com.amazonaws.lambda.demo.LambdaFunctionHandler
results:
"Hello from Lambda!"
Code of Lambda function automatically generated with Eclipse Toolkit:
package com.amazonaws.lambda.demo;
import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.RequestHandler;
public class LambdaFunctionHandler implements RequestHandler<Object, String> {
#Override
public String handleRequest(Object input, Context context) {
context.getLogger().log("Input: " + input);
// TODO: implement your handler
return "Hello from Lambda!"
}
}
This is my progress till yet. What i couldnt do is to use sam local which uses the docker-lambda image.(Maybe i should not have to download it manually?).
I installed sam local on my windows:
npm install -g aws-sam-local
created a template.yaml config sam file.
AWSTemplateFormatVersion: 2010-09-09
Transform: AWS::Serverless-2016-10-31
Resources:
ExampleJavaFunction:
Type: AWS::Serverless::Function
Properties:
Handler: com.amazonaws.lambda.demo.LambdaFunctionHandler
CodeUri: ./target/demo-1.0.0-shaded.jar
Runtime: java8
the name for CodeUri: i choosed after i build my shaded jar file with:
mvn compile shade:shade
after this i should run to run my lambda function:
$ echo '{ "some": "input" }' | sam local invoke
NOW i have this ERROR:
2017/12/05 14:56:36 Successfully parsed template.yaml
2017/12/05 14:56:36 Running AWS SAM projects locally requires Docker. Have you got it installed?
2017/12/05 14:56:36 error during connect: Get http://%2F%2F.%2Fpipe%2Fdocker_engine/_ping: open //./pipe/docker_engine: The system cannot find the file specified. In the default daemon configuration on Windows, the docker client must be run elevated to connect. This error may also indicate that the docker daemon is not running.
What is my mistake to use SAM Local with Java? Can it be that its not working because my computer has not Hyper-V and iam using dockertoolbox?..
here you can see Advanced sam docs with compiled languages like java.
It was a bug in Sam local.. fixed with new
update
If you still have a problem in windows then try this :
COMPOSE_CONVERT_WINDOWS_PATHS=1
this should help if your Path is wrong. / \
I built *.jar file for my apache spark scala project using maven. When I try to execute the main class, at some line of the code it provides the error Exception in thread "main" java.lang.NoClassDefFoundError for the class org.apache.spark.ml.recommendation.ALS.
I run the spark-submit as follows:
sudo -u hdfs /usr/bin/spark-submit --class
org.apache.spark.examples.ml.MyTest spark-examples-*.jar --rank 10 --path
/home/ubuntu/data
It looks like it only cannot find org.apache.spark.ml.recommendation.ALS. I have the following import statements in the class:
package org.apache.spark.examples.ml
import java.util.Date
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.examples.mllib.AbstractParams
import org.apache.spark.ml.recommendation.ALS
import org.apache.spark.rdd.RDD
import org.apache.spark.sql.{Row, SQLContext}
import scopt.OptionParser
How to solve this issue?
UPDATE 1:
I added maven-assebly plugin to pom.xml and also assembly.xml into the folder resources. Then I successfully made mvn package, but again the same problem.
UPDATE 2:
jar -tvf spark-examples-1.5.3-SNAPSHOT-hadoop2.2.0.jar | grep "ALS"
2678 Mon May 23 13:11:44 CEST 2016 org/apache/spark/examples/ml/recommendation/MyFunc$ALSParams.class
First find out reason for failure. 1) Is class missing ? 2) or It is initialization problem.
To find out whether class is available run command jar -tvf <.jar> | grep "class name" by this way you can find out whether class file is present or not or its initialization issue.
I have been given a java app to modify with the following:
import javax.xml.soap.MessageFactory;
import javax.xml.soap.SOAPException;
import javax.xml.soap.SOAPMessage;
import javax.xml.soap.SOAPPart;
But When I run it from cmd it throws the error:
Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/axis/SOAPP art
It seems to find the other imports without issue. Does 'NoClassDefFound' mean the class itself wasn't found? If so how can I check it is there and replace it if not? (I am using Eclipse)
Update: Okay I have found the class in 'JRE System Library>rt.jar>javax.xml.soap>SOAPPart.class' So if the class is there why do I get the error?
Turns out I needed to choose Export> 'Runnable JAR file' in eclipse instead of 'JAR file'