spring boot 3 native fail with javaAgent configured - java

I must say that I am pretty new with graalvm (like most people nowadays).
I have been following this guide created by #joshLong to create a spring boot native application.
In the guide it explains that we can use the java agent while executing tests in order to create the reflect-config.json and other files that are "given as an input to the graalVm compiler". This is also explained in the spring boot documentation here and also here
I created a repo (Which is a clone of the one from the guide), and just made one commit in order to "activate the java agent while executing the test" (in the history of the branch just look at the last commit).
please get it like this:
git clone https://github.com/clembo590/issues.git --branch issue_with_agent
when executing: mvn clean -P native native:compile I get this error
Caused by: com.oracle.svm.core.util.VMError$HostedError: New Method or Constructor found as reachable after static analysis: public java.lang.String com.example.aot.properties.DemoProperties.name()
I looked on the web if I could find something about this issue:
I found one issue on github about that , but I do not know how to fix it in the case of a spring boot app.
Thank you for your help.

Related

fabric8-maven-plugin - Generator for Dropwizard app (java-exec?)

I'm new to fabric8-maven-plugin.
I have a Dropwizard fat jar which I'd like to containerize for OpenShift/okd. It seems the recommended way would invoke the java-exec generator: http://maven.fabric8.io/#generator-java-exec
Problem is that Dropwizard apps have a config file argument that must be provided, but I'm not sure how to instruct the generator to do so.
Correct invocation should be:
java -jar hello-world-0.0.1-SNAPSHOT.jar server hello-world.yml
The generator does the following, which is missing arguments:
java -jar hello-world-0.0.1-SNAPSHOT.jar
The following seems to be the simplest approach:
Use the exec-maven-plugin in your build configuration. Supply the needed arguments: one for the command and another for the configuration file location.
The following example is a random search result: https://github.com/christian-posta/microservices-by-example-source/blob/master/hola-dropwizard/pom.xml#L103-L114.
It seems to implement the approach described here: https://www.oreilly.com/ideas/microservices-for-java-developers/page/3/dropwizard-for-microservices#chapter-3-dropwizard-for-microservices

Loading VoltDB procedures via Java classes with InProcessVoltDBServer

I am trying to load VoltDB Java procedures from methods and classes via jar with the following statement:
InProcessVoltDBServer volt = new InProcessVoltDBServer();
volt.runDDLFromString("load classes StoredProcs.jar;");
After running the jar file with java -jar I get the following error message:
SQL error while compiling query: SQL Syntax error in "LOAD CLASSES StoredProcs.jar" unexpected token: LOAD
...
It's clear to me what the message points to, but I am not certain why it's not recognizing this command?
Is this particular syntax perhaps related only to sqlcmd?
If it is, is there another way to execute loading of the classes via java code?
I am using the InProcessVoltDBServer and trying to load all the DDLs and DMLs from sql script file in order to setup the environment for integration tests.
Also, I have followed this guide to setup the project in STS.
Could anyone please shed some light on what I am doing wrong here?
Perhaps it's not possible to load DML (test data) like this?
The "LOAD CLASSES" command is only supported in sqlcmd. It is not used or supported by the lightweight InProcessVoltDBServer.
In the guide, a java procedure is created and tested, but there isn't an explicit step to load the class. When the DDL command "CREATE PROCEDURE PARTITION ON TABLE foo COLUMN id FROM CLASS procedures.UpdateFoo;" is executed, the procedures.UpdateFoo class is already available. This may be because the procedure and test are in the same Eclipse project. If the procedure was developed separately, you would need to load the jar into the classpath for the Eclipse project where you develop the tests.
Disclosure: I work at VoltDB.

Java 9 migration on intellij issues with module system

I am try to migrate my current project to be able to compile and run it on java 9. I am trying first to just move a java8 based project to java9 without not much effort which means not moving to jigsaw basically.
The project structure is something like
myjava-service [myjava-service-parent] parent-pom with the following modules
- myjava-service
- myjava-service-common
- myjava-service-test
it compiles perfectly with mvn clean package and it runs when I execute that fat.jar
the nightmare starts when I try to run it using intellij,to run on intellij i have to set the module I want to execute which is myjava-service but then apparently intellij understand it as java 9 module and well a lot of split packages issues, classes not found and other issues that I am struggling to fix, so my question is there a way to run the service on intellij under the java 9 environment without the new java module system being triggered somehow?
for the record issues like
java.lang.module.ResolutionException: Modules javax.annotation.api and annotations export package javax.annotation to module javax.el
So, apparently someone stepped into the same issue that I was facing, so what happens is that when your service start by a class which is not part of your sources (in my case my service is started by Starter class from the Vertx jar) then IDEA end up using the module path instead of class path. luckily Intellij team was quite helpful, follow up the ticket
https://youtrack.jetbrains.com/issue/IDEA-187390 , for the next version IDEA will abstain from using module path when there is no module-info file in sources.
Also for those who need desperately run a service in java 9 there's a work around which is basically create a Main class and Invoke it inside the class that starts your service, in my Case it looked like
public class Start {
public static void main(String[] args) {
Starter.main(args);
}
}

running "mvn appfuse:gen" does nothing

I'm Using Oracle db alongside the Appfuse ,but it does nothing when using command above (mvn appfuse:gen).for example, it's expected to generate a PersonDao/PersonDaoHibernate class in the dao folder if we use the command . but nothing's happened. Meanwhile, I'v visited the following Pages:
http://static.appfuse.org/plugins/appfuse-maven-plugin/gen-model-mojo.html
http://static.appfuse.org/plugins/appfuse-maven-plugin
but couldn't reconstruct it's content in practice. Does anybody can show a tutorial that basically explain the details ,please (Say, from generating the pojos .. to jsp files)?
Once you have a JavaBean (POJO class), you could generate all layers with following command:
mvn appfuse:gen -Dentity=Name
where 'Name' has to be replaced with Name of your JavaBean (Entity). Alternatively, you can use:
mvn appfuse:gen -Damp.genericCore=false -Dentity=Name
if you don't want to use GenericDao and GenericManager.
For further investigation I suggest:
http://appfuse.org/display/APF/AppFuse+Maven+Plugin#AppFuseMavenPlugin-amp-crud

RemoteActorRefProvider ClassNotFound

I'm struggling trying to get remote actors setup in Scala. I'm running Scala 2.10.2 and Akka 2.2.1.
I compile using [I've shortened the paths on the classpath arg for clarity sake]:
$ scalac -classpath "akka-2.2.1/lib:akka-2.2.1/lib/scala-library.jar:akka-2.2.1/lib/akka:akka-2.2.1/lib/akka/scala-reflect-2.10.1.jar:akka-2.2.1/lib/akka/config-1.0.2.jar:akka-2.2.1/lib/akka/akka-remote_2.10-2.2.1.jar:akka-2.2.1/lib/akka/akka-kernel_2.10-2.2.1.jar:akka-2.2.1/lib/akka/akka-actor_2.10-2.2.1.jar:." [file.scala]
I've continuously added new libraries trying to debug this - I'm pretty sure all I really need to include is akka-remote, but the others shouldn't hurt.
No issues compiling.
I attempt to run like this:
$ scala -classpath "[same as above]" [application]
And I receive a NSM exception:
java.lang.NoSuchMethodException: akka.remote.RemoteActorRefProvider.<init>(java.lang.String, akka.actor.ActorSystem$Settings, akka.event.EventStream, akka.actor.Scheduler, akka.actor.DynamicAccess)
at java.lang.Class.getConstructor0(Class.java:2810)
at java.lang.Class.getDeclaredConstructor(Class.java:2053)
...
Looking into the source code, it appears that Akka 2.2.X's flavor of this constructor takes 4 arguments (the Scheduler is removed). But in Akka < 2.2.X, the constructor takes 5 args.
Thus, I'm thinking my classpath isn't setup quite right. At compile-time, Scala must be finding the <2.2.X flavor. I don't even know where it would be finding it, since I only have Akka 2.2.1 installed.
Any suggestions!? Thanks! (Please don't say to use SBT).
The problem here is that the Scala distribution contains akka-actor 2.1.0 and helpfully puts that in the boot class path for you. We very strongly recommend using a dependency manager like sbt or maven when building anything which goes beyond the most trivial projects.
As noted in another answer, the problem is that scala puts a different version of Akka into the bootclasspath.
To more directly answer your question (as you said you don't want to use sbt): you can execute your program with java instead of scala. You just have to put the appropriate Scala jars into the classpath.
Here is a spark-dev message about the problem. The important part is: "the workaround is to use java to launch the application instead of scala. All you need to do is to include the right Scala jars (scala-library and scala-compiler) in the classpath."

Categories