I am trying to load VoltDB Java procedures from methods and classes via jar with the following statement:
InProcessVoltDBServer volt = new InProcessVoltDBServer();
volt.runDDLFromString("load classes StoredProcs.jar;");
After running the jar file with java -jar I get the following error message:
SQL error while compiling query: SQL Syntax error in "LOAD CLASSES StoredProcs.jar" unexpected token: LOAD
...
It's clear to me what the message points to, but I am not certain why it's not recognizing this command?
Is this particular syntax perhaps related only to sqlcmd?
If it is, is there another way to execute loading of the classes via java code?
I am using the InProcessVoltDBServer and trying to load all the DDLs and DMLs from sql script file in order to setup the environment for integration tests.
Also, I have followed this guide to setup the project in STS.
Could anyone please shed some light on what I am doing wrong here?
Perhaps it's not possible to load DML (test data) like this?
The "LOAD CLASSES" command is only supported in sqlcmd. It is not used or supported by the lightweight InProcessVoltDBServer.
In the guide, a java procedure is created and tested, but there isn't an explicit step to load the class. When the DDL command "CREATE PROCEDURE PARTITION ON TABLE foo COLUMN id FROM CLASS procedures.UpdateFoo;" is executed, the procedures.UpdateFoo class is already available. This may be because the procedure and test are in the same Eclipse project. If the procedure was developed separately, you would need to load the jar into the classpath for the Eclipse project where you develop the tests.
Disclosure: I work at VoltDB.
Related
I must say that I am pretty new with graalvm (like most people nowadays).
I have been following this guide created by #joshLong to create a spring boot native application.
In the guide it explains that we can use the java agent while executing tests in order to create the reflect-config.json and other files that are "given as an input to the graalVm compiler". This is also explained in the spring boot documentation here and also here
I created a repo (Which is a clone of the one from the guide), and just made one commit in order to "activate the java agent while executing the test" (in the history of the branch just look at the last commit).
please get it like this:
git clone https://github.com/clembo590/issues.git --branch issue_with_agent
when executing: mvn clean -P native native:compile I get this error
Caused by: com.oracle.svm.core.util.VMError$HostedError: New Method or Constructor found as reachable after static analysis: public java.lang.String com.example.aot.properties.DemoProperties.name()
I looked on the web if I could find something about this issue:
I found one issue on github about that , but I do not know how to fix it in the case of a spring boot app.
Thank you for your help.
I am working on a very simple Snowflake example to call a pre-compiled Java UDF from Intellij. I have made a jar out of my Java UDF and want to call this from Intellij where i am trying to execute the UDF on snowflake.
I want to know how I can upload this jar into snowflake ?
session.udf.registerTemporary("sampleJUdf",
"C:\\project_existing\\snowpark\\src\\main\\Java\\myjarfile.jar")
You can upload the jar file using Session.addDependency() for example
session.addDependency("C:\\project_existing\\snowpark\\src\\main\\Java\\myjarfile.jar")
Then you need to refer the name of the function you have in your myjarfile.jar that is going to be used by the UDF, for example
session.udf.registerTemporary("sampleJUdf", myFunc)
If the jar is part of your running application SNowpark should automatically try to upload it, but you need to import it in your code and refer the name of the function in registerTemporary.
See https://docs.snowflake.com/en/developer-guide/snowpark/creating-udfs.html#specifying-dependencies-for-a-udf for more details.
I am running a webapp using a connection to MongoDB where products reviews are stored. The current version of the webapp works correctly. (i.e. it writes and stores new reviews in the MongoDB collection).
Nonetheless, when I make changes and try to compile a new version of the MongoDB Utilities class I keep getting:
error package com.mongodb.XXX does not exist
import com.mongodb.BasicDBObject;
^
I do have the following .jar files in the \lib directory:
mongodb-driver-3.6.3
mongodb-driver-core-3.6.3
mongodb-java-driver-3.6.3
bson-3.6.3
and I mentioned them in the CLASSPATH variable:
set CLASSPATH=.;C:\apache-tomcat-7.0.34\lib\servlet-api.jar;C:\apache-tomcat-7.0.34\lib\jsp-api.jar;C:\apache-tomcat-7.0.34\lib\el-api.jar;C:\apache-tomcat-7.0.34\lib\commons-beanutils-1.8.3.jar; C:\apache-tomcat-7.0.34\lib\mongo-java-driver-3.6.3.jar; C:\apache-tomcat-7.0.34\lib\bson-3.6.3;C:\apache-tomcat-7.0.34\lib\mongodb-driver-3.6.3; C:\apache-tomcat-7.0.34\lib\mongodb-driver-core-3.6.3; C:\apache-tomcat-7.0.34\lib\mysql-connector-java-5.1.38-bin.jar;
What am I doing wrong? How can I get java to compile my new MongoDB Utilities class?
I couldn't reproduce the same behaviour, but I only reference mongodb-java-driver-3.6.3 in my build scripts.
Since the mongodb-java-driver is an uber JAR that contains mongodb-driver, mongodb-driver-core, and bson, you could try removing these latter three from your classpath and build scripts and see if that resolves the issue.
I was able to solve my issue so I post this answer in case someone is stuck in the same problem.
Thanks to what user "nos" posts in the answer to this question I used the -verbose option when compiling e.g.:
javac -verbose className.java
In the errors log I noticed that the Java compiler was searching for the MongoDB classes in a different \lib folder than the one I used in my CLASSPATH definition. So I added a copy of the mongodb-java-driver there and the compilation worked.
As craigcaulfield correctly mentions above there is no need to add the other drivers ( mongodb-driver, mongodb-driver-core, and bson).
I'm facing a problem with JUnit tests. I have written an JUnitRunner which is used to execute the WrapperTest.
The WrapperTest generates a plain JUnit-Test and a needed file. If I want to execute the methods of the generated test, my Runner searchs in the Developement-Workspace for the "NeededClass".
I'm generating the needed class in the JUnit-Workspace and i want the tests to use this generated class file, so i can delete this file in my Develop-Workspace.
So, how do I execute the generated test in the JUnit-Workspace? (He shall look in the JUnit-Workspace for the needed file)
edit: OK, i found out, it's a ClassLoader problem... The Develop Workspace got another ClassLoader than the JUnit-Workspace, this causes weired errors, for example that a "class isn't the identical class Exception" (java.lang.ClassCastException: org.junit.runner.JUnitCore cannot be cast to org.junit.runner.JUnitCore). Looks like i have to fix this problem by reflection, what is very dirty.
Look into Maven and its build lifecycle. You can wire the code generation you are doing into the generate-test-sources phase and then have it participate as normal in the test phase.
See this question for an example.
I am trying to provide an interface that I can call from MATLAB to access the contents of a database. I already have an existing Java interface that uses eclipselink to connect to the database and I would like to re-use it. I wrote a class to provide this and it works properly when I call it straight from Java, but when I try to call it from MATLAB I get the following exception:
javax.persistence.PersistenceException: No Perisistence provider for EntityManager named DatabaseConnection
Usually this exception occurs when I do not have the META-INF folder with the persistence.xml on the classpath properly, but I have tried putting the base folder that contains the META-INF folder on both the dynamic and static MATLAB javaclasspaths with no success. Again, this exact code (including the persistence.xml) work fine when run from java. Does anyone know what I am missing?
The one main difference that I was able to track down from what occurs when I run the java code is that MATLAB is using a OSGI classloader (felix) rather than the default classloader that Java uses, but I haven't figured out what difference that makes to finding the persistence.xml.