I am working on a very simple Snowflake example to call a pre-compiled Java UDF from Intellij. I have made a jar out of my Java UDF and want to call this from Intellij where i am trying to execute the UDF on snowflake.
I want to know how I can upload this jar into snowflake ?
session.udf.registerTemporary("sampleJUdf",
"C:\\project_existing\\snowpark\\src\\main\\Java\\myjarfile.jar")
You can upload the jar file using Session.addDependency() for example
session.addDependency("C:\\project_existing\\snowpark\\src\\main\\Java\\myjarfile.jar")
Then you need to refer the name of the function you have in your myjarfile.jar that is going to be used by the UDF, for example
session.udf.registerTemporary("sampleJUdf", myFunc)
If the jar is part of your running application SNowpark should automatically try to upload it, but you need to import it in your code and refer the name of the function in registerTemporary.
See https://docs.snowflake.com/en/developer-guide/snowpark/creating-udfs.html#specifying-dependencies-for-a-udf for more details.
Related
I have several Spark big data applications written in Scala. These applications have their other version written in R.
I also have a web server application written in Java. This is provided as API for the web GUI. The purpose is to enable GUI to execute these applications and choose the version: R or Spark. I managed to call the R code from the Java API and get the result to JSON. But now it seems to be quite complicated to execute the Spark programs.
Until now, I was able to merge one of the Scala .jar file with the Java API with Maven. I do this by placing my Spark program as a local repository in pom.xml so that the Scala code is included in the final .jar package. I also mentioned Scala and breeze library as dependencies in the pom.xml. And when I try to send a request with the API, of course it throws an error saying java.lang.NoClassDefFoundError: org/apache/spark/sql/SparkSession$. By this point, I realize that it was because I haven't mentioned Spark library in Maven dependencies, but then I think I've been doing it wrong, since Spark applications are generally run by executing spark-submit command in terminal.
So now what I'm thinking is putting the Java API .jar and Scala .jar in one folder, and then executing spark-submit from inside of Java API .jar, targeting the Scala .jar. Is this even correct? And how to execute the spark-submit from Java code? Does it have to be using Runtime.exec() as mentioned in here?
SparkLauncher can be used to submit the spark code(written in scala with precomplied jar scala.jar placed at certain location) from the Java Api code.
The saprk documentaion for using SparkLauncher recommends the below way to submit the spark job pro-grammatically from inside Java Applications. Add the below code in your Java Api code.
import org.apache.spark.launcher.SparkAppHandle;
import org.apache.spark.launcher.SparkLauncher;
public class MyLauncher {
public static void main(String[] args) throws Exception {
SparkAppHandle handle = new SparkLauncher()
.setAppResource("/my/scala.jar")
.setMainClass("my.spark.app.Main")
.setMaster("local")
.setConf(SparkLauncher.DRIVER_MEMORY, "2g")
.startApplication();
// Use handle API to monitor / control application.
}
}
I noticed that IntelliJ can parse .kts files as Kotlin and the code editor picks them up as free-floating Kotlin files. You are also able to run the script in IntelliJ as you would a Kotlin file with a main method. The script executes from top to bottom.
This form is PERFECT for the project I'm working on, if only I knew an easy way to use them from within Java or Kotlin.
What's the idiomatic way to "run" these scripts from Java or Kotlin?
Note that script files support in Kotlin is still pretty much experimental. This is an undocumented feature which we're still in the process of designing. What's working today may change, break or disappear tomorrow.
That said, currently there are two ways to invoke a script. You can use the command line compiler:
kotlinc -script foo.kts <args>
Or you can invoke the script directly from IntelliJ IDEA, by right-clicking in the editor or in the project view on a .kts file and selecting "Run ...":
KtsRunner
I've published a simple library that let's you run scripts from regular Kotlin programs.
https://github.com/s1monw1/KtsRunner
Example
The example class
data class ClassFromScript(val x: String)
The .kts file
import de.swirtz.ktsrunner.objectloader.ClassFromScript
ClassFromScript("I was created in kts")
The code to load the class
val scriptReader = Files.newBufferedReader(Paths.get("path/classDeclaration.kts"))
val loadedObj: ClassFromScript = KtsObjectLoader().load<ClassFromScript>(scriptReader)
println(loadedObj.x) // >> I was created in kts
As shown, the KtsObjectLoader class can be used for executing a .kts script and return its result. The example shows a script that creates an instance of the ClassFromScript type that is loaded via KtsObjectLoader and then processed in the regular program.
As of 2020 (Kotlin 1.3.70), you can just use the straightforward
kotlin script.main.kts
Note that using the file extension .main.kts instead of .kts seems to be important.
Note that for me this does not seem to run the main() function if defined, I had to add a manual call to main() at the top level.
One of the advantages of Kotlin script is the ability to declare code and dependencies inside a single file (with #file:DependsOn, see for example here)
early 2020ies kscript that you find at https://github.com/holgerbrandl/kscript
seems to be the most convenient and well supported way to go ...
jgo can fetch and run code from Maven repositories, so can be used to invoke
https://github.com/scijava/scijava-common and https://github.com/scripting-kotlin to execute a local Foo.kt like so:
jgo --repository scijava.public=maven.scijava.org/content/groups/public org.scijava:scijava-common:#ScriptREPL+org.scijava:scripting-kotlin Foo.kt
If no Foo.kt is provided, it launches a Kotlin REPL.
I work on a Scala project that uses c++ code, using sbt. Once compiled, this c++ code is imported into Scala through Java code that uses jna.
Now, currently the Java wrapper are manually written, and I like to automatize this. I've found jnaerator that can do that, but I don't know how I should use it in sbt.
I see two general approaches:
use command line, such as java -jar jnaerator ... but I don't know how to setup such command line task in sbt? Also, I would need to know the typical project structure to follow: where to output the jna generated code?
Use jnaerator maven plugin through sbt, if it is possible?
This might take some iteration until we get it do what you need.
For the first approach, here is how you can run custom system command on sbt (you essentially solve this using Scala code). Add the following to your build.sbt file:
lazy val runJnaerator= taskKey[Unit]("This task generates libraries from native code")
runJnaerator := {
import sys.process._
Seq("java" , "-jar", "jnaerator", "..." ).!
}
To execute:
>sbt runJnaerator
Now the question is where do you need these files files to go? Finally, how do you want to invoke everything?
I'm working on a Java/JRuby project which needs to be able to be able to interact with GAMS. I know we can use the Java API, but I would really like to be able to access it using JRuby if possible, since we're hoping to eventual add a DSL and some other complexity I'm not really excited about having to implement in pure Java.
Following the official Java API documentation for GAMS, I have downloaded and setup everything necessary to run GAMS from the command line, but I can't figure out how to include the GAMS directory in LD_LIBRARY_PATH and still run JRuby irb. When I run
export LD_LIBRARY_PATH=/home/wikk/Downloads/gams24.0_linux_x64_64_sfx
Then try to run irb with JRuby, I get
jruby: /home/wikk/Downloads/gams24.0_linux_x64_64_sfx/libstdc++.so.6: version 'GLIBCXX_3.4.15' not found (required by jruby)
I think this is what the documentation is asking me to do to run a Java program that calls the API, is there maybe some way to set LD_LIBRARY_PATH in irb, but before importing all the Java class files? I can do this successfully if I don't set LD_LIBRARY_PATH, but then GAMS tells me it can't find the main program when I try to create a new GAMSWorkspace object:
irb(main):002:0> ws = GAMSWorkspace.new
Java::ComGamsApi::GAMSException: could not find a GAMS system directory from
your environment variable, please set up properly before running a program!
from com.gams.api.GAMSWorkspace.verifySystemDirectory(GAMSWorkspace.java:335)
Am I doing this wrong? or does the API require some Java feature that isn't implemented in JRuby?
Finally came back to this problem, got it working through some trial and error. I also needed to run jruby with the -J-Djava.library.path=[GAMSDIR]/apifiles/Java/api flag, and add [GAMSDIR]/apifiles/Java/api/GAMSJavaAPI.jar to the classpath.
Once this is all in place, you can run gams models from ruby scripts:
import com.gams.api.GAMSWorkspace
import com.gams.api.GAMSJob
import com.gams.api.GAMSVariable
import com.gams.api.GAMSVariableRecord
import com.gams.api.GAMSWorkspace
ws = GAMSWorkspace.new
j1 = ws.addJobFromGamsLib('trnsport')
j1.run
j1.out_db.get_variable('x').each_entry do |rec|
puts "x(#{rec.get_keys[0]}, #{rec.get_keys[1]}): level = #{rec.get_level}, marginal = #{rec.get_marginal}"
end
I am writing here because it is the only thread related to the GAMS Java API problem.
In Eclipse, you have to go to "Run Configurations" and add two things:
1. (As already said) add a "-Djava.library.path=[GAMSDIR]\apifiles\Java\api\" to VM arguments
2. Go to Environment and SET explicitly a PATH variable to [GAMSDIR]. For some reason seeting path through windows is not working
I am trying to use Java protobuf stubs inside Matlab. I generated the Java stub and corresponding jar file in Eclipse. I then take the jar file and add it to the Matlab path. In Matlab I do the following:
import raven.aos.*;
import raven.aos.Messages.*;
image = Image.newBuilder();
At this point I get an error message that says:
??? Undefined variable "Image" or class "Image.newBuilder".
Error in ==> pub>pub.pub at 16
image = Image.newBuilder();
I have successfully been able to use the Java jar in a Java project using the exact same syntax. So this validates that my stub is correct. I have also successfully imported and used a different Java library, zmq.jar, in my Matlab project, so to a certain extent this verifies that I know how to import jar files properly into Matlab.
I've refrained from attaching the generated Java stub file since it is very long. I hope that someone can point out what I'm doing wrong with just the code that I've provided. If required, I will add the stub source.
Thanks in advance!
Because generated protocol buffer message classes are inner classes, you need to use Matlab's javaMethod command to get to the static methods. Import statement will not work. Using your example:
image = javaMethod('newBuilder','raven.aos.Messages$Image');
http://www.mathworks.com/help/techdoc/ref/javamethod.html