I am trying to understand how to use JPL. For this purpose I copied one of it's tests from the doc section (swipl\doc\packages\examples\jpl\java\Time) to eclipse and tried to run it.
If I double click the batch file, all runs well. If I run the Time class using eclipse I get
Exception in thread "main" jpl.PrologException: PrologException: error(existence_error(source_sink, 'time.pl'), _0)
I created a simple java project. Copied Time.java and time.pl to the root.
Also I created the needed Path variables and connected the jpl.jar to the project.
JPL.init() works. I fail on the if statement of this part:
static void test_0() {
Query query = new Query("consult('time.pl')");
if (!query.hasSolution()) {
The path to the prolog file should have the suffix of src/
Query query = new Query("consult('src/time.pl')");
Related
I´m using Krextor to convert XML to RDF. It runs fine from the command line.
I try to run it from Java (Eclipse) using this code.
private static void XMLToRDF() throws KrextorException, ValidityException, ParsingException, IOException, XSLException{
Element root = new Element("person");
Attribute friend = new Attribute("friends", "http://van-houten.name/milhouse");
root.addAttribute(friend);
Element name = new Element("name");
name.appendChild("Bart Simpson");
root.appendChild(name);
nu.xom.Document inputDocument = new nu.xom.Document(root);
System.out.println(inputDocument.toXML());
Element root1 = inputDocument.getRootElement();
System.out.println(root1);
Krextor k = new Krextor();
nu.xom.Document outputDocument = k.extract("socialnetwork","turtle",inputDocument);
System.out.println(outputDocument.toString());
}
I have the following problem problem
Exception in thread "main" java.lang.NoClassDefFoundError: net/sf/saxon/CollectionURIResolver
Caused by: java.lang.ClassNotFoundException: net.sf.saxon.CollectionURIResolver
I have included Saxon9he in the classpath, and I have also added manually as a library in the project but the error is the same.
I am the main developer of Krextor. And, #Michael Kay, actually a colleague of Grangel, so I will resolve the concrete problem with him locally.
So indeed the last Saxon version with which I did serious testing was 9.1; after that I haven't used Krextor integrated into Java but mainly used Krextor from the command line.
#Grangel, could you please file an issue for Krextor, and then we can work on fixing it together.
Indeed, #Michael Kay, for a while I had been including more recent Saxon versions with Krextor and updated the command line wrapper to use them (such as to add different JARs to the classpath), but I have not necessarily updated the Java wrapper code.
I'm trying to write a UDF for Hadoop Hive, that parses User Agents. Following code works fine on my local machine, but on Hadoop I'm getting:
org.apache.hadoop.hive.ql.metadata.HiveException: Unable to execute method public java.lang.String MyUDF .evaluate(java.lang.String) throws org.apache.hadoop.hive.ql.metadata.HiveException on object MyUDF#64ca8bfb of class MyUDF with arguments {All Occupations:java.lang.String} of size 1',
Code:
import java.io.IOException;
import org.apache.hadoop.hive.ql.exec.UDF;
import org.apache.hadoop.hive.ql.metadata.HiveException;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.*;
import com.decibel.uasparser.OnlineUpdater;
import com.decibel.uasparser.UASparser;
import com.decibel.uasparser.UserAgentInfo;
public class MyUDF extends UDF {
public String evaluate(String i) {
UASparser parser = null;
parser = new UASparser();
String key = "";
OnlineUpdater update = new OnlineUpdater(parser, key);
UserAgentInfo info = null;
info = parser.parse(i);
return info.getDeviceType();
}
}
Facts that come to my mind I should mention:
I'm compiling with Eclipse with "export runnable jar file" and extract required libraries into generated jar option
I'm uploading this "fat jar" file with Hue
Minimum working example I managed to run:
public String evaluate(String i) {
return "hello" + i.toString()";
}
I guess the problem lies somewhere around that library (downloaded from https://udger.com) I'm using, but I have no idea where.
Any suggestions?
Thanks, Michal
It could be a few things. Best thing is to check the logs, but here's a list of a few quick things you can check in a minute.
jar does not contain all dependencies. I am not sure how eclipse builds a runnable jar, but it may not include all dependencies. You can do
jar tf your-udf-jar.jar
to see what was included. You should see stuff from com.decibel.uasparser. If not, you have to build the jar with the appropriate dependencies (usually you do that using maven).
Different version of the JVM. If you compile with jdk8 and the cluster runs jdk7, it would also fail
Hive version. Sometimes the Hive APIs change slightly, enough to be incompatible. Probably not the case here, but make sure to compile the UDF against the same version of hadoop and hive that you have in the cluster
You should always check if info is null after the call to parse()
looks like the library uses a key, meaning that actually gets data from an online service (udger.com), so it may not work without an actual key. Even more important, the library updates online, contacting the online service for each record. This means, looking at the code, that it will create one update thread per record. You should change the code to do that only once in the constructor like the following:
Here's how to change it:
public class MyUDF extends UDF {
UASparser parser = new UASparser();
public MyUDF() {
super()
String key = "PUT YOUR KEY HERE";
// update only once, when the UDF is instantiated
OnlineUpdater update = new OnlineUpdater(parser, key);
}
public String evaluate(String i) {
UserAgentInfo info = parser.parse(i);
if(info!=null) return info.getDeviceType();
// you want it to return null if it's unparseable
// otherwise one bad record will stop your processing
// with an exception
else return null;
}
}
But to know for sure, you have to look at the logs...yarn logs, but also you can look at the hive logs on the machine you're submitting the job on ( probably in /var/log/hive but it depends on your installation).
such a problem probably can be solved by steps:
overide the method UDF.getRequiredJars(), make it returning a hdfs file path list which values are determined by where you put the following xxx_lib folder into your hdfs. Note that , the list mist exactly contains each jar's full hdfs path strings ,such as hdfs://yourcluster/some_path/xxx_lib/some.jar
export your udf code by following "Runnable jar file exporting wizard" (chose "copy required libraries into a sub folder next to the generated jar". This steps will result in a xxx.jar and a lib folder xxx_lib next to xxx.jar
put xxx.jar and the folders xxx_lib to your hdfs filesystem according to your code in step 0.
create a udf using: add jar ${the-xxx.jar-hdfs-path}; create function your-function as $}qualified name of udf class};
Try it. I test this and it works
I ran into library loading problems after creating a jar from my code via maven. I use intelliJ idea on Ubuntu. I broke the problem down to this situation:
Calling the following code from within idea it prints the path correctly.
package com.myproject;
public class Starter {
public static void main(String[] args) {
File classpathRoot = new File(Starter.class.getResource("/").getPath());
System.out.println(classpathRoot.getPath());
}
}
Output is:
/home/ted/java/myproject/target/classes
When I called mvn install and try to run it from command line using the following command I'm getting a NullPointerException since class.getResource() returns null:
cd /home/ted/java/myproject/target/
java -cp myproject-0.1-SNAPSHOT.jar com.myproject.Starter
same for calling:
cd /home/ted/java/myproject/target/
java -Djava.library.path=. -cp myproject-0.1-SNAPSHOT.jar com.myproject.Starter
It doesn't matter if I use class.getClassLoader().getRessource("") instead. Same problem when accessing single files inside of the target directory instead via class.getClassLoader().getRessource("file.txt").
I want to use this way to load native files in the same directory (not from inside the jar). What's wrong with my approach?
The classpath loading mechanism in the JVM is highly extensible, so it's often hard to guarantee a single method that would work in all cases. e.g. What works in your IDE may not work when running in a container because your IDE and your container probably have highly specialized class loaders with different requirements.
You could take a two tiered approach. If the method above fails, you could get the classpath from the system properties, and scan it for the jar file you're interested in and then extract the directory from that entry.
e.g.
public static void main(String[] args) {
File f = findJarLocation("jaxb-impl.jar");
System.out.println(f);
}
public static File findJarLocation(String entryName) {
String pathSep = System.getProperty("path.separator");
String[] pathEntries = System.getProperty("java.class.path").split(pathSep);
for(String entry : pathEntries) {
File f = new File(entry);
if(f.getName().equals(entryName)) {
return f.getParentFile();
}
}
return null;
}
I am having trouble using Create Function Command for Derby Database.
To start with I tried
CREATE FUNCTION TO_DEGREES(RADIANS DOUBLE) RETURNS DOUBLE
PARAMETER STYLE JAVA NO SQL LANGUAGE JAVA
EXTERNAL NAME 'java.lang.Math.toDegrees'
and then
SELECT TO_DEGREES(3.142), BILLNO FROM SALEBILL
This works absolutely fine.
Now I tried making my own function like this :
package SQLUtils;
public final class TestClass
{
public TestClass()
{
}
public static int addNos(int val1, int val2)
{
return(val1+val2);
}
}
followed by
CREATE FUNCTION addno(no1 int, no2 int) RETURNS int
PARAMETER STYLE JAVA NO SQL LANGUAGE JAVA
EXTERNAL NAME 'SQLUtils.TestClass.addNos'
and then
SELECT addno(3,4), BILLNO FROM SALEBILL
This gives an Exception
Error code -1, SQL state 42X51: The class 'SQLUtils.TestClass' does not exist or is inaccessible. This can happen if the class is not public.
Error code 99999, SQL state XJ001: Java exception: 'SQLUtils.TestClass: java.lang.ClassNotFoundException'.
Line 6, column 1
I have made a jar file of the project containing the above Class. I may be wrong but the conclusion that I can draw from this is that this jar file needs to be in some classpath. But in which classpath and how to add it to a classpath, I am not able to understand.
I tried copying the jar file to jdk\lib folder, jre\lib folder, jdk\jre\lib folder but to no avail.
Can someone please point me in the right direction ?
I am using NetBeans IDE 7.1.2, jdk 1.7.0_09, Derby version 10.8.1.2 in Network mode. The applications and data are on a Server. I access them from Netbeans installed on client computer.
I am relatively new to Java, so bear with me.
I am completing a tutorial on LWUIT, and just want to load a simple theme, created using the editor. Here is the code in question:
try
{
Container container = c.getContainer();
container.setVisible(true);
Display.init(container);
Display.getInstance().setPureTouch(true);
//Resources r = Resources.open(getClass().getResourceAsStream("/res/Theme.res"));
Resources r = Resources.open("/res/Theme.res");
UIManager.getInstance().setThemeProps(r.getTheme("Simple"));
}
When I use the first (commented out) statement, I get
*** Signal: alarm { "name":"XletException", "domain":"ams", "appId":"com.thomasdge.xlet.hellojamaica.HelloJamaica", "msg":"XletAction['initXlet'] resulted in exception com.aicas.xlet.manager.AMSError: running xlet code caused java exception: initXlet() resulted in exception: java.lang.NullPointerException: <null>.read()I", "data":{ } }
When I use the other, I get
java.io.IOException: /res/Theme.res not found
I have my Theme.res file in /res/Theme. I have also tried it directly in the root, as well as /src. Same results for each.
Any ideas?
If you put the res file in that folder, you will need to go down one level. I recommend you to put the res in the src folder. So, /src/Theme.res. In the code you will only need to write Resources r = Resources.open("/Theme.res");
If resource file placed in res folder, you need to add res folder in project properties. Also you mentioned, the problem even in /src folder, I feel you didn't change the path. Just use Resources.open("/Theme.res") when you use /src folder. Also check the theme name. This should work.