In Java programming, we should firstly add weka.jar into our classpath, thus we can call all classify or cluster algorithms in WEKA in the form of the following codes,
import weka.classifiers.trees.RandomForest;
...
RandomForest rf = new RandomForest(); // RandomForest object
But unfortunately, we can not use this way to import LibSVM algorithm, because there is not such class in weka.jar.
So, my question is How to import LibSVM into my Java code? Any help will be grateful :)
Firstly, I'd like to say there are so many methods to solve the problem. The solution I mentioned is quite simple, but other answers from StackOverflow are not detailed descripted, with waste my too much time to verify. So I'm happy to share it with all WEKA beginners :)
a) Download the LibSVM.jar from Maven Repository Center. Note that this LibSVM.jar is different from the libsvm.jar developed by Chih-Chung Chang and Chih-Jen Lin;
b) Add the LibSVM.jar to the classpath of our Java project;
c) Call the classifier LibSVM when you need, see the following Java code.
import weka.classifiers.functions.LibSVM; // contained in LibSVM.jar
String path = "file/train.arff";
Instances train = DataSource.read(path); // load the dataset
train.setClassIndex(train.numAttribute()-1); // set class index
LibSVM svm = new LibSVM(); // load the svm classifier
svm.buildClassifier(train);
Evaluation eval = new Evaluation(train);
eval.crossValidateModel(svm, train, 10, new Random(1)); // 10-fold cross-validation
See: https://weka.wikispaces.com/LibSVM
Use Weka's package manager to install the LibSVM. Suppose "weka.jar" is in your current folder, than run this:
java -cp weka.jar weka.core.WekaPackageManager -install-package LibSVM
During the installation, it shows:
[DefaultPackageManager] Tmp file: /tmp/LibSVM1.0.107382715397815864641.zip
[DefaultPackageManager] Installing: Description.props
[DefaultPackageManager] Installing: LibSVM.jar
[DefaultPackageManager] Installing: build_package.xml
...
You can see that "LibSVM.jar" is installed somewhere. In my case, it is at:
/home/john/wekafiles/packages/LibSVM/LibSVM.jar
Related
I am setting up a java framework that should use the Google OR-Tools. The code below compiles successfully, but throws an exception at runtime:
Exception in thread "main" java.lang.UnsatisfiedLinkError: com.google.ortools.linearsolver.operations_research_linear_solverJNI.MPSolver_CLP_LINEAR_PROGRAMMING_get()I
at com.google.ortools.linearsolver.operations_research_linear_solverJNI.MPSolver_CLP_LINEAR_PROGRAMMING_get(Native Method)
at com.google.ortools.linearsolver.MPSolver$OptimizationProblemType.<clinit>(MPSolver.java:221)
at Main.main(Main.java:15)
I am using Intellij 2018.3 on Windows 10. I spent a lot of time trying to get this run, but unsuccessful. Based on what I found on the internet, the exception might be caused by poor linking and/or missing external libraries on which OR-Tools depends. However, I don't have the background to resolve this issue, and also Intellij does not highlight anything. Any idea what the problem is?
For completion, this is the code I run:
import com.google.ortools.linearsolver.MPObjective;
import com.google.ortools.linearsolver.MPSolver;
import com.google.ortools.linearsolver.MPVariable;
public final class Main {
public static void main(String[] args) {
// Create the linear solver with the GLOP backend.
MPSolver solver =
new MPSolver("SimpleLpProgram", MPSolver.OptimizationProblemType.GLOP_LINEAR_PROGRAMMING);
// Create the variables x and y.
MPVariable x = solver.makeNumVar(0.0, 1.0, "x");
MPVariable y = solver.makeNumVar(0.0, 2.0, "y");
System.out.println("Number of variables = " + solver.numVariables());
// Create a linear constraint, 0 <= x + y <= 2.
MPConstraint ct = solver.makeConstraint(0.0, 2.0, "ct");
ct.setCoefficient(x, 1);
ct.setCoefficient(y, 1);
System.out.println("Number of constraints = " + solver.numConstraints());
// Create the objective function, 3 * x + y.
MPObjective objective = solver.objective();
objective.setCoefficient(x, 3);
objective.setCoefficient(y, 1);
objective.setMaximization();
solver.solve();
System.out.println("Solution:");
System.out.println("Objective value = " + objective.value());
System.out.println("x = " + x.solutionValue());
System.out.println("y = " + y.solutionValue());
}
}
In my case solution was simple - I just needed to add this singe line of code:
Loader.loadNativeLibraries();
where loader comes from com.google.ortools.Loader
Disclaimer: more a long comment than an answer...
note: I supposed you are using the github repository of or-tools if you used the binary package it should be more or less the same...
1) You must load the jni library which will load the OR-Tools C++ libraries and its dependencies...
/** Simple linear programming example.*/
public class Main {
static {
System.loadLibrary("jniortools");
}
public static void main(String[] args) throws Exception {
2) Did you manage to run the java samples ?
make run SOURCE=ortools/linear_solver/samples/SimpleLpProgram.java
ref: https://developers.google.com/optimization/introduction/java#simple_example
3) As Kayaman pointed out, you must pass the folder where the java runtime can find the native libraries (i.e. the JNI wrapper jniortools.dll and its dependencies libortools.dll)
if you look at the console log you'll see the full command line:
java -Xss2048k -Djava.library.path=lib -cp lib\sample.jar;lib\com.google.ortools.jar;lib\protobuf.jar ...\sample
Which comes from, the makefiles/Makefile.java file:
JAVAFLAGS = -Djava.library.path=$(LIB_DIR)
...
ifeq ($(SOURCE_SUFFIX),.java) # Those rules will be used if SOURCE contain a .java file
$(CLASS_DIR)/$(SOURCE_NAME): $(SOURCE) $(JAVA_OR_TOOLS_LIBS) | $(CLASS_DIR)
-$(DELREC) $(CLASS_DIR)$S$(SOURCE_NAME)
-$(MKDIR_P) $(CLASS_DIR)$S$(SOURCE_NAME)
"$(JAVAC_BIN)" -d $(CLASS_DIR)$S$(SOURCE_NAME) \
-cp $(LIB_DIR)$Scom.google.ortools.jar$(CPSEP)$(LIB_DIR)$Sprotobuf.jar \
$(SOURCE_PATH)
...
.PHONY: run # Run a Java program.
run: build
"$(JAVA_BIN)" -Xss2048k $(JAVAFLAGS) \
-cp $(LIB_DIR)$S$(SOURCE_NAME)$J$(CPSEP)$(LIB_DIR)$Scom.google.ortools.jar$(CPSEP)$(LIB_DIR)$Sprotobuf.jar \
$(SOURCE_NAME) $(ARGS)
endif
src: https://github.com/google/or-tools/blob/46173008fdb15dae1dca0e8fa42a21ed6190b6e4/makefiles/Makefile.java.mk#L15
and
https://github.com/google/or-tools/blob/46173008fdb15dae1dca0e8fa42a21ed6190b6e4/makefiles/Makefile.java.mk#L328-L333
note: you can run make detect_java to know the flags i.e. value of LIB_DIR
note: if you did use the precompiled package the Makefile is here:
https://github.com/google/or-tools/blob/stable/tools/Makefile.cc.java.dotnet
Then after you can try to add this option in Intellij...
You must understand that or-tools is a set of C++ native libraries which are wrapped to Java using the SWIG generator.
To make it work using Intellij (over a windows machine) you need to:
Install Microsoft Visual C++ Redistributable for Visual Studio
Download and extract the OR-Tools library for Java
In intellij, add jar dependency to the 2 jars under the lib folder of the extracted files (each of the 2 jars separately, do not add to lib folder itself. This is why).
Add the lib library path to VM options. In Intellij edit your run-configuration and add to vm options: -Djava.library.path=<path to the lib folder that hold the jars>
Load the jni library statically by adding the below code to your class (as mentioned here.)
static {
System.loadLibrary("jniortools");
}
I'm trying to write a UDF for Hadoop Hive, that parses User Agents. Following code works fine on my local machine, but on Hadoop I'm getting:
org.apache.hadoop.hive.ql.metadata.HiveException: Unable to execute method public java.lang.String MyUDF .evaluate(java.lang.String) throws org.apache.hadoop.hive.ql.metadata.HiveException on object MyUDF#64ca8bfb of class MyUDF with arguments {All Occupations:java.lang.String} of size 1',
Code:
import java.io.IOException;
import org.apache.hadoop.hive.ql.exec.UDF;
import org.apache.hadoop.hive.ql.metadata.HiveException;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.*;
import com.decibel.uasparser.OnlineUpdater;
import com.decibel.uasparser.UASparser;
import com.decibel.uasparser.UserAgentInfo;
public class MyUDF extends UDF {
public String evaluate(String i) {
UASparser parser = null;
parser = new UASparser();
String key = "";
OnlineUpdater update = new OnlineUpdater(parser, key);
UserAgentInfo info = null;
info = parser.parse(i);
return info.getDeviceType();
}
}
Facts that come to my mind I should mention:
I'm compiling with Eclipse with "export runnable jar file" and extract required libraries into generated jar option
I'm uploading this "fat jar" file with Hue
Minimum working example I managed to run:
public String evaluate(String i) {
return "hello" + i.toString()";
}
I guess the problem lies somewhere around that library (downloaded from https://udger.com) I'm using, but I have no idea where.
Any suggestions?
Thanks, Michal
It could be a few things. Best thing is to check the logs, but here's a list of a few quick things you can check in a minute.
jar does not contain all dependencies. I am not sure how eclipse builds a runnable jar, but it may not include all dependencies. You can do
jar tf your-udf-jar.jar
to see what was included. You should see stuff from com.decibel.uasparser. If not, you have to build the jar with the appropriate dependencies (usually you do that using maven).
Different version of the JVM. If you compile with jdk8 and the cluster runs jdk7, it would also fail
Hive version. Sometimes the Hive APIs change slightly, enough to be incompatible. Probably not the case here, but make sure to compile the UDF against the same version of hadoop and hive that you have in the cluster
You should always check if info is null after the call to parse()
looks like the library uses a key, meaning that actually gets data from an online service (udger.com), so it may not work without an actual key. Even more important, the library updates online, contacting the online service for each record. This means, looking at the code, that it will create one update thread per record. You should change the code to do that only once in the constructor like the following:
Here's how to change it:
public class MyUDF extends UDF {
UASparser parser = new UASparser();
public MyUDF() {
super()
String key = "PUT YOUR KEY HERE";
// update only once, when the UDF is instantiated
OnlineUpdater update = new OnlineUpdater(parser, key);
}
public String evaluate(String i) {
UserAgentInfo info = parser.parse(i);
if(info!=null) return info.getDeviceType();
// you want it to return null if it's unparseable
// otherwise one bad record will stop your processing
// with an exception
else return null;
}
}
But to know for sure, you have to look at the logs...yarn logs, but also you can look at the hive logs on the machine you're submitting the job on ( probably in /var/log/hive but it depends on your installation).
such a problem probably can be solved by steps:
overide the method UDF.getRequiredJars(), make it returning a hdfs file path list which values are determined by where you put the following xxx_lib folder into your hdfs. Note that , the list mist exactly contains each jar's full hdfs path strings ,such as hdfs://yourcluster/some_path/xxx_lib/some.jar
export your udf code by following "Runnable jar file exporting wizard" (chose "copy required libraries into a sub folder next to the generated jar". This steps will result in a xxx.jar and a lib folder xxx_lib next to xxx.jar
put xxx.jar and the folders xxx_lib to your hdfs filesystem according to your code in step 0.
create a udf using: add jar ${the-xxx.jar-hdfs-path}; create function your-function as $}qualified name of udf class};
Try it. I test this and it works
I'm trying to change the existing plugin OpenComet for ImageJ. I'm not into Java, so perhaps it's an easy task.
What I'm trying to implement are the following things
run("Bio-Formats Windowless Importer", "open=path autoscale color_mode=Default view=Hyperstack stack_order=XYCZT");
opening my files with the help of the Bioformat Importer plugin
run("Flip Horizontally");
This is supposed to be placed into the following code:
// Iterate over each input file
for(int i=0;i<inFiles.length;i++){
// Try to open file as image
//NUMBER 1 BIOFORMAT IMPORT AT THIS POINT
ImagePlus imp = IJ.openImage(inFiles[i].getPath());
// If image could be opened, run comet analysis
if(imp!=null){
//NUMBER 2 FLIPPING AT THIS POINT
String imageKey = inFiles[i].getName();
Moreover I would need to import the class of the BioFormat Importer or something like this. Wouldn't I?
Thanks a lot in advance.
run("Bio-Formats Windowless Importer", "open=path autoscale
color_mode=Default view=Hyperstack stack_order=XYCZT");
opening my files with the help of the Bioformat Importer plugin
You can achieve this using the BF helper class of bio-formats (see its API documentation). For a javascript example, have a look here. In Java, this could look like:
import loci.plugins.BF;
[...]
ImagePlus[] imps = BF.openImagePlus(inFiles[i].getPath());
ImagePlus imp = imps[0];
run("Flip Horizontally");
Use the recorder (Plugins > Macros > Record...) in Java mode to get the required command:
import ij.IJ;
[...]
IJ.run(imp, "Flip Horizontally", "");
If you want to know the Java command at a lower level, use the Command Finder (press [L] or Plugins > Utilities > Find Commands...) and type "flip" and you'll find the class that implements the command:
ij.plugin.filter.Transformer("fliph")
Hope that helps.
First of all I would like to thank in advance everyone for reading such a long post. I really appreciate your help.
The thing is that I've been doing some research on how to "connect" Matlab and Java for a project I am working on for university. I figured that the most suitable option was using Matlab Builder JA, but I'm having a lot of troubles with it.
I follow step by step the instructions described on a tutorial (the link of the video in below) but get compilation errors over and over, and I really don't know how to fix them. The tutorial is about creating a Java package (demo.jar) with MATLAB ("com.demo"), which contains a class (MLTestClass) with a function makeSqr(n) which returns an n × n square matrix. Then I go to Eclipse, I add to the project both libraries javabuilder.jar and demo.jar and then create the following class:
public class Driver {
public static void main (String[] args) {
MLTestClass x = null;
Object result [] = null;
try {
x = new MLTestClass ();
result = x.makeSqr (1, 5);
System.out.println (result [0]);
} catch (MWException e) {
e.printStackTrace();
}
}
}
Of course I import com.demo.* and com.mathworks.toolbox.javabuilder.*.
Here are the errors the console gives me:
Exception in thread "main" java.lang.ExceptionInInitializerError
at com.mathworks.toolbox.javabuilder.internal.MCRConfiguration.getProxyLibraryDir(MCRConfiguration.java:163)
at com.mathworks.toolbox.javabuilder.internal.MCRConfiguration$MCRRoot.get(MCRConfiguration.java:77)
at com.mathworks.toolbox.javabuilder.internal.MCRConfiguration$MCRRoot.<clinit>(MCRConfiguration.java:87)
at com.mathworks.toolbox.javabuilder.internal.MCRConfiguration.getMCRRoot(MCRConfiguration.java:92)
at com.mathworks.toolbox.javabuilder.internal.MCRConfiguration$ModuleDir.<clinit>(MCRConfiguration.java:66)
at com.mathworks.toolbox.javabuilder.internal.MCRConfiguration.getModuleDir(MCRConfiguration.java:71)
at com.mathworks.toolbox.javabuilder.internal.MWMCR.<clinit>(MWMCR.java:1573)
at com.demo.DemoMCRFactory.(DemoMCRFactory.java:122)
at com.demo.MLTestClass.(MLTestClass.java:63)
at Driver.main(Driver.java:12)
Caused by: java.lang.NullPointerException
at com.mathworks.toolbox.javabuilder.internal.MCRConfiguration$ProxyLibraryDir.get(MCRConfiguration.java:143)
at com.mathworks.toolbox.javabuilder.internal.MCRConfiguration$ProxyLibraryDir.<clinit>(MCRConfiguration.java:158)
... 10 more
Just in case, link tutorial (it's the video): http://www.mathworks.nl/products/javabuilder/description2.html
Anyone has any ideas what the problem could be? It says something about NullPointerException, but I don't know how to solve it as the constructor is provided by the class created with MATLAB. I didn't have any issues installing MCR, and by the way I have MacOS, which I hope is not the source of the problem :).
Again, sorry for the long post and thank you for your time.
Béntor.
Yes, please install MCR. The installation also mentions about setting environmental variables like LD_LIBRARY_PATH etc. If you are using eclipse, i would recommend you update the environmental variables
right click->
properties ->
run/debug settings->
environmental variables
I also had to make sure that variable MCR_CACHE_ROOT pointed to different directory since my home directory was not big enough.
You have install MCR (avaliable in http://www.mathworks.com/products/compiler/mcr/index.html)
None of the above solutions helped me (I already had MCR installed and Macs use DYLD_LIBRARY_PATH instead of LD_LIBRARY_PATH), and noone else online seemed to know. Finally in desperation, I tried editing the DYLD_LIBRARY_PATH and finally got it to work by removing the last part of it: /Applications/MATLAB/MATLAB_Compiler_Runtime/v82/sys/java/jre/maci64/jre/lib
Now the demo application from the tutorial works.
Next comes trying to make my code work.
OS X Paths for Run-Time Deployment
Use these setenv commands to set your MATLAB Runtime paths.
setenv DYLD_LIBRARY_PATH \
mcr_root/version/runtime/maci64 \
mcr_root/version/bin/maci64 \
mcr_root/version/sys/os/maci64
Source: http://www.mathworks.com/help/compiler_sdk/java/mcr-path-settings-for-run-time-deployment.html
I want to import a class that I made in my project, into my script
I did this but it doesn't work:
function doFunction(){
//Objectif Mensuel
importPackage(java.lang);
importClass(KPDataModel.KPData.KPItem); //ERROR HERE, this is my class that I want to import
KPItem kpItem = kpItemList.get(0);
System.out.println(kpItem.CellList.get(2).Value);
System.out.println("-------");
var proposedMediationSum = Integer.parseInt(kpItemList.get(0).CellList.get(2).Value);
var refusedMediationSum = Integer.parseInt(kpItemList.get(0).CellList.get(3).Value)
var totalMediation = proposedMediationSum + refusedMediationSum;
kpItemList.get(0).CellList.get(4).Value = totalMediation;
}
Well, thnx a lot, I found that the problem comes from the import.
This is what it said in the Oracle website :
The Packages global variable can be
used to access Java packages.
Examples: Packages.java.util.Vector,
Packages.javax.swing.JFrame. Please
note that "java" is a shortcut for
"Packages.java". There are equivalent
shortcuts for javax, org, edu, com,
net prefixes, so pratically all JDK
platform classes can be accessed
without the "Packages" prefix.
So, to import my class I used : importClass(Packages.KPDataModel.KPData.KPItem);