I followed the instruction in the ClusterJ tutorial here to connect to MySQL NDB cluster using the NDB api. Below is a sample code of what I am doing right now:
Main.java
import com.mysql.clusterj.ClusterJHelper;
import com.mysql.clusterj.SessionFactory;
import com.mysql.clusterj.Session;
import com.mysql.clusterj.Query;
import com.mysql.clusterj.query.QueryBuilder;
import com.mysql.clusterj.query.QueryDomainType;
import java.io.File;
import java.io.InputStream;
import java.io.FileInputStream;
import java.io.*;
import java.util.Properties;
import java.util.List;
public class Main {
public static void main (String[] args) throws
java.io.FileNotFoundException,java.io.IOException {
// Load the properties from the clusterj.properties file
File propsFile = new File("clusterj.properties");
InputStream inStream = new FileInputStream(propsFile);
Properties props = new Properties();
props.load(inStream);
//Used later to get user input
BufferedReader br = new BufferedReader(new InputStreamReader(System.in));
// Create a session (connection to the database)
SessionFactory factory = ClusterJHelper.getSessionFactory(props);
Session session = factory.getSession();
// Create and initialize an Employee
Employee newEmployee = session.newInstance(Employee.class);
newEmployee.setId(988);
newEmployee.setFirst("John");
newEmployee.setLast("Jones");
newEmployee.setStarted("1 February 2009");
newEmployee.setDepartment(666);
// Write the Employee to the database
session.persist(newEmployee);
}
}
I encounter the below error when I am trying to run this:
java.lang.UnsatisfiedLinkError: no ndbclient in java.library.path
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1867)
at java.lang.Runtime.loadLibrary0(Runtime.java:870)
at java.lang.System.loadLibrary(System.java:1122)
at com.mysql.clusterj.tie.ClusterConnectionServiceImpl.loadSystemLibrary(ClusterConnectionServiceImpl.java:68)
at com.mysql.clusterj.tie.ClusterConnectionServiceImpl.create(ClusterConnectionServiceImpl.java:87)
at com.mysql.clusterj.core.SessionFactoryImpl.createClusterConnection(SessionFactoryImpl.java:335)
at com.mysql.clusterj.core.SessionFactoryImpl.createClusterConnectionPool(SessionFactoryImpl.java:288)
at com.mysql.clusterj.core.SessionFactoryImpl.<init>(SessionFactoryImpl.java:211)
at com.mysql.clusterj.core.SessionFactoryImpl.getSessionFactory(SessionFactoryImpl.java:146)
at com.mysql.clusterj.core.SessionFactoryServiceImpl.getSessionFactory(SessionFactoryServiceImpl.java:36)
at com.mysql.clusterj.core.SessionFactoryServiceImpl.getSessionFactory(SessionFactoryServiceImpl.java:27)
at com.mysql.clusterj.ClusterJHelper.getSessionFactory(ClusterJHelper.java:72)
at com.mysql.clusterj.ClusterJHelper.getSessionFactory(ClusterJHelper.java:57)
at com.pkg.mysql.Main.main(Main.java:27)
The error occurs since ClusterJ requires the ndb_engine.so file in the java classpath.
Refer Link: http://ftp.nchu.edu.tw/MySQL/doc/ndbapi/en/mccj-getting.html
I could run the program once I specify the path.
# steps to compile and run the ndb java program on linux
javac -classpath /home/user1/clusterj-7.6.8.jar:. Main.java Employee.java
java -classpath /home/user1/clusterj-7.6.8.jar:. -Djava.library.path=/usr/lib64/mysql Main
I still could not find a way to resolve it on Windows environment.
Add as vm arguments or running including following path
For Mac:
-Djava.library.path="/usr/local/mysql-{cluster-version}/lib/"
For Windows:
search for library path and replace inside double quotes
Related
I want to use MongoDB in java, without an IDE or additional tools. I have downloaded mongo-java-driver-3.12.8.jar, and put it in the same folder as my helloMongo.java file.
I then have tried to run it with:
javac -cp "mongo-java-driver.jar" helloMongo.java
java -cp "mongo-java-driver.jar" helloMongo
Only to get that it cannot find the main class.
Then I tried, assuming the main path had been lost in javas braindead implementation:
javac -cp ".;mongo-java-driver.jar" helloMongo.java
java -cp ".;mongo-java-driver.jar" helloMongo
still to no luck. Then I tried:
javac -cp ".;/mongo-java-driver.jar" helloMongo.java
java -cp ".;/mongo-java-driver.jar" helloMongo
And a hundred other variants, and still no luck.
Is an IDE and Gradle essentially required to use Mongo with Java?
package com.javatpoint.java.mongo.db;
import com.mongodb.MongoClient;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.MongoDatabase;
import org.bson.Document;
public class JavaMongoDemo {
public static void main(String[] args){
try{
//---------- Connecting DataBase -------------------------//
MongoClient mongoClient = new MongoClient( "localhost" , 27017 );
//---------- Creating DataBase ---------------------------//
MongoDatabase db = mongoClient.getDatabase("javatpoint");
//---------- Creating Collection -------------------------//
MongoCollection<Document> table = db.getCollection("employee");
//---------- Creating Document ---------------------------//
Document doc = new Document("name", "Peter John");
doc.append("id",12);
//----------- Inserting Data ------------------------------//
table.insertOne(doc);
}catch(Exception e){
System.out.println(e);
}
}
}
If your package is com.javatpoint.java.mongo.db, then your class has to be in
./com/javatpoint/java/mongo/db
and assuming you leave the Mongo jar in the same directory as your source,
your java command must be
java -cp ./com/javatpoint/java/mongo/db/mongo-java-driver.jar com.javatpoint.java.mongo.db.helloMongo
i have the following code
import java.awt.Rectangle;
import java.awt.Robot;
import java.awt.Toolkit;
import java.awt.image.BufferedImage;
import java.io.File;
import javax.imageio.ImageIO;
import java.text.DateFormat;
import java.text.SimpleDateFormat;
import java.util.*;
public class screen2image
{
SimpleDateFormat formatter = new SimpleDateFormat("yyyyMMdd hh mm ss a");
public void robo() throws Exception
{
Calendar now = Calendar.getInstance();
Robot robot = new Robot();
BufferedImage screenShot = robot.createScreenCapture(new Rectangle(Toolkit.getDefaultToolkit().getScreenSize()));
ImageIO.write(screenShot, "JPG", new File("E:/java/AutoScreenShot\\"+formatter.format(now.getTime())+".jpg"));
System.out.println(formatter.format(now.getTime()));
}
public static void main(String[] args) throws Exception
{
screen2image s2i = new screen2image();
while(1==1)
{
s2i.robo();
Thread.sleep(10000);
}
}
}
and i've written that code in screen2image.java class file. It is running fine . but now i want to make a executable jar file of this code so that it can be run by anyone. i've implemented following steps:
1. right click on my project -> go to properties -> go to build -> packaging -> then there is a problem there they are showing screenshot.war file istead of jar file
i'm using Netbeans 8.1, i've search everywhere i'm unable to get anything which can help me through it. I've downloaded JWrapper and tried to make a executable jar file from it but still it is not happening after running that executable jar file there shows no main manifest found, then i write Main-Class : "my class name and Class-path: "path of my class" in Manifest.mf file but still not working..
i'm unable to understand where i'm getting wrong. please help me if anyone knows about it.. I'll be thankful to u
I am using Hadoop 1.0.3 and HBase 0.94.22. I am trying to run a mapper program to read values from a Hbase table and output them to a file. I am getting the following error:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:340)
at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
The code is as below
import java.io.IOException;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.client.Result;
import org.apache.hadoop.hbase.client.Scan;
import org.apache.hadoop.hbase.filter.FirstKeyOnlyFilter;
import org.apache.hadoop.hbase.io.ImmutableBytesWritable;
import org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil;
import org.apache.hadoop.hbase.mapreduce.TableMapper;
import org.apache.hadoop.hbase.util.Bytes;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
public class Test {
static class TestMapper extends TableMapper<Text, IntWritable> {
private static final IntWritable one = new IntWritable(1);
public void map(ImmutableBytesWritable row, Result value, Context context) throws IOException, InterruptedException
{
ImmutableBytesWritable userkey = new ImmutableBytesWritable(row.get(), 0 , Bytes.SIZEOF_INT);
String key =Bytes.toString(userkey.get());
context.write(new Text(key), one);
}
}
public static void main(String[] args) throws Exception {
HBaseConfiguration conf = new HBaseConfiguration();
Job job = new Job(conf, "hbase_freqcounter");
job.setJarByClass(Test.class);
Scan scan = new Scan();
FileOutputFormat.setOutputPath(job, new Path(args[0]));
String columns = "data";
scan.addFamily(Bytes.toBytes(columns));
scan.setFilter(new FirstKeyOnlyFilter());
TableMapReduceUtil.initTableMapperJob("test",scan, TestMapper.class, Text.class, IntWritable.class, job);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);
System.exit(job.waitForCompletion(true)?0:1);
}
}
I get the above code exported to a jar file and on the command line I use the below command to run the above code.
hadoop jar /home/testdb.jar test
where test is the folder to which the mapper results should be written.
I have checked a few other links like Caused by: java.lang.ClassNotFoundException: org.apache.zookeeper.KeeperException where it has been suggested to include the zookeeper file in the classpath, but while creating the project in eclipse I have already included zookeeper file from the lib directory of hbase. The file I have included is zookeeper-3.4.5.jar. Ans also visited this link too HBase - java.lang.NoClassDefFoundError in java , but I am using a mapper class to get the values from the hbase table not any client API. I know I am making a mistake somewhere, guys could you please help me out ??
I have noted another strange thing, when I remove all of the code in the main function except the first line " HBaseConfiguration conf = new HBaseConfiguration();", then export the code to a jar file and try to compile the jar file as hadoop jar test.jar I still get the same error. It seems either I am defining the conf variable incorrectly or there is some issue with my environment.
I got the fix to the problem, I had not added the hbase classpath in the hadoop-env.sh file. Below is the one I added to make the job work.
$ export HADOOP_CLASSPATH=$HBASE_HOME/hbase-0.94.22.jar:\
$HBASE_HOME/hbase-0.94.22-test.jar:\
$HBASE_HOME/conf:\
${HBASE_HOME}/lib/zookeeper-3.4.5.jar:\
${HBASE_HOME}/lib/protobuf-java-2.4.0a.jar:\
${HBASE_HOME}/lib/guava-11.0.2.jar
I tried editing the hadoop-env.sh file, but the changes mentioned here didn't work for me.
What worked is this:
export HADOOP_CLASSPATH="$HADOOP_CLASSPATH:$HBASE_HOME/lib/*"
I just added that at the end of my hadoop-env.sh.
Do not forget to set your HBASE_HOME variable.
You can also replace the $HBASE_HOME with the actual path of your hbase installation.
In case there is someone who has different paths/configuration. Here is what I added to hadoop-env.sh in order to make it work:
$ export HADOOP_CLASSPATH="$HBASE_HOME/lib/hbase-client-0.98.11-hadoop2.jar:\
$HBASE_HOME/lib/hbase-common-0.98.11-hadoop2.jar:\
$HBASE_HOME/lib/protobuf-java-2.5.0.jar:\
$HBASE_HOME/lib/guava-12.0.1.jar:\
$HBASE_HOME/lib/zookeeper-3.4.6.jar:\
$HBASE_HOME/lib/hbase-protocol-0.98.11-hadoop2.jar"
NOTE: if you haven't set the $HBASE_HOME you have 2 choices.
- By export HBASE_HOME=[your hbase installation path]
- Or just replace the $HBASE_HOME with your hbase full path
HADOOP_USER_CLASSPATH_FIRST=true \
HADOOP_CLASSPATH=$($HBASE_HOME/bin/hbase mapredcp) \
hadoop jar /home/testdb.jar test
here CreateTable is my java class file
use this command
java -cp .:/home/hadoop/hbase/hbase-0.94.8/hbase-0.94.8.jar:/home/hadoop/hbase/hbase-0.94.8/lib/* CreateTable
This is my first post. I am trying to open the remote .mdb file which is in shared folder in Windows machine from the linux machine using jackcess lib. and set the table values in busineess object. I wrote the below code.
Scenario 1 : I have run the code from windows machine it is working fine. Scenario 2 : If i run the code from linux machine it is getting file not found exception. Hope it should be small mistake. Please correct me what am missing here .
package com.gg.main;
import java.io.File;
import java.io.IOException;
import java.util.ArrayList;
import java.util.Map;
import com.healthmarketscience.jackcess.Database;
import com.healthmarketscience.jackcess.Table;
import com.penske.model.Login;
public class Test {
public static void main(String args[]){
Table table = null;
Database db = null;
Login login = null;
ArrayList<Login> rowList = null;
try {
rowList = new ArrayList();
db = Database.open(new File("//aa.bb.com/file/access.mdb"));
table = db.getTable("Maintenance");
System.out.println(Database.open(new File("//aa.bb.com/file/access.mdb"))
.getTable("Maintenance").getColumns());
for(Map<String, Object> row : table) {
login = new Login();
if(row.get("Req_ID")!=null){
login.setId(row.get("Req_ID").toString());
}
if(row.get("Name")!=null){
login.setName(row.get("Name").toString());
}if(row.get("Loc")!=null){
login.setLoc(row.get("Loc").toString());
}
rowList.add(login);
}
login.setRowList(rowList);
} catch (IOException e1) {
e1.printStackTrace();
}
}
}
Linux does not have native support for Windows' UNC path, as you use them here:
new File("//aa.bb.com/file/access.mdb")
You'll have to mount the remote filesystem somewhere in your Linux filesystem where your program can access it and then replace the paths in your program to use that local filesystem path, using smbfs or something like it. It's been a long time since I've had to interact with Windows machines, but it should go something like this:
mount -t smbfs -o username=foo,password=bar //aa.bb.com/file /mnt/whatever_you_choose_to_name_it
See the manpage for smbmount for details.
Of course, if your program is supposed to start automatically eg. as part of the system booting, you'll have to see to it that the filesystem is automatically mounted as well. See fstab(5).
I have this following code using Java 7 nio API:
import java.io.IOException;
import java.io.ObjectOutputStream;
import java.io.OutputStream;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.StandardOpenOption;
public class TestDeleteOnClose {
public static void main(String[] args) throws IOException {
Path tmp = Files.createTempFile("a", "b");
OutputStream out = Files.newOutputStream(tmp, StandardOpenOption.DELETE_ON_CLOSE);
ObjectOutputStream os = new ObjectOutputStream(out);
os.write(0);
os.flush();
System.out.println(Files.exists(tmp));
os.close();
System.out.println(Files.exists(tmp));
}
}
On Windows, I see what I expect, i.e true false. On Linux I see false false. Is it expected? Am I doing something wrong?
The fact that the file is deleted too early is problematic since I need to test it for its size for instance after having written to it.
I use jdk7u25 on both Linux and Windows and could reproduce on machines with RedHat or ArchLinux on it.
EDIT: even if I test for file existence before another call to os.write() I am told the file does not exist anymore. If I open the file with the CREATE options, then I will see true true.
It looks like the Linux JVM deletes the file as soon as you open it, which makes sense as you can do that on Linux. That's how I would implement it too. You'll have to keep track of how much has been written to the file yourself, e.g. by interposing a FilterOutputStream that counts bytes.