I have installed hadoop-2.3.0 and successfully configured hive on my machine ..
i have windows 7 on my machine ..
now i 've created so simple project to just connect to hive driver ..
import java.sql.SQLException;
import java.sql.Connection;
import java.sql.ResultSet;
import java.sql.Statement;
import java.sql.DriverManager;
public class Hive_java{
private static String driver = "org.apache.hive.jdbc.HiveDriver";
public static void main(String[] args) throws SQLException, ClassNotFoundException {
Class.forName(driver);
Connection connect =
DriverManager.getConnection("jdbc:hive2://localhost:10000/data", "", ""); //connecting to default database
}
}
and added these jars
now i have this error which can't solve it for some days .. connection refused
please tell me what can i do as am beginner in hive .. thanks in advance
Related
I have this code:
package routines;
import java.sql.Connection;
import java.sql.Driver;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.ResultSetMetaData;
import java.sql.SQLException;
import java.sql.Statement;
import java.util.Enumeration;
import java.util.Properties;
public class SnowflakeDriverExample {
public static void main() throws Exception
{
System.out.println("Create JDBC connection");
Connection connection = getConnection();
System.out.println("Done creating JDBC connectionn");
}
private static Connection getConnection()
throws SQLException
{
try
{
Class.forName("net.snowflake.client.jdbc.SnowflakeDriver");
}
catch (ClassNotFoundException ex)
{
System.err.println("Driver not found");
}
// build connection properties
Properties properties = new Properties();
properties.put("user", "my_user");
properties.put("password", "my_password");
properties.put("db", "my_db");
properties.put("schema", "my_schema");
// create a new connection
String connectStr = System.getenv("SF_JDBC_CONNECT_STRING");
// use the default connection string if it is not set in environment
if(connectStr == null)
{
connectStr = "https://my_account.snowflakecomputing.com/";
}
return DriverManager.getConnection(connectStr, properties);
}
}
now, when I call:
SnowflakeDriverExample.main();
I get this error:
Exception in component tJava_1 (j_example)
java.sql.SQLException: No suitable driver found for https://my_account.snowflakecomputing.com/
when I iterated over drivers and printed them- I got this one:
net.snowflake.client.jdbc.SnowflakeDriver
looks like I have the correct snowflake driver, and I the connectionStr is my actual snowflake url.
so what's the problem?
ok.
the problem was my connectionStr, which was:
connectStr = "https://my_account.snowflakecomputing.com/";
and should be:
connectStr = "jdbc:snowflake://my_account.snowflakecomputing.com/";
now everything is fine. thanks a lot!
Normally, ResultSetMetaData#getColumnType() should return 93 for DATETIME fields (usually represented as java.sql.Timestamp).
This was indeed true for driver versions 4.2.6420.100 and 4.0.4621.201 (as well as jTDS).
When using newer Microsoft JDBC drivers (6.0.7728.100, 6.0.8112.100 and 6.2.1.0 in partucular) with Microsoft SQL Server 2005 (9.0.1399), I observe a different type code is returned: -151, which doesn't even map to any type in java.sql.Types.
At the same time, ResultSetMetaData#getColumnClassName(int) and ResultSetMetaData#getColumnTypeName(int) behave correctly (always returning java.sql.Timestamp and datetime, respectively).
Here's the unit test which fails when run using the above driver and server combinations:
package com.example;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.is;
import java.sql.Connection;
import java.sql.ResultSet;
import java.sql.ResultSetMetaData;
import java.sql.SQLException;
import java.sql.Statement;
import java.sql.Types;
import javax.sql.DataSource;
import org.eclipse.jdt.annotation.Nullable;
import org.testng.annotations.AfterMethod;
import org.testng.annotations.BeforeClass;
import org.testng.annotations.BeforeMethod;
import org.testng.annotations.Test;
import com.microsoft.sqlserver.jdbc.SQLServerConnectionPoolDataSource;
public final class MsSqlServerTest {
#Nullable
private static DataSource dataSource;
#Nullable
private Connection conn;
#BeforeClass
public static void setUpOnce() {
dataSource = new SQLServerConnectionPoolDataSource();
((SQLServerConnectionPoolDataSource) dataSource).setURL("jdbc:sqlserver://localhost\\SQLEXPRESS:1433;databaseName=...");
}
#BeforeMethod
public void setUp() throws SQLException {
this.conn = dataSource.getConnection("...", "...");
}
#AfterMethod
public void tearDown() throws SQLException {
if (this.conn != null) {
this.conn.close();
}
this.conn = null;
}
#Test
public void testDateTimeCode() throws SQLException {
try (final Statement stmt = this.conn.createStatement()) {
try {
stmt.executeUpdate("drop table test");
} catch (#SuppressWarnings("unused") final SQLException ignored) {
// ignore
}
stmt.executeUpdate("create table test (value datetime)");
try (final ResultSet rset = stmt.executeQuery("select * from test")) {
final ResultSetMetaData metaData = rset.getMetaData();
assertThat(metaData.getColumnClassName(1), is(java.sql.Timestamp.class.getName()));
assertThat(metaData.getColumnTypeName(1), is("datetime"));
assertThat(metaData.getColumnType(1), is(Types.TIMESTAMP));
}
}
}
}
The above issue doesn't occur with newer Microsoft SQL Server versions (like 2014).
SQL Server Management Studio 2014 always reports column type correctly (DATETIME), regardless of the version of the server it is connected to.
What's wrong with the JDBC driver? Has Microsoft once again broken compatibility with one of its own products?
"Has Microsoft once again broken compatibility with one of its own products?"
Technically, no, because the current versions of the JDBC driver do not support SQL Server 2005. According to the SQL Server requirements for the JDBC driver:
For Microsoft JDBC Driver 4.2 and 4.1 for SQL Server, support begins with SQL Server 2008. For Microsoft JDBC Driver 4.0 for SQL Server, support beings [sic] with SQL Server 2005.
This was also discussed on GitHub.
I already checked the other question about my issue, but none of them solved.
I imported the Neo4j Jar by configuring the build path, but still I get this error.
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Statement;
public class TestNeo4j {
public static void main(String[] args) throws SQLException {
Connection c = DriverManager.getConnection("jdbc:neo4j://localhost:7474/", "neo4j", "neo4j");
Statement st = c.createStatement();
String cql = "match (m)-[:IS_ALLERGIC_TO]->(n:Product) where n.name = 'gluten' return m.name";
ResultSet rs = st.executeQuery(cql);
while(rs.next())
System.out.println(rs.getString(1));
c.close();
}
}
This is my code. Can you figure out what the problem is?
As suggested here, use neo4j jar with dependencies (neo4j-jdbc-2.1.4-jar-with-dependencies.jar).
You can grab it from neo4j release repository, here
Hi I wanted to load csv file in Oracle database using java but what I am getting error like "ora-00900 invalid sql statement". I am using oracle Database 11g Enterprise Edition. So I don't understand why it doesn't accept my load statement. Any help? Thanks in advance.
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.OutputStream;
import java.io.OutputStreamWriter;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.ResultSetMetaData;
import java.sql.Statement;
import org.apache.poi.hssf.usermodel.HSSFWorkbook;
import org.apache.poi.ss.usermodel.Cell;
import org.apache.poi.ss.usermodel.Row;
import org.apache.poi.ss.usermodel.Sheet;
import org.apache.poi.ss.usermodel.Workbook;
import org.apache.poi.xssf.usermodel.XSSFWorkbook;
public class test {
public static void main(String[] args){
test t=new test();
t.inserintoDb("C:\\Users\\fiels\\2.csv");
}
public void inserintoDb(String path){
Connection conn=null;
Statement stmt=null;
try{
Class.forName("oracle.jdbc.driver.OracleDriver");
conn=(Connection) DriverManager.getConnection(
"jdbc:oracle:thin:#address:orcl","user","password");
stmt=conn.createStatement();
String select1="truncate table table1";
stmt.execute(select1);
String select2="LOAD DATA INFILE'" +path+"' INTO TABLE table1 FIELDS TERMINATED BY ',' (customer_nbr, nbr)";
stmt.execute(select2);
}catch(Exception e){
e.printStackTrace();
}
}
}
Does infile works on Oracle? It seems that only on MySql.. The SQL Loader alternative is really fast. Check the official documentation to see how to config it:
As the question states that you want to use Java here is the help for calling the SQL Loader from Java. It bassically uses a Runtime but depends on the operating system:
String[] stringCommand = { "bash", "-c", "/usr/bin/sqlldr username/password#sid control=/path/to/sample.ctl"};
Runtime rt = Runtime.getRuntime();
Process proc = null;
try {
proc = rt.exec(stringCommand);
}catch (Exception e) {
// TODO something
}finally {
proc.destroy();
}
But if you just want to load some table for your personal use you wont need java. You can call it from a .bat file.
I have a database in libreoffice Base (Debian) which I need to export the tables as an xml file. I have created a piece of Eclipse Java code which is as follows:
package NewDB;
import java.io.FileOutputStream;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.SQLException;
import org.dbunit.database.DatabaseConnection;
import org.dbunit.database.IDatabaseConnection;
import org.dbunit.database.QueryDataSet;
import org.dbunit.dataset.IDataSet;
import org.dbunit.dataset.xml.FlatXmlDataSet;
import org.dbunit.dataset.DataSetException;
public class ExtractTestDataSet {
public static void main(String[] args) throws Exception {
// database connection
Class driverClass = Class.forName("org.hsqldb.jdbcDriver");
Connection jdbcConnection = DriverManager.getConnection "jdbc:hsqldb:/home/debian/Documents/database.odb", "sa", "");
IDatabaseConnection connection = new DatabaseConnection(jdbcConnection);
// full database export
IDataSet fullDataSet = connection.createDataSet();
FlatXmlDataSet.write(fullDataSet, new FileOutputStream("/home/debian/Documents/fulldataset.xml"));
}
}
After looking at the DBunit pages and other various sites this code should be correct; the database is populated, the connections are valid, there are no warnings or errors in the code, however when the xml file is created the only content is as follows:
<?xml version='1.0' encoding='UTF-8'?>
<dataset/>
Does anyone have any ideas as to why the dataset is not being exported?
Thanks
Turns out that the .odb database was connected to a different backend, explaining the blank dataset.