I recently implemented C3P0 in my database testing program (i'm using it to test different queries on our data in different DB formats... sqlite, mariadb, etc). The program was initially set up using a single sustained connection to do all queries. This worked fine with SQLite as I had to do an initial ATTACH on another table. When moving to C3P0 where it is necessary to open and close the connection on every query, how can one issue an ATTACH command and have it apply to subsequent queries? In my failure I did notice that the first query after the attach it seemed to apply.
Do I really need to interlace ATTACH test as TESTDB for EVERY query???
Running into a similar issue with setCatalog() for MariaDB. I get a "No Database selected" for every subsequent query after the first.
Do I really need to interlace ATTACH test as TESTDB for EVERY query???
No. As #MarkRotteveel suggested in a comment to the question, we can use a c3p0 connection customizer to tweak each connection as it is acquired for the pool. For example, if we create the class OurSQLiteConnectionCustomizer ...
package com.example.sqlite_pooled;
import java.sql.Connection;
import java.sql.SQLException;
import java.sql.Statement;
import com.mchange.v2.c3p0.AbstractConnectionCustomizer;
public class OurSQLiteConnectionCustomizer extends AbstractConnectionCustomizer {
public void onAcquire(Connection c, String pdsIdt) throws SQLException {
try (Statement st = c.createStatement()) {
st.execute("ATTACH DATABASE 'C:/__tmp/SQLite/test.sqlite' AS test");
}
}
}
... and we tell our ComboPooledDataSource to use it ...
cpds = new ComboPooledDataSource();
cpds.setConnectionCustomizerClassName("com.example.sqlite_pooled.OurSQLiteConnectionCustomizer");
... then whenever c3p0 acquires a new SQLite connection for the pool it will automatically perform the ATTACH DATABASE for us.
Related
Java Tutorial says there are 2 ways to connect to database thru JDBC: with DriverManager class (old, not recommended) and with DataSource class.
I undestand how to do it with DriverManager:
Connection con = DriverManager.getConnection("jdbc:sqlite:mytest.db");
...
But I cannot find how to use DataSource for SQLite thru JDBC. Does SQLite (or JDBC driver providers for it, I don't know how to call it correctly) support using DataSource at all?
I am using xerial/sqlite-jdbc driver to use SQLite from java (https://github.com/xerial/sqlite-jdbc)
My best guess is that I shall use org.sqlite.SQLiteDataSource class (it comes in sqlite-jdbc-3.15.1.jar for Xerial sqlite-jdbc driver), but how? And is it so? I also guess, that how to do it shall be in Xerial driver docs, but they give only example of how to connect using DriverManager.
So I am asking kind help of guru to confirm that this Xerial driver/jar doesn't support DataSource syntax, or to give example how to do it, or to suggest alternative driver with DataSource support (for SQLite from Java), or advice otherwise...
Java Tutorial
JDBC Driver Manager — The JDBC DriverManager class defines objects
which can connect Java applications to a JDBC driver. DriverManager
has traditionally been the backbone of the JDBC architecture. It is
quite small and simple.
The Standard Extension packages javax.naming and javax.sql let you use
a DataSource object registered with a Java Naming and Directory
Interface™ (JNDI) naming service to establish a connection with a data
source. You can use either connecting mechanism, but using a
DataSource object is recommended whenever possible.
My best guess is that I shall use org.sqlite.SQLiteDataSource class (it comes in sqlite-jdbc-3.15.1.jar for Xerial sqlite-jdbc driver),
Yes, that seems likely.
but how?
I just tried the following and it worked for me:
package com.example.sqlite.sqlite_test;
import java.sql.Connection;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Statement;
import org.sqlite.SQLiteDataSource;
public class SqliteTestMain {
public static void main(String[] args) {
SQLiteDataSource ds = new SQLiteDataSource();
ds.setUrl("jdbc:sqlite::memory:");
try (Connection conn = ds.getConnection()) {
System.out.println("Connected.");
String sql =
"SELECT COUNT(*) AS n FROM \"sqlite_master\"";
try (
Statement s = conn.createStatement();
ResultSet rs = s.executeQuery(sql)) {
rs.next();
System.out.printf(
"The \"sqlite_master\" table contains %d row(s).%n",
rs.getInt(1));
}
} catch (SQLException e) {
e.printStackTrace(System.err);
}
}
}
my slick mapping in play2.4 wasn't working, and I boiled down the problem to this: If I do a simple select from a table with a timestamp with time zone column, the timezone disappears from the results after a while. The example at the end of this message produces the following output:
2015-10-27 20:45:13.459+01
2015-10-27 20:45:13.459+01
2015-10-27 20:45:13.459+01
2015-10-27 20:45:13.459+01
2015-10-27 20:45:13.459+01
// from now on, the timezone is never returned (even after 1000 queries)
2015-10-27 20:45:13.459
2015-10-27 20:45:13.459
2015-10-27 20:45:13.459
2015-10-27 20:45:13.459
...
If I create the connection directly without using HikariCP, it works.
If I don't close the connection at the end of every query in the loop (i.e. I leak the connection), it works
If I use the same connection without creating/closing a new one (i.e. getting one from the pool and releasing it) each time, it works
If I create and close a new connection each time using standard DriverManager.getConnection with the same URL, it works.
If I don't prepare the statement, it works
If I prepare the statement twice (even without using the second one), it works
If I prepare a second statement different from the first one, it doesn't work
I tried both normal and java6 versions of hikaricp.
I'm using postgres 9.4. I work in scala but I created the example in java to broaden the audience ;)
I opened an issue but I'm in a bit of a rush so if anyone knows what to do...
If anyone can point me to how to use bonecp or anything else in play 2.4, I'd be grateful as well.
You can reproduce the issue with a project with just the HikariCP and the postgres dependencies:
import com.zaxxer.hikari.HikariConfig;
import com.zaxxer.hikari.HikariDataSource;
import java.sql.Connection;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
public class TestHikari {
public static void main(String[] args) throws Exception {
HikariConfig conf = new HikariConfig();
conf.setJdbcUrl("jdbc:postgresql://localhost/postgres");
conf.setDriverClassName("org.postgresql.Driver");
HikariDataSource ds = new HikariDataSource(conf);
Connection ddlconn = ds.getConnection();
try { ddlconn.createStatement().execute("DROP TABLE TEST"); } catch(Exception ignored) {}
ddlconn.createStatement().execute("CREATE TABLE TEST (ts TIMESTAMP with time zone)");
ddlconn.createStatement().execute("insert into test(ts) values('2015-10-27 20:45:13.459+01')");
ddlconn.close();
for (int i = 0; i< 10; i++) {
Connection conn = ds.getConnection();
PreparedStatement stmt = conn.prepareStatement("select ts from TEST");
//if I uncomment the next line, it works!
//conn.prepareStatement("select ts from TEST");
ResultSet rs = stmt.executeQuery();
rs.next();
System.out.println(rs.getString(1));
conn.close();
}
}
}
This question has been answered here: https://github.com/brettwooldridge/HikariCP/issues/473
There were some issues with dates / timezones when the JDBC driver
switches to server-side prepared statements. By default this happens
after 5 prepared statement executions (see prepareThreshold parameter
of postgresql JDBC).
So this is a bug in the postgres jdbc driver. it seems the only workaround for now is setting prepareThreshold=0 or switching to timestamp without time zone.
I am not from Java ,so my question may be very easy but I need clear steps how to implement.
Existing project : Webmethods connecting to Oracle Data base to fetch certain properties file and insert log information into some tables.
Problem: Many a times data base goes down and hence delays in execution.
New Requirement: We have to replace existing oracle table with Hbase. I have writen code write file into Hbase using Pig. But I really don't know how to write the real time data into Hbase.
I found using Java client or Thrift connection I can write. I need very detailed explanation. I have to submit for an Project. Please help me out.
You have knowledge of Row oriented database and Hbase is column oriented database.But we have apache Phoenix.
Apache Phoenix is a relational database layer over HBase delivered as a client-embedded JDBC driver targeting low latency queries over HBase data. Apache Phoenix takes your SQL query, compiles it into a series of HBase scans, and orchestrates the running of those scans to produce regular JDBC result sets. The table metadata is stored in an HBase table and versioned, such that snapshot queries over prior versions will automatically use the correct schema. Direct use of the HBase API, along with coprocessors and custom filters, results in performance on the order of milliseconds for small queries, or seconds for tens of millions of rows.
This can easily solve your problem.
http://phoenix.apache.org/
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.PreparedStatement;
import java.sql.Statement;
public class test {
public static void main(String[] args) throws SQLException {
Statement stmt = null;
ResultSet rset = null;
Connection con = DriverManager.getConnection("jdbc:phoenix:[zookeeper]");
stmt = con.createStatement();
stmt.executeUpdate("create table test (mykey integer not null primary key, mycolumn varchar)");
stmt.executeUpdate("upsert into test values (1,'Hello')");
stmt.executeUpdate("upsert into test values (2,'World!')");
con.commit();
PreparedStatement statement = con.prepareStatement("select * from test");
rset = statement.executeQuery();
while (rset.next()) {
System.out.println(rset.getString("mycolumn"));
}
statement.close();
con.close();
}
}
I have a baseDAO which contains methods for basic CRUD operations. And for each operation we are doing getJpaTemplate.xxx() operation.
The code is working fine in production, but now we have to write UTs for DAO layer and we are using DBUnit.
I saw the examples and writing the DBUnit classes, I observed that read operations work fine but Delete, update and Create operations are not working at all.
When we are trying to call DAO.save(object) it doesn't throw any exception, it comes to next line but when I try to open the table and see the value, the new row is not inserted neither that transaction fails nor any exception is thrown.
I doubt there might be issue with connection.
For the reference I am attaching getConnection() method.
protected IDatabaseConnection getConnection() throws Exception {
Connection con = dataSource.getConnection();
DatabaseConnection connection = new DatabaseConnection(con);
DatabaseConfig config = connection.getConfig();
config.setProperty(DatabaseConfig.PROPERTY_DATATYPE_FACTORY,new oracleDataTypeFactory());
return connection;
}
We have another method which is called in setup () for populating the data from the XML file, which works fine. Just for reference I am adding code here.
protected void insertData (String xmlDataFilePath) {
IDatabaseConnection dbConnection= getConnection();
TransactionOperation.CLEAN_INSERT.execute(dbConnection,getDataSet(xmlDataFilePath));
connection = jPATransactionManager.getDataSource().getConnection();
connection.setAutoCommit(false);
savepoint = connection.setSavepoint("Data inserted in db");
dbConnection.close();
}
I am not sure without seeing the new row inserted in the db, how to proceed further.Because I tried doing
getJpaTemplate().save(object);
getJpaTemplate().load(ClassName.class, object's id);
which returns me null and in db table also there is no entry.
Any suggestions please?
Thanks in advance.
JE.
savepoint = connection.setSavepoint("Data inserted in db");
make sure when the savepoint gets committed?
can you please put all relavant APIs code here?
I need to connect to a remote database using Database link using JDBC commands.
How can it be done?
If you already have the dblink setup, you can utilize it in your SQL (sent via jdbc) by addressing the required tables like such:
select * from SCHEMA.TABLE#DBLINK_NAME
Using this query inside of your java would look something like this
public ResultSet execQuery() throws SQLException, ClassNotFoundException{
//Load the database driver
Class.forName("oracle.jdbc.OracleDriver");
//Create connection to the database
Connection myConnection = DriverManager.getConnection(connectURL,userName,userPwd);
//Create a statement link to the database for running queries
Statement myQuery = myConnection.createStatement(ResultSet.TYPE_SCROLL_SENSITIVE,ResultSet.CONCUR_UPDATABLE);
//Create a resultSet to hold the returned query information
ResultSet myQueryResults = myQuery.executeQuery("select * from SCHEMA.TABLE#DBLINK_NAME");
return myQueryResults;
}
*java & oracle assumed
If you are asking about how to use JDBC to create a link between the DB you are talking to and another one, then it is "just SQL" that you (presumably) would execute in the same way as you would any other SQL statement. (If you tell us which DB you are using, we could talk about the actual SQL you need to execute.)
Otherwise, I don't think this makes sense. A DB link / Database link is a link from one database to another. But JDBC is for talking to a database from a Java client. It makes no sense (to me) to use DB link to connect a JDBC client to a database.
Please take a look at orajdbclink, on sourceforge
I am planning to to connect my oracle plsql sources to phoenix skin of hbase. It seems to me the unique way to create a connector between oracle and hbase for the moment...