My problem is as follows. I need a class that works as a single point to a database connection in a web system, so to avoid having one user with two open connections. I need it to be as optimal as possible and it should manage every transaction in the system. In other words only that class should be able to instantiate DAOs. And to make it better, it should also use connection pooling! What should I do?
You will need to implement a DAO Manager. I took the main idea from this website, however I made my own implementation that solves some few issues.
Step 1: Connection pooling
First of all, you will have to configure a connection pool. A connection pool is, well, a pool of connections. When your application runs, the connection pool will start a certain amount of connections, this is done to avoid creating connections in runtime since it's a expensive operation. This guide is not meant to explain how to configure one, so go look around about that.
For the record, I'll use Java as my language and Glassfish as my server.
Step 2: Connect to the database
Let's start by creating a DAOManager class. Let's give it methods to open and close a connection in runtime. Nothing too fancy.
public class DAOManager {
public DAOManager() throws Exception {
try
{
InitialContext ctx = new InitialContext();
this.src = (DataSource)ctx.lookup("jndi/MYSQL"); //The string should be the same name you're giving to your JNDI in Glassfish.
}
catch(Exception e) { throw e; }
}
public void open() throws SQLException {
try
{
if(this.con==null || !this.con.isOpen())
this.con = src.getConnection();
}
catch(SQLException e) { throw e; }
}
public void close() throws SQLException {
try
{
if(this.con!=null && this.con.isOpen())
this.con.close();
}
catch(SQLException e) { throw e; }
}
//Private
private DataSource src;
private Connection con;
}
This isn't a very fancy class, but it'll be the basis of what we're going to do. So, doing this:
DAOManager mngr = new DAOManager();
mngr.open();
mngr.close();
should open and close your connection to the database in an object.
Step 3: Make it a single point!
What, now, if we did this?
DAOManager mngr1 = new DAOManager();
DAOManager mngr2 = new DAOManager();
mngr1.open();
mngr2.open();
Some might argue, "why in the world would you do this?". But then you never know what a programmer will do. Even then, the programmer might forger from closing a connection before opening a new one. Plus, this is a waste of resources for the application. Stop here if you actually want to have two or more open connections, this will be an implementation for one connection per user.
In order to make it a single point, we will have to convert this class into a singleton. A singleton is a design pattern that allows us to have one and only one instance of any given object. So, let's make it a singleton!
We must convert our public constructor into a private one. We must only give an instance to whoever calls it. The DAOManager then becomes a factory!
We must also add a new private class that will actually store a singleton.
Alongside all of this, we also need a getInstance() method that will give us a singleton instance we can call.
Let's see how it's implemented.
public class DAOManager {
public static DAOManager getInstance() {
return DAOManagerSingleton.INSTANCE;
}
public void open() throws SQLException {
try
{
if(this.con==null || !this.con.isOpen())
this.con = src.getConnection();
}
catch(SQLException e) { throw e; }
}
public void close() throws SQLException {
try
{
if(this.con!=null && this.con.isOpen())
this.con.close();
}
catch(SQLException e) { throw e; }
}
//Private
private DataSource src;
private Connection con;
private DAOManager() throws Exception {
try
{
InitialContext ctx = new InitialContext();
this.src = (DataSource)ctx.lookup("jndi/MYSQL");
}
catch(Exception e) { throw e; }
}
private static class DAOManagerSingleton {
public static final DAOManager INSTANCE;
static
{
DAOManager dm;
try
{
dm = new DAOManager();
}
catch(Exception e)
dm = null;
INSTANCE = dm;
}
}
}
When the application starts, whenever anyone needs a singleton the system will instantiate one DAOManager. Quite neat, we've created a single access point!
But singleton is an antipattern because reasons!
I know some people won't like singleton. However it solves the problem (and has solved mine) quite decently. This is just a way of implementing this solution, if you have other ways you're welcome to suggest so.
Step 4: But there's something wrong...
Yes, indeed there is. A singleton will create only ONE instance for the whole application! And this is wrong in many levels, especially if we have a web system where our application will be multithreaded! How do we solve this, then?
Java provides a class named ThreadLocal. A ThreadLocal variable will have one instance per thread. Hey, it solves our problem! See more about how it works, you will need to understand its purpose so we can continue.
Let's make our INSTANCE ThreadLocal then. Modify the class this way:
public class DAOManager {
public static DAOManager getInstance() {
return DAOManagerSingleton.INSTANCE.get();
}
public void open() throws SQLException {
try
{
if(this.con==null || !this.con.isOpen())
this.con = src.getConnection();
}
catch(SQLException e) { throw e; }
}
public void close() throws SQLException {
try
{
if(this.con!=null && this.con.isOpen())
this.con.close();
}
catch(SQLException e) { throw e; }
}
//Private
private DataSource src;
private Connection con;
private DAOManager() throws Exception {
try
{
InitialContext ctx = new InitialContext();
this.src = (DataSource)ctx.lookup("jndi/MYSQL");
}
catch(Exception e) { throw e; }
}
private static class DAOManagerSingleton {
public static final ThreadLocal<DAOManager> INSTANCE;
static
{
ThreadLocal<DAOManager> dm;
try
{
dm = new ThreadLocal<DAOManager>(){
#Override
protected DAOManager initialValue() {
try
{
return new DAOManager();
}
catch(Exception e)
{
return null;
}
}
};
}
catch(Exception e)
dm = null;
INSTANCE = dm;
}
}
}
I would seriously love to not do this
catch(Exception e)
{
return null;
}
but initialValue() can't throw an exception. Oh, initialValue() you mean? This method will tell us what value will the ThreadLocal variable hold. Basically we're initializing it. So, thanks to this we can now have one instance per thread.
Step 5: Create a DAO
A DAOManager is nothing without a DAO. So we should at least create a couple of them.
A DAO, short for "Data Access Object" is a design pattern that gives the responsibility of managing database operations to a class representing a certain table.
In order to use our DAOManager more efficiently, we will define a GenericDAO, which is an abstract DAO that will hold the common operations between all DAOs.
public abstract class GenericDAO<T> {
public abstract int count() throws SQLException;
//Protected
protected final String tableName;
protected Connection con;
protected GenericDAO(Connection con, String tableName) {
this.tableName = tableName;
this.con = con;
}
}
For now, that will be enough. Let's create some DAOs. Let's suppose we have two POJOs: First and Second, both with just a String field named data and its getters and setters.
public class FirstDAO extends GenericDAO<First> {
public FirstDAO(Connection con) {
super(con, TABLENAME);
}
#Override
public int count() throws SQLException {
String query = "SELECT COUNT(*) AS count FROM "+this.tableName;
PreparedStatement counter;
try
{
counter = this.con.PrepareStatement(query);
ResultSet res = counter.executeQuery();
res.next();
return res.getInt("count");
}
catch(SQLException e){ throw e; }
}
//Private
private final static String TABLENAME = "FIRST";
}
SecondDAO will have more or less the same structure, just changing TABLENAME to "SECOND".
Step 6: Making the manager a factory
DAOManager not only should serve the purpose of serving as a single connection point. Actually, DAOManager should answer this question:
Who is the one responsible of managing the connections to the database?
The individual DAOs shouldn't manage them, but DAOManager. We've answered partially the question, but now we shouldn't let anyone manage other connections to the database, not even the DAOs. But, the DAOs need a connection to the database! Who should provide it? DAOManager indeed! What we should do is making a factory method inside DAOManager. Not just that, but DAOManager will also hand them the current connection!
Factory is a design pattern that will allow us to create instances of a certain superclass, without knowing exactly what child class will be returned.
First, let's create an enum listing our tables.
public enum Table { FIRST, SECOND }
And now, the factory method inside DAOManager:
public GenericDAO getDAO(Table t) throws SQLException
{
try
{
if(this.con == null || this.con.isClosed()) //Let's ensure our connection is open
this.open();
}
catch(SQLException e){ throw e; }
switch(t)
{
case FIRST:
return new FirstDAO(this.con);
case SECOND:
return new SecondDAO(this.con);
default:
throw new SQLException("Trying to link to an unexistant table.");
}
}
Step 7: Putting everything together
We're good to go now. Try the following code:
DAOManager dao = DAOManager.getInstance();
FirstDAO fDao = (FirstDAO)dao.getDAO(Table.FIRST);
SecondDAO sDao = (SecondDAO)dao.getDAO(Table.SECOND);
System.out.println(fDao.count());
System.out.println(sDao.count());
dao.close();
Isn't it fancy and easy to read? Not just that, but when you call close(), you close every single connection the DAOs are using. But how?! Well, they're sharing the same connection, so it's just natural.
Step 8: Fine-tuning our class
We can do several things from here on. To ensure connections are closed and returned to the pool, do the following in DAOManager:
#Override
protected void finalize()
{
try{ this.close(); }
finally{ super.finalize(); }
}
You can also implement methods that encapsulate setAutoCommit(), commit() and rollback() from the Connection so you can have a better handling of your transactions. What I also did is, instead of just holding a Connection, DAOManager also holds a PreparedStatement and a ResultSet. So, when calling close() it also closes both. A fast way of closing statements and result sets!
I hope this guide can be of any use to you in your next project!
I think that if you want to do a simple DAO pattern in plain JDBC you should keep it simple:
public List<Customer> listCustomers() {
List<Customer> list = new ArrayList<>();
try (Connection conn = getConnection();
Statement s = conn.createStatement();
ResultSet rs = s.executeQuery("select * from customers")) {
while (rs.next()) {
list.add(processRow(rs));
}
return list;
} catch (SQLException e) {
throw new RuntimeException(e.getMessage(), e); //or your exceptions
}
}
You can follow this pattern in a class called for example CustomersDao or CustomerManager, and you can call it with a simple
CustomersDao dao = new CustomersDao();
List<Customers> customers = dao.listCustomers();
Note that I'm using try with resources and this code is safe to connections leaks, clean, and straightforward, You probably don't want to follow the full DAO pattern with Factorys, interfaces and all that plumbing that in many cases don't add real value.
I don't think that it's a good idea using ThreadLocals, Bad used like in the accepted answer is a source of classloader leaks
Remember ALWAYS close your resources (Statements, ResultSets, Connections) in a try finally block or using try with resources
Related
I ran into some code and I wanted to research other people's approaches to it, but I'm not sure what the design pattern is called. I tried searching for "database executer" and mostly got results about Java's Executor framework which is unrelated.
The pattern I'm trying to identify uses a single class to manage connections and execute queries through the use of functions that allow you to isolate any issues related to connection management.
Example:
// Service class
public Service {
private final Executor executor;
public void query(String query) {
ResultSet rs = (ResultSet) executor.execute((connection) -> {
Statement st = connection.createStatement();
return st.executeQuery(query);
});
}
}
// Executer class
public Executer {
private final DataSource dataSource;
public Object execute(Function function) {
Connection connection = dataSource.getConnection();
try {
return function(connection);
} catch(Exception e) {
log...
} finally {
// close or return connection to pool
}
}
}
As you can see from above, if you ever have a connection leak you don't need to search through a bunch of DAOs or services, it's all contained in a single executor class. Any idea what this strategy or design pattern is called? Anyone see this before or know of open source projects that utilize this strategy/pattern?
I am using JavaFX for my project and I have two classes - MainApp class and Database class.
Very simplified implementation would look like this:
public class MainApp extends Application {
#Override
public void start(Stage stage) throws Exception {
// Getting username & password doing some initialization, etc.
Database.setUserName(username);
Database.setPassword(password);
Database.testConnection();
}
// This method was pretty much generated by IDE
public static void main(String[] args)
{
launch(args);
}
}
Only relevant part of Database class implementation is as follows (note that I have declared and implemented variables that appear in mentioned methods, I just don't paste them here to keep the code short)
public class Database {
private static OracleDataSource dataSource;
static {
try {
dataSource = new OracleDataSource();
dataSource.setURL("myjdbcaddress");
dataSource.setUser(userName);
dataSource.setPassword(password);
System.out.print("Static block executed...");
}
catch (SQLException e)
{
System.out.print("Static block caught...");
throw new ExceptionInInitializerError("Initial Database Connection not established. Sorry.");
}
}
public static Connection getConnection()
{
Connection conn = null;
try
{
conn = dataSource.getConnection();
if (conn != null)
isConnected = true;
}
catch (SQLException e)
{
e.printStackTrace();
}
return conn;
}
}
I am getting null pointer exception because of this: static block in Database class is executed after overridden start() method. Therefore, when I access properties of Database class, they are not initialized yet.
Is there a way to force call static block before start method? Did I choose wrong approach? Should I start working with database somewhere else than start() method?
I am getting null pointer exception because of this: static block in Database class is executed after overridden start() method. Therefore, when I access properties of Database class, they are not initialized yet.
No this is not the issue. The static initializer is executed when the class is loaded, which should happen right before (It's always done before anything other than a static constant in the class is used.)
Database.setUserName(username);
or earlier.
The problem probably is that the userName and password not being assigned yet (although it's hard to tell without more code).
I don't recommend using static data to pass information but instead design the application in a way that allows access to a non-static object for communication with the database where it's needed.
However you could fix your problem by moving the code from the static initializer to a static method:
public class Database {
private static OracleDataSource dataSource;
public static void login(String userName, String password) {
try {
dataSource = new OracleDataSource();
dataSource.setURL("myjdbcaddress");
dataSource.setUser(userName);
dataSource.setPassword(password);
System.out.print("Static block executed...");
} catch (SQLException e) {
throw new IllegalStateException("Initial Database Connection not established. Sorry.", e);
}
}
...
}
Database.login(username, password);
Database.testConnection();
But again: Try to avoid using such a Database class that allows access from everywhere.
BTW: If you need to initialize something before the start method of the Application runs, it should be done in a overriden init() method of the application class.
I'm using jdbc transactions as described here: JDBC Transaction example to store complicated object and its relations. For example to store a car I call public "general method" which need to store wheels, engine, car itself etc... It delegates these task to private methods, to which it pass connection. If at some step something went wrong thrown exception is being catch by public method and rollback is performed.
For example:
public boolean saveCar(Car car){
Connection con = null;
try{
con = ....//get connection
con.setAutoCommit(false);
storeCar(car, con);
storeWheels(car, con);
storeEngine(car, con);
con.commit();
} catch(Exception e){
e.printStackTrace();
con.rollback();
} finally {
con.close();
}
}
private void storeWheels(Car car, Connection con) throws Exception{
String sql = "...";
PreparedStatement ps = con.prepareStatement(sql);
ps.executeUpdate;
}
But I need to close PreparedStatement as well. It should be closed in a finally clause, so I have to write my private methods like this:
private void storeWheels(Car car, Connection con) throws Exception{
String sql = "...";
PreparedStatement ps = null;
try{
ps = con.prepareStatement(sql);
ps.executeUpdate;
} catch (Exception e) {
throw new Exception("Something went wrong");
} finally {
ps.close();
}
}
Too many try-catch it makes my code error prone and overloaded. Also throwing Exception from catch block isn't look good. My question how can or is it possible to delegate these tasks as I described and avoid an unnecessary try-catch blocks in every private method.
Can you store an Engine without storing a Car previously? The same question applies for every other component(s). If all those components must be created at the same time you create a Car you you should put the logic for those components all together inside saveCar. If not, your logic still is a little bit "obscure", since you are creating things separately — while you might be confused with an update operation, which, at some point can share the same code.
NOTE: I don't think it's a good idea to be passing a Connection object. Although there's nothing wrong with that but the inconvenience of having an unreleased resource at some point when your programs grows (if so). Look at it as "a good practice", not something you cannot do if it's well understandable/maintainable.
Find this beautiful library DBUtils which has solved my problem:
private QueryRunner runner = new QueryRunner(); //creating global runner object which used across all methods in my class
public boolean saveCar(Car car){
Connection con = null;
try{
con = ....//get connection
con.setAutoCommit(false);
storeCar(car, con);
storeWheels(car, con);
storeEngine(car, con);
DbUtils.commitAndCloseQuietly(con);
} catch(Exception e){
DbUtils.rollbackAndCloseQuietly(con);
}
}
private void storeWheels(Car car, Connection con) throws Exception{
String sql = "...";
runner.update(connection, sql); //all functionality encapsulated, performing update, closing statement, etc..
}
In our application, in DAO layer every method follows almost same sequence:
public List getSomeValue(String[] parameters) {
try {
//Get connection from pool
//Execute procedure
//Process resultset
} catch (SomeException e) {
//Error handling mechanism
} finally {
//Release connection
}
return someResult;
}
All the commented lines in the above code describes the operation we are doing.
Now, other than process result-set part, everything is almost exactly same.
My question is, can we implement some kind of design, so that we don't have to write the same code again and again in each and every method? So that, we just have to write result-set processing part.
Utility methods are already in place. But again we have to call them in each and every procedure in exactly same sequence. Can we have something so that these predefined methods will be called automatically and we have to write only the part that differs? I'm not even sure if this is at all possible or not.
Note: We can not use any ORM tool like Hibernate.
Why not pass in a mapper that knows how to convert the ResultSet into the List of elements you expect. For example:
public List<T> getSomeValue(String[] parameters, ResultSetMapper<T> mapper) {
try {
//Get connection from pool
//Execute procedure
//Process resultset
return mapper.convert(resultSet);
} catch (SomeException e) {
//Error handling mechanism
} finally {
//Release connection
}
}
interface RowMapper<T> {
List<T> convert(ResultSet resultSet);
}
You'd pass in different mapper implementations, and the getSomeValue method would stay the same, so you'd only need 1 implementation of it.
If you are using Spring in your project, you can use its ResultSetExtractor or RowMapper classes to do something similar.
Since java 8 you method could be refactored to something like;
Object doSomething(){
return doInTransaction(trans -> {
return trans.load();
});
}
Where InTrans is
public interface InTrans<T> {
T call(Transaction transaction);
}
and doInTransaction
<T> T doInTransaction(InTrans<T> callable){
try {
Connection connection = ...
return callable.call(connection));
} catch (SomeException e) {
//Error handling mechanism
} finally {
//Release connection
}
}
Alternatively you can use declarative transaction from Spring
This uses the same principle proposed by talex in his answer, except that for what you are looking for, I believe that what you want is to use the Consumer<T> functional interface (or some other similar interface) using a ResultSet as input.
So basically, all your fetching of the connection, procedure execution, looping of the resultset and the exception handling remains inside the one method without repetition. And when you invoke the method, you pass in the code that will process each row.
It will look something like this:
public void callingCode() {
List<String> someList = new ArrayList<>();
performQuery(
"SELECT * FROM ...",
new String[]{"param1", "param2"},
rs -> {
try {
// process your row here.
someList.add(rs.getString("somecolumn"));
} catch (SQLException e) {
throw new RuntimeException(e);
}
});
}
public void performQuery(String query, String[] parameters, Consumer<ResultSet> processRow) {
try {
//Get connection from pool
//Execute procedure
ResultSet rs = null; // pretend this comes from the procedure call.
//Process resultset
while (rs.next()) {
processRow.accept(rs);
}
} catch (Exception e) {
//Error handling mechanism
} finally {
//Release resources
}
}
EDIT:
The annoying thing with the Consumer<T> functional interface is that the method signature doesn't allow for any checked exceptions, so the SQLExceptions need to be handled explicitly (in the code above, you can see that I'm forced to wrap the SQLException in a RuntimeException). To avoid this annoyance, you could choose not to use the built-in Consumer<T> functional interface, and instead, create your own that does include throws SQLException as part of the method signature.
ResultSetConsumer interface:
public interface ResultSetConsumer {
void processRow(ResultSet rs) throws SQLException;
}
Adjusted code:
public void callingCode() {
List<String> someList = new ArrayList<>();
performQuery(
"SELECT * FROM ...",
new String[]{"param1", "param2"},
rs -> {
// process your row here.
someList.add(rs.getString("somecolumn"));
});
}
public void performQuery(String query, String[] parameters, ResultSetConsumer rsConsumer) {
try {
//Get connection from pool
//Execute procedure
ResultSet rs = null; // pretend this comes from the procedure call.
//Process resultset
while (rs.next()) {
rsConsumer.processRow(rs);
}
} catch (Exception e) {
//Error handling mechanism
} finally {
//Release resources
}
}
You can create a base class having the common methods written there and then inherit the class from base class, so every child class will inherit them.
If having a super class is not an option (restricted by the framework or some other reasons), you can write a utility class having static methods for common functionality.
I have a Java application that uses lots of java.sql.Connection to a database.
I want to test that, if the database is unavailable, my services return the appropriate error codes (distinguishing between temporary and permanent problems e.g. HTTP 500 and 503).
For testing, my application connects to an embedded, local, in-memory h2 database; the application is not aware of this, only my integration test is.
How can I make writes to the database fail deterministically, e.g. hook into commits and make them throw a custom SQLException? I want a global 'database is unavailable' boolean in the test code that affects all connections and makes my application exercise its reconnect logic.
(I had started by proxying Connection and putting an if(failFlag) throw new MySimulateFailureException() in commit(); but this didn't catch PreparedStatement.executeUpdate(); before I embark on proxying the PreparedStatement too - its a lot of methods! - I'd like to be taught a better way...)
I think this is a good candidate for using aspects. With eg. Spring it is supremely easy to pointcut entire packages or just certain methods that you wish to fail - specifically you could have a before advice always throwing a ConnectException or do something more advanced with the around advice.
Cheers,
I ended up making my own Java reflection wrapper that intercepts Connection.commit and the PreparedStatement.execute... methods.
My final code in my 'DBFactory' class:
#SuppressWarnings("serial")
public class MockFailureException extends SQLException {
private MockFailureException() {
super("The database has been deliberately faulted as part of a test-case");
}
}
private class MockFailureWrapper implements InvocationHandler {
final Object obj;
private MockFailureWrapper(Object obj) {
this.obj = obj;
}
#Override public Object invoke(Object proxy, Method m, Object[] args) throws Throwable {
if(dbFailure && ("commit".equals(m.getName()) || m.getName().startsWith("execute")))
throw new MockFailureException();
Object result;
try {
result = m.invoke(obj, args);
if(result instanceof PreparedStatement)
result = java.lang.reflect.Proxy.newProxyInstance(
result.getClass().getClassLoader(),
result.getClass().getInterfaces(),
new MockFailureWrapper(result));
} catch (InvocationTargetException e) {
throw e.getTargetException();
} catch (Exception e) {
throw new RuntimeException("unexpected invocation exception: " + e.getMessage());
}
return result;
}
}
public Connection newConnection() throws SQLException {
Connection connection = DriverManager.getConnection("jdbc:h2:mem:"+uuid+";CREATE=TRUE;DB_CLOSE_ON_EXIT=FALSE");
return (Connection)java.lang.reflect.Proxy.newProxyInstance(
connection.getClass().getClassLoader(),
connection.getClass().getInterfaces(),
new MockFailureWrapper(connection));
}