How do I implement getConnection() in DataSource in Java? - java

I'm reading up on DataSource, here, and trying to implement it in my own little project by using a simple file as my "data source". I've created a class that is pretty simple at the moment...
public class QueueData implements DataSource { ... }
though the reason it is simple is because I haven't been able to find a resource that explains how the implemented methods should work. Everyone seems to just list the initialization of a context and a magical getConnection() call, like so.
Context ctx = new InitialContext(env1);
DataSource ds = (DataSource)ctx.lookup("jdbc/mydatasource");
Connection conn = ds.getConnection(); // Magical method!
But can one of you actually give me an example of what the code inside getConnection() should look like?

The reason no one shows samples how to implement a DataSource and "just" uses them instead is because only JDBC driver vendors (usually database makers) need to write them.
What it should do is of course return a Connection object that will also need to be an instance of of a driver-specific class. In your case, something that can accept SQL statements to read from a file.
Your code could look something like this:
public Connection getConnection(){
final String fileName = getFileNameFromMyDatabaseUrl();
return new MyFileConnection(new File(fileName));
}
Which of course is not very interesting code, either.
You can look at some open-source DataSource implementations to see what they do:
Apache Commons DBCP PoolingDataSource (a connection pool)
Apache Derby's EmbeddedDataSource (an embedded database written in Java)
Postgresql's BaseDataSource (abstract base class for the Postgresql JDBC driver)

It's not magical. You retrieve a DataSource object which was bound to the JNDI server, typically when you setup your connection pool in your application server. This setup requires you to provide all the necessary database details like connection url, authentication credentials and other options like the DataSource class for that particular database (which is present in the jdbc driver which comes with that database). This setup information is utilized to create a DataSource which knows how to hand out a connection for that given database.
Let's say that you want to write your own implementation which gives the client a NOOP connection (connection object on which all operations yield nothing). All you have to do is "register" a custom DataSource implementation with the app servers JNDI server. That implementation's getConnection() method would just return a class which implements the Connection interface and methods which do nothing.

Connection is pretty sure an interface itself, which you will need to implement as well. Then, your getConnection() will return an instance of your own class. However, be prepared for a great deal of work... (e.g., if you want to be able to get the datasource over the context, you'll need to register your implementation first.)
Why do you want to develop your own DataSource? There are very lightweight file-based and in-process database libraries available on the web (although I have to admit I don't know them too well), as for example HSQLDB (Wikipedia article), H2 (Wiki)...
If you were developing in C#, I'd also recommend to study Linq To XML, but I don't know whether the Java world has already come up with a counterpart for that...

Related

Spring JdbcTemplate alter session

I want to alter Oracle session for every connection that I get from the connection pool.
I found that it can be done by simply execute a statement. See here.
Is there a way to hook into the jdbc template or the datasource and execute a statement after the connection pool creates a new connection.
I'm using Spring Boot and creating the datasource that way:
#Bean
#ConfigurationProperties(prefix="datasource.local")
public DataSource localDataSource() {
return DataSourceBuilder.create().build();
}
There are a lot of ways to do so.
The first one:
DataSource is an interface, so why don't you implement it yourself (use Proxy pattern)? Create something like this:
class MyDataSource implements DataSource {
private DataSource realDataSource;
public Connection getConnection() {
Connection c = realDataSource.getConnection();
// do whatever you want to do and
return c;
}
}
All other methods will delegate directly to realDataSource.
This proxy can be used in a provided code snippet.
You can use some AOP - just provide an advice that after get connection is created will run and do whatever you need there. Basically it's the same proxy but automatically created by Spring.

Changing Spring bean properties on fly

I've got the following problem. There are 2 datasources for 2 dbs: current(A) and stand by(B). If A doesn't respond, I will try to connect B.
This is how I acheave that.
Interceptor checks if connection is bad
Interceptor will swap datasource URL, if something goes wrong
Datasources are Spring beans. So I change Spring bean properties on fly. Is that ok? Look at the code:
#Autowired
#Qualifier("dataSourceMain")
private oracle.jdbc.pool.OracleDataSource dsMain;
#Autowired
#Qualifier("dataSourceStandBy")
private oracle.jdbc.pool.OracleDataSource dsStandBy;
public void swapURL() {
try {
String dsMainURL = dsMain.getURL();
dsMain.setURL(dsStandBy.getURL());
dsStandBy.setURL(dsMainURL);
} catch (Exception e) {
e.printStackTrace();
}
}
As I can see, my code works, but I don't know if it's good approach or not.
Can you check if this solves your problem? Seems like a similiar question
dynamically change Spring data source
Which seems to be done in a more elegant way
If your data sources are pooled, then they will have a pool of connections waiting to be used or re-used. Depending on your strategy for pooling, your code could have no effect as you aren't telling the data source to evict existing pooled connections that are using the old URLs.
On a general point I suggest you use AbstractRoutingDataSource to safely swap datasources. See here
How to make safe frequent DataSource switches for AbstractRoutingDataSource?

Embedded Derby in OSGi, creating multiple connection using connection pool

I want to create instances of a class which will have access to the underlying Embedded derby database and pass this class to each bundle binding to my database bundle using declarative services.
I have seen in the derby documentation that sharing one connection for multiple threads has many pitfalls. So I was thinking to create a connection for each instance of the class I am creating. Since I only want a very simple way to just create multiple connections and manage them, using "MiniConnectionPoolManager" here seems like a good option. The sample code for derby is shown below:
org.apache.derby.jdbc.EmbeddedConnectionPoolDataSource dataSource = new org.apache.derby.jdbc.EmbeddedConnectionPoolDataSource();
dataSource.setDatabaseName("c:/temp/testDB");
dataSource.setCreateDatabase("create");
MiniConnectionPoolManager poolMgr = new MiniConnectionPoolManager(dataSource, maxConnections);
...
Connection connection = poolMgr.getConnection();
...
connection.close();
But the documentation does not cover many things plus I am a beginner in using Database. My questions are:
When I am creating a new class that will need database connection to perform insert,update & other actions. Shall I pass the 'poolMgr' and call poolMgr.getConnection() from the newly created class?
When should I close this connection? I don't know for how long the bundle (user) will use the new class so shall I save the newly created connection in a private global variable and force the user to execute unregister class where I could then close the connection? Or shall I just close all connections when my database bundle is being deactivated.
Other suggestions are also appreciated to manage different classes accessing one database. Thank you in advance.
Edit:
The main class in my database bundle is always active as long as the application is running. It is the bundles requesting for an instance of a new class(performing database operation) that come and go. And also since it will be deployed in embedded system, I can only use small footprint applications.
You should get a connection from a connection pool when you need it and close the connection as soon as you can. It is the job of the connection pool to re-use connections, not yours.
In other words: Do not keep a connection alive until your consumer bundle is deactivated.
Connection pools normally implement DataSource interface, you should use the pools via it. In that case you can replace the pool implementation easily without changing your code. E.g:
#Component
public class MyComponent {
// Connection pool based DataSource
#Reference
DataSource dataSource;
public void myFunction() {
try (Connection c = dataSource.getConnection()) {
// Database operations
} catch (SQLException e) {
// TODO
}
}
}
When you find yourself repeating the same code many times (getting connection, catching SQLException), you can write a simple component that accepts functional interfaces. E.g.:
#Component
#Service
public class SQLHelper {
#Reference // This is a connection pool DataSource
private DataSource dataSource;
public <R> R execute(Callback<R> callback) {
try (Connection c = dataSource.getConnection()) {
return callback.call(c);
} catch (SQLException e) {
throw new UncheckedSQLException(e);
}
}
}
Your functional interface would look like this:
public interface Callback<R> {
R call(Connection connection);
}
And you would use it like this:
sqlHelper.execute((Connection c) -> {
// Do some stuff with the connection
});
Using transactions
If you want to use atomic transactions, I suggest that you should use org.apache.derby.jdbc.EmbeddedXADataSource together with org.apache.commons.dbcp.managed.BasicManagedDataSource from commons-dbcp. After that, you can handle transactions via JTA.
It is hard to use the JTA API directly. You should choose a library that helps you propagating transactions.
A small guide based on Declarative Services:
Install derby jar into your OSGi container
Install pax-derby bundle as well! By doing that, you will have a DataSourceFactory OSGi service
Install everit-dsf-bundle with its dependencies! You will see two new DS components. Create a configuration for the one called XADataSource via the webconsole! All configuration options have descriptions.
Install a JTA Transaction Manager into the OSGi container! You have several choices. I
Install everit-commons-dbcp-component with its dependencies! You will see two new DS components. Configure the Managed one in the webconsole and set the previously created XADataSource as the target! The transactional pool will take care of providing the same connection if you request-and-close connections whitin the scope of the same transaction.
normally use Aries Transaction Manager that embeds Geronimo TM.
Install everit-transaction-helper to your OSGi container! You will see a new OSGi service with the interface TransactionHelper (that is provided by a configurable DS component).
Now you have everything to write your code. Your component would similar to the following:
#Component
#Service
public class MyComponent {
#Reference
private DataSource dataSource;
#Reference
private TransactionHelper th;
public void myFunction() {
th.required(() -> {
try (Connection c = dataSource.getConnection()) {
// My SQL statements
} catch (SQLException e) {
// TODO
}
}
}
}
In case you do not need transaction handling, you can:
use the standard EmbeddedDataSource
use any non-transactional connection pool
skip the installation of the TransactionManager and TransactionHelper bundles
skip the usage of TransactionHelper from the code
A more complex guide (that also takes care of schema creation and uses OO based queries) is available at http://cookbook.everit.org/persistence/index.html.
Update
You do not have to get a connection for every SQL statement. You should get a connection, execute as many SQL statements that you can within a "moment" and than call close on the connection.
If you have to run three SQL statements right behind each other, you should request a connection, execute the three SQL statements and than call close on the connection
If you close the requested connection within the same function you requested it from the pool, you probably do things right. You might call other functions passing the connection as a parameter, but they should only use it to run SQL statements and than return.
You should not keep alive a connection and wait for another user action. That is the job of the connection pool. When you call close on a connection that is provided by a pool, the connection is not closed physically, but only retrieved to the pool.
You should keep the connection object in a local variable. If you use a member variable for your connection object, you should suspect that something is wrong with your code (the only exception is if you pass the Connection to an object that lives for a very short time and that object holds the connection in a member variable to have cleaner code).
Please note that if you use Java 6 or earlier, you should close the connection in a finally block to avoid unclosed connections.
MiniConnectionPoolManager might be a great solution for embedded devices as it is really "mini". The only issue is that it does not implement the DataSource interface so your business code shuold directly use the MiniCPM classes. By doing that, it will be much harder to switch to other Connection pool if you find a bug or you need a more complex pool later.
If you decide to use MiniCPM, I suggest that you should write a component that implements DataSource and delegates the getConnection() function to a MiniCPM instance. E.g.:
#Component
#Service
public class MiniCPMDataSourceComponent implements DataSource {
#Reference
protected ConnectionPoolDataSource cpDataSource;
private MiniConnectionPoolManager wrapped;
#Activate
public void activate() {
this.wrapped = new MiniConnectionPoolManager(cpDataSource);
}
#Override
public Connection getConnection() {
return wrapped.getConnection();
}
#Override
public Connection getConnection(String user, String password) {
throw new UnsupportedOperationException();
}
#Deactivate
public void deactivate() {
wrapped.dispose();
}
}
You can decorate this component with configuration possibilities like the max connection number and timeout (that is supported by MiniCPM). If you use the service that is provided by this component, you will be able to switch the connection pool without changing your business code. Also, your business bundle will not be wired directly to MiniCPM.

Instantiating a JdbcTemplate from a java.sql.Connection

I want to obtain a JdbcTemplate in my Java code. I've already got a working java.sql.Connection. To create a new JdbcTemplate it would normally need an instance of the javax.sql.DataSource interface.
Is it somehow possible to obtain a new JdbcTemplatefrom an existing java.sql.Connection?
Technically, you can, using SingleConnectionDataSource
new JdbcTemplate(new SingleConnectionDataSource(connection, false))
However, this is not quite advisable, unless for unit-tests for example.
You'd better use a full-featured DataSource and wire things using spring.
No, JdcbTemplate is a Spring class; Connection is part of the JDK. Connection knows nothing about JdbcTemplate.
The way to do it is to add a JdbcTemplate bean in your Spring app context; then inject it into the classes that need it declaratively.

writing test case for a DAO on a J2ee Application

I am trying to write some test cases for my DAO classes in a J2EE applications. Methods in my DAO classes try to get connection to the Database based on a JDBC URL (which is on the app server). So from the front end if I click bunch of stuff and make the DAO trigger it runs fine. However, when I write tests cases for the DAO and the DAO object calls the method then it is not able to get the connection to the database. I think since the JDBC resource is on the App server that is why it is not working from the test class.
because of this when I run my tests instead of pass or fail..it returns bunch of errors.
Has someone encountered this issue? what can I do to overcome this?
Example:
public class DBConnectionManager {
public static final String DB_URL = "jdbc/RSRC/my/connection/mydb"
public Connection getconnection ()
{
DataSource ds = ServiceLocator.getInstance().getDataSource(DB_URL);
return ds.getconnection();
}
}
public class MyDAO extends DBConnectionManager {
publci SomeBean getContents (String id)
{
Connection con = getConnection();
CallableStatement cs = con.prepareCall("{call myStorProc(?)}");
cs.setString(1, id);
...
//code to call resultset and retrieve SomeBean goes here
..
return SomeBean;
}
}
public class MyTests extends TestCase {
public testGetcontents ()
{
MyDAO myd = new MyDAO ();
SomeBean smb = myd.getContents("someparm");
assertEquals (5, smb.getSomeVal());
}
}
Should I be doing something extra in my testcase...? if so what?
EDIT:
error I get is:
java.lang.NoClassDefFoundError: com/iplanet/ias/admin/common/ASException
at java.lang.ClassLoader.defineClass1(Native Method)
Your DAO has a JNDI lookup string hard wired into it. Unless you have a JNDI lookup service available, it won't be able to get a connection.
I don't think a DAO should be responsible for acquiring a database connection. This design won't allow you to set transactions for a unit of work, because a DAO can't know if it's part of a larger unit of work.
I'd recommend passing the connection into the DAO, perhaps into its constructor. That way a service layer can establish appropriate transaction boundaries if there's more than one DAO in a single unit of work.
This design will have the added benefit of making it possible for your application to use its JNDI resource appropriately and your test to get its connection from a DriverManager, without having to use a JNDI lookup. You have two different sources for acquiring the DataSource or Connection - one for the app and another for the test.
UPDATE:
Here's what I mean, expressed in your code:
public class DBConnectionManager
{
public static final String DB_URL = "jdbc/RSRC/my/connection/mydb"
public Connection getConnection (String jndiLookup)
{
DataSource ds = ServiceLocator.getInstance().getDataSource(jndiLookup);
return ds.getconnection();
}
public Connection getConnection(String driver, String url, String username, String password)
throws ClassNotFoundException, SQLException
{
Class.forName(driver);
return DriverManager.getConnection(url, username, password);
}
}
public class MyDAO
{
private Connection connection;
public MyDao(Connection connection)
{
this.connection = connection;
}
public SomeBean getContents (String id)
{
CallableStatement cs = this.connection.prepareCall("{call myStorProc(?)}");
this.connection.setString(1, id);
//code to call resultset and retrieve SomeBean goes here
return someBean;
}
}
You show nothing about closing resources properly or transactions. Judging by this code, you'll be in trouble on both counts. I'd think carefully about your implementation.
I'll recommend Spring JDBC to you. You can write your DAOs in Spring without rewriting your whole app.
I'll also point out that you might also be looking at generics: Don't Repeat The DAO.
Test well your ServiceLocator first. As you mentioned, the problem is probably because the datasource is declared on the server. Here the "bunch of errors" should be helpful, as of whether the problem is in acquiring the DataSource, or the Connectiion itself.
What database are you using? Can you logon to it from your machine from console? If not - configure it so that your host is allowed.
It could be a permissions issue on the database you're trying to access. What errors are you getting?
One useful way for testing database access is to create a clean, local "test" version of your database as part of your test harness. Before you run your tests, use scripts to create a local copy of the database with all the pertinent data, then run your tests against that, rather than the remote server.
People may argue that testing against a database in a unit test is not truly a unit test, since it has an external dependency. If you're able to refactor your DAO classes, you can make it so the actual data source is injectable through some interfaces. In your test code, you'd inject a "mock" data source which provides your test data in some in memory format, then in production, you'd use/inject the actual database source classes. If you can hide the external (non-business code related) parts of your DAO behind interfaces, you can use mocks in your unit tests to test more of your functionality, rather than the actual data access.
Where I work our DAOs have an injectable connection (via constructor injection), and we unit test against a mock connection. To test the code in the DAO, we pass in a mocked (usually using Mockito) connection, and set up expectations in our unit tests as to what methods will be called. This makes for somewhat noisy tests, as the tests look very similar to the code being developed, but it works for us.

Categories