Oracle + dbunit gets AmbiguousTableNameException - java

I am using dbunit to create database backups, which can be imported and exported. My application can use several database engines: MySQL, PostgreSQL, SQLServer, H2 and Oracle.
All of the above work fine with the following code:
// Connect to the database
conn =BackupManager.getInstance().getConnection();
IDatabaseConnection connection = new DatabaseConnection(conn);
InputSource xmlSource = new InputSource(new FileInputStream(new File(nameXML)));
FlatXmlProducer flatXmlProducer = new FlatXmlProducer(xmlSource);
flatXmlProducer.setColumnSensing(true);
DatabaseOperation.CLEAN_INSERT.execute(connection,new FlatXmlDataSet(flatXmlProducer));
But on Oracle I get this exception:
!ENTRY es.giro.girlabel.backup 1 0 2012-04-11 11:51:40.542
!MESSAGE Start import backup
org.dbunit.database.AmbiguousTableNameException: AQ$_SCHEDULES
at org.dbunit.dataset.OrderedTableNameMap.add(OrderedTableNameMap.java:198)
at org.dbunit.database.DatabaseDataSet.initialize(DatabaseDataSet.java:231)
at org.dbunit.database.DatabaseDataSet.getTableMetaData(DatabaseDataSet.java:281)
at org.dbunit.operation.DeleteAllOperation.execute(DeleteAllOperation.java:109)
at org.dbunit.operation.CompositeOperation.execute(CompositeOperation.java:79)
at es.giro.girlabel.backup.ImportBackup.createData(ImportBackup.java:39)
at es.giro.girlabel.backup.handlers.Import.execute(Import.java:45)

From the docs:
public class AmbiguousTableNameException extends DataSetException
This exception is thrown by IDataSet when multiple tables having the
same name are accessible. This usually occurs when the database
connection have access to multiple schemas containing identical table
names.
Possible solutions:
1) Use a database connection credential that has
access to only one database schema.
2) Specify a schema name to the
DatabaseConnection or DatabaseDataSourceConnection constructor.
3) Enable the qualified table name support (see How-to documentation).

For whom uses SpringDBUnit. I had struggled with this very annoying issue. I had ended up solving the issue by adding the configuration for com.github.springtestdbunit.bean.DatabaseConfigBean and com.github.springtestdbunit.bean.DatabaseDataSourceConnectionFactoryBean.
This is my full spring context for SpringDBUnit
<bean id="dataSource" class="org.apache.commons.dbcp.BasicDataSource"
destroy-method="close">
<property name="driverClassName" value="oracle.jdbc.driver.OracleDriver" />
<property name="url" value="jdbc:oracle:thin:#localhost:1521/XE" />
<property name="username" value="xxxx" />
<property name="password" value="xxxx" />
</bean>
<bean id="sessionFactory"
class="org.springframework.orm.hibernate3.annotation.AnnotationSessionFactoryBean">
<property name="dataSource">
<ref bean="dataSource" />
</property>
<property name="hibernateProperties">
<props>
<prop key="hibernate.dialect">org.hibernate.dialect.OracleDialect</prop>
<prop key="hibernate.show_sql">true</prop>
</props>
</property>
<property name="annotatedClasses">
<list>
<value>xxx.example.domain.Person</value>
</list>
</property>
</bean>
<bean id="dbUnitDatabaseConfig" class="com.github.springtestdbunit.bean.DatabaseConfigBean">
<property name="skipOracleRecyclebinTables" value="true" />
<property name="qualifiedTableNames" value="true" />
<!-- <property name="caseSensitiveTableNames" value="true"/> -->
</bean>
<bean id="dbUnitDatabaseConnection"
class="com.github.springtestdbunit.bean.DatabaseDataSourceConnectionFactoryBean">
<property name="dataSource" ref="dataSource"/>
<property name="databaseConfig" ref="dbUnitDatabaseConfig" />
<property name="schema" value="<your_schema_name>"/>
</bean>

Setting the database schema fixed it for me:
#Bean
public DatabaseDataSourceConnectionFactoryBean dbUnitDatabaseConnection(final DataSource dataSource){
final DatabaseDataSourceConnectionFactoryBean connectionFactory = new DatabaseDataSourceConnectionFactoryBean();
connectionFactory.setDataSource(dataSource);
connectionFactory.setSchema(DB_SCHEMA);
return connectionFactory;
}

I had the same AmbiguousTableNameException while executing Dbunits aginst Oracle DB. It was working fine and started throwing error one day.
Rootcause: while calling a stored procedure, it got modified by mistake to lower case. When changed to upper case it stared working.
I could solve this also by setting the shema name to IDatabaseTester like iDatabaseTester.setSchema("SCHEMANAMEINCAPS")
Also please make sure your connection doesn't access only to many schemas having same table name.

You might encounter issues when importing data from Hibernate before DBUnit runs. According to the database you are using, the casing of table and column names could be important.
For example, in HSQL, database names must be declared in uppercase.
In case you import data via Hibernate's import.sql, make sure the table names are also in uppercase there, otherwise you'll end up with the following problem:
Hibernate creates the tables in lower case
DBUnit reads the table names from the DB in lower case
DBUnit tries to import its datasets using upper case table names
You end up in a mess, with the ambiguous name exception.
Remember to also check whether multiple tables were created during a previous run (both upper and lower case), in which case you need to clean it up too.

Related

Create database script mySQL

I have dynamic web java project where I am using spring-MVC/hibernate. In my xml config file I have a bean for setting up my sessionFactory (all dependencies - jdbc url etc. for my local database). Is there a way I can send this project + database to my friend and it will work ? I am wondering because in xml config file I have the exact jdbc url, hostname, port etc. which doesnt have to be the same on my friend's laptop but is there a way to create some script that will create database with the exactly same properties as my local database ?
EDIT - here is config file
<!-- Define Database DataSource / connection pool -->
<bean id="myDataSource" class="com.mchange.v2.c3p0.ComboPooledDataSource"
destroy-method="close">
<property name="driverClass" value="com.mysql.jdbc.Driver" />
<property name="jdbcUrl" value="jdbc:mysql://localhost:3306/java_task?useSSL=false" />
<property name="user" value="me" />
<property name="password" value="me" />
<!-- these are connection pool properties for C3P0 -->
<property name="minPoolSize" value="5" />
<property name="maxPoolSize" value="20" />
<property name="maxIdleTime" value="30000" />
</bean>
<!-- Define Hibernate session factory -->
<bean id="sessionFactory"
class="org.springframework.orm.hibernate5.LocalSessionFactoryBean">
<property name="dataSource" ref="myDataSource" />
<property name="packagesToScan" value="com.javatask.entity" />
<property name="hibernateProperties">
<props>
<prop key="hibernate.dialect">org.hibernate.dialect.MySQLDialect</prop>
<prop key="hibernate.show_sql">true</prop>
</props>
</property>
</bean>
[...] is there a way to create some script that will create database
with the exactly same properties as my local database?
In the hibernate-tools jar, Hibernate provides a SchemaExport tool that supports what you want to do.
You can write a little program yourself to use it, but I think most folks don't "roll their own." I think it's common to use a plugin for your build system.
Are you using Maven? You could use something like this: https://github.com/Devskiller/jpa2ddl
Is there a way I can send this project + database to my friend and it
will work ?
Yes, u can put your code in a public repository like Github, and ur friend can clone it from there
I am wondering because in xml config file I have the exact jdbc url,
hostname, port etc. which doesnt have to be the same on my friend's
laptop but is there a way to create some script that will create
database with the exactly same properties as my local database ?
We don't know ur code, but if u have provided absolute path in ur configuration then it will fail in ur friend's laptop. For a better answer, please provide ur configuration code
You can read about absolute path here
Update
I don't see any problem in your config code. it should work well in another system. About uploading the database, u can simply export it from ur database, and put the deattached file into the folder of ur current project. Once u push ur code to the repo the database would be there too. Then the other person, will get it again by the other codes. Again there is no problem.
Your firend must get sure that he has Mysql, and it is working on port 3306. But, there shouldn't be any problem. why do u think, it would fail?

DataSource in an OSGI container

I have a simple Spring App that connects to a DB via an EntityManager
So I have to following configuration:
<bean id="domainEntityManagerFactory" class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean">
<property name="persistenceUnitName" value="TheManager" />
<property name="dataSource" ref="domainDataSource" />
<property name="packagesToScan" value="com.conztanz.persistence.stock.model" />
<property name="jpaVendorAdapter">
<bean class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter" />
</property>
<property name="jpaProperties">
<props>
<prop key="hibernate.hbm2ddl.auto">create-drop</prop>
<prop key="hibernate.dialect">org.hibernate.dialect.MySQL5Dialect</prop>
</props>
</property>
</bean>
<bean id="domainDataSource" class="org.springframework.jdbc.datasource.DriverManagerDataSource">
<property name="driverClassName" value="org.postgresql.Driver" />
<property name="url" value="jdbc:postgresql://localhost:5433/dbName" />
<property name="username" value="xxxx" />
<property name="password" value="xxxx" />
</bean>
This works fine when lunched via a main class (loading the AppContext manually)
But, once deployed into ServiceMix I get the following error :
Property 'driverClassName' threw exception; nested exception is java.lang.IllegalStateException: Could not load JDBC driver class [org.postgresql.Driver]
I've read somewhere that OSGI and DriverManager don't mix well but I fail to understand why.
The solution that seems to be a good practice is to expose the dataSource as an OSGI bundle, do you agree ? and in that case how would you have access to it from a spring context to be able to have an EntityManager for example ?
DriverManager does not work well in OSGi. The easiest way is to use a DataSource directly. Most DB drivers have such a class. If you instantiate it in your app context then it will work. The downside is though that it binds your application to the DB driver as it then will import the packages for the DataSource impl.
A more loosely coupled way is to use ops4j pax jdbc. It allows to create a DataSource as an OSGi service from a config in config admin. So in your app context you just have to add a dependency to a DataSource service. So your application is not bound to the specific DB driver. One typical use case is to use H2 in tests and oracle in production.

Spring #Transactional on one service method spanning over two Hibernate transaction managers

I was wondering if it is possible to use two transaction manager in one service methods.
Because due to the limitation of client's mysql db configuration, we have got 2 different datasources within one database, i.e., one user/pwd/url per schema. Thats why i have to configured two transaction managers. Now I got problem when it comes to the service implementation. See the following code:
public class DemoService{
...
#Transactional(value = "t1")
public doOne(){
doTwo();
}
#Transactional(value = "t2")
public doTwo(){
}
...
}
if I using this code pattern, i always got the exception
org.hibernate.HibernateException: No Session found for current thread
If i run the two methods seperately, it workd fine.
Did i miss something? Or there is other work around here?
Any advice would be appreciated.
btw: some of my configuration
<bean id="sessionFactorySso" class="org.springframework.orm.hibernate4.LocalSessionFactoryBean">
<property name="mappingLocations">
<list>
<value>classpath*:sso.vo/*.hbm.xml</value>
</list>
</property>
<property name="hibernateProperties">
<props>
<prop key="hibernate.show_sql">true</prop>
<prop key="generateDdl">true</prop>
<prop key="hibernate.dialect">${dialect} </prop>
</props>
</property>
<property name="dataSource" ref="dataSourceSso"/>
</bean>
<bean id="dataSourceSso" class="com.mchange.v2.c3p0.ComboPooledDataSource" destroy-method="close">
<property name="driverClass" value="${driver}"/>
<property name="jdbcUrl" value="${sso.url}"/>
<property name="user" value="${sso.username}"/>
<property name="password" value="${sso.password}"/>
<!-- these are C3P0 properties -->
<property name="acquireIncrement" value="2" />
<property name="minPoolSize" value="1" />
<property name="maxPoolSize" value="2" />
<property name="automaticTestTable" value="test_c3p0" />
<property name="idleConnectionTestPeriod" value="300" />
<property name="testConnectionOnCheckin" value="true" />
<property name="testConnectionOnCheckout" value="true" />
<property name="autoCommitOnClose" value="true" />
<property name="checkoutTimeout" value="1000" />
<property name="breakAfterAcquireFailure" value="false" />
<property name="maxIdleTime" value="0" />
</bean>
<bean id="transactionManagerSso" class="org.springframework.orm.hibernate4.HibernateTransactionManager">
<property name="sessionFactory" ref="sessionFactorySso"/>
<qualifier value="sso" />
</bean>
<tx:annotation-driven transaction-manager="transactionManagerSso" />
Because you want to enlist two data sources in one transaction you need XA(Global) Transaction.
Therefore you need to:
Set the Spring JTA transaction manager
You Hibernate properties should use the JTA platform settings
Your data source connections should be XA complaint
You need an application server JTA transaction manager or a stand-alone tarnsaction manager (Bitronix, Atomikos, JOTM)
You will need two session factory configurations, one for each individual data source.
And you won't have two transaction managers: t1 and t2, but instead you will enlist two transactional XA data sources that will be automatically enlisted in the same global transaction, meaning you will have two XA connections being enlisted in the same global transaction. The XA transaction will use the 2PC protocol to commit both resources upon commit time.
Checkout this Bitronix Hibernate example.
You have a few options:
Inject the bean into itself and use the reference to call doTwo(). This really goes against the whole idea of IoC and AOP so I don't recommend it.
Switch to compile time weaving. Rather than using proxies, Spring (actually the AspectJ compiler) will add the bytecode to start/stop transactions to your class at compile time. There are pros and cons to this approach. See this page for more details.
Use load time weaving. Same as #2 except that your classes are modified as they are loaded rather than at compile time. IMO, Java classloading is complicated enough. I'm sure this works great for some folks but I personally would avoid this.
As Vlad pointed out, you can use JTA and XA.
Start a new transaction against transaction manager 2 within doOne() before calling doTwo(). RTFM on programmatic transaction management.
Check out ChainedTransactionManager. It essentially aggregates multiple transaction managers and does a "best effort" with commit/rollback. This is NOT a true two-phase commit like Vlad's solution.
All of these except for Vlad's solution (#4) have the potential to leave the databases in an inconsistent state. You need to use JTA/XA/two-phase commit to ensure consistency in the event that one of the TX managers throws an exception at commit time.

using last successful run date for query in next run

I am using JdbcPagingItemReader of Spring Batch for processing entries in my database. There is a timestamp column in the table I am querying and I want the JdbcPagingItemReader in the next run to just process the items where timestamp > "last successful job execution"
I think this should be a fairly common use case but somehow I can't figure out how to configure it. Thanks for your help!
JdbcPagingItemReader has it's own custom restart logic. It searches for the last retrieved value which maps to a unique index field and restarts the job from there.
From the JavaDocs:
On restart it uses the last sort key
value to locate the first page to read
(so it doesn't matter if the
successfully processed itmes have been
removed or modified).
As you can see, your timestamp field would not make any significant difference.
Update after reading comment:
OK, then how about dynamically creating the where clause for your PagingQueryProvider?
<bean id="itemReader" class="org.spr...JdbcPagingItemReader">
<property name="dataSource" ref="dataSource"/>
<property name="queryProvider">
<bean class="org.spr...SqlPagingQueryProviderFactoryBean">
<property name="selectClause" value="select id, name, credit"/>
<property name="fromClause" value="from customer"/>
<property name="whereClause">
<bean class="your.company.WhereClauseFactorybean" />
<property />
<property name="sortKey" value="id"/>
</bean>
</property>
<property name="parameterValues">
<map>
<entry key="status" value="NEW"/>
</map>
</property>
<property name="pageSize" value="1000"/>
<property name="rowMapper" ref="customerMapper"/>
</bean>
Now implement WhereClauseFactorybean as a FactoryBean that uses JdbcTemplate to find the last timestamp and return something like where timestamp > <your time stamp> or null if no timestamp is found.
Reference:
Spring Batch:
JdbcPagingItemReader
Spring: FactoryBean
Spring: JdbcTemplate
Update after reading more comments:
Then I guess you will have to implement a custom StepExecutionListener, inject the AbstractSqlPagingQueryProvider into it and set the where clause in the beforeStep(StepExecution) method.

using AbstractTransactionalDataSourceSpringContextTests with Hibernate & DbUnit - inserted data not found

All,
I'm attempting to write a unit test for a Dao that uses Hibernate. I'm also using Spring, so I'm trying to extend AbstractTransactionalDataSourceSpringContextTests and also use DBUnit to insert data into the database before each test in the onSetUpInTransaction method.
From my logs, I can see that DbUnit is able to successfully insert the data in onSetUpInTransaction just fine. However, when I run a test method that uses the Dao (and therefore, Hibernate) to try to access that data (testGetPersonById2), the data isn't found, even though all this should be happening in the same transaction. After the test method finishes running (it fails), I see the log statement from the AbstractTransactionalDataSourceSpringContextTests that the transaction does get rolled back correctly.
It seems like the onSetUpInTransaction and Hibernate session must be using different transactions, but I can't figure out why. Does anyone have an example of something like this working? Advice on what I'm missing?
Here's what I've got so far:
public class PersonDaoTest extends AbstractTransactionalDataSourceSpringContextTests{
private Log logger = LogFactory.getLog(PersonDaoTest.class);
private PersonDaoImpl personDao;
#Override
public void onSetUpInTransaction() throws Exception {
// Load test data using DBUnit
super.onSetUpBeforeTransaction();
DataSource ds = jdbcTemplate.getDataSource()
Connection con = DataSourceUtils.getConnection(ds);
IDatabaseConnection dbUnitCon = new DatabaseConnection(con);
DatabaseConfig config = dbUnitCon.getConfig();
config.setFeature("http://www.dbunit.org/features/qualifiedTableNames",
true);
//This dataset contains a single entry in the Persons table,
// a new person with Id = 998877665, it gets inserted successfully
IDataSet dataSet = new FlatXmlDataSet(new FileInputStream(
"./PersonDaoTest.xml"));
logger.warn("dataSet = " + dataSet);
try {
DatabaseOperation.REFRESH.execute(dbUnitCon, dataSet);
SessionFactoryUtils.getSession(getSessionFactory(), false).flush();
} finally {
DataSourceUtils.releaseConnection(con, ds);
}
}
//This test PASSES, because the Person with Id = 9 already
//exists in the database, it does not rely on the data being set up in the
// onSetUpInTransaction method
#Test
public void testGetPersonById() {
Person person = personDao.findById(9L);
assertNotNull("person should not be null", person);
}
//This test FAILS at the assertNotNull line, because
//no Person with Id = 998877665 exists in the database,
//even though that Person was inserted
//in the onSetUpInTransaction method - it seems
//that hibernate cannot see that insertion.
#Test
public void testGetPersonById2() {
Person person = personDao.findById(998877665L);
assertNotNull("person should not be null", person);
}
UPDATE: Here's my spring config:
<bean id="propertyConfigurer" class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
<property name="locations">
<list>
<value>classpath:jdbc.properties</value>
</list>
</property>
</bean>
<bean id="dataSource" class="com.p6spy.engine.spy.P6DataSource">
<constructor-arg>
<bean id="basicDataSource" class="org.apache.commons.dbcp.BasicDataSource" destroy-method="close">
<property name="driverClassName">
<value>${jdbc.driverClassName}</value>
</property>
<property name="url">
<value>${jdbc.url}</value>
</property>
<property name="username">
<value>${jdbc.username}</value>
</property>
<property name="password">
<value>${jdbc.password}</value>
</property>
</bean>
</constructor-arg>
</bean>
<bean id="transactionManager" class="org.springframework.jdbc.datasource.DataSourceTransactionManager">
<property name="dataSource" ref="dataSource"/>
</bean>
<!-- Hibernate SessionFactory -->
<bean id="sessionFactory" class="org.springframework.orm.hibernate3.LocalSessionFactoryBean">
<property name="dataSource"><ref bean="dataSource"/></property>
<property name="configLocation">
<value>classpath:hibernate.cfg.xml</value>
</property>
<property name="configurationClass">
<value>org.hibernate.cfg.AnnotationConfiguration</value>
</property>
<property name="hibernateProperties">
<props>
<prop key="hibernate.show_sql">false</prop>
<prop key="hibernate.dialect">org.hibernate.dialect.Oracle9Dialect</prop>
<prop key="hibernate.cache.use_query_cache">false</prop>
<prop key="hibernate.cache.provider_class">org.hibernate.cache.EhCacheProvider</prop>
<prop key="hibernate.cache.query_cache_factory">org.hibernate.cache.StandardQueryCacheFactory</prop>
</props>
</property>
</bean>
<!-- The Hibernate interceptor
<bean id="hibernateInterceptor" class="org.springframework.orm.hibernate3.HibernateInterceptor">
<property name="sessionFactory"><ref bean="sessionFactory"/></property>
</bean>-->
<bean id="personDao" class="my.dao.PersonDaoImpl">
<property name="sessionFactory"><ref bean="sessionFactory"/></property>
</bean>
I spent some more time with this, and never was able to find anything that worked when trying to use the onSetUpInTransaction method. I ended up switching over to using onSetUpBeforeTransaction and onTearDownAfterTransaction instead. It's not exactly ideal, because the onSetUpBeforeTransaction method does end up committing its data insertions to the database, and that data must then be cleaned up in onTearDownAfterTransaction. However, the tests themselves can still insert and update data all they want and have those changes all rolled back, since each test does still operate in its own transaction. So, I don't have to do anything special to clean up when a test inserts new data, which means I met one of my goals anyway!
I had a similar problem. here is how I solved it.
I added a code to my abstract testcase class
then I added a datasource with the name "dataSource" which allowed me to use it to insert my test data and create my test sqls using that data source.
The individual datasources datasource1 and datasource2 were correctly injected into my dao beans without any issue.
#Override
protected String[] getConfigLocations()
{
setAutowireMode(AUTOWIRE_BY_NAME);
setDependencyCheck(false);
return new String[] { "classpath:configuration/myappxxx-test-application-context.xml" };
}

Categories