using last successful run date for query in next run - java

I am using JdbcPagingItemReader of Spring Batch for processing entries in my database. There is a timestamp column in the table I am querying and I want the JdbcPagingItemReader in the next run to just process the items where timestamp > "last successful job execution"
I think this should be a fairly common use case but somehow I can't figure out how to configure it. Thanks for your help!

JdbcPagingItemReader has it's own custom restart logic. It searches for the last retrieved value which maps to a unique index field and restarts the job from there.
From the JavaDocs:
On restart it uses the last sort key
value to locate the first page to read
(so it doesn't matter if the
successfully processed itmes have been
removed or modified).
As you can see, your timestamp field would not make any significant difference.
Update after reading comment:
OK, then how about dynamically creating the where clause for your PagingQueryProvider?
<bean id="itemReader" class="org.spr...JdbcPagingItemReader">
<property name="dataSource" ref="dataSource"/>
<property name="queryProvider">
<bean class="org.spr...SqlPagingQueryProviderFactoryBean">
<property name="selectClause" value="select id, name, credit"/>
<property name="fromClause" value="from customer"/>
<property name="whereClause">
<bean class="your.company.WhereClauseFactorybean" />
<property />
<property name="sortKey" value="id"/>
</bean>
</property>
<property name="parameterValues">
<map>
<entry key="status" value="NEW"/>
</map>
</property>
<property name="pageSize" value="1000"/>
<property name="rowMapper" ref="customerMapper"/>
</bean>
Now implement WhereClauseFactorybean as a FactoryBean that uses JdbcTemplate to find the last timestamp and return something like where timestamp > <your time stamp> or null if no timestamp is found.
Reference:
Spring Batch:
JdbcPagingItemReader
Spring: FactoryBean
Spring: JdbcTemplate
Update after reading more comments:
Then I guess you will have to implement a custom StepExecutionListener, inject the AbstractSqlPagingQueryProvider into it and set the where clause in the beforeStep(StepExecution) method.

Related

How to implement multiple Reader using db in Spring Batch

This code is a version of Spring Batch version 1. I have problem migrating this code to version 4 since the org.springframework.batch.item.database.IbatisDrivingQueryItemReader class is no longer available in current version.
The process of the code below is, the withdrawalIbatisKeyGenerator bean should execute first and from the output of that bean, it will use in ibatisWithdrawalReader bean.
My question is, how to implement this reader to current version, since the two bean have dependency with each other.
<bean id="ibatisWithdrawalReader"
class="org.springframework.batch.item.database.IbatisDrivingQueryItemReader">
<property name="detailsQueryId"
value="withdrawalTransactionDao.getWithdrawalTransaction" />
<property name="sqlMapClient" ref="sqlMap" />
<property name="keyCollector"
ref="withdrawalIbatisKeyGenerator" />
</bean>
<bean id="withdrawalIbatisKeyGenerator"
class="ph.pnblife.julia.batch.BatchKeyCollector">
<property name="drivingQuery"
value="withdrawalTransactionDao.getWithdrawalTransactionKey" />
<property name="restartQueryId"
value="withdrawalTransactionDao.restartWithdrawalTransaction" />
<property name="sqlMapClient" ref="sqlMap" />
<property name="parameters">
<list>
<value>%%pricing_date_ibatis:date:pricing_date</value>
<value>%%policy_number:string:pol_no</value>
<value>%%version_no:string:version_no</value>
<value>%%start_date:date:start_dt</value>
<value>%%endt_type:string:endt_code</value>
</list>
</property>
<property name="isKeyAMap">
<value type="java.lang.Boolean">true</value>
</property>
</bean>
The withdrawalIbatisKeyGenerator bean could registered as a StepExecutionListener where the data required by the reader is generated in the StepExecutionListener#beforeStep method.

PostgreSQL and hibernate - Function rownumber() doesn't exist

I am currently trying to use Hibernate generated functions for a PostgreSQL Database. It all works correctly except for when I use the functions setFirstResult() and setMaxResults() on my hibernate queries.
Here is a sample of my Java code :
String queryString = "select h from History h " ;
Query onePageQuery = getEntityManager().createQuery(queryString)
.setFirstResult(rowMin).setMaxResults(PAGE_SIZE);
onePageQuery.getResultList();
The generated query is the following :
select
*
from
( select
rownumber() over(
order by
history0_.DTMTC desc) as rownumber_,
history0_.ID as ID2_,
history0_.CCOUL as CCOUL2_,
history0_.CDIAM as CDIAM2_,
history0_.CODAV as CODAV2_,
history0_.COOPVM as COOPVM2_,
history0_.COOPVT as COOPVT2_,
history0_.CPOTA as CPOTA2_,
history0_.DTMTC as DTMTC2_,
history0_.TYCOUP as TYCOUP2_
from
F23VCM2D history0_
order by
history0_.DTMTC desc ) as temp_
where
rownumber_ <= ?
Before I used a PostgreSQL database, I was using a DB2 Database. I have a jpa.xml file in which I declare that I am using a POSTGRESQL database as follows :
<property name="jpaVendorAdapter">
<bean class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter">
<property name="database" value="POSTGRESQL" />
<!-- <property name="database" value="DB2" /> -->
</bean>
</property>
I think that maybe the generated query is not adapted to PostgreSQL, because the error I have is : (roughly translated from French)
Caused by: org.postgresql.util.PSQLException: ERROR: the function rownumber() Doesn't exist
Hint : There's no function corresponding to the given name or the arguments types.
You need to add explicit type conversion.
My questions are :
1) Is the generated query the one that should be generated ? I doubt it cause I tried to send it via SQL Developper and it failed.
2) Is there something that is obvious I didn't do with my hibernate set up?
Modify jpaVendorAdapter like this :
<property name="jpaVendorAdapter">
<bean class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter">
<property name="databasePlatform" value="org.hibernate.dialect.PostgreSQL95Dialect"/>
</bean>
</property>
in order to correctly set the hibernate dialect.
The problem was on my side : there were two jpa.xml files.
I didn't know this because it's a legacy project where I'm trying to change only the database connection. Hence, I was trying to modify the wrong jpa.xml file so there were no impact on the generated queries.
Once I had removed the "DB2" line :
<property name="jpaVendorAdapter">
<bean class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter">
<property name="database" value="DB2" />
</bean>
</property>
and replaced it with the "POSTGRE" line, in the right file, it worked and the query was generated successfully. So the following version of the jpaVendorAdapter did it for me :
<property name="jpaVendorAdapter">
<bean class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter">
<property name="database" value="POSTGRESQL" />
</bean>
</property>
Thank you for the other input, it helped me find the answer. There didn't seem to need to add the dialect part, so I didn't add it in the end.

Seam/Hibernate: liquibase before JPA startup

I have a Java EE web application (hibernate3, seam) that I'm using in Weblogic container.
I want to introduce Liquibase for schema migrations.
Currently we use
<property name="hibernate.hbm2ddl.auto" value="update"/>
which we want to drop because it can be dangerous.
I want the migration to automatically happen at deployments, so I'm using the servlet listener integration.
In web.xml, the first listener is:
<listener>
<listener-class>liquibase.integration.servlet.LiquibaseServletListener</listener-class>
</listener>
Sadly, this listener comes into play after the Hibernate initialization and it throws missing table errors (because the schema is empty).
I'm google-ing like a boss for hours and I'm a bit confused now.
Thanks in advance
UPDATE
If I set <property name="hibernate.hbm2ddl.auto" value="none" />, liquibase finishes it's job successfully and the app starts up as expected. If I set validate, it seems like hibernate schema validation takes place before liquibase and it cries because of missing tables.
UPDATE
It seems like Seam initializes Hibernate, but Liquibase listener is listed before SeamListener, so I have no clue on how to enable schema validation and liquibase at the same time...
My understanding is that the LiquibaseServletListener requires the path to change log file which is passed using liquibase.changelog context param. So you already have a change log generated or am I missing something here ?
You can take a look at the liquibase hibernate integration library provided by Liquibase.
This library works with both the classic hibernate configuration (via .cfg and .xml files) as well as JPA configuration via persistence.xml.
AFAIK, generating the changelog and running the change log are two seperate process. Liquibase hibernate integration library helps in generating the change log from the diff of current state of entities in persistence unit and the current database state.
How to determine the order of listeners in web.xml
You should place:
<listener>
<listener-class>liquibase.integration.servlet.LiquibaseServletListener</listener-class>
</listener>
before ORM or framework other related listeners.
I use Spring beans LiquiBase activation to reduce DB authentication data duplication by using already provided datasource bean:
<bean id="liquibase" class="liquibase.integration.spring.SpringLiquibase">
<property name="dataSource" ref="dataSource" />
<property name="changeLog" value="classpath:sql/master.sql" />
<property name="defaultSchema" value="PRODUCT" />
</bean>
To restrict order use depends-on attribute:
<bean id="entityManagerFactory"
class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean"
depends-on="liquibase">
<property name="dataSource" ref="dataSource" />
<property name="jpaVendorAdapter">
<bean class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter" />
</property>
<property name="packagesToScan" value="product.domain" />
<property name="jpaProperties">
<props>
<prop key="hibernate.dialect">org.hibernate.dialect.MySQLDialect</prop>
<prop key="hibernate.hbm2ddl.auto">validate</prop>
</props>
</property>
</bean>

Spring JDBCTemplate update statement

There is a datasource configured like below in Spring.
<bean id="dataSource" class="org.apache.commons.dbcp.BasicDataSource"
destroy-method="close">
<property name="driverClassName" value="${prop_jdbc.driverClassName}"/>
<property name="url" value="${prop_jdbc.url}"/>
<property name="username" value="${prop_jdbc.username}"/>
<property name="password" value="${prop_jdbc.password}"/>
<property name="initialSize" value="2"/>
<property name="maxActive" value="5"/>
<property name="maxIdle" value="2"/>
<property name="poolPreparedStatements" value="true"/>
<property name="maxOpenPreparedStatements" value="-1"/>
<!-- property name="defaultAutoCommit">
<value>false</value>
</property-->
</bean>
Now, I am doing firstly a DROP TABLE using jdbcTemplate created from above dataSource, then in next statement, I create same TABLE again and finally try to DROP it immediately in third statement.
jdbcTemplate.update( dropSql,new Object[] { });
jdbcTemplate.update( createSql,new Object[] { });
jdbcTemplate.update( dropSql,new Object[] { });
EDIT after #Brian comments
After first statement , Table was dropped immediately, and second statement also creates it immediately but second time, DROP is not happening .. There is no error as well..
Does JdbcTemplate execute DROP immediately or periodically as this is hard to understand , using same data source why should second DROP not happen when first happened 2 lines before.. ?
DDL - like create and drop are not transactional. Please share the actual ddl being executed. I would suggest in the absence of having the actual SQL to review that you use the execute method instead of update method on jdbcTemplate.
What are you doing to drop the table between each failed attempt by your code?

Oracle + dbunit gets AmbiguousTableNameException

I am using dbunit to create database backups, which can be imported and exported. My application can use several database engines: MySQL, PostgreSQL, SQLServer, H2 and Oracle.
All of the above work fine with the following code:
// Connect to the database
conn =BackupManager.getInstance().getConnection();
IDatabaseConnection connection = new DatabaseConnection(conn);
InputSource xmlSource = new InputSource(new FileInputStream(new File(nameXML)));
FlatXmlProducer flatXmlProducer = new FlatXmlProducer(xmlSource);
flatXmlProducer.setColumnSensing(true);
DatabaseOperation.CLEAN_INSERT.execute(connection,new FlatXmlDataSet(flatXmlProducer));
But on Oracle I get this exception:
!ENTRY es.giro.girlabel.backup 1 0 2012-04-11 11:51:40.542
!MESSAGE Start import backup
org.dbunit.database.AmbiguousTableNameException: AQ$_SCHEDULES
at org.dbunit.dataset.OrderedTableNameMap.add(OrderedTableNameMap.java:198)
at org.dbunit.database.DatabaseDataSet.initialize(DatabaseDataSet.java:231)
at org.dbunit.database.DatabaseDataSet.getTableMetaData(DatabaseDataSet.java:281)
at org.dbunit.operation.DeleteAllOperation.execute(DeleteAllOperation.java:109)
at org.dbunit.operation.CompositeOperation.execute(CompositeOperation.java:79)
at es.giro.girlabel.backup.ImportBackup.createData(ImportBackup.java:39)
at es.giro.girlabel.backup.handlers.Import.execute(Import.java:45)
From the docs:
public class AmbiguousTableNameException extends DataSetException
This exception is thrown by IDataSet when multiple tables having the
same name are accessible. This usually occurs when the database
connection have access to multiple schemas containing identical table
names.
Possible solutions:
1) Use a database connection credential that has
access to only one database schema.
2) Specify a schema name to the
DatabaseConnection or DatabaseDataSourceConnection constructor.
3) Enable the qualified table name support (see How-to documentation).
For whom uses SpringDBUnit. I had struggled with this very annoying issue. I had ended up solving the issue by adding the configuration for com.github.springtestdbunit.bean.DatabaseConfigBean and com.github.springtestdbunit.bean.DatabaseDataSourceConnectionFactoryBean.
This is my full spring context for SpringDBUnit
<bean id="dataSource" class="org.apache.commons.dbcp.BasicDataSource"
destroy-method="close">
<property name="driverClassName" value="oracle.jdbc.driver.OracleDriver" />
<property name="url" value="jdbc:oracle:thin:#localhost:1521/XE" />
<property name="username" value="xxxx" />
<property name="password" value="xxxx" />
</bean>
<bean id="sessionFactory"
class="org.springframework.orm.hibernate3.annotation.AnnotationSessionFactoryBean">
<property name="dataSource">
<ref bean="dataSource" />
</property>
<property name="hibernateProperties">
<props>
<prop key="hibernate.dialect">org.hibernate.dialect.OracleDialect</prop>
<prop key="hibernate.show_sql">true</prop>
</props>
</property>
<property name="annotatedClasses">
<list>
<value>xxx.example.domain.Person</value>
</list>
</property>
</bean>
<bean id="dbUnitDatabaseConfig" class="com.github.springtestdbunit.bean.DatabaseConfigBean">
<property name="skipOracleRecyclebinTables" value="true" />
<property name="qualifiedTableNames" value="true" />
<!-- <property name="caseSensitiveTableNames" value="true"/> -->
</bean>
<bean id="dbUnitDatabaseConnection"
class="com.github.springtestdbunit.bean.DatabaseDataSourceConnectionFactoryBean">
<property name="dataSource" ref="dataSource"/>
<property name="databaseConfig" ref="dbUnitDatabaseConfig" />
<property name="schema" value="<your_schema_name>"/>
</bean>
Setting the database schema fixed it for me:
#Bean
public DatabaseDataSourceConnectionFactoryBean dbUnitDatabaseConnection(final DataSource dataSource){
final DatabaseDataSourceConnectionFactoryBean connectionFactory = new DatabaseDataSourceConnectionFactoryBean();
connectionFactory.setDataSource(dataSource);
connectionFactory.setSchema(DB_SCHEMA);
return connectionFactory;
}
I had the same AmbiguousTableNameException while executing Dbunits aginst Oracle DB. It was working fine and started throwing error one day.
Rootcause: while calling a stored procedure, it got modified by mistake to lower case. When changed to upper case it stared working.
I could solve this also by setting the shema name to IDatabaseTester like iDatabaseTester.setSchema("SCHEMANAMEINCAPS")
Also please make sure your connection doesn't access only to many schemas having same table name.
You might encounter issues when importing data from Hibernate before DBUnit runs. According to the database you are using, the casing of table and column names could be important.
For example, in HSQL, database names must be declared in uppercase.
In case you import data via Hibernate's import.sql, make sure the table names are also in uppercase there, otherwise you'll end up with the following problem:
Hibernate creates the tables in lower case
DBUnit reads the table names from the DB in lower case
DBUnit tries to import its datasets using upper case table names
You end up in a mess, with the ambiguous name exception.
Remember to also check whether multiple tables were created during a previous run (both upper and lower case), in which case you need to clean it up too.

Categories