I'm using Quartz 2.2.1. My code is like follows.
<bean name="myJob" class="org.springframework.scheduling.quartz.JobDetailFactoryBean">
<property name="jobClass" value="com.my.MyJob" />
<property name="durability" value="true" />
<property name="jobDataAsMap">
<map>
<entry key="param" value="myParam" />
</map>
</property>
</bean>
<bean id="myTrigger" class="org.springframework.scheduling.quartz.CronTriggerFactoryBean">
<property name="jobDetail" ref="myJob" />
<property name="cronExpression" value="0 0/1 * * * ?" />
</bean>
<bean class="org.springframework.scheduling.quartz.SchedulerFactoryBean">
<property name="jobDetails">
<list>
<ref bean="myJob" />
</list>
</property>
<property name="triggers">
<list>
<ref bean="myTrigger" />
</list>
</property>
<property name="quartzProperties" ref="quartzProperties"/>
</bean>
java class
#PersistJobDataAfterExecution
#DisallowConcurrentExecution
public class MyJob extends QuartzJobBean {
private String param;
private final org.slf4j.Logger log = LoggerFactory.getLogger(getClass());
#Override
protected void executeInternal(JobExecutionContext jobContext) throws JobExecutionException {
JobDataMap jobDataMap = jobContext.getJobDetail().getJobDataMap();
log.debug("size of jobdatamap {}", jobDataMap.keySet().size());
log.debug("quartz param {}", param);
// some code
}
public void setParam(String param) {
this.param = param;
}
public String getParam() {
return param;
}
}
but I always get param null and jobdatamap 0. I checked several examples in web and those are same like mine.
org.quartz.scheduler.instanceId = AUTO
org.quartz.scheduler.makeSchedulerThreadDaemon = true
org.quartz.threadPool.class = org.quartz.simpl.SimpleThreadPool
org.quartz.threadPool.threadCount = 1
org.quartz.threadPool.makeThreadsDaemons = true
org.quartz.jobStore.class = org.quartz.impl.jdbcjobstore.JobStoreTX
org.quartz.jobStore.driverDelegateClass=org.quartz.impl.jdbcjobstore.MSSQLDelegate
org.quartz.jobStore.dataSource = myDB
org.quartz.jobStore.tablePrefix = QRTZ_
org.quartz.jobStore.isClustered = false
org.quartz.dataSource.myDB.jndiURL = /jdbc/myDB
what is I am missing?
I have use the SchedulerFactoryBean to pass JobDataMap and it is working for me.
<bean class="org.springframework.scheduling.quartz.SchedulerFactoryBean">
<property name="jobDetails">
<list>
<ref bean="myJob" />
</list>
</property>
<property name="triggers">
<list>
<ref bean="myTrigger" />
</list>
</property>
<property name="quartzProperties" ref="quartzProperties"/>
<property name="schedulerContextAsMap">
<map>
<entry key="param" value="myParam" />
</map>
</property>
</bean>
Related
I have bean class as below -
public class Account {
private String strAccNumber = "";
private List<Account> accountList = new ArrayList<Account>();
// getter setter
....
...
#Override
public String toString() {
// code for PassThroughLineAggregator
String data="";
for (int j=0; j<accountList.size(); j++) {
Account bean = accountList.get(j);
data = data + bean.getStrAccNumber();
if (j<(accountList.size()-1)) {
data = data + "\n";
}
}
return data;
}
}
to write data I want to set accountList in my config file.
I'm using below code to set bean property.
<bean id="flatFileReader" class="org.springframework.batch.item.file.FlatFileItemReader"
scope="step">
<property name="resource" value="classpath:springBatch.csv" />
<property name="strict" value="false"></property>
<property name="linesToSkip" value="1" />
<property name="lineMapper">
<bean class="org.springframework.batch.item.file.mapping.DefaultLineMapper">
<property name="lineTokenizer">
<bean
class="org.springframework.batch.item.file.transform.DelimitedLineTokenizer">
<property name="names" value="ACC#" />
</bean>
</property>
<property name="fieldSetMapper">
<bean
class="com.abc.reader.AccountDetailsFieldSetMapper" />
</property>
</bean>
</property>
</bean>
<bean id="flatFileProcessor"
class="com.abc.processor.AccountItemProcessor"
scope="step"></bean>
<bean id="flatFileWriter"
class="com.abc.FlatFileItemWriterListener"
scope="step">
<property name="resource" value="classpath:springBatch1.csv" />
<property name="lineAggregator">
<bean class="org.springframework.batch.item.file.transform.DelimitedLineAggregator">
<property name="fieldExtractor">
<bean class="org.springframework.batch.item.file.transform.BeanWrapperFieldExtractor">
<property name="names" value="strAccNumber" />
</bean>
</property>
</bean>
</property>
</bean>
AccountItemProcessor -
public class AccountItemProcessor implements ItemProcessor<Account, List<Account>>{
#Override
public List<Account> process(Account accObj) throws Exception {
// logic..
}
After processing single record in the process step I want to write multiple items(list of item) , how can I write multiple item at a time using arraylist. In my case I want to write data from accountlist.
You'll want to implement your own LineAggregator. That is what provides the String to be written to the file. In your case, you'll write one that returns a String that is really multiple lines.
You can read more about the LineAggregator interface in the documentation here: http://docs.spring.io/spring-batch/apidocs/org/springframework/batch/item/file/transform/LineAggregator.html
I am importing a spring xml configuration file from another project using import resource. The resource is apparently found since there is no error message saying it isn't, but none of the beans get loaded. Is there something I'm doing wrong here? I should note that when I run my integration tests from within eclipse it all works fine, but it blows up when running the same integration test from within a maven build.
applicationContext.xml
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:context="http://www.springframework.org/schema/context"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context.xsd
http://www.springframework.org/schema/util http://www.springframework.org/schema/util/spring-util.xsd">
<import resource="classpath*:httpclient-4x.xml" />
<context:component-scan base-package="com.mycompany.data" />
<bean id="applicationProperties" class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
<property name="locations">
<list>
<value>classpath*:properties/client.${deployment.env}.properties</value>
</list>
</property>
<property name="ignoreResourceNotFound" value="false" />
<property name="systemPropertiesModeName" value="SYSTEM_PROPERTIES_MODE_OVERRIDE" />
<property name="searchSystemEnvironment" value="true" />
</bean>
<bean id="jacksonObjectMapper" class="com.fasterxml.jackson.databind.ObjectMapper" />
</beans>
httpclient-4x.xml
<beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:context="http://www.springframework.org/schema/context" xmlns:p="http://www.springframework.org/schema/p"
xmlns:util="http://www.springframework.org/schema/util"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context.xsd
http://www.springframework.org/schema/util http://www.springframework.org/schema/util/spring-util.xsd">
<bean id="secureRestTemplate" class="org.springframework.web.client.RestTemplate">
<constructor-arg>
<ref bean="secureClientHttpRequestFactory" />
</constructor-arg>
<property name="messageConverters">
<list>
<bean class="org.springframework.http.converter.StringHttpMessageConverter">
<property name="supportedMediaTypes">
<list>
<util:constant static-field="org.springframework.http.MediaType.APPLICATION_XML" />
<util:constant static-field="org.springframework.http.MediaType.APPLICATION_JSON" />
<util:constant static-field="org.springframework.http.MediaType.TEXT_PLAIN" />
</list>
</property>
</bean>
<bean class="org.springframework.http.converter.FormHttpMessageConverter">
<property name="supportedMediaTypes">
<list>
<util:constant static-field="org.springframework.http.MediaType.MULTIPART_FORM_DATA" />
</list>
</property>
</bean>
<bean class="org.springframework.http.converter.json.MappingJackson2HttpMessageConverter">
<property name="objectMapper" ref="jacksonObjectMapper" />
</bean>
</list>
</property>
</bean>
<bean id="secureClientHttpRequestFactory" class="com.cobalt.inventory.security.PreemptiveBasicAuthClientHttpRequestFactory">
<constructor-arg>
<ref bean="secureCloseableHttpClient" />
</constructor-arg>
</bean>
<bean id="secureCloseableHttpClient" factory-bean="secureHttpClientBuilder" factory-method="build" />
<bean id="secureHttpClientBuilder" class="org.apache.http.impl.client.HttpClientBuilder" factory-method="create">
<property name="defaultCredentialsProvider" ref="credentialsProvider" />
<property name="defaultSocketConfig" ref="socketConfig" />
<property name="defaultRequestConfig" ref="requestConfig" />
<property name="userAgent" value="${http.user.agent: CDK Cobalt invJava}" />
<property name="maxConnPerRoute" value="${http.max.connections.per.route:100}" />
<property name="maxConnTotal" value="${http.max.connections:100}" />
</bean>
<bean id="credentialsProvider" class="org.apache.http.impl.client.BasicCredentialsProvider" />
<bean id="credentials" class="org.apache.http.auth.UsernamePasswordCredentials">
<constructor-arg index="0" name="userName" value="validUsername" />
<constructor-arg index="1" name="password" value="validPassword" />
</bean>
<bean id="socketConfigBuilder" class="org.apache.http.config.SocketConfig.Builder">
<property name="soKeepAlive" value="true" />
<property name="soTimeout" value="${http.socket.timeout:10000}" />
</bean>
<bean id="socketConfig" factory-bean="socketConfigBuilder" factory-method="build" />
<bean id="requestConfigBuilder" class="org.apache.http.client.config.RequestConfig.Builder">
<property name="connectTimeout" value="${http.connection.timeout:10000}" />
<property name="socketTimeout" value="${http.socket.timeout:10000}" />
<property name="cookieSpec">
<util:constant static-field="org.apache.http.client.config.CookieSpecs.IGNORE_COOKIES" />
</property>
</bean>
<bean id="requestConfig" factory-bean="requestConfigBuilder" factory-method="build" />
<!-- For accessing common authorization service -->
<bean id="commonAuthzSecureRestTemplate" class="org.springframework.web.client.RestTemplate">
<constructor-arg>
<ref bean="commonAuthzSecureClientHttpRequestFactory" />
</constructor-arg>
<property name="messageConverters">
<list>
<bean class="org.springframework.http.converter.StringHttpMessageConverter">
<property name="supportedMediaTypes">
<list>
<util:constant static-field="org.springframework.http.MediaType.APPLICATION_XML" />
<util:constant static-field="org.springframework.http.MediaType.APPLICATION_JSON" />
<util:constant static-field="org.springframework.http.MediaType.TEXT_PLAIN" />
</list>
</property>
</bean>
<bean class="org.springframework.http.converter.FormHttpMessageConverter">
<property name="supportedMediaTypes">
<list>
<util:constant static-field="org.springframework.http.MediaType.MULTIPART_FORM_DATA" />
</list>
</property>
</bean>
<bean id="jsonMessageConverter"
class="org.springframework.http.converter.json.MappingJackson2HttpMessageConverter ">
<property name="supportedMediaTypes">
<list>
<bean id="jsonMediaTypeApplicationJson" class="org.springframework.http.MediaType">
<constructor-arg value="application" />
<constructor-arg value="json" />
</bean>
</list>
</property>
</bean>
</list>
</property>
</bean>
<bean id="commonAuthzSecureClientHttpRequestFactory" class="com.mycompany.security.PreemptiveBasicAuthClientHttpRequestFactory">
<constructor-arg>
<ref bean="commonAuthzSecureCloseableHttpClient" />
</constructor-arg>
</bean>
<bean id="commonAuthzSecureCloseableHttpClient" factory-bean="commonAuthzSecureHttpClientBuilder" factory-method="build" />
<bean id="commonAuthzSecureHttpClientBuilder" class="org.apache.http.impl.client.HttpClientBuilder" factory-method="create">
<property name="defaultCredentialsProvider" ref="commonAuthzCredentialsProvider" />
<property name="defaultSocketConfig" ref="commonAuthzSocketConfig" />
<property name="defaultRequestConfig" ref="commonAuthzRequestConfig" />
<property name="userAgent" value="${http.user.agent: userAgent}" />
<property name="maxConnPerRoute" value="${http.max.connections.per.route:100}" />
<property name="maxConnTotal" value="${http.max.connections:100}" />
</bean>
<bean id="commonAuthzCredentialsProvider" class="org.apache.http.impl.client.BasicCredentialsProvider" />
<bean id="commonAuthzCredentials" class="org.apache.http.auth.UsernamePasswordCredentials">
<constructor-arg index="0" name="userName" value="${common_services.iam.remoting.user:user" />
<constructor-arg index="1" name="password" value="${common_services.iam.remoting.password:password}" />
</bean>
<bean id="commonAuthzSocketConfigBuilder" class="org.apache.http.config.SocketConfig.Builder">
<property name="soKeepAlive" value="true" />
<property name="soTimeout" value="${http.socket.timeout:10000}" />
</bean>
<bean id="commonAuthzSocketConfig" factory-bean="commonAuthzSocketConfigBuilder" factory-method="build" />
<bean id="commonAuthzRequestConfigBuilder" class="org.apache.http.client.config.RequestConfig.Builder">
<property name="connectTimeout" value="${http.connection.timeout:10000}" />
<property name="socketTimeout" value="${http.socket.timeout:10000}" />
<property name="cookieSpec">
<util:constant static-field="org.apache.http.client.config.CookieSpecs.IGNORE_COOKIES" />
</property>
</bean>
<bean id="commonAuthzRequestConfig" factory-bean="commonAuthzRequestConfigBuilder" factory-method="build" />
<util:list id="spel-configuration">
<value>#{credentialsProvider.setCredentials(T(org.apache.http.auth.AuthScope).ANY, credentials)}</value>
<value>#{commonAuthzCredentialsProvider.setCredentials(T(org.apache.http.auth.AuthScope).ANY, commonAuthzCredentials)} </value>
</util:list>
</beans>
I turned debug logging on for Spring and found the following log entries:
[2015-04-13 10:46:56,818][DEBUG][org.springframework.beans.factory.xml.XmlBeanDefinitionReader] - Loaded 0 bean definitions from location pattern [classpath*:httpclient-4x.xml]
[2015-04-13 10:46:56,818][DEBUG][org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader] - Imported 0 bean definitions from URL location [classpath*:httpclient-4x.xml]
I do not have control of the content of httpclient-4x.xml I just reference it.
By request, here is CvsDataClientImpl:
package com.mycompany.data;
import javax.annotation.Resource;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.ResponseEntity;
import org.springframework.stereotype.Component;
import org.springframework.web.client.RestTemplate;
import com.mycompany.utility.ConditionalUtils;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.node.ObjectNode;
#Component
public class CvsDataClientImpl implements CvsDataClient<JsonNode> {
private static final int MAX_PAGE_SIZE = 100;
private static final String V1 = "/rest/v1.0/vehicles";
#Resource(name = "secureRestTemplate")
private RestTemplate restTemplate;
#Resource(name = "jsonVehiclesNodeMerger")
private JsonVehiclesNodeMerger nodeMerger;
#Value("${inv.vehicle-app.context:inventoryWebApp}")
private String vehicleAppContextPath;
#Value("${services.remoting.url:http://localhost:10080}")
private String remotingUrl;
#Override
public JsonNode read(String urlParameters) {
validateUrlParameters(urlParameters);
ResponseEntity<JsonNode> response = restTemplate.getForEntity(buildBaseRestUrl() + "?" + urlParameters, JsonNode.class);
return response.getBody();
}
#Override
public JsonNode readAll(String urlParameters) {
return readAll(urlParameters, null);
}
public JsonNode readAll(String urlParameters, Integer pageSize) {
int offset = 0;
int actualPageSize = pageSize != null ? pageSize : MAX_PAGE_SIZE;
JsonNode latestResult = null;
JsonNode baseResult = null;
do {
String extendedUrlParameters = buildUrlParameters(urlParameters, offset, actualPageSize);
latestResult = read(extendedUrlParameters);
baseResult = merge(latestResult, baseResult);
offset += actualPageSize;
} while (latestResult.get("searchResult").get("vehicles") != null);
return baseResult;
}
private JsonNode merge(JsonNode latestResult, JsonNode baseResult) {
JsonNode merged = null;
if (baseResult == null) {
merged = latestResult;
} else {
merged = nodeMerger.merge(latestResult, baseResult);
}
JsonNode searchResult = merged.get("searchResult");
((ObjectNode) searchResult).remove("summary");
return merged;
}
private String buildUrlParameters(String urlParameters, int offset, int pageSize) {
String parameters = urlParameters;
parameters += "&limit=" + pageSize + "&offset=" + offset;
return parameters;
}
private String buildBaseRestUrl() {
return remotingUrl + vehicleAppContextPath + V1;
}
private void validateUrlParameters(String urlParameters) {
if (ConditionalUtils.isNullOrEmpty(urlParameters) || urlParameters.indexOf("inventoryOwner") < 0 && urlParameters.indexOf("storeId") < 0) {
throw new IllegalArgumentException("Must provide at least an inventory owner or store ID");
}
}
void setVehicleAppContextPath(String contextPath) {
vehicleAppContextPath = contextPath;
}
void setRemotingUrl(String remotingUrl) {
this.remotingUrl = remotingUrl;
}
}
After some discussion on chat, minion and I found that httpclient-4x.xml is not included in the external project's build jar for some reason. Now I need to investigate that, but it's another story. :-)
What would be the best way to set custom parameters to JdbcPagingItemReader query?
My custom JdbcPagingItemReader implementation:
public class CustomItemReader extends JdbcPagingItemReader<Long> {
public CustomItemReader(DataSource dataSource) throws Exception {
SqlPagingQueryProviderFactoryBean queryProvider = new SqlPagingQueryProviderFactoryBean();
queryProvider.setDataSource(dataSource);
queryProvider.setSelectClause("SELECT t1.id");
queryProvider.setFromClause("FROM table1 t1 LEFT JOIN table2 t2 ON t2.fk_table1_id = t1.id");
queryProvider.setWhereClause("WHERE (t1.col1 = :param1) AND ((t2.id IS NULL) OR (t2.col3 = :param2))");
queryProvider.setSortKey("t1.id");
setDataSource(dataSource);
setFetchSize(10);
setRowMapper(new RowMapper<Long>() {
#Override
public Long mapRow(ResultSet rs, int rowNum) throws SQLException {
return rs.getLong(1);
}
});
setQueryProvider(queryProvider.getObject());
}
}
To pass parameters from the job's ExecutionContext to the JdbcPagingItemReader, you'd configure the writer as follows:
<bean id="itemReader" class="org.springframework.batch.item.database.JdbcPagingItemReader" scope="step">
<property name="dataSource" ref="dataSource" />
<property name="rowMapper">
<bean class="MyRowMapper" />
</property>
<property name="queryProvider">
<bean class="org.springframework.batch.item.database.support.SqlPagingQueryProviderFactoryBean">
<property name="dataSource" ref="dataSource" />
<property name="sortKeys">
<map>
<entry key="t1.id" value="ASCENDING"/>
</map>
</property>
<property name="selectClause" value="SELECT t1.id" />
<property name="fromClause" value="FROM table1 t1 LEFT JOIN table2 t2 ON t2.fk_table1_id = t1.id" />
<property name="whereClause" value=""WHERE (t1.col1 = :param1) AND ((t2.id IS NULL) OR (t2.col3 = :param2))" />
</bean>
</property>
<property name="pageSize" value="10" />
<property name="parameterValues">
<map>
<entry key="param1" value="#{jobExecutionContext[param1]}" />
<entry key="param2" value="#{jobExecutionContext[param2]}" />
</map>
</property>
</bean>
You can view a configuration like this in action in the Spring Batch examples here: https://github.com/spring-projects/spring-batch/tree/master/spring-batch-samples
I'm using spring batch to perform some calcul, in a reader I have to get a large data to be treated in a processor / writer, and this process takes a lot of (RAM).
So I tried to split the step using the partitioner like below :
<batch:step id="MyStep.master" >
<partition step="MyStep" partitioner="MyPartitioner">
<handler grid-size="1" task-executor="TaskExecutor" />
</partition>
</batch:step>
<batch:step id="MyStep" >
<batch:tasklet transaction-manager="transactionManager">
<batch:chunk reader="MyReader" processor="MyProcessor"
writer="MyWriter" commit-interval="1000" skip-limit="1000">
<batch:skippable-exception-classes>
<batch:include class="...FunctionalException" />
</batch:skippable-exception-classes>
</batch:chunk>
</batch:tasklet>
</batch:step>
<bean id="MyPartitioner" class="...MyPartitioner" scope="step"/>
<bean id="TaskExecutor" class="org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor" >
<bean name="MyReader"
class="org.springframework.batch.item.database.JdbcCursorItemReader"
scope="step">
<property name="dataSource" ref="dataSource" />
<property name="sql">
<value>
<![CDATA[
SELECT...
]]>
</value>
</property>
<property name="rowMapper" ref="MyRowMapper" />
</bean>
<bean id="MyRowMapper" class="...MyRowMapper" />
<bean id="dataSource" class="com.mchange.v2.c3p0.ComboPooledDataSource" destroy-method="close">
<property name="driverClass" value="org.postgresql.Driver"/>
<property name="jdbcUrl" value="jdbc:postgresql://${database.host}/${database.name}"/>
<property name="user" value="${database.user}"/>
<property name="password" value="${database.password}"/>
<property name="acquireIncrement" value="1" />
<property name="autoCommitOnClose" value="true" />
<property name="minPoolSize" value="${min.pool.size}" /> <!-- min.pool.size=5 -->
<property name="maxPoolSize" value="${max.pool.size}" /> <!-- max.pool.size=15 -->
</bean>
But in vain the partitioning takes a lot of memory too, because the steps (slaves) are executed in parallel, what I want to do is to split the step and execute the thread successively (not in parallel) to reduce the memory usage (RAM), is that possible?
The question it's a bit old, so I'm not sure if this will be helpful now, probably you solved it yourself.
If you don't have a problem with the row execution order the solution would be to query your db in your partitioner bean, then pass to each partitions all the information to split in portions your table/s (start_key, end_key) this will reduce the ram usage (A LOT).
Some warnings:
Be aware that both the query used by partitioner bean and the reader must have the same "order by"
partitioned readers must be scope="step"
Don't use this method if you need to process the rows in a precise order
To tune the RAM try different combination of gridSize and taskExecutor maxCoreSize (that's why my gridSize is a jobParams)
Here an example:
XML Configuration:
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:batch="http://www.springframework.org/schema/batch"
xmlns:context="http://www.springframework.org/schema/context"
xmlns:aop="http://www.springframework.org/schema/aop" xmlns:util="http://www.springframework.org/schema/util"
xsi:schemaLocation="http://www.springframework.org/schema/aop http://www.springframework.org/schema/aop/spring-aop-4.0.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-4.0.xsd
http://www.springframework.org/schema/util http://www.springframework.org/schema/util/spring-util-4.0.xsd
http://www.springframework.org/schema/batch http://www.springframework.org/schema/batch/spring-batch-3.0.xsd
http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-4.0.xsd">
<!-- JOB -->
<batch:job id="printPdf" job-repository="jobRepository"
restartable="false">
<batch:step id="MyStep">
<batch:partition step="MyStep.template"
partitioner="myPartitioner" handler="partitionHandler">
</batch:partition>
</batch:step>
</batch:job>
<!-- Partitioner -->
<bean id="myPartitioner" class="foo.MyPartitioner"
scope="step">
<property name="jdbcTemplate" ref="myJdbcTemplate" />
<property name="sql"
value="Select ...." />
<property name="rowMap">
<bean
class="foo.MyPartitionHandlerRowMapper" />
</property>
<property name="preparedStatementSetter">
<bean
class="org.springframework.batch.core.resource.ListPreparedStatementSetter">
<property name="parameters">
<list>
<value>#{jobParameters['param1']}</value>
</list>
</property>
</bean>
</property>
</bean>
<bean id="partitionHandler" scope="step"
class="org.springframework.batch.core.partition.support.TaskExecutorPartitionHandler">
<property name="taskExecutor" ref="customTaskExecutor" />
<property name="gridSize" value="#{jobParameters['gridSize']}" />
<property name="step" ref="MyStep.template" />
</bean>
<bean id="customTaskExecutor"
class="org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor">
<property name="corePoolSize" value="8" />
<property name="maxPoolSize" value="8" />
<property name="waitForTasksToCompleteOnShutdown" value="true" />
<property name="awaitTerminationSeconds" value="120" />
</bean>
<batch:step id="MyStep.tempate">
<batch:tasklet transaction-manager="transactionManager">
<batch:chunk commit-interval="2500" reader="myReader"
processor="myProcessor" writer="myWriter" skip-limit="2500">
<batch:skippable-exception-classes>
<batch:include class="...FunctionalException" />
</batch:skippable-exception-classes>
</batch:chunk>
</batch:tasklet>
</batch:step>
<!-- Beans -->
<!-- Processors -->
<bean id="myProcessor" class="foo.MyProcessor"
scope="step">
</bean>
<bean id="classLoaderVerifier"
class="it.addvalue.pkjwd.services.genbean.GenericStockKeysForNoDuplicate" />
<!-- Readers -->
<bean id="myReader"
class="org.springframework.batch.item.database.JdbcCursorItemReader"
scope="step">
<property name="dataSource" ref="myDataSouce" />
<property name="sql"
value="select ... from ... where ID >= ? and ID <= ?" />
<property name="rowMapper">
<bean class="foo.MyReaderPartitionedRowMapper" />
</property>
<property name="preparedStatementSetter">
<bean
class="org.springframework.batch.core.resource.ListPreparedStatementSetter">
<property name="parameters">
<list>
<value>#{stepExecutionContext['START_ID']}</value>
<value>#{stepExecutionContext['END_ID']}</value>
</list>
</property>
</bean>
</property>
</bean>
<!-- Writers -->
<bean id="myWriter"
class="org.springframework.batch.item.database.JdbcBatchItemWriter">
<property name="assertUpdates" value="false" />
<property name="itemPreparedStatementSetter">
<bean class="foo.MyWriterStatementSetters" />
</property>
<property name="sql"
value="insert ..." />
<property name="dataSource" ref="myDataSouce" />
</bean>
</beans>
Your Partitioner Bean will look like this:
package foo;
import foo.model.MyTable;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import org.apache.commons.lang.StringUtils;
import org.springframework.batch.core.partition.support.Partitioner;
import org.springframework.batch.item.ExecutionContext;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.PreparedStatementSetter;
import org.springframework.jdbc.core.RowMapper;
public class MyPartitioner implements Partitioner
{
private JdbcTemplate jdbcTemplate;
private RowMapper<foo.model.MyTable> rowMap;
private String sql;
private PreparedStatementSetter preparedStatementSetter;
public JdbcTemplate getJdbcTemplate()
{
return jdbcTemplate;
}
public void setJdbcTemplate(JdbcTemplate jdbcTemplate)
{
this.jdbcTemplate = jdbcTemplate;
}
public RowMapper<foo.model.MyTable> getRowMap()
{
return rowMap;
}
public void setRowMap(RowMapper<PkjwdPolizzePartition> rowMap)
{
this.rowMap = rowMap;
}
public String getSql()
{
return sql;
}
public void setSql(String sql)
{
this.sql = sql;
}
public PreparedStatementSetter getPreparedStatementSetter()
{
return preparedStatementSetter;
}
public void setPreparedStatementSetter(PreparedStatementSetter preparedStatementSetter)
{
this.preparedStatementSetter = preparedStatementSetter;
}
#Override
public Map<String, ExecutionContext> partition(int gridSize)
{
Map<String, ExecutionContext> map = new HashMap<String, ExecutionContext>();
try
{
List<PkjwdPolizzePartition> lstMyRows = jdbcTemplate.query(sql, preparedStatementSetter ,rowMap);
if ( lstMyRows.size() > 0 )
{
int total = lstMyRows.size();
int rowsPerPartition = total / gridSize;
int leftovers = total % gridSize;
total = lstMyRows.size() - 1;
int startPos = 0;
int endPos = rowsPerPartition - 1;
int i = 0;
while (endPos <= (total))
{
ExecutionContext context = new ExecutionContext();
if ( endPos + leftovers == total )
{
endPos = total;
}
else if ( endPos >= (total) )
{
endPos = total;
}
context.put("START_ID", lstMyRows.get(startPos).getId());
context.put("END_ID", lstMyRows.get(endPos).getId());
map.put("PART_" + StringUtils.leftPad("" + i, ("" + gridSize).length(), '0'), context);
i++;
startPos = endPos + 1;
endPos = endPos + rowsPerPartition;
}
}
}
catch ( Exception e )
{
e.printStackTrace();
}
return map;
}
}
I have an Spring+Hibernate/Flex application that needs to switch dynamically between database schemas. To accomplish that I implemented a AbstractRoutingDataSource following this article. Unfortunately to change between dataSource's doesn't work.
Someone could help me?
I followed this link: Spring + Hibernate SessionFactory + AbstractRoutingDataSource
My code is:
- ApplicationContext:
<bean id="dataSource"
class="org.springframework.jdbc.datasource.DriverManagerDataSource">
<property name="driverClassName">
<value>${jdbc.driverClassName}</value>
</property>
<property name="url">
<value>${jdbc.url}</value>
</property>
<property name="username">
<value>${jdbc.username}</value>
</property>
<property name="password">
<value>${jdbc.password}</value>
</property>
</bean>
<bean id="dataSource2"
class="org.springframework.jdbc.datasource.DriverManagerDataSource">
<property name="driverClassName">
<value>${jdbc.driverClassName}</value>
</property>
<property name="url">
<value>${jdbc.url2}</value>
</property>
<property name="username">
<value>${jdbc.username2}</value>
</property>
<property name="password">
<value>${jdbc.password2}</value>
</property>
</bean>
<bean id="routingDS" class="br.com.cpb.gtf.infra.RoutingDataSource">
<property name="targetDataSources">
<map key-type="java.lang.String">
<entry key="br.com.cpb.gtf.infra.SchemaConstants.TESTE" value-ref="dataSource2"/>
<entry key="br.com.cpb.gtf.infra.SchemaConstants.PRODUCAO" value-ref="dataSource"/>
</map>
</property>
<property name="defaultTargetDataSource" ref="dataSource2"/>
</bean>
<bean id="sessionFactory"
class="org.springframework.orm.hibernate3.annotation.AnnotationSessionFactoryBean">
<property name="dataSource" ref="dataSource" />
<property name="packagesToScan" value="br.com.cpb.*" />
<property name="hibernateProperties">
<props>
<prop key="hibernate.dialect">
org.hibernate.dialect.SQLServerDialect
</prop>
<prop key="hibernate.show_sql">true</prop>
<prop key="hibernate.hbm2ddl.auto">update</prop>
<prop key="hibernate.query.substitutions">true '1', false '0'" </prop>
</props>
</property>
</bean>
<bean id="transactionManager"
class="org.springframework.orm.hibernate3.HibernateTransactionManager">
<property name="sessionFactory" ref="sessionFactory" />
</bean>
-RoutingDataSource:
package br.com.cpb.gtf.infra;
import org.springframework.jdbc.datasource.lookup.AbstractRoutingDataSource;
public class RoutingDataSource extends AbstractRoutingDataSource {
#Override
protected Object determineCurrentLookupKey()
{
return Globals.getSchema();
}
}
-SchemaConstants
package br.com.cpb.gtf.infra;
public class SchemaConstants {
public static final String PRODUCAO = "dataSource";
public static final String TESTE = "dataSource2";
}
-Globals:
package br.com.cpb.gtf.infra;
public class Globals {
private static final ThreadLocal<String> schemaHolder = new ThreadLocal<String>();
public static void setSchema(String schema) {
schemaHolder.set(schema);
}
public static String getSchema() {
return schemaHolder.get();
}
public static void clearCustomerType() {
schemaHolder.remove();
}
}
-Service:
#RemotingInclude
#Transactional(propagation=Propagation.REQUIRED, rollbackFor=Exception.class)
public Adm_Usuario logar(Adm_Usuario user) {
Globals.clearCustomerType();
Globals.setSchema(SchemaConstants.TESTE);
Adm_UsuarioDao dao = new Adm_UsuarioDao(sessionFactory);
Adm_Usuario admUsuario = dao.logar(user);
setLoggedUserOnSession(admUsuario);
return admUsuario;
}
<bean id="sessionFactory"
class="org.springframework.orm.hibernate3.annotation.AnnotationSessionFactoryBean">
<property name="dataSource" ref="dataSource" />
in property dataSource, ref should be routingDS:
<property name="dataSource" ref="routingDS" />