I have a Spring Boot project with a custom CacheResolver as I need to decide on runtime which cache I want to use, I don't have any compilation errors but, when I do some tests and place a break point at my custom CacheResolver it never steps into it.
This is my Configuration class for the Cache:
#Configuration
#EnableCaching(proxyTargetClass = true)
#PropertySource(CacheConfig.CLASSPATH_DEPLOY_CACHE_PROPERTIES_PROPERTIES)
public class CacheConfig extends CachingConfigurerSupport{
public static final String CLASSPATH_DEPLOY_CACHE_PROPERTIES_PROPERTIES = "classpath:/deploy/cache-properties.properties";
public static final String CACHEABLE_DOCUMENTS_PROPERTY = "cacheable.documents";
public static final String TTL_CACHEABLE_DOCUMENTS_PROPERTY = "ttl.cacheable.documents";
public static final String SIZED_CACHEABLE_DOCUMENTS_PROPERTY = "sized.cacheable.documents";
public static final String CACHE_NAME = "permanentCache";
public static final String TTL_CACHE = "ttlCache";
public static final String SIZED_CACHE = "sizedCache";
public static final String CACHEABLE_DOCUMENTS = "cacheableDocuments";
public static final String SIZED_CACHEABLE_DOCUMENTS = "sizedCacheableDocuments";
public static final int WEIGHT = 1000000;
public static final int TO_KBYTES = 1000;
#Inject
protected Environment environment;
//#Bean
#Override
public CacheManager cacheManager() {
SimpleCacheManager cacheManager = new SimpleCacheManager();
GuavaCache sizedCache = new GuavaCache(SIZED_CACHE, CacheBuilder.newBuilder().maximumWeight(WEIGHT).weigher(
(key, storable) -> {
String json = ((Storable) storable).toJson();
return json.getBytes().length / TO_KBYTES;
}
).build());
GuavaCache permanentCache = new GuavaCache(CACHE_NAME,CacheBuilder.newBuilder().build());
//GuavaCache ttlCache = new GuavaCache(TTL_CACHE, CacheBuilder.newBuilder().expireAfterWrite(30, TimeUnit.MINUTES).build());
cacheManager.setCaches(Arrays.asList(permanentCache,sizedCache));
return cacheManager;
}
#Bean(name = "wgstCacheResolver")
#Override
public CacheResolver cacheResolver(){
CacheResolver cacheResolver = new WgstCacheResolver(cacheManager(),cacheableDocuments(),sizedCacheableDocuments());
return cacheResolver;
}
#Bean(name = CACHEABLE_DOCUMENTS)
public List<String> cacheableDocuments(){
String[] cacheableDocuments = StringUtils.commaDelimitedListToStringArray(environment.getProperty(CACHEABLE_DOCUMENTS_PROPERTY));
return Arrays.asList(cacheableDocuments);
}
#Bean(name = SIZED_CACHEABLE_DOCUMENTS)
public List<String> sizedCacheableDocuments(){
String[] sizedCacheableDocuments = StringUtils.commaDelimitedListToStringArray(environment.getProperty(SIZED_CACHEABLE_DOCUMENTS_PROPERTY));
return Arrays.asList(sizedCacheableDocuments);
}
}
Here is my CacheResolver
public class WgstCacheResolver extends AbstractCacheResolver {
private final List<String> cacheableDocuments;
private final List<String> sizedCacheableDocuments;
public WgstCacheResolver(final CacheManager cacheManager,final List<String> cacheableDocuments, final List<String> sizedCacheableDocuments) {
super(cacheManager);
this.cacheableDocuments = cacheableDocuments;
this.sizedCacheableDocuments = sizedCacheableDocuments;
}
/**
* Resolves the cache(s) to be updated on runtime
* #param context
* #return*/
#Override
protected Collection<String> getCacheNames(final CacheOperationInvocationContext<?> context) {
final Collection<String> cacheNames = new ArrayList<>();
final AbstractDao dao = (AbstractDao)context.getTarget();
final String documentType = dao.getDocumentType().toString();
if (cacheableDocuments.contains(documentType)){
cacheNames.add("permanentCache");
}
if (sizedCacheableDocuments.contains(documentType)){
cacheNames.add("sizedCache");
}
return cacheNames;
}
}
And here my DAO where I use the cache:
#Component
#Scope(value = ConfigurableBeanFactory.SCOPE_PROTOTYPE, proxyMode = ScopedProxyMode.DEFAULT)
#CacheConfig(cacheResolver = "wgstCacheResolver")
public class CacheableDao<T extends Storable> extends AbstractDao<T> {
private final Logger logger = LoggerFactory.getLogger(CacheableDao.class);
public CacheableDao(final Bucket bucket, final Class<T> typeParameterClass, final DocumentType documentType) {
super(bucket, typeParameterClass, documentType);
}
#Cacheable(key = "{#root.methodName, #root.target.generateFullKey(#key)}")
public T get(final String key) throws DatastoreAccessException, ObjectMappingException {
//do something
}
.
.
.
}
I have tried implementing CacheResolver instead of extending AbstractCacheResolver but it didn't make any difference.
Thank you.
Cache names need to be included at some point, just specifying the CacheResolver to use is not enough, the #Cacheable class needs to be aware of the available cache names, so I included them with the #CacheConfig annotation:
#CacheConfig(cacheNames = {WgstCacheConfig.PERMANENT_CACHE, WgstCacheConfig.SIZED_CACHE},
cacheResolver = WgstCacheConfig.WGST_CACHE_RESOLVER)
public class CacheableDao<T extends Storable> extends AbstractDao<T> {
One thing that I don't like is that I need to provide a null CacheManager, even if I'm not using it, otherwise I get the following error:
Caused by: java.lang.IllegalStateException: No CacheResolver specified, and no bean of type CacheManager found. Register a CacheManager bean or remove the #EnableCaching annotation from your configuration.
So I left it like this, and it works:
#Bean
public CacheManager cacheManager() {
return null;
}
#Bean(name = WGST_CACHE_RESOLVER)
public CacheResolver cacheResolver(){
CacheResolver cacheResolver = new WgstCacheResolver(cacheableDocuments(),sizedCacheableDocuments(),getPermanentCache(),
getSizedCache());
return cacheResolver;
}
Reran my tests, stepping through my custom CacheResolver and it is behaving as expected resolving to the correct cache(s)
My configuration class is not extending CachingConfigurerSupport anymore.
After a bit of back and forth (Sorry about that!) it turns out this is indeed a bug in Spring Framework.
I've created SPR-13081. Expect a fix for the next maintenance release (4.1.7.RELEASE). Thanks for the sample project!
Related
I have a class annotated with #Component which is use to initialze application.yml config properties. Service classe is using configuration property. But sometime my Service class instance created before the Configuration class and I get null property value in service class, Its random not specific pattern.
Configuration Initializer class..
#Component
public class ConfigInitializer implements InitializingBean {
private static final Logger log = LoggerFactory.getLogger(ConfigInitializer.class);
#Autowired
ProxyConfig proxyConfig;
/*#PostConstruct
public void postConstruct(){
setProperties();
}
*/
#Override
public void afterPropertiesSet() {
setProperties();
}
private void setSystemProperties(){
log.debug("Setting properties...");
Properties props = new Properties();
props.put("PROXY_URL", proxyConfig.getProxyUrl());
props.put("PROXY_PORT", proxyConfig.getProxyPort());
System.getProperties().putAll(props);
}
}
#Component
#ConfigurationProperties(prefix = "proxy-config")
public static class ProxyConfig {
private String proxyUrl;
private String proxyPort;
public String getProxyUrl() {
return proxyUrl;
}
public void setProxyUrl(String proxyUrl) {
this.proxyUrl = proxyUrl;
}
public String getProxyPort() {
return proxyPort;
}
public void setProxyPort(String proxyPort) {
this.proxyPort = proxyPort;
}
}
Service Class..
#Service("receiverService")
public class ReceiverService {
private static final Logger logger = LoggerFactory.getLogger(ReceiverService.class);
private ExecutorService executorService = Executors.newSingleThreadExecutor();
#Autowired
public ReceiverService() {
initClient();
}
private void initClient() {
Future future = executorService.submit(new Callable(){
public Object call() throws Exception {
String value = System.getProperty("PROXY_URL"); **//Here I am getting null**
logger.info("Values : " + value);
}
});
System.out.println("future.get() = " + future.get());
}
}
Above Service class get null values String value = System.getProperty("PROXY_URL")
When I use #DependsOn annotation on Service class, it works fine.
In my little knowledge, I know Spring does not have specific order of bean creation.
I want to know If I use #Configuration instead of #Component on ConfigInitializer class like below, Will spring initialize ConfigInitializer
class before other beans ?.
#Configuration
public class ConfigInitializer implements InitializingBean {
//code here
}
Good evening all.. maybe someone can help me sort this out some.
I'm attempting to set up multiple cache managers, each with its own set of distinct caches. To that end, I have the following:
MyCachingFactory:
public interface MyCachingFactory {
public JCacheCacheManager cacheManager();
public JCacheCacheManager commonCacheManager();
public JCacheCacheManager secondaryCacheManager();
public JCacheCacheManager secondaryCacheManager();
}
MyCachingFactoryImpl:
#EnableCaching
public class MyCachingFactoryImpl implements MyCachingFactory {
private static final int DEFAULT_CACHE_EXPIRY = 64800;
private static final List<String> DEFAULT_CACHES = Arrays.asList(
"default");
private static final List<String> COMMON_CACHES = Arrays.asList(
"common_default",
"common_config",
"common_session",
"common_roles",
"common_capabilities");
private static final List<String> SECONDARY_CACHES = Arrays.asList(
"secondary_default",
"secondary_foo");
private static final List<String> TERTIARY_CACHES = Arrays.asList(
"tertiary_default",
"tertiary_foo");
#Bean
public JCacheCacheManager cacheManager() {
return getNewJCacheManager(DEFAULT_CACHES);
}
#Bean
public JCacheCacheManager commonCacheManager() {
return getNewJCacheManager(COMMON_CACHES);
}
#Bean
public JCacheCacheManager secondaryCacheManager() {
return getNewJCacheManager(SECONDARY_CACHES);
}
#Bean
public JCacheCacheManager tertiaryCacheManager() {
return getNewJCacheManager(TERTIARY_CACHES);
}
private JCacheCacheManager getNewJCacheManager(List<String> cacheNames) {
JCacheCacheManager springJCacheManager = new JCacheCacheManager();
CachingProvider provider = Caching.getCachingProvider();
EhcacheCachingProvider ehcacheProvider = (EhcacheCachingProvider) provder;
javax.cache.CacheManager jCacheManager = ehcacheProvider.getCacheManager();
AtomicInteger count = new AtomicInteger(1);
cacheNames.forEach((listName) -> {
logger.debug("[" + count + "] Creating cache name [" + listName + "].");
jCacheManager.createCache(listName, new MutableConfiguration<>()
.setExpiryPolicyFactory(TouchedPolicyExpiry.factoryOf(new Duration(TimeUnit.SECONDS, DEFAULT_CACHE_EXPIRY)))
.setStoryByValue(false)
.setStatisticsEnabled(true));
count.incrementAndGet();
});
springJCacheManager.setCacheManager(jCacheManager);
return springJCacheManager;
}
}
Example class:
public class Example {
#Resource
private JCacheCacheManager commonCacheManager;
#CacheResult(cacheName = "common_config")
public String getCachedConfigByName(String name) { ... }
#CacheResult(cacheName = "common_config")
public String getCachedConfigByType(String type) { ... }
}
However, when starting up my application, I see my debug statements for cache creation... iterating over the same list (in this case, COMMON_CACHE) multiple times. This, in turn, causes a nested exception in that Spring complains that a cache named X already exists.
What am I doing wrong here?
Thanks in advance (go easy please... I am new to spring caching).
I am coding Dropwizard micro-services that fetch data in a MongoDB database. The micro-services work fine but I'm struggling to use in my DAO the configuration coming from my Dropwizard configuration Java class. Currently I have
public class XDAO implements IXDAO {
protected DB db;
protected DBCollection collection;
/* singleton */
private static XDAO instance;
/* Get singleton */
public static synchronized XDAO getSingleton(){
if (instance == null){
instance = new XDAO();
}
return instance;
}
/* constructor */
public XDAO(){
initDatabase();
initDatabaseIndexes();
}
private void initDatabase(){
MongoClient client = null;
try {
client = new Mongo("10.126.80.192",27017);
db = client.getDB("terre");
//then some other code
}
catch (final MongoException e){
...
}
catch (UnknownHostException e){
...
}
}
}
I want to unhard-code the three arguments in these two lines :
client = new Mongo("10.126.80.192", 27017);
db = client.getDB("terre");
My MongoConfiguration Java class is :
public class MongoConfiguration extends Configuration {
#JsonProperty
#NotEmpty
public String host;
#JsonProperty
public int port = 27017;
#JsonProperty
#NotEmpty
public String db_name;
public String getMongohost() {
return host;
}
public void setMongohost(String host) {
this.host = host;
}
public int getMongoport() {
return port;
}
public void setMongoport(int port) {
this.port = port;
}
public String getDb_name() {
return db_name;
}
public void setDb_name(String db_name) {
this.db_name = db_name;
}
}
My Resource class that uses the DAO is :
#Path("/mongo")
#Produces(MediaType.APPLICATION_JSON)
public class MyResource {
private XDAO xDAO = XDAO.getSingleton();
private String mongohost;
private String db_name;
private int mongoport;
public MyResource(String db_name, String mongohost, int mongoport) {
this.db_name = db_name;
this.mongohost = mongohost;
this.mongoport = mongoport;
}
public MyResource() {
}
#GET
#Path("/findByUUID")
#Produces(value = MediaType.APPLICATION_JSON)
#Timed
public Entity findByUUID(#QueryParam("uuid") String uuid) {
return xDAO.findByUUid(uuid);
}
}
And in my application class there is
#Override
public void run(final MongoConfiguration configuration, final Environment environment) {
final MyResource resource = new MyResource(configuration.getDb_name(), configuration.getMongohost(), configuration.getMongoport());
environment.jersey().register(resource);
}
To solve my problem I tried many things. The last thing I tried was to add these four fields in my XDAO
private String mongohost;
private String db_name;
private int mongoport;
private static final MongoConfiguration configuration = new MongoConfiguration();
Coming with this piece of code in the constructor of the XDAO:
public XDAO(){
instance.mongohost = configuration.getMongohost();
instance.mongoport = configuration.getMongoport();
instance.db_name = configuration.getDb_name();
/* then like before */
initDatabase();
initDatabaseIndexes();
}
When I try this I have a null pointer exception when my initDatabase method is invoked : mongoHost and db_name are null
The problem is that you are creating a new configuration in your XDAO with private static final MongoConfiguration configuration = new MongoConfiguration(); instead of using the config from Dropwizard's run method.
When you do this, the fields host and db_name in the new configuration are null, which is why you are getting the NPE when instantiating XDAO
You need to pass the instance of MongoConfiguration that you get from Dropwizard in your application class to your XDAO, ideally when the singleton XDAO is created so it has non-null values for db_name and host
This code below part of the problem - you are creating the singleton without giving XDAO the MongoConfiguration configuration instance.
public class XDAO implements IXDAO {
//... snip
/* Get singleton */
public static synchronized XDAO getSingleton(){
if (instance == null){
instance = new XDAO(); // no configuration information is included!
}
return instance;
}
/* constructor */
public XDAO(){
initDatabase(); // this call needs db_name & host but you haven't set those yet!!
initDatabaseIndexes();
}
I recommend you modify your application class to create XDAO along the lines of this:
#Override
public void run(final MongoConfiguration configuration, final Environment environment) {
XDAO XDAOsingleton = new XDAO(configuration);
XDAO.setSingletonInstance(XDAOsingleton); // You need to create this static method.
final MyResource resource = new MyResource(configuration.getDb_name(), configuration.getMongohost(), configuration.getMongoport()); // MyResource depends on XDAO so must be created after XAO's singleton is set
environment.jersey().register(resource);
}
You may also need to take initDatabase() etc out of XDAO's constructor depending on if you keep public static synchronized XDAO getSingleton()
I also recommend you change the constructor of MyResource to public MyResource(XDAO xdao). The resource class doesn't appear to need the configuration information, and it is better to make the dependency on an XDAO explicit (you then also don't need to keep the XDAO singleton in a static field inside XDAO's class).
To get MongoDB integrated in a simple way to Dropwizard, please try and use MongoDB Managed Object. I will explain this in 3 simple steps:
Step 1: Create a simple MongoManged class:
import com.mongodb.Mongo;
import io.dropwizard.lifecycle.Managed;
public class MongoManaged implements Managed {
private Mongo mongo;
public MongoManaged(Mongo mongo) {
this.mongo = mongo;
}
#Override
public void start() throws Exception {
}
#Override
public void stop() throws Exception {
mongo.close();
}
}
Step 2: Mention MongoDB Host, Port, DB Name in a config yml file:
mongoHost : localhost
mongoPort : 27017
mongoDB : softwaredevelopercentral
Step 3: Bind everything together in the Application Class:
public class DropwizardMongoDBApplication extends Application<DropwizardMongoDBConfiguration> {
private static final Logger logger = LoggerFactory.getLogger(DropwizardMongoDBApplication.class);
public static void main(String[] args) throws Exception {
new DropwizardMongoDBApplication().run("server", args[0]);
}
#Override
public void initialize(Bootstrap<DropwizardMongoDBConfiguration> b) {
}
#Override
public void run(DropwizardMongoDBConfiguration config, Environment env)
throws Exception {
MongoClient mongoClient = new MongoClient(config.getMongoHost(), config.getMongoPort());
MongoManaged mongoManaged = new MongoManaged(mongoClient);
env.lifecycle().manage(mongoManaged);
MongoDatabase db = mongoClient.getDatabase(config.getMongoDB());
MongoCollection<Document> collection = db.getCollection(config.getCollectionName());
logger.info("Registering RESTful API resources");
env.jersey().register(new PingResource());
env.jersey().register(new EmployeeResource(collection, new MongoService()));
env.healthChecks().register("DropwizardMongoDBHealthCheck",
new DropwizardMongoDBHealthCheckResource(mongoClient));
}
}
I have used these steps and written a blog post and a sample working application code is available on GitHub. Please check: http://softwaredevelopercentral.blogspot.com/2017/09/dropwizard-mongodb-tutorial.html
I had already tried solutions mentioned in Why is my Spring #Autowired field null? yet the problem persists. I have tried annotating the class DevicePojo(code below) with #Configurable #Service.
Here are my beans
DistributionConfig.java
#Component
#Configuration
public class DistributionConfig {
#Qualifier("exponentialDistribution")
#Bean
#Scope("prototype")
public DistributionService exponentialDistribution() {
return new ExponentiallyDistribute();
}
#Qualifier("normalDistribution")
#Bean
#Scope("prototype")
public DistributionService normalDistribution() {
return new NormallyDistribute();
}
#Qualifier("uniformDistribution")
#Bean
#Scope("prototype")
public DistributionService uniformDistribution() {
return new UniformlyDistribute();
}
}
JsonFileConfig.java
#Configuration
public class JsonFileConfig {
private static ObjectMapper mapper=new ObjectMapper();
#Qualifier("devicesPojo")
#Bean
public DevicesPojo[] devicesPojo() throws Exception {
DevicesPojo[] devicePojo=mapper.readValue(new File(getClass().getClassLoader().getResource("Topo/esnet-devices.json").getFile()),DevicesPojo[].class);
return devicePojo;
}
#Qualifier("linksPojo")
#Bean
public LinksPojo[] linksPojo() throws Exception {
LinksPojo[] linksPojo=mapper.readValue(new File(getClass().getClassLoader().getResource("Topo/esnet-adjcies.json").getFile()),LinksPojo[].class);
return linksPojo;
}
}
Here is my DevicePojo where i get the null pointer exception.
#JsonDeserialize(using = DeviceDeserializer.class)
#Component
public class DevicesPojo {
private String device;
private List<String> ports;
private List<Integer> bandwidth;
#Autowired
#Qualifier("uniformDistribution")
private DistributionService uniformDistribution; // Here uniformDistribution is null
public DevicesPojo(String device, List<String> port, List<Integer> bandwidth) {
this.device = device;
this.ports= port;
this.bandwidth=bandwidth;
this.uniformDistribution.createUniformDistribution(1000,0,ports.size());
}
public String getDevice(){
return device;
}
public String getRandomPortForDevice()
{
return ports.get((int)uniformDistribution.getSample());
}
public List<String> getAllPorts(){
return ports;
}
public int getBandwidthForPort(String port){
return bandwidth.get(ports.indexOf(port));
}
}
However, if i replace private DistributionService uniformDistribution;with private DistributionService uniformDistribution=new UniformDistribution() the code is working fine.
There is a mix of problems here.
1. You create your DevicesPojo objects using JSON deserializer. Spring has no chance to interfere and inject the DistributionService.
2. Even if it could interfere, it would fail, since you are trying to use the 'distributionService' object in the constructor. Field injection would work only after an object is constructed.
Now regarding fixing the problems.
Long answer short - don't expect auto-injection in your POJOs.
Normally, dependencies like 'distributionService' in objects that are created on the fly, like your DevicesPojo are avoided altogether.
If you insist on having them, inject them manually at construction time:
class DevicesPojoFactory {
#Autowired #Qualifier("uniformDistribution")
private DistributionService uniformDistribution;
ObjectMapper mapper = new ObjectMapper();
DevicesPojo[] readFromFile(String path) {
DevicesPojo[] devicePojoArr = mapper.readValue(...);
for (DevicesPojo dp: devicePojoArr) {
dp.setDistribution(uniformDistribution);
}
}
}
Why would I be getting a cast error from a proxy based on a bean that is properly implementing an interface? Package and object names have been changed to protect proprietary data.
I know the error
java.lang.ClassCastException: com.sun.proxy.$Proxy20 cannot be cast to package.Order
normally comes up when one is not implementing an interface as is expected. In this case I am implementing interfaces for facilitating Proxy generation.
My specific problem is using a collection of domain objects in an enhanced for loop within an ItemWriter implementation. This exception is being thrown from the enhanced for loop I reference below.
public void write(List<? extends Order> items) throws Exception {
for ( Order order : items){
Order is an interface implemented by OrderBean.
public class OrderBean implements Order
And is declared as a Prototype for use by a BeanWrapperFieldSetMapper in a Spring Java Configuration class. As below.
#Bean
#Scope("prototype")
public Order order()
As an experiment I commented out the Java Configuration declaration, and replaced it with the XML declaration below.
<bean name="order"
scope="prototype"
class="package.OrderBean"/>
The entirety of my configuration appears below as requested in comments. I am not sure why the Order objects are being proxied, unless possibly it comes from the BeanWrapperFieldSetMapper.
Upon further testing, I found I get the same error from any bean set with the Step scope, as in #Scope("step").
#Configuration
public class OrderProcessingBatchConfiguration {
#Value("${batch.jdbc.driver}")
private String driverClassName;
#Value("${batch.jdbc.url}")
private String driverUrl;
#Value("${batch.jdbc.user}")
private String driverUsername;
#Value("${batch.jdbc.password}")
private String driverPassword;
#Value("${order.delimiter}")
private String delimiter;
#Value("${order.item.field.names}")
private String[] orderItemFieldNames;
#Value("${order.item.file.path}")
private String orderItemFilePath;
#Value("${order.field.names}")
private String[] orderFieldNames;
#Value("${order.file.path}")
private String orderFilePath;
#Value("${query.order.clear}")
private String deleteOrderQuery;
#Value("${query.order.item.clear}")
private String deleteOrderItemQuery;
#Value("${ftp.host.name}")
private String ftpHostName;
#Value("${ftp.host.port}")
private Integer ftpHostPort;
#Value("${ftp.client.mode}")
private Integer ftpClientMode;
#Value("${ftp.file.type}")
private Integer ftpFileType;
#Value("${ftp.host.username}")
private String ftpUsername;
#Value("${ftp.host.password}")
private String ftpPassword;
#Value("${ftp.tasklet.retryIfNotFound}")
private Boolean retryIfNotFound;
#Value("${ftp.tasklet.download.attempts}")
private Integer downloadAttempts;
#Value("${ftp.tasklet.retry.interval}")
private Integer retryInterval;
#Value("${ftp.tasklet.file.name.pattern}")
private String fileNamePattern;
#Value("${ftp.host.remote.directory}")
private String remoteDirectory;
#Value("${ftp.client.local.directory}")
private File localDirectory;
#Value("${ftp.tasklet.sftp}")
private Boolean sftp;
#Value("${query.order.insert}")
private String orderInsertQuery;
#Value("${query.order.items.insert}")
private String orderItemInsertQuery;
#Autowired
#Qualifier("jobRepository")
private JobRepository jobRepository;
#Bean
public DataSource dataSource() {
BasicDataSource dataSource = new BasicDataSource();
dataSource.setDriverClassName(driverClassName);
dataSource.setUrl(driverUrl);
dataSource.setUsername(driverUsername);
dataSource.setPassword(driverPassword);
return dataSource;
}
#Bean
public SimpleJobLauncher jobLauncher() {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(jobRepository);
return jobLauncher;
}
#Bean
public PlatformTransactionManager transactionManager() {
return new DataSourceTransactionManager(dataSource());
}
#Bean
#Scope("prototype")
public OrderItem orderItem(){
return new OrderItemBean();
}
#Bean
#Scope("prototype")
public Order order(){
return new OrderBean();
}
#Bean
//#Scope("step")
public DelimitedLineTokenizer orderItemLineTokenizer(){
DelimitedLineTokenizer lineTokenizer = new DelimitedLineTokenizer(delimiter);
lineTokenizer.setNames(orderItemFieldNames);
return lineTokenizer;
}
#Bean
//#Scope("step")
public BeanWrapperFieldSetMapper<OrderItem> orderItemFieldSetMapper(){
BeanWrapperFieldSetMapper<OrderItem> orderItemFieldSetMapper = new BeanWrapperFieldSetMapper<OrderItem>();
orderItemFieldSetMapper.setPrototypeBeanName("orderItem");
return orderItemFieldSetMapper;
}
#Bean
//#Scope("step")
public DefaultLineMapper<OrderItem> orderItemLineMapper(){
DefaultLineMapper<OrderItem> orderItemLineMapper = new DefaultLineMapper<OrderItem>();
orderItemLineMapper.setLineTokenizer(orderItemLineTokenizer());
orderItemLineMapper.setFieldSetMapper(orderItemFieldSetMapper());
return orderItemLineMapper;
}
#Bean
//#Scope("step")
public Resource orderItemResource(){
Resource orderItemResource = new FileSystemResource(orderItemFilePath);
return orderItemResource;
}
#Bean
//#Scope("step")
public FlatFileItemReader<OrderItem> orderItemItemReader(){
FlatFileItemReader<OrderItem> orderItemItemReader = new FlatFileItemReader<OrderItem>();
orderItemItemReader.setLineMapper(orderItemLineMapper());
orderItemItemReader.setResource(orderItemResource());
return orderItemItemReader;
}
#Bean
//#Scope("step")
public DelimitedLineTokenizer orderLineTokenizer(){
DelimitedLineTokenizer lineTokenizer = new DelimitedLineTokenizer(delimiter);
lineTokenizer.setNames(orderFieldNames);
return lineTokenizer;
}
#Bean
//#Scope("step")
public BeanWrapperFieldSetMapper<Order> orderFieldSetMapper(){
BeanWrapperFieldSetMapper<Order> orderItemFieldSetMapper = new BeanWrapperFieldSetMapper<Order>();
orderItemFieldSetMapper.setPrototypeBeanName("order");
return orderItemFieldSetMapper;
}
#Bean
//#Scope("step")
public DefaultLineMapper<Order> orderLineMapper(){
DefaultLineMapper<Order> orderItemLineMapper = new DefaultLineMapper<Order>();
orderItemLineMapper.setLineTokenizer(orderLineTokenizer());
orderItemLineMapper.setFieldSetMapper(orderFieldSetMapper());
return orderItemLineMapper;
}
#Bean
//#Scope("step")
public Resource orderResource(){
Resource orderItemResource = new FileSystemResource(orderFilePath);
return orderItemResource;
}
#Bean
//#Scope("step")
public FlatFileItemReader<Order> orderItemReader(){
FlatFileItemReader<Order> orderItemItemReader = new FlatFileItemReader<Order>();
orderItemItemReader.setLineMapper(orderLineMapper());
orderItemItemReader.setResource(orderResource());
return orderItemItemReader;
}
#Bean
#Scope("step")
public Map<String, Order> orderCache(){
Map<String, Order> orderCache = new HashMap<String, Order>();
return orderCache;
}
#Bean
public JdbcTemplate jdbcTemplate(){
return new JdbcTemplate(dataSource());
}
#Bean
//#Scope("step")
public AggregatingFlatFileOrderItemReader aggregatingFlatFileOrderItemReader(){
AggregatingFlatFileOrderItemReader aggregatingFlatFileOrderItemReader = new AggregatingFlatFileOrderItemReader();
aggregatingFlatFileOrderItemReader.setJdbcTemplate(jdbcTemplate());
aggregatingFlatFileOrderItemReader.setOrderCache(orderCache());
aggregatingFlatFileOrderItemReader.setOrderItemFlatFileItemReader(orderItemItemReader());
aggregatingFlatFileOrderItemReader.setOrderFlatFileItemReader(orderItemReader());
aggregatingFlatFileOrderItemReader.setDeleteOrderQuery(deleteOrderQuery);
aggregatingFlatFileOrderItemReader.setDeleteOrderItemQuery(deleteOrderItemQuery);
return aggregatingFlatFileOrderItemReader;
}
#Bean
#Scope("step")
public SessionFactory ftpSessionFactory(){
DefaultFtpSessionFactory ftpSessionFactory = new DefaultFtpSessionFactory();
ftpSessionFactory.setHost(ftpHostName);
ftpSessionFactory.setClientMode(ftpClientMode);
ftpSessionFactory.setFileType(ftpFileType);
ftpSessionFactory.setPort(ftpHostPort);
ftpSessionFactory.setUsername(ftpUsername);
ftpSessionFactory.setPassword(ftpPassword);
return ftpSessionFactory;
}
#Bean
#Scope(value="step")
public FtpGetRemoteFilesTasklet myFtpGetRemoteFilesTasklet(){
FtpGetRemoteFilesTasklet ftpTasklet = new FtpGetRemoteFilesTasklet();
ftpTasklet.setRetryIfNotFound(retryIfNotFound);
ftpTasklet.setDownloadFileAttempts(downloadAttempts);
ftpTasklet.setRetryIntervalMilliseconds(retryInterval);
ftpTasklet.setFileNamePattern(fileNamePattern);
ftpTasklet.setRemoteDirectory(remoteDirectory);
ftpTasklet.setLocalDirectory(localDirectory);
ftpTasklet.setSessionFactory(ftpSessionFactory());
ftpTasklet.setSftp(sftp);
return ftpTasklet;
}
#Bean
#Scope(value="step")
public OrderItemWriter orderItemWriter(){
OrderItemWriter orderItemWriter = new OrderItemWriter();
orderItemWriter.setJdbcTemplate(jdbcTemplate());
orderItemWriter.setOrderInsertQuery(orderInsertQuery);
orderItemWriter.setOrderItemInsertQuery(orderItemInsertQuery);
return orderItemWriter;
}