Spring 3 \ Quartz 2. Integration problems - java

I have problems with integration spring and quartz. I need to dynamically added CronTriggerFactoryBean to SchedulerFactoryBean.
XML Spring mapping:
<bean class="org.springframework.scheduling.quartz.SchedulerFactoryBean" scope="prototype">
<property name="jobDetails">
<list>
<ref bean="plannedVacationServiceJob" />
</list>
</property>
<property name="triggers">
<list>
<ref bean="plannedVacationServiceCronTrigger" />
</list>
</property>
</bean>
<bean id="plannedVacationServiceJob" class="org.springframework.scheduling.quartz.JobDetailFactoryBean">
<property name="jobClass" value="com.my.service.package.PlannerJob" />
<property name="durability" value="true" />
</bean>
<bean id="plannedVacationServiceCronTrigger" class="org.springframework.scheduling.quartz.CronTriggerFactoryBean" scope="prototype">
<property name="jobDetail" ref="plannedVacationServiceJob" />
<property name="cronExpression" value="*/15 * * * * ?" />
</bean>
Java code:
#Service
public class Planner implements Planning {
#Autowired
#Qualifier("plannedVacationServiceCronTrigger")
private CronTriggerFactoryBean plannedVacationServiceCronTrigger;
#Autowired
private SchedulerFactoryBean schedulerFactoryBean;
#PostConstruct
public void init() {
schedulerFactoryBean.start();
}
//some code
private void addTask(PlannerEntity entity) {
try {
String name = getIdentityName(entity);
JobKey jobKey = new JobKey(name);
String cronExpression = getCronExpression(entity);
plannedVacationServiceCronTrigger.setCronExpression(cronExpression);
JobDetail jobDetail = JobBuilder.newJob(PlannerJob.class).withIdentity(jobKey).build();
plannedVacationServiceCronTrigger.setJobDetail(jobDetail);
plannedVacationServiceCronTrigger.setName(name);
plannedVacationServiceCronTrigger.afterPropertiesSet();
schedulerFactoryBean.getScheduler().scheduleJob(jobDetail, plannedVacationServiceCronTrigger.getObject());
triggers.put(entity.getId(), trigger);
} catch (Exception e) {
e.printStackTrace();
}
}
}
In this case we have only one trigger, which added in xml mapping
I trying remove reference to triggers from xml and creating CronTriggerFactoryBean from code:
private void addTask(PlannerEntity entity) {
try {
String name = getIdentityName(entity);
JobKey jobKey = new JobKey(name);
String cronExpression = getCronExpression(entity);
CronTriggerFactoryBean trigger = new CronTriggerFactoryBean();
trigger.setCronExpression(cronExpression);
JobDetail jobDetail = JobBuilder.newJob(PlannerJob.class).withIdentity(jobKey).build();
trigger.setJobDetail(jobDetail);
trigger.setName(name);
trigger.afterPropertiesSet();
schedulerFactoryBean.getScheduler().scheduleJob(jobDetail, trigger.getObject());
triggers.put(entity.getId(), trigger);
} catch (Exception e) {
e.printStackTrace();
}
}
But in this case we dont have any triggers:
2014-04-02 14:28:55,144 DEBUG [org.quartz.core.QuartzSchedulerThread(org.springframework.scheduling.quartz.SchedulerFactoryBean#0_QuartzSchedulerThread)] - batch acquisition of 0 triggers
2014-04-02 14:29:18,981 DEBUG [org.quartz.core.QuartzSchedulerThread(org.springframework.scheduling.quartz.SchedulerFactoryBean#0_QuartzSchedulerThread)] - batch acquisition of 0 triggers
2014-04-02 14:29:46,361 DEBUG [org.quartz.core.QuartzSchedulerThread(org.springframework.scheduling.quartz.SchedulerFactoryBean#0_QuartzSchedulerThread)] - batch acquisition of 0 triggers
2014-04-02 14:30:09,439 DEBUG [org.quartz.core.QuartzSchedulerThread(org.springframework.scheduling.quartz.SchedulerFactoryBean#0_QuartzSchedulerThread)] - batch acquisition of 0 triggers

You can't really do it that way. FactoryBean has a specific contract and you're trying to act in the middle of it.
A factory bean is a bean whose purpose is to create a bean programmatically. You can't really inject a factory bean because the sole purpose of its existence is to register a bean. Besides, the start callback is related to the application context lifecycle. When you call it from your code, it is too late, the scheduler is already active.
The triggers are registered when the bean is initialized (in the afterPropertiesSet callback method). If you try to add additional triggers to the factory after that, it will not be taken into account as the scheduler has already been created.
That being said, I don't really see the relationship between the XML and the Java code. In a regular scenario, your plannedVacationServiceCronTrigger should be registered (and there is no reason to make that a prototype bean).

Related

Apache Ignite mongo configuration using spring

I am introducing Apache Ignite in our application as cache system as well as for computation. I have configured spring application using following configuration class.
#Configuration
#EnableCaching
public class IgniteConfig {
#Value("${ignite.config.path}")
private String ignitePath;
#Bean(name="cacheManager")
public SpringCacheManager cacheManager(){
SpringCacheManager springCacheManager = new SpringCacheManager();
springCacheManager.setConfigurationPath(ignitePath);
return springCacheManager;
}
}
Using it like
#Override
#Cacheable("cache1")
public List<Channel> getAllChannels(){
List<Channel> list = new ArrayList<Channel>();
Channel c1 = new Channel("1",1);
Channel c2 = new Channel("2",2);
Channel c3 = new Channel("3",3);
Channel c4 = new Channel("4",4);
list.add(c1);
list.add(c2);
list.add(c3);
list.add(c4);
return list;
}
Now I want to add write-through and read-through feature. I could not find any documentation to connect ignite to mongo.
The idea is not to talk to db directly but through ignite using write behind feature.
EDIT-----------------------
As suggested I implemented
public class ChannelCacheStore extends CacheStoreAdapter<Long, Channel> implements Serializable {
#Override
public Channel load(Long key) throws CacheLoaderException {
return getChannelDao().findOne(Channel.mongoChannelCode, key);
}
#Override
public void write(Cache.Entry<? extends Long, ? extends Channel> entry) throws CacheWriterException {
getChannelDao().save(entry.getValue());
}
#Override
public void delete(Object key) throws CacheWriterException {
throw new UnsupportedOperationException("Delete not supported");
}
private ChannelDao getChannelDao(){
return SpringContextUtil.getApplicationContext().getBean(ChannelDao.class);
}
}
And added this CacheStore into cache configuration like below :
<property name="cacheConfiguration">
<list>
<bean class="org.apache.ignite.configuration.CacheConfiguration">
<property name="name" value="channelCache"/>
<property name="cacheMode" value="PARTITIONED"/>
<property name="atomicityMode" value="ATOMIC"/>
<property name="backups" value="1"/>
<property name="readThrough" value="true"/>
<!-- Sets flag indicating whether write to database is enabled. -->
<property name="writeThrough" value="true"/>
<!-- Enable database batching. -->
<!-- Sets flag indicating whether write-behind is enabled. -->
<property name="writeBehindEnabled" value="true"/>
<property name="cacheStoreFactory">
<bean class="javax.cache.configuration.FactoryBuilder$SingletonFactory">
<constructor-arg>
<bean class="in.per.amt.ignite.cache.ChannelCacheStore"></bean>
</constructor-arg>
</bean>
</property>
</bean>
</list>
</property>
But now getting class cast exception
java.lang.ClassCastException: org.springframework.cache.interceptor.SimpleKey cannot be cast to java.lang.Long
at in.per.amt.ignite.cache.ChannelCacheStore.load(ChannelCacheStore.java:19)
You can have any kind of backing database by implementing CacheStore interface:
https://apacheignite.readme.io/docs/persistent-store
Have you tried setting your key generator?
#CacheConfig(cacheNames = "cache1",keyGenerator = "simpleKeyGenerator")
https://github.com/spring-projects/spring-boot/issues/3625
So in the below line of code from what you have shared,
#Cacheable("cache1")
public List<Channel> getAllChannels(){
the #Cacheable annotation is being used on a method which is not accepting any parameters. Spring cache uses the parameters (if in basic data type) as a key for the cache (response obj as the value). I believe this makes the caching ineffective.

Hibernate 4 persist() works but merge() does not

I am writing a webapp that runs on Wildfly 8.2 and uses Hibernate 4. I've gotten it to successfully persist a new entity but cannot seem to make it commit changes to it afterwards. I'm kind of assumming it's some transaction setting that I have wrong, but I'm not sure what it is. I have a service layer, in which I set the transaction settings and that layer calls a DAO layer. Here's an example:
#Transactional
#EnableTransactionManagement
#TransactionManagement(value = TransactionManagementType.CONTAINER)
#TransactionAttribute(value = TransactionAttributeType.REQUIRED)
#Stateless
#Interceptors(SpringBeanAutowiringInterceptor.class)
#DeclareRoles("Security Admin")
public class SecurityServiceBean
{
#Override
#PermitAll
public UserRegistration confirmRegistration(
String confirmationCode) throws ApplicationException
{
try
{
QueryResults<UserRegistration> userRegistrations = this.userRegistrationDAO
.find(new UserRegistrationQuery(null, confirmationCode));
if (userRegistrations.getTotalRecords() == 1)
{
UserRegistration userRegistration = userRegistrations.uniqueResult();
if (userRegistration.getConfirmationDate() == null)
{
userRegistration.setConfirmationDate(new Date());
userRegistration.setState(State.CONFIRMED);
userRegistration = this.userRegistrationDAO.saveOrUpdate(userRegistration);
...
}
...
}
}
}
}
and the base DAO class has this
public abstract class AbstractJpaDataAccessObject implements DataAccessObject
{
public <T extends UniqueObject<?>> T saveOrUpdate(
T obj) throws DAOException
{
try
{
if (obj.getId() == null)
{
this.em.persist(obj);
}
else
{
T attached = this.em.merge(obj);
this.em.flush();
return attached;
}
return obj;
}
catch (PersistenceException e)
{
throw new DAOException("[saveOrUpdate] obj=" + obj.toString() + ",msg=" + e.getMessage(), e);
}
}
}
I know that I should not need the call to flush(), but I wanted to try it to see if that helped, which it did not.
So what am I missing?
UPDATE:
There is no exception being thrown. The object being returned from SecurityServiceBean.confirmRegistration() has all of the changes that were made in the method. However, querying the database shows that the changes were not committed. None of my fields in the entity are marked as updateable=false. Below is an example. I'll limit the fields to just the "status" field which is one of the fields I expect to be updated.
#Entity
#Table(name = "user_registrations", schema = "campaigner")
public class UserRegistration extends AbstractUserRegistration
{
}
and the mapped superclass.
#MappedSuperclass
public class AbstractUserRegistration extends CampaignerHistoryObject<Long>
{
public static enum State {
UNCONFIRMED, CONFIRMED, APPROVED, DENIED,
};
private State state;
#Column(name = "STATE")
public State getState()
{
return state;
}
}
And here's two XML files that I use. The first is beanRefContent.xml
<beans>
<!-- <aop:aspectj-autoproxy proxy-target-class="true"/> -->
<!--
SpringBeanAutowiringInterceptor needs this file.
We need SpringBeanAutowiringInterceptor to autowire the EJBs.
-->
<bean
class="org.springframework.context.support.ClassPathXmlApplicationContext">
<constructor-arg value="classpath:campaignerContext.xml" />
</bean>
</beans>
The second is campaignerContext.xml.
<beans>
<bean id="entityManagerFactory" class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean">
<property name="persistenceUnitName" value="campaigner" />
</bean>
<bean id="em" class="org.springframework.orm.jpa.support.SharedEntityManagerBean">
<property name="entityManagerFactory" ref="entityManagerFactory" />
</bean>
</beans>
UPDATE 2: Now I'm starting to think that my problems lie in my persistence.xml file, shown below:
<persistence>
<persistence-unit name="campaigner" transaction-type="JTA">
<provider>org.hibernate.jpa.HibernatePersistenceProvider</provider>
<jta-data-source>java:/jdbc/CampaignerDS</jta-data-source>
...
<properties>
<property name="hibernate.dialect" value="org.hibernate.dialect.MySQLInnoDBDialect" />
<property name="hibernate.transaction.jta.platform" value="org.hibernate.service.jta.platform.internal.JBossAppServerJtaPlatform" />
<property name="jta.UserTransaction" value="java:jboss/UserTransaction" />
<property name="jta.TransactionManager" value="java:jboss/TransactionManager" />
</properties>
</persistence-unit>
</persistence>

How to configure Async and Sync Event publishers using spring

I am trying to implement an event framework using spring events.I came to know that the default behavior of spring event framework is sync. But during spring context initialization if it finds a bean with id applicationEventMulticaster it behaves Async.
Now i want to have both sync and async event publishers in my application, because some of the events needs to be published sync. I tried to configure sync event multicaster using SysncTaskExecutor, but i cant find a way to inject it into my AsyncEventPublisher's applicationEventPublisher property.
My spring configuration file is as below
<bean id="taskExecutor" class="org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor" destroy-method="shutdown">
<property name="corePoolSize" value="5" />
<property name="maxPoolSize" value="10" />
<property name="WaitForTasksToCompleteOnShutdown" value="true" />
</bean>
<bean id="syncTaskExecutor" class="org.springframework.core.task.SyncTaskExecutor" />
<bean id="customEventPublisher" class="x.spring.event.CustomEventPublisher" />
<bean id="customEventHandler" class="x.spring.event.CustomEventHandler" />
<bean id="eventSource" class="x.spring.event.EventSource" />
<bean id="responseHandler" class="x.spring.event.ResponseHandler" />
<bean id="syncEventSource" class="x.spring.event.syncEventSource" />
<bean id="applicationEventMulticaster" class="org.springframework.context.event.SimpleApplicationEventMulticaster">
<property name="taskExecutor" ref="taskExecutor" />
</bean>
<bean id="syncApplicationEventMulticaster" class="org.springframework.context.event.SimpleApplicationEventMulticaster">
<property name="taskExecutor" ref="syncTaskExecutor" />
</bean>
Can anyone help me out here ?
I just had to work this out for myself. By default events are sent asynchronously except if you implement a marker interface, in my case I called it SynchronousEvent. You'll need an 'executor' in your config too (I omitted mine as it's quite customised).
#EnableAsync
#SpringBootConfiguration
public class BigFishConfig {
#Autowired AsyncTaskExecutor executor;
#Bean
public ApplicationEventMulticaster applicationEventMulticaster() {
log.debug("creating multicaster");
return new SimpleApplicationEventMulticaster() {
#Override
public void multicastEvent(final ApplicationEvent event, #Nullable ResolvableType eventType) {
ResolvableType type = eventType != null ? eventType : ResolvableType.forInstance(event);
if (event instanceof PayloadApplicationEvent
&& ((PayloadApplicationEvent<?>) event).getPayload() instanceof SynchronousEvent)
getApplicationListeners(event, type).forEach(l -> invokeListener(l, event));
else
getApplicationListeners(event, type).forEach(l -> executor.execute(() -> invokeListener(l, event)));
}
};
}
...
no, you can't do that, the spring initApplicationEventMulticaster just init only one, and the BeanName must be applicationEventMulticaster. so you just can choose one of below Executor:
- org.springframework.core.task.SyncTaskExecutor
- org.springframework.core.task.SimpleAsyncTaskExecutor
- your own Executor: org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor
any way, you can modify org.springframework.context.event.SimpleApplicationEventMulticaster
to add your logic, then you can control whether need to Sync/Async
/**
* Initialize the ApplicationEventMulticaster.
* Uses SimpleApplicationEventMulticaster if none defined in the context.
* #see org.springframework.context.event.SimpleApplicationEventMulticaster
*/
protected void initApplicationEventMulticaster() {
ConfigurableListableBeanFactory beanFactory = getBeanFactory();
if (beanFactory.containsLocalBean(APPLICATION_EVENT_MULTICASTER_BEAN_NAME)) {
this.applicationEventMulticaster =
beanFactory.getBean(APPLICATION_EVENT_MULTICASTER_BEAN_NAME, ApplicationEventMulticaster.class);
if (logger.isDebugEnabled()) {
logger.debug("Using ApplicationEventMulticaster [" + this.applicationEventMulticaster + "]");
}
}
else {
this.applicationEventMulticaster = new SimpleApplicationEventMulticaster(beanFactory);
beanFactory.registerSingleton(APPLICATION_EVENT_MULTICASTER_BEAN_NAME, this.applicationEventMulticaster);
if (logger.isDebugEnabled()) {
logger.debug("Unable to locate ApplicationEventMulticaster with name '" +
APPLICATION_EVENT_MULTICASTER_BEAN_NAME +
"': using default [" + this.applicationEventMulticaster + "]");
}
}
}
i am not good for edit with stackoverflow. please forgive me.
SyncTaskExecutor
I don't need to add comment that you can know well. this is synchronized. this Executor run task in sequence, and blocked for every task.
public class SyncTaskExecutor implements TaskExecutor, Serializable {
/**
* Executes the given {#code task} synchronously, through direct
* invocation of it's {#link Runnable#run() run()} method.
* #throws IllegalArgumentException if the given {#code task} is {#code null}
*/
#Override
public void execute(Runnable task) {
Assert.notNull(task, "Runnable must not be null");
task.run();
}
}
SimpleAsyncTaskExecutor
This class is very large, so i just choose section of code. If you give threadFactory, will be retrieved Thread from this factory, or will be create new Thread.
protected void doExecute(Runnable task) {
Thread thread = (this.threadFactory != null ? this.threadFactory.newThread(task) : createThread(task));
thread.start();
}
ThreadPoolTaskExecutor
this class use jdk5's current pkg ThreadPoolTaskExecutor. but spring encapsulate functionality. Spring is good at this way, jdk6's current and jdk7'scurrent pkg have some difference.
this will be get Thread from ThreadPool and reuse it, execute every task Asynchronized. If you want to know more detail, see JKD source code.
I tried below tutorial :
https://www.keyup.eu/en/blog/101-synchronous-and-asynchronous-spring-events-in-one-application
It helps in making sync and async multicaster and creates a wrapper over these. Make sure the name of the wrapper class (DistributiveEventMulticaster) is applicationEventMulticaster

How to assign task name in Spring TaskExecutor and check if it is still alive?

Lest's consider that I have the following:
public class MyRunnable implements Runnable {
public void run() {
//do something expensive
}
}
public class ThreadExecutor {
private TaskExecutor taskExecutor;
public ThreadExecutor(TaskExecutor taskExecutor) {
this.taskExecutor = taskExecutor;
}
public void fireThread(){
taskExecutor.execute(new MyRunnable());
}
}
my xml is the following:
<bean id="taskExecutor" class="org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor">
<property name="corePoolSize" value="5" />
<property name="maxPoolSize" value="10" />
<property name="queueCapacity" value="25" />
</bean>
<bean id="threadExecutor" class="com.vanilla.threads.controllers.ThreadExecutor">
<constructor-arg ref="taskExecutor" />
</bean>
in my Spring MVC controller I start the task:
#RequestMapping(value="/startTask.html", method=RequestMethod.GET)
public ModelAndView indexView(){
ModelAndView mv = new ModelAndView("index");
threadExecutor.fireThread();
return mv;
}
Now let's consider that I would like to create another request(#RequestMapping(value="/checkStatus.html") which will tell if the the task started in my previous Request has been finished.
So my questions are simple:
1) Can I assign name to the task in TaskExecutor and if yes, how can I do it?
2) How can I check that the task with that specific name has been done?
1) No, but..
Instead of using taskExecutor.execute() use taskExecutor.submit() which returns a Future. Put the Future in the HttpSession, and when a checkStatus request comes in, pull the Future from the session and call isDone() on it. If you need to give it a name then instead of putting the Future in the session directly have a Map<String, Future> in the session where the key is the name of your task.

Spring 3: disable SpEL evaluation of a bean property value?

We're in the process of updating our apps from Spring 2.5 to 3.0 and we've hit a problem with the new SpEL evaluation of bean properties.
We've been using an in-house templating syntax in one module which unfortunately uses the same "#{xyz}" markup as SpEL. We have a few beans which take string's containing these expressions as properties but spring assumes they are SpEL expressions and throws a SpelEvaluationException when it tries to instantiate the bean.
e.g.
<bean id="templatingEngine" class="com.foo.TemplatingEngine">
<property name="barTemplate" value="user=#{uid}&country=#{cty}"/>
</bean>
Is it possible to disable SpEL evaluation, ideally per-bean, but alternatively for the whole application context?
Alternatively is there a way to escape the values?
Thanks,
Stephen
Completely disable SpEL evaluation by calling the bean factory setBeanExpressionResolver method passing in null. You can define a BeanFactoryPostProcessor to do this.
public class DisableSpel implements BeanFactoryPostProcessor {
public void postProcessBeanFactory(
ConfigurableListableBeanFactory beanFactory)
throws BeansException
{
beanFactory.setBeanExpressionResolver(null);
}
}
Then define this bean in the application context.
<bean class="com.example.spel.DisableSpel"/>
Well what you could do is re-define the expression language delimiters.
I would say the way to do this is through a special bean that implements BeanFactoryPostProcessor (thanks to inspiration by Jim Huang):
public class ExpressionTokensRedefiner implements BeanFactoryPostProcessor{
private BeanExpressionResolver beanExpressionResolver;
public void setBeanExpressionResolver(
final BeanExpressionResolver beanExpressionResolver){
this.beanExpressionResolver = beanExpressionResolver;
}
#Override
public void postProcessBeanFactory(
final ConfigurableListableBeanFactory beanFactory)
throws BeansException{
beanFactory.setBeanExpressionResolver(createResolver());
}
private String expressionPrefix = "${";
private String expressionSuffix = "}";
public void setExpressionPrefix(final String expressionPrefix){
this.expressionPrefix = expressionPrefix;
}
public void setExpressionSuffix(final String expressionSuffix){
this.expressionSuffix = expressionSuffix;
}
private BeanExpressionResolver createResolver(){
if(beanExpressionResolver == null){
final StandardBeanExpressionResolver resolver =
new StandardBeanExpressionResolver();
resolver.setExpressionPrefix(expressionPrefix);
resolver.setExpressionSuffix(expressionSuffix);
return resolver;
} else{
return beanExpressionResolver;
}
}
}
Define it as a bean like this:
<bean class="foo.bar.ExpressionTokensRedefiner">
<property name="expressionPrefix" value="[[" />
<property name="expressionSuffix" value="]]" />
</bean>
or like this:
<!-- this will use the default tokens ${ and } -->
<bean class="foo.bar.ExpressionTokensRedefiner" />
or use a custom resolver:
<bean class="foo.bar.ExpressionTokensRedefiner">
<property name="beanExpressionResolver">
<bean class="foo.bar.CustomExpressionResolver" />
</property>
</bean>
Now you can leave your definitions untouched and if you want to use SpEL, use the new delimiters.
EDIT: now I did test it and it actually works.
<bean class="foo.bar.ExpressionTokensRedefiner">
<property name="expressionPrefix" value="[[" />
<property name="expressionSuffix" value="]]" />
</bean>
<bean class="foo.bar.FooFritz">
<property name="fizz" value="[[ systemProperties['user.home'] ]]"></property>
<property name="fozz" value="[[ systemProperties['java.io.tmpdir'] ]]"></property>
<!-- this is what it would normally choke on -->
<property name="fazz" value="#{ boom() }"></property>
</bean>
Test code:
final ConfigurableApplicationContext context =
new ClassPathXmlApplicationContext("classpath:foo/bar/ctx.xml");
context.refresh();
final FooFritz fooFritz = context.getBean(FooFritz.class);
System.out.println(fooFritz.getFizz());
System.out.println(fooFritz.getFozz());
System.out.println(fooFritz.getFazz());
Output:
/home/seanizer
/tmp
#{ boom() }
I am not a dab, but this mighbe of help.
https://issues.apache.org/jira/browse/CAMEL-2599

Categories