I'm currently working on a Java EE 7 Batch API application, and I would like the lifecycle of one of my CDI Bean be related to the current job.
Actually I would like this bean to have a #JobScoped scope (but it doesn't exist in the API). Also I would like this bean to be injectable in any of my jobs class.
At first, I wanted to create my own #JobScoped scope, with a JobScopedContext, etc. But then I came with the idea that Batch API has the JobContext bean with a unique job id per bean.
So I wonder if I could manage the lifecycle of my job scoped bean with this JobContext.
For example, I would have my bean that I want to be job scoped :
#Alternative
public class JobScopedBean
{
private String m_value;
public String getValue()
{
return m_value;
}
public void setValue(String p_value)
{
m_value = p_value;
}
}
Then I would have the producer of this bean which will return the JobScopedBean associated to the current job (thanks to the JobContext which is unique per job)
public class ProducerJobScopedBean
{
#Inject
private JobContext m_jobContext;// this is the JobContext of Batch API
#Inject
private JobScopedManager m_manager;
#Produces
public JobScopedBean getObjectJobScoped() throws Exception
{
if (null == m_jobContext)
{
throw new Exception("Job Context not active");
}
return m_manager.get(m_jobContext.getExecutionId());
}
}
And the manager which holds the map of my JobScopedBean :
#ApplicationScoped
public class JobScopedManager
{
private final ConcurrentMap<Long, JobScopedBean> mapObjets = new ConcurrentHashMap<Long, JobScopedBean>();
public JobScopedBean get(final long jobId)
{
JobScopedBean returnObject = mapObjets.get(jobId);
if (null == returnObject)
{
final JobScopedBean ajout = new JobScopedBean();
returnObject = mapObjets.putIfAbsent(jobId, ajout);
if (null == returnObject)
{
returnObject = ajout;
}
}
return returnObject;
}
Of course, I will manage the destruction of the JobScopedBean at the end of each job (through a JobListener and a CDI Event).
Can you tell me if I'm wrong with this solution?
It looks correct to me but maybe I'm missing something?
May be there is a better way to handle this?
Thanks.
So it boils down to creating #Dependent scoped beans that are based on a job on creation. Works fine for beans with a lifespan shorter than the job, so for the standard scopes only #Dependent (#Request/#Session/#Converstion might be ok but do not apply here).
It will cause problems for other Scopes, especially #ApplicationScoped/#Singleton. If you inject the JobScopedBean into one of them. You might be (un)lucky to have an active Job when you need them the first time, but the beans will always be attached to that initial job (#Dependent scope beans are not pseudoscoped so will not create proxies to get the contextual instance)
If you want something like that, create a customscope.
Related
I'm using Spring boot with jetty embedded web server for one Web application.
I want to be 100% sure that the repo class is thread safety.
The repo class
#Repository
#Scope("prototype")
public class RegistrationGroupRepositoryImpl implements RegistrationGroupRepository {
private RegistrationGroup rg = null;
Integer sLastregistrationTypeID = 0;
private UserAccountRegistration uar = null;
private List<RegistrationGroup> registrationGroup = new ArrayList<>();
private NamedParameterJdbcTemplate jdbcTemplate;
#Autowired
public RegistrationGroupRepositoryImpl(DataSource dataSource) {
this.jdbcTemplate = new NamedParameterJdbcTemplate(dataSource);
}
public List<RegistrationGroup> getRegistrationGroups(Integer regId) {
// Some logic here which is stored in stored in the instance variables and registrationGroup is returned from the method
return this.registrationGroup;
}
And the Service class which invoke the getRegistrationGroups method from the repo.
#Service
public class RegistrationService {
#Autowired
private Provider<RegistrationGroupRepository> registrationGroupRepository;
public List<RegistrationGroup> getRegistrationGroup() {
return registrationGroupRepository.getRegistrationGroups(1);
}
}
Can I have race condition situation if two or more request execute the getRegistrationGroups(1) method?
I guess I'm on the safety side because I'm using Method injection (Provider) with prototype bean, and every time I'm getting new instance from the invocation?
First of all, making your Bean a prototype Bean doesn't ensure an instance is created for every method invocation (or every usage, whatever).
In your case you're okay on that point, thanks to the Provider usage.
I noticed however that you're accessing the getRegistrationGroups directly.
return registrationGroupRepository.getRegistrationGroups(1);
How can this code compile? You should call get() on the Provider instance.
return registrationGroupRepository.get().getRegistrationGroups(1);
Answering your question, you should be good to go with this code. I don't like the fact that you're maintaining some sort of state inside RegistrationGroupRepositoryImpl, but that's your choice.
I always prefer having all my fields as final. If one of them requires me to remove the final modifier, there is something wrong with the design.
I am trying to define a custom DeltaSpike ConfigSource. The custom config source will have the highest priority and check the database for the config parameter.
I have a ConfigParameter entity, that simply has a key and a value.
#Entity
#Cacheable
public class ConfigParameter ... {
private String key;
private String value;
}
I have a #Dependent DAO that finds all config parameters.
What I am trying to do now, is define a custom ConfigSource, that is able to get the config parameter from the database. Therefore, I want to inject my DAO in the ConfigSource. So basically something like
#ApplicationScoped
public class DatabaseConfigSource implements ConfigSource {
#Inject
private ConfigParameterDao configParameterDao;
....
}
However, when registering the ConfigSource via META-INF/services/org.apache.deltaspike.core.spi.config.ConfigSource, the class will be instantiated and CDI will not work.
Is there any way to get CDI working in this case?
Thanks in advance, if you need any further information, please let me know.
The main problem is, that the ConfigSource gets instantiated very early on when the BeanManager is not available yet. Even the JNDI lookup does not work at that point in time. Thus, I need to delay the injection/lookup.
What I did now, is add a static boolean to my config source, that I set manually. We have a InitializerService that makes sure that the system is setup properly. At the end of the initialization process, I call allowInitialization() in order to tell the config source, that the bean is injectable now. Next time the ConfigSource is asked, it will be able to inject the bean using BeanProvider.injectFields.
public class DatabaseConfigSource implements ConfigSource {
private static boolean allowInit;
#Inject
private ConfigParameterProvider configParameterProvider;
#Override
public int getOrdinal() {
return 500;
}
#Override
public String getPropertyValue(String key) {
initIfNecessary();
if (configParameterProvider == null) {
return null;
}
return configParameterProvider.getProperty(key);
}
public static void allowInitialization() {
allowInit = true;
}
private void initIfNecessary() {
if (allowInit) {
BeanProvider.injectFields(this);
}
}
}
I have a request-scoped bean that holds all my config variables for type-safe access.
#RequestScoped
public class Configuration {
#Inject
#ConfigProperty(name = "myProperty")
private String myProperty;
#Inject
#ConfigProperty(name = "myProperty2")
private String myProperty2;
....
}
When injecting the Configuration class in a different bean, each ConfigProperty will be resolved. Since my custom DatabaseConfigSource has the highest ordinal (500), it will be used for property resolution first. If the property is not found, it will delegate the resolution to the next ConfigSource.
For each ConfigProperty the getPropertyValue function from the DatabaseConfigSource is called. Since I do not want to retreive the parameters from the database for each config property, I moved the config property resolution to a request-scoped bean.
#RequestScoped
public class ConfigParameterProvider {
#Inject
private ConfigParameterDao configParameterDao;
private Map<String, String> configParameters = new HashMap<>();
#PostConstruct
public void init() {
List<ConfigParameter> configParams = configParameterDao.findAll();
configParameters = configParams.stream()
.collect(toMap(ConfigParameter::getId, ConfigParameter::getValue));
}
public String getProperty(String key) {
return configParameters.get(key);
}
}
I could sure change the request-scoped ConfigParameterProvider to ApplicationScoped. However, we have a multi-tenant setup and the parameters need to be resolved per request.
As you can see, this is a bit hacky, because we need to explicitly tell the ConfigSource, when it is allowed to be instantiated properly (inject the bean).
I would prefer a standarized solution from DeltaSpike for using CDI in a ConfigSource. If you have any idea on how to properly realise this, please let me know.
Even though this post has been answered already I'd like to suggest another possible solution for this problem.
I managed to load properties from my db service by creating an #Signleton #Startup EJB which extends the org.apache.deltaspike.core.impl.config.BaseConfigSource and injects my DAO as delegate which I then registered into the org.apache.deltaspike.core.api.config.ConfigResolver.
#Startup
#Singleton
public class DatabaseConfigSourceBean extends BaseConfigSource {
private static final Logger logger = LoggerFactory.getLogger(DatabaseConfigSourceBean.class);
private #Inject PropertyService delegateService;
#PostConstruct
public void onStartup() {
ConfigResolver.addConfigSources(Collections.singletonList(this));
logger.info("Registered the DatabaseConfigSourceBean in the ConfigSourceProvider ...");
}
#Override
public Map<String, String> getProperties() {
return delegateService.getProperties();
}
#Override
public String getPropertyValue(String key) {
return delegateService.getPropertyValue(key);
}
#Override
public String getConfigName() {
return DatabaseConfigSourceBean.class.getSimpleName();
}
#Override
public boolean isScannable() {
return true;
}
}
I know that creating an EJB for this purpose basically produces a way too big overhead, but I think it's a bit of a cleaner solution instead of handling this problem by some marker booleans with static accessors ...
DS is using the java se spi mechanism for this which is not CD'Injectable'. One solution would be to use the BeanProvider to get hold of your DatabaseConfigSource and delegate operations to it.
In my Spring application, I have components that use Spring's caching mechanism. Each #Cacheable annotation specifies the cache that is to be used. I'd like to autodiscover all the caches that are needed at startup so that they can be automatically configured.
The simplest approach seemed to create a marker interface (ex: CacheUser) to be used by each caching component:
#Component
public class ComponentA implements CacheUser {
#Cacheable("dictionaryCache")
public String getDefinition(String word) {
...
}
}
I would then have Spring autodiscover all the implementations of this interface and autowire them to a configuration list that can be used when configuring the cache manager(s). This works.
#Autowired
private Optional<List<CacheUser>> cacheUsers;
My plan was to take each discovered class and find all methods annotated with #Cacheable. From there I would access the annotation's properties and obtain the cache name. I'm using AnnotationUtils.findAnnotation() to get the annotation declaration.
That's where the plan falls apart. Spring actually wires proxies instead of the raw component, and the annotations aren't copied over to the proxies' methods. The only workaround I've found exploits the fact that the proxy implements Advised which provides access to the proxied class:
((Advised)proxy).getTargetSource().getTargetClass().getMethods()
From there I can get the original annotations, but this approach is clearly brittle.
So two questions, really:
Is there a better way to get to the annotations defined by the proxied class?
Can you suggest any other way to discover all uses of #Cacheable in my project? I'd love to do without a marker interface.
Thanks!
Spring has a lot of infrastructure interfaces which can help you tap into the lifecycle of the container and/or beans. For your purpose you want to use a BeanPostProcessor and the SmartInitializingSingleton.
The BeanPostProcessor will get a callback for all the beans constructed, you will only need to implement the the postProcessAfterInitialization method. You can in that method detect the annotations and fill a list of caches.
Then in the SmartInitializingSingletons afterSingletonsInstantiated method you use this list to bootstrap/init your caches.
Something like the following (it is untested but should give you an idea).
public class CacheInitialingProcessor implements BeanPostProcessor, SmartInitializingSingleton {
private final Set<String> caches = new HashSet<String>();
#Override
public Object postProcessBeforeInitialization(Object bean, String beanName) throws BeansException {
return bean;
}
#Override
public Object postProcessAfterInitialization(Object bean, String beanName) throws BeansException {
Class<?> targetClass = AopUtils.getTargetClass(bean);
ReflectionUtils.doWithMethods(targetClass, new ReflectionUtils.MethodCallback() {
#Override
public void doWith(Method method) throws IllegalArgumentException, IllegalAccessException {
Cacheable cacheable = AnnotationUtils.getAnnotation(method, Cacheable.class);
if (cacheable != null) {
caches.addAll(Arrays.asList(cacheable.cacheNames()));
}
}
});
return bean;
}
#Override
public void afterSingletonsInstantiated() {
for (String cache : caches) {
// inti caches.
}
}
}
I have a property file and using Spring property place holder, I set values to the Spring beans. Now, this property file may be modified during the run time. Is there a way to refresh the properties of the Spring beans with this newly modified property value? Especially, I have many singleton beans? How can I refresh them with the new values? Is there already a solution to this or should it be custom coded? If it doesn't already exist, can someone please give the best approach to achieve this? Thanks!
PS: My application is a batch application. I use Spring based Quartz configuration to schedule the batches.
I'll leave this in for reference, but the updated answer is below the divider:
Well the ConfigurableApplicationContext interface contains a refresh() method, which should be what you want, but the question is: how to access that method. Whichever way you do it, you'll start with a bean that has a dependency of type ConfigurableApplicationContext:
private ConfigurableApplicationContext context;
#Autowired
public void setContext(ConfigurableApplicationContext ctx){
this.context = ctx;
}
Now the two basic options I'd suggest would be to either
use the Task Execution Framework and let your bean watch the property resources regularly, refreshing the ApplicationContext when it finds changes or
expose the bean to JMX, allowing you to manually trigger the refresh.
Referring to comments: since it seems impossible to refresh the entire context, an alternative strategy would be to create a properties factory bean and inject that into all other beans.
public class PropertiesFactoryBean implements FactoryBean<Properties>{
public void setPropertiesResource(Resource propertiesResource){
this.propertiesResource = propertiesResource;
}
private Properties value=null;
long lastChange = -1L;
private Resource propertiesResource;
#Override
public Properties getObject() throws Exception{
synchronized(this){
long resourceModification = propertiesResource.lastModified();
if(resourceModification != lastChange){
Properties newProps = new Properties();
InputStream is = propertiesResource.getInputStream();
try{
newProps.load(is);
} catch(IOException e){
throw e;
} finally{
IOUtils.closeQuietly(is);
}
value=newProps;
lastChange= resourceModification;
}
}
// you might want to return a defensive copy here
return value;
}
#Override
public Class<?> getObjectType(){
return Properties.class;
}
#Override
public boolean isSingleton(){
return false;
}
}
You could inject this properties bean into all your other beans, however, you would have to be careful to always use prototype scope. This is particularly tricky inside singleton beans, a solution can be found here.
If you don't want to inject lookup methods all over the place, you could also inject a PropertyProvider bean like this:
public class PropertiesProvider implements ApplicationContextAware{
private String propertyBeanName;
private ApplicationContext applicationContext;
public void setPropertyBeanName(final String propertyBeanName){
this.propertyBeanName = propertyBeanName;
}
#Override
public void setApplicationContext(final ApplicationContext applicationContext) throws BeansException{
this.applicationContext = applicationContext;
}
public String getProperty(final String propertyName){
return ((Properties) applicationContext.getBean(propertyBeanName)).getProperty(propertyName);
}
}
I want to reinject singleton-scoped dependencies into prototype Spring beans, after they have been deserialized.
Say I've got a Process bean, which depends on a Repository bean. The Repository bean is a scoped as a singleton, but the Process bean is prototype-scoped. Periodically I serialize the Process, and then later deserialize it.
class Process {
private Repository repository;
// getters, setters, etc.
}
I don't want to serialize and deserialize the Repository. Nor do I want to put "transient" on the member variable that holds a reference to it in Process, nor a reference to some kind of proxy, or anything other than a plain old member variable declared as a Repository.
What I think I want is for the Process to have its dependency filled with a serializable proxy that points (with a transient reference) to the Repository, and, upon deserialization, can find the Repository again. How could I customize Spring to do that?
I figure I could use a proxy to hold the dependency references, much like . I wish I could use that exact technique. But the proxy I've seen Spring generate isn't serializable, and the docs say that if I use it with a singleton bean, I'll get an exception.
I could use a custom scope, perhaps, on the singleton beans, that would always supply a proxy when asked for a custom-scoped bean. Is that a good idea? Other ideas?
I used this instead, without any proxy:
public class Process implements HttpSessionActivationListener {
...
#Override
public void sessionDidActivate(HttpSessionEvent e) {
ServletContext sc = e.getSession().getServletContext();
WebApplicationContext newContext = WebApplicationContextUtils
.getRequiredWebApplicationContext(sc);
newContext.getAutowireCapableBeanFactory().configureBean(this, beanName);
}
}
The example is for a web environment when the application server serializes the session, but it should work for any ApplicationContext.
Spring provides a solution for this problem.
Take a look at the spring documentation http://static.springsource.org/spring/docs/3.0.x/spring-framework-reference/html/aop.html#aop-atconfigurable.
7.8.1 Using AspectJ to dependency inject domain objects with Spring
...
The support is intended to be used for objects created outside
of the control of any container. Domain objects often fall into
this category because they are often created programmatically
using the new operator, or by an ORM tool as a result of a database query.
The trick is to use load time weaving. Just start the jvm with -javaagent:path/to/org.springframework.instrument-{version}.jar. This agent will recognize every object that is instantiated and if it is annotated with #Configurable it will configure (inject #Autowired or #Resource dependencies) that object.
Just change the Process class to
#Configurable
class Process {
#Autowired
private transient Repository repository;
// getters, setters, etc.
}
Whenever you create a new instance
Process process = new Process();
spring will automatically inject the dependencies.
This also works if the Process object is deserialized.
How about added using aspects to add an injection step when you deserialize the object?
You would need AspectJ or similar for this. It would work very similarly to the #Configurable function in Spring.
e.g. add some advice around the a "private void readObject(ObjectInputStream in) throws IOException, ClassNotFoundException" method
This article may also help: http://java.sun.com/developer/technicalArticles/Programming/serialization/
I think the idea of serializing a bean and then forcing a reinjection of dependencies is not the best architecture.
How about having some sort of ProcessWrapper bean instead which could be a singleton. It would be injected with the Repository and either manages the deserialization of the Process or has a setter for it. When a new Process is set in the wrapper, it would call setRepository() on the Process. The beans that use the Process could either be set with the new one by the wrapper or call the ProcessWrapper which would delegate to the Process.
class ProcessWrapper {
private Repository repository;
private Process process;
// getters, setters, etc.
public void do() {
process.do();
}
public void setProcess(Process process) {
this.process = process;
this.process.setRepository(repository);
}
}
Answering my own question: how I've solved the problem so far is to create a base class which serializes and deserializes using a cheap little proxy. The proxy contains only the name of the bean.
You'll note that it uses a global to access the Spring context; a more elegant solution might store the context in a thread-local variable, something like that.
public abstract class CheaplySerializableBase
implements Serializable, BeanNameAware {
private String name;
private static class SerializationProxy implements Serializable {
private final String name;
public SerializationProxy(CheaplySerializableBase target) {
this.name = target.name;
}
Object readResolve() throws ObjectStreamException {
return ContextLoader.globalEvilSpringContext.getBean(name);
}
}
#Override
public void setBeanName(String name) {
this.name = name;
}
protected Object writeReplace() throws ObjectStreamException {
if (name != null) {
return new SerializationProxy(this);
}
return this;
}
}
The resulting serialized object is 150 bytes or so (if I remember correctly).
The method applicationContext.getAutowireCapableBeanFactory().autowireBean(detachedBean); can be used to reconfigure a Spring-managed bean that was serialized and then de-serialized (whose #Autowired fields become null). See example below. The serialization details are omitted for simplicity.
public class DefaultFooService implements FooService {
#Autowired
private ApplicationContext ctx;
#Override
public SerializableBean bar() {
SerializableBean detachedBean = performAction();
ctx.getAutowireCapableBeanFactory().autowireBean(detachedBean);
return detachedBean;
}
private SerializableBean performAction() {
SerializableBean outcome = ... // Obtains a deserialized instance, whose #Autowired fields are detached.
return outcome;
}
}
public class SerializableBean {
#Autowired
private transient BarService barService;
private int value;
public void doSomething() {
barService.doBar(value);
}
}