Retry OpmisticLockException with db-util library does not work - java

I have used this example to implement OptimisticLockException handling:
How to retry JPA transactions after an OptimisticLockException
Dependency in pom.xml:
<dependency>
<groupId>io.hypersistence</groupId>
<artifactId>hypersistence-utils-hibernate-55</artifactId>
<version>${hypersistence-utils.version}</version>
</dependency>
I have annotated the method which may receive OptimisticLockException:
#Retry(times = 10, on = OptimisticLockException.class)
public void modifySomething(){...}
However, when RollbackException/OptimisticLockException/StaleStateException occurs, I get a stack trace, but the method is not retried.
I use it with Guice 4.1.0. Should I bind it somewhere or write the method interceptor?
How to add AOP aspect?

You need to add the RetryAspect to your Spring configuration.
If you are using Guice, you have to create a similar Aspect because the one coming with Hypersistence Utils is only working with Spring.
In Spring, the configuration is very easy; you can do it like this:
#Configuration
#EnableAspectJAutoProxy
#ComponentScan(
basePackages = {
"io.hypersistence.utils.spring.aop"
}
)
public class RetryAspectConfiguration {
}

Related

Micronaut - What is Springframework #Bean equivalent?

I am very new to Micronauts and I have a fair bit of experience developing spring boot applications. With this background I was stumbled upon creating custom beans like how I used to create with #Bean annotations on Spring applications.
In my case I have a library that provides an Interface and its implementation class. I wanted to use the interface in my code and try to inject the implementation and it failes with below error
Caused by: io.micronaut.context.exceptions.NoSuchBeanException: No bean of type [io.vpv.saml.metadata.service.MetaDataParser] exists for the given qualifier: #Named('MetaDataParserImpl'). Make sure the bean is not disabled by bean requirements (enable trace logging for 'io.micronaut.context.condition' to check) and if the bean is enabled then ensure the class is declared a bean and annotation processing is enabled (for Java and Kotlin the 'micronaut-inject-java' dependency should be configured as an annotation processor).
Here is my code
#Singleton
public class ParseMetadataImpl implements ParseMetadata {
private Logger logger = LoggerFactory.getLogger(this.getClass());
#Inject
#Named("MetaDataParserImpl")
private MetaDataParser metaDataParser;
#Override
public IDPMetaData getIDPMetaData(URL url) throws IOException {
logger.info("Parsing {}", url);
logger.info("metaDataParser {}", metaDataParser);
return metaDataParser.parseIDPMetaData(url);
}
}
I am sure there is somehting wrong I am doing and need to understand what to do. I have this working by adding below code and removing annotations around metaDataParser.
#PostConstruct
public void initialize() {
//Want to Avoid This stuff
this.metaDataParser = new MetaDataParserImpl();
}
Using Spring Boot it would be possible to add a #Bean annotation to create some custom beans we can do #Autowired to inject it everywhere on our application. Is there an equivalent on Micronauths that I am missing. I went through the guide on https://docs.micronaut.io/2.0.0.M3/guide/index.html and was not able to get anything to get this working.
Can someone suggest how I can use the #Inject to inject custom beans?
Just incase you want to see this, here is the application on Github.
https://github.com/reflexdemon/saml-metadata-viewer
With the help from Deadpool and a bit of reading I got what I was looking for. The solution was creating #BeanFactory
See Javadoc here: https://docs.micronaut.io/latest/guide/ioc.html#builtInScopes
The #Prototype annotation is a synonym for #Bean because the default scope is prototype.
Thus here is an example that will match the the behavior of Spring framework
Here is the answer for anyone who also is looking for such a thing.
import io.micronaut.context.annotation.Factory;
import io.vpv.saml.metadata.service.MetaDataParser;
import io.vpv.saml.metadata.service.MetaDataParserImpl;
import javax.inject.Singleton;
#Factory
public class BeanFactory {
#Singleton
public MetaDataParser getMetaDataParser() {
return new MetaDataParserImpl();
}
}

Correct way to Integrate JAX-RS with CDI?

I used to integrate Service and DAO beans in Jersey REST resources by annotating them with #Path following Java EE tutorial
In general, for JAX-RS to work with enterprise beans, you need to annotate the class of a bean with #Path to convert it to a root resource class. You can use the #Path annotation with stateless session beans and singleton POJO beans.
So my code used to be something like this:
#Path("/")
public class ServiceResource {
#Inject
private AccountService accountService;
#GET
#Path("/account/get")
public Account getAccount(#QueryParam("id") String id) {
return accountService.get(id);
}
}
#javax.inject.Singleton
#Path("")
public class AccountService {
public Account get(String id){...}
}
Now, I started integrating a Quartz Job into my application, and I wanted to find a way to inject my AccountService inside a job like this
public class AccountJob implements Job {
#Inject
private AccountService accountService;
#Override
public void execute(JobExecutionContext jec) throws JobExecutionException {
accountService.updateAllAccounts();
}
}
I found this answer that tells to use DeltaSpike to do the Job, so I added the following dependencies to my pom.xml, and without adding any more lines of code to any class the inejection of accountService to my Job works fine
<dependency>
<groupId>org.apache.deltaspike.modules</groupId>
<artifactId>deltaspike-scheduler-module-api</artifactId>
<version>1.7.2</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.deltaspike.modules</groupId>
<artifactId>deltaspike-scheduler-module-impl</artifactId>
<version>1.7.2</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.apache.deltaspike.cdictrl</groupId>
<artifactId>deltaspike-cdictrl-api</artifactId>
<version>1.7.2</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.deltaspike.cdictrl</groupId>
<artifactId>deltaspike-cdictrl-weld</artifactId>
<version>1.7.2</version>
<scope>runtime</scope>
</dependency>
However, I realized that when I remove the #Path("") from AccountService, its instance is still injected fine inside ServiceResource, so my questions are the following:
Why adding DeltaSpike dependencies made it possible to inject my beans without using #Path on them?
By searching more, I understood that DeltaSpike internally uses Weld to do the injection, and since I am already using GlassFish 4.0, I know that Weld is already there, so why the injection is not working by default in my Job class and in ServiceResource class without adding #Path on my beans? Actually why adding #Path is even suggested in the Java tutorial?
Is there any bad side effects that I don't see in my code, because I think that I am mixing multiple DI methods here without really understanding how do they work?
Update: After more search, I realize that Jersey doesn't use Weld for dependency injection, instead it uses HK2, a different framework that also happens to be a part of GlassFish, when I try to inject AccountService without using #Path it shows the following exception
org.glassfish.hk2.api.UnsatisfiedDependencyException: There was no object available for injection at SystemInjecteeImpl(requiredType=AccountService,parent=ServiceResource,qualifiers={}...
So this updates the questions to the following:
How to make HK2 injections works? // Without using #Path as mentioned in the Java EE Tutorial
If I managed to to do DI with HK2, will it be safe to use DeltaSpike to do DI for the Quartz Job? Is it okay to mix two CDI framewroks together to scan the classes and do the injection?
I put my my source code on pastebin; pom.xml is here and the Java is here
You do not need to set the Path annotation on your AccountService CDI bean. If CDI is enabled on your application (either with empty beans.xml in CDI 1.0 or discovery-mode=all in CDI > 1.0), you can #Inject any CDI bean in your JAX-RS resource.
So you just have to write the following class:
#Path("/")
public class ServiceResource {
#Inject
private AccountService accountService;
#GET
#Path("/account/get")
public Account getAccount(#QueryParam("id") String id) {
return accountService.get(id);
}
}
#javax.inject.Singleton
public class AccountService {
public void Account get(String id){...}
}
The article you linked in your post deals with mixing EJB and CDI annotations. For example you can mix #Stateless and #Path annotations. It's interesting for example because you can :
Benefit of EJB transaction in your Rest resource (even if now you can use #Transactional interceptor binding)
Set a pool of resources
etc.
Note that all of this works without the help of deltaspike dependency.
For your second question, as Quartz manages its own threads, classes are not handled by CDI so you can not inject beans in Quartz classes. The aim of the deltaspike module is to allow injecting CDI beans in Quartz Jobs. Internally, deltaspike controls CDI Contexts.
EDIT
For your last questions:
Your HK2 problem comes pretty sure from a missing dependency (in your application or server). As said in a previous comment, I managed to deploy your App on Glassfish 4 (build 89) with the source files you provided.
Regarding the integration of CDI with Quartz, I think the best is to implement your own JobFactory and instanciate your jobs using BeanManager. Take a look at this link : https://devsoap.com/injecting-cdi-managed-beans-into-quarz-jobs/
First of all injected resources(beans) and Jersey Endpoint class(point of injection) must be CDI-Aware. It must be detecteable by CDI. We can use bean-discovery-mode="all" - then CDI scan ALL classes or
bean-discovery-mode="annotated" and MARK our class with PROPER annotation: from here : Bean defining annotations. I prefer#Dependent or #RequestScoped
Then we must use Jersey Extension
<dependency>
<groupId>org.glassfish.jersey.ext.cdi</groupId>
<artifactId>jersey-cdi1x-servlet</artifactId>
<version>{version}</version>
<scope>runtime</scope>
</dependency>
`
to connect CDI with HK2 discovery mechanism.
Here is Official oracle Guideline
The default beans.xml discovery-mode (in Java EE 7) is "annotated". Which means only beans that have CDI annotations are recognized and managed by CDI.
Your AccountJob class is not annotated. If you want CDI to be able to inject the service into it then you need to annotate it with some scope annotation, e.g. #ApplicationScoped.
Your other option is to create CDI producer for creating AccountJob beans. See:
http://docs.jboss.org/weld/reference/latest/en-US/html_single/#_producer_methods

How to call a Spring managed object from a POJO?

I am running a web-app, which has one exposed class ( available to other POJO classes) and that has one autowired private member.
Spring managed class
public class EPSQueueSender {
#Autowired
private AmqpTemplate epsMessageTemplate;
public void dosomething(...){
epsMessageTemplate.convertAndSend(...); // Here epsMessageTemplate is null if instance of EPSQueueSender taken from other POJO
}
}
POJO class
public class Test{
EPSQueueSender sender = new EPSQueueSender();
sender.dosomething(....); // gives null exception on epsMessageTemplate
}
Spring code ( running as WebApp) and POJO class code( different Jar) are on same JVM. The POJO is not able to get initialized autowired object. However it is initialized if I use it in webApp project.
Can someone please give some suggestion how can I overcome this problem?
Last thing I would like to try is to hit webserver as http request from POJO.
beans can be pojo or xml many examples might help. You already have #autowired but you did not create the #bean method itself that belongs in a class annotated with #Configuration
Your problem could be overcome using #Configurable feature of spring. For it you have configure in xml with a code like belove
<context:annotation-config/>
<context:spring-configured/>
<context:load-time-weaver/>
in Java Congiguration like below:
#Configuration
#EnableAspectJAutoProxy
#EnableSpringConfigured
#EnableLoadTimeWeaving
public class ConfigApplicationContext {
}
with this configuration you can benefit of the load-waving aspect technique that througth the build-in Spring bean AnnotationBeanConfigureAspect you can inject Spring bean in a pojo that is annotated with #Configurable. you colud be have a code like below:
#Configurable
public class Test{
#Autowired
private EPSQueueSender sender;
public void method(){
sender.dosomething(....); // gives null exception on epsMessageTemplate
}
}
of course, since that you are using a load-wave technique you have configure an agent that will perform the istruments. the configuration is very simple and you have add a line like below in the start of the jvm or tomcat:
java -javaagent:path of the jar with the agent/spring-instrument.jar
remember of course of insert the aop and spring aop maven dependency:
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-aop</artifactId>
<version>yourVersion</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>4
<version>yourVersion</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-instrument</artifactId>
<version>yourVersion</version>
</dependency>
I hope that this can help you

Pointcut targeting a third party JAR class is not triggered

As a temporary fix of the bug https://github.com/spring-projects/spring-hateoas/issues/220, I would like modify the return value of org.springframework.hateoas.core.AnnotationMappingDiscoverer.getMapping methods so that I can resolve placeholders manually. Here is the aspect I tried:
<aop:aspectj-autoproxy />
#Component
#Aspect
public class AnnotationMappingDiscovererFix {
#Around("execution(* org.springframework.hateoas.core.AnnotationMappingDiscoverer.getMapping(..))")
public Object resolvePlaceholders(ProceedingJoinPoint joinPoint) throws Throwable {
Object mapping = joinPoint.proceed();
// resolve placeholders manually...
return mapping;
}
}
But this pointcut gets never triggered, any idea why?
With proxy-based Spring AOP you can only target Spring beans/components. I am not a Spring user, so I do not know for sure, but I do not think that you can actually intercept Spring framework classes via its own AOP framework due to "hen vs. egg" bootstrapping problems.
But if you use full AspectJ via load-time weaving (LTW), you should be able to achieve what you want because the AspectJ weaving agent (aspectjweaver.jar) is loaded before the Spring classes and thus can modify them during classloading phase. The Spring documentation explains how to use AspectJ in connection with Spring.

Can I intercept Spring #Autowired process to do validation checks?

Sometimes we make mistakes in our code and #Autowired a prototype-scoped bean into a singleton-scoped bean. This is of course wrong because then the singleton is probably going to use that dependency as if it was also a singleton.
Is there any way of intercepting the autowiring/DI process to detect this and raise an error ? This would be for detection at development time.
The best way to achieve this is through your unit tests. For example:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = MyAppConfig.class, loader = AnnotationConfigContextLoader.class)
public class MyServiceTest {
#Autowired(required = true)
MyService myService;
#Test
public void shouldAutowire() {}
}
The #ContextConfiguration can be used with Java config as above, or it can refer to XML config files. By doing this, Spring will be used to inject all of your dependencies whenever you run your tests. By including "required = true" on your #Autowired beans, you are ensuring that Spring will throw an exception during that phase, and your test will fail. The example above may not look fancy, but it will ensure that any configuration errors are caught. Of course, you can go further and have your tests make use of the injected beans. I find that rather handy for database access integration tests.
This is not intercepting the autowiring process itself, but you can of course test that your beans are behaving correctly.
You will need to import the spring-test dependency. i.e. For Maven:
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-test</artifactId>
<version>${spring.version}</version>
<scope>test</scope>
</dependency>

Categories