Is there a way to access SlingRepository in a POJO that is not managed by OSGI?
For example, we might have a POJO named Site:
public class Site {
private String domainName;
public String getDomainName() { return domainName; }
public void setDomainName(String domainName) { this.domainName = domainName; }
public void validate() throws Exception {
SlingRepository repo = ...;
// validation logic dependent on repo
}
}
That is used like this:
Site site = new Site();
site.validate();
Update (re. Tomek's answer)
The reason why I cannot use #Reference or the current request's ResourceResolver is because I am trying to implement a JSR-303 (aka Bean validation) validator.
In our Sling app, we have a bunch of servlets that receive JSON payloads from the browser. Then we convert them to pojos:
Person p = new Gson().fromJson(json, Person.class)
validate(p); // Validate p using Bean Validation
Person is a simple POJO annotated with JSR-303 annotations:
public class Person {
#NotNull
private String name;
#UniqueProperty(
name = "email",
path = "/content/myapp",
nodeType = "cq:PageContent"
)
private String email;
}
In short, my goal is to implement the #UniqueProperty validation. In the example above, if a node of type cq:PageContent under /content/myapp exists that has an email property with the same value as this model, the validation fails.
The validator class itself will look like this:
public class UniquePropertyValidator implements ConstraintValidator<UniqueProperty, String {
#Override
public void initialize(UniqueProperty constraintAnnotation) {
...
}
#Override
public boolean isValid(String object, ConstraintValidatorContext constraintContext) {
// need to execute a JCR query here
}
}
UniquePropertyValidator will be instantiated by the JSR-303 implementation (e.g. Hibernate Validator) as needed, and there's no way it can access the current request's resource resolver, hence why I was looking for a way to access the SlingRepository.
First of all, if you use Sling then using ResourceResolver is generally more preferable than SlingRepository. It gives you an useful Resource abstraction layer and you can still get the underlying JCR Session object using adaptTo() method.
But back to your question, POJO always lives in a context, there is some entrypoint that runs the whole thing. In Sling there is a few such places: JSP, servlet or an OSGi component. In all of these entrypoints there is at least one way to get access to the repository:
JSP
use resourceResolver binding
use getService(ResourceResolverFactory.class) to create an administrative resolver,
Servlet or filter
use request.getResourceResolver() to get the request session,
or see the next point.
Any OSGi service (including servlet or filter)
use #Reference ResourceResolverFactory to get the factory and create administrative resolver.
After that you can pass the resource resolver to your POJO constructir. I think it's a better option than using hacks with the FrameworkUtil for a few reasons:
if you pass the ResourceResolver to the POJO constructor, it's clear that this particular class operates on the repository,
you don't need to worry about closing sessions (as POJO may not have a defined lifecycle),
if you create your POJO in servlet or component (cases 1 and 2), it's better to use the request's session to avoid working on the administrative one.
In general you should not retrieve external resources from a pojo. Simply add a contructor or setter where you inject the SlingRepository at the position where you do the new. This has the advantage that your pojo is independent of OSGi and can be used in different environments. Also unit testing is easier with this approach.
Site site = new Site(slingRepository);
Of course this just moves the problem to the class that creates the instance. I guess at some point you start at an activator where you have access to the BundleContext and can lookup the service.
In the rare cases where you really want to lookup a service directly use
FrameworkUtil.getBundle(this.getClass()).getBundleContext();
From there you can lookup the service.
As Christian Schneider wrote, you can use this:
BundleContext bundleContext = FrameworkUtil.getBundle(this.getClass()).getBundleContext();
ServiceReference serviceReference = bundleContext.getServiceReference(SlingRepository.class);
SlingRepository slingRepository;
if (serviceReference != null) {
slingRepository = (SlingRepository) bundleContext.getService(serviceReference);
}
Just found a solution how you can tap into the lifecycle of the ConstraintValidator.
See:
http://beanvalidation.org/1.0/spec/#constraintsdefinitionimplementation-validationimplementation
Chapter 2.5. The ConstraintValidatorFactory might help you. If you register a ConstraintValidatorFactory in e.g. your Activator then you can supply it with the SlingRepository. The factory can then forward the SlingRepository to the Validator it creates. So you can keep the OSGi logic out of the Validator.
Related
Every request that my Java application receives, passes through 4 layers:
Handler --> Validator --> Processor --> DAO
Handler is the API Resource. (Handler.java)
Validator validates the input. (Validator.java)
Processor performs some business logic. (Processor.java)
DAO is the DB communication layer. (DAO.java)
The input request has a field called the request_type. Based on this request_type, I want to create different objects for all the layer classes, i.e:
request_type_A should pass through Handler1, Validator1, Processor1, DAO1 (instances)
request_type_B should pass through Handler2, Validator2, Processor2, DAO2 (instances)
request_type_C should pass through Handler3, Validator3, Processor3, DAO3 (instances).. and so on
To clarify, the requirement is to create different chain of objects for a given request type, so that two request having different request_type have entirely different object chain instances. Basically i want to shard my application's object based on a given request_type.
I am using Spring Boot. Is there a way that spring's ApplicationContext can provide different object chains for different object types. Or should I manage these instances by my own?
Is there a way I can create a library which would give me a new object instance for every layer, based on the request_type using Spring's ApplicationContext?
Or should i create multiple ApplicationContext?
Based on comments & question, I understand that you would be receiving 2 or 3 request_type.
So main idea which I have used here is to use constructor injection of chained objects with different configuration beans which will be used based on your request type.
Feel free to check-out this simple demonstration based code from github where I have proposed my idea : https://github.com/patilashish312/SpringObjectChaining
So based on this code, I can confirm that
This application is not creating chain of object per request but will re-use if same type of requests received by application
Objects assigned to one request type is not being used by other request type.
Below console output is proof :
displaying request MyRequest(id=1, name=Ashish, requestType=requestTypeA)
Printing handler bean com.spr.boot3.ConditionalVerification.Handler.MyHandler#31182e0a
Printing validator bean com.spr.boot3.ConditionalVerification.Validator.MyValidator#484e3fe7
Printing processor bean com.spr.boot3.ConditionalVerification.Processor.MyProcessor#70f9b9c7
Printing dao bean com.spr.boot3.ConditionalVerification.Dao.MyDao#2a8175d9
inside dao, doing DAO processing
displaying request MyRequest(id=1, name=Ashish, requestType=requestTypeA)
Printing handler bean com.spr.boot3.ConditionalVerification.Handler.MyHandler#31182e0a
Printing validator bean com.spr.boot3.ConditionalVerification.Validator.MyValidator#484e3fe7
Printing processor bean com.spr.boot3.ConditionalVerification.Processor.MyProcessor#70f9b9c7
Printing dao bean com.spr.boot3.ConditionalVerification.Dao.MyDao#2a8175d9
inside dao, doing DAO processing
displaying request MyRequest(id=1, name=Ashish, requestType=requestTypeB)
Printing handler bean com.spr.boot3.ConditionalVerification.Handler.MyHandler#55ea9008
Printing validator bean com.spr.boot3.ConditionalVerification.Validator.MyValidator#5b2d74c5
Printing processor bean com.spr.boot3.ConditionalVerification.Processor.MyProcessor#5f12fb78
Printing dao bean com.spr.boot3.ConditionalVerification.Dao.MyDao#1a107efe
inside dao, doing DAO processing
displaying request MyRequest(id=1, name=Ashish, requestType=requestTypeB)
Printing handler bean com.spr.boot3.ConditionalVerification.Handler.MyHandler#55ea9008
Printing validator bean com.spr.boot3.ConditionalVerification.Validator.MyValidator#5b2d74c5
Printing processor bean com.spr.boot3.ConditionalVerification.Processor.MyProcessor#5f12fb78
Printing dao bean com.spr.boot3.ConditionalVerification.Dao.MyDao#1a107efe
inside dao, doing DAO processing
I had a similar requirement in my solution. What I built was a general-purpose command handler, and used a decorator pattern of annotations on each command to provide the specification for which handlers, validators, processors, and dao.
In my implementation, I have API handlers which convert requests to specific commands. Command class was an subclass of an abstract command class with a generic type param.
API -> all API variables are copied into a wrapper data model. (This could encapsulate the entrypoint of your handler concept or request_type concept)
Command extends AbstractCommand where T is the wrapper data model.
Then I would have an annotation for each of your concepts: Handler, Validator, Processor, Dao.
The general purpose command handler would have a method that "process"es commands by reading their annotations and then lining up the annotation helper that corresponds to that annotation. This could use the application context to load the bean of the class referenced in the annotation value. By providing a sequencing property for each of the annotation helpers you could loop over the sorted helpers to perform actions in the right order.
In my implementation this was further augmented by whether or not the command included asynchronous behavior, so that all the synchronous behavior would occur first, and the asychronous behavior would be wrapped in a background thread.
The beans that are injected in the rest controller don't vary with the http request content. What you can do is factor your request_type as a path variable and create the desired chains in separate http mappings like so:
#PostMapping(value = "/request_type_A")
public Object handle1(args...){
// Validator1 --> Processor1 --> DAO1
}
#PostMapping(value = "/request_type_B")
public Object handle2(args...){
// Validator2 --> Processor2 --> DAO2
}
If this is not practical for whatever reason and you must specify the type dynamically in the #RequestBody, #PathVariable or #RequestParam, then I would suggest implementing a resolver bean similar to this:
#Component
public class Resolver {
private final RequestTypeAValidator requestTypeAValidator;
private final RequestTypeBValidator requestTypeBValidator;
...
public IValidator getValidator(String requestType) {
switch (requestType) {
case "request_type_A":
return requestTypeAValidator;
case "request_type_B":
return requestTypeBValidator;
default:
throw new IllegalArgumentException("cannot find validator");
}
}
}
The drawback of this approach is that it does not comply with the "Open-Closed" principle in the sense that for any new request type, you will need to edit the resolvers. That can be fixed by using a HashMap in the resolver and letting the beans register themselves to that map on #PostConstruct:
#Component
public class Resolver {
private final Map<String, IValidator> validators = new HashMap<>();
public IValidator getValidator(String requestType) {
IValidator result = validators.get(requestType);
if (Objects.isNull(result)) {
throw new IllegalArgumentException("cannot find validator");
}
return result;
}
public void register(String type, IValidator validator) {
validators.put(type, validator)
}
}
#Component
public class ValidatorA implements IValidator {
private final Resolver resolver;
#PostConstruct
private void register() {
resolver.register("request_type_A", this);
}
...
}
However, in this approach there is a direct dependency from all implementations back to the Resolver.
Lastly, you could inject dynamically like so:
#Component
public class Resolver {
private final ApplicationContext applicationContext;
...
public IValidator getValidator(String requestType) {
switch (requestType) {
case "request_type_A":
try {
return applicationContext.getBean(ValidatorA.class);
} catch (NoSuchBeanDefinitionException e) {
// handle exception
}
case "request_type_B":
try {
return applicationContext.getBean(ValidatorB.class);
} catch (NoSuchBeanDefinitionException e) {
// handle exception
}
default:
throw new IllegalArgumentException("cannot find validator");
}
}
}
Note: Avoid taking the client specified string as the class name or type directly in the applicationContext.getBean() call. That is not safe and may present a great security vulnerability, use a switch or dictionary to resolve the correct bean name or bean type.
If you want to inject multiple instances of the same classes, create a configuration class and declare the beans like this:
#Configuration
public class BeanConfiguration {
#Bean
public IValidator aValidator(){
return new ValidatorImpl(...);
}
#Bean
public IValidator bValidator(){
return new ValidatorImpl(...);
}
}
And then to inject it, you can either use the dynamic resolution by name as above, or use the #Qualifier annotation:
#Service
public class MyService {
private final ApplicationContext applicationContext;
private final IValidator bValidator;
public MyService(ApplicationContext applicationContext, #Qualifier("bValidator") IValidator bValidator) {
this.applicationContext = applicationContext;
this.bValidator = bValidator;
}
public void getDynamically(){
IValidator aValidator = (IValidator)applicationContext.getBean("aValidator");
}
}
I'm trying to add a GraphQLServlet to my existing webservice using graphql-spqr.
I have added the correct annotations and now only need to setup the GraphQL endpoint, which is where I am struggling.
All tutorials and google results that I found are referring to deprecated constructors for the class graphql.servlet.SimpleGraphQLServlet
I can create a servlet object like this:
UserService userService = new UserService();
GraphQLSchema schema = new GraphQLSchemaGenerator()
.withOperationsFromSingleton(userService)
.generate();
graphql.servlet.GraphQLServlet servlet = new Builder(schema).build();
But I can't figure out if and how you can register a servlet instance.
And search engines either misinterpret my search or come up empty.
I'm grateful for any pointers.
Using Servlet 3+, you can dynamically register the servlet. This is what I did for the exact same reason (I didn't want to use deprecated constructors).
#WebListener
public class GraphQLServletRegistrationServletContextListener implements ServletContextListener {
private GraphQLSchema schema;
public GraphQLServletRegistrationServletContextListener() {}
#Inject
public GraphQLServletRegistrationServletContextListener(final GraphQLSchema schema) {
this.schema = schema;
}
#Override
public void contextInitialized(final ServletContextEvent event) {
final ServletRegistration.Dynamic dynamicGraphQLServlet
= event.getServletContext().addServlet("GraphQLEndpoint", SimpleGraphQLServlet.create(schema));
dynamicGraphQLServlet.addMapping("/graphql");
}
#Override
public void contextDestroyed(final ServletContextEvent event) {}
}
I'm using CDI to inject the GraphQLSchema, but you can just ignore that and construct it manually any way you like.
UPDATE: As of graphql-java-servlet 6.x, this answer seems to be out of date
Unless you're using Spring, it's actually quite difficult to register a servlet instance yourself (and there's no standard way to do this), as the Servlet spec never really intended it to be used this way (the container is expected to take care of instantiating the servlets). And if you are using Spring, creating a servlet is unnecessary anyway. For this reason, using the SimpleGraphQLServlet is often unintuitive and inconvenient.
One way you can go about it is to subclass the SimpleGraphQLServlet, and let the container instantiate it for you as intended:
#WebServlet(urlPatterns = "/graphql")
public class GraphQLEndpoint extends SimpleGraphQLServlet {
public GraphQLEndpoint() {
super(schema());
}
private static GraphQLSchema schema() {
UserService userService = new UserService();
GraphQLSchema schema = new GraphQLSchemaGenerator()
.withOperationsFromSingleton(userService)
.generate();
return schema;
}
}
You can also register GraphQLEndpoint the old-school way via web.xml.
You'll notice that you need to inline the entire setup as the call to super has to be the very first thing the constructor calls. This is what makes SimpleGraphQLServlet so inconvenient.
Another option is create your own servlet instead. There's really not that many things you need to do immediately. Just create the schema in its init method (no weird inline setup needed), store it in an instance field, and accept queries in doGet or doPost. There's of course the matter of handling errors, different URLs etc, which is a bit more complicated, but you don't need it right away for learning the basics.
You might also want to use a full-fledged framework, like Spring MVC or Boot. You can find an example of this in graphql-spqr samples.
I have a moderate sized Java EE 6 project that uses several EJBs, including one which sole purpose is managing database calls through JPA. My question is what is the best way to add a new class that does some random bit of functionality and then calls the database access EJB to persist the data from this class.
Does this new class have to be an EJB if it needs access to annotations and injections? Does it have to be an EJB if it has to be deployed with the rest of the project?
I was told that if you want to add a new logic class to the project it either has to be an EJB or you can remotely integrate it using JNDI to access EJB elements and create some kind of client interface. Right now my new class is just a POJO but it's not able to access the EJB functionality.
What should I do in general?
EDIT: Please note my question IS NOT about database access. That's just an example I'm using. My guestion is more broad. I want to know how to access EJB methods from other classes I create. From one EJB to another you can simply inject the other EJB since they're both container managed. But say I create another class in the same package as the EJBs how might How can I access those methods? Is it possbile? What is the best practices here.
Right now I have a class that is taking twitter feed data from a URL it then parses the JSON and returns a string of the top 10 entries. I want to call my EJB that manages database access and pass that string to its corresponding method but I cannot do that because my class is not also an EJB.
EJBs are generally used to implement services of any kind. They integrate really well with JPA so are often used for DB access, but that's not their only usage.
What EJBs are typically not suited for is modeling data. I.e. they should be the verbs in your application, not the nouns. The following is thus wrong:
#Stateless
#Entity
public class CreditCard { // silly, don't do this!
#Id
Long id; + getters/setters
Data expiration date; + getters/setters
}
The following is better, it's a service that when your application starts up fetches some quote data from somewhere:
#Singleton
#Startup
public class QuoteFetcher {
private List<Quote> quotes; // + getter
#PostConstruct
public fetchQuote()
quotes = SomeUrlBuilder.someUrl().getQuotes();
}
}
The following is the obligatory DAO example:
#Stateless
public class JPAInvoiceDAO implements InvoiceDAO {
#PersistenceContext
private EntityManager entityManager;
public Invoice getById(Long invoiceId) {
return entityManager.find(invoiceId, Invoice.class);
}
// More DAO methods
}
The following shows how declarative security is used, and how a bean looks up something that has been externally mapped into its private context (ENC):
#Stateless
public class TokenFetcher
#Resource
private SessionContext sessionContext;
#RolesAllowed("SYSTEM")
public Token getToken() {
return (Token) sessionContext.lookup("token");
}
}
The second part of the question seems to be how to use these beans in your code. There are basically four methods:
Injection in managed beans
Bootstrapping via JNDI
Automatically called at startup
Automatically via a timer
Injection is the easiest way, but only managed beans are injection candidates (basically meaning the Java EE framework creates the bean, and you don't use new() to instantiate it).
E.g. Injection:
#ManagedBean
public class InvoiceBacking {
private Long invoiceId; // + getter/setter
private Invoice invoice; // + getter
#EJB
private InvoiceDAO invoiceDAO;
#PostConstruct
public void init() {
invoice = invoiceDAO.getById(invoiceId);
}
}
(also see Communication in JSF 2.0#Processing GET request parameters)
Bootstrapping via JNDI:
public class SomeQuartzJob implements Job {
public void execute(JobExecutionContext jobExecutionContext) throws JobExecutionException {
InvoiceDAO invoiceDAO = (InvoiceDAO) new InitialContext().lookup("java:global/myApp/myEJB/InvoiceDAO");
List<Invoice> openInvoices = invoiceDAO.getAllByOpenStatus();
// verbose exception handling and closing JNDI context omitted for brevity
}
}
The #Singleton bean showed earlier was an example of how the Java EE framework calls your code itself at startup. For the automatic timer you would use the #Schedule annotation on a bean's method.
I have an application which is part JavaEE (the server side) part JavaSE (the client side). As I want that client to be well architectured, I use Weld in it to inject various components. Some of these components should be server-side #EJB.
What I plan to do is to extend Weld architecture to provide the "component" allowing Weld to perform JNDI lookup to load instances of EJBs when client tries to reference them. But how do I do that ?
In other worrds, I want to have
on the client side
public class ClientCode {
public #Inject #EJB MyEJBInterface;
}
on the server-side
#Stateless
public class MyEJB implements MyEJBInterface {
}
With Weld "implicitely" performing the JNDI lookup when ClientCode objects are created. How can I do that ?
Basically, doing so requires write a so-called portable CDI extension.
But, as it is quite long and requires a few tweaks, let me explain it further.
Portable extension
Like weld doc explains, first step is to create a class that implements the Extension tagging interface, in which one will write code corresponding to interesting CDI events. In that precise case, the most interesting event is, to my mind, AfterBeanDiscovery. Indeed, this event occurs after all "local" beans have been found by CDI impl.
So, writing extension is, more opr less, writing a handler for that event :
public void loadJndiBeansFromServer(
#Observes AfterBeanDiscovery beanDiscovery, BeanManager beanManager)
throws NamingException, ClassNotFoundException, IOException {
// Due to my inability to navigate in server JNDI naming (a weird issue in Glassfish naming)
// This props maps interface class to JNDI name for its server-side
Properties interfacesToNames = extractInterfacesToNames();
// JNDI properties
Properties jndiProperties = new Properties();
Context context = new InitialContext();
for (Entry<?, ?> entry : interfacesToNames.entrySet()) {
String interfaceName = entry.getKey().toString();
Class<?> interfaceClass = Class.forName(interfaceName);
String jndiName = entry.getValue().toString();
Bean<?> jndiBean = createJndIBeanFor(beanManager, interfaceClass, jndiName, jndiProperties);
beanDiscovery.addBean(jndiBean);
}
}
Creating the bean is not a trivial operation : it requires transforming "basic" Java reflection objects into more advanced weld ones (well, in my case)
private <Type> Bean<Type> createJndIBeanFor(BeanManager beanManager, Class<Type> interfaceClass,
String jndiName, Properties p) {
AnnotatedType<Type> annotatedType = beanManager
.createAnnotatedType(interfaceClass);
// Creating injection target in a classical way will fail, as interfaceClass is the interface of an EJB
JndiBean<Type> beanToAdd = new JndiBean<Type>(interfaceClass, jndiName, p);
return beanToAdd;
}
Finally, one has to write the JndiBean class. But before, a small travel in annotations realm is required.
Defining the used annotation
At first, I used the #EJB one. A bad idea : Weld uses qualifier annotation method calls result to build hashcode of bean ! So, I created my very own #JndiClient annotation, which holds no methods, neither constants, in order for it to be as simple as possible.
Constructing a JNDI client bean
Two notions merge here.
On one side, the Bean interface seems (to me) to define what the bean is.
On the other side, the InjectionTarget defines, to a certain extend, the lifecycle of that very bean.
From the literature I was able to find, those two interfaces implementations often share at least some of their state. So I've decided to impelment them using a unique class : the JndiBean !
In that bean, most of the methods are left empty (or to a default value) excepted
Bean#getTypes, which must return the EJB remote interface and all extended #Remote interfaces (as methods from these interfaces can be called through this interface)
Bean#getQualifiers which returns a Set containing only one element : an AnnotationLiteral corresponding to #JndiClient interface.
Contextual#create (you forgot Bean extended Contextual, didn't you ?) which performs the lookup :
#Override
public T create(CreationalContext<T> arg0) {
// Some classloading confusion occurs here in my case, but I guess they're of no interest to you
try {
Hashtable contextProps = new Hashtable();
contextProps.putAll(jndiProperties);
Context context = new InitialContext(contextProps);
Object serverSide = context.lookup(jndiName);
return interfaceClass.cast(serverSide);
} catch (NamingException e) {
// An unchecked exception to go through weld and break the world appart
throw new LookupFailed(e);
}
}
And that's all
Usage ?
Well, now, in my glassfish java client code, I can write things such as
private #Inject #JndiClient MyRemoteEJB instance;
And it works without any problems
A future ?
Well, for now, user credentials are not managed, but I guess it could be totally possible using the C of CDI : Contexts ... oh no ! Not contexts : scopes !
Section 3.5 of the CDI spec should help out. You may want to use some of the properties on the EJB annotation as well. Also, (probably don't need to tell you this) make sure you have JNDI set up correctly on the client to reference the server, and pack any of the needed interfaces into your client jar.
I have a POJO class with a method annotated with #Transactional
public class Pojo {
#Transactional
public void doInTransaction() {
...
}
}
Spring declarative transaction management is based on AOP but I don't have any experience with that. My question is:
Is it possible that when invoking the (new Pojo).doInTransaction() alone, Spring will start a Transaction.
Spring declarative transaction
management is based on APO but I don't
have any experience with that.
I would recommend to start working with it and you will get the experience of using transaction advices using AOP. A good starting point is here.
Is it possible that when invoking the
(new Pojo).doInTransaction() alone,
Spring will start a Transaction.
No, you can't expect Spring to be aware of a bean that you manually invoked. However, it sounds like that you are wanting to avoid declarative transaction management and do programmatic transaction management. There is a way to do that with Spring using the Transaction Template. Is that what you were looking for?
It is somewhat possible, but in a cumbersome way: You must use the AutowireCapableBeanFactory mechanism.
Here is a transactional class as example
public interface FooBar{
void fooIze(Object foo);
}
public class FooBarImpl implements FooBar{
#Transactional
#Override
public void fooIze(final Object foo){
// do stuff here
}
}
And here is how we can use it:
public class FooService implements ApplicationContextAware{
private ApplicationContext applicationContext;
#Override
public void setApplicationContext(
final ApplicationContext applicationContext){
this.applicationContext = applicationContext;
}
public void serviceMethod(){
//declare variable as interface, initialize to implementation
FooBar fooBar = new FooBarImpl();
// try to use it, won't work, as it's not a proxy yet
Object target = new Object[0];
fooBar.fooIze(target); // no transaction
// now let spring create the proxy and re-assign the variable
// to the proxy:
fooBar = // this is no longer an instance of FooBarImpl!!!
(FooBar) applicationContext
.getAutowireCapableBeanFactory()
.applyBeanPostProcessorsAfterInitialization(fooBar,
"someBeanName");
fooBar.fooIze(fooBar); // this time it should work
}
}
This is not a best practice. For one thing, it makes your application highly aware of the Spring Framework and also, it violates the dependency injection principles. So use this only if there is no other way!
Yes, it is possible. Spring does not require the use of dynamic proxies for #Transactional to work. Instead, you can use "true AOP", as provided by AspectJ.
For the details, see http://static.springsource.org/spring/docs/3.0.x/spring-framework-reference/html/transaction.html#transaction-declarative-aspectj
The way Spring handle the transaction through Annotation is using AOP as you've said.
The AOP bit is implemented using Dynamic Proxies (see doc)
So in order to do so you'll need to retrieve an instance of your class (Pojo here) through the spring container since to make it work, Spring will return you a Dynamic Proxy over your Pojo that will automatically surround any annotated method with the transaction management code.
If you simply do a
Pojo p = new Pojo();
p.doInTransaction();
Spring doesn't have any role to play here and your method call won't be inside a transaction.
so what you need to do is something like this
ApplicationContext springContext = ...;
Pojo p = (Pojo) springContext.getBean("your.pojo.id");
p.doInTransaction();
Note: this is an example, you should prefer dependency injection instead of retrieving your bean manually from the context
By doing so, and with a properly configured Spring Context, Spring should have lookout your classes to scan for transactional annotation and automatically wrapped your beans into annotation aware dynamic proxies instances. From your point of view that doesn't change anything, you'll still cast your object to your own Classes, but if you try to print out the class name of your spring context Pojo bean, you'll get something as Proxy$... and not your original class name.
Have a look at this link anyway : link text