EntityManager initialization best practices - java

When using EntityManager, is it better to get one instance with PersistenceContext and pass it around in my program, or should I use dependency injection more than once?
In my application each client will communicate with a stateful session bean, and each bean needs to use EntityManager at some point.
I guess that bean methods are invocated concurrently (but I'm not even sure).
How do I guarantee that I use EntityManager in a thread-safe manner? With transactions? With a separate instance in each bean?
Sorry if this is confusing, I'm new to EJB/JPA and I couldn't find any material which addresses my questions.

Yes, you should inject the EntityManager instances (which will be different for each thread/client request) into your stateful session beans (which are not invoked concurrently, at least not from different clients).
There is no point in creating DAO classes, though. JPA already is a high-level persistence API that gives you RDBMS independence and portability between different JPA implementations. So, DAOs would only add clutter to the codebase.
For transactions, you don't really need to do anything. Business methods in session beans have a "Required" transaction attribute by default, so they will always run inside a client-specific transaction.

Use #PersistenceContext to inject your EntityManager in your DAO class(es). These are the classes that will handle the database operations. Then in all other (service) classes inject your DAO class(es). Your DAO should be a stateless bean (no need of a remote interface, only local)

Related

Combining MDB, JPA and JTA

I'm developing a system to process messages and update the database accordingly, but I need to keep some degree of isolation between layers. I have in mind something like the following.
MyDao.java: a #Stateless bean that provides database access. MyDao accesses the database using JPA, with an EntityManager injected by #PersistenceContext.
MyMdb.java: an MDB that listens on a queue. MyMdb uses MyDao by injection with #EJB.
One single execution of MyMdb.onMessage() needs to perform several accesses to the database, both read and write.
On the one hand, this makes me think that a #Stateless bean is not the right choice for MyDao: the EntityManager instance in MyDao could be randomly accessed by different executions of MyMdb.onMessage(), leading threads to interfere with each other.
On the other hand, JPA documentation says that the injected EntityManager is just a proxy: the actual persistence context on which it operates is the one bound to the JTA transaction. This way everything should be ok because, even though EntityManagers are "shared", each MDB will have a different transaction ongoing and thus work in safe isolation.
What is the right scenario? Am I missing something?
Entity managers injected in a stateless EJB in the way that you described are exactly what you should do.
This type of injection provides a 'Container-Managed Entity Manager' which is 'transaction scoped'.
So in the scenario that you describe.
the onMessage MDB call will create a transaction
the call to the stateless bean will happen to the same transaction context, creating an Entity Manager which will live until the transaction finishes usually when the MDB method returns.
The specific type of injected entity manager for the same EJB instance doesn't survive and is not re-used across different transactions.

EJB: seems I don't understand the crucial point

I'm new in EJB & persistence at all, so excuse me please if I'll ask a stupid question.
I read a book about EJB and JPA, and faces phrase that I don't understand at all:
Intended to fully insulate developers from dealing directly with
persistence, it (EJB) introduced an interface-based approach, where
the concrete bean class was never directly used by client code.
Instead, a specialized bean compiler generated an implementation of
the bean interface to facilitate such things as persistence, security,
and transaction management, delegating the business logic to the
entity bean implementation.
and
The notion of container-managed entity beans was introduced, where
bean classes became abstract and the server was responsible for
generating a subclass to manage the persistent data.
What does it mean:
Specialized bean compiler generated an implementation of the bean interface
server was responsible for generating a subclass to manage the persistent data
Actually I can't grasp what mean generate implementation/subclass, does it mean in runtime?
Thank you in advance.
EDITED:
Finally, entity beans were modeled as remote objects that used RMI and
CORBA, introducing network overhead and restrictions that should never
have been added to a persistent object to begin with.
Does it also sink into nothingness?
1) Interfaces: To define a bean, you had to declare a Local interface and a Remote interface (if you were writting bean MyEJB, they had to be MyEJBLocal and MyEJBRemote; MyEJB would implement both). With that, the compiler generated some derived class implementing the methods, such methods would just connect to the EJB server to retrieve a bean and execute its methods.
I am not so sure about 2, since we had so many performance issues that we ended implementing JDBC logic in the Session beans (I know, I know)...
Specialized Bean: Since Java EE 5, EJB turn into annotation #EJB. All annotaion works like interface. This simple annotaion provide security, and transaction management, delegating the business logic at compile time.
JPA: No more entity bean remain since Java EE 5. Now if you put #Entity to on pojo than server will generate container-managed entity beans and communicate with database through persistance context.

How many EntityManagers are injected for a given PersistenceContext?

I am injecting EntityManager objects in stateless EJB3 beans (which act as DAO objects and each provide access to a different database table). Deployment is in JBoss AS 7.
I then added code using System.identityHashCode in the EJB3 beans methods to see the various instances of the EntityManagers injecting (hoping to see the same instance in all DAOs). E.g. like:
#Stateless
public class AFacade {
#PersistenceContext(unitName="foo")
EntityManager em;
public List<A> findAll() {
l.info("entity manager is: "+System.identityHashCode(em)+" class is: "+em.getClass().getSimpleName());
...
}
However, what I noticed is that each DAO (e.g. AFacade, BFacade and so on) was injected with a different EntityManager (as reported by identityHashCode) although the PersistenceContext was the same. The implementing class was TransactionScopedEntityManager in all cases.
It is not clear to me why this different EntityManager objects are injected, whether this is something that should concern me or not. Also, I understand that the EJB3 container may actually inject proxies to the real EntityManager so these different instances may actually be proxies to a single EntityManager.
Yes, they are proxies (in fact, I think they are thread safe decorators, rather than proxies) on the real entity managers.
I'm not sure if you know that the EntityManager is a wrapper around a connection. If you wouldn't have this decorator (or proxy), all invocations to that stateless bean would share the same connection (and potentially transaction), which is not what you want.
The injected EntityManagers are proxy generated by the EJB Containers.
For Transaction scoped Entity Managers, each transaction uses a single separate instance of Provider's Entity Manager.
When a method call is made to this proxy , container checks javax.transaction.TransactionSynchronizationRegistry ( this is implemented by EJB Container) to see if there is a provider EntityManager already created for this transaction. If not, it will create the provider Entity Manager and register it in TransactionSynchronizationRegistry and then delegate the method call to it. If already present, it will simply retrieve the provider Entity Manager an delegate the method call to it.
Transaction scoped EntityManagers are stateless according to the book "Pro JPA2 Mastering the Java Persistence API" by Mike Keith and Merrick Schincariol (See Chapter 6).
The proxy objects inserted in each EJB instance object is different, though a single proxy object could have been used because of the stateless nature of Transaction scoped Entity Manager.
Also take a look at : http://piotrnowicki.com/2011/11/am-i-in-the-same-transaction-am-i-using-the-same-persistencecontext/

JSF2: Open Session in View with EJBs?

Does it make sense to talk about the Open Session In View Pattern within JSF2 applications?
My app has JSF2 Managed Beans calling Business Service EJBs that do all the db-related stuff (there's a DAO layer but that doesn't matter right now).
Having OSIV pattern would mean that the Managed Bean would have to somehow make sure the underlying session was opened.
I am also using JPA.
Theoretically, the issue is exactly the same: entity will become detaches when they leave the EJB unless something keeps the scope of the EntityManager open. (Here is a great post about the topic in general: JPA implementation patterns: Lazy loading).
From a blog post I read:
8) No Open Entity Manager In View support.
[...] In EJB3, when your entity leaves bean
with transaction scoped EntityManager,
it is detached from persistence
context and you may no longer rely on
lazy loading (in fact, JPA
specification does not specify the
behavior in such situation, probably
some vendor dependent exception will
be thrown...) Of course, you may use
EntityManager with extended
persistence context, holding the
transaction and persistence context as
long as you want. But this feature is
only available for SFSB, while DAO
classes are typical examples of
stateless services, since they only
dispatch calls to the persistence
layer. Additionally, having dedicated
DAO bean instance for each client
seems to be a big overkill.
I'm however not sure it is really true. From my understanding you should be able to write a servlet filter which uses the UserTransaction to start and commit the transaction (like the regular filter in OSIV). EJB would then participate in the transaction started in the filter and the EntityManager would remain open. I haven't tested it though, but my suggestion would be to give it a try.

Injecting EntityManager Vs. EntityManagerFactory

A long question, please bear with me.
We are using Spring+JPA for a web application. My team is debating over injecting EntityManagerFactory in the GenericDAO (a DAO based on Generics something on the lines provided by APPFUSE, we do not use JpaDaosupport for some reason) over injecting an EntityManager. We are using "application managed persistence".
The arguments against injecting a EntityManagerFactory is that its too heavy and so is not required, the EntityManager does what we need. Also, as Spring would create a new instance of a DAO for every web request(I doubt this) there are not going to be any concurrency issues as in the same EntityManager instance is shared by two threads.
The argument for injecting EFM is that its a good practice over all its always good to have a handle to a factory.
I am not sure which is the best approach, can someone please enlighten me?
The pros and cons of injecting EntityManagerFactory vs EntityManager are all spelled out in the Spring docs here, I'm not sure if I can improve on that.
Saying that, there are some points in your question that should be cleared up.
...Spring would create a new instance of
a DAO for every web request...
This is not correct. If your DAO is a Spring bean, then it's a singleton, unless you configure it otherwise via the scope attribute in the bean definition. Instantiating a DAO for every request would be crazy.
The argument for injecting EMF is that
its a good practice over all its
always good to have a handle to a
factory.
This argument doesn't really hold water. General good practice says that an object should be injected with the minimum collaborators it needs to do its job.
I am putting down what I have finally gathered. From the section "Implementing DAOs based on plain JPA" in the Spring Reference:
Although EntityManagerFactory instances are thread-safe, EntityManager
instances are not. The injected JPA EntityManager behaves like an
EntityManager fetched from an application server's JNDI environment,
as defined by the JPA specification. It delegates all calls to the
current transactional EntityManager, if any; otherwise, it falls back
to a newly created EntityManager per operation, in effect making its
usage thread-safe.
This means as per JPA specifications EntityManager instances are not thread safe, but if Spring handles them, they are made thread safe.
If you are using Spring, it is better to inject EntityManagers instead of EntityManagerFactory.
I think this has already been well covered, but just to reinforce a few points.
The DAO, if injected by Spring, is a
singleton by default. You have to
explicitly set the scope to prototype
to create a new instance every time.
The entity manger injected by
#PersistenceContext is thread safe.
That being said, I did have some issues on with a singleton DAO in my multi-threaded application. I ended up making the DAO a instanced bean and that solved the problem. So while the documentation may say one thing, you probably want to test your application thoroughly.
Follow-up:
I think part of my problem is I am using
#PersistenceContext(unitName = "unit",
type = PersistenceContextType.EXTENDED)
If you use PersistenceContextType.EXTENDED, keep in mind you have to, if I understand correctly, manually close the transaction. See this thread for more information.
Another Follow-up:
Using an instanced DAO is an extremely bad idea. Each instance of the DAO will have its own persistence cache and changes to one cache will not be recognized by other DAO beans. Sorry for the bad advice.
I found that setting the #Repository Spring annotation on our DAOs and having EntityManager managed by Spring and injected by #PersistenceContext annotation is the most convenient way to get everything working fluently. You benefit from the thread safety of the shared EntityManager and exception translation. By default, the shared EntityManager will manage transactions if you combine several DAOs from a manager for instance. In the end you'll find that your DAOs will become anemic.

Categories