OpenSessionInView vs PersistentContext (Extended) - java

I'm working on an architecture Hibernate/JPA/Spring/Zk, and I multiply the questions these days because I have to learn a lot of framework.
I have a question that leaves me perplexed for several days.
I hear about the "pattern" OpenSessionInView to keep alive a Hibernate transaction to make lazy loading.
Many also say that pattern is not very clean.
And on the other, it is said that PersistentContext extended is not thread safe, and is therefore not suitable for keeping alive the entityManager.
So, what is the real solution to these problems?
I presume that these issues arise from the introduction of ajax which allows more possibilities especially with the use lazy loading to load some heavy Collections when necessary.
For moment, i tried #PersistenceContext in extended mode. It's working...
I had to set it for my JUnit tests, et it's working too in my web application with lazy loading without more configurations.
Is that the evolution of framework (Spring, JPA 2.0) mean that it is now easier and more "clean" work with PersistentContext?
If this is not the case, should we use the OpenSessionInViewFilter from Spring and replace the PersistentContext in transactional mode?
Thank you.

I hear you. I've implemented both patterns in several applications since 2008. Now, I abandon any statetful patterns altogether. When you introduce state to the client, you pose scalability and state management issues: do you merge in client, do you save in user session, what happens when you walk through a wizard and object must be transient before save? How would you synchronize client and serverside state? What happens when db changes--does the client break?
Look at the trend of existing technologies, including Spring MVC: the pattern is to build two projects: 1) restful webservices 2) user interfaces. State is shared through an immutable domain model. Sure you might end up maintain a set of dtos, but they're predictable, cheap, and scale infinitely.
My recommendation? Avoid sending proxied objects over the wire and deal with dtos on the client or a share a domain model with the client if you want to reuse serverside validations. Lazy collections can be loaded via fine-grained api calls through Ajax. That way, you give complete control to the client.
That's how the social web has scaled in the past five years.

Related

how to roll back a transaction happening between microservices?

we have microservice architecture where for most part each microservice is independent. But for some legacy reasons, there is a situation where we have to call another microservice from within another.
eg: the following method is part of Legal Service
#Autowired
public ServiceManager UserServiceManager;
public void updateUserLegalData(){
//do db update of Legal info for the user
userServiveManager.setAcceptedLegal(true);
}
There are two db transactions going on above. one is updating legalService db and other is updating UserService db. please NOTE userService is a microservicerunning on a separate VM.
we are seeing situations where legal Service db is updated but call to userService is failing (Internal server error). so this leaves the application in an inconsistent state. How can we fix this in a recommended way?
Thanks
This situation can be handled only with JTA global/distributed transactions. JTA is part of Java EE standard and can have various implementors. Atomikos is often tool of choice.
Here is good writeup from Dave Syer (Spring ecosystem contributor). It contain also working examples. It's little bit outdated, but still relevant. You can apply some more modern Spring abstractions on top of his examples.
I created few GitHub examples of JTA transactions for my book. Notice that there are errors simulated and transaction is spread across JMS and JDBC datasources.
But also bear in mind that JTA transactions across various data sources are slow, because of 2-phased commit algorithm involved. So often people try to avoid them and rather deal with inconsistencies somehow pragmatically.
Don't do distributed transactions.
For integration with your existing legacy system one approach could be a separate (micro)service which listens to update events from your userService and forwards the respective updates to the legalService.
Spring integration may be suitable for such a task.
Cheers,
Michael
Well if you read little bit about the subject in the internet, it is a big debacle point at the moment but there is one answer that everybody agrees on it, distributed transactions are not way to go for it. They are too clumsy and buggy that we can't rely on them for data consistency.
So what is our options then, people are the moment trying to coordinate micro service transactions via Apache Kafka or with Event Source (which concentrate on saving events which are changing the data instead of saving the data itself). So what is the problem with those? Well they are quite different then usual programming model that we get used to and at technical and organisational point of view quite complex, so instead of programming for Business Problems, you start programming against the technical challenge.
So what is the alternative, I personally developed an another concept and wrote a blog about it, it might be interesting for you. In its basics, it uses full micro service design principles and Spring Boot + Netflix in a J2EE container and fully using the transactions, it is too long to write all details here, if you are interested you can read the from the link below.
Micro Services and Transactions with Spring Boot + Netflix
Transaction across microservices can become complex and can slow down the system, one of the best ways to solve the problem of distributed transactions is to avoid them completely.
If you avoid distributed transactions across microservices then you will not end up in such situation.
If at all you have to implement distributed transactions across microservices then I think there are a couple of ways :
Two-phase commit protocol
**Eventual Consistency
In your case, I would recommend using message bus and flag to communicate among services , So if legal service adds the data into legal database put a lock on that record and send message on message bus , user service when it is up it will pick the message and update database at its end and send ack message onto the message bus, once ack message is received remove the lock otherwise delete/rollback the record after certain time duration. This looks complex but reliable and failure-proof solution in your case.

CDI interceptors and memcache

I was reading about interceptors and AOP, the way they can unclutter your code and externalize cross-cutting concerns into aspects. I instantly thought of CDI and the use of custom interceptors to access cache everytime one tries to access the database.
Is there any library that already implements this and supports memcache? I think calls to the entitymanager should be intercepted.
IMHO, if you want to go that way, you need a pretty good reason to justify why Hibernate Cache / JBoss Cache (just guessing about your technology stack, but there are products / solution for almost all stacks) won't fit you needs?
You certainly don't want to reinvent the wheel in terms of developing your own query- or object cache, don't you?
In general, using memcached to directly avoid DB requests is very difficult to get right and inefficient. You really want to cache higher level concepts such as DAO -> DTO boundaries.
I've used AOP to inject cache invalidation and observer management code in java programs pretty successfully. AOP allows me to think of a different set of reusability of different parts of my code. It doesn't mean I don't have to design these aspects, but it frees me of limitations and prevents me from cutting and pasting, etc...
So my recommendation would be to design this access pattern such that you have to do a bunch of work at each of these boundaries, and then design cross cuts that inject that work at compile time.

Looking for design patterns to isolate framework layers from each other

I'm wondering if anyone has any experience in "isolating" framework objects from each other (Spring, Hibernate, Struts). I'm beginning to see design "problems" where an object from one framework gets used in another object from a different framework. My fear is we're creating tightly coupled objects.
For instance, I have an application where we have a DynaActionForm with several attributes...one of which is a POJO generated by the Hibernate Tools. This POJO gets used everywhere...the JSP populates data to it, the Struts Action sends it down to a Service Layer, the DAO will persist it...ack!
Now, imagine that someone decides to do a little refactoring on that POJO...so that means the JSP, Action, Service, DAO all needs to be updated...which is kind of painful...There has got to be a better way?!
There's a book called Core J2EE Patterns: Best Practices and Design Strategies (2nd Edition)...is this worth a look? I don't believe it touches on any specific frameworks, but it looks like it might give some insight on how to properly layer the application...
Thanks!
For instance, I have an application where we have a DynaActionForm with several attributes...one of which is a POJO generated by the Hibernate Tools. This POJO gets used everywhere...the JSP populates data to it, the Struts Action sends it down to a Service Layer, the DAO will persist it...ack!
To me, there is nothing wrong with having Domain Objects as a "transveral" layer in a web application (after all, you want their state to go from the database to the UI and I don't see the need to map them into intermediate structures):
Now, imagine that someone decides to do a little refactoring on that POJO...so that means the JSP, Action, Service, DAO all needs to be updated...which is kind of painful...There has got to be a better way?!
Sure, you could read "Beans" from the database at the DAO layer level, map them into "Domain Objects" at the service layer and map the Domain Objects into "Value Objects" for the presentation layer and you would have very low coupling. But then you'll realize that:
Adding a column in a database usually means adding some information on the view and vice-versa.
Duplication of objects and mappings are extremely painful to do and to maintain.
And you'll forget this idea.
There's a book called Core J2EE Patterns: Best Practices and Design Strategies (2nd Edition)...is this worth a look? I don't believe it touches on any specific frameworks, but it looks like it might give some insight on how to properly layer the application...
This book was a "showcase" of how to implement (over engineered) applications using the whole J2EE stack (with EJB 2.x) and has somehow always been considered as too complicated (too much patterns). On top of that, it is today clearly outdated. So it is interesting but must be taken with a giant grain of salt.
In other words, I wouldn't recommend that book (at least certainly not as state of the art). Instead, have a look at Real World Java EE Patterns - Rethinking Best Practices (see Chapter 3 - Mapping of the Core J2EE patterns into Java EE) and/or the Spring literature if you are not using Java EE.
First, avoid Struts 1. Having to extend a framework class (like DynaActionForm) is one of the reasons this framework is no longer a good choice.
You don't use spring classes in the usual scenarios. Spring is non-invasive - it just wires your objects. You depend on it only if using some interfaces like ApplicationContextAware, or if you are using the hibernate or jdbc extensions. Using these extensions together with hibernate/jdbc completely fine and it is not an undesired coupling.
Update: If you are forced to work with Struts 1 (honestly, try negotiating for Struts 2, Struts 1 is obsolete!), the usual way to go was to create a copy of the Form class, that contained the exact same fields, but did not extend the framework class. There would be a factory method that takes the form class and returns the simple POJO. This is duplication of code, but I've seen it in practice and is not that bad (compared to the use of Struts 1 :) )
I think your problem is not so big as it seems.
Let's imagine, what can you really change in your POJO:
1) name of its class: any IDE with refactoring support will automatically make all necessary changes for you
2) add some field/method: it almost always means adding new functionality what is always should be done manually and carefully. It usually cause to some changes in your service layer, very seldom in DAO, and usually in your view (jsp).
3) change methods implementation: with good design this should cause any changes in other classes.
That's all, imho.
Make a decision about technology for implementing busyness-logic (EJB or Spring) and use its facilities of dependency injection. Using DI will make different parts of your program communicate to each other through interfaces. It should be enough for reaching necessary (small enough) level of coupling.
It's always nice to keep things clear if you can and separate the layers etc. But don't go overboard. I've seen systems where the developers were so intent on strictly adhering to their adopted patterns and practices that they ended up with a system worse than the imaginary one they were trying to avoid.
The art of good design is understanding the good practices and patterns, knowing when and how to apply them, but also knowing when it's appropriate to break or ignore them.
So take a good look at how you can achieve what you are after, read up on the patterns. Then do a trial on a separate proof of concept or a small part of your system to see your ideas in practice. My experience is that only once you actually put some code in place, do you really see the pros and cons of the idea. Once you have done that, you will be able to make an informed decision about what you will or will not introduce.
Finally, it's possible to build a system which does handle all the issues you are concerned about, but be pragmatic - is each goal you are attempting to reach worth the extra code and APIs you will have to introduce to reach it.
I'd say that Core J2EE Patterns: Best Practices and Design Strategies (2nd Edition) addresses EJB 2.0 concerns, some of which would be considered anti-patterns today. Knowledge is never wasted, but I wouldn't make this my first choice.
The problem is that it's impossible to decouple all the layers. Refactoring the POJO means modifying the problem you're solving, so all the layers DO have to be modified. There's no way around that.
Pure decoupling of layers that have no knowledge of each other requires a lot of duplication, translation, and mapping to occur. Don't fall for the idea that loose coupling means this work goes away.
One thing you can do is have a service layer that's expressed in terms of XML requests and responses. It forces you to map the XML to objects on the service side, but it does decouple the UI from the rest.

How to deal with databases for websites written in Java, more specifically Wicket?

I'm new to website development using Java but I've got started with Wicket and make a little website. I'd like to expand on what I've already made (a website with a form, labels and links) and implement database connectivity.
I've looked at a couple of examples, in example Mystic Paste, and I see that they're using Hibernate and Spring. I've never touched Hibernate or Spring before and to be honest the heavy use of annotations scare me a little bit as I haven't really made use of them before, with the exception of supressing warnings and overriding.
At this point I have one Connection object which I set up in the WebApplication class upon initialization. I then retrieve this connection object whenever I need to perform queries. I don't know if this is a bad approach or not for a production web application.
All help is greatly appreciated.
Wicket, Spring and Hibernate is pretty much the standard stack for Wicket applications. Or let's rather say that any web framework, Spring and Hibernate is pretty much the standard stack for any web framework.
Regarding Wicket, injecting objects using #SpringBean inside components is an extremely nice to have feature. Additionally, the OpenSessionInViewFilter manages Hibernate sessions for you (while Hibernate itself takes care of connections).
Therefore, I'd really suggest you look into Spring and Hibernate - both of which don't require annotations, but they are most of the time easier to use than configuration files (typically XML).
If you still don't want to use Spring or Hibernate, I'd suggest you look at the OpenSessionInViewFilter and create something similar yourself: create a connection for each request, use it during one request, close it at the end. As this won't perform very well, you might rather choose to get connections from a pool to which you return it at the end of a request. But instead of writing this code, you could already be injecting beans into your components ;)
Bad approach because a Connection object is intended for use by a single thread and web application requests are processed from a pool of thread.
In the best case you'll suffer for big performance problems cause the connection object won't execute queries concurrently.
A solution to this problem is the usage of a connection pool.
If you have time you can dig around Apache Cayenne, it's far more light than Hibernate and for dependency injection combine with Google Guice, again very lightweight. Wicket has wicket-guice subproject, which provides DI in wicket components, much like Spring Context.
IMHO it's fair alternative, works very nice so far.

Why does Hibernate seem to be designed for short lived sessions?

I know this is a subjective question, but why does Hibernate seem to be designed for short lived sessions? Generally in my apps I create DAOs to abstract my data layer, but since I can't predict how the entity objects are going to be used some of its collections are lazy loaded, or I should say fail to load once the session is closed.
Why did they not design it so that it would automatically re-open the session, or have sessions always stay open?
Becuase once you move out of your transaction boundary you can't hit the database again without starting a new transaction. Having long running transactions 'just in case' is a bad thing (tm).
I guess you want to lazy load object from your view - take a look here for some options. I prefer to define exactly how much of the object map is going to be returned by my session facade methods. I find this makes it easier to unit test and to performance test my business tier.
I worked on a desktop app that used EJB and Hibernate. We had to set lazy=false everywhere, because when the objects get serialized, they lose their ability to be fetched from the backend. That's just how it goes, unfortunately.
If you are concerned with performance, you could use caching on the backend so that your non-lazy fetches are not as painful.
You're looking for the OpenSessionInView pattern, which is essentially a conceptual filter (and sometimes implemented as a servlet filter) that detects when a session needs to be transparently reopened. Several frameworks implement this so it handles it automagically.
I'm writing a desktop application so using a filter isn't applicable.
Connections are a scarce resource that need to be recycled as soon as you are done using them. If you are also using connection pooling, getting another one when you need it should be quick. This is the architecture that you have to use to make websites scale -- even though you are a desktop app, their use-cases probably concentrate on scalable sites.
If you look at MS ADO.NET, you will see a similar focus on keeping connections open for a short time -- they have a whole offline model for updating data disconnected and then applying to a database when you are ready.
Hibernate is designed as a way to map Objects to Relational Database tables. It accomplishes that job very well. But, it can't please everybody all of the time. I think there is some complexity in learning how initialization works but once you get the hang of it it makes sense. I don't know if it was necessarily "designed" to specifically to anger you, it's just the way it happened.
If it was going to magically reopen sessions in non-webapps I think the complexity of learning the framework would far outweight the benefits.

Categories