why have manager files in JDBC - java

I'm playing around with JDBC, and I've noticed that there's usually a manager file that interacts between the front end and the DAO.
I was wondering: why is this the case?
Is it bad form to have the front end directly interact with the DAO and call the methods?

The question isn't very clear. If by "manager file" you mean "service", I think that's closer to the truth.
The reason is that usually there's more work to be done than a single DAO call to accomplish a use case, so a service marshals all the objects that are required.
If there are any write operations, it's typical that they either all need to succeed or fail together, so the service can own the transaction and manage the commit/rollback behavior.

Related

Objectify and transactions

I'm trying to use Objectify in my App Engine project. It works but I've got several "paths" where a single entity can be read and write by a single servlet. Now if I well understood the architecture, according to the load the servlet container can instantiate my servlet multiple times, isn't it? So the question is: do I need to use Objectify transactions in this case? My doubt is quite basic because I think this kind of situation happens 99% of times in this context, so at this point the other question is: when I can use the simple objectify load and save? I hope someone can clarify a bit.
From Objectify Wiki: If you operate on the datastore without an explicit transaction, each datastore operation is treated like a separate little transaction which is retried separately (link: https://github.com/objectify/objectify/wiki/Concepts#transactionless).
So all the save() or delete() are executed in separate transaction. So it doesn't matter even if GAE starts multiple instances of your Servlet.
You would want to start a transaction explicitly, when you want to perform multiple operations as an atomic transaction (either all or none). e.g. select and modify, or modify multiple objects together....

Database transaction combined with REST call

I have to save records to a database and then send some data to a restful web service. I need them to happen together. If one fails then the other should not happen as well. So for example, consider the following code:
saveRecords(records);
sendToRestService(records);
If saveRecords fails with a database constraint violation then I don't want the rest call to happen. I could make saveRecords happen in it's own transaction and commit it before the call to sendToRestService but there is still the potential for the rest service to be down. I could keep up with whether the rest service succeeds and if it doesn't then try to send them later. I was just wondering if there is a better strategy since this seems like it would be a common scenario.
Thanks for any advice.
why don't you try Observer design pattern?
I'm assuming saveRecords(records) and sendToRestService(records) methods are in two different classes.
If you use Observer design pattern, you can notify the class containing sendToRestService() method in case if the calling class object changes.
Ref: Observer Design Pattern

DAO and Service layer design

I am developing web application with Java EE 6. In order to minimize calls to database will it be a good idea to have classes:
Data access class (DAO) will call only basic methods getAllClients, getAllProducts, getAllOrders, delete, update methods - CRUD methods.
Service class which will call CRUD methods but in addition filter methods e.g. findClientByName, findProuctByType, findProductByYear, findOrderFullyPaid/NotPaid etc... which will be based on basic DAO methods.
Thank you
In my experience (albeit, limited) DAO classes tend to have all the possible database operations which the application is allowed to perform. So in your case, it will have methods such as getAllClients() and getClientByName(String name), etc.
Getting all the users in your DAO and iterating all over them until you find the one you need will result in unneeded waste of computational time and memory consumption.
If you want to reduce the amount of times that your database is hit you could, maybe, implement some caching mechanism. An ORM framework such as Hibernate should be able to provide what you need as shown here.
EDIT:
As per your comment question, no, your service will not be made redundant. What one does is to usually use a Service layer to expose the DAO functionalities. This will, basically, not make the DAO visible from the from front end of your application. It usually also allows for extra methods, such as, for instance, public String getUserFormatted(String userName). This will make use of the getUserByName function offered by the DAO but provide some extra functionality.
The Service layer will also make itself useful should there be a change in specification and you now also need a web service to interface with your application. Having a service layer in between will allow the web service to query the DAO through the Service layer.
So basically, the DAO layer will still worry about the database stuff (CRUD Operations) while the service will adapt the data returned by the DAO without exposing the DAO.
It's hard to say without more information, but I think it's probably a good idea to leverage your database more than with just CRUD operations. Databases are good at searching, provided you configure them correctly, so IMHO it's a good idea to let your database handle the searching in your find methods for you. This means that your find methods would probably go in your DAOs...
It's good to think about/be aware of the implications of DB access on performance, but don't go overboard. Also, your approach implies that since your services are going to be doing the filtering, you are going to load a large amount of DB data into your application, which is a bad idea. The bottom line is you should use your RDBMS as it is intended to be used, and worry about performance due to over-access when you can show its a problem. I doubt you will run into that scenario.
I would say that you're better off having your DAO be more fine grained than you've specified.
I'd suggest putting findClientByName, findProuctByType, findProductByYear, findOrderFullyPaid/NotPaid on your DAO as well in some way because your database will most likely be better at filtering and sorting data than your in memory code.
Imagine you have 10 years of data and you call findProductsByYear on your service class and it then calls getAllProducts and then throws away 9 years of data in memory. You're far better off getting your database to only return you the year you are interested in.
Yes, this is the right way to do it.
The service will own the transactions. You should write these as POJOs; that way you can expose them as SOAO or REST web services, EJBs, or anything else that you want later on.

Create Hibernate-Session per Request

I just started a simple Java testproject which manages some entities using Hibernate and provides a REST interface to manipulate these objects and provide some additional business logic. The REST interface is created using RESTEasy and Jetty.
Everything works fine so far, but I have the feeling that I'm actually writing too much boilerplate code. As I don't have much experience in these Java frameworks I'm just wondering if anyone could give me a hint on how to improve the situation.
Creting Hibernate Sessions per Request
Well, as far as I understood I have to create a Hibernate session per request and at the end I have to close it. So currently all of my service methods look like this:
Session session = HibernateUtil.getInstance().getSessionFactory().openSession();
...
...
...
session.close();
Is there any way to remove these two lines in order to somehow do this automatically?
Currently my service is registered as a RestEASY singleton. Will changing to a RESTeasy ressource and creating the session in the constructor solve the problem? I think it will solve the problem of creating the session. But wherer to close it?
In C++ this can easily done be creating a scoped object which closes the session at the end. But in Java?
Whenever such a REST request is made I have to check for a valid session (the user has to login previously). Is a ServletFilter the right way to do this?
General: Are there any other patterns or frameworks I should consider using? I mean I want to keep it as simple as possible and especially as I dont have that much experience I dont want to end up using Spring or whatever heavyweight framework. Seems that I'm used to the simplicity of Python and Django but for this little project I have to use Java.
THanks so far!
Hibernate's current recommended approach for managing Sessions is detailed on this wiki page. In particular, I think you need to read the last paragraph: This is all very difficult, can't this be done easier?
In the end, you do need to tell the persistence layer that "I'm about to do something" (which usually also gets the Session to do it with) and "I'm done doing it". You can do it with annotations, or JTA transactions, but that information still has to be communicated!
Inject SessionFactory to your Data Access Object and use sessionFactory.getCurrentSession() to access Hibernate Session object.
you can make use of any of the Factory classes available to implement this..
Then your code should look like this..
sessionFactory.getCurrentSession().save(newInstance);
You should try writing a Filter that does this. Spring's OpenSessionInViewFilter is a good place to start if you need an example.

Three tier layered application using Wicket + Spring + Hibernate. How would you handle transactions?

I'm thinking about using the Open Session In View (OSIV) filter or interceptor that comes with Spring, as it seems like a convenient way for me as a developer. If that's what you recommend, do you recommend using a filter or an interceptor and why?
I'm also wondering how it will mix with HibernateTemplate and if I will lose the ability to mark methods as #Transactional(readOnly = true) etc and thus lose the ability to get some more fine grained transaction control?
Is there some kind of best practice for how to integrate this kind of solution with a three tier architecture using Hibernate and Spring (as I suppose my decision to use Wicket for presentation shouldn't matter much)?
If I use OSIV I will at least never run into lazy loading exceptions, on the other hand my transaction will live longer before being able to commit by being uncommitted in the view as well.
It's really a matter of personal taste.
Personally, I like to have transaction boundaries at the service layer. If you start thinking SOA, every call to a service should be independent. If your view layer has to call 2 different services (we could argue that this is already a code smell) then those 2 services should behave independently of each other, could have different transaction configurations, etc... Having no transactions open outside of the services also helps make sure that no modification occurs outside of a service.
OTOH you will have to think a bit more about what you do in your services (lazy loading, grouping functionalities in the same service method if they need a common transactionality, etc ...).
One pattern that can help reduce lazy-loading error is to use Value Object outside of the service layer. The services should always load all the data needed and copy it to VOs. You lose the direct mapping between your persistent objects and your view layer (meaning you have to write more code), but you might find that you gain in clarity ...
Edit: The decision will be based on trade offs, so I still think it is at least partly a matter of personal taste. Transaction at the service layer feels cleaner to me (more SOA-like, the logic is clearly restrained to the service layer, different calls are clearly separated, ...). The problem with that approach is LazyLoadingExceptions, which can be resolved by using VO. If VO are just a copy of your persistent objects, then yes, it is clearly a break of the DRY principle. If you use VO like you would use a database View, then VO are a simplification of you persistent objects. It will still be more code to write, but it will make your design clearer. It becomes especially useful if you need to plug some authorization scheme : if certain fields are visible only to certain roles, you can put the authorization at the service level and never return data that should not be viewed.
If I use OSIV I will at least never
run into lazy loading exceptions
that is not true, in fact its extremely easy to run into the infamous LazyInitializationException, just load an object, and try to read an attribute of it, after the view, depending on your configuration you WILL get the LIE

Categories