I have a custom filter for querying the database.
The API layer build the filter,send it to the dao layer and the dao execute the filter (filter.toCreteia()) and return List of results.
public interface IFilter {
Creteria toCriteria();
}
I want to make the DAO api to always as for filter + securityFilter in every method.
List getAll(IFilter filter, IFilter security); //each filter will be a creteria in the end
I end up inside the dao with 2 creterias: regular filter and security filter.
how can i render 2 creteias for returning one List of results?
Or, do you think I should use only one filter and in the API layer add security content to it?
Unless you want to go with an interceptor approach (e.g., have a SecurityInterceptor/proxy class which transparently modifies the criteria) I think it would be nicer design to have two separate filters.
Note that I don't see that it is possible to join two DetachedCriteria objects together. Although you can have a routine which based on two IFilter objects returns a single DetachedCriteria based e.g. on Restrictions.and(criterion1, criterion2).
Related
I'm brand new to Spring Boot, and I've created a very basic REST service that uses JPA, and exposes the RepositoryRestResource for CRUD and query operations on my model:
#RepositoryRestResource
public interface CatalogueOrderRepository extends JpaRepository<CatalogueOrder, Long>,
QuerydslPredicateExecutor<CatalogueOrder> {
}
Using this, I'm able to perform queries that involve searching for values, pagination, and ordering, for instance:
?page=0&size=5&sort=priority,desc&orderStatus=submitted
Is it possible to search for values that are not equal, without any additional work? For instance, all orders where the orderStatus is NOT equal to 'submitted'.
I notice that the Predicate interface has a not() method, though I'm not sure if it's related.
For such cases you should do some work. There are different approaches to do that. See Spring docs and examples about JPA.
E.g. you can use #Query or specifications.
You can try "Query creation from method names".
Let's say you want to search Orders by orderstatus <> submitted,
List<Order> findByOrderstatusNot(String orderstatus);
I am working with REST API,
Is there any wrong if given the list of Model Class Objects as response directly to the user.
or should I need to map those Actual Model class to any POJA class before Returning?
eg :
if the API is forgetting all Users ("/Users")
then is it a good coding method to return directly
return userRepository.findAll();
or need to convert it to any List<UserPOJO> before returning?
Or is there any good codding standards?
From my experience, it is usually better to map the Entities to equivalent POJO classes.
Here are a few reasons:
1) Most of the time you do not need all the data that is stored in an entity. You can map only the subset that is needed in the response.
2) From a security perspective, it is always good to have some sort of a middle ground where you filter out the sensitive data that should not actually put in the response. Or only for certain users which you can decide during the mapping.
3) Hibernate objects are not plain objects, they are proxies. This may cause unnecessary lazy loading for example of #OneToMany and #ManyToMany relations. You should be able to control that and from my experience, Jackson loads all things possible, unless you annotate it with a #JsonIgnore.
Unless you are working with a very simple and not security-heavy app, then I would stay with Hibernate objects. But otherwise, which is most cases, I would go for the mapping.
I was generating POJOs from my database using different tools and noticed that some would generate collections as fields, with getters and setters, for one to many relationships and others didn't.
Let's say I have an Order and Product table. Each order can have one or many products.
Collection<Product> list = new ArrayList<>();
list.add(product1);
list.add(product2);
Method 1:
Order order = new Order();
order.setDate(...);
orderDao.add(order);
orderDao.addProductBatch(list)
Method 2:
Order order = new Order();
order.setDate(...);
order.setProductCollection(list);
orderDao.add(order);
and in the add method, include an addProductBatch call.
which method is prefered? Also for some one to many relationships adding multiple objects in a single transaction never occurs - in which case you wouldn't need some of these collections - is this correct?
It depends on the implementation of the DAO...
In the method 2, you build your order and its products in the business model, then pass the complete and consistent order (order + list of products) to be saved by the DAO. The transaction implementation is internal to the DAO.
In the method 1, you call twice the DAO, first with the order (without product), then again with the list of products related to the order. It means, either that the DAO is stateful and you have some method to execute the transaction when you are done setting it up, or that there is 2 transactions. If you are in the case of this last option, consistency of the DB can be wrong (having an order without any products).
Method 2 is certainly better since it allows stateless DAO, and a clean transaction management.
I would prefer variant of the first method. We avoid to use direct data transfer between entities and DAO services in form of collections, but prefer creation such methods as add, get, remove directly in entity (I know it's some against the rules) - all first needed methods for entity life encapsulates in entity.
I'm currently developing an application using 3 layers ui-service-dao . At the dao level I am using Spring's jdbcTemplate . So far so good but I encountered a situation which I like to have some more insight
My DAOs had at the beginnign only simple CRUD methods . At the service level I'm checking for input values and delegating to the DAOs and also dealing with transactions.
Now I need things more like this one below
List getAllBooksByAuthorName(String name)
My question is where to put this one? In DAO-layer using sql or in service by using core methods of CRUD and computing simply in java
I rather tend to use sql as much as possible instead of calculating at service layer. But now it seems like for every new method , I also need to change the interface of the DAO and make correspondent method in the interface of the service. Then service becomes nothing more than a delegator and parameter checker. It feels not right.
Your opinions are quite valid but i didn't get much why you are in doubt.Generally DAO pattern reduce coupling between Business logic and Persistence logic.
public interface BooksDAO{
public boolean save(Book book);
public boolean update(Book book);
public boolean findByBookIsbn(int isbn);
public boolean delete(Book book);
//here is what you want
public List<Book> getAllBooksByAuthorName(String name);
}
Now you can have different implementations for BooksDao like HibernateBooksDaoImpl or JdbcBooksDAOImpl. DAO pattern makes easy to write isolated junit test and executes faster.
If you have complex queries you can still use dao pattern. Basically there is way to write complex queries in implementation side whether it is simple jdbc (sql can be used) or spring jdbc template(still sql can be used) or hibernate use criteria.
see:
http://docs.jboss.org/hibernate/core/3.6/javadocs/org/hibernate/Criteria.html
For more information look:
http://javarevisited.blogspot.com/2013/01/data-access-object-dao-design-pattern-java-tutorial-example.html
http://www.oracle.com/technetwork/articles/entarch/spring-jdbc-dao-101284.html
That's however how it should be. In the business logic is reduced to nothing except calling a DAO method, then you are lucky to have simple business logic.
It would obviously be extremely inefficient and completely unrealistic to have the service call BookDAO.findAll() and filter the giant list of books returned by the DAO. SQL is the right tool for the job.
Note that the days where mocking was only possible with interfaces are past. Using an interface to define your DAO methods isn't really necessary anymore.
For example you could use the Entity-Control-Boundary-Pattern.
Your package structure will look like the following:
Under the namespace of your application you could introduce a package called "business", in that package there can be packages named by the business responsibility and these packages are seperated into "entity", "control" and "boundary".
com.example.myapplication.business.project.entity -> If you are using JPA all your entities can be stored in this package, contains DTOs
com.example.myapplication.business.project.control -> In this package refactored services can be stored, for example if the DAO-Code is needed in more than just one boundary, the code could be refactored in this package
com.example.myapplication.business.project.boundary -> This package contains all services that can be seen by the client (for example your web page)
In the package "presentation" your ui controllers can be stored and the ui controllers should only access the services stored in the boundary package.
com.example.myapplication.presentation.project
By using this pattern you avoid the use of delegators, because the services stored in the boundary-package can also contain sql-specific stuff and all servies and entities are in the package they belong to.
The pattern can be also used outside of JEE. Adam Bien has revolutionised this pattern in the JEE-Architecture and I´m using it also in my own projects. Here is an example -> http://www.youtube.com/watch?v=JWcoiXNoKxk#t=2380
The methods of your boundary could look like the following:
public interface ProjectService {
public Project createProject(Project project);
public Project getProjectById(String projectId);
public List<Project> getProjectList(ListConfig config); // where ListConfig is a class containing information of how the list should be sorted, optional pagination information, etc, so that the interface must not be changed every time you need a new parameter
public Project updateProject(Project project);
public void deleteProject(String projectId);
public Project addFeature(Project project, Feature feature);
}
#ayan ahmedov: Sorry, the first time I tried to answer your question I had unfortunately edit your question and my answer was in the content area of your question. I´ve 'reverted' the accidental changes.
Correct me if anything is wrong.
Now when we use Spring DAO for ORM templates, when we use #Transactional attribute,
we do not have control over the transaction and/or session when the method is called externally, not within the method.
Lazy loading saves resources - less queries to the db, less memory to keep all the collections fetched in the app memory.
So, if lazy=false, then everything is fetched, all associated collections, that is not effectively, if there are 10,000 records in a linked set.
Now, I have a method in a DAO class that is supposed to return me a User object.
It has collections that represent linked tables of the database.
I need to get a object by id and then query its collections.
Hibernate "failed to lazily initialize a collection" exception occurs when I try to access the linked collection that this DAO method returns.
Explain please, what is a workaround here?
Update: All right, let me ask you this. DAO is an abstract layer, so a method "getUserById(Integer id)" is supposed to return an Object.
What if in some cases I need these linked collections of the User object and in other situation I need those collections.
Are there only two ways:
1) lazy loading = false
2) create different methods: getUserByIdWithTheseCollections(), getUserByIdWithOtherCollections() and inside those methods use your approach?
I mean are there only 2 ways and nothing better?
Update 2: Explain please, what would give me the explicit use of SESSIONFACTORY?
How does it look in practice? We create an instance of DAO object,
then inject it with session factory and this would mean that two consequent
method calls to DAO will run within the same transaction?
It seems to me that anyway, DAO is detached from the classes that make use of it!
The logic and transactions are encapsulated within DAO, right?
You can get the linked collection in transaction to load it while you're still within the transaction:
User user = sessionFactory.getCurrentSession().get(User.class, userId);
user.getLinkedCollection().size();
return user;
As BalusC has pointed out, you can use Hibernate.initialize() instead of size(). That's a lot cleaner.
Then when you return such an entity, the lazy field is already initialized.
Replying to your PS - is using transactions on service level (rather than DAO) level feasible? It seems to be, as doing each DAO call in separate transaction seems to be a waste (and may be incorrect).
I find that it's best to put #Transactional at the service layer, rather than the DAO layer. Otherwise, all your DAO calls are in separate hibernate sessions - all that object equality stuff won't work.
In my opinion best way to solve this problem will be to design application in a session-per-request model. Then, if you even have an object taken from DAO, until your OSIV pattern works you can use the object safely anywhere in application, even in views without bothering this stuff. This is probably better solution that those proposed because:
Hibernate.initialize() or size is a very artificial workaround - what if you want to have User with different collection initialized, would you write another method for getting user?
Service layer transactional model is OK, but the same problem comes when you want to get object extracted from the service layer to use it in controller or view
You could do something like following:
public User getByUserId(Long id, String ... fetch) {
Criteria criteria = createCriteria();
if (fetch != null) {
for (String fieldName : fetch) {
criteria.setFetchMode(fieldName, FetchMode.JOIN); // fetch these fields eagerly
}
}
return criteria.add(Restrictions.eq("id", id)).list();
}