In my Spring web app I'm using a generic dao class:
public abstract class GenericDaoImpl<T> implements GenericDao<T> {
#Override
public T create(final T t) {
this.getEntityManager().persist(t);
return t;
}
#Override
public void delete(final Object id) {
this.getEntityManager().remove(
this.getEntityManager().getReference(getEntityType(), id));
}
#Override
public T find(final Object id) {
return (T) this.getEntityManager().find(getEntityType(), id);
}
#Override
public T update(final T t) {
return this.getEntityManager().merge(t);
}
}
I implement this class for every entity in my model and it works perfectly. For example:
#Repository
public class GruppoDaoImpl extends GenericDaoImpl<Gruppo> implements GruppoDao {
}
I use these dao classes in my service layer. I have a service layer for every entity in my model, but methods for most of these classes, are the same, so I tried to create a generic service class that I can extend in the same way I do for the generic dao:
public abstract class GenericAdminServiceImpl<ENTITY extends AbstractEntity, DTO extends AbstractDto>
implements GenericAdminService<ENTITY, DTO> {
private GenericDao<ENTITY> dao;
private Class<ENTITY> entityClass;
private Class<DTO> dtoClass;
#SuppressWarnings({ "unchecked", "rawtypes" })
protected GenericAdminServiceImpl(GenericDao<ENTITY> dao) {
this.dao = dao;
//
Type t = getClass().getGenericSuperclass();
ParameterizedType pt = (ParameterizedType) t;
this.entityClass = (Class) pt.getActualTypeArguments()[0];
this.dtoClass = (Class) pt.getActualTypeArguments()[1];
}
public DTO getById(Object id) {
DTO dto = null;
ENTITY entity = dao.find(id);
if (entity != null) {
try {
dto = dtoClass.newInstance();
initDto(entity, dto);
} catch (Exception e) {
}
}
return dto;
}
public void create(DTO dto) throws ServiceOperationException {
ENTITY entity;
try {
entity = entityClass.newInstance();
initEntity(dto, entity);
Date dt = new Date();
entity.setDataUltimoAggiornamento(dt);
entity.setUtenteUltimoAggiornamento(dto.getLoggedUser());
entity.setDataInserimento(dt);
entity.setUtenteInserimento(dto.getLoggedUser());
dao.create(entity);
} catch (Exception e) {
throw new ServiceOperationException("impossibile creare entity ["
+ entityClass.getSimpleName() + "]", e);
}
}
public void update(DTO dto) throws ServiceOperationException {
ENTITY entity = dao.find(dto.getId());
if (!entityExists(entity)) {
throw new ServiceOperationException("entity non esistente ["
+ entityClass.getSimpleName() + "#" + dto.getId() + "]");
}
initEntity(dto, entity);
Date dt = new Date();
entity.setDataUltimoAggiornamento(dt);
entity.setUtenteUltimoAggiornamento(dto.getLoggedUser());
dao.update(entity);
}
public void delete(Object id) throws ServiceOperationException {
try {
dao.delete((int) id);
} catch (Exception e) {
throw new ServiceOperationException(
"impossibile eliminare entity ["
+ entityClass.getSimpleName() + "#" + id + "]", e); // TODO
}
}
protected abstract void initDto(ENTITY entity, DTO outDto);
protected abstract void initEntity(DTO dto, ENTITY outEntity);
protected abstract boolean entityExists(ENTITY entity);
}
Extending this class I just have to implement specific parts for every entity, leaving all the common stuff in the abstract/generic class.
The problem is that using the generic service, merge, persist and delete don't work. Only select seems to work and I cannot understand why...
When I run debug mode in Eclipse all seems correct. A consistent entity is passed to merge/persist methods, so why they don't work? can you help me?
UPDATE #1
This is an example of implementation:
#Service
#Transactional(propagation = Propagation.REQUIRES_NEW)
public class GruppoServiceImplG extends
GenericAdminServiceImpl<Gruppo, GruppoDto> implements GruppoServiceG {
#Autowired
protected GruppoServiceImplG(GruppoDao gruppoDao) {
super(gruppoDao);
}
#Override
protected void initDto(Gruppo entity, GruppoDto outDto) {
outDto.setId(entity.getId());
outDto.setNome(entity.getNome());
outDto.setDescrizione(entity.getDescrizione());
outDto.setDataInizioValidita(entity.getDataInizioValidita());
outDto.setDataFineValidita(entity.getDataFineValidita());
}
#Override
protected void initEntity(GruppoDto dto, Gruppo outEntity) {
outEntity.setId(dto.getId());
outEntity.setNome(dto.getNome());
outEntity.setDescrizione(dto.getDescrizione());
outEntity.setDataInizioValidita(dto.getDataInizioValidita());
outEntity.setDataFineValidita(dto.getDataFineValidita());
}
#Override
protected boolean entityExists(Gruppo entity) {
return entity != null && entity.getId() > 0;
}
}
UPDATE #2
Following Łukasz L. suggestion, I added to all my crud methods a flush(). Now I get this exception javax.persistence.TransactionRequiredException: no transaction is in progress. What's wrong with my transaction declaration? it works fine with non-generic serices...
If you read that question about Spring and Hibernate flush behaviour, it's not easy that commiting your transaction will make also the EntityManager to save all changes. Spring and JPA (Hibernate&CO) are designed to work quite nice (from the Spring side) but nevertheless, you must assert that your entity manager will write all queries to database before commiting transaction.
The problem: JPAs like to cache. It means, they tend to avoid issuing queries. If you do SELECT, they have no choice - they must fetch some data (as long as that data portion was not fetched - like when getting single entity by ID). By INSERTs and UPDATEs - well, they CAN cache. It means, that create, merge or remove will usually not issue a query to RDBMS until you call flush() on EntityManager.
If you leave transactional block without calling flush, and entity manager is delaying operations, you'll commit transactions, by which the modifying queries were not issued!
Just make sure to call EntityManager.flush() at least at the end of the transactional method. You can also call it after each DML operation, it's your choice (I prefer that way because it gives me full control in which order the DML queries are issued by JPA, if you heavily uses DB constraints/triggers, it can be essential).
#Transactional
public void myTransactionalMethod() {
getEntityManager().persist(a); // I've made some JPA change, that is not issued to DB yet
...
// I'm doing something more
...
getEntityManager().flush(); // the last moment to flush, after that instruction I leave transactional context
}
Following Łukasz L. suggestion I discovered the actual issue in my generic class.
Transaction declaration was wrong. I set #Transactional(propagation = Propagation.REQUIRES_NEW) only in concrete service class.
I solved this way:
#Transactional(propagation = Propagation.REQUIRES_NEW)
public abstract class GenericAdminServiceImpl<ENTITY extends AbstractEntity, DTO extends AbstractDto>
implements GenericAdminService<ENTITY, DTO> {
// ...
}
And (in concrete implementation):
#Service
#Transactional
public class GruppoServiceImplG extends
GenericAdminServiceImpl<Gruppo, GruppoDto> implements GruppoServiceG {
// ...
}
Related
While implementing OData V4 using Olingo in Java, I am getting a NullPointerException.
Here is a detailed description --
Trying to implement OData V4 using olingo in Java in my Spring Boot application. I am following the official documentation https://olingo.apache.org/doc/odata4/tutorials/readep/tutorial_readep.html
Rather than feeding the data manually/static data, I am using a database that is providing the data.
As per the documentation, I have created a class Storage.java to simulate the data layer.
public class Storage {
#Autowired
CompanyService cservice;
private List<Entity> companyentityList;
public Storage() throws Exception {
companyentityList = new ArrayList<Entity>();
initSampleData();
}
// PUBLIC FACADE
public EntityCollection readEntitySetData(EdmEntitySet edmEntitySet) throws NullPointerException {
// actually, this is only required if we have more than one Entity Sets
if (edmEntitySet.getName().equals(DemoEdmProvider.ES_COMPANY_RECORDS)) {
return getCompaniesData();
}
return null;
}
public Entity readEntityData(EdmEntitySet edmEntitySet, List<UriParameter> keyParams) throws Exception {
EdmEntityType edmEntityType = edmEntitySet.getEntityType();
// actually, this is only required if we have more than one Entity Type
if (edmEntityType.getName().equals(DemoEdmProvider.ET_COMPANY)) {
return getCompany(edmEntityType, keyParams);
}
return null;
}
// INTERNAL
public EntityCollection getCompaniesData() throws NullPointerException {
EntityCollection retEntitySet = new EntityCollection();
for (Entity companyEntity : this.companyentityList) {
retEntitySet.getEntities().add(companyEntity);
}
return retEntitySet;
}
public Entity getCompany(EdmEntityType edmEntityType, List<UriParameter> keyParams) throws Exception {
// the list of entities at runtime
EntityCollection entitySet = getCompaniesData();
// generic approach to find the requested entity
Entity requestedEntity = Util.findEntity(edmEntityType, entitySet, keyParams);
if (requestedEntity == null) {
// this variable is null if our data doesn't contain an entity for the requested
// key
// Throw suitable exception
throw new ODataApplicationException("Entity for requested key doesn't exist",
HttpStatusCode.NOT_FOUND.getStatusCode(), Locale.ENGLISH);
}
return requestedEntity;
}
// Helper
public void initSampleData() {
try {
getData();
} catch (NullPointerException e) {
System.out.print("<<<<<<<---------------- Database unable to provide data ------------>>>>>>");
}
}
public List<Company> getAllcompanyList() {
Collection<Company> checkingdata = new ArrayList<>();
try {
checkingdata = cservice.getDetails();
} catch (NullPointerException e) {
System.out.print("<<<<<<<---------------- Database unable to provide data ------------>>>>>>");
}
return (List<Company>) checkingdata;
}
// final EntityCollection entitySet = new EntityCollection();
// loop over List<Company> converting each instance of Company into and Olingo Entity
public EntityCollection makeEntityCollection(List<Company> companyList) {
EntityCollection entitySet = new EntityCollection();
for (Company cmp : companyList) {
entitySet.getEntities().add(createEntity(cmp));
}
return entitySet;
}
// Convert instance of cmp object into an Olingo Entity
public Entity createEntity(Company cmp) {
final Entity tmpEntity = new Entity().addProperty(new Property(null, "ID", ValueType.PRIMITIVE, cmp.getId()))
.addProperty(new Property(null, "Name", ValueType.PRIMITIVE, cmp.getContent()));
companyentityList.add(tmpEntity);
return tmpEntity;
}
public void getData() throws NullPointerException {
// ... code to get Data from the DataBase in List and calling makeEntityCollection
List<Company> companyList = getAllcompanyList();
makeEntityCollection(companyList);
// System.out.println(companyList.size());
}
}
In the above mentioned code I #Autowired the CompanyService interface reference cservice.
Here is the implementation of CompanyService -
public interface CompanyService {
Collection<Company> getDetails() throws Exception;
}
CompanyService interface is implemented by CompanyServiceImplementation -
#Service
public class CompanyServiceImplementation implements CompanyService{
#Autowired
private CompanyDAOImplementation cDAOWrapper;
public Collection<Company> getDetails() throws Exception {
return cDAOWrapper.findAll();
}
}
In the above class, the findAll() method is returning the data from the database.
So the problem is that - The CompanyService reference cservice which is #Autowired in Storage.java class is null and it is not getting initialised and hence I am getting a NullPointerException while calling cservice.getDetails().
Please let me know what is wrong with my Code. Thanks in advance.
I have this two entities:
#Entity
#Immutable
#Cacheable
#Cache(region = "dosefrequency", usage = CacheConcurrencyStrategy.READ_ONLY)
public class DoseFrequency{
.....
}
#Entity
public class PrescriptionDrug {
.....
#ManyToOne(optional=false)
#JoinColumn(name="doseFrequency")
public DoseFrequency getDoseFrequency() {
return doseFrequency;
}
}
DoseFrequency is a read only entity, and PrescriptionDrug has associated one DoseFreqyency. I want to achive that each time I load one or many PrescriptionDrug the instances of DoseFrequency don't be repeated.
I know that the DoseFrequency instances will be cached in the first level cache of hibernate, but the load is done in several sessions (it is a web app). I used the second level cache, but that cache doesn't store instances only serializes the entity.
I get this behaivor working using Tuplizer in DoseFrequency, but I don't know if there is any other way to achive this.
#Entity
#Immutable
#Cacheable
#Cache(region = "dosefrequency", usage = CacheConcurrencyStrategy.READ_ONLY)
#Tuplizer(impl=DoseFrequencyTuplizer.class)
public class DoseFrequency {
....
}
public class DoseFrequencyTuplizer extends PojoEntityTuplizer {
public DoseFrequencyTuplizer(EntityMetamodel entityMetamodel, PersistentClass mappedEntity) {
super(entityMetamodel, mappedEntity);
}
#Override
protected Instantiator buildInstantiator(EntityMetamodel entityMetamodel, PersistentClass persistentClass) {
return new DoseFrequencyInstantiator(DoseFrequency.class.getName(),entityMetamodel,persistentClass,null);
}
}
public class DoseFrequencyInstantiator implements Instantiator {
private final Class targetClass;
protected PojoEntityInstantiator pojoEntityInstantiator;
public DoseFrequencyInstantiator(String targetClassName, EntityMetamodel entityMetamodel,
PersistentClass persistentClass, ReflectionOptimizer.InstantiationOptimizer optimizer) {
try {
this.targetClass = Class.forName(targetClassName);
} catch (ClassNotFoundException e) {
throw new HibernateException(e);
}
pojoEntityInstantiator = new PojoEntityInstantiator(entityMetamodel, persistentClass, optimizer);
}
#Override
public Object instantiate(Serializable id) {
DoseFrequency df = MedereEMRCache.instance.findDoseFrequencyByID(Long.valueOf(id.toString()));
if (df == null) {
return pojoEntityInstantiator.instantiate(id);
}
return df;
}
#Override
public Object instantiate() {
return instantiate(null);
}
#Override
public boolean isInstance(Object object) {
try {
return targetClass.isInstance(object);
} catch (Throwable t) {
throw new HibernateException("could not get handle to entity as interface : " + t);
}
}
}
I'm aware that the instances will be shared among all the threads of the application, but they are treated as read only, so they should not be modified.
Is this approach right?
Thank you
You can also use a LoadEventListener to always serve the same instance. Nevertheless, this functionality is not needed since the entity is immutable, so, even if you have multiple copies of it, it will still be immutable.
More, even if you implement a Singleton pattern, it will only be enforced per JVM, so I don't see why you'd want to implement this request.
Entities are meant to treated as singletons per EntityManager only.
I am trying to add some custom functionality to a spring data repository.
Using this as my starting point http://docs.spring.io/spring-data/jpa/docs/current/reference/html/#repositories.single-repository-behaviour I have created the following code:
public interface TableLock<T> {
void checkout(T entity);
void checkin(T entity, boolean cmpltBatch);
}
public interface BatchTableLock extends TableLock<MyEntity> {
}
public class BatchTableLockImpl implements BatchTableLock {
private static final Logger logger = LoggerFactory.getLogger(BatchTableLockImpl.class);
#PersistenceContext(unitName = "mysql")
private EntityManager em;
#Override
public void checkout(MyEntity batch) {
Long id = batch.getMyEntityId();
try {
MyEntity p = em.find(MyEntity.class, id, LockModeType.PESSIMISTIC_WRITE);
if (p == null) {
logger.error("checkout : MyEntity id {} must be valid", id);
throw new PessimisticLockException();
}
if (myCondition is true) {
return;
}
} catch (LockTimeoutException | PessimisticLockException e) {
logger.error("checkout : Unable to get write lock on MyEntity id {}", id, e);
}
throw new PessimisticLockException();
}
#Override
public void checkin(MyEntity batch, boolean cmplt) {
Long id = batch.getMyEntityId();
try {
MyEntity p = em.find(MyEntity.class, id, LockModeType.PESSIMISTIC_WRITE);
if (p == null) {
logger.error("complete : MyEntity id {} must be valid", id);
return;
}
if (this is true) {
if (cmplt) {
yep;
} else {
nope;
}
} else if (cmplt) {
logger.error("complete: Unable to complete MyEntity {} with status.", id);
}
} catch (LockTimeoutException | PessimisticLockException e) {
logger.error("complete : Unable to get write lock on MyEntity id {}", id, e);
}
}
}
#Repository
public interface MyDao extends CrudRepository<MyEntity, BigInteger>, BatchTableLock {
... My queries ...
}
unfortunately I am getting the following error:
org.springframework.data.mapping.PropertyReferenceException: No property checkin found for type MyEntity!
This if I'm not mistaken means that spring is trying to generate a query based on the method 'checkin' and it can't find a field in MyEntity with the name 'checkin'. which is correct there is no such field. how do I make it stop doing this? based on the link above I don't think it should be trying to generate a query for this method, but it seems to be doing it anyway. I may be missing something, that is usually the case, but I don't see what it is.
As stated in the reference documentation section you linked to, you need a MyDaoImpl class that implements the custom methods. I guess the easiest way is to either rename BatchTableLockImpl to that or just create an empty MyDaoImpl extending that class.
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = {HibernateConfigTest.class})
#Transactional
#Sql(scripts = {"api_routes.sql",
"profile.sql",
"status.sql",
"user.sql",
"game_token.sql",
"game.sql",
"message.sql"},
config = #SqlConfig(transactionMode = ISOLATED),
executionPhase = ExecutionPhase.BEFORE_TEST_METHOD)
#Sql(scripts = "delete_data.sql",
executionPhase = ExecutionPhase.AFTER_TEST_METHOD)
public class GameDaoTest {
#Autowired
private GameDao gameDao;
#Test
public void getGetRecentGames() {
Game game = null;
for (int i = 0; i < 1000; i++) {
game = new Game(i + 1000);
game.setStartedAt(DateUtils.getCurrentUTCDate());
gameDao.save("game", game);
}
List<Game> recentGames = gameDao.getRecentGames(1000);
assertNotNull(recentGames);
assertEquals(1000, recentGames.size());
}
}
When I get to the line List<Game> recentGames = gameDao.getRecentGames(1000); hibernate prints out all of the insert statements. Unfortunately, when the games are retrieved, none of the ones I inserted are retrieved. Is there any way to retrieve those games? Maybe a better question would be how do I put those inserts in their own transaction so they are persisted for the subsequent methods?
Here is the AbstractDao that GameDao extends:
public class AbstractDao {
#PersistenceContext
protected EntityManager entityManager;
protected Session getSession() {
return entityManager.unwrap(Session.class);
}
public void save(Object entity) {
getSession().save(entity);
}
public void save(String entityName, Object entity) {
getSession().save(entityName, entity);
}
public void persist(Object entity) {
getSession().persist(entity);
}
}
The persist method throws an exception for some detached entity reason that I am not familiar with.
Try calling flush on your session before calling getRecentGames. If that method is executing a custom query, as opposed to using one of Hibernate's session methods, it's going directly against the database -- but since your save hasn't yet been synced with the underlying database, the data isn't there. You don't actually need to do a true commit as long as Hibernate can see the data.
for (int i = 0; i < 1000; i++) {
game = new Game(i + 1000);
game.setStartedAt(DateUtils.getCurrentUTCDate());
gameDao.save("game", game);
}
gameDao.flush(); // calls getSession().flush()
List<Game> recentGames = gameDao.getRecentGames(1000);
// call assert methods as needed
I have a project that has been running on GWT 2.4 for some time (and 2.0 etc before that). When I switch it to GWT 2.5 or 2.6, child objects attached to my main Entity no longer save changes made to them. I don't change any code, but switching between 2.4 and 2.6, it breaks. I believe the changes don't get sent to the server. I'm watching the POST data, and it seems incomplete, missing the changes that I see being sent when on v2.4.
Are there changes to RequestFactory from 2.4-2.5 that would cause this? Something in the way I built it out that was not proper design? I appreciate any feedback!
Here's a sample retrieve/update pattern:
// Retrieve object from server
MyEntityRequest request = App.getRequestFactory().myEntityRequest();
MyEntityProxy myEntity;
request.get(id).with("child").fire(new Receiver<MyEntityProxy>() {
#Override
public void onSuccess(MyEntityProxy response) {
request = App.getRequestFactory().myEntityRequest();
myEntity = response;
}
});
// Edits made client side..
// Save updated object
myEntity = request.edit(myEntity);
myEntity.childEntity.setName("new value");
request.save(myEntity).fire(new Receiver<MyEntityProxy>() {
#Override
public void onSuccess(Void) { }
});
Example Request interface:
#Service(value = MyEntityDao.class, locator = DaoLocator.class)
public interface MyEntityRequest extends RequestContext {
Request<Void> save(MyEntityProxy entity);
}
Domain objects:
#ProxyFor(MyEntity.class, locator=DomainObjectLocator.class)
public interface MyEntityProxy extends EntityProxy {
Integer getId();
Set<ChildProxy> getChildren();
void setChildren(Set<ChildProxy> children);
}
#ProxyFor(Child.class, locator=DomainObjectLocator.class)
public interface ChildProxy extends EntityProxy {
Integer getId();
String getName();
void setName(String name);
}
Server object:
#Entity
#Table (name="MyEntity")
#BatchSize(size=25)
public class MyEntity extends DomainObject {
#OneToMany (mappedBy = "myEntity")
#BatchSize(size=25)
#Cascade(CascadeType.ALL)
private Set<Child> children;
}
Server persist:
public static Void save(MyEntity myEntity) {
Session session = HibernateUtil.getSessionFactory().getCurrentSession();
session.beginTransaction();
try {
myEntity = (MyEntity) session.merge(myEntity);
} catch (Exception e) {
e.printStackTrace();
session.getTransaction().rollback();
} finally {
session.getTransaction().commit();
}
}
It probably has something to do with https://code.google.com/p/google-web-toolkit/issues/detail?id=7827
TL;DR: yes, there were some changes in 2.5.