While implementing OData V4 using Olingo in Java, I am getting a NullPointerException.
Here is a detailed description --
Trying to implement OData V4 using olingo in Java in my Spring Boot application. I am following the official documentation https://olingo.apache.org/doc/odata4/tutorials/readep/tutorial_readep.html
Rather than feeding the data manually/static data, I am using a database that is providing the data.
As per the documentation, I have created a class Storage.java to simulate the data layer.
public class Storage {
#Autowired
CompanyService cservice;
private List<Entity> companyentityList;
public Storage() throws Exception {
companyentityList = new ArrayList<Entity>();
initSampleData();
}
// PUBLIC FACADE
public EntityCollection readEntitySetData(EdmEntitySet edmEntitySet) throws NullPointerException {
// actually, this is only required if we have more than one Entity Sets
if (edmEntitySet.getName().equals(DemoEdmProvider.ES_COMPANY_RECORDS)) {
return getCompaniesData();
}
return null;
}
public Entity readEntityData(EdmEntitySet edmEntitySet, List<UriParameter> keyParams) throws Exception {
EdmEntityType edmEntityType = edmEntitySet.getEntityType();
// actually, this is only required if we have more than one Entity Type
if (edmEntityType.getName().equals(DemoEdmProvider.ET_COMPANY)) {
return getCompany(edmEntityType, keyParams);
}
return null;
}
// INTERNAL
public EntityCollection getCompaniesData() throws NullPointerException {
EntityCollection retEntitySet = new EntityCollection();
for (Entity companyEntity : this.companyentityList) {
retEntitySet.getEntities().add(companyEntity);
}
return retEntitySet;
}
public Entity getCompany(EdmEntityType edmEntityType, List<UriParameter> keyParams) throws Exception {
// the list of entities at runtime
EntityCollection entitySet = getCompaniesData();
// generic approach to find the requested entity
Entity requestedEntity = Util.findEntity(edmEntityType, entitySet, keyParams);
if (requestedEntity == null) {
// this variable is null if our data doesn't contain an entity for the requested
// key
// Throw suitable exception
throw new ODataApplicationException("Entity for requested key doesn't exist",
HttpStatusCode.NOT_FOUND.getStatusCode(), Locale.ENGLISH);
}
return requestedEntity;
}
// Helper
public void initSampleData() {
try {
getData();
} catch (NullPointerException e) {
System.out.print("<<<<<<<---------------- Database unable to provide data ------------>>>>>>");
}
}
public List<Company> getAllcompanyList() {
Collection<Company> checkingdata = new ArrayList<>();
try {
checkingdata = cservice.getDetails();
} catch (NullPointerException e) {
System.out.print("<<<<<<<---------------- Database unable to provide data ------------>>>>>>");
}
return (List<Company>) checkingdata;
}
// final EntityCollection entitySet = new EntityCollection();
// loop over List<Company> converting each instance of Company into and Olingo Entity
public EntityCollection makeEntityCollection(List<Company> companyList) {
EntityCollection entitySet = new EntityCollection();
for (Company cmp : companyList) {
entitySet.getEntities().add(createEntity(cmp));
}
return entitySet;
}
// Convert instance of cmp object into an Olingo Entity
public Entity createEntity(Company cmp) {
final Entity tmpEntity = new Entity().addProperty(new Property(null, "ID", ValueType.PRIMITIVE, cmp.getId()))
.addProperty(new Property(null, "Name", ValueType.PRIMITIVE, cmp.getContent()));
companyentityList.add(tmpEntity);
return tmpEntity;
}
public void getData() throws NullPointerException {
// ... code to get Data from the DataBase in List and calling makeEntityCollection
List<Company> companyList = getAllcompanyList();
makeEntityCollection(companyList);
// System.out.println(companyList.size());
}
}
In the above mentioned code I #Autowired the CompanyService interface reference cservice.
Here is the implementation of CompanyService -
public interface CompanyService {
Collection<Company> getDetails() throws Exception;
}
CompanyService interface is implemented by CompanyServiceImplementation -
#Service
public class CompanyServiceImplementation implements CompanyService{
#Autowired
private CompanyDAOImplementation cDAOWrapper;
public Collection<Company> getDetails() throws Exception {
return cDAOWrapper.findAll();
}
}
In the above class, the findAll() method is returning the data from the database.
So the problem is that - The CompanyService reference cservice which is #Autowired in Storage.java class is null and it is not getting initialised and hence I am getting a NullPointerException while calling cservice.getDetails().
Please let me know what is wrong with my Code. Thanks in advance.
Related
I'm trying to map an entity to a DTO using ModelMapper. The problem comes when a #JoinColumn is not loaded(lazy load). ModelMapper tries to access the lazy load entity's properties and then a LazyInitializationException is thrown.
I already have an strategy to solve that but I could not find a single ModelMapper feature which does what I need.
Here is what I need to do:
For each not loaded entity, I'll create a new target object using my factory. If the object is loaded, then the default mapping must be applied.
The following example is a ModelMapper feature that would fit exactly with my needs if it wan't by the fact that it does not provide the source(provides only the source type):
public static class MyConverter implements ConditionalConverter<Object, Object> {
private EntityManager em;
public MyConverter(EntityManager em) {
this.em = em;
}
#Override
public MatchResult match(Class<?> sourceType, Class<?> destinationType) {
Object source = null; // I need the source instead of its type.
PersistenceUnitUtil persistenceUnitUtil = em.getEntityManagerFactory().getPersistenceUnitUtil();
return persistenceUnitUtil.isLoaded(source) ? MatchResult.NONE : MatchResult.FULL;
}
#Override
public Object convert(MappingContext<Object, Object> context) {
return LazyEntityProxyFactory.factory(context.getSource(), context.getDestinationType()); // Creates the target object
}
}
Do you guys of any ModelMapper feature which provides what I need. Or maybe a hack?
*Obs: I've looked into ModelMapper's code and noticed that when ConditionalConverter.match is called the context already exists and therefore it possesses the source. What if ModelMapper had also a ConditionalContextConverter interface which passes the context in the match method? Just an idea.
I just found what I needed! The secret is to check the properties from the parent entity. After that I was able to leverage from the default mapping and also use my own factory if needed.
Here's my ConditionalConverter:
public static class MyConverter implements ConditionalConverter<Object, Object> {
private EntityManager em;
public MyConverter(EntityManager em) {
this.em = em;
}
#Override
public MatchResult match(Class<?> sourceType, Class<?> destinationType) {
return MatchResult.FULL;
}
#Override
public Object convert(MappingContext<Object, Object> context) {
Object source = context.getSource();
Object destination = context.getMappingEngine().createDestination(context);
try {
Field[] sourceFields = context.getSourceType().getDeclaredFields();
Field[] destinationFields = context.getDestinationType().getDeclaredFields();
for (Field sourceField : sourceFields) {
sourceField.setAccessible(true);
for (Field destinationField : destinationFields) {
destinationField.setAccessible(true);
if (sourceField.getName().equals(destinationField.getName())) {
Object sourceFieldValue = sourceField.get(source);
PersistenceUnitUtil persistenceUnitUtil = em.getEntityManagerFactory().getPersistenceUnitUtil();
if (persistenceUnitUtil.isLoaded(sourceFieldValue)) {
MappingContext<?, ?> myContext = context.create(sourceFieldValue, destinationField.getType());
Object destinationValue = context.getMappingEngine().map(myContext);
destinationField.set(destination, destinationValue);
} else {
// Here is your factory call;
destinationField.set(destination, SomeFactory.factory(sourceFieldValue, destinationField.getType()));
}
break;
}
}
}
} catch (Exception e) {
e.printStackTrace();
}
return destination;
}
}
I am trying to write unit tests for Repository layer classes with Junit and Mockito.
I have mocked the base class that supplies NamedParameterJdbcOperations and tried to inject into the repo class.
In the repo class, we are loading sql queries from files on classpath. This is done in a method that is annotated with #PostConstruct.
When trying to test a method of the repo, it is not able to find or load the query and thus throwing NullPointerException.
Need help / suggestion on how to deal with such scenario.
PS: I am not allowed to change the repo class implementation.
Attaching the code of repo and test class for reference.
RepositoryImpl.java
#Repository
public class RepositoryImpl extends AppJdbcImpl implements
Repository {
private static final StudentMapper STUDENT_ROW_MAPPER = new StudentMapper();
private static final CourseMapper COURSE_ROW_MAPPER = new CourseMapper();
#Value("classpath:sql/sql1.sql")
private Resource sql1;
private String query1;
#Value("classpath:sql/sql2.sql")
private Resource sql2;
private String query2;
public RepositoryImpl() { }
public RepositoryImpl(NamedParameterJdbcOperations jdbc) {
super(jdbc);
}
#PostConstruct
public void setUp() {
query1 = loadSql(sql1);
query2 = loadSql(sql2);
}
public Iterable<Course> findCoursesByStudentId(int studentId) throws
DataAccessException {
try {
return jdbc().queryForObject(query1,
ImmutableMap.of("studentId", studentId),
COURSE_ROW_MAPPER);
} catch (EmptyResultDataAccessException emptyResult) {
return null;
} catch (DataAccessException e) {
// Need to create exception classes and throw specific exceptions
throw e;
}
}
public Iterable<Student> findStudentsByCourseId(int courseId) throws DataAccessException {
try {
return jdbc().query(query2,
ImmutableMap.of("courseId", courseId),
STUDENT_ROW_MAPPER);
} catch (DataAccessException e) {
// Need to create exception classes and throw specific exceptions
throw e;
}
}
private String loadSql(Resource resource) {
try {
return CharStreams.toString(new InputStreamReader(resource.getInputStream()));
} catch (IOException e) {
return null;
}
}
}
RespositoryImplTest.java
#RunWith(MockitoJUnitRunner.class)
public class RepositoryImplTest {
#Mock
private NamedParameterJdbcOperations jdbc;
#Mock
private ResultSet resultSet;
#Mock
private StudentMapper studentMapper;
#Mock
private CourseMapper CourseMapper;
#InjectMocks
private RepositoryImpl repository;
private Student student1;
private Student student2;
private Course course1;
private Course course2;
#Before
public void setUp() throws Exception {
MockitoAnnotations.initMocks(this);
course1 = new Course(1, "Karate");
course2 = new Course(2, "Riding");
course8 = new Course(8, "Swimming");
List<Course> courseList = Arrays.asList(course1, course2, course8);
student1 = new Student(1, "Chuck", "Norris", 27, new Arrays.asList(course1, course2));
student2 = new Student(2, "Bruce", "Lee", 54, new Arrays.asList(course1, course8));
List<Student> studentList = Arrays.asList(student1, student2);
when(jdbc.queryForObject(Matchers.anyString(), anyMap(),
isA(StudentMapper.class)))
.thenAnswer(new Answer() {
#Override
public Object answer(InvocationOnMock invocationOnMock) throws Throwable {
Object[] args = invocationOnMock.getArguments();
int queryParam = Integer.parseInt(args[0].toString());
Iterable<Credentials> result = studentList.stream()
.filter(d -> d.getId() == queryParam)
.collect(Collectors.toList());
return result;
}
});
}
#Test
public void findCoursesByStudentId() {
Iterable<Course> result = repository.findCoursesByStudentId(1);
assertNotNull(result);
}
}
In repo class, exception is thrown as query1 is null.
Need help to properly solving the issue.
Thanks, Baru
#RunWith(MockitoJUnitRunner.class)
you start test with mockito starter, not spring starter. It's mean that spring not provided you beans. Mockito starter nothing know about PostConstruct annotation.
You may call PostConstruct method youself in sturUp junit method or in test method.
I am trying to add some custom functionality to a spring data repository.
Using this as my starting point http://docs.spring.io/spring-data/jpa/docs/current/reference/html/#repositories.single-repository-behaviour I have created the following code:
public interface TableLock<T> {
void checkout(T entity);
void checkin(T entity, boolean cmpltBatch);
}
public interface BatchTableLock extends TableLock<MyEntity> {
}
public class BatchTableLockImpl implements BatchTableLock {
private static final Logger logger = LoggerFactory.getLogger(BatchTableLockImpl.class);
#PersistenceContext(unitName = "mysql")
private EntityManager em;
#Override
public void checkout(MyEntity batch) {
Long id = batch.getMyEntityId();
try {
MyEntity p = em.find(MyEntity.class, id, LockModeType.PESSIMISTIC_WRITE);
if (p == null) {
logger.error("checkout : MyEntity id {} must be valid", id);
throw new PessimisticLockException();
}
if (myCondition is true) {
return;
}
} catch (LockTimeoutException | PessimisticLockException e) {
logger.error("checkout : Unable to get write lock on MyEntity id {}", id, e);
}
throw new PessimisticLockException();
}
#Override
public void checkin(MyEntity batch, boolean cmplt) {
Long id = batch.getMyEntityId();
try {
MyEntity p = em.find(MyEntity.class, id, LockModeType.PESSIMISTIC_WRITE);
if (p == null) {
logger.error("complete : MyEntity id {} must be valid", id);
return;
}
if (this is true) {
if (cmplt) {
yep;
} else {
nope;
}
} else if (cmplt) {
logger.error("complete: Unable to complete MyEntity {} with status.", id);
}
} catch (LockTimeoutException | PessimisticLockException e) {
logger.error("complete : Unable to get write lock on MyEntity id {}", id, e);
}
}
}
#Repository
public interface MyDao extends CrudRepository<MyEntity, BigInteger>, BatchTableLock {
... My queries ...
}
unfortunately I am getting the following error:
org.springframework.data.mapping.PropertyReferenceException: No property checkin found for type MyEntity!
This if I'm not mistaken means that spring is trying to generate a query based on the method 'checkin' and it can't find a field in MyEntity with the name 'checkin'. which is correct there is no such field. how do I make it stop doing this? based on the link above I don't think it should be trying to generate a query for this method, but it seems to be doing it anyway. I may be missing something, that is usually the case, but I don't see what it is.
As stated in the reference documentation section you linked to, you need a MyDaoImpl class that implements the custom methods. I guess the easiest way is to either rename BatchTableLockImpl to that or just create an empty MyDaoImpl extending that class.
I'm building a RESTful web service with Jersey. I use JAXB to convert incoming JSON objects into Java objects. Unfortunately this approach allows to create Java objects which don't have all mandatory fields. If I have 3 mandatory fields but the JSON contains only 1 field, I would like to see an exception thrown.
Resource class:
#XmlRootElement
#XmlAccessorType(XmlAccessType.FIELD)
public class Resource {
private int field1;
private String field2;
private String field3;
public Resource() {
}
...
}
REST class:
#Path("resource")
public class ResourceREST {
...
#POST
#Consumes(APPLICATION_JSON)
#Produces(TEXT_PLAIN)
public String createResource(Resource resource) {
...
}
...
}
Is there any possibility to do this with JAXB? If not, how can I realize this input validation?
Thanks in advance!
I have gone through the same scenario and applied some logic to fix this after the JSON is generated.
In a List add those Field Names that you considered as mandatory.
public static final List<String> REQUIRED_FIELDS = new ArrayList<String>();
static {
REQUIRED_FIELDS.add("Field1");
REQUIRED_FIELDS.add("Field2");
};
Send those JSON that you have build to a validate method.
Your validate method should be like this.
public void validateRequiredFields(JSONObject jsonObject, List<String> requiredFields) throws ParserException, Exception {
if (log.isDebugEnabled()) {
log.debug("Entering validateForRequiredFields");
}
List<String> missingFields = new ArrayList<String>();
try {
if (requiredFields != null) {
for (String requiredField : requiredFields) {
if (ifObjectExists(jsonObject, requiredField)) {
if (StringUtils.isEmpty(jsonObject.getString(requiredField))) {
missingFields.add(requiredField);
}
} else {
missingFields.add(requiredField);
}
}
}
if (missingFields != null && missingFields.size() > 0) {
throw new Exception(missingFields);
}
} catch (JSONException e) {
throw new ParserException("Error occured in validateRequiredFields", e);
}
}
In my Spring web app I'm using a generic dao class:
public abstract class GenericDaoImpl<T> implements GenericDao<T> {
#Override
public T create(final T t) {
this.getEntityManager().persist(t);
return t;
}
#Override
public void delete(final Object id) {
this.getEntityManager().remove(
this.getEntityManager().getReference(getEntityType(), id));
}
#Override
public T find(final Object id) {
return (T) this.getEntityManager().find(getEntityType(), id);
}
#Override
public T update(final T t) {
return this.getEntityManager().merge(t);
}
}
I implement this class for every entity in my model and it works perfectly. For example:
#Repository
public class GruppoDaoImpl extends GenericDaoImpl<Gruppo> implements GruppoDao {
}
I use these dao classes in my service layer. I have a service layer for every entity in my model, but methods for most of these classes, are the same, so I tried to create a generic service class that I can extend in the same way I do for the generic dao:
public abstract class GenericAdminServiceImpl<ENTITY extends AbstractEntity, DTO extends AbstractDto>
implements GenericAdminService<ENTITY, DTO> {
private GenericDao<ENTITY> dao;
private Class<ENTITY> entityClass;
private Class<DTO> dtoClass;
#SuppressWarnings({ "unchecked", "rawtypes" })
protected GenericAdminServiceImpl(GenericDao<ENTITY> dao) {
this.dao = dao;
//
Type t = getClass().getGenericSuperclass();
ParameterizedType pt = (ParameterizedType) t;
this.entityClass = (Class) pt.getActualTypeArguments()[0];
this.dtoClass = (Class) pt.getActualTypeArguments()[1];
}
public DTO getById(Object id) {
DTO dto = null;
ENTITY entity = dao.find(id);
if (entity != null) {
try {
dto = dtoClass.newInstance();
initDto(entity, dto);
} catch (Exception e) {
}
}
return dto;
}
public void create(DTO dto) throws ServiceOperationException {
ENTITY entity;
try {
entity = entityClass.newInstance();
initEntity(dto, entity);
Date dt = new Date();
entity.setDataUltimoAggiornamento(dt);
entity.setUtenteUltimoAggiornamento(dto.getLoggedUser());
entity.setDataInserimento(dt);
entity.setUtenteInserimento(dto.getLoggedUser());
dao.create(entity);
} catch (Exception e) {
throw new ServiceOperationException("impossibile creare entity ["
+ entityClass.getSimpleName() + "]", e);
}
}
public void update(DTO dto) throws ServiceOperationException {
ENTITY entity = dao.find(dto.getId());
if (!entityExists(entity)) {
throw new ServiceOperationException("entity non esistente ["
+ entityClass.getSimpleName() + "#" + dto.getId() + "]");
}
initEntity(dto, entity);
Date dt = new Date();
entity.setDataUltimoAggiornamento(dt);
entity.setUtenteUltimoAggiornamento(dto.getLoggedUser());
dao.update(entity);
}
public void delete(Object id) throws ServiceOperationException {
try {
dao.delete((int) id);
} catch (Exception e) {
throw new ServiceOperationException(
"impossibile eliminare entity ["
+ entityClass.getSimpleName() + "#" + id + "]", e); // TODO
}
}
protected abstract void initDto(ENTITY entity, DTO outDto);
protected abstract void initEntity(DTO dto, ENTITY outEntity);
protected abstract boolean entityExists(ENTITY entity);
}
Extending this class I just have to implement specific parts for every entity, leaving all the common stuff in the abstract/generic class.
The problem is that using the generic service, merge, persist and delete don't work. Only select seems to work and I cannot understand why...
When I run debug mode in Eclipse all seems correct. A consistent entity is passed to merge/persist methods, so why they don't work? can you help me?
UPDATE #1
This is an example of implementation:
#Service
#Transactional(propagation = Propagation.REQUIRES_NEW)
public class GruppoServiceImplG extends
GenericAdminServiceImpl<Gruppo, GruppoDto> implements GruppoServiceG {
#Autowired
protected GruppoServiceImplG(GruppoDao gruppoDao) {
super(gruppoDao);
}
#Override
protected void initDto(Gruppo entity, GruppoDto outDto) {
outDto.setId(entity.getId());
outDto.setNome(entity.getNome());
outDto.setDescrizione(entity.getDescrizione());
outDto.setDataInizioValidita(entity.getDataInizioValidita());
outDto.setDataFineValidita(entity.getDataFineValidita());
}
#Override
protected void initEntity(GruppoDto dto, Gruppo outEntity) {
outEntity.setId(dto.getId());
outEntity.setNome(dto.getNome());
outEntity.setDescrizione(dto.getDescrizione());
outEntity.setDataInizioValidita(dto.getDataInizioValidita());
outEntity.setDataFineValidita(dto.getDataFineValidita());
}
#Override
protected boolean entityExists(Gruppo entity) {
return entity != null && entity.getId() > 0;
}
}
UPDATE #2
Following Łukasz L. suggestion, I added to all my crud methods a flush(). Now I get this exception javax.persistence.TransactionRequiredException: no transaction is in progress. What's wrong with my transaction declaration? it works fine with non-generic serices...
If you read that question about Spring and Hibernate flush behaviour, it's not easy that commiting your transaction will make also the EntityManager to save all changes. Spring and JPA (Hibernate&CO) are designed to work quite nice (from the Spring side) but nevertheless, you must assert that your entity manager will write all queries to database before commiting transaction.
The problem: JPAs like to cache. It means, they tend to avoid issuing queries. If you do SELECT, they have no choice - they must fetch some data (as long as that data portion was not fetched - like when getting single entity by ID). By INSERTs and UPDATEs - well, they CAN cache. It means, that create, merge or remove will usually not issue a query to RDBMS until you call flush() on EntityManager.
If you leave transactional block without calling flush, and entity manager is delaying operations, you'll commit transactions, by which the modifying queries were not issued!
Just make sure to call EntityManager.flush() at least at the end of the transactional method. You can also call it after each DML operation, it's your choice (I prefer that way because it gives me full control in which order the DML queries are issued by JPA, if you heavily uses DB constraints/triggers, it can be essential).
#Transactional
public void myTransactionalMethod() {
getEntityManager().persist(a); // I've made some JPA change, that is not issued to DB yet
...
// I'm doing something more
...
getEntityManager().flush(); // the last moment to flush, after that instruction I leave transactional context
}
Following Łukasz L. suggestion I discovered the actual issue in my generic class.
Transaction declaration was wrong. I set #Transactional(propagation = Propagation.REQUIRES_NEW) only in concrete service class.
I solved this way:
#Transactional(propagation = Propagation.REQUIRES_NEW)
public abstract class GenericAdminServiceImpl<ENTITY extends AbstractEntity, DTO extends AbstractDto>
implements GenericAdminService<ENTITY, DTO> {
// ...
}
And (in concrete implementation):
#Service
#Transactional
public class GruppoServiceImplG extends
GenericAdminServiceImpl<Gruppo, GruppoDto> implements GruppoServiceG {
// ...
}