HIbernate JPA Query from Different Sources not Generating Proper Jackson JSON Response - java

I am retrieving data from multiple tables/POJO's.I want the data in JSON format.In Pojo classes I am using #JsonProperty.Still I am not getting result Json in desired format.
My result:
[["2017 Sprint 1","Android development",23/12/2016,16/01/2017]]
I want result in format {
"iteration": "2017 Sprint 1",
"project": "MDM - Core & Integration",
"isd": "23/12/2016",
"ied": "16/01/2017",
My main controller method:
#Controller
#RequestMapping("/json/retrospective")
public class MainControllerClass
{
#RequestMapping(value="{userid}", method = RequestMethod.GET,produces=MediaType.APPLICATION_JSON_VALUE)
#ResponseBody
public List<Details> getInfoInJSON(#PathVariable int userid)
{
Configuration con = new Configuration();
con.configure("hibernate.cfg.xml");
SessionFactory SF = con.buildSessionFactory();
Session session= SF.openSession();
Query test=session.createQuery("select itr.iteration_name,prj.project_name,itr.isd,itr.ied from RetrospectiveInfo retro,IterationInfo itr,ProjectInfo prj where retro.retrospective_id ="+userid+" and retro.project_id = prj.project_id and retro.iteration_id = itr.iteration_id");
List<Details> details= test.list();
session.close();
SF.close();
return details;
}
}
Class details:
public class Details
{
#JsonProperty("iteration")
private String iteration;
#JsonProperty("project")
private String project;
#JsonProperty("isd")
private Date isd;
#JsonProperty("ied")
private Date ied;
getter/setters
I have got 3 Jackson jars annotation,databind and core of latest version 2.8 in buildpath.Why I am I getting such a result??What changes do I need to make in my code??Are any jars to be added??kindly help

The main issue is that your are constructing a Details class that is formed from a JPA query using different types check (Error: Cannot create TypedQuery for query with more than one return)
To resolve the issue, create a constructor for the required JSON attributes
package com.sof.type;
import com.fasterxml.jackson.annotation.JsonProperty;
import com.fasterxml.jackson.annotation.JsonPropertyOrder;
#JsonPropertyOrder({ "iteration", "project", "isd", "ied"})
public class Details {
#JsonProperty
private String iteration;
#JsonProperty
private String project;
#JsonProperty
private String isd;
#JsonProperty
private String ied;
public Details(String iteration, String project, String isd, String ied) {
this.iteration = iteration;
this.project = project;
this.isd = isd;
this.ied = ied;
}
}
Then use it this way
#PersistenceContext
private EntityManager em;
or
EntityManagerFactory entityManagerFactory = Persistence.createEntityManagerFactory("com.sof");
EntityManager em = entityManagerFactory.createEntityManager();
with
List<Details> details = em.createQuery(
"SELECT NEW com.sof.type.Details(itr.iteration_name, prj.project_name, itr.isd, itr.ied) " +
"FROM RetrospectiveInfo retro, " +
" IterationInfo itr, " +
" ProjectInfo prj " +
"WHERE retro.retrospective_id = :userId " +
"AND retro.project_id = prj.project_id " +
"AND retro.iteration_id = itr.iteration_id", Details.class)
.setParameter("userId", userid)
.getResultList();

Related

How to implement Server-side processing of DataTables with JDBC so that it paginates?

I have a Spring Boot app with DataTables server-side processing and Oracle database. Actually, I started with implementing one of the tutorials. It worked. The tutorial uses JPA. I want to implement the same using JDBC. I made all the corresponding classes, the repository, the new model with same filds but without jpa. But when I tried to fetch the data, it allowed me to get only the first page without a chance to get to the second page. Below I will post the extracts of the original and added code. So, the original tutorial used these classes:
#Entity
#Table(name = "MYUSERS")
public class User {
#Id
#Column(name = "USER_ID")
private Long id;
#Column(name = "USER_NAME")
private String name;
#Column(name = "SALARY")
private String salary;
...getters and setters
}
And
#Entity
public class UserModel {
#Id
private Long id;
private String name;
private String salary;
private Integer totalRecords;
#Transient
private Integer rn;
...getters and setters
}
And I substituted these two classes with one like this:
public class NewUser {
private Long id;
private String name;
private String salary;
private Integer totalRecords;
private Integer rn;
...getters and setters
}
The table itself has only 3 fields: id, name and salary, the other 2 fields are created and filled later.
The repositiry the original Author has for the user looks like this:
public interface UserRepository extends JpaRepository<User, Long> {
#Query(value = "SELECT * FROM MYUSERS", nativeQuery = true)
List<User> findAllByUsernames(List<String> listOfUsernames);
}
My own repository looks like this:
#Repository
public class NewUserRepoImpl extends JdbcDaoSupport implements NewUserRepo {
private static final String SELECT_ALL_SQL = "SELECT USER_ID as id, USER_NAME as name, SALARY as salary FROM MYUSERS";
private final NamedParameterJdbcTemplate namedParameterJdbcTemplate;
private final JdbcTemplate jdbctemplate;
public NewUserRepoImpl(NamedParameterJdbcTemplate namedParameterJdbcTemplate, JdbcTemplate jdbctemplate, DataSource dataSource) {
this.namedParameterJdbcTemplate = namedParameterJdbcTemplate;
this.jdbctemplate = jdbctemplate;
setDataSource(dataSource);
}
#Override
public List<NewUser> findAll(PaginationCriteria pagination) {
try {
String paginatedQuery = AppUtil.buildPaginatedQueryForOracle(SELECT_ALL_SQL, pagination);
return jdbctemplate.query(paginatedQuery, newUserRowMapper());
} catch (DataAccessException e) {
throw new EntityNotFoundException("No Entities Found");
}
}
#Bean
public RowMapper<NewUser> newUserRowMapper() {
return (rs, i) -> {
final NewUser newUser = new NewUser();
newUser.setId(rs.getLong("ID"));
newUser.setName(rs.getString("NAME"));
newUser.setSalary(rs.getString("SALARY"));
newUser.setTotalRecords(rs.getInt("TOTAL_RECORDS"));
newUser.setTotalRecords(rs.getInt("RN"));
return newUser;
};
}
}
the buildPaginatedQueryForOracle thing transforms my Query and allows it to get the totalRecords and rn. Below I will post the output of it both for the orifinal and my queries (they are the same, I checked).
So, the main part, the controller. I left the old and new pieces in it for now for debug purposes and just returning one of the results:
#RequestMapping(value="/users/paginated/orcl", method=RequestMethod.GET)
#ResponseBody
public String listUsersPaginatedForOracle(HttpServletRequest request, HttpServletResponse response, Model model) {
DataTableRequest<User> dataTableInRQ = new DataTableRequest<User>(request);
System.out.println(new Gson().toJson(dataTableInRQ));
DataTableRequest<NewUser> dataTableInRQNew = new DataTableRequest<NewUser>(request);
System.out.println(new Gson().toJson(dataTableInRQNew));
PaginationCriteria pagination = dataTableInRQ.getPaginationRequest();
System.out.println(new Gson().toJson(pagination));
PaginationCriteria paginationNew = dataTableInRQNew.getPaginationRequest();
System.out.println(new Gson().toJson(paginationNew));
String baseQuery = "SELECT USER_ID as id, USER_NAME as name, SALARY as salary FROM MYUSERS";
String paginatedQuery = AppUtil.buildPaginatedQueryForOracle(baseQuery, pagination);
String paginatedQueryNew = AppUtil.buildPaginatedQueryForOracle(baseQuery, paginationNew);
System.out.println(paginatedQuery);
System.out.println(paginatedQueryNew);
Query query = entityManager.createNativeQuery(paginatedQuery, UserModel.class);
System.out.println("Query:");
System.out.println(query);
#SuppressWarnings("unchecked")
List<UserModel> userList = query.getResultList();
System.out.println(new Gson().toJson(userList));
#SuppressWarnings("unchecked")
List<NewUser> userListNew = newUserRepo.findAll(paginationNew);
System.out.println(new Gson().toJson(userListNew));
DataTableResults<UserModel> dataTableResult = new DataTableResults<UserModel>();
DataTableResults<NewUser> dataTableResultNew = new DataTableResults<NewUser>();
dataTableResult.setDraw(dataTableInRQ.getDraw());
dataTableResultNew.setDraw(dataTableInRQNew.getDraw());
dataTableResult.setListOfDataObjects(userList);
dataTableResultNew.setListOfDataObjects(userListNew);
if (!AppUtil.isObjectEmpty(userList)) {
dataTableResult.setRecordsTotal(userList.get(0).getTotalRecords()
.toString());
if (dataTableInRQ.getPaginationRequest().isFilterByEmpty()) {
dataTableResult.setRecordsFiltered(userList.get(0).getTotalRecords()
.toString());
} else {
dataTableResult.setRecordsFiltered(Integer.toString(userList.size()));
}
}
if (!AppUtil.isObjectEmpty(userListNew)) {
dataTableResultNew.setRecordsTotal(userListNew.get(0).getTotalRecords()
.toString());
if (dataTableInRQ.getPaginationRequest().isFilterByEmpty()) {
dataTableResultNew.setRecordsFiltered(userListNew.get(0).getTotalRecords()
.toString());
} else {
dataTableResultNew.setRecordsFiltered(Integer.toString(userListNew.size()));
}
}
System.out.println(new Gson().toJson(dataTableResult));
System.out.println(new Gson().toJson(dataTableResultNew));
return new Gson().toJson(dataTableResult);
}
So, I log out everything possible in the console. Here is the output:
{"uniqueId":"1579786571491","draw":"1","start":0,"length":5,"search":"","regex":false,"columns":[{"index":0,"data":"id","name":"ID","searchable":true,"orderable":true,"search":"","regex":false,"sortDir":"ASC"},{"index":1,"data":"name","name":"Name","searchable":true,"orderable":true,"search":"","regex":false},{"index":2,"data":"salary","name":"Salary","searchable":true,"orderable":true,"search":"","regex":false}],"order":{"index":0,"data":"id","name":"ID","searchable":true,"orderable":true,"search":"","regex":false,"sortDir":"ASC"},"isGlobalSearch":false,"maxParamsToCheck":3}
{"uniqueId":"1579786571491","draw":"1","start":0,"length":5,"search":"","regex":false,"columns":[{"index":0,"data":"id","name":"ID","searchable":true,"orderable":true,"search":"","regex":false,"sortDir":"ASC"},{"index":1,"data":"name","name":"Name","searchable":true,"orderable":true,"search":"","regex":false},{"index":2,"data":"salary","name":"Salary","searchable":true,"orderable":true,"search":"","regex":false}],"order":{"index":0,"data":"id","name":"ID","searchable":true,"orderable":true,"search":"","regex":false,"sortDir":"ASC"},"isGlobalSearch":false,"maxParamsToCheck":3}
{"pageNumber":0,"pageSize":5,"sortBy":{"mapOfSorts":{"id":"ASC"}},"filterBy":{"mapOfFilters":{},"globalSearch":false}}
{"pageNumber":0,"pageSize":5,"sortBy":{"mapOfSorts":{"id":"ASC"}},"filterBy":{"mapOfFilters":{},"globalSearch":false}}
SELECT * FROM (SELECT FILTERED_ORDERED_RESULTS.*, COUNT(1) OVER() total_records, ROWNUM AS RN FROM (SELECT BASEINFO.* FROM ( SELECT USER_ID as id, USER_NAME as name, SALARY as salary FROM MYUSERS ) BASEINFO ) FILTERED_ORDERED_RESULTS ORDER BY id ASC ) WHERE RN > (0 * 5) AND RN <= (0 + 1) * 5
SELECT * FROM (SELECT FILTERED_ORDERED_RESULTS.*, COUNT(1) OVER() total_records, ROWNUM AS RN FROM (SELECT BASEINFO.* FROM ( SELECT USER_ID as id, USER_NAME as name, SALARY as salary FROM MYUSERS ) BASEINFO ) FILTERED_ORDERED_RESULTS ORDER BY id ASC ) WHERE RN > (0 * 5) AND RN <= (0 + 1) * 5
Query:
org.hibernate.query.internal.NativeQueryImpl#3ea49a4
[{"id":3,"name":"user3","salary":"300","totalRecords":18},{"id":4,"name":"user4","salary":"400","totalRecords":18},{"id":5,"name":"user5","salary":"500","totalRecords":18},{"id":6,"name":"user6","salary":"600","totalRecords":18},{"id":7,"name":"user7","salary":"700","totalRecords":18}]
[{"id":3,"name":"user3","salary":"300","totalRecords":1},{"id":4,"name":"user4","salary":"400","totalRecords":2},{"id":5,"name":"user5","salary":"500","totalRecords":3},{"id":6,"name":"user6","salary":"600","totalRecords":4},{"id":7,"name":"user7","salary":"700","totalRecords":5}]
{"draw":"1","recordsFiltered":"18","recordsTotal":"18","data":[{"id":3,"name":"user3","salary":"300","totalRecords":18},{"id":4,"name":"user4","salary":"400","totalRecords":18},{"id":5,"name":"user5","salary":"500","totalRecords":18},{"id":6,"name":"user6","salary":"600","totalRecords":18},{"id":7,"name":"user7","salary":"700","totalRecords":18}]}
{"draw":"1","recordsFiltered":"1","recordsTotal":"1","data":[{"id":3,"name":"user3","salary":"300","totalRecords":1},{"id":4,"name":"user4","salary":"400","totalRecords":2},{"id":5,"name":"user5","salary":"500","totalRecords":3},{"id":6,"name":"user6","salary":"600","totalRecords":4},{"id":7,"name":"user7","salary":"700","totalRecords":5}]}
It helped me realize that:
DataTableRequest incoming from the back is the same for both jpa
and jdbc
PaginationCriteria are also the same
paginatedQuery
having been made with the method specified above are the same.
Differences are already seen in the Lists: where the Jpa list
retrieved with native Query has totalRecords as 18 for every row,
the JDBC repo with the same query returns 1,2,3... for every
subsequent row.
It made me think that I should look at the Query made for JPA. But, as you see in the log, System.out.println wasn't able to decipher it for some reason.
Any advice on how to decipher it and more importantly how to get the right total result for each row would be greatly appreciated!!!

JPA limits `queryResultList` even though `setMaxResults` is not definied

I have written the following code snippet to fetch records of certain zip files from zips table using hibernate as the JPA provider.
public List<ZipEntity> getZipEntityFromZipName(String zipName, String version, String createdBy,
String type) throws FileException {
int numAttempts = 0;
do {
numAttempts++;
EntityManager entityManager = getNewEntityManager();
try {
TypedQuery<ZipEntity> query = entityManager
.createNamedQuery(Constants.Database.Queries.GET_FROM_ZIP_NAME, ZipEntity.class)
.setParameter("zipName", zipName)
.setParameter("version", version)
.setParameter("createdBy", createdBy)
.setParameter("type", type);
return query.getResultList();
} catch (PersistenceException e) {
validatePersistenceException(e);
} finally {
closeEntityManager(entityManager);
}
} while (numAttempts <= maxRetries);
throw new FileException("Database connection failed.");
Here are the relevant entity classes
#NamedNativeQueries({
#NamedNativeQuery(
name = Constants.Database.Queries.GET_FROM_ZIP_NAME,
query = Constants.Database.Queries.GET_FROM_ZIP_NAME_QUERY,
resultClass = ZipEntity.class
)
})
#Entity
#Table(name = "zips")
public class ZipEntity {
#EmbeddedId
private ZipKey ZipKey;
public ZipEntity() {
}
public ZipEntity(String zipName, String version, String createdBy, String file, String type,
String extension) {
this.ZipKey = new ZipKey(zipName, version, createdBy, file, type, extension);
}
}
#Embeddable
public class ZipKey implements Serializable {
static final long serialVersionUID = 1L;
#Column(name = "zip_name")
private String zipName;
#Column(name = "version")
private String version;
#Column(name = "created_by")
private String createdBy;
#Column(name = "filepath")
private String file;
#Column(name = "type")
private String type;
#Column(name = "extension")
private String extension;
// Getter, setters and Constructor
}
And the query in Constant class is as follows,
public static final String GET_FROM_ZIP_NAME = "getFile";
public static final String GET_FROM_ZIP_NAME_QUERY = "SELECT * FROM zips WHERE zip_name = " +
":zipName AND version = :version AND created_by = :createdBy AND type = :type";
Event though setMaxResults() is not defined for the above query the results obtained from the above code snippet are limited to 25 record, although the same query executed at DB results in 35 records. What I am doing wrong in here ?
Please debug your solution and check values of "zipName","version","createdBy" and also "type" parameters to verify that they are the expected values by you. This query has for conditions combined by AND logic which affects to your results. To get 35 records, your parameters should make your conditions true for all 35 records.
You can limit the records as below in NamedNativeQuery which provides you 35 records at a time.
#NamedNativeQuery(
name = Constants.Database.Queries.GET_FROM_ZIP_NAME,
query = Constants.Database.Queries.GET_FROM_ZIP_NAME_QUERY,
fetchSize = 35,
resultClass = ZipEntity.class
)

avoid repetitive code Java

I am working on ETL Java project and it does 3 things
extract - read the data from a table
transform the data to JSON
Load the data
It works fine. The issue is I am doing it for each table. The way I have right now is
class ETLHelper
{
private Person read(ResultSet results){
Person p = new Person();
p.setPersonId(results.getString("PERSON_ID"));
p.setPersonName(results.getString("PERSON_NAME"));
return p;
}
private String transform(Person p){
TransformPerson t = new TransformPerson();
t.setTransformPersonId(p.getPersonId);
t.setTransformPersonName(p.getPersonName);
PersonEData eData = new PersonEData();
eData.setDate1(p.date1);
eData.setDate2(p.date2);
t.seteData(eData);
PersonDetails pd = new PersonDetails();
pd.settransformdata(t);
return writeValueAsString(pd);
}
public void etl(){
Connection c = null;
PreparedStatement p = null;
ResultSet r = null;
c = getConnection();
p = c.prepareStatement(getSql());
r = p.executeQuery();
while(r.next()){
messages.add(transform(read(r)));
/*code for loading data*/
}
}
}
Person.Java:
#JsonTypeName(value = "PERSON")
#JsonTypeInfo(include = JsonTypeInfo.As.WRAPPER_OBJECT, use = JsonTypeInfo.Id.NAME)
public class Person{
#JsonProperty(value = "PERSON_ID")
private String personId;
//getter and setter for personId
#JsonProperty(value = "PERSON_NAME")
private String personName;
//getter and setter for personName
}
TransformPerson.java:
#JsonRootName(value = "Person")
class TransformPerson{
private String transformPersonName;
private String transformPersonId;
/*getter and setter for transformPersonName and tranformPersonId*/
#override
String toString(){
return "Person [name =" + transformPersonName + ", id = " + transformPeronId "]";
}
}
PersonEdata:
private String date1;
private String date2;
/*getter and setter*/
#override
public String toString(){
return "PersonEdata [date1=" + date1 +", date2=" + date2 + "]";
}
So a Person class, a class needed for transformation and etl class is written for each table. There are also some additional classes like PersonEdata that returns JSON when toString() is called. Is there anyway can I change this design to avoid writing the similar code for each table? There are some constraints. Each table is different and they transformation class is needed because there are other programs that uses the JSON generated so we need to generate JSON that needs to understood by those programs.
In your current solution, you have created :
Person class - to hold the data retrieved from DB
PersonTransform class - to copy the data from Person to other representation and have extended the capability to create JSON by overrinding toString()
To keep it simple what you can do is:
Have single Class like Person for each entity (Table) - which is JSON Serializable.
Don't override the toString method to represent the JSON representation - use JSON serializer instead.

JPA Could not serialize

I have this class, but when I make a query it throws an exception:
org.hibernate.type.SerializationException: could not serialize
#Entity
public class Google implements Serializable{
#Id
String nombre;
String pass;
public Google() {
nombre = "defecto";
pass = "defecto";
}
public Google(String anom, String apass) {
nombre = anom;
pass = apass;
}
//Getters, setters..
}
This is the query, I am using JPA, hibernate and a MySQL DB, and the class implements Serializable I don't know which is the problem.
public void findNombresGoogle(Map<String, Amigo> anombresAmigos){
List<Google> resultados = new LinkedList<Google>();
Map<String, Amigo> nombresAmigos = anombresAmigos;
Query query = entityManager.createNativeQuery("FROM Google google WHERE "
+ "google.nombre IN (?1)", Google.class);
query.setParameter(1, nombresAmigos);
resultados = (List<Google>) query.getResultList();
}
In the following line :
query.setParameter(1, nombresAmigos);
nombresAmigos should be a List<String>, not a Map<String, Amigo>.

JPA returning empty result list while DB returns row set

I am trying to run a query to fetch some statistic data from my database. And I'm using JPA. But I faced such a trouble: when I run JPQL query, the empty result set is returned. But when I run SQL, produced with JPA for that JPQL query, I got a single row of data.
Here's what I've got:
The Ticket entity
#Entity
#Table(name="tickets")
public class Ticket extends AbstractEntity {
#Id
#GeneratedValue(strategy=GenerationType.IDENTITY)
private int id;
#Embedded
private Owner owner;
#ManyToOne
#JoinColumn(name="flightId")
private Flight flight;
private String status;
public Ticket() {
this.status = "AVAILABLE";
}
The Flight entity
#Entity
#Table(name="flights")
public class Flight extends AbstractEntity {
#Id
#GeneratedValue(strategy=GenerationType.IDENTITY)
private int id;
private String departure;
private String destination;
private Date date;
private float ticketCost;
#OneToMany(mappedBy="flight", fetch=FetchType.LAZY, cascade=CascadeType.ALL)
private List<Ticket> tickets = new ArrayList<Ticket>();
The result row class
public class SoldReportRow {
private String departure;
private String destination;
private DateTime date;
private int ticketsSold;
private float totalCost;
public SoldReportRow(Date date, String departure, String destination, Long ticketsSold, Double totalCost) {
this.departure = departure;
this.destination = destination;
this.ticketsSold = ticketsSold.intValue();
this.totalCost = totalCost.floatValue();
this.date = new DateTime(date);
}
The JPQL
SELECT NEW entities.SoldReportRow(f.date, f.departure, f.destination,
COUNT(t.id), SUM(f.ticketCost))
FROM Ticket t JOIN t.flight f
WHERE t.status = 'SOLD' AND t.owner IS NOT NULL AND f.date BETWEEN ? and ?
GROUP BY f.id
The generated SQL
SELECT t0.DATE, t0.DEPARTURE, t0.DESTINATION, COUNT(t1.ID), SUM(t0.TICKETCOST)
FROM flights t0, tickets t1
WHERE ((((t1.STATUS = ?) AND NOT ((((((t1.ADDRESS IS NULL)
AND (t1.EMAIL IS NULL)) AND (t1.NAME IS NULL)) AND (t1.OWNERFROM IS NULL))
AND (t1.PHONE IS NULL)))) AND (t0.DATE BETWEEN ? AND ?))
AND (t0.ID = t1.flightId)) GROUP BY t0.ID
So here is what I got when I run JPQL:
And here is what I got when I run the generated SQL:
UPD: the TicketDAO methods
// ...
protected static EntityManagerFactory factory;
protected static EntityManager em;
static {
factory = Persistence.createEntityManagerFactory(UNIT_NAME);
}
// ...
public static List<SoldReportRow> soldReportByDate(String from, String to) {
DateTimeFormatter dfTxt = DateTimeFormat.forPattern("dd/MM/yyyy");
DateTimeFormatter dfSql = DateTimeFormat.forPattern("yyyy-MM-dd");
String startDate = dfSql.print(dfTxt.parseDateTime(from));
String endDate = dfSql.print(dfTxt.parseDateTime(to));
String query = String.format(
"SELECT NEW entities.SoldReportRow(f.date, f.departure, f.destination, COUNT(t.id), SUM(f.ticketCost)) FROM " +
"Ticket t JOIN t.flight f " +
"WHERE t.status = 'SOLD' AND t.owner IS NOT NULL AND f.date BETWEEN '%s' and '%s' " +
"GROUP BY f.id",
startDate, endDate
);
return TicketDAO.query(SoldReportRow.class, query);
}
public static <T> List<T> query(Class<T> entityClass, String query) {
EntityManager entityManager = getEntityManager();
TypedQuery<T> q = entityManager.createQuery(query, entityClass);
List<T> entities = null;
try {
entities = q.getResultList();
} finally {
entityManager.close();
}
return entities;
}
public static EntityManager getEntityManager() {
return factory.createEntityManager();
}
The question is, why does this happen and how to fix that?
Thanks!
After the research, I've found that the trouble was caused by the data at the database. By default, SQLite does not have the DATE column type. And it uses strings to describe timestamps. So for date comparison (just like SELECT ... WHERE date BETWEEN a AND b) it's better to use UTC date form, not string one (1397036688 is the better value than the 2014-03-09).

Categories