Update Database with JDBC having some values which should be ignored - java

I am having some troubles using JDBC for updating a table-column. If I have a table e.g User(name,address,hobby,...) imagine about 15 fields. Then I get via frontend an Object from a form, where a user can type in all entries which should be changed. Now I need to save the changes in the database, but not all of the fields got changed, so my DAO has some null values. For example name and address should be changed, the other entries in the table shouldn't. Is there any smart way to put that into a JDBC PreparedStatement? Or do you know other solutions? I am trying to avoid a lot of value != null statements.
Thanks in advance!
(I am using spring as my backend, and angular in frontend)

Since you're using Spring, you can use the NamedParameterJdbcTemplate, but the real trick is the use of COALESCE to use a fall-back value when the value given is NULL:
#Autowired
private DataSource dataSource;
public void updateUser(int id, String name, String address, String hobby) {
NamedParameterJdbcTemplate jdbcTemplate = new NamedParameterJdbcTemplate(dataSource);
String sql = "UPDATE User" +
" SET Name = COALESCE(:name, Name)" +
", Address = COALESCE(:address, Address)" +
", Hobby = COALESCE(:hobby, Hobby)" +
" WHERE Id = :id";
MapSqlParameterSource paramMap = new MapSqlParameterSource();
paramMap.addValue("id" , id , Types.INTEGER);
paramMap.addValue("name" , name , Types.VARCHAR);
paramMap.addValue("address", address, Types.VARCHAR);
paramMap.addValue("hobby" , hobby , Types.VARCHAR);
if (jdbcTemplate.update(sql, paramMap) == 0)
throw new EmptyResultDataAccessException("User not found: " + id, 1);
}
Or, if you use a POJO with the user data:
public class User {
private int id;
private String name;
private String address;
private String hobby;
// Getters and setters here
}
public void updateUser(User user) {
NamedParameterJdbcTemplate jdbcTemplate = new NamedParameterJdbcTemplate(dataSource);
String sql = "UPDATE User" +
" SET Name = COALESCE(:name, Name)" +
", Address = COALESCE(:address, Address)" +
", Hobby = COALESCE(:hobby, Hobby)" +
" WHERE Id = :id";
BeanPropertySqlParameterSource paramMap = new BeanPropertySqlParameterSource(user);
if (jdbcTemplate.update(sql, paramMap) == 0)
throw new EmptyResultDataAccessException("User not found: " + id, 1);
}

The simple solution you can have is using SimpleJdbcInsert and adding usingGeneratedKeyColumns("ID").
#Autowired
private DataSource dataSource;
#Override
public ResponseEntity<Void> insertEntity(Entity obj) {
SimpleJdbcInsert simpleJdbcInsert = new SimpleJdbcInsert(jdbcTemplate.getDataSource());
simpleJdbcInsert.withTableName(TABLE_APP_REPO).usingGeneratedKeyColumns("ID");
BeanPropertySqlParameterSource paramSource = new BeanPropertySqlParameterSource(obj);
try {
simpleJdbcInsert.execute(paramSource);
} catch (Exception e) {
return ResponseEntity.badRequest().build();
}
return ResponseEntity.ok().build();
}

Related

Using jdbcTemplate.query with parameters

I have 3 tables in Database Lecture--< LectureGroups >-- Groups.
And I want to get schedule for a certain group on a certain day. I try to do it in this way:
#Repository
public class Schedule {
private static final String GET_GROUP_DAY_SCHEDULE = "SELECT * FROM LECTURES " +
"INNER JOIN LECTUREGROUPS ON LECTURES.ID = LECTUREGROUPS.LECTUREID " +
"INNER JOIN GROUPS ON GROUPS.ID = LECTUREGROUPS.GROUPID " +
"WHERE GROUPID = :GROUPID AND DATE = :DATE";
#Autowired
private JdbcTemplate jdbcTemplate;
public List<Lecture> getGroupDayLectures(int groupId, LocalDateTime dateTime) {
MapSqlParameterSource parameters = new MapSqlParameterSource()
.addValue("groupid", groupId)
.addValue("date", dateTime);
return jdbcTemplate.query(GET_GROUP_DAY_SCHEDULE, new BeanPropertyRowMapper<>(Lecture.class), parameters);
}
}
But I get an exception in query raw
Caused by: org.postgresql.util.PSQLException: Can't infer the SQL type to use for an instance of org.springframework.jdbc.core.namedparam.MapSqlParameterSource. Use setObject() with an explicit Types value to specify the type to use.
How I can fix it?
I also used variant with
private static final String GET_GROUP_DAY_SCHEDULE = "SELECT * FROM LECTURES " +
"INNER JOIN LECTUREGROUPS ON LECTURES.ID = LECTUREGROUPS.LECTUREID " +
"INNER JOIN GROUPS ON GROUPS.ID = LECTUREGROUPS.GROUPID " +
"WHERE GROUPID = ? AND DATE = ?";
#Autowired
private JdbcTemplate jdbcTemplate;
public List<Lecture> getGroupDayLectures(int groupId, LocalDateTime dateTime) {
return jdbcTemplate.query(GET_GROUP_DAY_SCHEDULE, new Object[]{groupId, dateTime}, new BeanPropertyRowMapper<>(Lecture.class));
}
and it works but return only 1 Lecture in list (it must be 3)
There is a signature with parameters in the jdbcTemplate class:
public <T> List<T> query(String sql, RowMapper<T> rowMapper, #Nullable Object... args)
So it is very easy to use it in this way
private static final String GET_GROUP_DAY_SCHEDULE = "SELECT * FROM LECTURES " +
"INNER JOIN LECTUREGROUPS ON LECTURES.ID = LECTUREGROUPS.LECTUREID " +
"INNER JOIN GROUPS ON GROUPS.ID = LECTUREGROUPS.GROUPID " +
"WHERE GROUPID = ? AND DATE = ?";
#Autowired
private JdbcTemplate jdbcTemplate;
public List<Lecture> getGroupDayLectures(int groupId, LocalDate date) {
return jdbcTemplate.query(GET_GROUP_DAY_SCHEDULE, new BeanPropertyRowMapper<>(Lecture.class), groupId, date);
}

How to implement Server-side processing of DataTables with JDBC so that it paginates?

I have a Spring Boot app with DataTables server-side processing and Oracle database. Actually, I started with implementing one of the tutorials. It worked. The tutorial uses JPA. I want to implement the same using JDBC. I made all the corresponding classes, the repository, the new model with same filds but without jpa. But when I tried to fetch the data, it allowed me to get only the first page without a chance to get to the second page. Below I will post the extracts of the original and added code. So, the original tutorial used these classes:
#Entity
#Table(name = "MYUSERS")
public class User {
#Id
#Column(name = "USER_ID")
private Long id;
#Column(name = "USER_NAME")
private String name;
#Column(name = "SALARY")
private String salary;
...getters and setters
}
And
#Entity
public class UserModel {
#Id
private Long id;
private String name;
private String salary;
private Integer totalRecords;
#Transient
private Integer rn;
...getters and setters
}
And I substituted these two classes with one like this:
public class NewUser {
private Long id;
private String name;
private String salary;
private Integer totalRecords;
private Integer rn;
...getters and setters
}
The table itself has only 3 fields: id, name and salary, the other 2 fields are created and filled later.
The repositiry the original Author has for the user looks like this:
public interface UserRepository extends JpaRepository<User, Long> {
#Query(value = "SELECT * FROM MYUSERS", nativeQuery = true)
List<User> findAllByUsernames(List<String> listOfUsernames);
}
My own repository looks like this:
#Repository
public class NewUserRepoImpl extends JdbcDaoSupport implements NewUserRepo {
private static final String SELECT_ALL_SQL = "SELECT USER_ID as id, USER_NAME as name, SALARY as salary FROM MYUSERS";
private final NamedParameterJdbcTemplate namedParameterJdbcTemplate;
private final JdbcTemplate jdbctemplate;
public NewUserRepoImpl(NamedParameterJdbcTemplate namedParameterJdbcTemplate, JdbcTemplate jdbctemplate, DataSource dataSource) {
this.namedParameterJdbcTemplate = namedParameterJdbcTemplate;
this.jdbctemplate = jdbctemplate;
setDataSource(dataSource);
}
#Override
public List<NewUser> findAll(PaginationCriteria pagination) {
try {
String paginatedQuery = AppUtil.buildPaginatedQueryForOracle(SELECT_ALL_SQL, pagination);
return jdbctemplate.query(paginatedQuery, newUserRowMapper());
} catch (DataAccessException e) {
throw new EntityNotFoundException("No Entities Found");
}
}
#Bean
public RowMapper<NewUser> newUserRowMapper() {
return (rs, i) -> {
final NewUser newUser = new NewUser();
newUser.setId(rs.getLong("ID"));
newUser.setName(rs.getString("NAME"));
newUser.setSalary(rs.getString("SALARY"));
newUser.setTotalRecords(rs.getInt("TOTAL_RECORDS"));
newUser.setTotalRecords(rs.getInt("RN"));
return newUser;
};
}
}
the buildPaginatedQueryForOracle thing transforms my Query and allows it to get the totalRecords and rn. Below I will post the output of it both for the orifinal and my queries (they are the same, I checked).
So, the main part, the controller. I left the old and new pieces in it for now for debug purposes and just returning one of the results:
#RequestMapping(value="/users/paginated/orcl", method=RequestMethod.GET)
#ResponseBody
public String listUsersPaginatedForOracle(HttpServletRequest request, HttpServletResponse response, Model model) {
DataTableRequest<User> dataTableInRQ = new DataTableRequest<User>(request);
System.out.println(new Gson().toJson(dataTableInRQ));
DataTableRequest<NewUser> dataTableInRQNew = new DataTableRequest<NewUser>(request);
System.out.println(new Gson().toJson(dataTableInRQNew));
PaginationCriteria pagination = dataTableInRQ.getPaginationRequest();
System.out.println(new Gson().toJson(pagination));
PaginationCriteria paginationNew = dataTableInRQNew.getPaginationRequest();
System.out.println(new Gson().toJson(paginationNew));
String baseQuery = "SELECT USER_ID as id, USER_NAME as name, SALARY as salary FROM MYUSERS";
String paginatedQuery = AppUtil.buildPaginatedQueryForOracle(baseQuery, pagination);
String paginatedQueryNew = AppUtil.buildPaginatedQueryForOracle(baseQuery, paginationNew);
System.out.println(paginatedQuery);
System.out.println(paginatedQueryNew);
Query query = entityManager.createNativeQuery(paginatedQuery, UserModel.class);
System.out.println("Query:");
System.out.println(query);
#SuppressWarnings("unchecked")
List<UserModel> userList = query.getResultList();
System.out.println(new Gson().toJson(userList));
#SuppressWarnings("unchecked")
List<NewUser> userListNew = newUserRepo.findAll(paginationNew);
System.out.println(new Gson().toJson(userListNew));
DataTableResults<UserModel> dataTableResult = new DataTableResults<UserModel>();
DataTableResults<NewUser> dataTableResultNew = new DataTableResults<NewUser>();
dataTableResult.setDraw(dataTableInRQ.getDraw());
dataTableResultNew.setDraw(dataTableInRQNew.getDraw());
dataTableResult.setListOfDataObjects(userList);
dataTableResultNew.setListOfDataObjects(userListNew);
if (!AppUtil.isObjectEmpty(userList)) {
dataTableResult.setRecordsTotal(userList.get(0).getTotalRecords()
.toString());
if (dataTableInRQ.getPaginationRequest().isFilterByEmpty()) {
dataTableResult.setRecordsFiltered(userList.get(0).getTotalRecords()
.toString());
} else {
dataTableResult.setRecordsFiltered(Integer.toString(userList.size()));
}
}
if (!AppUtil.isObjectEmpty(userListNew)) {
dataTableResultNew.setRecordsTotal(userListNew.get(0).getTotalRecords()
.toString());
if (dataTableInRQ.getPaginationRequest().isFilterByEmpty()) {
dataTableResultNew.setRecordsFiltered(userListNew.get(0).getTotalRecords()
.toString());
} else {
dataTableResultNew.setRecordsFiltered(Integer.toString(userListNew.size()));
}
}
System.out.println(new Gson().toJson(dataTableResult));
System.out.println(new Gson().toJson(dataTableResultNew));
return new Gson().toJson(dataTableResult);
}
So, I log out everything possible in the console. Here is the output:
{"uniqueId":"1579786571491","draw":"1","start":0,"length":5,"search":"","regex":false,"columns":[{"index":0,"data":"id","name":"ID","searchable":true,"orderable":true,"search":"","regex":false,"sortDir":"ASC"},{"index":1,"data":"name","name":"Name","searchable":true,"orderable":true,"search":"","regex":false},{"index":2,"data":"salary","name":"Salary","searchable":true,"orderable":true,"search":"","regex":false}],"order":{"index":0,"data":"id","name":"ID","searchable":true,"orderable":true,"search":"","regex":false,"sortDir":"ASC"},"isGlobalSearch":false,"maxParamsToCheck":3}
{"uniqueId":"1579786571491","draw":"1","start":0,"length":5,"search":"","regex":false,"columns":[{"index":0,"data":"id","name":"ID","searchable":true,"orderable":true,"search":"","regex":false,"sortDir":"ASC"},{"index":1,"data":"name","name":"Name","searchable":true,"orderable":true,"search":"","regex":false},{"index":2,"data":"salary","name":"Salary","searchable":true,"orderable":true,"search":"","regex":false}],"order":{"index":0,"data":"id","name":"ID","searchable":true,"orderable":true,"search":"","regex":false,"sortDir":"ASC"},"isGlobalSearch":false,"maxParamsToCheck":3}
{"pageNumber":0,"pageSize":5,"sortBy":{"mapOfSorts":{"id":"ASC"}},"filterBy":{"mapOfFilters":{},"globalSearch":false}}
{"pageNumber":0,"pageSize":5,"sortBy":{"mapOfSorts":{"id":"ASC"}},"filterBy":{"mapOfFilters":{},"globalSearch":false}}
SELECT * FROM (SELECT FILTERED_ORDERED_RESULTS.*, COUNT(1) OVER() total_records, ROWNUM AS RN FROM (SELECT BASEINFO.* FROM ( SELECT USER_ID as id, USER_NAME as name, SALARY as salary FROM MYUSERS ) BASEINFO ) FILTERED_ORDERED_RESULTS ORDER BY id ASC ) WHERE RN > (0 * 5) AND RN <= (0 + 1) * 5
SELECT * FROM (SELECT FILTERED_ORDERED_RESULTS.*, COUNT(1) OVER() total_records, ROWNUM AS RN FROM (SELECT BASEINFO.* FROM ( SELECT USER_ID as id, USER_NAME as name, SALARY as salary FROM MYUSERS ) BASEINFO ) FILTERED_ORDERED_RESULTS ORDER BY id ASC ) WHERE RN > (0 * 5) AND RN <= (0 + 1) * 5
Query:
org.hibernate.query.internal.NativeQueryImpl#3ea49a4
[{"id":3,"name":"user3","salary":"300","totalRecords":18},{"id":4,"name":"user4","salary":"400","totalRecords":18},{"id":5,"name":"user5","salary":"500","totalRecords":18},{"id":6,"name":"user6","salary":"600","totalRecords":18},{"id":7,"name":"user7","salary":"700","totalRecords":18}]
[{"id":3,"name":"user3","salary":"300","totalRecords":1},{"id":4,"name":"user4","salary":"400","totalRecords":2},{"id":5,"name":"user5","salary":"500","totalRecords":3},{"id":6,"name":"user6","salary":"600","totalRecords":4},{"id":7,"name":"user7","salary":"700","totalRecords":5}]
{"draw":"1","recordsFiltered":"18","recordsTotal":"18","data":[{"id":3,"name":"user3","salary":"300","totalRecords":18},{"id":4,"name":"user4","salary":"400","totalRecords":18},{"id":5,"name":"user5","salary":"500","totalRecords":18},{"id":6,"name":"user6","salary":"600","totalRecords":18},{"id":7,"name":"user7","salary":"700","totalRecords":18}]}
{"draw":"1","recordsFiltered":"1","recordsTotal":"1","data":[{"id":3,"name":"user3","salary":"300","totalRecords":1},{"id":4,"name":"user4","salary":"400","totalRecords":2},{"id":5,"name":"user5","salary":"500","totalRecords":3},{"id":6,"name":"user6","salary":"600","totalRecords":4},{"id":7,"name":"user7","salary":"700","totalRecords":5}]}
It helped me realize that:
DataTableRequest incoming from the back is the same for both jpa
and jdbc
PaginationCriteria are also the same
paginatedQuery
having been made with the method specified above are the same.
Differences are already seen in the Lists: where the Jpa list
retrieved with native Query has totalRecords as 18 for every row,
the JDBC repo with the same query returns 1,2,3... for every
subsequent row.
It made me think that I should look at the Query made for JPA. But, as you see in the log, System.out.println wasn't able to decipher it for some reason.
Any advice on how to decipher it and more importantly how to get the right total result for each row would be greatly appreciated!!!

Joining classes in ORMLite for Android throws SQL exception: could not find a foreign class or vice versa

I'm trying to create a join query using QueryBuilder for two different classes, a Product class and a Coupon class, that references a Product attribute, the storeId.
public class Coupon {
#DatabaseField(columnName = TableColumns.PRODUCT, foreign = true, foreignColumnName = Product.TableColumns.STOREID)
private Product product;
}
public class Product {
#DatabaseField(columnName = TableColumns.ID, generatedId = true)
private Integer id;
#DatabaseField(columnName = TableColumns.STOREID, index = true, unique = true)
private Integer storeId;
}
I need to get a Coupon list based on the Product storeId.
public static List<Coupon> getByProduct(Product product) throws SQLException {
QueryBuilder<Coupon, String> couponBuilder = dao.queryBuilder();
QueryBuilder<Product, Integer> productBuilder = Product.dao.queryBuilder();
productBuilder.where().eq(Product.TableColumns.STOREID, product.getStoreId());
return couponBuilder.join(productBuilder).query();
}
This query is throwing a SQL Exception:
04-22 11:26:08.058: W/System.err(19479): java.sql.SQLException: Could not find a foreign class com.cde.express.mccopa.model.Coupon field in class com.cde.express.mccopa.model.Product or vice versa
04-22 11:26:08.059: W/System.err(19479): at com.j256.ormlite.stmt.QueryBuilder.matchJoinedFields(QueryBuilder.java:554)
04-22 11:26:08.059: W/System.err(19479): at com.j256.ormlite.stmt.QueryBuilder.addJoinInfo(QueryBuilder.java:525)
04-22 11:26:08.059: W/System.err(19479): at com.j256.ormlite.stmt.QueryBuilder.join(QueryBuilder.java:316)
The exception says my classes are not related by a foreign field, but the Coupon class has a foreign Product attribute, annotated properly. I already checked the database, the values in the tables are correct.
How can I fix it?
Looking at the code inside QueryBuilder.java
for (FieldType fieldType : tableInfo.getFieldTypes()) {
// if this is a foreign field and its foreign-id field is the same as the other's id
FieldType foreignIdField = fieldType.getForeignIdField();
if (fieldType.isForeign() && foreignIdField.equals(joinedQueryBuilder.tableInfo.getIdField())) {
joinInfo.localField = fieldType;
joinInfo.remoteField = foreignIdField;
return;
}
}
// if this other field is a foreign field and its foreign-id field is our id
for (FieldType fieldType : joinedQueryBuilder.tableInfo.getFieldTypes()) {
if (fieldType.isForeign() && fieldType.getForeignIdField().equals(idField)) {
joinInfo.localField = idField;
joinInfo.remoteField = fieldType;
return;
}
}
Above states that if you want to make a join using QueryBuilder, you need to make sure you put a join on id field so this seems to me like a limitation of QueryBuilder.java in orm-lite.If you have put up a foreign key as id in Coupon.java It would have worked.
A quick workaround should be to use a raw query string to achieve what you want.
For example -
final GenericRawResults<Coupon> results = couponDao.queryRaw("SELECT "
+ "product_table.product_id AS id, "
+ "product_table.store_id AS storeId "
+ " FROM coupon_table "
+ "JOIN product_table ON coupon_table.store_id = product_table.store_id "
+ "WHERE product_table.store_id = 1", new RawRowMapper<Coupon>() {
#Override
public Coupon mapRow(String[] columnNames, String[] resultColumns) throws SQLException {
final Integer productId = Integer.parseInt(resultColumns[0]);
final Integer storeId = Integer.parseInt(resultColumns[1]);
final Product product = new Product();
product.setId(productId);
product.setStoreId(storeId);
final Coupon coupon = new Coupon();
coupon.setProduct(product);
return coupon;
}
});
final Coupon coupon = results.getResults().get(0);
final Product product = coupon.getProduct();
System.out.println("Product id is " + product.getId() + " , Store id is " + product.getStoreId());

Getting pl/sql array of struct from stored procedure using JdbcSimpleCall

im trying to execute oracle stored procedure using SimpleJDBCCall, all tables and stored procedures are in restaurant schema, table looks like:
CREATE TABLE STAFF
(
STAFF_ID NUMBER(5),
STAFF_FIRST_NAME VARCHAR2(10 BYTE) NOT NULL,
STAFF_LAST_NAME VARCHAR2(20 BYTE) NOT NULL,
STAFF_ROLE VARCHAR2(20 BYTE) NOT NULL,
STAFF_OTHER_DETAILS VARCHAR2(50 BYTE)
);
my type package:
CREATE OR REPLACE PACKAGE Staff_Types
AS
TYPE Staff_Collection IS TABLE OF Staff%ROWTYPE;
END Staff_Types;
my access package:
CREATE OR REPLACE PACKAGE Staff_TAPI
AS
FUNCTION getAllStaff RETURN Staff_Types.Staff_Collection;
END Staff_TAPI;
CREATE OR REPLACE PACKAGE BODY Staff_Tapi
AS
FUNCTION getAllStaff
RETURN Staff_Types.Staff_Collection
IS
all_staff Staff_Types.Staff_Collection;
BEGIN
SELECT *
BULK COLLECT INTO all_staff
FROM Staff;
RETURN all_staff;
END;
END Staff_Tapi;
Java Access:
#Component
#Qualifier("staffJdbcDAO")
public class StaffJDBCDAO implements StaffDAO {
JdbcTemplate jdbcTemplate;
SimpleJdbcCall getAllMembersSP;
#Autowired
#Qualifier("dataSource")
DataSource dataSource;
#Autowired
#Qualifier("jdbcTemplate")
public void setJdbcTemplate(JdbcTemplate jdbcTemplate) {
this.jdbcTemplate = jdbcTemplate;
initializeStoredProceduresCalls();
}
private void initializeStoredProceduresCalls() {
getAllMembersSP = new SimpleJdbcCall(jdbcTemplate);
getAllMembersSP.withCatalogName("Staff_Tapi");
getAllMembersSP.withFunctionName("getAllStaff");
getAllMembersSP.declareParameters(
new SqlOutParameter("return",
Types.OTHER,
"Staff_Types.Staff_Collection",
new SqlReturnStructArray<>( new StaffMapper() )
)
);
getAllMembersSP.compile();
}
#Override
public List<Staff> getAllMembers() {
Staff[] staff = getAllMembersSP.executeFunction(Staff[].class,new HashMap<String,Object>() );
return Arrays.asList(staff);
}
}
mapping class:
public class StaffMapper implements StructMapper<Staff> {
#Override
public STRUCT toStruct(Staff staff, Connection connection, String typeName) throws SQLException {
StructDescriptor descriptor = StructDescriptor.createDescriptor(typeName, connection);
Object[] attributes = new Object[5];
attributes[0] = new Integer( staff.getId() );
attributes[1] = new String("STAFF_FIRST_NAME");
attributes[2] = new String("STAFF_LAST_NAME");
attributes[3] = new String("STAFF_ROLE");
attributes[4] = new String("STAFF_OTHER_DETAILS");
Struct staffStruct = connection.createStruct(typeName,attributes);
return new STRUCT(descriptor,connection,attributes);
}
#Override
public Staff fromStruct(STRUCT struct) throws SQLException {
StructDescriptor descriptor = struct.getDescriptor();
ResultSetMetaData metaData = descriptor.getMetaData();
Object[] attributes = struct.getAttributes();
Map<String,Object> attributeMap = new HashMap<>();
int idx = 1;
for ( Object attribute : attributes )
attributeMap.put( metaData.getColumnName(idx++),attribute );
int id = ((Integer)attributeMap.get("STAFF_ID")).intValue();
String firstName = (String) attributeMap.get("STAFF_FIRST_NAME");
String lastName = (String) attributeMap.get("STAFF_LAST_NAME");
String staffRole = (String) attributeMap.get("STAFF_ROLE");
String otherDetails = (String) attributeMap.get("STAFF_OTHER_DETAILS");
return new Staff(id,firstName,lastName,staffRole,otherDetails);
}
}
and staff:
public class Staff {
private int id;
private String firstName;
private String lastName;
private String profession;
private String otherDetails;
public Staff(int id, String firstName, String lastName, String profession, String otherDetails) {
this.id = id;
this.firstName = firstName;
this.lastName = lastName;
this.profession = profession;
this.otherDetails = otherDetails;
}
public int getId() {
return id;
}
public int setId(int id) {
this.id = id;
}
// and others getters and setters
}
when i execute getAllMembers from StaffDAO im constatly getting :
CallableStatementCallback; uncategorized SQLException for SQL
[{? = call STAFF_TAPI.GETALLSTAFF()}];
SQL state [99999]; error code [17004]; Invalid column type: 1111;
when i change return type parameter to Types.Array i get:
CallableStatementCallback; uncategorized SQLException for SQL
[{? = call STAFF_TAPI.GETALLSTAFF()}];
SQL state [99999]; error code [17074];
invalid name pattern: restaurant.Staff_Types.Staff_Collection;
i tried in both ways with pattern "Staff_Types.Staf_collection" getting same results, im trying to do this for nearly 2 days without any idea what should i do, if anyone has any suggestions i will be greateful.
You cannot load a PL/SQL record from a stored procedure through JDBC. In fact, you cannot even load such a type from Oracle SQL. See also this question for details:
Fetch Oracle table type from stored procedure using JDBC
You can only load SQL types through JDBC (as opposed to PL/SQL types). Given your example, you'll need to write:
-- You cannot really avoid this redundancy
CREATE TYPE STAFF AS OBJECT
(
STAFF_ID NUMBER(5),
STAFF_FIRST_NAME VARCHAR2(10 BYTE) NOT NULL,
STAFF_LAST_NAME VARCHAR2(20 BYTE) NOT NULL,
STAFF_ROLE VARCHAR2(20 BYTE) NOT NULL,
STAFF_OTHER_DETAILS VARCHAR2(50 BYTE)
);
CREATE TYPE STAFF_TABLE AS TABLE OF STAFF;
And then:
CREATE OR REPLACE PACKAGE Staff_TAPI
AS
FUNCTION getAllStaff RETURN STAFF_TABLE;
END Staff_TAPI;
In order to easily integrate your PL/SQL call, and since it is already built as a function: have you thought about something like this?
select * from TABLE(CAST(Staff_Tapi.getAllStaff() as Staff_Types.Staff_Collection))
This way, you can execute it easily as a regular JDBC query. Once done, just process the ResultSet it returns with some minor variant of your fromStruct method in order to return a List<Staff> list to whatever business logic you have on top of it. Hope you find this useful!
You may want to capitalize your custom type in java code like
getAllMembersSP.declareParameters(
new SqlOutParameter("return",
Types.OTHER,
"STAFF_TYPES.STAFF_COLLECTION",
new SqlReturnStructArray<>( new StaffMapper() )
)
);

Hibernate SQL Query result Mapping/Convert TO Object/Class/Bean

1 2: select (table.*)/(all column) is OK
String sql = "select t_student.* from t_student";
//String sql = "select t_student.id,t_student.name,... from t_student"; //select all column
SQLQuery query = session.createSQLQuery(sql);
query.addEntity(Student.class);//or query.addEntity("alias", Student.class);
//query.list();[Student#..., Student#..., Student#...]
query.setResultTransformer(Transformers.ALIAS_TO_ENTITY_MAP); //or other transformer
query.list(); //[{Student(or alias)=Student#...},{Student=Student#...}]
3: select some column(not all of), is Error
String sql = "select t_student.id,t_student.name.t_student.sex from t_student";
SQLQuery query = session.createSQLQuery(sql);
query.addEntity(Student.class);
query.setResultTransformer(Transformers.ALIAS_TO_ENTITY_MAP);
query.list(); //Exception:invalid column/no column
I want "3" to work ok, and let the result can be mapped to Student.class.
Like: Student[id=?, name=?, sex=?, (other field are null/default)]
I've no idea for this error, help me please!
You can go further and add
.setResultTransformer(Transformers.aliasToBean(YOUR_DTO.class));
and automatically map it to your custom dto object, see also Returning non-managed entities.
For example:
public List<MessageExtDto> getMessagesForProfile2(Long userProfileId) {
Query query = getSession().createSQLQuery(" "
+ " select a.*, b.* "
+ " from messageVO AS a "
+ " INNER JOIN ( SELECT max(id) AS id, count(*) AS count FROM messageVO GROUP BY messageConversation_id) as b ON a.id = b.id "
+ " where a.id > 0 "
+ " ")
.addScalar("id", new LongType())
.addScalar("message", new StringType())
......... your mappings
.setResultTransformer(Transformers.aliasToBean(MessageExtDto.class));
List<MessageExtDto> list = query.list();
return list;
}
I want "3" to work ok, and let the result can be mapped to Student.class
That's possible using
Query createNativeQuery(String sqlString, String resultSetMapping)
In the second argument you could tell the name of the result mapping. For example:
1) Let's consider a Student entity, the magic is going to be in the SqlResultSetMapping annotation:
import javax.persistence.Entity;
import javax.persistence.SqlResultSetMapping;
import javax.persistence.Table;
#Entity
#Table(name = "student")
#SqlResultSetMapping(name = "STUDENT_MAPPING", classes = {#ConstructorResult(
targetClass = Student.class, columns = {
#ColumnResult(name = "name"),
#ColumnResult(name = "address")
})})
public class Student implements Serializable {
private String name;
private String address;
/* Constructor for the result mapping; the key is the order of the args*/
public Student(String aName, String anAddress) {
this.name = aName;
this.address = anAddress;
}
// the rest of the entity
}
2) Now you can execute a query which results will be mapped by STUDENT_MAPPING logic:
String query = "SELECT s FROM student s";
String mapping = "STUDENT_MAPPING";
Query query = myEntityManager.createNativeQuery(query, mapping);
#SuppressWarnings("unchecked")
List<Student> students = query.getResultList();
for (Student s : students) {
s.getName(); // ...
}
Note: I think it's not possible to avoid the unchecked warning.
There is only two ways.
You can use 1st or 2nd snippet. According to Hibernate documentation you must prefer 2nd.
You can get just a list of object arrays, like this:
String sql = "select name, sex from t_student";
SQLQuery query = session.createSQLQuery(sql);
query.addScalar("name", StringType.INSTANCE);
query.addScalar("sex", StringType.INSTANCE);
query.list();
I had same problem on HQL Query. I solved the problem by changing the transformer.
The problem caused the code written to transform as Map. But it is not suitable for Alias Bean. You can see the error code at below. The code written to cast result as map and put new field to the map.
Class : org.hibernate.property.access.internal.PropertyAccessMapImpl.SetterImpl
m
Method: set
#Override
#SuppressWarnings("unchecked")
public void set(Object target, Object value, SessionFactoryImplementor factory) {
( (Map) target ).put( propertyName, value );
}
I solved the problem to duplicate the transformer and change the code.
You can see the code in the project.
Link : https://github.com/robeio/robe/blob/DW1.0-migration/robe-hibernate/src/main/java/io/robe/hibernate/criteria/impl/hql/AliasToBeanResultTransformer.java
Class:
import java.lang.reflect.Field;
import java.util.Map;
import io.robe.hibernate.criteria.api.query.SearchQuery;
import org.hibernate.HibernateException;
import org.hibernate.transform.AliasedTupleSubsetResultTransformer;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class AliasToBeanResultTransformer extends AliasedTupleSubsetResultTransformer {
private static final Logger LOGGER = LoggerFactory.getLogger(AliasToBeanResultTransformer.class);
private final Class resultClass;
// Holds fields of Transform Class as Map. Key is name of field.
private Map<String, Field> fieldMap;
public AliasToBeanResultTransformer(Class resultClass) {
if ( resultClass == null ) {
throw new IllegalArgumentException( "resultClass cannot be null" );
}
fieldMap = SearchQuery.CacheFields.getCachedFields(resultClass);
this.resultClass = resultClass;
}
#Override
public boolean isTransformedValueATupleElement(String[] aliases, int tupleLength) {
return false;
}
#Override
public Object transformTuple(Object[] tuple, String[] aliases) {
Object result;
try {
result = resultClass.newInstance();
for ( int i = 0; i < aliases.length; i++ ) {
String name = aliases[i];
Field field = fieldMap.get(name);
if(field == null) {
LOGGER.error(name + " field not found in " + resultClass.getName() + " class ! ");
continue;
}
field.set(result, tuple[i]);
}
}
catch ( InstantiationException e ) {
throw new HibernateException( "Could not instantiate resultclass: " + resultClass.getName() );
} catch ( IllegalAccessException e ) {
throw new HibernateException( "Could not instantiate resultclass: " + resultClass.getName() );
}
return result;
}
}
After created new Transformer You can use like below.
query.setResultTransformer(new AliasToBeanResultTransformer(YOUR_DTO.class));
You can mapped it automatically:
Your Model Student.java
public class Student {
private String name;
private String address;
}
Repository
String sql = "Select * from student";
Query query = em.createNativeQuery(sql, Student.class);
List ls = query.getResultList();
so it will automatically mapped the result with the Student class

Categories