I'm in the process of creating a front end for a Database Driven application and Could do with some advice. I have the following basic entities in my database:
aspect
aspect_value
As you can image I can have many aspect value to each aspect so that when a user records an aspect they can select more than one value per aspect... simple.
I've created an POJO entity to model each aspect an my question is this... using Spring and the jdbcTemplate how would I be able to create the desired composite relationship using org.springframework.jdbc.core.RowMapper i.e. each aspect object containing one or more aspect value ojects? And for that matter I would appreciate if you could please let me know if this would be the best way to do it. I'm keen at some point to delve deeper into ORM but I've been put off so far by the number of issues I've encountered which has slowed down my development and led to the decision to use jdbcTemplate instead.
Thanks
You can use a RowMapper if you are storing aspect_values in your aspect object as objects. Each call to RowMapper returns an object so you'll end up with a collection of aspect_values. If you need to build an aspect object (or objects) with values contained in the aspect_value table then a ResultSetExtractor is the better choice.
Here are my examples as promised. I have to type these in by hand because our development network is on an internal network only so any typos are copy errors and not errors in the code. These are abbreviated versions of inner classes in my DAO:
This maps a single row in the ResultSet to an object:
public List<MessageSummary> getMessages(Object[] params)
{
// mList is filled with objects created in MessageRowMapper,
// so the length of the list equal to the number of rows in the ResultSet
List<MessageSummary> mList = jdbcTemplate.query(sqlStr, new MessageRowMapper(),
params);
return mList;
}
private final class MessageRowMapper implements RowMapper<MessageSummary>
{
#Override
public MessageSummary mapRow(ResultSet rs, int i) throws SQLException
{
MessageSummary ms = new MessageSummary();
ms.setId(rs.getInt("id"));
ms.setMessage(rs.getString("message"));
return ms;
}
}
ResultSetExtractor works on the same idea except you map the entire set yourself instead of just converting a row into an object. This is useful when your object has attributes from multiple rows.
public Map<Integer, List<String>> getResults(Object[] params)
{
Map<Integer, List<String>> result = jdbcTemplate.query(sqlStr, new ResultExtractor(),
params);
return result;
}
private final class ResultExtractor implements ResultSetExtractor<Map<Integer, List<String>>>
{
#Override
public Map<Integer, List<String>> extractData(ResultSet rs)
throws SQLException, DataAccessException
{
Map<Integer, List<String>> resultMap = new HashMap<Integer, List<String>>();
while (rs.next())
{
int id = rs.getInt("id");
List<String> nameList = resultMap.get(id);
if (nameList == null)
{
nameList = new ArrayList<String>();
resultMap.put(id, nameList);
}
nameList.add(rs.getString("name"));
}
return resultMap;
}
}
The RowMapper interface provides a method
Object mapRow(ResultSet rs,
int rowNum)
throws SQLException
You implement this method in a class and provide code to populate your entity object with values held in the row of the ResultSet rs. To obtain the resultset itself from database , you can use JdbcTemplate.select method's overload
List jdbcTemplate.query(String sql, RowMapper mapper )
Related
Using JOOQ 3.5.2 with MySQL 5.7, I'm trying to accomplish the following...
MySQL has a set of JSON functions which allow for path targeted manipulation of properties inside larger documents.
I'm trying to make an abstraction which takes advantage of this using JOOQ. I began by creating JSON serializable document model which keeps track of changes and then implemented a JOOQ custom Binding for it.
In this binding, I have all the state information necessary to generate calls to these MySQL JSON functions with the exception of the qualified name or alias of the column being updated. A reference to this name is necessary for updating existing JSON documents in-place.
I have been unable to find a way to access this name from the *Context types available in the Binding interface.
I have been considering implementing a VisitListener to capture these field names and pass them through the Scope custom data map, but that option seems quite fragile.
What might the best way to gain access to the name of the field or alias being addressed within my Binding implementation?
--edit--
OK, to help clarify my goals here, take the following DDL:
create table widget (
widget_id bigint(20) NOT NULL,
jm_data json DEFAULT NULL,
primary key (widget_id)
) ENGINE=InnoDB DEFAULT CHARSET=utf8
Now let's assume jm_data will hold the JSON representation of a java.util.Map<String,String>. For this JOOQ provides a very nice extension API by implementing and registering a custom data-type binding (in this case using Jackson):
public class MySQLJSONJacksonMapBinding implements Binding<Object, Map<String, String>> {
private static final ObjectMapper mapper = new ObjectMapper();
// The converter does all the work
#Override
public Converter<Object, Map<String, String>> converter() {
return new Converter<Object, Map<String, String>>() {
#Override
public Map<String, String> from(final Object t) {
try {
return t == null ? null
: mapper.readValue(t.toString(),
new TypeReference<Map<String, String>>() {
});
} catch (final IOException e) {
throw new RuntimeException(e);
}
}
#Override
public Object to(final Map<String, String> u) {
try {
return u == null ? null
: mapper.writer().writeValueAsString(u);
} catch (final JsonProcessingException e) {
throw new RuntimeException(e);
}
}
#Override
public Class<Object> fromType() {
return Object.class;
}
#Override
public Class toType() {
return Map.class;
}
};
}
// Rending a bind variable for the binding context's value and casting it to the json type
#Override
public void sql(final BindingSQLContext<Map<String, String>> ctx) throws SQLException {
// Depending on how you generate your SQL, you may need to explicitly distinguish
// between jOOQ generating bind variables or inlined literals. If so, use this check:
// ctx.render().paramType() == INLINED
ctx.render().visit(DSL.val(ctx.convert(converter()).value()));
}
// Registering VARCHAR types for JDBC CallableStatement OUT parameters
#Override
public void register(final BindingRegisterContext<Map<String, String>> ctx)
throws SQLException {
ctx.statement().registerOutParameter(ctx.index(), Types.VARCHAR);
}
// Converting the JsonElement to a String value and setting that on a JDBC PreparedStatement
#Override
public void set(final BindingSetStatementContext<Map<String, String>> ctx) throws SQLException {
ctx.statement().setString(ctx.index(),
Objects.toString(ctx.convert(converter()).value(), null));
}
// Getting a String value from a JDBC ResultSet and converting that to a Map
#Override
public void get(final BindingGetResultSetContext<Map<String, String>> ctx) throws SQLException {
ctx.convert(converter()).value(ctx.resultSet().getString(ctx.index()));
}
// Getting a String value from a JDBC CallableStatement and converting that to a Map
#Override
public void get(final BindingGetStatementContext<Map<String, String>> ctx) throws SQLException {
ctx.convert(converter()).value(ctx.statement().getString(ctx.index()));
}
// Setting a value on a JDBC SQLOutput (useful for Oracle OBJECT types)
#Override
public void set(final BindingSetSQLOutputContext<Map<String, String>> ctx) throws SQLException {
throw new SQLFeatureNotSupportedException();
}
// Getting a value from a JDBC SQLInput (useful for Oracle OBJECT types)
#Override
public void get(final BindingGetSQLInputContext<Map<String, String>> ctx) throws SQLException {
throw new SQLFeatureNotSupportedException();
}
}
...this implementation is attached at build time by the code-gen like so:
<customTypes>
<customType>
<name>JsonMap</name>
<type>java.util.Map<String,String></type>
<binding>com.orbiz.jooq.bindings.MySQLJSONJacksonMapBinding</binding>
</customType>
</customTypes>
<forcedTypes>
<forcedType>
<name>JsonMap</name>
<expression>jm_.*</expression>
<types>json</types>
</forcedType>
</forcedTypes>
...so with this in place, we have a nice, strongly typed java Map which we can manipulate in our application code. The binding implementation though, well it always writes the entire map contents to the JSON column, even if only a single map entry has been inserted, updated, or deleted. This implementation treats the MySQL JSON column like a normal VARCHAR column.
This approach poses two problems of varying significance depending on usage.
Updating only a portion of a large map with many thousands of entries produces unnecessary SQL wire traffic as a side effect.
If the contents of the map are user editable, and there are multiple users editing the contents at the same time, the changes of one may be overwritten by the other, even if they are non-conflicting.
MySQL 5.7 introduced the JSON data type, and a number of functions for manipulating documents in SQL. These functions make it possible to address the contents of the JSON documents, allowing for targeted updates of single properties. Continuing our example...:
insert into DEV.widget (widget_id, jm_data)
values (1, '{"key0":"val0","key1":"val1","key2":"val2"}');
...the above Binding implementation would generate SQL like this if I were to change the java Map "key1" value to equal "updated_value1" and invoke an update on the record:
update DEV.widget
set DEV.widget.jm_data = '{"key0":"val0","key1":"updated_value1","key2":"val2"}'
where DEV.widget.widget_id = 1;
...notice the entire JSON string is being updated. MySQL can handle this more efficiently using the json_set function:
update DEV.widget
set DEV.widget.jm_data = json_set( DEV.widget.jm_data, '$."key1"', 'updated_value1' )
where DEV.widget.widget_id = 1;
So, if I want to generate SQL like this, I need to first keep track of changes made to my Map from when it was initially read from the DB until it is to be updated. Then, using this change information, I can generate a call to the json_set function which will allow me to update only the modified properties in place.
Finally getting to my actual question. You'll notice in the SQL I wish to generate, the value of the column being updated contains a reference to the column itself json_set( DEV.widget.jm_data, .... This column (or alias) name does not seem to be available to the Binding API. I there a way to identify the name of the column of alias being updated from within my Binding implementation?
Your Binding implementation is the wrong place to look for a solution to this problem. You don't really want to change the binding of your column to somehow magically know of this json_set function, which does an incremental update of the json data rather than a full replacement. When you use UpdatableRecord.store() (which you seem to be using), the expectation is for any Record.field(x) to reflect the content of the database row exactly, not a delta. Of course, you could implement something similar in the sql() method of your binding, but it would be very difficult to get right and the binding would not be applicable to all use-cases.
Hence, in order to do what you want to achieve, simply write an explicit UPDATE statement with jOOQ, enhancing the jOOQ API using plain SQL templating.
// Assuming this static import
import static org.jooq.impl.DSL.*;
public static Field<Map<String, String>> jsonSet(
Field<Map<String, String>> field,
String key,
String value
) {
return field("json_set({0}, {1}, {2})", field.getDataType(), field, inline(key), val(value));
}
Then, use your library method:
using(configuration)
.update(WIDGET)
.set(WIDGET.JM_DATA, jsonSet(WIDGET.JM_DATA, "$.\"key1\"", "updated_value1"))
.where(WIDGET.WIDGET_ID.eq(1))
.execute();
If this is getting too repetitive, I'm sure you can factor out common parts as well in some API of yours.
I'm trying to implement data transfer from one DB type to another via JDBC.
With connection.getMetaData().getColumnType() I can get JDBCType of column.
How can I use it to get java Class object to be able to use resultSet.getObject(i, XXX.class)?
Of course I can do something like this https://stackoverflow.com/a/5253901/7849631
Does any library already implements this accoridng DB vendor? Or maybe there is special result set floating somewhere?
Here's a simple example to understand my problem:
private List<Map<String, Object>> readDataFromResultSet(ResultSet resultSet, Map<String, JDBCType> columnNamesTypes) throws SQLException {
List<Map<String, Object>> dataToTransfer = new ArrayList<>();
while (resultSet.next()) {
Map<String, Object> value = new LinkedHashMap<>();
for (String columnName : columnNamesTypes.keySet()) {
final JDBCType type = columnNamesTypes.get(columnName);
value.put(columnName, resultSet.getObject(columnName, **convertJDBCTypeToClass**(type)));
}
dataToTransfer.add(value);
}
return dataToTransfer;
}
UPDATE: this has been resolved with org.hibernate.type.descriptor.sql.JdbcTypeJavaClassMappings
I'm working on some webservices, which I'm implementing with hibernate. The issue is that I need to access the names of the entity tables (say, those in #Table("Table_A")).
The code is generated, so I cannot alter the entity classes themselves to give me what I need.
This is what I was doing so far:
public static
<T extends Entity<K>, K extends Serializable>
String getTableName(Class<T> objClass) {
Table table = objClass.getAnnotation(Table.class);
return (table != null)? table.name() : null;
}
However, after some research I discovered that reflection is not the best performance-wise, and since this method will be called a lot I'm looking for another way to go at it.
I'm trying to follow the advice given here:
Performance of calling Method/Field.getAnnotation(Class) several times vs. Pre-caching this data in a Map.
This is what I came up with:
public final class EntityUtils {
private static HashMap<String, String> entityCache;
private EntityUtils() {
entityCache = new HashMap<String, String>();
}
public static
<T extends Entity<K>, K extends Serializable>
String getTableName_New(Class<T> objClass) {
String tableName = entityCache.get(objClass.getCanonicalName());
if (tableName == null) {
Table table = objClass.getAnnotation(Table.class);
if (table != null) {
tableName = table.name();
entityCache.put(objClass.getCanonicalName(), tableName);
}
}
return tableName;
}
}
However, I'm not sure about this. Is it a good idea to cache reflection data in a static map? Is there any alternative way to accomplish this?
Ideally, I would use a Guava cache, with weak keys, that way you're not keeping any references to the class object in case you use some advanced ClassLoader magic.
LoadingCache<Class<?>, String> tableNames = CacheBuilder.newBuilder()
.weakKeys()
.build(
new CacheLoader<Class<?>, String>() {
public String load(Class<?> key) {
Table table = objClass.getAnnotation(Table.class);
return table == null ? key.getSimpleName() : table.name();
}
});
Usage:
String tablename = tableNames.get(yourClass);
See: Caches Explained
On the other hand: annotations are cached by the JDK also, so your previous approach is probably fine
So when we use JDBI to query from database, it is getting it into a Map<String, Object> type.
I want to get it as my customized object (constructor) instead of Map<String, Object>.
DBI dbi = establishConnection(url, userName, passWord);
Handle handle = dbi.open();
List<Map<String, Object>> rs = handle.select("select * from sometable");
Instead I want to use:
List<customizedObject> rs = handle.select("select * from sometable");
Where customizedObject class is an object that contains all the column properties with it.
Is there any way to do this? I found some relative documentation, but I cannot really understand the implementation.
http://jdbi.org/sql_object_api_queries/
Please also see the previous page in the documentation that shows how to link your Handle or DBI with the mappers.
Essentially, you need a mapper to convert the ResultSet to the desired object and an interface to refer to the mapper.
Let's assume a minimal example. First the mapper needs to be provided:
public class CustomizedObjectMapper implements ResultSetMapper<customizedObject> {
#Override
public customizedObject map(int index, ResultSet r, StatementContext ctx)
throws SQLException {
return new customizedObject(r.getString("uuid"), r.getString("other_column"));
}
}
Then we need an interface to define which query provides the data that is passed to the mapper class. One result row leads to one invocation of CustomizedObjectMapper.map(...):
#RegisterMapper(CustomizeObjectMapper.class)
public interface CustomizeObjectQuery {
#SqlQuery("Select uuid, other_column from schema.relation")
List<customizedObject> get();
}
Finally, the objects can be retrieved: List<customizedObject> test = dbi.open(CustomizeObjectQuery.class).get().
Your can also put the components together on an individual basis like so and omit the interface:
dbi.open().createQuery("Select uuid, other_colum from schema.relation").map(new EventMapper()).list()
Few days into Spring now. Integrating Spring-JDBC into my web application. I was successfully able to preform CRUD operations on my DB, impressed with boiler-plate code reduction. But I am failing to use the query*() methods provided in NamedParameterJDBCTemplate. Most of the examples on the internet provide the usage of either RowMapper or ResultSetExtractor. Though both uses are fine, it forces me to create classes which have to implement these interfaces. I have to create bean for every type of data I am loading for the DB (or maybe I am mistaken).
Problem arises in code section where I have used something like this:
String query="select username, password from usertable where username=?"
ps=conn.prepareStatement(query);
ps.setString(username);
rs=ps.executeQuery();
if(rs.next()){
String username=rs.getString("username");
String password=rs.getString("password")
//Performs operation on them
}
As these values are not stored in any bean and used directly, I am not able to integrate jdbcTemplate in these kind of situations.
Another situation arises when I am extracting only part of properties present in bean from my database.
Example:
public class MangaBean{
private String author;
private String title;
private String isbn;
private String releaseDate;
private String rating;
//getters and setters
}
Mapper:
public class MangaBeanMapper implements RowMapper<MangaBean>{
#Override
public MangaBean mapRow(ResultSet rs, int arg1) throws SQLException {
MangaBean mb=new MangaBean();
mb.setAuthor(rs.getString("author"));
mb.setTitle(rs.getString("title"));
mb.setIsbn(rs.getString("isbn"));
mb.setReleaseDate(rs.getString("releaseDate"));
mb.setRating(rs.getString("rating"));
return mb;
}
}
The above arrangement runs fine like this:
String query="select * from manga_data where isbn=:isbn"
Map<String, String> paramMap=new HashMap<String, String>();
paramMap.put("isbn", someBean.getIsbn());
return template.query(query, paramMap, new MangaBeanMapper());
However, if I only want to retrieve two/three values from my db, I cannot use the above pattern as it generates a BadSqlGrammarException: releaseDate does not exist in ResultSet . Example :
String query="select title, author where isbn=:isbn"
Map<String, String> paramMap=new HashMap<String, String>();
paramMap.put("isbn", someBean.getIsbn());
return template.query(query, paramMap, new MangaBeanMapper());
Template is an instance of NamedParameterJDBCTemplate. Please advice me solutions for these situations.
The other answers are sensible: you should create a DTO bean, or use the BeanPropertyRowMapper.
But if you want to be able to have more control than the BeanPropertyRowMapper, (or reflection makes it too slow), you can use the
queryForMap
method, which will return you a list of Maps (one per row) with the returned columns as keys. Because you can call get(/* key that is not there */) on a Map without throwing an exception (it will just return null), you can use the same code to populate your object irrespective of which columns you selected.
You don't even need to write your own RowMapper, just use the BeanPropertyRowMapper that spring provides. The way it works is it matches the column names returned to the properties of your bean. Your query has columns that match your bean exactly, if it didn't you would use an as in your select as follows...
-- This query matches a property named matchingName in the bean
select my_column_that doesnt_match as matching_name from mytable;
The BeanPropertyRowMapper should work with both queries you listed.
Typically, yes : for most queries you would create a bean or object to transform the result into. I would suggest that more most cases, that's want you want to do.
However, you can create a RowMapper that maps a result set to a map, instead of a bean, like this. Downside would be be losing the type management of beans, and you'd be relying on your jdbc driver to return the correct type for each column.
As #NimChimpskey has just posted, it's best to create a tiny bean object : but if you really don't want to do that, this is another option.
class SimpleRowMapper implements RowMapper<Map<String, Object>> {
String[] columns;
SimpleRowMapper(String[] columns) {
this.columns = columns;
}
#Override
public Map<String, Object> mapRow(ResultSet resultSet, int i) throws SQLException {
Map<String, Object> rowAsMap = new HashMap<String, Object>();
for (String column : columns) {
rowAsMap.put(column, resultSet.getObject(column));
}
return rowAsMap;
}
}
In yr first example I would just create a DTO Bean/Value object to store them. There is a reason its a commonly implemented pattern, it takes minutes to code and provides many long term benefits.
In your second example, create a second implementation of rowmapper where you don;t set the fields, or supply a null/subsitute value to mangabean where necessary :
#Override
public MangaBean mapRow(ResultSet rs, int arg1) throws SQLException {
MangaBean mb=new MangaBean();
mb.setAuthor(rs.getString("author"));
mb.setTitle(rs.getString("title"));
/* mb.setIsbn("unknown");*/
mb.setReleaseDate("unknown");
mb.setRating(null);
return mb;
}