Opting for an alternative to JPA and Spring-Data I wanted to try out JDBI for my Repository implementation with SQLite
Repository Code
/**
* SQLite implementation of Foo Repository
*/
public class FooRepository implements FooRepository {
private final DBI connection;
/**
* The constructor initialises the connection to the local SQLite file
*
* #param dataSource jdbc connection string e.g. "jdbc:sqlite::resource:db/foo.db"
* #throws IllegalArgumentException when an invalid DB file is given
*/
public FooRepository(final SQLiteDataSource dataSource) {
checkNotNull(dataSource, "dataSource required");
connection = new DBI(dataSource);
}
/**
* Returns a list of Foo objects for a website locale in the DB
* #return List
* #throws SQLException error querying
*/
#Override
public List<Foo> getFoosByWebsiteLocale(f) throws SQLException {
checkNotNull(websiteLocale, "websiteLocale required");
final String fooQuery = query...
Handle queryHandler = connection.open();
final List<Foo> fooList = queryHandler.createQuery(fooQuery)
.map(FooMapper.class);
queryHandler.close();
return fooList;
}
}
Mapper
public class FooMapper implements ResultSetMapper {
/**
* Construct a Foo object from a record in the result set
* #param index row number
* #param resultRow row
* #param ctx statementcontext
* #return Foo object
* #throws SQLException when accessing sql result set
*/
#Override
public Foo map(final int index, final ResultSet resultRow, final StatementContext ctx) throws SQLException {
return Foo.builder()
.name(resultRow.getString("foo_name"))
.type(resultRow.getString("foo_type"))
.build();
}
}
I am struggling to understand how I will create a list of Foo objects using ResultSetMapper.
The JDBI documentation also appears to be broken on this area :
http://jdbi.org/maven_site/apidocs/org/skife/jdbi/v2/tweak/ResultSetMapper.html
Help would be appreciated on how to make this work.
Your mapper only needs to map one row to one Foo object. JDBI will create the list and put the objects in the list for you.
I.e.:
final List<Foo> fooList = queryHandler.createQuery(fooQuery).map(FooMapper.class).list();
Related
We are putting all our data points for one row into a hashmap. We don't want to use a pojo because the values are a different set each time. For example, we might get "place" on some records and we might get "hometown" on others. Actually we have thousands of different column names to choose from. Our code looks like this:
Map<String, Object> aMap = new HashMap<>();
aMap.put("id", Integer.valueOf(1));
aMap.put("age", Integer.valueOf(45));
aMap.put("name", "mark");
aMap.put("place", "home");
final GenericRecord record = new GenericData.Record(avroSchema);
aMap.forEach((k, v) -> {
record.put(k, v);
});
writer.write(record);
We would like to put all the values in a map and then generate a schema. Since using the Reflect api, it can be done for a pojo, I was wondering if it could be done from a hashmap as well?
As a side question, Is there any way to eliminate the forEach above and just write the map?
Here is what we came up with. We also had nested columns.
import java.util.ArrayList;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import org.apache.avro.Schema;
import org.apache.avro.Schema.Parser;
import org.apache.avro.Schema.Type;
/**
* This does NOT do all types. So far just the types we think we need. See
* https://docs.oracle.com/database/nosql-12.1.3.0/GettingStartedGuide/avroschemas.html
* <p>
* We need some error handlling here and when we don't have the correct type, call it out!
* <p>
* This runs in 1-2ms even with a large payload.
*/
public class AvroSchemaBuilder {
/**
* Construct!
*/
private AvroSchemaBuilder() {
//private constructor. All methods are static.
}
/**
* Build the Avro schema and return it.
*
* #param name Name of object.
* #param nameTypeConsumer The nameTypeConsumer of objects being saved.
* #return the Avro schema.
*/
public static Schema getAvroSchema(String name, NameTypeConsumer nameTypeConsumer) {
String json = Lson.toJson(getAvroSchemaAsMap(name, nameTypeConsumer, true));
Parser parser = new Parser().setValidate(true);
return parser.parse(json);
}
/**
* Returns the map with all the attributes to build a schema. This would be recursive if we need
* to build a complex schema. For example for Trends this would build a complex schema where some
* of the types are maps that are themselves described as another nested schema.
*/
private static Map<String, Object> getAvroSchemaAsMap(String name,
NameTypeConsumer nameTypeConsumer,
boolean addNameSpace) {
Map<String, Object> schemaMap = new LinkedHashMap<>();
schemaMap.put("type", "record");
schemaMap.put("name", name);
if (addNameSpace) {
schemaMap.put("namespace", "com.blah.blah");
}
List<Field> fields = new ArrayList();
nameTypeConsumer.consumeNestedNameType((columnName, nestedNameType) -> {
Object avroType;
if (nestedNameType.getNameTypeConsumer() != null) {
avroType = getAvroSchemaAsMap(columnName, nestedNameType.getNameTypeConsumer(), false);
} else {
avroType = getAvroType(nestedNameType.getType()).getName();
}
Object[] types = {"null", avroType}; //adding null first always.
fields.add(new Field(columnName, types));
});
schemaMap.put("fields", fields);
return schemaMap;
}
/**
* Finds the avro type by class.
*
* #param type the Type (this is an avro type).
* #return avro constant.
*/
private static Type getAvroType(Class<?> type) {
if (type.equals(Integer.class)) {
return Type.INT;
}
if (type.equals(Long.class)) {
return Type.LONG;
}
if (type.equals(Float.class)) {
return Type.FLOAT;
}
if (type.equals(Double.class)) {
return Type.DOUBLE;
}
if (type.equals(String.class)) {
return Type.STRING;
}
if (type.equals(Boolean.class)) {
return Type.BOOLEAN;
}
throw new GenericRuntimeException("Cannot get Avro type for type " + type.getName());
}
/**
* Nested class to make our field.
*/
private static class Field {
public final String name;
public final Object[] type;
public Field(String name, Object[] type) {
this.name = name;
this.type = type;
}
}
}
I have a class to hold some json data as follows:
package org.swx.nursing.tools.configuration.data;
import java.util.Set;
import com.cerner.system.exception.Verifier;
import com.cerner.system.exception.VerifyException;
import com.google.common.collect.ImmutableSet;
/**
* Class representing a simple {#link JsonData#identifier},
* {#link JsonData#data} format. This class can be used to
* persist application data for example in a Configuration file.
*
* #author SW029693
* #since v1.0
*/
public class JsonData <T>{
/**
* Represents a unique identifier
*/
private String identifier;
/**
* Represents the data pertaining to this {#link JsonData#identifier}
*/
private T data;
private static final Set<String> VALID_JSON_ID_TYPES = ImmutableSet.of("CONFIG","HOTKEYS");
public JsonData(String identifier, T data) {
super();
this.identifier = identifier;
this.data = data;
}
/**
* Getter for {#link JsonData#identifier}
* #return
*/
public String getIdentifier() {
return identifier;
}
/**
* Sets the {#link JsonData#identifier} to the given value
* #param identifier
* Represents a unique {#link JsonData#identifier}
* #throws VerifyException
* If the argument is {#code null} or {#code empty}
*/
public void setIdentifier(String identifier) throws VerifyException{
Verifier.verifyNotNull(identifier, "identifier : null");
Verifier.verifyNotEmpty(identifier,"identifier : empty");
this.identifier = identifier;
}
/**
* Getter for {#link JsonData}
* #return
*/
public T getData() {
return data;
}
/**
* Sets the {#link JsonData#data} to the given value
* #param identifier
* Represents a unique {#link JsonData#data}
* #throws VerifyException
* If the argument is {#code null}
*/
public void setData(T data) {
Verifier.verifyNotNull(data, "data : null");
this.data = data;
}
#Override
public String toString() {
return "JsonData [identifier=" + identifier + ", data=" + data + "]";
}
}
I am trying to convert some data to JSON (write it to a file), read it back and cast into the above JsonData object. This is my unit test which fails:
#Test
#SuppressWarnings("unchecked")
public void testWriteContentsToFile() {
ConfigurationManager<JsonData> configurationManager = (ConfigurationManager<JsonData>)
ConfigurationManager.Factory.create();
assertNotNull(configurationManager);
configurationManager.write(getMockJsonData());
System.out.println("1="+getMockJsonData().getData().toString());
assertEquals(getMockJsonData().getData(), ((JsonData) readconfigFile()).getData());
}
The helper methods for this test are as follows:
/**
* Helper method to read the config.json file
* #return
*/
private static JsonData<ConfigurationProperty> readconfigFile() {
Reader reader = null;
JsonData<ConfigurationProperty> data = null;
Gson gson = null;
try {
reader = new FileReader("./config.json");
gson = new GsonBuilder().create();
data = gson.fromJson(reader, JsonData.class);
System.out.println("yooo="+data.getData().getRunnableContext());
} catch (FileNotFoundException e) {
e.printStackTrace();
fail("Test failed while reading the config.json file: "+e.getMessage()); }
finally {
try {
reader.close();
} catch (IOException e) {
e.printStackTrace();
fail("Test failed while reading the config.json file: "+e.getMessage());
}
}
return data;
}
and
/**
* Helper method which creates a mock {#link JsonData} object for testing purposes
*
* #return An instance of the newly created mock {#link Jsondata} object
*/
private static JsonData<ConfigurationProperty> getMockJsonData() {
JsonData<ConfigurationProperty> data = new JsonData<ConfigurationProperty>("CONFIG",
getMockConfigurationProperty("testKey", "testContext", "APPLICATION"));
System.out.println("data ==="+data);
return data;
}
/**
* Helper method which creates a {#link ConfigurationProperty} based on the given
* non-null and non-empty arguments. This method can be used to get the
* {#link ConfigurationProperty} instance created with the desired fields when ALL
* of the 3 parameters are non-null, non-empty & valid.
*
* #param mockHotKey
* A non-null & non-empty mockHotkey parameter passed in as an argument
*
* #param mockContext
* A non-null & non-empty mockContext parameter passed in as an argument
*
* #param mockContextType
* A non-null & non-empty mockContextType parameter passed in as an argument
*
* #throws VerifierException
* If the argument validation for non-null or non-empty arguments as failed.
*
* #throws IllegalArgumentException
* The {#link ConfigurationProperty.Builder#withRunnableContextType(String type) method
* throws an {#link IllegalArgumentException} if the provided type is not supported in
* {#link ConfigurationProperty#RUNNABLE_CONTEXT_TYPE}
*
* #return An instance of newly created {#link ConfigurationProperty} object
*/
private static ConfigurationProperty getMockConfigurationProperty (
String mockHotKey, String mockContext, String mockContextType) {
Verifier.verifyNotNull(mockHotKey);
Verifier.verifyNotNull(mockContext);
Verifier.verifyNotNull(mockContextType);
Verifier.verifyNotEmpty(mockHotKey);
Verifier.verifyNotEmpty(mockContext);
Verifier.verifyNotEmpty(mockContextType);
return ConfigurationProperty.Builder
.create()
.withHotKey(mockHotKey)
.withRunnableContext(mockContext)
.withRunnableContextType(mockContextType)
.build();
}
I am getting the following error when i run the unit test to read the data from the json file into my JsonData object:
java.lang.ClassCastException: com.google.gson.internal.LinkedTreeMap cannot be cast to org.swx.nursing.tools.configuration.data.ConfigurationProperty
at org.swx.nursing.tools.configuration.data.ConfigurationManagerImplTest.readconfigFile(ConfigurationManagerImplTest.java:181)
at org.swx.nursing.tools.configuration.data.ConfigurationManagerImplTest.testWriteContentsToFile(ConfigurationManagerImplTest.java:166)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.junit.internal.runners.TestMethod.invoke(TestMethod.java:66)
at org.jmock.integration.junit4.JMock$1.invoke(JMock.java:37)
at org.junit.internal.runners.MethodRoadie.runTestMethod(MethodRoadie.java:105)
at org.junit.internal.runners.MethodRoadie$2.run(MethodRoadie.java:86)
at org.junit.internal.runners.MethodRoadie.runBeforesThenTestThenAfters(MethodRoadie.java:94)
at org.junit.internal.runners.MethodRoadie.runTest(MethodRoadie.java:84)
at org.junit.internal.runners.MethodRoadie.run(MethodRoadie.java:49)
at org.junit.internal.runners.JUnit4ClassRunner.invokeTestMethod(JUnit4ClassRunner.java:98)
at org.junit.internal.runners.JUnit4ClassRunner.runMethods(JUnit4ClassRunner.java:61)
at org.junit.internal.runners.JUnit4ClassRunner$1.run(JUnit4ClassRunner.java:54)
at org.junit.internal.runners.ClassRoadie.runUnprotected(ClassRoadie.java:34)
at org.junit.internal.runners.ClassRoadie.runProtected(ClassRoadie.java:44)
at org.junit.internal.runners.JUnit4ClassRunner.run(JUnit4ClassRunner.java:52)
at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:49)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:467)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:683)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:390)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:197)
The file gets written like this:
{
"identifier": "CONFIG",
"data": {
"hotKey": "testKey",
"type": "APPLICATION",
"runnableContext": "testContext"
}
}
I am unable to read the 'data' part of the above json written into the file into the JsonData object due to the above error.
Please advise
Thanks
please see https://sites.google.com/site/gson/gson-user-guide#TOC-Serializing-and-Deserializing-Generic-Types
When you call toJson(obj), Gson calls obj.getClass() to get information on the fields to serialize. Similarly, you can typically pass MyClass.class object in the fromJson(json, MyClass.class) method. This works fine if the object is a non-generic type. However, if the object is of a generic type, then the Generic type information is lost because of Java Type Erasure. Here is an example illustrating the point:
class Foo<T> {
T value;
}
Gson gson = new Gson();
Foo<Bar> foo = new Foo<Bar>();
gson.toJson(foo); // May not serialize foo.value correctly
gson.fromJson(json, foo.getClass()); // Fails to deserialize foo.value as Bar
The above code fails to interpret value as type Bar because Gson invokes list.getClass() to get its class information, but this method returns a raw class, Foo.class. This means that Gson has no way of knowing that this is an object of type Foo, and not just plain Foo.
You can solve this problem by specifying the correct parameterized type for your generic type. You can do this by using the TypeToken class.
Type fooType = new TypeToken<Foo<Bar>>() {}.getType();
gson.toJson(foo, fooType);
gson.fromJson(json, fooType);
The idiom used to get fooType actually defines an anonymous local inner class containing a method getType() that returns the fully parameterized type.
I need a data structure to store different type of objects.E.g. String, Boolean and other classes.
Is using a Map<String, Object> where using the key you get the according object which assumes that you know how to cast it a good practice?
Is there a better solution?
That's a perfect use case for a PropretyHolder I wrote a while ago. You can read in length about it on my blog. I developed it with immutability in mind, feel free to adapt it to your needs.
In general I'd say if you want to profit from type safety in Java you need to know your keys. What I mean by that - it will be hardly possible to develop type safe solution where keys come from external source.
Here's a special key that knows type of its value (it's not complete please download the source for complete version):
public class PropertyKey<T> {
private final Class<T> clazz;
private final String name;
public PropertyKey(Class<T> valueType, String name) {
this.clazz = valueType;
this.name = name;
}
public boolean checkType(Object value) {
if (null == value) {
return true;
}
return this.clazz.isAssignableFrom(value.getClass());
}
... rest of the class
}
Then you develop a data structure that utilizes it:
public class PropertyHolder {
private final ImmutableMap<PropertyKey<?>, ?> storage;
/**
* Returns value for the key of the type extending-the-one-declared-in-the {#link PropertyKey}.
*
* #param key {#link PropertyKey} instance.
* #return Value of the type declared in the key.
*/
#SuppressWarnings("unchecked")
public <T extends Serializable> T get(PropertyKey<T> key) {
return (T) storage.get(key);
}
/**
* Adds key/value pair to the state and returns new
* {#link PropertyHolder} with this state.
*
* #param key {#link PropertyKey} instance.
* #param value Value of type specified in {#link PropertyKey}.
* #return New {#link PropertyHolder} with updated state.
*/
public <T> PropertyHolder put(PropertyKey<T> key, T value) {
Preconditions.checkNotNull(key, "PropertyKey cannot be null");
Preconditions.checkNotNull(value, "Value for key %s is null",
key);
Preconditions.checkArgument(key.checkType(value),
"Property \"%s\" was given "
+ "value of a wrong type \"%s\"", key, value);
// Creates ImmutableMap.Builder with new key/value pair.
return new PropertyHolder(filterOutKey(key)
.put(key, value).build());
}
/**
* Returns {#link Builder} with all the elements from the state except for the given ket.
*
* #param key The key to remove.
* #return {#link Builder} for further processing.
*/
private <T> Builder<PropertyKey<? extends Serializable>, Serializable> filterOutKey(PropertyKey<T> key) {
Builder<PropertyKey<? extends Serializable>, Serializable> builder = ImmutableMap
.<PropertyKey<? extends Serializable>, Serializable> builder();
for (Entry<PropertyKey<? extends Serializable>, Serializable> entry : this.storage.entrySet()) {
if (!entry.getKey().equals(key)) {
builder.put(entry);
}
}
return builder;
}
... rest of the class
}
I omit here a lot of unnecessary details please let me know if something is not clear.
A typesafe heterogeneous container can be used for this purpose:
import java.util.HashMap;
import java.util.Map;
public class Container {
private Map<Class<?>, Object> container = new HashMap<Class<?>, Object>();
public <T> void putElement(Class<T> type, T instance) {
if (type == null) {
throw new NullPointerException("Type is null");
}
//container.put(type, instance); // 'v1'
container.put(type, type.cast(instance)); // 'v2' runtime type safety!
}
public <T> T getElement(Class<T> type) {
return type.cast(container.get(type));
}
public static void main(String[] args) {
Container myCont = new Container();
myCont.putElement(String.class, "aaa");
myCont.putElement(Boolean.class, true);
myCont.putElement(String[].class, new String[] {"one", "two"});
System.out.println(myCont.getElement(String.class));
System.out.println(myCont.getElement(String[].class)[1]);
}
}
Limitation: this container in its form is capable only to store one instance/object type.
In putElement() you can achieve runtime type safety by using a dynamic cast. This will hoewever add an extra overhead.
E.g: Try to pass a raw class object to the container. Note where the exception occurs:
Class raw = Class.forName("MyClass");
myCont.putElement(raw, "aaa"); //ClassCastException if using 'v2'
System.out.println(myCont.getElement(raw)); //ClassCastException if using 'v1'
I have this snippet of code:
public abstract class Repository<Entity extends BaseObject> {
...
public void readFromJson(){
String content = "JSON content here";
Gson gson = new Gson();
Type entityType = new TypeToken<JSONObject<Entity>>(){}.getType();
jsonObject = gson.fromJson(content, entityType);
for (Entity ent : jsonObject.getEntities()) ;
}
}
When I try to do the foreach my entities object is no longer of type Entity but LinkedHashMap and I get this exception: java.lang.ClassCastException: java.util.LinkedHashMap cannot be cast to com.tranca.bookstore.domain.shared.BaseObject
Here is the JSONObject class(created by me)
public class JSONObject<Entity> {
private List<Entity> entities = new ArrayList<Entity>();
private long lastId = -1;
public List<Entity> getEntities() {
return entities;
}
public void setEntities(List<Entity> entities) {
this.entities = entities;
}
public long getLastId() {
return lastId;
}
public void setLastId(long lastId) {
this.lastId = lastId;
}
public void incrementLastId() {
this.lastId++;
}
}
maybe the base object is relevant so I will put the code here:
public abstract class BaseObject implements Serializable {
protected long id = (long) -1;
protected int version = 0;
protected BaseObject(){}
public long getId() {
return id;
}
public void setId(long id) {
this.id = id;
}
public int getVersion() {
return version;
}
public void setVersion(int version) {
this.version = version;
}
}
I had the same / a similar problem. To give a more clear answer in a slightly different context:
I had following Method which produced the error "com.google.gson.internal.LinkedTreeMap cannot be cast to MyType":
/**
* Reads a LinkedHashMap from the specified parcel.
*
* #param <TKey>
* The type of the key.
* #param <TValue>
* The type of the value.
* #param in
* The in parcel.
* #return Returns an instance of linked hash map or null.
*/
public static <TKey, TValue> LinkedHashMap<TKey, TValue> readLinkedHashMap(Parcel in) {
Gson gson = JsonHelper.getGsonInstance();
String content = in.readString();
LinkedHashMap<TKey, TValue> result = gson.fromJson(content, new TypeToken<LinkedHashMap<TKey, TValue>>(){}.getType());
return result;
}
I wanted an easy generic way to read/write linked hashmap. The above solution does not work because the type information of the TypeToken with TKey an TValue will be lost after compilation as far as i understand. And this is the problem. If you change you're code to following example, then it works, because now we explicitly define the type token. I am not so much into java that i understand why in this case it is possible to read the type information at runtime.
/**
* Reads a LinkedHashMap from the specified parcel.
*
* #param <TKey>
* The type of the key.
* #param <TValue>
* The type of the value.
* #param in
* The in parcel.
* #return Returns an instance of linked hash map or null.
*/
public static <TKey, TValue> LinkedHashMap<TKey, TValue> readLinkedHashMap(Parcel in, TypeToken<LinkedHashMap<TKey, TValue>> typeToken) {
Gson gson = JsonHelper.getGsonInstance();
Type type = typeToken.getType();
String content = in.readString();
LinkedHashMap<TKey, TValue> result = gson.fromJson(content, type);
return result;
}
And now you would call the above function like:
readLinkedHashMap(in, new TypeToken<LinkedHashMap<UUID, MyObject>>(){});
A sidenote 1: When writign the linked hash map, you do not need to specify any type token at all. toJson(map) is sufficient.
A sidenote 2 (to a problem which I had): By default gson uses toString() to serialize the key. If you register a type adapter for the key type which is maybe a more complex type, then this type adapter is not applied when serializing, but when deserializing. This leads to a non consistent and therefore failing process. Following options activates complex map key serialization.
gsonBuilder.enableComplexMapKeySerialization()
Finally got it!
The problem was that:
new TypeToken< JSONObject< Entity>>(){}.getType();
returns the type of JSONObject< T> not the specific entity of the subclass that was extending Repository(eg UserRepository extends Repository< User>).
The trick was to create an abstract method to force the subclasses to set the Type for deserialization.
In conclusion if you get this error be sure you have the right type of class (in case you use subclasses be sure it returns the type of you subclass not superclass).
I've seen lots of examples of how to create pagination with some really simple queries. But I don't see any using HibernateTemplate's findByNamedParam method.
How can I set a query's firstResult and maxResult parameters while also using the findByNamedParam method?
Basically, I'm trying to add pagination to an hql query I'm creating via HibernateTemplate's findByNamedParam method.
Ok after a lot of research, I finally got what I wanted.
First, need to create a HibernateCallback implementation:
HibernateCallbackImpl.java:
import java.sql.SQLException;
import java.util.List;
import org.apache.poi.hssf.record.formula.functions.T;
import org.hibernate.HibernateException;
import org.hibernate.Query;
import org.hibernate.Session;
import org.springframework.orm.hibernate3.HibernateCallback;
public class HibernateCallbackImpl
implements HibernateCallback<List<T>> {
private String queryString;
private String[] paramNames;
private Object[] values;
private int firstResult;
private int maxResults;
/**
* Fetches a {#link List} of entities from the database using pagination.
* Execute HQL query, binding a number of values to ":" named parameters in the query string.
*
* #param queryString a query expressed in Hibernate's query language
* #param paramNames the names of the parameters
* #param values the values of the parameters
* #param firstResult a row number, numbered from 0
* #param maxResults the maximum number of rows
*/
public HibernateCallbackImpl(
String queryString,
String[] paramNames,
Object[] values,
int firstResult,
int maxResults) {
this.queryString = queryString;
this.paramNames = paramNames;
this.values = values;
this.firstResult = firstResult;
this.maxResults = maxResults;
}
#Override
public List<T> doInHibernate(Session session) throws HibernateException,
SQLException {
Query query = session.createQuery(queryString);
query.setFirstResult(firstResult);
query.setMaxResults(maxResults);
// TODO: throw proper exception when paramNames.length != values.length
for (int c=0; c<paramNames.length; c++) {
query.setParameter(paramNames[c], values[c]);
}
#SuppressWarnings("unchecked")
List<T> result = query.list();
return result;
}
}
Then, I can just instantiate the new object and it will return what I want:
Example:
#SuppressWarnings("unchecked")
List<TitleProductAccountApproval> tpaas =
getHibernateTemplate().executeFind(
new HibernateCallbackImpl(
hql.toString(),
paramNames.toArray(new String[paramNames.size()]),
values.toArray(),
firstResult,
maxResult
)
);
The solution by #Corey works great but it includes a problem inside the for-loop where query.setParameter(...) is called.
The problem is that it doesn't account for parameters which are either a collection or an array and this will result in weired ClassCastExceptions because Hibernate tries to determine the ID by calling getId() on the collection or array (which is wrong). This happens e.g. if you are using an IN-clause (e.g. ...WHERE department IN (:departments) ...) where 'departments' is an array or collection of Department entities.
This is because collections or arrays need to use 'query.setParameterList(paramName, (Object[]) value)' or 'query.setParameterList(paramName, (Collection) value)'
Long story short:
I modified the version by #Corey by adding an 'applyNamedParameterToQuery()' method which I borrowed from org.springframework.orm.hibernate3.HibernateTemplate.applyNamedParameterToQuery(Query, String, Object):
import java.sql.SQLException;
import java.util.List;
import org.apache.poi.hssf.record.formula.functions.T;
import org.hibernate.HibernateException;
import org.hibernate.Query;
import org.hibernate.Session;
import org.springframework.orm.hibernate3.HibernateCallback;
public class HibernateCallbackImpl
implements HibernateCallback<List<T>> {
private String queryString;
private String[] paramNames;
private Object[] values;
private int firstResult;
private int maxResults;
/**
* Fetches a {#link List} of entities from the database using pagination.
* Execute HQL query, binding a number of values to ":" named parameters in the query string.
*
* #param queryString a query expressed in Hibernate's query language
* #param paramNames the names of the parameters
* #param values the values of the parameters
* #param firstResult a row number, numbered from 0
* #param maxResults the maximum number of rows
*/
public HibernateCallbackImpl(
String queryString,
String[] paramNames,
Object[] values,
int firstResult,
int maxResults) {
this.queryString = queryString;
this.paramNames = paramNames;
this.values = values;
this.firstResult = firstResult;
this.maxResults = maxResults;
}
#Override
public List<T> doInHibernate(Session session) throws HibernateException,
SQLException {
Query query = session.createQuery(queryString);
query.setFirstResult(firstResult);
query.setMaxResults(maxResults);
// TODO: throw proper exception when paramNames.length != values.length
for (int c=0; c<paramNames.length; c++) {
applyNamedParameterToQuery(query, paramNames[c], values[c]);
}
#SuppressWarnings("unchecked")
List<T> result = query.list();
return result;
}
/**
* Code borrowed from org.springframework.orm.hibernate3.HibernateTemplate.applyNamedParameterToQuery(Query, String, Object)
*
* Apply the given name parameter to the given Query object.
* #param queryObject the Query object
* #param paramName the name of the parameter
* #param value the value of the parameter
* #throws HibernateException if thrown by the Query object
*/
protected void applyNamedParameterToQuery(Query queryObject, String paramName, Object value)
throws HibernateException {
if (value instanceof Collection) {
queryObject.setParameterList(paramName, (Collection) value);
}
else if (value instanceof Object[]) {
queryObject.setParameterList(paramName, (Object[]) value);
}
else {
queryObject.setParameter(paramName, value);
}
}
}