Good afternoon.
I'm trying to bind a list with strings to the query that the IN operator uses.
im use Oracle.
I did following the example that was described by the link:
How to use IN operator with JDBI?
List<String> ms = new ArrayList();
ms.add("Novosibirsk");
ms.add("Perm");
public interface CityDAO {
#RegisterMapper(CitiesMapper.class)
#SqlQuery("SELECT *
FROM Universities
WHERE Location IN (:cities)")
List<cities> getItems(#Bind("cities") List<String> cities);}
}
I created a ListArgumentFactory
public class ListArgumentFactory implements ArgumentFactory<List> {
#Override
public boolean accepts(Class<?> expectedType, Object value, StatementContext ctx) {
return value instanceof List;
}
#Override
public Argument build(Class<?> expectedType, final List value, StatementContext ctx) {
return new Argument() {
#Override
public void apply(int position, PreparedStatement statement, StatementContext ctx) throws SQLException {
String type = null;
if(value.get(0).getClass() == String.class){
type = "varchar";
} else if(value.get(0).getClass() == Integer.class){
// For integer and so on...
} else {
// throw error.. type not handled
}
Array array = ctx.getConnection().createArrayOf(type, value.toArray());
statement.setArray(position, array);
}
};
}
}
I registered the factory
public class DBI extends AbstractModule {
private DBI dbi;
#Override
protected void configure() {
this.dbi = new DBI(provideConfig().url());
this.dbi.registerArgumentFactory(new ListArgumentFactory());
}
}
But when I make a request I get an exception
org.skife.jdbi.v2.exceptions.UnableToCreateStatementException: Exception while binding 'cities' [statement:"SELECT * FROM Universities WHERE Location IN (:cities)", arguments:{ positional:{}, named {cities:factory.ListArgumentFactory$1#6788168c}, finder:[]}]
Help me figure out what I'm doing wrong
According to the JDBI documentation, achieving something like that using Oracle could be quite complex so might be a better idea to use the first approach described (UseStringTemplate3StatementLocator):
Oracle supports something similar, but you need to use Oracle specific APIs and oracle.sql.ARRAY instances. In the Oracle case you have to pre-declare the array type in the database first, and as it stores the array in the database, free it after the call.
Having said that, there is a simple approach that can be used to make this work in Oracle which is to join the elements in the list with a comma. I have modified the ListArgumentFactory using the Java 8 String.join method:
public class ListArgumentFactory implements ArgumentFactory<List> {
#Override
public boolean accepts(Class<?> expectedType, Object value, StatementContext ctx) {
return value instanceof List;
}
#Override
public Argument build(Class<?> expectedType, final List value, StatementContext ctx) {
return new Argument() {
#Override
public void apply(int position, PreparedStatement statement, StatementContext ctx) throws SQLException {
statement.setString(position, String.join(",", value));
}
};
}
}
I have tried the approach described in the JDBI documentation to use oracle.sql.ARRAY and a custom TYPE in the Oracle DB but was not successful for me.
Related
This is failing:
public List<TypeActionCommerciale> requestTypeActionCommercialeSansNip() throws PersistenceException {
Query query = createQuery("from TypeActionCommercialeImpl where type != :type1");
query.setParameter("type1", TypeActionCommercialeEnum.NIP);
return (List<TypeActionCommerciale>) query.list();
}
exception:
Hibernate: select typeaction0_.id as id1_102_, typeaction0_.libelle as
libelle3_102_, typeaction0_.code as code4_102_, typeaction0_.type as
type5_102_ from apex.typeActionCommerciale typeaction0_ where
typeaction0_.type<>?
ERROR
org.hibernate.engine.jdbc.spi.SqlExceptionHelper.logExceptions(SqlExceptionHelper.java:129)
No value specified for parameter 1 org.hibernate.exception.DataException: could not extract ResultSet at
i use setProperties but i have the same error:
public List<TypeActionCommerciale> requestTypeActionCommercialeSansNip() throws PersistenceException {
Query query = createQuery("from TypeActionCommercialeImpl where type <> :type1");
final Map<String, Object> properties = new HashMap<>();
properties.put("type1", TypeActionCommercialeEnum.NIP);
query.setProperties(properties);
return (List<TypeActionCommerciale>) query.list();
}
The problem is here query.setParameter("type1", TypeActionCommercialeEnum.NIP);
The enum type is not defined in hibernate so you must store the name of enum and use it for the query (the easy way) then use:
query.setString("type1", TypeActionCommercialeEnum.NIP.name());
To use enum directly (the hard way) you must implement your CustomUserType . You can find here how https://docs.jboss.org/hibernate/orm/5.0/manual/en-US/html/ch06.html#types-custom
The main advantages of use CustomUserType are:
you can store into DB an integer (that is more smaller) instead of a string that represents the enum.
Delegate the parsing to hibernate during storing and retrieving of object.
You can use the enum directly into the query (like you are trying to do)
Try to use <> instead of != like this:
"from TypeActionCommercialeImpl where type <> :type1"
i resolve my pb i have a class public class EnumUserType<E extends Enum<E>> implements UserType and i implement this method:
#Override
public Object nullSafeGet(ResultSet rs, String[] names, SessionImplementor session, Object owner)
throws HibernateException, SQLException {
String name = rs.getString(names[0]);
Object result = null;
if (!rs.wasNull()) {
result = Enum.valueOf(clazz, name);
}
return result;
}
#Override
public void nullSafeSet(PreparedStatement st, Object value, int index, SessionImplementor session)
throws HibernateException, SQLException {
if (null == value) {
st.setNull(index, Types.VARCHAR);
} else {
st.setString(index, ((Enum<E>) value).name());
}
}
The following code is the one I was using for updating some field's value in the database
public void updatesomeField(String registrationID) {
ContentValues objValues;
try {
objDatabase=this.getWritableDatabase();
objValues = new ContentValues();
objValues.put(COLUMN_REGISTRATION_ID,registrationID);
objDatabase.update(CUSTOMER_USERS_TABLE_NAME, objValues,null,null);
objDatabase.close();
} catch(Exception errorException) {
Log.d("error",""+ errorException);
}
}
Then I decided to use a generic approach and hence written the above code like
public <T> void update(String tableName, String columnName, T value) {
ContentValues objValues;
try {
objDatabase=this.getWritableDatabase();
objValues = new ContentValues();
objValues.put(columnName,value); ///here comes the error BECAUSE THE 'value'
objDatabase.update(tableName, objValues,null,null);
objDatabase.close();
} catch(Exception errorException) {
Log.d("error",""+ errorException);
}
}
Error is because ContentValues is final and I cannot extend it to create my new own class to store my generic variable type. What optimization should be performed so that I can get rid of the error by having the same code?
ANOTHER VERSION OF THE SAME QUESTION (IF THE ABOVE ASKED THING IS DIFFICULT TO UNDERSTAND)
I have a final predefined class ContentValues having method put which take parameters in the form of key and values.
However I want to implement a generic functionality and want to decide it on run time that what should be the type of the value
You could write an Adapter which does the dispatching for you:
class ContentValuesAdapter {
private ContentValues values;
public ContentValuesAdapter(ContentValues values) {
this.values = values;
}
public void put(String columnName, Object value) {
if(value instanceof String) {
values.put(columnName, (String) value);
} else if ( ... ) {
...
}
}
public ContentValues getContentValues() {
return this.values;
}
/* Delegate all other methods to the ContentValue instance. */
}
Now you can use this class instead of the original class, and keep your using code clean.
I use generic types throughout my applications, and have to say that one of the simplest (possibly not the best, or most efficient) ways of determining object type is instanceof - For e.g:
if(obj instanceof SiteContact){
buildContactDropdownList((SiteContact)obj);
}else if(obj instanceof Delivery){
buildDeliveryList((Delivery)obj);
}
You could quite easily wrap this up inside a helper class too.
If the ContentValues class belong to a third-party library and you have no access to its source code, then I'm afraid you're stuck with it.
Apparently, ContentValues contains a put(String,String) method. What you could do, is call the toString() method of value:
objValues.put(columnName, value.toString());
Since the toString() method is defined already in the class Object, you don't need to use generics for this. So you can just call your method:
public void update(String tableName, String columnName, Object value)
{
...
}
I'm learning on how to use SQLData and having an issue with casting back to my object.
My Oracle Types looks something like this:
CREATE OR REPLACE TYPE activities_t AS OBJECT
(
list activity_list_t;
);
CREATE OR REPLACE TYPE activity_list_t AS TABLE OF activity_t;
CREATE OR REPLACE TYPE activity_t AS OBJECT
(
startDate DATE;
endDate DATE;
);
And my Java looks like this:
public class Activities implements SQLData {
private String sqlType = "ACTIVITIES_T";
List<Activity> list;
// must have default ctor!
public Activities() {
}
public String getSQLTypeName() throws SQLException
{
return sqlType;
}
public List getList() {
return list;
}
public void setList(List list) {
this.list = list;
}
public void readSQL(SQLInput stream, String typeName) throws SQLException
{
Array a = stream.readArray();
// :(
}
public void writeSQL(SQLOutput stream) throws SQLException
{
// stream.writeArray(this.list);
}
}
I've tried a few things in readSQL but I am not having much success - what am I missing?
I am calling a PLSQL stored procedure which has an OUT parameter of "activities_t" using JDBC:
Map map = connection.getTypeMap();
map.put("ACTIVITIES_T", Class.forName("Activities"));
connection.setTypeMap(map);
callableStatement = connection.prepareCall("{call GET_ACTIVITIES(?)}");
callableStatement.execute();
Thanks!
Steve
(most of the above is from memory as the code is at work...)
You'll need to add a type mapping for the type ACTIVITY_T as well as the one for ACTIVITIES_T. It's not clear from your question whether you've already done this.
Let's assume you've done this and created a class called Activity which implements SQLData as well. Once you've done that, the following should suffice to read the activity list within Activities:
public void readSQL(SQLInput stream, String typeName) throws SQLException {
Array array = stream.readArray();
this.list = new ArrayList<Activity>();
for (Object obj : (Object[])array.getArray()) {
list.add((Activity)obj);
}
}
Tips:
JDBC APIs are case-sensitive with regard to type names; you will see a Unable to resolve type error if your type name does not exactly match. Oracle will uppercase your type name unless you double-quoted the name in its create statement.
You may need to specify SCHEMA.TYPE_NAME if the type isn't in your default schema.
Remember to grant execute on types if the user you are connecting with is not the owner.
If you have execute on the package, but not the type, getArray() will throw an exception when it tries to look for type metadata.
getArray()
My solution is essentially the same as Luke's. However, I needed to provide a type mapping when getting the array: array.getArray(typeMap)
You can also set a default type map on the Connection, but this didn't work for me.
When calling getArray() you get an array of the object type, i.e. the SQLData implementation you created that represents activity_t
Here is a generic function you might find useful:
public static <T> List<T> listFromArray(Array array, Class<T> typeClass) throws SQLException {
if (array == null) {
return Collections.emptyList();
}
// Java does not allow casting Object[] to T[]
final Object[] objectArray = (Object[]) array.getArray(getTypeMap());
List<T> list = new ArrayList<>(objectArray.length);
for (Object o : objectArray) {
list.add(typeClass.cast(o));
}
return list;
}
writeArray()
Figuring out how to write an array was frustrating, Oracle APIs require a Connection to create an Array, but you don't have an obvious Connection in the context of writeSQL(SQLOutput sqlOutput). Fortunately, this blog has a trick/hack to get the OracleConnection, which I've used here.
When you create an array with createOracleArray() you specify the list type for the type name, NOT the object type. i.e. activity_list_t
Here's a generic function for writing arrays. In your case, listType would be "activity_list_t" and you would pass in a List<Activity>
public static <T> void writeArrayFromList(SQLOutput sqlOutput, String listType, #Nullable List<T> list) throws SQLException {
final OracleSQLOutput out = (OracleSQLOutput) sqlOutput;
OracleConnection conn = (OracleConnection) out.getSTRUCT().getJavaSqlConnection();
conn.setTypeMap(getTypeMap()); // not needed?
if (list == null) {
list = Collections.emptyList();
}
final Array array = conn.createOracleArray(listType, list.toArray());
out.writeArray(array);
}
Note: at one point I thought setTypeMap was required, but now when I remove that line my code still works, so I'm not sure if it's necessary.
I could workaround this problem but I cannot understand it, so I am asking for some explanation (and maybe a better question title as well).
Please consider this:
public class TBGService {
// TBGObject is an abstract base class which is extended by model classes
public <T> T doGet(TBGObject model) throws TBGServiceException {
String uri = model.buildUrl(repository) + model.getObjectKey();
GetMethod method = new GetMethod(uri);
T returned = execute(method, credentials, model.getClass());
return returned;
}
}
and this:
public enum TBGTaskAttributes {
private TBGTaskAttributes(String id, String type, String label, Object... flags) {
builder = new TaskAttributeBuilder();
builder.withId(id).withLabel(label);
for (Object flag : flags) {
processFlag(flag);
}
}
public abstract String getValueFromIssue(TBGIssue issue);
public abstract void setValueInIssue(TBGIssue issue, String value);
}
when I write this code to define an enum item:
PROJECT(TaskAttribute.PRODUCT, TaskAttribute.TYPE_SINGLE_SELECT, "Project", new OptionProvider() {
#Override
public Set<Entry<String, String>> getOptions(TaskRepository repository) {
try {
List<TBGProject> list = TBGService.get(repository)
.doGet(new TBGProjects()).getProjects();
[...]
return map.entrySet();
} catch (TBGServiceException e) { [...] }
return null;
}
}) {
#Override
public String getValueFromIssue(TBGIssue issue) {
return issue.getProjectKey();
}
#Override
public void setValueInIssue(TBGIssue issue, String value) {
issue.setProjectKey(value);
}
},
[... other items ...]
I get compiler error (also eclipse auto-completion does not work):
The method getProjects() is undefined for the type Object
and if I hover the doGet method, eclipse show it as defined like:
<Object> Object TBGService.doGet(TBGObject model)
Elsewhere, hovering shows the signature correctly as:
<TBGProjects> TBGProjects TBGService.doGet(TBGObject model)
when called with parameter new TBGProjects().
Just changing:
List<TBGProject> list = TBGService.get(repository)
.doGet(new TBGProjects()).getProjects();
with:
TBGProjects projects = TBGService.get(repository).doGet(new TBGProjects());
List<TBGProject> = projects.getProjects();
makes it work. But what's happening here? What am I missing?
Java infers the type of T based on what you assign the return value of the method to.
If you don't assign the return value to anything, Java has no idea what T should be.
To fix this, you can change the parameter to be of type T so Java can infer T from the parameter you pass:
public <T extends TBGObject> T doGet(T model) throws TBGServiceException {
My Domain object has couple of Joda-Time DateTime fields. When I'm reading database values using SimpleJdbcTemplate:
Patient patient = jdbc.queryForObject(sql, new
BeanPropertyRowMapper(Patient.class), patientId);
It just fails and surprisingly, no errors were logged. I guess it's because of the timestamp parsing to DateTime is not working with Jdbc.
If it's possible to inherit and override BeanPropertyRowMapper and instruct to convert all java.sql.Timestamp and java.sql.Date to DateTime, it would be great and could save a lot of extra code.
Any advice?
The correct thing to do is to subclass BeanPropertyRowMapper, override initBeanWrapper(BeanWrapper) and register a custom Property Editor:
public class JodaDateTimeEditor extends PropertyEditorSupport {
#Override
public void setAsText(final String text) throws IllegalArgumentException {
setValue(new DateTime(text)); // date time in ISO8601 format
// (yyyy-MM-ddTHH:mm:ss.SSSZZ)
}
#Override
public void setValue(final Object value) {
super.setValue(value == null || value instanceof DateTime ? value
: new DateTime(value));
}
#Override
public DateTime getValue() {
return (DateTime) super.getValue();
}
#Override
public String getAsText() {
return getValue().toString(); // date time in ISO8601 format
// (yyyy-MM-ddTHH:mm:ss.SSSZZ)
}
}
public class JodaTimeSavvyBeanPropertyRowMapper<T>
extends BeanPropertyRowMapper<T> {
#Override
protected void initBeanWrapper(BeanWrapper bw) {
bw.registerCustomEditor(DateTime.class, new JodaDateTimeEditor());
}
}
Looking at BeanPropertyRowMapper implementation, the way it sets the fields is:
Object value = getColumnValue( rs, index, pd );
if (logger.isDebugEnabled() && rowNumber == 0) {
logger.debug("Mapping column '" + column + "' to property '" +
pd.getName() + "' of type " + pd.getPropertyType());
}
try {
bw.setPropertyValue(pd.getName(), value);
}
where getColumnValue(rs, index, pd); delegates to JdbcUtils.getResultSetValue
That pd field in getColumnValue is the actual "p roperty d escriptor", that is used ( pd.getPropertyType() ) in JdbcUtils as a type of the field to map to.
If you look at JdbcUtils code for getResultSetValue method, you'll see that it simply goes from one if statement to another, to match pd.getPropertyType() to all standard types. When it does not find one, since DateTime is not a "standard" type, it relies on a rs.getObject():
} else {
// Some unknown type desired -> rely on getObject.
Then if this object is a SQL Date it converts it to a Timestamp, and returns to be set to a DateTime field of your domain => where it fails.
Hence, there does not seem to be a straight forward way to inject a Date/Timestamp to DateTime converter into a BeanPropertyRowMapper. So it would be cleaner (and more performant) to implement your own RowMapper.
In case you'd like to see the mapping error in a console, set your logging level for org.springframework.jdbc to "debug" or better yet "trace" to see exactly what happens.
One thing you can try, which I have not tested, is to extend a BeanPropertyRowMapper and override a property of DateTime type in:
/**
* Initialize the given BeanWrapper to be used for row mapping.
* To be called for each row.
* <p>The default implementation is empty. Can be overridden in subclasses.
* #param bw the BeanWrapper to initialize
*/
protected void initBeanWrapper(BeanWrapper bw) {}
The answer of #Sean Patrick Floyd is perfect until you do not have many many custom types.
Here it a generalized, configurable based on usage extension:
public class CustomFieldTypeSupportBeanPropertyRowMapper<T> extends BeanPropertyRowMapper<T> {
private Map<Class<?>, Handler> customTypeMappers = new HashMap<Class<?>, Handler>();
public CustomFieldTypeSupportBeanPropertyRowMapper() {
super();
}
public CustomFieldTypeSupportBeanPropertyRowMapper(Class<T> mappedClass, boolean checkFullyPopulated) {
super(mappedClass, checkFullyPopulated);
}
public CustomFieldTypeSupportBeanPropertyRowMapper(Class<T> mappedClass) {
super(mappedClass);
}
public CustomFieldTypeSupportBeanPropertyRowMapper(Class<T> mappedClass, Map<Class<?>, Handler> customTypeMappers) {
super(mappedClass);
this.customTypeMappers = customTypeMappers;
}
#Override
protected Object getColumnValue(ResultSet rs, int index, PropertyDescriptor pd) throws SQLException {
final Class<?> current = pd.getPropertyType();
if (customTypeMappers.containsKey(current)) {
return customTypeMappers.get(current).f(rs, index, pd);
}
return super.getColumnValue(rs, index, pd);
}
public void addTypeHandler(Class<?> class1, Handler handler2) {
customTypeMappers.put(class1, handler2);
}
public static interface Handler {
public Object f(ResultSet rs, int index, PropertyDescriptor pd) throws SQLException;
}
}