I am working on a screen which will upload a file to oracle table as BFILE type. I am using spring3 and hibernate3.
The BO class look like:
#Entity
#Table(name="abc_manuals")
public class ManualBo implements Serializable {
/* Persistent Fields */
#Id
#Column(name="id", nullable=false)
#GeneratedValue(strategy=GenerationType.IDENTITY)
private Long mId;
#Column(name="guide")
#Type(type="com.bo.entity.BFILEType")
private BFILE guide;
public Long getMlId() {
return mlId;
}
public void setMId(Long manualId) {
this.mId = mId;
}
public BFILE getGuide() {
return guide;
}
public void setGuide(BFILE guide) {
guide = guide;
}
}
I have defined a BFILE userType:
public class BFILEType implements UserType, Serializable {
#SuppressWarnings("unused")
private static final String MARK_EMPTY = "<EmptyString/>";
private static final int[] TYPES = { OracleTypes.BFILE };
public int[] sqlTypes() {
return TYPES;
}
#SuppressWarnings("rawtypes")
public Class returnedClass() {
return BFILE.class;
}
public boolean equals(Object x, Object y) {
if (x==y) return true;
if (x==null || y==null) return false;
return x.equals(y);
}
public Object deepCopy(Object x) {
return x;
}
public boolean isMutable() { return false; }
public Object nullSafeGet(ResultSet rs, String[] names, Object owner) throws
HibernateException, SQLException {
BFILE bfile = (BFILE)rs.getObject(names[0]);
return bfile;
}
public void nullSafeSet(PreparedStatement st, Object value, int index) throws
HibernateException, SQLException {
if(value==null)
st.setObject(index, null);
else
st.setObject(index, value, OracleTypes.BFILE);
}
public Object assemble(Serializable arg0, Object arg1) throws HibernateException {
return deepCopy(arg0);
}
public Serializable disassemble(Object value) {
return (Serializable) deepCopy(value);
}
public int hashCode(Object arg0) throws HibernateException {
return arg0.hashCode();
}
public Object replace(Object arg0, Object arg1, Object arg2) throws
HibernateException {
return deepCopy(arg0);
}
}
Problem is when I am trying to setFile in controller
manual.setGuide((oracle.sql.BFILE) form.getFile());
it is compiling well but when I upload file from screen it is giving following exception
java.lang.ClassCastException: org.springframework.web.multipart.commons.CommonsMultipartFile cannot be cast to oracle.sql.BFILE
How to solve this?
Solved :
I tried :
public void nullSafeSet(PreparedStatement st, Object value, int index) throws HibernateException, SQLException {
if (value == null) {
st.setObject(index, null);
} else {
OracleConnection oc = (OracleConnection) st.getConnection();
OraclePreparedStatement opst = (OraclePreparedStatement) st;
OracleCallableStatement ocs = (OracleCallableStatement) oc.prepareCall("{? = call BFILENAME('" + directory + "', '"+ filename + "')}");
ocs.registerOutParameter(1, OracleTypes.BFILE);
ocs.execute();
BFILE bfile = ocs.getBFILE(1);
opst.setBFILE(index, bfile);
}
}
and its working fine.
Thanks!
java.lang.ClassCastException:
org.springframework.web.multipart.commons.CommonsMultipartFile cannot
be cast to oracle.sql.BFILE
That is correct, the two classes don't share a type hierarchy, there's no way to cast one to the other. Instead of casting the types, you should concentrate on transferring the contents.
This should be the code you need:
BFILE bfile = new BFILE();
bfile.setBytes(form.getFile().getBytes());
manual.setGuide(bfile);
UPDATE: It turns out it's not that easy, as a BFILE can't just be constructed. Here's an Oracle tutorial for working with BFILES in Java.
CommonsMultipartFile represents multipart form request, i.e. a collection of fields. You have to extract your file from the field, i.e. call getFileItem() or getInputStream() to extract the file content.
Related
I would like to marshall and unmarshall objects whose fields are of their class's inner class (synthetic class if I'm not wrong).
class A {
private B field_b=null;
public static class B {
public static B B1 = new B("b1");
public static B B2 = new B("b2");
private final String name;
private B(String name) {
this.name=name;
}
}
public B getBforName(String name) {
if (B1.name.equals(name) return B1;
else if (B2.name.equals(name) return B2;
else return null;
}
And produce and read from an XML:
<A>
<field_B>b1</field_B>
</A>
The writing part is easy.
The reading part is more complicated.
I would like to write a converter:
public class BConverter implements Converter {
public boolean canConvert(Class type) {
return B.class.isAssignableFrom(type) ;
}
public void marshal(Object source, HierarchicalStreamWriter writer, MarshallingContext context) {
// ...
}
public Object unmarshal(HierarchicalStreamReader reader, UnmarshallingContext context) {
A parent_v1 = (A) context.getCurrentObject(); // !!! always empty
A parent_v2 = (A) context.get("current_unmarshalled_A");
return parent_v2.getBforName((String)reader.getValue());
}
}
The context.getCurrentObject() returns null and seems to be obsolete (from forums I read).
The context.get("current_unmarshalled_A")would require that I put in this unMarshallingContext that key and the A object being unmarshalled. I don't find to do that without writing an AConverter. And that is not neat as I would loose the default unmarshalling behaviour for the class A.
Anyone has an idea ?
I'm not sure this is cleanest way to that but it works.
I use a static method in the B class ...
class A {
private B field_b=null;
public static class B {
public static B B1 = new B("b1");
public static B B2 = new B("b2");
public static B getForName(String name) {
if (B1.name.equals(name) return B1;
else if (B2.name.equals(name) return B2;
else return null;
}
private final String name;
private B(String name) {
this.name=name;
}
}
... and reflection in the Converter
public class BConverter implements Converter {
public boolean canConvert(Class type) {
return B.class.isAssignableFrom(type) ;
}
public void marshal(Object source, HierarchicalStreamWriter writer, MarshallingContext context) {
writer.setValue(((B)source).getName())
}
public Object unmarshal(HierarchicalStreamReader reader, UnmarshallingContext context) {
try {
Method method = context.getRequiredType().getMethod("getForName", String.class);
final String v = reader.getValue();
Object b= method.invoke(null, v);
if (b== null)
throw new ConversionException("Could not retrieve a B object for \"" + v + "\"");
return b;
} catch (Exception ex) {
throw new ConversionException("Error while retrieving a B object", ex);
}
}
}
I am attempting to store JSON in a postgresql 9.4 database using the JSONB datatype with dropwizard and jdbi. I am able to store the data, but if my json goes any deeper than a single level, the json gets turned into a string instead of nested json.
For instance, the following json
{
"type":"unit",
"nested": {
"key":"embedded"
}
}
actually gets stored as
{
"type":"unit",
"nested":"{key=embedded}"
}
The method signature in my DAO is
#SqlUpdate("insert into entity_json(id, content) values(:id, :content\\:\\:jsonb)")
protected abstract void createJson(#Bind("id") String id, #Bind("content") Map content);
I obviously have something wrong, but I can't seem to figure out the correct way to store this nested data.
You can use PGObject to build a JSONB data type in Java. This way you can avoid any special handling as part of the SQL:
PGobject dataObject = new PGobject();
dataObject.setType("jsonb");
dataObject.setValue(value.toString());
A full example including converting an object to a tree, and using an ArgumentFactory to convert it to a PGobject could look like this:
public class JsonbTest {
#Test
public void tryoutjson() throws Exception {
final DBI dbi = new DBI("jdbc:postgresql://localhost:5432/sighting", "postgres", "admin");
dbi.registerArgumentFactory(new ObjectNodeArgumentFactor());
Sample sample = dbi.onDemand(Sample.class);
ObjectMapper mapper = new ObjectMapper();
int id = 2;
User user = new User();
user.emailaddress = "me#home.com";
user.posts = 123;
user.username = "test";
sample.insert(id, mapper.valueToTree(user));
}
public static class User {
public String username, emailaddress;
public long posts;
}
public interface Sample {
#SqlUpdate("INSERT INTO sample (id, data) VALUES (:id, :data)")
int insert(#Bind("id") long id, #Bind("data") TreeNode data);
}
public static class ObjectNodeArgumentFactor implements ArgumentFactory<TreeNode> {
private static class ObjectNodeArgument implements Argument {
private final PGobject value;
private ObjectNodeArgument(PGobject value) {
this.value = value;
}
#Override
public void apply(int position,
PreparedStatement statement,
StatementContext ctx) throws SQLException {
statement.setObject(position, value);
}
}
#Override
public boolean accepts(Class<?> expectedType, Object value, StatementContext ctx) {
return value instanceof TreeNode;
}
#Override
public Argument build(Class<?> expectedType, TreeNode value, StatementContext ctx) {
try {
PGobject dataObject = new PGobject();
dataObject.setType("jsonb");
dataObject.setValue(value.toString());
return new ObjectNodeArgument(dataObject);
} catch (SQLException e) {
throw new RuntimeException(e);
}
}
}
}
I was able to solve this by passing in a string obtained by calling writeValueAsString(Map) on a Jackson ObjectMapper. My createJson method turned into:
#SqlUpdate("insert into entity_json(id, content) values(:id, :content\\:\\:jsonb)")
public abstract void createJson(#Bind("id")String id, #Bind("content")String content);
and I obtained the string to pass in by creating a mapper:
private ObjectMapper mapper = Jackson.newObjectMapper();
and then calling:
mapper.writeValueAsString(map);
This gave me the nested json I was looking for.
How to custom Jackson to deserialize a value that contains a ${token} ?
Here is an example of the functionnality that i want to add from the Apache Commons Configuration variable interpolation :
{
"file" = "${sys:user.home}/path/to/my_file"
}
You can register a custom string deserialiser based on the default which would interpolate the variables after the deserialisation.
But, as was pointed out in the comments, that would not work for the non-String types such as File and URLs. The better idea is to override the getText() and getValueAsString() methods of the JsonParser by passing a custom JsonFactory.
Here is an example:
public class JacksonInterpolateString {
static final String JSON = "{ \"file\":\"${sys:user.home}/path/to/my_file\" }";
public static class Bean {
public File file;
#Override
public String toString() {
return file.toString();
}
}
private static class MyJsonParser extends JsonParserDelegate {
public MyJsonParser(final JsonParser d) {
super(d);
}
#Override
public String getText() throws IOException {
final String value = super.getText();
if (value != null) {
return interpolateString(value);
}
return value;
}
#Override
public String getValueAsString() throws IOException {
return getValueAsString(null);
}
#Override
public String getValueAsString(final String defaultValue) throws IOException {
final String value = super.getValueAsString(defaultValue);
if (value != null) {
return interpolateString(value);
}
return null;
}
}
private static class MyMappingJsonFactory extends MappingJsonFactory {
#Override
protected JsonParser _createParser(
final char[] data,
final int offset,
final int len,
final IOContext ctxt,
final boolean recyclable)
throws IOException {
return new MyJsonParser(super._createParser(data, offset, len, ctxt, recyclable));
}
#Override
protected JsonParser _createParser(final Reader r, final IOContext ctxt)
throws IOException {
return new MyJsonParser(super._createParser(r, ctxt));
}
}
private static String interpolateString(final String value) {
return value.replace("${sys:user.home}", "/home/user");
}
public static void main(String[] args) throws IOException {
final JsonFactory factory = new MyMappingJsonFactory();
final ObjectMapper mapper = new ObjectMapper(factory);
System.out.println(mapper.readValue(JSON, Map.class));
System.out.println(mapper.readValue(JSON, Bean.class));
}
}
Output:
{file=/home/user/path/to/my_file}
/home/user/path/to/my_file
I have a hibernate-mapped Java object, JKL, which is full of a bunch of normal hibernate-mappable fields (like strings and integers).
I'm added a new embedded field to it (which lives in the same table -- not a mapping), asdf, which is a fj.data.Option<ASDF>. I've made it an option to make it clear that this field may not actually contain anything (as opposed to having to handle null every time I access it).
How do I set up the mapping in my JKL.hbm.xml file? I'd like hibernate to automatically convert a null in the database to a none of fj.data.Option<ASDF> when it retrieves the object. It should also convert a non-null instance of ASDF to a some of fj.data.Option<ASDF>.
Is there any other trickery that I have to do?
I would suggest introducing FunctionalJava's Option in the accessors (getter and setter), while leaving Hibernate to handle a simple java field which is allowed to be null.
For example, for an optional Integer field:
// SQL
CREATE TABLE `JKL` (
`JKL_ID` INTEGER PRIMARY KEY,
`MY_FIELD` INTEGER DEFAULT NULL
)
You can map a Hibernate private field directly:
// Java
#Column(nullable = true)
private Integer myField;
You could then introduce Option at the accessor boundary:
// Java
public fj.data.Option<Integer> getMyField() {
return fj.data.Option.fromNull(myField);
}
public void setMyField(fj.data.Option<Integer> value) {
myField = value.toNull();
}
Does that work for your needs?
You can use Hibernate's custom mapping types. Documentation is here. Here is an analogous example of mapping Scala's Option to a Hibernate mapping.
Simply put, you would need to extend the org.hibernate.UserType interface. You could also create a generic-typed base class with a JKL-typed sub-type, similar to what you see in the Scala example.
I think using getter/setter is simpler, but here's an example of what I did to make it work :
(It works fine for number and string, but not for date (error with #Temporal annotation)).
import com.cestpasdur.helpers.PredicateHelper;
import com.google.common.annotations.VisibleForTesting;
import com.google.common.base.Optional;
import org.apache.commons.lang.ObjectUtils;
import org.apache.commons.lang.StringUtils;
import org.hibernate.HibernateException;
import org.hibernate.usertype.UserType;
import org.joda.time.DateTime;
import java.io.Serializable;
import java.sql.*;
public class OptionUserType implements UserType {
#Override
public int[] sqlTypes() {
return new int[]{
Types.NULL
};
}
#Override
public Class returnedClass() {
return Optional.class;
}
#Override
public boolean equals(Object o, Object o2) throws HibernateException {
return ObjectUtils.equals(o, o2);
}
#Override
public int hashCode(Object o) throws HibernateException {
assert (o != null);
return o.hashCode();
}
#Override
public Optional<? extends Object> nullSafeGet(ResultSet rs, String[] names, Object owner) throws HibernateException, SQLException {
return Optional.fromNullable(rs.getObject(names[0]));
}
#VisibleForTesting
void handleDate(PreparedStatement st, Date value, int index) throws SQLException {
st.setDate(index, value);
}
#VisibleForTesting
void handleNumber(PreparedStatement st, String stringValue, int index) throws SQLException {
Double doubleValue = Double.valueOf(stringValue);
st.setDouble(index, doubleValue);
}
#Override
public void nullSafeSet(PreparedStatement st, Object value, int index) throws SQLException {
if (value != null) {
if (value instanceof Optional) {
Optional optionalValue = (Optional) value;
if (optionalValue.isPresent()) {
String stringValue = String.valueOf(optionalValue.get());
if (StringUtils.isNotBlank(stringValue)) {
if (PredicateHelper.IS_DATE_PREDICATE.apply(stringValue)) {
handleDate(st, new Date(DateTime.parse(stringValue).getMillis()), index);
} else if (StringUtils.isNumeric(stringValue)) {
handleNumber(st, stringValue, index);
} else {
st.setString(index, optionalValue.get().toString());
}
} else {
st.setString(index, null);
}
} else {
System.out.println("else Some");
}
} else {
//TODO replace with Preconditions guava
throw new IllegalArgumentException(value + " is not implemented");
}
} else {
st.setString(index, null);
}
}
#Override
public Object deepCopy(Object o) throws HibernateException {
return o;
}
#Override
public boolean isMutable() {
return false;
}
#Override
public Serializable disassemble(Object o) throws HibernateException {
return (Serializable) o;
}
#Override
public Object assemble(Serializable serializable, Object o) throws HibernateException {
return serializable;
}
#Override
public Object replace(Object original, Object target, Object owner) throws HibernateException {
return original;
}
}
Oracle supports the use of VARRAYS and NESTED TABLE data types, allowing multivalued attributes. (http://www.orafaq.com/wiki/NESTED_TABLE)
I am currently using Hibernate 3 as my ORM framework, but I can't see how I can map Hibernate to a NESTED TABLE/VARRAY data type in my database.
I looked at defining custom types in Hibernate, with no success. (Can Hibernate even handle the "COLUMN_VALUE" Oracle keyword necessary to unnest the subtable?)
Does anyone know how to implement these data types in Hibernate?
Thank you all for your help.
-- TBW.
Hibernate's UserType for Oracle's TABLE OF NUMBERS.
OracleNativeExtractor found here : https://community.jboss.org/wiki/MappingOracleXmlTypeToDocument . String YOUR_CUSTOM_ARRAY_TYPE replace with your name.
import oracle.sql.ARRAY;
import oracle.sql.ArrayDescriptor;
import org.apache.commons.lang.ArrayUtils;
import org.hibernate.HibernateException;
import org.hibernate.usertype.UserType;
import java.io.Serializable;
import java.sql.*;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
public class ArrayUserType
implements UserType, Serializable {
private static final OracleNativeExtractor EXTRACTOR = new OracleNativeExtractor();
#Override
public int[] sqlTypes() {
return new int[]{Types.ARRAY};
}
#Override
public Class returnedClass() {
return List.class;
}
#Override
public boolean equals(Object x, Object y) throws HibernateException {
if (x == null && y == null) return true;
else if (x == null && y != null) return false;
else return x.equals(y);
}
#Override
public int hashCode(Object x) throws HibernateException {
return x.hashCode();
}
#Override
public Object nullSafeGet(ResultSet rs, String[] names, Object owner) throws HibernateException, SQLException {
return Arrays.asList(ArrayUtils.toObject(((ARRAY) rs.getObject(names[0])).getLongArray()));
}
#Override
public void nullSafeSet(PreparedStatement st, Object value, int index) throws HibernateException, SQLException {
ARRAY array = null;
if (value != null) {
Connection nativeConn = EXTRACTOR.getNativeConnection(st.getConnection());
ArrayDescriptor descriptor =
ArrayDescriptor.createDescriptor("YOUR_CUSTOM_ARRAY_TYPE", nativeConn);
array = new ARRAY(descriptor, nativeConn, ((List<Long>) value).toArray(new Long[]{}));
}
st.setObject(1, array);
}
#Override
public Object deepCopy(Object value) throws HibernateException {
if (value == null) return null;
return new ArrayList<Long>((List<Long>) value);
}
#Override
public boolean isMutable() {
return false;
}
public Object assemble(Serializable _cached, Object _owner)
throws HibernateException {
return _cached;
}
public Serializable disassemble(Object _obj)
throws HibernateException {
return (Serializable) _obj;
}
#Override
public Object replace(Object original, Object target, Object owner) throws HibernateException {
return deepCopy(original);
}
}
I hope I'm wrong and that you find a better answer in your research, but this feature is not supported in Hibernate. Hibernate relies on standard JDBC to talk to a database and these features are not part of the standard. They are Oracle extensions.
That said, I can think of a few workarounds:
1) Implement your own UserType. With your specific user type, you'll have a chance to manipulate the values provided by the database (or about to be sent to the database). But that will only work if Oracle provides this value as one of these java.sql.Types: http://download.oracle.com/javase/1.5.0/docs/api/java/sql/Types.html
2) The other option is to use JDBC directly, through the use of a Hibernate worker. See this example of a Worker: https://github.com/hibernate/hibernate-core/blob/master/hibernate-core/src/test/java/org/hibernate/test/jdbc/GeneralWorkTest.java
That said, I think that you have to weight the solutions and re-evaluate if you really need a nested table.