How to fetch length of a blob field with JPQL? - java

I have a JPA entity with a blob field in it. I want to write a JPQL query to fetch the length of the entity's blob (I don't want to load the blob into the memory).
For instance, in Oracle, I can use the following SQL query:
SELECT length(blob_field) FROM my_table WHERE id = ?
How can I fetch the blob's length with JPQL?

You cannot. JPQL length-function counts numbers of characters in string, and number of bytes in blob doesn't fit to that definition.
But, you can use length for attributes that are persisted as CLOB (char/Characters array or String as Java type).
//can use length function for CLOBs:
#Lob char[] a;
#Lob Character b;
#Lob String c;
//can not use length function for BLOBs:
#Lob byte[] d;
#Lob Byte e;
#Lob Serializable f;

If using EclipseLink you can use the "FUNC" function in JPQL to define a database function.
See,
http://wiki.eclipse.org/EclipseLink/UserGuide/JPA/Basic_JPA_Development/Querying/Support_for_Native_Database_Functions

Related

Why can't I fetch bytea values from my postgres db?

I have a DAO with these methods:
#SqlUpdate("INSERT INTO my_test (ba) VALUES (:ba)")
void insertBytea(#Bind("ba") byte[] ba);
#SqlQuery("SELECT ba from my_test fetch first 1 row only")
byte[] selectBytea();
When I execute the insert method:
byte[] bytea = new byte[1];
bytea[0] = 1;
myDao.insertBytea(bytea);
the value ends up in the database.
So far so good.
But when I retrieve it:
byte[] bytes = myDao.selectBytea();
.. this happens:
...
Caused by: org.postgresql.util.PSQLException: Bad value for type byte : \x01
at org.postgresql.jdbc.PgResultSet.getByte(PgResultSet.java:2135)
at org.jdbi.v3.core.mapper.PrimitiveMapperFactory.lambda$primitiveMapper$0(PrimitiveMapperFactory.java:64)
at org.jdbi.v3.core.mapper.SingleColumnMapper.lambda$new$0(SingleColumnMapper.java:41)
at org.jdbi.v3.core.mapper.SingleColumnMapper.map(SingleColumnMapper.java:55)
at org.jdbi.v3.core.result.ResultSetResultIterator.next(ResultSetResultIterator.java:83)
I'm not sure what is going on. But when I debug the code, it seems as if the postgres library is has transformed the value from byte array, to string, back to byte array?
...because the values [92, 120, 48, 49] corresponds to the string "\x01" which seems to be one of the ways postgres expresses bytea values.
I am using jdbi3 libraries to access the db.
I am depending on the artifact postgresql version 42.2.18.
JDBI internal mapper strategy detects byte[] return type as container type and expects the query to return array of byte values. But in your case it is rather a single value returned containing array of bytes.
Solution is simple, just add org.jdbi.v3.sqlobject.SingleValue annotation to your method.
From #SingleValue javadoc
Indicate to SqlObject that a type that looks like a container should be treated as a single element.
Your particular example will look like the following:
#SingleValue
#SqlQuery("SELECT ba from my_test fetch first 1 row only")
byte[] selectBytea();

How do I convert a column of large objects to long text?

I'm using Spring Boot with Hibernate, JPA and PostgreSQL. I'm wanting to convert database large objects into text content. Previously I was defining my long text in my JPA entity as #Lob:
#Lob
String description;
I then discovered that often problems are created using #Lob's and decided to rather change them to:
#Type(type="org.hibernate.type.StringClobType")
String description;
Which is represented in the database as a text type. Unfortunately, now the reference numbers (oid's) of the previous large objects are stored in my rows instead of the actual content. For example:
id | description
---------------------
1 | 463784 <- This is a reference to the object rather than the content
instead of:
id | description
---------------------
1 | Once upon a time, in a galaxy...
My question is now that we have thousands of rows of data in the database, how do I write a function or perform a query to replace the large object id with the actual text content stored in the large object?
Special thanks to #BohuslavBurghardt for pointing me to this answer. For your convenience:
UPDATE table_name SET column_name = lo_get(cast(column_name as bigint))
I needed some additional conversion:
UPDATE table_name SET text_no_lob = convert_from(lo_get(text::oid), 'UTF8');
I had the same problem with Spring, Postgres and JPA (Hibernate). I had a payload field that was like below
#NotBlank
#Column(name = "PAYLOAD")
private String payload;
I wanted to change the data type to text to support large data. So I used #Lob and I got the same error. To resolve that I first changed my field in my Entity like below:
#NotBlank
#Column(name = "PAYLOAD")
#Lob
#Type(type = "org.hibernate.type.TextType")
private String payload;
And because my data in this column was some scalar(Number) I have changed it to normal text with below command in Postgres:
UPDATE MYTABLE SET PAYLOAD = lo_get(cast(PAYLOAD as bigint))
Thanks a lot #Sipder.

How to dump byte array to INSERT INTO SQL script (Java + PostgreSQL)

How can I populate Postgres SQL base INSERT INTO SQL script with following Java Entity class based data collection. Issue is how can I write content of the byte array as a INSERT INTO value ('content of the byte[]') on SQL script.
That mean it can not feed this data via java application, according to requirement it needs some row SQL script for populate existing data base on production environment. Thanks.
Entity class
#Entity
#Table(name="image_table")
public class ImageData implements Serializable {
#Id
#GeneratedValue
#Column(name = "id")
private Integer id;
#Column(name = "content")
private byte[] content;
}
Format of the row SQL script need to generate
INSERT INTO image_table (id, content) VALUES ('1', '<content of the byte[]>');
INSERT INTO image_table (id, content) VALUES ('2', '<content of the byte[]>');
To answer your question literally: You can write a SQL INSERT script for an integer and a blob value, but that would be rather horrible with hex escaped strings for bytes and you could easily run into problems with long statements for larger blobs; PostgreSQL itself has no practical limit, but most regular text editors do.
As of PostgreSQL 9.0 you can use the hex format (documentation here). Basically, a hex value in text representation looks like E'\x3FB5419C' etc. So you output from your method E'\x, then the byte[] as a hex string, then the closing '. Writing the hex string from the byte[] content can be done with org.apache.commons.codec.binary.Hex.encodeHexString(content) or use this SO answer for a plain Java solution. Depending on your production environment you may have to escape the backslash \\ and fiddle with the newline character. I suggest you give this a try with a small blob to see what works for you.
A better approach is direct insertion. Assuming that you are using the PostgreSQL JDBC driver, the documentation has a worked out example. Given that you have a byte[] class member, you should use the setBytes() method instead of setBinaryStream() (which expects an InputStream instance):
PreparedStatement ps = conn.prepareStatement("INSERT INTO image_table (id, content) VALUES (?, ?)");
ps.setInteger(1, 1);
ps.setBytes(2, content);
ps.executeUpdate();
ps.setInteger(1, 2);
ps.executeUpdate();
ps.close();
You have to place the #Lob annotation on top of your content column. Your final result would be something like this:
import javax.persistence.Lob;
.
.
.
#Lob
#Column(name = "content")
private byte[] content;

How to storage a file into a Database as byte[] with Hibernate?

I want to storage a file into a database table as byte[]. Now I have a question concerning storing a byte[] into a postgres database table with Hibernate. I have this implementation in my Entity class:
#Column
private byte[] bytes;
In my database table it shows the datatype byte for the byte[].
Now I can see plaintext in this column (in my case the name of the file, which I want to store in the database). For each byte it will repeat the name of the file in the database column. But why?
Is a another Hibernate annotation necessary?
Thanks for help!
Greetz
Marwief
You must annotate the field with #Lob.
Note: If the column is autogenerated, you might think of setting the column definition with an additional annotation (e.g for MYSQL):
#Column(columnDefinition="LONGBLOB")//or VARBINARY(128) if you need a smaller limit
#Column(columnDefinition = "LONGBLOB")
private byte[] bytes;
Domain
import javax.persistence.Lob;
#Lob
private byte[] image;
Repository layer
byte[] image = getImageAsBytes();
domain.setImage(image);
entityManager.save(domain);
It should work, atleast it works for us. No special magic

EclipseLink/JPA: Specify length of byte array

I have the following definition:
#Column(name = "password", length = 80)
byte[] password;
When I use EclipseLink to create the tables (mysql) I get a table with a longblob. A tinyblob would suffice.
How do I have to specify the length?
I know that I could add a columnDefinition but I'd like to keep it database/sql agnostic.
If you want specific type for blob column, columnDefinition should be used. Problem with length is that it is only for strings. This is also told in API.

Categories