How can I populate Postgres SQL base INSERT INTO SQL script with following Java Entity class based data collection. Issue is how can I write content of the byte array as a INSERT INTO value ('content of the byte[]') on SQL script.
That mean it can not feed this data via java application, according to requirement it needs some row SQL script for populate existing data base on production environment. Thanks.
Entity class
#Entity
#Table(name="image_table")
public class ImageData implements Serializable {
#Id
#GeneratedValue
#Column(name = "id")
private Integer id;
#Column(name = "content")
private byte[] content;
}
Format of the row SQL script need to generate
INSERT INTO image_table (id, content) VALUES ('1', '<content of the byte[]>');
INSERT INTO image_table (id, content) VALUES ('2', '<content of the byte[]>');
To answer your question literally: You can write a SQL INSERT script for an integer and a blob value, but that would be rather horrible with hex escaped strings for bytes and you could easily run into problems with long statements for larger blobs; PostgreSQL itself has no practical limit, but most regular text editors do.
As of PostgreSQL 9.0 you can use the hex format (documentation here). Basically, a hex value in text representation looks like E'\x3FB5419C' etc. So you output from your method E'\x, then the byte[] as a hex string, then the closing '. Writing the hex string from the byte[] content can be done with org.apache.commons.codec.binary.Hex.encodeHexString(content) or use this SO answer for a plain Java solution. Depending on your production environment you may have to escape the backslash \\ and fiddle with the newline character. I suggest you give this a try with a small blob to see what works for you.
A better approach is direct insertion. Assuming that you are using the PostgreSQL JDBC driver, the documentation has a worked out example. Given that you have a byte[] class member, you should use the setBytes() method instead of setBinaryStream() (which expects an InputStream instance):
PreparedStatement ps = conn.prepareStatement("INSERT INTO image_table (id, content) VALUES (?, ?)");
ps.setInteger(1, 1);
ps.setBytes(2, content);
ps.executeUpdate();
ps.setInteger(1, 2);
ps.executeUpdate();
ps.close();
You have to place the #Lob annotation on top of your content column. Your final result would be something like this:
import javax.persistence.Lob;
.
.
.
#Lob
#Column(name = "content")
private byte[] content;
Related
I have a need to store a large string into Oracle database the length of which would be at most 10000 bytes. I understand that there is some configuration in Oracle 12c that can increase the 4000 byte limit of varchar2. But I do not have the option to use that configuration.
So I am inclined to use the CLOB data type. I have no previous experience in using CLOB. So I have my concerns.
I saw the following on SO
Java: How to insert CLOB into oracle database
I did not want to use any oracle package to handle the CLOB type. My question is, is the following safe enough for my purpose?
To store:
try {
String myclobstring = "xx ........";
String sql = "Insert into mytable (clobfield) values (?)";
Statement stmt = conn.prepareStatement(sql);
stmt.setString(1, myclobstring);
.
.
}
To Retrieve:
try {
String sql = "select clobfield from mytable";
stmt = conn.createStatement();
ResultSet result = stmt.executeQuery(sql);
String s = result.getString ("clobfield');
.
.
}
You can create the CLOB as a String, though using a stmt.setCharacterStream might be a bit better for something very large. Here is an example I found that shows this nicely:
Storing Clobs
You can also use the java.sql.Clob if you're wanting to not use the Oracle specific code.
From Oracle's documentation on CLOB
Use the java.sql.Clob and create the CLOB with the connection's createClob function.
I thought I will post an answer as I did not see an explicit answer. So far setString()/setString() is storing up to 10K characters into the CLOB field. I am getting back what I am storing without a problem.
I did see the following related SO post that gives me a bit more confidence.
How to use setClob() in PreparedStatement inJDBC
I'm using Spring Boot with Hibernate, JPA and PostgreSQL. I'm wanting to convert database large objects into text content. Previously I was defining my long text in my JPA entity as #Lob:
#Lob
String description;
I then discovered that often problems are created using #Lob's and decided to rather change them to:
#Type(type="org.hibernate.type.StringClobType")
String description;
Which is represented in the database as a text type. Unfortunately, now the reference numbers (oid's) of the previous large objects are stored in my rows instead of the actual content. For example:
id | description
---------------------
1 | 463784 <- This is a reference to the object rather than the content
instead of:
id | description
---------------------
1 | Once upon a time, in a galaxy...
My question is now that we have thousands of rows of data in the database, how do I write a function or perform a query to replace the large object id with the actual text content stored in the large object?
Special thanks to #BohuslavBurghardt for pointing me to this answer. For your convenience:
UPDATE table_name SET column_name = lo_get(cast(column_name as bigint))
I needed some additional conversion:
UPDATE table_name SET text_no_lob = convert_from(lo_get(text::oid), 'UTF8');
I had the same problem with Spring, Postgres and JPA (Hibernate). I had a payload field that was like below
#NotBlank
#Column(name = "PAYLOAD")
private String payload;
I wanted to change the data type to text to support large data. So I used #Lob and I got the same error. To resolve that I first changed my field in my Entity like below:
#NotBlank
#Column(name = "PAYLOAD")
#Lob
#Type(type = "org.hibernate.type.TextType")
private String payload;
And because my data in this column was some scalar(Number) I have changed it to normal text with below command in Postgres:
UPDATE MYTABLE SET PAYLOAD = lo_get(cast(PAYLOAD as bigint))
Thanks a lot #Sipder.
I have a problem while selecting from a table containg data in utf-8 format in MySQL using java, in the WHERE clause I need to compare if a column value equals to a java String but they don't match
"Select age from student where name = '"+stringVariable+"';"
the stringVariable can sometimes be in arabic so in this case they don't match
The database is utf8mb4 and the connection also between java and database is utf_8 and I dont have a problem while inserting or just selecting data but the problem occurs while comparing
I tried to convert the string like this and it also didnt match
byte[] b = stringVariable.getBytes("UTF-8");
String str = new String(b,"UTF-8");
So anyone has any solution for that ?!
Thank you in advance
Use parameters. The driver should then encode them correctly according to the connection properties.
PreparedStatement statement = connection.prepareStatement(
"select age from student where name = ?");
statement.setString(1, stringVariable);
In addition, this also correctly escapes special SQL characters (like single quotes).
You shouldn't have to transform anything. The driver takes care of everything. Try using a prepared statement rather than string concatenation. This will have the additional advantage of avoiding SQL injection attacks, and make sure your statement works even if the string variable contains a quote.
I solved this problem it was not only related with the database query but I get the string from ajax request so I had to decode it from utf8 first and then get the parameter value using
URLDecoder.decode(request.getQueryString(), "UTF-8")
and then pass it to the query and I used prepared statement which encodes it according to the connection properties as you mentioned
Thank you
I have a JPA entity with a blob field in it. I want to write a JPQL query to fetch the length of the entity's blob (I don't want to load the blob into the memory).
For instance, in Oracle, I can use the following SQL query:
SELECT length(blob_field) FROM my_table WHERE id = ?
How can I fetch the blob's length with JPQL?
You cannot. JPQL length-function counts numbers of characters in string, and number of bytes in blob doesn't fit to that definition.
But, you can use length for attributes that are persisted as CLOB (char/Characters array or String as Java type).
//can use length function for CLOBs:
#Lob char[] a;
#Lob Character b;
#Lob String c;
//can not use length function for BLOBs:
#Lob byte[] d;
#Lob Byte e;
#Lob Serializable f;
If using EclipseLink you can use the "FUNC" function in JPQL to define a database function.
See,
http://wiki.eclipse.org/EclipseLink/UserGuide/JPA/Basic_JPA_Development/Querying/Support_for_Native_Database_Functions
I able to save (spring-hibernate saveorupdate()) field
#Lob
#Column(name = "FILENAME")
private String filename;
into oracle database datatype is clob
but when i try retrieve it, i get error
ERROR -
JDBCExceptionReporter.logExceptions(72)
| ORA-00932: inconsistent datatypes:
expected - got CLOB
below is how i retrive from database
DetachedCriteria crit = DetachedCriteria.forClass(Storagefile.class);
crit.addOrder(bSortOrder ? Order.asc(sortColumnId) : Order.desc(sortColumnId));
List<Storagefile> result = (List<Storagefile>) getHibernateTemplate().findByCriteria(crit, nFirst, nPageSize);
It's not clear from your sample code, but my guess is that you're trying to sort by the CLOB column, and Oracle does not permit that. That error code is Oracle's charming way of telling you this.
Are you sure you need to use a CLOB to store a filnename? Oracle can store up to 4000 characters in a VARCHAR2 column, surely that's enough for a filename? If you want to sort by the filename, then that's what you'll need to do.
Have you waded through this:
https://www.hibernate.org/56.html
There seems to be an issue with the Oracle 9i driver and LOBs (not sure what your setup is).