Quarkus hibernate Exception in native mode - java

I'am setting up a simple Quarkus application using hibernate for testing Quarkus capabilities.
It contains a simple hibernate mapping
#Entity
Class AnEntity{
public double[] data;
}
It maps to the postgresql table anentity(data bytea).
This is fine for me and works well when running on JVM: zero problem on this side. I can read easily the content of the table using a NativeQuery.
Query query = em.createNativeQuery("select * from anentity",AnEntity.class);
List<AnEntity> resultList = query.getResultList();
for (AnEntity anEntity: resultList) {
System.out.println(anEntity.data);
}
The behavior here is that the content of the data array is serialized/deserialized using "standard" java serialization when reading/writing the data field from and to the database.
Now when compiling in native mode, I have a serialization related exception that follows:
ERROR [io.qua.run.Application] (main) Failed to start application (with profile prod): java.io.StreamCorruptedException: invalid type code: 40
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1712)
at java.io.ObjectInputStream.readArray(ObjectInputStream.java:2103)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:493)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:451)
at org.hibernate.internal.util.SerializationHelper.doDeserialize(SerializationHelper.java:225)
at org.hibernate.internal.util.SerializationHelper.deserialize(SerializationHelper.java:287)
at org.hibernate.type.descriptor.java.SerializableTypeDescriptor.fromBytes(SerializableTypeDescriptor.java:138)
at org.hibernate.type.descriptor.java.SerializableTypeDescriptor.wrap(SerializableTypeDescriptor.java:113)
at org.hibernate.type.descriptor.java.SerializableTypeDescriptor.wrap(SerializableTypeDescriptor.java:29)
at org.hibernate.type.descriptor.sql.VarbinaryTypeDescriptor$2.doExtract(VarbinaryTypeDescriptor.java:60)
at org.hibernate.type.descriptor.sql.BasicExtractor.extract(BasicExtractor.java:47)
at org.hibernate.type.AbstractStandardBasicType.nullSafeGet(AbstractStandardBasicType.java:257)
at org.hibernate.type.AbstractStandardBasicType.nullSafeGet(AbstractStandardBasicType.java:253)
at org.hibernate.type.AbstractStandardBasicType.nullSafeGet(AbstractStandardBasicType.java:243)
at org.hibernate.type.AbstractStandardBasicType.hydrate(AbstractStandardBasicType.java:329)
at org.hibernate.persister.entity.AbstractEntityPersister.hydrate(AbstractEntityPersister.java:3214)
at org.hibernate.loader.Loader.loadFromResultSet(Loader.java:1887)
at org.hibernate.loader.Loader.hydrateEntityState(Loader.java:1811)
at org.hibernate.loader.Loader.instanceNotYetLoaded(Loader.java:1784)
at org.hibernate.loader.Loader.getRow(Loader.java:1624)
at org.hibernate.loader.Loader.getRowFromResultSet(Loader.java:748)
at org.hibernate.loader.Loader.getRowsFromResultSet(Loader.java:1047)
I tried to register double[].class in a #RegisterForReflection annotation, changing double[] to Double[] but it did not help.
I'am not sure what's wrong here. It looks like the Java default serialization/deserialization is not working well in native mode, and I'am wondering how to solve this.
Thanks for any hints/pointer. I have the feeling it is a simple problem of a missing declaration or configuration parameter somewhere, at least I hope so :)
I'll double check known issues with Java deserialization in native mode.

Solved!
After more investigations it was a simple matter of registering the double[] class in native image builder serialization configuration file.
Added this line in the quarkus application.properties:
quarkus.native.additional-build-args = -H:SerializationConfigurationResources=serialization-config.json
and serialization-config.json being simply
[
{"name":"double[]"}
]

Related

Couchbase Java SDK N1QL UPDATE issue

I'm experimenting with Java and Couchbase 6.0 Community edition using Java 2.7 SDK.
I'm trying to execute a simple update query from my java application the Couchbase Java 2.7 SDK:
String query ="UPDATE admin SET FIELDNAME='TEST'"
N1qlParams params = N1qlParams.build().adhoc(false);
N1qlQuery nquery = N1qlQuery.simple(query, params);
N1qlQueryResult nqr= this.rbucket.query(nquery);
And I am getting the following exception (the most meaningful part):
com.couchbase.client.core.CouchbaseException: N1qlQuery Error - {"msg":"syntax error - at UPDATE","code":3000}
The actual exception starts like this:
Exception in thread "main" com.couchbase.client.core.CouchbaseException: Error while preparing plan
Of course - this query works fine through the Couchbase web UI and I can update without problem.
Just for info: I tried escaping the single quotes, even tried setting the column to be equal to itself - same error.
Select queries are executed in a similar manner without any problem.
"admin' is not a good name for a bucket, as it is usually related to some reserved keywords, if you still want to use this name, you have to use backticks around it:
update `admin` set FIELDNAME = 'TEST'
It also might ask you to create a primary index if you don't have one yet.

Typecast error of postgres in Jboss while calling finder method in Ejb 2.1

First of all, things were working perfectly in my previous laptop, when I shifted my code to the new laptop I am getting this error, Basically, I am calling finder method in my EJB and code of it was working fine as in EJB-jar it looks like this
<query>
<query-method>
<method-name>finduserrightsonform</method-name>
<method-params>
<method-param>java.lang.String</method-param>
<method-param>java.lang.String</method-param>
</method-params>
</query-method>
<ejb-ql>select object(o) from UserRightsOnForm o where o.ser_groups_users_id=?1 and o.ser_form_id=?2</ejb-ql>
</query>
as both columns in DB are integers so as in my previous laptop Postgres auto-cast it's there, but what I am missing is strange.
so far as I tried
cast using cast(columnname as character varying) e.g. it also
through error
change method param to Integer it will not work as well, in fact,
stoped deployed the jar
changing the Postgress versions to the previous one.
casting like using ::smallint
Caused by: org.postgresql.util.PSQLException: ERROR: operator does not exist: integer = character varying
ERROR [stderr] (EJB default - 3) Hint: No operator matches the given name and argument type(s). You might need to add explicit type casts.
its look like the environment maybe jar issue but what I am missing not catching it.
calling code:
sms.viewUserRightsOnFormImpl(userId, "validationForm", groupId);
defination:
Collection findUserRightOnForm(String user_id, String form_id) throws FinderException;
using
Jdevelper 11g
postgress 9.2
Jboss 6
EJB 2.1
Note: Tried all google links and stackover links almost wasted my 2 days.

Aggregation with MongoDB 3.6 and Morphia 1.3.2

I am trying to aggregate a MongoDB (3.6) using Morphia (1.3.2).
Currently it is a simple match and unwind to understand Morphia's API.
The problem that I am facing however is related to MongoDB 3.6:
Changed in version 3.4: MongoDB 3.6 removes the use of aggregate command without the cursor option unless the command includes the explain option. Unless you include the explain option, you must specify the cursor option.
This paragraph comes directly from the MongoDB documentation. MongoDB Aggregate.
This means that a cursor is mandatory for the aggregate to work. However, I can't find a way to do this using Morphia. Therefore my aggregate does not work.
AggregationPipeline data = aggregation.match(query).unwind("data");
Iterator<LoraHourData> out = data.aggregate(Data.class);
The error that is produced using above code is as follows :
Command failed with error 9: The cursor option is required, except for aggregate with the explain argument on server localhost:27017. The full response is { ok : 0.0, errmsg: The cursor option is required, except for aggregate with the explain argument", code : 9, codeName : FailedToParse }

AbstractMethodError thrown when creating a Clob object in Java oracle procedure

I'm developing a Java PLSQL procedure that must instanciate a CLOB object. In order to instanciate that Clob object I must use a connection object. Based on Oracle documentation, I can get the current connection using java.sql.DriverManager however when I execute the following code an AbstractMethodError error is thrown
Connection conn = DriverManager.getConnection("jdbc:default:connection:");
Clob clob = conn.createClob();
I see a lot of posts talking about driver compatibility with Java runtime running the code but as I am working inside an Oracle DB I suppose it should be compatible.
Oracle version: 11.2.0.4.0
My goal is to create a clob inside my java method and return it to my plsql code. How can I instanciate a Clob inside a java class stored inside an Oracle database ?
Thanks for any help !
The creation of the CLOBusing conn.createClob() works for my fine in 12c. I suspect from your error message in 11 you'll have to use the CLOB.createTemporary method to create the CLOB (which works fine in 12c as well but is marked as deprecated).
Here an example
Java class
CREATE OR REPLACE AND RESOLVE JAVA SOURCE NAMED "CreateCLOB"
AS
import java.sql.*;
import oracle.sql.*;
import oracle.jdbc.*;
/****** START PASTE JAVA CLASS HERE *****/
public class CreateCLOB{
public static void ClobProc (Clob cl[]) throws SQLException
{
Connection conn = DriverManager.getConnection ("jdbc:default:connection:"); /* or use "jdbc:oracle:kprb:" */
Clob clob = CLOB.createTemporary(conn, false, oracle.sql.CLOB.DURATION_SESSION); /* this is deprecated in 12c */
// conn.createClob(); /* works fine in 12.1.0.2.0 */
clob.setString(1, "Test Data");
cl[0] = clob;
}
}
/
Wrapper
create or replace procedure MyClob (cl OUT Clob)
as language java
name 'CreateCLOB.ClobProc(java.sql.Clob[])';
/
Test
declare
x Clob;
begin
MyClob(x);
dbms_output.put_line('created CLOB length = ' || dbms_lob.getlength(x));
end;
/
created CLOB length = 9
As mentioned I can't test it on version 11, but I suppose it will work.
select * from v$version;
Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production
You need to follow these steps:
1. On your computer you should have an Oracle jdbc driver (http://www.oracle.com/technetwork/database/features/jdbc/index-091264.html)
2. Load a driver
Class.forName("oracle.jdbc.driver.OracleDriver");
3. Get a connection
Connection connection =
DriverManager.getConnection("jdbc:oracle:thin:#localhost:1521:SID","username","password");
UPDATE: If you work inside of Oracle DB, you can choose a type of driver (Server-Side Thin Driver or Server-Side Internal Driver) - https://docs.oracle.com/cd/E11882_01/appdev.112/e13995/oracle/jdbc/OracleDriver.html
Even when you are not using the createClob() you may get this error when starting the application if you are using an older version of hibernate core library.
The errors is just thrown, but the application starts. If you are not worried about it you may leave it, but if you don't want to see an error during application startup then her is the fix.
Its a bug in the Spring boot 2.X - hibernate core 5.3.X combination. There are 2 solutions for the issue. I would suggest #1.
Upgrade all hibernate libraries to 5.4.X (my way)
Provide spring.jpa.properties.hibernate.jdbc.lob.non_contextual_creation=true in your application properties/ yml file.(If you really do not want to upgrade your hibernate core library).
I have got this when upgrading spring boot to 2.4 and using Teradata DB without any createClob().
If you still want to know how the error is being created, its well explained in https://alexpask.com/spring-boot-createclob-error/

Strings > 4000 to CLOB conversion using hibernate

i have a java class persisting a string (>4k) into a database table into a CLOB field.
If the string is less than 4k then it works.
I have the field annotated using #Lob and initially I was getting an exception due to batching not supported for streams so I made the batch size 0 in the Hibernate config which is giving the exception:
Caused by: java.sql.SQLException: ORA-01460: unimplemented or unreasonable conversion requested
at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:134)
at oracle.jdbc.ttc7.TTIoer.processError(TTIoer.java:289)
at oracle.jdbc.ttc7.Oall7.receive(Oall7.java:582)
at oracle.jdbc.ttc7.TTC7Protocol.doOall7(TTC7Protocol.java:1986)
at oracle.jdbc.ttc7.TTC7Protocol.parseExecuteFetch(TTC7Protocol.java:1144)
at oracle.jdbc.driver.OracleStatement.executeNonQuery(OracleStatement.java:2152)
at oracle.jdbc.driver.OracleStatement.doExecuteOther(OracleStatement.java:2035)
at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:2876)
at oracle.jdbc.driver.OraclePreparedStatement.executeUpdate(OraclePreparedStatement.java:609)
at org.hibernate.jdbc.NonBatchingBatcher.addToBatch(NonBatchingBatcher.java:23)
at org.hibernate.persister.entity.AbstractEntityPersister.insert(AbstractEntityPersister.java:2062)
... 36 more
I only get this issue when using the code from Grails. When I use the same code from a pure Java application then I don't get the issue. Both applications have the same hibernate config (except I need to set the batch size to 0 in Grails). Is the issue the difference in the Hibernate versions which is 3.2.6ga in Grails as far as I can see and 3.2.5ga for the java application. The Oracle driver is the same in both cases.
Any answers welcome.
Try with annotating the field with #Column(length = Integer.MAX_VALUE). This hibernate bug report mentions it helped in Derby.

Categories