I have an issue related to SQLInput not reading data with readString()
Same code works on two different Oracle databases.
But on this one I have following issue:
This code:
#Override
public void readSQL(SQLInput stream, String typeName) throws SQLException {
userId = stream.readBigDecimal().longValue();
name = stream.readString();
modified = stream.readTimestamp();
}
returns userId and modified, but for name it returns "???" even though it has data in database for that name.
I have no idea what the issue is.
The data from query is return into oracle type similar to:
create or replace TYPE MY_ROW AS OBJECT (
USER_ID NUMBER,
name VARCHAR2(50),
MODIFIED TIMESTAMP
)
which is used in table type of that row type:
create or replace TYPE MY_TABLE as TABLE of MY_ROW
So I again, I have no idea why this doesn't work for that specific oracle database, when it works with two other ones.
It's like varchar2 fields with getString are not getting returned and I'm just getting ???
Anyone has any idea on what to do?
EDIT:
Even after including orai18n.jar in the classpath, the issue is still there
I finally managed to resolved it by combining orai18n.jar and ojdbc7.jar into same jar file.
Not sure why adding orai18n.jar separately in the class path didn't do the trick.
Related
I have java application through which I do different operations on MySQL DB. The probleam is that when inserting utf8 String it is not inserted correctly. The charset of DB is utf8 and I have set collation to utf8_unicode_ci. Server connection collation is also utf8_unicode_ci. Furthermore when I insert data from phpMyAdmin it is inserted correctly, but when I do it from Java application using JOOQ - it is not. Example:
Result<ExecutorsRecord> executorsRecord =
context.insertInto(EXECUTORS, EXECUTORS.ID, EXECUTORS.NAME, EXECUTORS.SURNAME, EXECUTORS.REGION, EXECUTORS.PHONE, EXECUTORS.POINTS, EXECUTORS.E_TYPE)
.values(id, name, surname, region, phone, 0, type)
.returning(EXECUTORS.ID)
.fetch();
where name = "Бобр" and surname = "Добр", produces tuple with ???? as a name and ???? as surname. I have checked both strings, they are passed correctly to the method correctly.
As #spencer7593 suggested the problem could be in JDBC connector. So I added into url of connection following: ?characterEncoding=utf8 so that final url was "jdbc:mysql://localhost:3306/mydb?characterEncoding=utf8", where mydb is a name of database. This has sorted out my problem. Also I would like to add the following statement (again by #spencer7593):
When we've got things configured correctly, and things aren't working, our goto suspect is the JDBC driver. To get timezone differences between the JVM and the MySQL database sorted out, to prevent the JDBC driver from "helping" by doing an illogical combination of various operations, we had to add two extra obscurely documented settings to the connection string.
Further reading
We have an Oracle database with the following charset settings
SELECT parameter, value FROM nls_database_parameters WHERE parameter like 'NLS%CHARACTERSET'
NLS_NCHAR_CHARACTERSET: AL16UTF16
NLS_CHARACTERSET: WE8ISO8859P15
In this database we have a table with a CLOB field, which has a record that starts with the following string, stored obviously in ISO-8859-15: X²ARB (here correctly converted to unicode, in particular that 2-superscript is important and correct).
Then we have the following trivial piece of code to get the value out, which is supposed to automatically convert the charset to unicode via globalization support in Oracle:
private static final String STATEMENT = "SELECT data FROM datatable d WHERE d.id=2562456";
public static void main(String[] args) throws Exception {
Class.forName("oracle.jdbc.driver.OracleDriver");
try (Connection conn = DriverManager.getConnection(DB_URL);
ResultSet rs = conn.createStatement().executeQuery(STATEMENT))
{
if (rs.next()) {
System.out.println(rs.getString(1).substring(0, 5));
}
}
}
Running the code prints:
with ojdbc8.jar and orai18n.jar: X�ARB -- incorrect
with ojdbc7.jar and orai18n.jar: X�ARB -- incorrect
with ojdbc-6.jar: X²ARB -- correct
By using UNISTR and changing the statement to SELECT UNISTR(data) FROM datatable d WHERE d.id=2562456 I can bring ojdbc7.jar and ojdbc8.jar to return the correct value, but this would require an unknown number of changes to the code as this is probably not the only place where the problem occurs.
Is there anything I can do to the client or server configurations to make all queries return correctly encoded values without statement modifications?
It definitely looks like a bug in the JDBC thin driver (I assume you're using thin). It could be related to LOB prefetch where the CLOB's length, character set id and the first part of the LOB data is sent inband. This feature was introduced in 11.2. As a workaround, you can disable lob prefetch by setting the connection property
oracle.jdbc.defaultLobPrefetchSize
to "-1". Meanwhile I'll follow up on this bug to make sure that it gets fixed.
Please have a look at Database JDBC Developer's Guide - Globalization Support
The basic Java Archive (JAR) file ojdbc7.jar, contains all the
necessary classes to provide complete globalization support for:
CHAR or VARCHAR data members of object and collection for the character sets US7ASCII, WE8DEC, WE8ISO8859P1, WE8MSWIN1252, and UTF8.
To use any other character sets in CHAR or VARCHAR data members of
objects or collections, you must include orai18n.jar in the CLASSPATH
environment variable:
ORACLE_HOME/jlib/orai18n.jar
I am trying to insert a new record into a simple database table with MyBatis but I get a strange exception. Mybe it is related to that I am not using POJO.
MyBatis version: 3.4.5
My table:
CREATE TABLE IF NOT EXISTS image
(
id BIGINT PRIMARY KEY,
content BYTEA
) WITHOUT OIDS;
MyBatis mapper:
#Insert("INSERT INTO image (id, content) VALUES (#{id}, #{content})")
#SelectKey(statement = "SELECT NEXTVAL('image_seq')", keyProperty = "id", before = true, resultType = long.class)
long insertImage(byte[] content);
The way I am trying to use it:
byte[] fileContent = IOUtils.toByteArray(inputStream);
long id = imageDao.insertImage(fileContent);
The exception what I get:
java.lang.ClassCastException: java.lang.Long cannot be cast to [B
at org.apache.ibatis.type.ByteArrayTypeHandler.setNonNullParameter(ByteArrayTypeHandler.java:26)
at org.apache.ibatis.type.BaseTypeHandler.setParameter(BaseTypeHandler.java:53)
at org.apache.ibatis.scripting.defaults.DefaultParameterHandler.setParameters(DefaultParameterHandler.java:87)
at org.apache.ibatis.executor.statement.PreparedStatementHandler.parameterize(PreparedStatementHandler.java:93)
at org.apache.ibatis.executor.statement.RoutingStatementHandler.parameterize(RoutingStatementHandler.java:64)
at org.apache.ibatis.executor.SimpleExecutor.prepareStatement(SimpleExecutor.java:86)
at org.apache.ibatis.executor.SimpleExecutor.doUpdate(SimpleExecutor.java:49)
at org.apache.ibatis.executor.BaseExecutor.update(BaseExecutor.java:117)
at org.apache.ibatis.executor.CachingExecutor.update(CachingExecutor.java:76)
at org.apache.ibatis.session.defaults.DefaultSqlSession.update(DefaultSqlSession.java:198)
at org.apache.ibatis.session.defaults.DefaultSqlSession.insert(DefaultSqlSession.java:185)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
...
I do not want to create POJO class with getter/setter method for this one "content" param but I think this issue is related to missing POJO.
What is the solution?
EDIT
I am trying to debug mybatis code and I have found "[B" in the parameterTypes:
#SelectKey is useful when you want to reuse generated value farther in the code, but it seems yo will not.
Then why not keep everything in the SQL:
INSERT INTO image (id, content) VALUES ((SELECT NEXTVAL('image_seq')), #{content})
For exception regarding parameters, parameters must be named with #Param annotation
int insertImage(#Param("content") byte[] content);
or
int insertImage(#Param("id) Long id, #Param("content") byte[] content)
Note that INSERT as well as UPDATE and DELETE statements returns type int being the number of inserted/updated/deleted rows, [...]
EDIT: unless you consider that under the hood, the java 8 PreparedStatement.executeLargeUpdate returning long is executed.
[...] and not the generated key as it is suggested. Then it seems you eventually want to get the key value, that means back to square one with #SelectKey and need for a POJO and a target property for the generated value. It even works with bulk insert with generated keys.
I have discovered lately that actual parameters name can be used (then your code will work as is) if following instructions in settings section of the documentation:
useActualParamName Allow referencing statement parameters by their
actual names declared in the method signature. To use this feature,
your project must be compiled in Java 8 with -parameters option.
(Since: 3.4.1) valid values: true | false default: true
java.lang.Long cannot be cast to [B
This is saying that you are trying to convert long to byte[]
Looking at source of org.apache.ibatis.type.ByteArrayTypeHandler:
public void setNonNullParameter(PreparedStatement ps, int i, byte[] parameter, JdbcType jdbcType) throws SQLException {
ps.setBytes(i, parameter);
}
I think you need to remove {id} from insert annotation (as this value is autogenerated).
#Insert("INSERT INTO image (content) VALUES (#{content})")
Otherwise parameters are shifted by one.
I am getting below exception, when trying to insert a batch of rows to an existing table
ORA-00942: table or view does not exist
I can confirm that the table exists in db and I can insert data to that table using oracle
sql developer. But when I try to insert rows using preparedstatement in java, its throwing table does not exist error.
Please find the stack trace of error below
java.sql.SQLException: ORA-00942: table or view does not exist
at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:134)
at oracle.jdbc.ttc7.TTIoer.processError(TTIoer.java:289)
at oracle.jdbc.ttc7.Oall7.receive(Oall7.java:573)
at oracle.jdbc.ttc7.TTC7Protocol.doOall7(TTC7Protocol.java:1889)
at oracle.jdbc.ttc7.TTC7Protocol.parseExecuteFetch(TTC7Protocol.java:1093)
at oracle.jdbc.driver.OracleStatement.executeNonQuery(OracleStatement.java:2047)
at oracle.jdbc.driver.OracleStatement.doExecuteOther(OracleStatement.java:1940)
at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout>>(OracleStatement.java:2709)
at oracle.jdbc.driver.OraclePreparedStatement.executeUpdate(OraclePreparedStatement.java:589)
at quotecopy.DbConnection.insertIntoDestinationDb(DbConnection.java:591)
at quotecopy.QuoteCopier.main(QuoteCopier.java:72)
Can anyone suggest the reasons for this error ?
Update : Issue solved
There was no problem with my database connection properties or with my table or view name. The solution to the problem was very strange. One of the columns that I was trying insert was of Clob type. As I had a lot of trouble handling clob data in oracle db before, gave a try by replacing the clob setter with a temporary string setter and the same code executed with out any problems and all the rows were correctly inserted!!!.
ie. peparedstatement.setClob(columnIndex, clob)
was replaced with
peparedstatement.setString(columnIndex, "String")
Why an error table or view does exist error was throws for error in inserting clob data. Could anyone of you please explain ?
Thanks a lot for your answers and comments.
Oracle will also report this error if the table exists, but you don't have any privileges on it. So if you are sure that the table is there, check the grants.
There seems to be some issue with setCLOB() that causes an ORA-00942 under some circumstances when the target table does exist and is correctly privileged. I'm having this exact issue now, I can make the ORA-00942 go away by simply not binding the CLOB into the same table.
I've tried setClob() with a java.sql.Clob and setCLOB() with an oracle.jdbc.CLOB but with the same result.
As you say, if you bind as a string the problem goes away - but this then limits your data size to 4k.
From testing it seems to be triggered when a transaction is open on the session prior to binding the CLOB. I'll feed back when I've solved this...checking Oracle support.
There was no problem with my database connection properties or with my table or view name. The solution to the problem was very strange. One of the columns that I was trying insert was of Clob type. As I had a lot of trouble handling clob data in oracle db before, gave a try by replacing the clob setter with a temporary string setter and the same code executed with out any problems and all the rows were correctly inserted!!!.
ie. peparedstatement.setClob(columnIndex, clob)
was replaced with
peparedstatement.setString(columnIndex, "String")
#unbeli is right. Not having appropriate grants on a table will result in this error. For what it's worth, I recently experienced this. I was experiencing the exact problem that you described, I could execute insert statements through sql developer but would fail when using hibernate. I finally realized that my code was doing more than the obvious insert. Inserting into other tables that did not have appropriate grants. Adjusting grant privileges solved this for me.
Note: Don't have reputation to comment, otherwise this may have been a comment.
We experienced this issue on a BLOB column. Just in case anyone else lands on this question when encountering this error, here is how we resolved the issue:
We started out with this:
preparedStatement.setBlob(parameterIndex, resultSet.getBlob(columnName)); break;
We resolved the issue by changing that line to this:
java.sql.Blob blob = resultSet.getBlob(columnName);
if (blob != null) {
java.io.InputStream blobData = blob.getBinaryStream();
preparedStatement.setBinaryStream(parameterIndex, blobData);
} else {
preparedStatement.setBinaryStream(parameterIndex, null);
}
I found how to solve this problem without using JDBC's setString() method which limits the data to 4K.
What you need to do is to use preparedStatement.setClob(int parameterIndex, Reader reader). At least this is what that worked for me. Thought Oracle drivers converts data to character stream to insert, seems like not. Or something specific causing an error.
Using a characterStream seems to work for me. I am reading tables from one db and writing to another one using jdbc. And i was getting table not found error just like it is mentioned above. So this is how i solved the problem:
case Types.CLOB: //Using a switch statement for all columns, this is for CLOB columns
Clob clobData = resultSet.getClob(columnIndex); // The source db
if (clobData != null) {
preparedStatement.setClob(columnIndex, clobData.getCharacterStream());
} else {
preparedStatement.setClob(columnIndex, clobData);
}
clobData = null;
return;
All good now.
Is your script providing the schema name, or do you rely on the user logged into the database to select the default schema?
It might be that you do not name the schema and that you perform your batch with a system user instead of the schema user resulting in the wrong execution context for a script that would work fine if executed by the user that has the target schema set as default schema. Your best action would be to include the schema name in the insert statements:
INSERT INTO myschema.mytable (mycolums) VALUES ('myvalue')
update: Do you try to bind the table name as bound value in your prepared statement? That won't work.
It works for me:
Clob clob1;
while (rs.next()) {
rs.setString(1, rs.getString("FIELD_1"));
clob1 = rs.getClob("CLOB1");
if (clob1 != null) {
sta.setClob(2, clob1.getCharacterStream());
} else {
sta.setClob(2, clob1);
}
clob1 = null;
sta.setString(3, rs.getString("FIELD_3"));
}
Is it possible that you are doing INSERT for VARCHAR but doing an INSERT then an UPDATE for CLOB?
If so, you'll need to grant UPDATE permissions to the table in addition to INSERT.
See https://stackoverflow.com/a/64352414/1089967
Here I got the solution for the question. The problem is on glass fish if you are using it. When you create JNDI name make sure pool name is correct and pool name is the name of connection pool name that you are created.
I have a Play 2.1.3 Java app using Ebean. I am getting the OptimisticLockException below.
[OptimisticLockException: Data has changed. updated [0] rows sql[update person
set name=? where id=? and email=? and name=? and password is null and created=?
and deleted is null] bind[null]]
I understand that it is trying to tell me the record has changed between when I read it and when I tried to write it. But the only change is happening in this method.
public void updateFromForm(Map<String, String[]> form) throws Exception {
this.name = form.get("name")[0];
String password = form.get("password")[0];
if (password != null && password.length() != 0) {
String hash = Password.getSaltedHash(password);
this.password = hash;
}
this.update();
}
Am I doing this wrong? I saw similar logic in zentasks. Also, should I be able to see the the values for the bind variables?
UPDATE: I am calling updateFromForm() from inside a controller:
#RequiresAuthentication(clientName = "FormClient")
public static Result updateProfile() throws Exception {
final CommonProfile profile = getUserProfile();
String email = getEmail(profile);
Person p = Person.find.where().eq("email", email).findList().get(0);
Map<String, String[]> form = request().body().asFormUrlEncoded();
if (p == null) {
Person.createFromForm(form);
} else {
p.updateFromForm(form);
}
return ok("HI");
}
I have an alternative approach to this, where I add the annotation
#EntityConcurrencyMode(ConcurrencyMode.NONE)
to the Entity class.
This disables the optimistic locking concurrent modification check meaning the SQL becomes
update person set name=? where id=?
This is even more optimistic since it simply overwrites any intermediate changes.
Little bit late, but for your case #Version annotation should be the solution. We're using it mostly with java.util.Date, so it can be also used also for determining the date of last record update, in Play model that's just:
#Version
public java.util.Date version;
In such case update statement will be done with id and version fields only - useful especially when using with large models:
update person set name='Bob'
where id=1 and version='2014-03-03 22:07:35';
Note: you don't need/should update this field manually at each save, Ebean does it itself. version value changes ONLY when there was updated data (so using obj.update() where nothing changes doesn't update version field)
Mystery solved.
First- this public service announcement. "OptimisticLockException" is a big bucket. If you are trying to track one of these down be open to the idea that it could really be anything.
I figured out my problem by dumping SQL to the log and finding this:
update person set name='Bob'
where id=1 and email='jj#test.com'
and name='Robert' and password is null
and created=2013-12-01 and deleted is null
So I guess what happens when you do an update is that it builds a WHERE clause with all the known entities and their values as they were originally ready.
That means, if any other part of your code or another process changes something behind your back, this query will fail. I wrongly assumed that the problem was that somehow .setName('Bob') had changed the name in the DB or some object cache.
Really what was happening is that the WHERE clause includes a date while my database includes an entire timestamp with date, time, and timezone.
For now, I fixed it by just commenting out the timestamp in the model until I can figure out if/how Ebean can handle this data type.
I had the same problem,
after hours of search i found the reason..
It was of inconsistency of the parameters type in the data base (in my case string) and the object i created and tried to save -java.util.Date.
after changing the database to hold datetime object the problem was solved