Oracle schema validator - java

I'm using Oracle DB in my application
My Application allow the user to create schema and for that reason
I want to do some validation before my application is set up..
for example I want to make sure that the user didn't create table with long column name
(There is limitation in Oracle for max 30 bytes table and column name)
I holding Dialect object in my validation function ,
Is it possible using the dialect object to find out that the user input (in my example column name)
is not correct - (because the column name size is more than 30 bytes..)
please assist,
Thanks,
Jhon.

I found out how to do it ..
I declared new object of class : java.sql.DatabaseMetaData
in This class there is getMaxColumnNameLength() method which return the limit for each DB
(for example in oracle that method return 30)
and now I can do my validation!
Thanks anyway :)
John.

Related

How to don't write schema name in oracle sql query?

I have Workspace/Schema EDUCATION in Oracle XE.
In my Java code I want execute queries like this: SELECT * FROM Table instead of SELECT * FROM EDUCATION.Table.
When I write query without EDUCATION I have error: table or view does not exist.
I tried to set the default schema to % (screenshot), but it did not help.
How to avoid writing Workspace/Schema name?
If I understand correctly, you want to access tables in other schemas without using the schema name.
One simple way to do this uses synonyms. In the schema you are connect to:
create synonym table for education.table;
Then you can use table where you would use education.table.

hbm2ddl.auto is not creating schema automatically when set to create [duplicate]

I am getting below exception, when trying to insert a batch of rows to an existing table
ORA-00942: table or view does not exist
I can confirm that the table exists in db and I can insert data to that table using oracle
sql developer. But when I try to insert rows using preparedstatement in java, its throwing table does not exist error.
Please find the stack trace of error below
java.sql.SQLException: ORA-00942: table or view does not exist
at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:134)
at oracle.jdbc.ttc7.TTIoer.processError(TTIoer.java:289)
at oracle.jdbc.ttc7.Oall7.receive(Oall7.java:573)
at oracle.jdbc.ttc7.TTC7Protocol.doOall7(TTC7Protocol.java:1889)
at oracle.jdbc.ttc7.TTC7Protocol.parseExecuteFetch(TTC7Protocol.java:1093)
at oracle.jdbc.driver.OracleStatement.executeNonQuery(OracleStatement.java:2047)
at oracle.jdbc.driver.OracleStatement.doExecuteOther(OracleStatement.java:1940)
at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout>>(OracleStatement.java:2709)
at oracle.jdbc.driver.OraclePreparedStatement.executeUpdate(OraclePreparedStatement.java:589)
at quotecopy.DbConnection.insertIntoDestinationDb(DbConnection.java:591)
at quotecopy.QuoteCopier.main(QuoteCopier.java:72)
Can anyone suggest the reasons for this error ?
Update : Issue solved
There was no problem with my database connection properties or with my table or view name. The solution to the problem was very strange. One of the columns that I was trying insert was of Clob type. As I had a lot of trouble handling clob data in oracle db before, gave a try by replacing the clob setter with a temporary string setter and the same code executed with out any problems and all the rows were correctly inserted!!!.
ie. peparedstatement.setClob(columnIndex, clob)
was replaced with
peparedstatement.setString(columnIndex, "String")
Why an error table or view does exist error was throws for error in inserting clob data. Could anyone of you please explain ?
Thanks a lot for your answers and comments.
Oracle will also report this error if the table exists, but you don't have any privileges on it. So if you are sure that the table is there, check the grants.
There seems to be some issue with setCLOB() that causes an ORA-00942 under some circumstances when the target table does exist and is correctly privileged. I'm having this exact issue now, I can make the ORA-00942 go away by simply not binding the CLOB into the same table.
I've tried setClob() with a java.sql.Clob and setCLOB() with an oracle.jdbc.CLOB but with the same result.
As you say, if you bind as a string the problem goes away - but this then limits your data size to 4k.
From testing it seems to be triggered when a transaction is open on the session prior to binding the CLOB. I'll feed back when I've solved this...checking Oracle support.
There was no problem with my database connection properties or with my table or view name. The solution to the problem was very strange. One of the columns that I was trying insert was of Clob type. As I had a lot of trouble handling clob data in oracle db before, gave a try by replacing the clob setter with a temporary string setter and the same code executed with out any problems and all the rows were correctly inserted!!!.
ie. peparedstatement.setClob(columnIndex, clob)
was replaced with
peparedstatement.setString(columnIndex, "String")
#unbeli is right. Not having appropriate grants on a table will result in this error. For what it's worth, I recently experienced this. I was experiencing the exact problem that you described, I could execute insert statements through sql developer but would fail when using hibernate. I finally realized that my code was doing more than the obvious insert. Inserting into other tables that did not have appropriate grants. Adjusting grant privileges solved this for me.
Note: Don't have reputation to comment, otherwise this may have been a comment.
We experienced this issue on a BLOB column. Just in case anyone else lands on this question when encountering this error, here is how we resolved the issue:
We started out with this:
preparedStatement.setBlob(parameterIndex, resultSet.getBlob(columnName)); break;
We resolved the issue by changing that line to this:
java.sql.Blob blob = resultSet.getBlob(columnName);
if (blob != null) {
java.io.InputStream blobData = blob.getBinaryStream();
preparedStatement.setBinaryStream(parameterIndex, blobData);
} else {
preparedStatement.setBinaryStream(parameterIndex, null);
}
I found how to solve this problem without using JDBC's setString() method which limits the data to 4K.
What you need to do is to use preparedStatement.setClob(int parameterIndex, Reader reader). At least this is what that worked for me. Thought Oracle drivers converts data to character stream to insert, seems like not. Or something specific causing an error.
Using a characterStream seems to work for me. I am reading tables from one db and writing to another one using jdbc. And i was getting table not found error just like it is mentioned above. So this is how i solved the problem:
case Types.CLOB: //Using a switch statement for all columns, this is for CLOB columns
Clob clobData = resultSet.getClob(columnIndex); // The source db
if (clobData != null) {
preparedStatement.setClob(columnIndex, clobData.getCharacterStream());
} else {
preparedStatement.setClob(columnIndex, clobData);
}
clobData = null;
return;
All good now.
Is your script providing the schema name, or do you rely on the user logged into the database to select the default schema?
It might be that you do not name the schema and that you perform your batch with a system user instead of the schema user resulting in the wrong execution context for a script that would work fine if executed by the user that has the target schema set as default schema. Your best action would be to include the schema name in the insert statements:
INSERT INTO myschema.mytable (mycolums) VALUES ('myvalue')
update: Do you try to bind the table name as bound value in your prepared statement? That won't work.
It works for me:
Clob clob1;
while (rs.next()) {
rs.setString(1, rs.getString("FIELD_1"));
clob1 = rs.getClob("CLOB1");
if (clob1 != null) {
sta.setClob(2, clob1.getCharacterStream());
} else {
sta.setClob(2, clob1);
}
clob1 = null;
sta.setString(3, rs.getString("FIELD_3"));
}
Is it possible that you are doing INSERT for VARCHAR but doing an INSERT then an UPDATE for CLOB?
If so, you'll need to grant UPDATE permissions to the table in addition to INSERT.
See https://stackoverflow.com/a/64352414/1089967
Here I got the solution for the question. The problem is on glass fish if you are using it. When you create JNDI name make sure pool name is correct and pool name is the name of connection pool name that you are created.

ORA-24816: Expanded non LONG bind data supplied after actual LONG or LOB column

I'm getting following exception, while updating table in Hibernate
ORA-24816: Expanded non LONG bind data supplied after actual LONG or LOB column
I have extracted sql query as well, it looks like
Update table_name set columnName (LOB)=value, colmun2 (String with 4000)=value where id=?;
Entity class
class Test{
#Lob
private String errorText;
#Column(length = 4000)
private String text;
}
Please help me, what is wrong in this
Thanks
Ravi Kumar
Running oerr ora 24816 to get the details on the error yields:
$ oerr ora 24816
24816, ... "Expanded non LONG bind data supplied after actual LONG or LOB column"
// *Cause: A Bind value of length potentially > 4000 bytes follows binding for
// LOB or LONG.
// *Action: Re-order the binds so that the LONG bind or LOB binds are all
// at the end of the bind list.
So another solution that uses only 1 query would be to move your LOB/LONG binds after all your non-LOB/LONG binds. This may or may not be possible with Hibernate. Perhaps something more like:
update T set column2 (String with 4000)=:1, columnName (LOB)=:3 where id=:2;
This DML limitation appears to have been around since at least Oracle 8i.
References:
http://openacs.org/forums/message-view?message_id=595742
https://community.oracle.com/thread/417560
I do realise that this thread is quite old, but I thought I'd share my own experience with the same error message here for future reference.
I have had the exact same symptoms (i.e. ORA-24816) for a couple of days. I was a bit side-tracked by various threads I came across suggesting that this was related to order of parameter binding. In my case this was not a problem. Also, I struggled to reproduce this error, it only occurred after deploying to an application server, I could not reproduce this through integration tests.
However, I took a look at the code where I was binding the parameter and found:
preparedStatement.setString(index, someStringValue);
I replaced this with:
preparedStatement.setClob(index, new StringReader(someStringValue));
This did the trick for me.
This thread from back in 2009 was quite useful.
I found issue.
1. When hibernate updating data in DB and entity has 4000 chars column and lob type column then hibernate throwing this exception
I have solved this issue by writing two update queires
1. First i have saved entity by using Update()
2. Written another update query for lob column update
Thanks
ravi
I have also encountered same error in oracle db and foudn that Hibernate Guys fixed here
In my case we were already using hiberante 4.3.7 but didnt mention that field is Lob in Entity
Reproducing Steps
Have fields with varchar2 datatype and clob data type.Make sure your column name are in this alphabetic order clob_field,varchar_two_field1,varchar_two_field2.
Now update clob_field with < 2000 bytes and varchar_two_field1 with 4000 bytes size.
This should end up with error ORA-24816: Expanded non LONG bind data supplied after actual LONG or LOB column
Solution
Make sure you have hiberante 4.1.8, < 4.3.0.Beta1
Annotate your clob/blob field in respective Entity as
import javax.persistence.Lob;
...
#Lob
#Column(name = "description")
private String description;
....
If you want to see the difference , by after making above changes enable debug for sql statements by setting "true" for "hibernate.show_sql" in persistence.xml.
I came across this issue today while trying to Insert the data into a table. To avoid this error, Just keep all the fields having "LOB" data type at the end of the insert statement.
For Example
Table1 has 8 Fields (Field1,Field2,.... Field8 etc..),
of which
Field1 and Field2 are of CLOB data types
and the rest are Varchar2 Data types
. Then while inserting the data make sure you keep Field1 and Field2 values at the end like below.
INSERT INTO TABLE1 ( Field3,Field4,Field5,Field6,Field7,Field8,Field1,Field2)
Values ('a','b','c','d','e','f','htgybyvvbshbhabjh','cbsdbvsb')
Place your LOB binding at the last. See if that solves the issue..

Change Table names in derby database using entitymanager

I am using an APACHE DERBY database, and basing my database interactions on EntityManager, and I don't want to use JDBC class to build a query to change my tables' names (i just need to put a prefix to each new user to the application, but have the same structure of tables), such as:
//em stands for EntityManager object
Query tableNamesQuery= em.createNamedQuery("RENAME TABLE SCHEMA.EMP_ACT TO EMPLOYEE_ACT");
em.executeUpdate();
// ... rest of the function's work
// The command works from the database command prompt but i don't know how to use it in a program
//Or as i know you can't change system tables data, but here's the code
Query tableNamesQuery= em.createNamedQuery("UPDATE SYS.SYSTABLES SET TABLENAME='NEW_TABLE_NAME' WHERE TABLETYPE='T'");
em.executeUpdate();
// ... rest of the function's work
My questions are :
This syntax is correct?
Will it work?
Is there any other alternative?
Should I just use the SYS.SYSTABLES and find all the tables that has 'T' as tabletype and alter their name their, will it change the access name ?
I think you're looking for the RENAME TABLE statement: http://db.apache.org/derby/docs/10.10/ref/rrefsqljrenametablestatement.html
Don't just issue update statements against the system catalogs, you will corrupt your database.

Querying the appropriate database schema

This is a follow-on question to my earlier question about specifying multiple schemata in java using jooq to interact with H2.
My test H2 DB currently has 2 schemata, PUBLIC and INFORMATION_SCHEMA. PUBLIC is specified as the default schema by H2. When running a query that should extract information from eg INFORMATION_SCHEMA.TABLES the query fails with a "table unknown" SQL error. I am only able to execute such queries by executing a factory.use(INFORMATION_SCHEMA). There are no build errors etc and eclipse properly autocompletes eg TABLES.TABLE_NAME.
If I dont do this, jooq doesnt seem to prepend the appropriate schema even though I create the correct Factory object for the schema eg
InformationSchemaFactory info = new InformationSchemaFactory(conn);
I read about mapping but am a bit confused as to which schema I would use as the input/output.
By default, the InformationSchemaFactory assumes that the supplied connection is actually connected to the INFORMATION_SCHEMA. That's why schema names are not rendered in SQL. Example:
// This query...
new InformationSchemaFactory(conn).selectFrom(INFORMATION_SCHEMA.TABLES).fetch();
// ... renders this SQL (with the asterisk expanded):
SELECT * FROM "TABLES";
The above behaviour should be documented in your generated InformationSchemaFactory Javadoc. In order to prepend "TABLES" with "INFORMATION_SCHEMA", you have several options.
Use a regular factory instead, which is not tied to any schema:
// This query...
new Factory(H2, conn).selectFrom(INFORMATION_SCHEMA.TABLES).fetch();
// ... renders this SQL:
SELECT * FROM "INFORMATION_SCHEMA"."TABLES";
Use another schema's factory, such as the generated PublicFactory:
// This query...
new PublicFactory(conn).selectFrom(INFORMATION_SCHEMA.TABLES).fetch();
// ... renders this SQL:
SELECT * FROM "INFORMATION_SCHEMA"."TABLES";
Use Settings and an appropriate schema mapping to force the schema name to be rendered.
The first option is probably the easiest one.
This blog post here will give you some insight about how to log executed queries to your preferred logger output: http://blog.jooq.org/2011/10/20/debug-logging-sql-with-jooq/

Categories