I have an application which processes a very large file and sends data to an oracle database (using Java 6, oracle 9).
In a loop, I use a PreparedStatement ps and create all SQL statements generated with ps.addBatch().
I have a situation where a BatchUpdateException bue is thrown somewhere during the ps.executeBatch(). At that point, the batch stops to be executed.
I'd like the batch execution to continue, so that I can then check on failed updates in a method processUpdateCounts(bue.getUpdateCounts()).
The javadoc about class BatchUpdateException says:
After a command in a batch update
fails to execute properly and a
BatchUpdateException is thrown, the
driver may or may not continue to
process the remaining commands in the
batch.
Is there a way to enforce continuation or do I need to alter my program so that it will execute the statement individually?
Just found this link:
JDBC Batch Update Problem
Apparently, it says there there is
NO WAY WITH ORACLE BATCH JDBC to proceed after first failure,
thus I am resorting to sending the inserts one by one.
Thank you
(sorry for not looking better to find the link above before).
there is a workaround that would allow you to use the batch feature. Instead of executing a simple INSERT statement, you can execute a PL/SQL block that will deal with the error appropriately:
BEGIN
INSERT INTO your_table VALUES (?,?,...?);
EXCEPTION
WHEN OTHERS THEN
/* deal with the error. For example, log the error id and error msg
so that you can list them after the batch */
INSERT INTO error_table VALUES (?, sqlerrm);
END
The performance should be on par with the batch insert (should be faster than individual execution of the statements). You could also call a stored procedure instead of a PL/SQL block.
Oracle itself can, see here: http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14250/oci04sql.htm#sthref616
However, it doesn't seem that this functionality is exposed to JDBC, not even in the oracle specific classes.
Because of the rather useless JDBC error handling ("the driver may or may not continue"), I'm always setting a savepoint before the batch, and performing an rollback to that point on error. That's the only JDBC compliant way to establish a known state after an Oracle Batch Error--as far as I know.
Since the specification doesn't seem to mandate it (as clearly shown by the Javadoc), any "forced" continuation would have to be done on per-driver basis. A simple standard-compliant workaround would be to check the getUpdateCounts() returned array and "re-run" the batch for those statements which failed. You can make this approach a bit more sophisticated by putting in a logic for the number of retries.
Sure, this seems a bit messy (keeping track of the "batch" added and then checking the output) but would work across all databases and driver implementations. Just a thought...
Related
I came across an old piece of code which looks like below
Statement stmt = connection.createStatement();
stmt.addBatch(insertQuery);
stmt.addBatch(insertQuery);
stmt.addBatch(insertQuery);
stmt.addBatch(insertQuery);
//there is some data which needs to be deleted before inserting the new data.
stmt.execute(deleteQuery);
stmt.executeBatch();
Here we are batching up a few query and before executing the batch this code is executing some other delete query and then executing the batch.
Is it legal to do this?
Will the above code work as expected that it will first execute the delete query and then the batch update?
The JDBC specification (version 4.3) says:
The behavior of the methods executeQuery, executeUpdate, and
execute is implementation-defined when a statement’s batch is
non-empty.
In other words, the behaviour is not specified and depends on the driver implementation, this means it should not be relied on.
A quick (but not thorough) scan of the pgjdbc sources seems to indicate that the PostgreSQL driver indeed allows you to first add statements to the batch, execute a single statement, and then execute the batch.
But in the code shown, I'd suggest to simply first execute the delete query, and only then populate and execute the batch. That order would be a lot simpler to read for people unfamiliar with the code.
So basically, I would like to avoid stored procedures, but at the same time I would'nt want multiple round-trips to database to execute sequential statements.
Apparently this blog says Facebook uses mysql's multiple-statement-queries. Unfortunately, its a C API, is there a java equivalent of it?
So in brief, the question "is in java+mysql how can a second jdbc statement use the output of the first statement as input to execute, without a round-trip to database and without a storedproc" ?
If not how do other people approach this problem?
Yes, the JDBC driver for MySQL support the multi-statement queries. It is however disabled by default for security reasons, as multi-statement queries significantly increase the risks associated with eventual SQL injections.
To turn on multi-statement queries support, simply add the allowMultiQueries=true option to your connection string (or pass the equivalent option in map format). You can get more information on that option here: https://dev.mysql.com/doc/connector-j/5.1/en/connector-j-reference-configuration-properties.html.
Once this option enabled, you can simply execute a call similar to: statement.execute("select ... ; select ... ; select ..."). Returned ResultSets can be iterated from the Statement object: each call to statement.getResultSet() return the next one. Use statement.getMoreResults() to determine if there are indeed more ResultSet available.
It sounds like you want to do batch processing.
here is a duplicate question with an good answer:
How to execute multiple SQL statements from java
I have to confirm that can we execute multiple select statement in one shot and get multiple resultsets. Please give me some idea for doing this.
I have to execute two select query in one statement:-
String sql="select * form test;
select * from test where empid=1;"
I am expecting to run like this :-
statement.execute(sql);
thanks
I don't believe that standard JDBC supports this. Certainly the ResultSet interface is oriented towards "multiple rows, one row at a time" - but not "multiple sets of results".
That doesn't mean it's not feasible with your specific database, however - it's possible that there's a driver for your database which extends JDBC to allow it. If you specify which database and driver you're using, we could verify that more easily.
In my opinion JDBC does not allow executing multiple statements in one go. The language used in the JDBC specification and API doc indicates that the expectation is that one Statement execution is one statement, not multiple statements (eg it uses 'a SQL statement', which when reading the SQL spec means a single SELECT, INSERT, etc). However it never explicitly states it is not allowed.
Some drivers do support execution of multiple statements in one execution, but this usually has to be explicitly enabled using a connection property. Also some databases support executing a block of stored procedure code without explicitly defining a stored procedure (in that case the block is considered to be the statement).
Create a stored procedure with those set of select statements and use Statement.getMoreResults() to check whether you have more ResultSets and use
Statement.getResultSet() to get the next ResultSet
An example is given Here
I'm getting this BatchUpdateException from a stmt.executeBatch() statement:
BatchUpdateException: A resultset was created for update
The internet does not have any information on this exception message. What does it mean? The traceback doesn't contain anything useful other than that a stored procedure failed.
I'd interpret the message as meaning that an SQL statement that you added via addBatch() has produced a ResultSet, meaning that it's not your normal INSERT, UPDATE or DELETE statement.
Statements that should return results can't be executed in batches with JDBC.
The JDBC Tutorial (under the heading "Handling Batch Update Exceptions") confirms it:
You will get a BatchUpdateException when you call the method executeBatch if (1) one of the SQL statements you added to the batch produces a result set (usually a query) or (2) one of the SQL statements in the batch does not execute successfully for some other reason.
You seem to be running into case 1 here.
A batch-update are several insert/update/delete statements which are processed by the database together. This is usualy done for perfomance reasons. 1x 1000 inserts is much faster than 1000x 1 insert. A BatchUpdateException means 1 (or more) statements failed most often due to a constraint-violation.
You will have to look at the stored procedure to see what it has been doing. Mayby your dba can give you more information about what went wrong.
I dropped one column from the table.
When i tried to insert records to that table i was getting BatchUpdateException.
After running the below command the problem got solved
REORG TABLE TABLE_NAME
Somewhere way inside JBoss in a hibernate query I'm catching an error that leaves me with a ResultSet. This code is a plugged in custom data type.
It would be nice if I could simple do rs.getStatement().toString() and be done with it, but that unfortunately doesn't give away anything about the sql statement that went into it.
I was thinking doing something with ((PreparedStatement)rs.getStatement()).getMetaData().
I really wished hibernate would be a little more informative when it runs into errors.
Does anyone have a good solution to help reveal which table and which column that was used when the exception occurred?
Simply enable SQL logging in the Hibernate configuration properties by setting the hibernate.show_sql property to true.
This more reliable than examining the result sets metadata since the where clause is not available.
One way you can debug Hibernate is by turning on its detailed logging.
For example, you can log all SQL statements as they are executed by turning on logging for org.hibernate.SQL. From here you should be able to narrow down the last statement executed prior to your exception.
Documentation can be found here.
getting the MetaData for the ResultSet will not allow you to get the info that was passed in. In Hibernate you can have the statements be output to a log file.
Most JDBC drivers allow you to set tracing so that you can debug.