What does this BatchUpdateException mean? - java

I'm getting this BatchUpdateException from a stmt.executeBatch() statement:
BatchUpdateException: A resultset was created for update
The internet does not have any information on this exception message. What does it mean? The traceback doesn't contain anything useful other than that a stored procedure failed.

I'd interpret the message as meaning that an SQL statement that you added via addBatch() has produced a ResultSet, meaning that it's not your normal INSERT, UPDATE or DELETE statement.
Statements that should return results can't be executed in batches with JDBC.
The JDBC Tutorial (under the heading "Handling Batch Update Exceptions") confirms it:
You will get a BatchUpdateException when you call the method executeBatch if (1) one of the SQL statements you added to the batch produces a result set (usually a query) or (2) one of the SQL statements in the batch does not execute successfully for some other reason.
You seem to be running into case 1 here.

A batch-update are several insert/update/delete statements which are processed by the database together. This is usualy done for perfomance reasons. 1x 1000 inserts is much faster than 1000x 1 insert. A BatchUpdateException means 1 (or more) statements failed most often due to a constraint-violation.
You will have to look at the stored procedure to see what it has been doing. Mayby your dba can give you more information about what went wrong.

I dropped one column from the table.
When i tried to insert records to that table i was getting BatchUpdateException.
After running the below command the problem got solved
REORG TABLE TABLE_NAME

Related

Using batchExecute and execute methods on a single statement object in JDBC

I came across an old piece of code which looks like below
Statement stmt = connection.createStatement();
stmt.addBatch(insertQuery);
stmt.addBatch(insertQuery);
stmt.addBatch(insertQuery);
stmt.addBatch(insertQuery);
//there is some data which needs to be deleted before inserting the new data.
stmt.execute(deleteQuery);
stmt.executeBatch();
Here we are batching up a few query and before executing the batch this code is executing some other delete query and then executing the batch.
Is it legal to do this?
Will the above code work as expected that it will first execute the delete query and then the batch update?
The JDBC specification (version 4.3) says:
The behavior of the methods executeQuery, executeUpdate, and
execute is implementation-defined when a statement’s batch is
non-empty.
In other words, the behaviour is not specified and depends on the driver implementation, this means it should not be relied on.
A quick (but not thorough) scan of the pgjdbc sources seems to indicate that the PostgreSQL driver indeed allows you to first add statements to the batch, execute a single statement, and then execute the batch.
But in the code shown, I'd suggest to simply first execute the delete query, and only then populate and execute the batch. That order would be a lot simpler to read for people unfamiliar with the code.

Intermittent state of executing multiple queries using a Statement object

I am executing multiple SQL statements using single Connection and Statement object as two processes.
Pre-process (Preprocess.java)
Post-process (Postprocess.java)
My Process steps are as follows,
Creating a Connection object and setAutoCommit() as false.
Creating a Statement object.
During pre-process executing the required SQL statements using the created Statement object.
Then for post-process i pass the same Statement object and executing the required SQL statements.
Finally commit the transaction (connection.commit();) and close the Connection,Statement objects.
My Problem is:
Sometimes the post-process executed statements are not reflected in database. That is, if i insert or update any records in post-process,those records are not available in database.
But the post-process execution is working fine without any exception.
The Probability of this case is 5 out of 1/(Which means 4 times working as i expected but 1 time its problematic).
Can any one point out the problem?
I am using Jdk 1.7 and Postgresql 9.3 version.

Batch MySQL inserts, one a primary record, one a detail record with foreign key

I have an application that logs a lot of data to a MySQL database. The in-production version already runs insert statements in batches to improve performance. We're changing the db schema a bit so that some of the extraneous data is sent to a different table that we can join on lookup.
However, I'm trying to properly design the queries to work with our batch system. I wanted to use the mysql LAST_QUERY_ID so I wouldn't have to worry about getting the generated keys and matching them up (seems like a very difficult task).
However, I can't seem to find a way to add different insert statements to a batch, so how can resolve this? I assume I need to build a second batch and add all detail queries to that, but that means that the LAST_QUERY_ID loses meaning.
s = conn.prepareStatement("INSERT INTO mytable (stuff) VALUES (?)");
while (!queue.isEmpty()){
s.setLong(1, System.currentTimeMillis() / 1000L);
// ... set other data
s.addBatch();
// Add insert query for extra data if needed
if( a.getData() != null && !a.getData().isEmpty() ){
s = conn.prepareStatement("INSERT INTO mytable_details (stuff_id,morestuff)
VALUES (LAST_INSERT_ID(),?)");
s.setString(1, a.getData());
s.addBatch();
}
}
This is not how batching works. Batching only works within one Statement, and for a PreparedStatement that means that you can only add batches of parameters for one and the same statement. Your code also neglects to execute the statements.
For what you want to do, you should use setAutoCommit(false), execute both statement and then commit() (or rollback if an error occurred).
Also I'd suggest you look into the JDBC standard method of retrieving generated keys, as that will make your code less MySQL specific. See also Retrieving AUTO_INCREMENT Column Values through JDBC.
I've fixed it for now though I wish there was a better way. I built an arraylist of extra data values that I can associates with the generatedKeys returned from the batch inserts. After the first query batch executes, I build a second batch with the right ids/data.

JDBC statement fails to delete row in specific MySql table

I have a table, say example1 and I'm using a jdbc statement to delete one of its rows. I have tried various methods, from delete from example1 where id = 1 to statement.addbatch(sql) but it does not delete the row. If I execute the same sql statement in Toad for Mysql it's able to delete the row just fine.
Weird thing is that using jdbc I am able to delete rows from other tables just fine; it's just this one particular table giving me unexpected results.
There is nothing special about this table. It has a primary key and no constraints/foreign key relationships.
Also, this delete is a part of a transaction so auto-commit is set to false and once all records get updated/inserted/deleted then the commit is done. This does not seem to have any problem with any other table and all the updates/deletes/inserts are done just fine.
Permission-wise this table has same permission for the db user that any other table in the db.
Any ideas or pointers will be greatly appreciated!
Turning on general logging on the database or profiling in the JDBC driver would show you what's actually going to the database:
http://dev.mysql.com/doc/refman/5.0/en/connector-j-reference-configuration-properties.html
Enable profiling of queries for Connector/J by adding this to your connection string: profileSQL=true
General Logging documentation:
http://dev.mysql.com/doc/refman/5.1/en/query-log.html
There's also mk-query-digest for sniffing your network traffic and analyzing the results:
http://www.maatkit.org/doc/mk-query-digest.html
I have come across the same situation
I remember there was a defnitely mistake in the query
Try to execute the query in the mysql sqlyog or any GUI and check if it works, i am 100% sure it wont work
then correct the query and check it

BatchUpdateException: the batch will not terminate

I have an application which processes a very large file and sends data to an oracle database (using Java 6, oracle 9).
In a loop, I use a PreparedStatement ps and create all SQL statements generated with ps.addBatch().
I have a situation where a BatchUpdateException bue is thrown somewhere during the ps.executeBatch(). At that point, the batch stops to be executed.
I'd like the batch execution to continue, so that I can then check on failed updates in a method processUpdateCounts(bue.getUpdateCounts()).
The javadoc about class BatchUpdateException says:
After a command in a batch update
fails to execute properly and a
BatchUpdateException is thrown, the
driver may or may not continue to
process the remaining commands in the
batch.
Is there a way to enforce continuation or do I need to alter my program so that it will execute the statement individually?
Just found this link:
JDBC Batch Update Problem
Apparently, it says there there is
NO WAY WITH ORACLE BATCH JDBC to proceed after first failure,
thus I am resorting to sending the inserts one by one.
Thank you
(sorry for not looking better to find the link above before).
there is a workaround that would allow you to use the batch feature. Instead of executing a simple INSERT statement, you can execute a PL/SQL block that will deal with the error appropriately:
BEGIN
INSERT INTO your_table VALUES (?,?,...?);
EXCEPTION
WHEN OTHERS THEN
/* deal with the error. For example, log the error id and error msg
so that you can list them after the batch */
INSERT INTO error_table VALUES (?, sqlerrm);
END
The performance should be on par with the batch insert (should be faster than individual execution of the statements). You could also call a stored procedure instead of a PL/SQL block.
Oracle itself can, see here: http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14250/oci04sql.htm#sthref616
However, it doesn't seem that this functionality is exposed to JDBC, not even in the oracle specific classes.
Because of the rather useless JDBC error handling ("the driver may or may not continue"), I'm always setting a savepoint before the batch, and performing an rollback to that point on error. That's the only JDBC compliant way to establish a known state after an Oracle Batch Error--as far as I know.
Since the specification doesn't seem to mandate it (as clearly shown by the Javadoc), any "forced" continuation would have to be done on per-driver basis. A simple standard-compliant workaround would be to check the getUpdateCounts() returned array and "re-run" the batch for those statements which failed. You can make this approach a bit more sophisticated by putting in a logic for the number of retries.
Sure, this seems a bit messy (keeping track of the "batch" added and then checking the output) but would work across all databases and driver implementations. Just a thought...

Categories