I have a little concern with Objet PreparedStatement in java to insert in an Oracle database.
Infect I prepare well the model of my INSERT query in the PreparedStatement I add well all my parameters with an addBatch() for each record I want to insert.
I add several batches to insert a 500 record hits for example.
Until then all of them work well I can insert what I want
On the other hand, in case my PreparedStatement generates a BatchUpdateException error (for example violation of constraint) on the 500 line that I want to insert it inserts me nothing at all.
I want to the limit remove the record that raises concern (with violation constraint) and insert at least the 499 line that are OK
How can I do that ? if she gives me a track I'd be grateful.
Just for Info I want to insert several lines of a stroke from 500 lines, so the solution to insert line by line does not fit me too much performance level.
Cordially
maybe this is not what you want but oracle has some built in error logic
you have to create an error table
e.g. if the table is called emp, run this
exec dbms_errlog.create_error_log(dml_table_name=>'emp');
that will create a table err$_emp that will catch the errors
then you can do something like this below(note the log errors into clause)
the batch will succeed and you will have to check the error table for errors after you run it
import java.sql.*;
public class Class1 {
public static void main(String[] args) throws SQLException {
try {
Class.forName("oracle.jdbc.driver.OracleDriver");
} catch (ClassNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
PreparedStatement preparedStatement;
int records = 20;
try {
Connection connection = DriverManager.getConnection(
"jdbc:oracle:thin:#//host/db","scott","tiger");
String compiledQuery = "INSERT INTO EMP(EMPNO)" +
" VALUES" + "(?) LOG ERRORS INTO ERR$_EMP REJECT LIMIT UNLIMITED";
preparedStatement = connection.prepareStatement(compiledQuery);
for(int index = 1; index <= records; index++) {
preparedStatement.setInt(1, index);
preparedStatement.addBatch();
}
long start = System.currentTimeMillis();
int[] inserted;
try {
inserted = preparedStatement.executeBatch();
}
catch (SQLException e)
{
System.out.println("sql error");
}
long end = System.currentTimeMillis();
System.out.println("total time taken to insert the batch = " + (end - start) + " ms");
System.out.println("total time taken = " + (end - start)/records + " s");
preparedStatement.close();
connection.commit();
connection.close();
} catch (SQLException ex) {
System.err.println("SQLException information");
while (ex != null) {
System.err.println("Error msg: " + ex.getMessage());
ex = ex.getNextException();
}
throw new RuntimeException("Error");
}
}
}
Related
I am writing a Java class to insert data to a database with an iteration. Although I can now insert data, I am struggling in handling Oracle errors. In this case, I have deliberately created a Primary Key Constrain error by trying to insert duplicate primary keys (I have pre-loaded the database with the same entries as I am trying to insert with Java)
So as expected, I get the "ORA-00001: unique constraint". However, the problem I am having, is that after 300 iterations, I reach a new error:"ORA-01000: maximum open cursors exceeded"
I guess the issue is that every failed executeUpdate() keeps a cursor open.
I have temporarily solved the issue by including a close() statement on the catch of the error. However, I am wondering:
Should the failed executeUpdate() not be closing the cursor?
Is there a better way I can close cursors on Exception?
Why does it return a null Exception?
My Java Code:
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.PreparedStatement;
import java.sql.SQLException;
public class TestOracleError {
private static Connection conn;
public static void main(String[] args) {
//Connect
try {
conn = DriverManager.getConnection("jdbc:oracle:thin:"+
"#XXX.XXX.XXX.XXX:1521:XXXX", "XXXX", "XXXX");
}
catch(SQLException e) {
System.err.println(e.getMessage());
}
//Iterate insert
for(int i=1; i<5000; i++){
PreparedStatement pstmt=null;
try {
//Deliberate error on the SQL (Primary Key Constraint)
pstmt = conn.prepareStatement("INSERT INTO DUMMY (ID) VALUES "
+"('"+i+"')");
pstmt.executeUpdate();
pstmt.close();
}
catch(Exception e) {
System.err.println("DB Error on iteration "+i+": " +
e.getMessage());
//If I add the following line then I can close the error!
//try {pstmt.close();} catch (Exception e1) {}
}
}
}
}
If you need to insert many rows put the creation of prepared statement outside the loop and set only the values inside the loop.
// Moved outside
PreparedStatement pstmt=null;
// Using the question mark as a placeholder for a variable
pstmt = conn.prepareStatement("INSERT INTO DUMMY (ID) VALUES (?)");
for (int i = 1; i < 5000; i++) {
try {
//Deliberate error on the SQL (Primary Key Constraint)
// Only set the variable in the loop
pstmt.setInt(1, i);
pstmt.executeUpdate();
} catch (Exception e) {
System.err.println("DB Error on iteration "+i+": " +
e.getMessage());
//If I add the following line then I can close the error!
//try {pstmt.close();} catch (Exception e1) {}
}
}
pstmt.close(); // Moved out of loop
Note: Your code don't close the pstmt if an exception happens. So statements remains opened. This can potentially create the problem of too many open cursors.
Generally the best solution is to close the resources in a finally block or use a try with resource statement
Use finally or Try with resources.
try {
//Deliberate error on the SQL (Primary Key Constraint)
// Only set the variable in the loop
pstmt.setInt(1, i);
pstmt.executeUpdate();
} catch (Exception e) {
System.err.println("DB Error on iteration "+i+": " +
e.getMessage());
//If I add the following line then I can close the error!
//try {pstmt.close();} catch (Exception e1) {}
}
finally
{
pstmt.close();
}
or
try (Connection con = DriverManager.getConnection(myConnectionURL);
PreparedStatement ps = con.prepareStatement(sql);) {
......
}
} catch (SQLException e) {
e.printStackTrace();
}
I guess the issue is that every failed executeUpdate() keeps a cursor open.
No, the issue is that you aren't closing your PreparedStatements if an SQLException occurs.
Should the failed executeUpdate() not be closing the cursor?
No.
Is there a better way I can close cursors on Exception?
Close the PreparedStatement in a finally block, or with the try-with-resources syntax.
Why does it return a null Exception?
It doesn't. It throws an exception with a null message.
Instead of preparing 5000 PreparedStatements you should also investigate batches.
I have a problem with my program. When I try to write to my database (in MySQL) I get this error "Column count doesn't match value count at row 1"
This is my code:
public void registreerNieuwSpelbord(String spelnaam, String mapcode) {
try (Connection connectie = DriverManager.getConnection(Connectie.JDBC_URL)) {
Statement stmt = connectie.createStatement();
String schrijfSpelbordWeg = "INSERT INTO spelbord(Mapcode, spel_Spelnaam) values('" + mapcode + "," + spelnaam + "')";
stmt.executeUpdate(schrijfSpelbordWeg);
} catch (SQLException ex) {
throw new RuntimeException(ex);
} catch (Exception e) {
throw new RuntimeException(e);
}
}
note: there is also a 3th column with an ID that automatically gives a number
You have two columns listed in the insert, but only one value.
Try this:
String schrijfSpelbordWeg = "INSERT INTO spelbord(Mapcode, spel_Spelnaam) values('" + mapcode + "','" + spelnaam + "')";
You should always use a PreparedStatement and bind variables when dealing with SQL that takes input parameters. This way, you're eliminating the chance of SQL injection, allowing the DB to re-use/cache your query and sparing yourself from hunting down bugs that are caused by missing a quote around a parameter.
Here's a refactored version that uses parameterized SQL:
public void registreerNieuwSpelbord(String spelnaam, String mapcode) {
String sql = "INSERT INTO spelbord(Mapcode, spel_Spelnaam) values(?, ?)";
try (Connection connectie = DriverManager.getConnection(Connectie.JDBC_URL);
PreparedStatement ps = connectie.prepareStatement(sql);) {
ps.setString(1, mapcode);
ps.setString(2, spelnaam);
ps.executeUpdate();
} catch (SQLException ex) {
throw new RuntimeException(ex);
}
}
OK, I know that Batch Processing allows to group related SQL statements into a batch and submit them with one call to the database. When you send several SQL statements to the database at once, you reduce the amount of communication overhead, thereby improving performance. In this particular situation (see code below) I don't think batching does it's sole purpose. Cause stmt.executeBatch() is called straight away after adding a batch(?) Wouldn't stmt.executeUpdate() do the same thing?
public void tableChanged(TableModelEvent e) {
int row = e.getFirstRow();
int col = e.getColumn();
model = (MyTableModel) e.getSource();
String stulpPav = model.getColumnName(col);
Object data = model.getValueAt(row, col);
Object studId = model.getValueAt(row, 0);
System.out.println("tableChanded works");
try {
new ImportData(stulpPav, data, studId);
bottomLabel.setText(textForLabel());
} catch (ClassNotFoundException e1) {
e1.printStackTrace();
} catch (SQLException e1) {
e1.printStackTrace();
}
}
public class ImportData {
public ImportData(String a, Object b, Object c)
throws ClassNotFoundException, SQLException {
Statement stmt = null;
try {
connection = TableWithBottomLine.getConnection();
String stulpPav = a;
String duom = b.toString();
String studId = c.toString();
System.out.println(duom);
connection.setAutoCommit(false);
stmt = connection.createStatement();
stmt.addBatch("update finance.fin set " + stulpPav + " = " + duom
+ " where ID = " + studId + ";");
stmt.executeBatch();
connection.commit();
} catch (BatchUpdateException e) {
e.printStackTrace();
} catch (SQLException e) {
e.printStackTrace();
} finally {
if (stmt != null)
stmt.close();
connection.setAutoCommit(true);
System.out.println("Data was imported to database");
}
}
}
In this case using batch has no advantage at all. It might even introduce additional overhead over a direct executeUpdate (but that is driver and database dependent).
However, don't assume that batching has advantages with all JDBC drivers. I haven't looked at the specifics of MySQL, but I know there are JDBC drivers where batching internally is a normal execute for each statement in the batch.
The code in your question however has a bigger problem: it is vulnerable to SQL injection.
When I do a insert in a remote tabke using jconnect it gives me the following error:
Unexpected exception : java.sql.SQLException: This transaction has been rolled back, rather than only the current statement.
, sqlstate = ZZZZZjava.sql.SQLException: This transaction has been rolled back, rather than only the current statement.
at com.sybase.jdbc4.jdbc.SybConnection.getAllExceptions(SybConnection.java:2780)
at com.sybase.jdbc4.jdbc.SybStatement.handleSQLE(SybStatement.java:2665)
at com.sybase.jdbc4.jdbc.SybStatement.nextResult(SybStatement.java:295)
at com.sybase.jdbc4.jdbc.SybStatement.nextResult(SybStatement.java:272)
at com.sybase.jdbc4.jdbc.SybStatement.updateLoop(SybStatement.java:2515)
at com.sybase.jdbc4.jdbc.SybStatement.executeUpdate(SybStatement.java:2499)
at com.sybase.jdbc4.jdbc.SybStatement.executeUpdate(SybStatement.java:577)
at connectSybase.main(connectSybase.java:48)
Do you know what it might be?
Here's my full code:
import java.io.*;
import java.sql.*;
public class connectSybase {
public static void main(String args[])
{
try
{
// jconn3 <-- do pessoal do OMS
//Class.forName("com.sybase.jdbc3.jdbc.SybDriver");
// jconn4 <-- do servidor de OMS1_PAR_DEV_SQL
Class.forName("com.sybase.jdbc4.jdbc.SybDriver");
}
catch (ClassNotFoundException cnfe)
{
System.out.println("BUM!");
}
try
{
System.out.println("Any of the following may throw an SQLException.");
System.out.println("Opening a connection.");
Connection con = java.sql.DriverManager.getConnection
("----------------------------");
// more code to use connection ...
System.out.println("Creating a statement object.");
Statement stmt = con.createStatement();
System.out.println("Executing the query.");
ResultSet rs = stmt.executeQuery("Select top 10 * from OMS_DEV..SCRIBE_AR");
System.out.println("Process the result set.");
while (rs.next())
{
System.out.println("Fetched value " + rs.getString(1));
}
System.out.println("Executing the query.");
int result = stmt.executeUpdate("---------------");
System.out.println("Process the result set: " + result );
}
catch (SQLException sqe)
{
sqe.printStackTrace();
System.out.println("Unexpected exception : " +
sqe.toString() + ", sqlstate = " +
sqe.getSQLState());
System.exit(1);
}
System.exit(0);
}
}
I've omitted the insert and the connection but both work because I get the result of the first select (only the insert fails) and the insert is also correct because it works using isql or dbartisan.
Sybase error message was not specific but the problem was related to packet size.
In ASE it was 8192 and in IQ only 2048.
It generated the error when the packet exceeded 2k.
I have a method which does a simple mysql insert, when I tried to rollback the insert action as follow, on an error but it is not rollingback on errors, please assist me,
public void addFamer(FamerDTO famer) throws Exception {
Connection con = JDBCConnectionPool.getInstance().checkOut();
con.setAutoCommit(false);
try {
String generalFamerDataSQL = "INSERT INTO famers(famer_code, name_wt_initials, full_name, gender, "
+ "nic_or_passport_no, sc_possition, phone_home, phone_mobile, phone_office) VALUES(?,?,?,?,?,?,?,?,?)";
PreparedStatement insertFamerPS = con.prepareStatement(generalFamerDataSQL, PreparedStatement.RETURN_GENERATED_KEYS);
insertFamerPS.setString(1, famer.getFamerCode());
insertFamerPS.setString(2, famer.getNameWithInitials());
insertFamerPS.setString(3, famer.getNameInFull());
insertFamerPS.setString(4, famer.getGender());
insertFamerPS.setString(5, famer.getNICorPassportNo());
insertFamerPS.setString(6, famer.getSocietyPosission());
insertFamerPS.setString(7, famer.getHomePhone());
insertFamerPS.setString(8, famer.getMobilePhone());
insertFamerPS.setString(9, famer.getOfficePhone());
insertFamerPS.execute();
String famerRelations = "INSERT INTO org_reg_blk_soc_fmr(org_id, region_id, famer_id, block_id, soc_id) "
+ "VALUES (?,?,?,?,?)";
PreparedStatement famerRelationsPS = con.prepareStatement(famerRelations);
famerRelationsPS.setInt(1, famer.getOrganization().getOrg_id());
famerRelationsPS.setInt(2, famer.getRegion().getRegion_id());
famerRelationsPS.setInt(3, famerID);
famerRelationsPS.setInt(4, famer.getBlock().getBlockId());
famerRelationsPS.setInt(6, famer.getSociety().getSoc_id()); //intentionally made an error here to test, put index as 6 for 5
famerRelationsPS.execute();
con.commit();
} catch (Exception e) {
if (con != null) {
logger.info("Rolling back!");
con.rollback();
}
logger.error(e.getLocalizedMessage());
} finally {
con.setAutoCommit(true);
JDBCConnectionPool.getInstance().checkIn(con);
}
}
once this method is called with the required parameters as there is a error in the second insert statement I expected to rollback the first insert action. but thought the error is shown, a record is added to the data base by the first insert statement.
Just to check - what is the table type you're using? Last time I used MySQL, MyISAM tables didn't support transactions, meaning you have to used another table type e.g. InnoDB.