How can i insert data using procedure sql? [duplicate] - java

In my app I need to do a lot of INSERTS. Its a Java app and I am using plain JDBC to execute the queries. The DB being Oracle. I have enabled batching though, so it saves me network latencies to execute queries. But the queries execute serially as separate INSERTs:
insert into some_table (col1, col2) values (val1, val2)
insert into some_table (col1, col2) values (val3, val4)
insert into some_table (col1, col2) values (val5, val6)
I was wondering if the following form of INSERT might be more efficient:
insert into some_table (col1, col2) values (val1, val2), (val3, val4), (val5, val6)
i.e. collapsing multiple INSERTs into one.
Any other tips for making batch INSERTs faster?

This is a mix of the two previous answers:
PreparedStatement ps = c.prepareStatement("INSERT INTO employees VALUES (?, ?)");
ps.setString(1, "John");
ps.setString(2,"Doe");
ps.addBatch();
ps.clearParameters();
ps.setString(1, "Dave");
ps.setString(2,"Smith");
ps.addBatch();
ps.clearParameters();
int[] results = ps.executeBatch();

Though the question asks inserting efficiently to Oracle using JDBC, I'm currently playing with DB2 (On IBM mainframe), conceptually inserting would be similar so thought it might be helpful to see my metrics between
inserting one record at a time
inserting a batch of records (very efficient)
Here go the metrics
1) Inserting one record at a time
public void writeWithCompileQuery(int records) {
PreparedStatement statement;
try {
Connection connection = getDatabaseConnection();
connection.setAutoCommit(true);
String compiledQuery = "INSERT INTO TESTDB.EMPLOYEE(EMPNO, EMPNM, DEPT, RANK, USERNAME)" +
" VALUES" + "(?, ?, ?, ?, ?)";
statement = connection.prepareStatement(compiledQuery);
long start = System.currentTimeMillis();
for(int index = 1; index < records; index++) {
statement.setInt(1, index);
statement.setString(2, "emp number-"+index);
statement.setInt(3, index);
statement.setInt(4, index);
statement.setString(5, "username");
long startInternal = System.currentTimeMillis();
statement.executeUpdate();
System.out.println("each transaction time taken = " + (System.currentTimeMillis() - startInternal) + " ms");
}
long end = System.currentTimeMillis();
System.out.println("total time taken = " + (end - start) + " ms");
System.out.println("avg total time taken = " + (end - start)/ records + " ms");
statement.close();
connection.close();
} catch (SQLException ex) {
System.err.println("SQLException information");
while (ex != null) {
System.err.println("Error msg: " + ex.getMessage());
ex = ex.getNextException();
}
}
}
The metrics for 100 transactions :
each transaction time taken = 123 ms
each transaction time taken = 53 ms
each transaction time taken = 48 ms
each transaction time taken = 48 ms
each transaction time taken = 49 ms
each transaction time taken = 49 ms
...
..
.
each transaction time taken = 49 ms
each transaction time taken = 49 ms
total time taken = 4935 ms
avg total time taken = 49 ms
The first transaction is taking around 120-150ms which is for the query parse and then execution, the subsequent transactions are only taking around 50ms. (Which is still high, but my database is on a different server(I need to troubleshoot the network))
2) With insertion in a batch (efficient one) - achieved by preparedStatement.executeBatch()
public int[] writeInABatchWithCompiledQuery(int records) {
PreparedStatement preparedStatement;
try {
Connection connection = getDatabaseConnection();
connection.setAutoCommit(true);
String compiledQuery = "INSERT INTO TESTDB.EMPLOYEE(EMPNO, EMPNM, DEPT, RANK, USERNAME)" +
" VALUES" + "(?, ?, ?, ?, ?)";
preparedStatement = connection.prepareStatement(compiledQuery);
for(int index = 1; index <= records; index++) {
preparedStatement.setInt(1, index);
preparedStatement.setString(2, "empo number-"+index);
preparedStatement.setInt(3, index+100);
preparedStatement.setInt(4, index+200);
preparedStatement.setString(5, "usernames");
preparedStatement.addBatch();
}
long start = System.currentTimeMillis();
int[] inserted = preparedStatement.executeBatch();
long end = System.currentTimeMillis();
System.out.println("total time taken to insert the batch = " + (end - start) + " ms");
System.out.println("total time taken = " + (end - start)/records + " s");
preparedStatement.close();
connection.close();
return inserted;
} catch (SQLException ex) {
System.err.println("SQLException information");
while (ex != null) {
System.err.println("Error msg: " + ex.getMessage());
ex = ex.getNextException();
}
throw new RuntimeException("Error");
}
}
The metrics for a batch of 100 transactions is
total time taken to insert the batch = 127 ms
and for 1000 transactions
total time taken to insert the batch = 341 ms
So, making 100 transactions in ~5000ms (with one trxn at a time) is decreased to ~150ms (with a batch of 100 records).
NOTE - Ignore my network which is super slow, but the metrics values would be relative.

The Statement gives you the following option:
Statement stmt = con.createStatement();
stmt.addBatch("INSERT INTO employees VALUES (1000, 'Joe Jones')");
stmt.addBatch("INSERT INTO departments VALUES (260, 'Shoe')");
stmt.addBatch("INSERT INTO emp_dept VALUES (1000, 260)");
// submit a batch of update commands for execution
int[] updateCounts = stmt.executeBatch();

You'll have to benchmark, obviously, but over JDBC issuing multiple inserts will be much faster if you use a PreparedStatement rather than a Statement.

You can use this rewriteBatchedStatements parameter to make the batch insert even faster.
you can read here about the param: MySQL and JDBC with rewriteBatchedStatements=true

SQLite: The above answers are all correct. For SQLite, it is a little bit different. Nothing really helps, even to put it in a batch is (sometimes) not improving performance. In that case, try to disable auto-commit and commit by hand after you are done (Warning! When multiple connections write at the same time, you can clash with these operations)
// connect(), yourList and compiledQuery you have to implement/define beforehand
try (Connection conn = connect()) {
conn.setAutoCommit(false);
preparedStatement pstmt = conn.prepareStatement(compiledQuery);
for(Object o : yourList){
pstmt.setString(o.toString());
pstmt.executeUpdate();
pstmt.getGeneratedKeys(); //if you need the generated keys
}
pstmt.close();
conn.commit();
}

How about using the INSERT ALL statement ?
INSERT ALL
INTO table_name VALUES ()
INTO table_name VALUES ()
...
SELECT Statement;
I remember that the last select statement is mandatory in order to make this request succeed. Don't remember why though.
You might consider using PreparedStatement instead as well. lots of advantages !
Farid

You can use addBatch and executeBatch for batch insert in java See the Example : Batch Insert In Java

In my code I have no direct access to the 'preparedStatement' so I cannot use batch, I just pass it the query and a list of parameters. The trick however is to create a variable length insert statement, and a LinkedList of parameters. The effect is the same as the top example, with variable parameter input length.See below (error checking omitted).
Assuming 'myTable' has 3 updatable fields: f1, f2 and f3
String []args={"A","B","C", "X","Y","Z" }; // etc, input list of triplets
final String QUERY="INSERT INTO [myTable] (f1,f2,f3) values ";
LinkedList params=new LinkedList();
String comma="";
StringBuilder q=QUERY;
for(int nl=0; nl< args.length; nl+=3 ) { // args is a list of triplets values
params.add(args[nl]);
params.add(args[nl+1]);
params.add(args[nl+2]);
q.append(comma+"(?,?,?)");
comma=",";
}
int nr=insertIntoDB(q, params);
in my DBInterface class I have:
int insertIntoDB(String query, LinkedList <String>params) {
preparedUPDStmt = connectionSQL.prepareStatement(query);
int n=1;
for(String x:params) {
preparedUPDStmt.setString(n++, x);
}
int updates=preparedUPDStmt.executeUpdate();
return updates;
}

if you use jdbcTemplate then:
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.BatchPreparedStatementSetter;
public int[] batchInsert(List<Book> books) {
return this.jdbcTemplate.batchUpdate(
"insert into books (name, price) values(?,?)",
new BatchPreparedStatementSetter() {
public void setValues(PreparedStatement ps, int i) throws SQLException {
ps.setString(1, books.get(i).getName());
ps.setBigDecimal(2, books.get(i).getPrice());
}
public int getBatchSize() {
return books.size();
}
});
}
or with more advanced configuration
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.ParameterizedPreparedStatementSetter;
public int[][] batchInsert(List<Book> books, int batchSize) {
int[][] updateCounts = jdbcTemplate.batchUpdate(
"insert into books (name, price) values(?,?)",
books,
batchSize,
new ParameterizedPreparedStatementSetter<Book>() {
public void setValues(PreparedStatement ps, Book argument)
throws SQLException {
ps.setString(1, argument.getName());
ps.setBigDecimal(2, argument.getPrice());
}
});
return updateCounts;
}
link to source

Using PreparedStatements will be MUCH slower than Statements if you have low iterations. To gain a performance benefit from using a PrepareStatement over a statement, you need to be using it in a loop where iterations are at least 50 or higher.

Related

Is prepared statement significantly slower than single query for large number of inserts? [duplicate]

This question already has answers here:
Why is MySQL InnoDB insert so slow?
(9 answers)
Java: Insert multiple rows into MySQL with PreparedStatement
(7 answers)
Closed 2 years ago.
I'm inserting a few million of rows into a MySql table. I'm using prepared statement as shown below.
Would creating a single insert string like the one directly below be expected to be substantially faster?
Single string approach from Is 22 seconds a good time for inserting 500 rows in mysql? :
INSERT INTO example
(example_id, name, value, other_value)
VALUES
(100, 'Name 1', 'Value 1', 'Other 1'),
(101, 'Name 2', 'Value 2', 'Other 2'),
(102, 'Name 3', 'Value 3', 'Other 3'),
(103, 'Name 4', 'Value 4', 'Other 4');
What I'm currently doing:
//
// method to do upload
//
public static void doUpload(Connection conn) {
log.info("Deleting existing data...");
Database.update("truncate table attribute", conn);
log.info("Doing inserts");
String sqlString = "insert into attribute values (null,?,?,?)";
int max = 1000000;
PreparedStatement ps = Database.getPreparedStatement(sqlString, conn);
for (int i = 0; i < max; i++) {
// add params
String subjectId = i+"";
addParam(subjectId, "GENDER", getGender(), ps);
addParam(subjectId, "AGE", getAge(), ps);
addParam(subjectId, "CITY", getCity(), ps);
addParam(subjectId, "FAVORITE_COLOR", getColor(), ps);
addParam(subjectId, "PET", getPet(), ps);
if (i % 1000 == 0) {
log.info("Executing " + i + " of " + max);
Database.execute(ps);
log.info("Done with batch update");
ps = Database.getPreparedStatement(sqlString, conn);
}
}
if (Database.isClosed(ps) == false) {
Database.execute(ps);
}
}
//
// method to add param to the prepared statement
//
private static void addParam(String subjectId, String name, String val, PreparedStatement ps) {
ArrayList<String> params;
params = new ArrayList<String>();
params.add(subjectId + "");
params.add(name);
params.add(val);
Database.addToBatch(params, ps);
}
//
// addToBatch
//
public static void addToBatch(List<String> params, PreparedStatement ps) {
try {
for (int i = 0; i < params.size(); i++) {
ps.setString((i + 1), params.get(i));
}
ps.addBatch();
} catch (Exception exp) {
throw new RuntimeException(exp);
}
}
What is the fastest way to do this type of insert?
I'm currently inserting 1000 rows in about 5 seconds. Is it reasonable to expect much better than this?
I'm running locally and have already dropped all indexes on the table I'm inserting into.
The fastest way to do batch inserts with JDBC is use addBatch / executeBatch,
which you appear to be already doing.
For sample code, see
But that will only get you so much performance.
For a real performance boost, add rewriteBatchedStatements=true to your JDBC url.
You will see a significant improvement.
See MySQL and JDBC with rewriteBatchedStatements=true
Keep in mind that what you suggest in your "Single String Approach" is similar, but rewriteBatchedStatements=true also makes the network communication with the database more efficient.
Not sure what Database.getPreparedStatement is doing but you usually do not need to recreate PreparedStatement object after each batch execution, you can still reuse it.
Also have you tried to set larger batch size? As of now your batch size is 1000, have you tried making it bigger?
Prepared statements offer the advantage of security. Theoretically, the prepared statement is precompiled and should still offer better performance.

Reading rows from Java of a table with 3000000 rows

I want to read the rows of a table with 30000000 rows. I have used st.setFetchSize(10000) thinking I will get the 30000000 rows in packages of 10000 but I only get the first 10000 rows and the program ends. Please could anyone tell me how to get all 30000000 rows in packages of 10000?
public class InsertBatch {
public static void main(String[] args) throws SQLException {
try (Connection connection = DriverManager.getConnection("jdbc:postgresql://localhost:5432/postgres", "postgres", "root")) {
connection.setAutoCommit(false);
Statement st = connection.createStatement(
ResultSet.TYPE_FORWARD_ONLY,
ResultSet.CONCUR_READ_ONLY,
ResultSet.FETCH_FORWARD
);
System.out.println(new Date());
st.setFetchSize(10000);
System.out.println("start query ");
ResultSet rs = st.executeQuery("SELECT * FROM contratacion");
System.out.println("done query ");
String insert = "INSERT INTO contrato(contrato, codigo_postal,cups) VALUES(?, ?, ?)\n" +
"ON CONFLICT (contrato) DO\n" +
"UPDATE SET codigo_postal = excluded.codigo_postal, cups = excluded.cups";
PreparedStatement pst = connection.prepareStatement(insert);
int cont = 0;
while(rs.next()) {
cont++;
Integer contrato = rs.getInt(1);
Integer codigo_postal = rs.getInt(2);
String cups = rs.getString(3);
pst.setInt(1, contrato);
pst.setInt(2, codigo_postal);
pst.setString(3, cups);
pst.executeUpdate();
connection.commit();
System.out.println(cont);
}
System.out.println(new Date());
} catch (SQLException ex) {
}
}
}
Please read the documentation, i.e. the javadoc of createStatement​(int resultSetType, int resultSetConcurrency, int resultSetHoldability):
Parameters:
resultSetType - one of the following ResultSet constants: ResultSet.TYPE_FORWARD_ONLY, ResultSet.TYPE_SCROLL_INSENSITIVE, or ResultSet.TYPE_SCROLL_SENSITIVE
resultSetConcurrency - one of the following ResultSet constants: ResultSet.CONCUR_READ_ONLY or ResultSet.CONCUR_UPDATABLE
resultSetHoldability - one of the following ResultSet constants: ResultSet.HOLD_CURSORS_OVER_COMMIT or ResultSet.CLOSE_CURSORS_AT_COMMIT
Your code is:
Statement st = connection.createStatement(
ResultSet.TYPE_FORWARD_ONLY, // Good
ResultSet.CONCUR_READ_ONLY, // Good
ResultSet.FETCH_FORWARD // BAD !!!!!
);
As you can see, the 3rd parameter is not one of the valid values.
Since you call connection.commit(); inside the while(rs.next()) loop, it'd work much better for you if you pass ResultSet.HOLD_CURSORS_OVER_COMMIT.
Of course, you shouldn't even be doing that, because 3000000 INSERT statements will take forever, especially if you commit each one individually. Yikes!
If you have to, because you need to process the data in Java, at least use batching.
Instead, just write it as a single statement:
INSERT INTO contrato ( contrato, codigo_postal, cups )
SELECT contrato, codigo_postal, cups
FROM contratacion
ON CONFLICT (contrato) DO UPDATE
SET codigo_postal = excluded.codigo_postal
, cups = excluded.cups

How to insert List<String[]> data into database using JDBC?

The current format of my List<String[]> is:
60 52 0 0 1512230400
76 52 1 1 1514044800
42 52 4 1 1516464000
Whereby each separated value by space is a row in my database table, for example: 60 52 0 0 1512230400. I want to insert the 5 separate values per loop. I want to insert all these lines into my database but am not sure on exactly how. This is also a working connection to my database as of now.
This is my rough idea:
String query = "INSERT INTO games (team1_id, team2_id, score1, score2, created_at) VALUES (? ,?, ?, ?, ? )";
Connection con = DBConnector.connect();
PreparedStatement stmt = con.prepareStatement(query);//prepare the SQL Query
for (String[] s : fixtures) {
}
Any help is amazing.
Many thanks
In your for-loop, you can do something like this:
stmt.setString(1, s[0]); //team1_id if it's of string type in db
stmt.setInt(2, Integer.parseInt(s[1])); //team2_id if it's of type integer in db
stmt.setInt(3, Integer.parseInt(s[2])); //score1
stmt.setInt(4, Integer.parseInt(s[3])); //score2
stmt.setLong(5, Long.parseLong(s[4])); //created_at
stmt.executeUpdate();
The above code shows you how to deal with String, Long and Integer, you can use other types similarly.
List<String[]> fixtures = new ArrayList<>();
fixtures.add(new String [] {"60","52","0","0","1512230400"});
fixtures.add(new String [] {"76","52","1","1","1514044800"});
fixtures.add(new String [] {"42","52","4","1","1516464000"});
String query =
"INSERT INTO games (team1_id, team2_id, score1, score2, created_at)\n"
+ " VALUES (? ,?, ?, ?, ? )";
try(
Connection con = DBConnector.connect();
PreparedStatement stmt = con.prepareStatement(query);
) {
for (String[] s : fixtures) {
stmt.setString(1,s[0]);
stmt.setString(2,s[1]);
stmt.setString(3,s[2]);
stmt.setString(4,s[3]);
stmt.setString(5,s[4]);
stmt.execute();
}
con.commit();
}
With this approach, we pass the bind variables as strings. If needed, based on the actual type of the columns being inserted to, conversion from string (VARCHAR) to numeric (NUMBER) will happen by the database.
You got basically all of it right, but didn't take the next step of actually setting the bind-variables ...
This can work if the input List is already created:
List<String[]> fixtures = ...; // assuming this data is already created
String query = "INSERT INTO games (team1_id, team2_id, score1, score2, created_at) VALUES (? ,?, ?, ?, ? )";
try (Connection con = DBConnector.connect();
PreparedStatement stmt = con.prepareStatement(query)) {
for (String [] row : fixtures) {
// This gets executed for each row insert
for (int i = 0; i < row.length; i++) {
stmt.setInt(i+1, Integer.parseInt(row[i]);
}
stmt.executeUpdate();
}
}
catch(SQLException ex) {
ex.printStackTrace();
// code that handles exception...
}

How to insert about 500.000 data rows in table efficiently

I have about 500.000 rows of data to insert into one table.
I am currently inserting them one at a time (I know it's bad) like this :
Dao method :
public static final String SET_DATA = "insert into TABLE (D_ID, N_ID, VALUE, RUN_ID) " + "values (?, ?, ?, ?)";
public void setData(String dId, String nId, BigDecimal value, Run run) throws HibernateException {
if (session == null) {
session = sessionFactory.openSession();
}
SQLQuery select = session.createSQLQuery(SET_DATA);
select.setString(0, dId);
select.setString(1, nId);
select.setBigDecimal(2, value);
select.setLong(3, run.getRunId());
select.executeUpdate();
}
How can I do this more efficiently ?
Why you went for hand written SQL query ?? If you are writing sql in such a way , you are definitely not getting the fruits of hibernate.
Learn Batch Insert Example code for Batch Insert,
Session session = sessionFactory.openSession();
Transaction tx = session.beginTransaction();
for ( int i=0; i<100000; i++ ) {
Customer customer = new Customer(.....);
session.save(customer);
if ( i % 20 == 0 ) { //20, same as the JDBC batch size
//flush a batch of inserts and release memory:
session.flush();
session.clear();
}
}
tx.commit();
session.close();
The fastest way is always using the native bulk import tool of your DBMS.
Do not use hibernate or java for that.
Dump the data into some format your DB understands (most probably in the same file system of your DB) and use your DBMS native import tool.
Ideally you should use batch Insert. Refer to example provided here. This inserts multiple records in DB in a single go.
dbConnection.setAutoCommit(false);//commit trasaction manually
String insertTableSQL = "INSERT INTO DBUSER"
+ "(USER_ID, USERNAME, CREATED_BY, CREATED_DATE) VALUES"
+ "(?,?,?,?)";
PreparedStatement = dbConnection.prepareStatement(insertTableSQL);
for(int i=0;i<500000;i++){
preparedStatement.setInt(1, 101);
preparedStatement.setString(2, "mkyong101");
preparedStatement.setString(3, "system");
preparedStatement.setTimestamp(4, getCurrentTimeStamp());
preparedStatement.addBatch();
}
preparedStatement.executeBatch();
dbConnection.commit();
1.Solution
StringBuilder sb = new StringBuilder();
sb.AppendLine("insert into Table_Name (column1, column1, column1 , column1 ) values ");
foreach (var item in req)
{
sb.AppendFormat("({0},{1},{2},'{3}'),",
item.val1, item.val2, item.val3, item.val4);
}
sb = sb.Remove(sb.Length - 1, 1);
ExecuteNonQuery(sb.ToString());
return true;
if record count great then 1000
StringBuilder sb = new StringBuilder();
foreach (var item in req)
{
sb.AppendLine("insert into Table_Name(column1, column1, column1 , column1) values ");
sb.AppendFormat("({0},{1},{2},'{3}') ;",
item.val1, item.val2, item.val3, item.val4);
}
sb = sb.Remove(sb.Length - 1, 1);
ExecuteNonQuery(sb.ToString());
return true;

Getting ExecuteBatch to execute faster

I'm trying to read a table from a sybase server, process the rows, and output the results to another table. (Below is my code)
The code retrieves the table pretty fast and processes equally fast (get's to the part where it sends within 30 seconds). But When I run execute batch it sits there for 20 minutes before finish (fyi, I have a table which I'm testing with 8400 rows).
Is there a more efficient way to do this? I'm amenable as to how I can recieve or send the queries (I can create a new table, update a table, etc) -- I just don't know why this is so slow (I'm sure the data < 1 MB and I'm sure it doesn't take the SQL server 20 minutes to parse 8400 rows). Any ideas?
Note: The reason this is really bad for me is that I have to parse a table with 1.2 MM rows (this table I'm working with right now is a test table with 8400 rows)
Connection conn = DriverManager.getConnection(conString, user, pass);
String sql = "SELECT id,dateid,attr from user.fromtable";
Statement st = conn.createStatement();
ResultSet rs = st.executeQuery(sql);
String sqlOut = "INSERT INTO user.mytabletest (id,attr,date,estEndtime) values (?,?,?,?)";
PreparedStatement ps = conn.prepareStatement(sqlOut);
int i=1;
while(rs.next())
{
int date = rs.getInt("dateid");
String attr = rs.getString("attr");
String id = rs.getString("id");
Time tt = getTime(date,attr);
Timestamp ts = new Timestamp(tt.getTime());
ps.setString(1, id);
ps.setString(2, attr);
ps.setInt(3, date);
ps.setTimestamp(4, ts);
ps.addBatch();
if(i % 10000 == 0)
{
System.out.println(i);
ps.executeBatch();
conn.commit();
ps.clearBatch();
}
i++;
}
System.out.println("sending "+(new Date()));
int[] results = ps.executeBatch();
System.out.println("committing "+(new Date()));
conn.commit();
System.out.println("done "+(new Date()));
To work with batches effectively you should turn AutoCommit option off and turn it on after executing the batch (or alternatively use connection.commit() method)
connection.setAutoCommit(false);
while(rs.next())
{
.....
ps.addBatch();
}
int[] results = ps.executeBatch();
connection.setAutoCommit(true);
Add ?rewriteBatchedStatements=true to the end of your JDBC url. It'll give you a serious performance improvement. Note that this is specific to MySql, won't have any effect with any other JDBC drivers.
Eg : jdbc:mysql://server:3306/db_name?rewriteBatchedStatements=true
It improved my performance by more than 15 times
I had this same problem, finally figured it out though I also was not able to find the right explanation anywhere.
The answer is that for simple un-conditioned inserts .executeBatch() should not be used. What batch mode is doing is making lots of individual "insert into table x ..." statements and that is why it is running slow. However if the insert statements were more complex, possibly with conditions that affect each row differently, then it might require individual insert statements and a batch execution would actually be useful.
An example of what works, try the following which creates a single insert statement as a PreparedStatement (but same concept as a Statement object would require), and solves the problem of running slow:
public boolean addSetOfRecords(String tableName, Set<MyObject> objects) {
StringBuffer sql = new StringBuffer("INSERT INTO " + tableName + " VALUES (?,?,?,?)");
for(int i=1;i<objects.size();i++) {
sql.append(",(?,?,?,?)");
}
try {
PreparedStatement p = db.getConnection().prepareStatement(sql.toString());
int i = 1;
for(MyObject obj : objects) {
p.setString(i++, obj.getValue());
p.setString(i++, obj.getType());
p.setString(i++, obj.getId());
p.setDate(i++, new Date(obj.getRecordDate().getTime()));
}
p.execute();
p.close();
return true;
} catch (SQLException e) {
e.printStackTrace();
return false;
}
}
There is a commercial solution from Progress DataDirect to translate JDBC batches into the database's native bulk load protocol to significantly improve performance. It's very popular with SQL Server since it does not require BCP. I am employed by that vendor and wrote a blog on how to bulk insert JDBC batches.

Categories