I am populating two tables, which have a 1-many relationship.
So I insert a line in outer, get the (autoincrement primary key) id for that line, and then insert 100 lines into inner (all with a foreign key pointing to outer.id).
Then I repeat, 50 times. For every entry in outer I have to insert, read id, and then insert into inner.
This is slow. Most of the time is spent in loading the 100 lines into inner. I suspect it would be much faster if I could insert all 50*100 lines into inner in one batch operation. But I cannot see how to do that - how can I make the foreign keys work?
How do other people make this efficient?
I am using Java / Spring. The 100 lines are inserted with a JdbcTemplate.batchUpdate().
public final void insert(final JdbcTemplate db,
final Iterable<DataBlock> data) {
String insertSql = getInsertSql();
String idQuery = getIdQuery();
ItemRowMapper.IdRowMapper mapper = new ItemRowMapper.IdRowMapper();
for (DataBlock block: data) {
Object[] outer = block.getOuter();
LOG.trace("Loading outer");
db.update(insertSql, outer);
LOG.trace("Getting index");
// currently retrieve index based on natural key, but could use last index
int id = db.query(idQuery, mapper, uniqueData(outer)).get(0);
LOG.trace("Getting inner");
List<Object[]> inner = block.getInner(id);
// most time spent here
LOG.trace(format("Loading inner (%d)", inner.size()));
innerTable.insert(db, inner);
}
}
And pseudo-SQL:
create table outer (
integer id primary key autoincrement,
...
);
create table inner (
integer outer references outer(id),
...
);
Update - The following appears to work with Spring 3.1.1 and Postgres 9.2-1003.jdbc4.
/**
* An alternative implementation that should be faster, since it inserts
* in just two batches (one for inner and one fo router).
*
* #param db A connection to the database.
* #param data The data to insert.
*/
public final void insertBatchier(final JdbcTemplate db,
final AllDataBlocks data) {
final List<Object[]> outers = data.getOuter();
List<Integer> ids = db.execute(
new PreparedStatementCreator() {
#Override
public PreparedStatement createPreparedStatement(
final Connection con) throws SQLException {
return con.prepareStatement(getInsertSql(),
Statement.RETURN_GENERATED_KEYS);
}
},
new PreparedStatementCallback<List<Integer>>() {
#Override
public List<Integer> doInPreparedStatement(final PreparedStatement ps)
throws SQLException {
for (Object[] outer: outers) {
for (int i = 0; i < outer.length; ++i) {
setParameterValue(ps, i + 1,
SqlTypeValue.TYPE_UNKNOWN, outer[i]);
}
ps.addBatch();
}
ps.executeBatch();
RowMapperResultSetExtractor<Integer> ids =
new RowMapperResultSetExtractor<Integer>(
new ItemRowMapper.IdRowMapper());
try (ResultSet keys = ps.getGeneratedKeys()) {
return ids.extractData(keys);
}
}
});
innerTable.insert(db, data.getInner(ids));
}
I'm not as familiar with JdbcTemplate, but assuming it is similar to JDBC I would do it with something similar (I would probably break this into multiple methods) to the following code:
private static final int BATCH_SIZE = 50;
public void addBatch(Connection connection, List<Outer> outers) {
PreparedStatement outerInsertStatement = connection.prepareStatement("...", Statement.RETURN_GENERATED_KEYS);
PreparedStatement innerInsertStatement = connection.prepareStatement("...", Statement.RETURN_GENERATED_KEYS);
List<Integer> outerIds = new ArrayList<Integer>();
for(Outer outer : outers) {
outerInsertStatement.setParameter(...);
...
outerInsertStatement.setParameter(...);
outerInsertStatement.addBatch();
}
outerInsertStatement.executeBatch();
//Note, this line requires JDBC3
ResultSet primaryKeys = outerInsertStatement.getGeneratedKeys();
while(!primaryKeys.isAfterLast()) {
outerIds.add(primaryKeys.getInt(0));
}
for(int i = 0; i < outers.size(); i++) {
Outer outer = outers.get(i);
Integer outerId = outerIds.get(i);
for(Inner inner : outer.getInners()) {
//One of these setParameter calls would use outerId
innerInsertStatement.setParameter(...);
...
innerInsertStatement.setParameter(...);
innerInsertStatement.addBatch();
if( (i+1) % BATCH_SIZE == 0) {
innerInsertStatement.executeBatch();
}
}
innerInsertStatement.executeBatch();
}
}
Related
I'm using jdbcTemplate to Batch-Insert into 2 tables. The 1st table is easy, and has an ID. The 2nd table has an FK Reference USER_ID which I need to obtain from Table 1 before inserting.
Suppose I have this:
Main Java Code (here I split up into batches <= 1000)
for(int i = 0; i < totalEntries.size(); i++) {
// Add to Batch-Insert List; if list size ready for batch-insert, or if at the end, batch-persist & clear list
batchInsert.add(user);
if (batchInsert.size() == 1000 || i == totalEntries.size() - 1) {
// 1. Batch is ready, insert into Table 1
nativeBatchInsertUsers(jdbcTemplate, batchInsert);
// 2. Batch is ready, insert into Table 2
nativeBatchInsertStudyParticipants(jdbcTemplate, batchInsert);
// Reset list
batchInsert.clear();
}
}
Method to Batch-Insert into Table 1 (note I'm getting the Seq Val here for USERS_T)
private void nativeBatchInsertUsers(JdbcTemplate jdbcTemplate, final List<UsersT> batchInsert) {
String sqlInsert_USERS_T = "INSERT INTO PUBLIC.USERS_T (id, password, user_name) " +
"VALUES (nextval('users_t_id_seq'), ?, ? " +
")";
// Insert into USERS_T using overridden JdbcTemplate's Native-SQL batchUpdate() on the string "sqlInsert_USERS_T"
jdbcTemplate.batchUpdate(sqlInsert_USERS_T, new BatchPreparedStatementSetter() {
#Override
public int getBatchSize() {
return batchInsert.size();
}
#Override
public void setValues(PreparedStatement ps, int i) throws SQLException {
ps.setString(1, null);
ps.setString(2, batchInsert.get(i).getUsername());
// etc.
});
}
Method to Batch-Insert into Table 2
private void nativeBatchInsertStudyParticipants(JdbcTemplate jdbcTemplate, final List<UsersT> batchInsertUsers) {
String sqlInsert_STUDY_PARTICIPANTS_T =
"INSERT INTO PUBLIC.STUDY_PARTICIPANTS_T (id, study_id, subject_id, user_id) "
"VALUES (nextval('study_participants_t_id_seq'), ?, ?, ?
")";
// Insert into STUDY_PARTICIPANTS_T using overridden JdbcTemplate's Native-SQL batchUpdate() on the string "sqlInsert_USERS_T"
jdbcTemplate.batchUpdate(sqlInsert_STUDY_PARTICIPANTS_T, new BatchPreparedStatementSetter() {
#Override
public int getBatchSize() {
return batchInsert.size();
}
#Override
public void setValues(PreparedStatement ps, int i) throws SQLException {
// PROBLEM: For Param #4, USER_ID, need to get the USERS_T.ID from Batch-Insert #1
}
});
}
When I come to the 2nd Batch-Insert, one of the columns is an FK back to USERS_T.ID which is called STUDY_PARTICIPANTS_T.USER_ID. Is it possible for me to obtain it by keeping the jdbcTemplate.batchUpdate() logic?
Here's the answer.
1) One solution, if you're using jdbcTemplate (Spring JDBC), is to reserve your own ID range in advance. Then provide the manually-calculated IDs for each row yourself. E.g.
#Transactional(readOnly = false, rollbackFor = Exception.class)
public void doMultiTableInsert(List<String> entries) throws Exception {
// 1. Obtain current Sequence values
Integer currTable1SeqVal = table1DAO.getCurrentTable1SeqVal();
Integer currTable2SeqVal = table2DAO.getCurrentTable2SeqVal();
// 2. Immediately update the Sequences to the calculated final value (this reserves the ID range immediately)
table1DAO.setTable1SeqVal(currTable1SeqVal + entries.size());
table2DAO.setTable2SeqVal(currTable2SeqVal + entries.size());
for(int i = 0; i < entries.size(); i++) {
// Prepare Domain object...
UsersT user = new User();
user.setID(currTable1SeqVal + 1 + i); // Set ID manually
user.setCreatedDate(new Date());
// etc.
StudyParticipantsT sp = new StudyParticipantsT();
sp.setID(currTable2SeqVal + 1 + i); // Set ID manually
// etc.
user.setStudyParticipant(sp);
// Add to Batch-Insert List
batchInsertUsers.add(user);
// If list size ready for Batch-Insert (in this ex. 1000), or if at the end of all subjectIds, perform Batch Insert (both tables) and clear list
if (batchInsertUsers.size() == 1000 || i == subjectIds.size() - 1) {
// Part 1: Insert batch into USERS_T
nativeBatchInsertUsers(jdbcTemplate, batchInsertUsers);
// Part 2: Insert batch into STUDY_PARTICIPANTS_T
nativeBatchInsertStudyParticipants(jdbcTemplate, batchInsertUsers);
// Reset list
batchInsertUsers.clear();
}
}
}
then your Batch-Insert submethods referenced above:
1)
private void nativeBatchInsertUsers(JdbcTemplate jdbcTemplate, final List<UsersT> batchInsertUsers) {
String sqlInsert = "INSERT INTO PUBLIC.USERS_T (id, password, ... )"; // etc.
jdbcTemplate.batchUpdate(sqlInsert, new BatchPreparedStatementSetter() {
#Override
public int getBatchSize() {
return batchInsertUsers.size();
}
#Override
public void setValues(PreparedStatement ps, int i) throws SQLException {
ps.setInt(1, batchInsertUsers.get(i).getId()); // ID (provided by ourselves)
ps.setDate(2, batchInsertUsers.get(i).getCreatedDate());
//etc.
}
});
}
2)
private void nativeBatchInsertStudyParticipants(JdbcTemplate jdbcTemplate, final List<UsersT> batchInsertUsers) {
String sqlInsert = "INSERT INTO PUBLIC.STUDY_PARTICIPANTS_T (id, ... )"; // etc.
jdbcTemplate.batchUpdate(sqlInsert, new BatchPreparedStatementSetter() {
#Override
public int getBatchSize() {
return batchInsertUsers.size();
}
#Override
public void setValues(PreparedStatement ps, int i) throws SQLException {
ps.setInt(1, batchInsertUsers.get(i).getStudyParticipants().getId()); // ID (provided by ourselves)
//etc.
}
});
}
There are ways to get/set Sequence values, e.g. in Postgres it's
SELECT last_value FROM users_t_id_seq; -- GET SEQ VAL
SELECT setval('users_t_id_seq', 621938); -- SET SEQ VAL
Note also that everything is under #Transactional. If there are any exceptions in the method all data gets rolled back (for all exceptions, rollbackFor = Exception.class). The only thing that doesn't get rolled back is the manual Sequence update. But that's OK, sequences can have gaps.
2) Another solution, if you're willing to drop down to the PreparedStatement level, is Statement.RETURN_GENERATED_KEYS:
PreparedStatement ps = con.prepareStatement(sql, Statement.RETURN_GENERATED_KEYS)
After you execute ps, the ResultSet will contain your IDs in the order they were created. You can iterate over the ResultSet and store the IDs in a separate list.
while (rs.next()) {
generatedIDs.add(rs.getInt(1));
}
Remember that in this case you're responsible for your own Transaction Management. You need to conn.setAutoCommit(false); to have the batches pile up without real persistence, and then conn.commit(); / conn.rollback();.
I'm using jdbcTemplate to Batch-Insert into 2 tables. The 1st table is easy, and has an ID. The 2nd table has an FK Reference USER_ID which I need to obtain from Table 1 before inserting.
Suppose I have this:
Main Java Code (here I split up into batches <= 1000)
for(int i = 0; i < totalEntries.size(); i++) {
// Add to Batch-Insert List; if list size ready for batch-insert, or if at the end, batch-persist & clear list
batchInsert.add(user);
if (batchInsert.size() == 1000 || i == totalEntries.size() - 1) {
// 1. Batch is ready, insert into Table 1
nativeBatchInsertUsers(jdbcTemplate, batchInsert);
// 2. Batch is ready, insert into Table 2
nativeBatchInsertStudyParticipants(jdbcTemplate, batchInsert);
// Reset list
batchInsert.clear();
}
}
Method to Batch-Insert into Table 1 (note I'm getting the Seq Val here for USERS_T)
private void nativeBatchInsertUsers(JdbcTemplate jdbcTemplate, final List<UsersT> batchInsert) {
String sqlInsert_USERS_T = "INSERT INTO PUBLIC.USERS_T (id, password, user_name) " +
"VALUES (nextval('users_t_id_seq'), ?, ? " +
")";
// Insert into USERS_T using overridden JdbcTemplate's Native-SQL batchUpdate() on the string "sqlInsert_USERS_T"
jdbcTemplate.batchUpdate(sqlInsert_USERS_T, new BatchPreparedStatementSetter() {
#Override
public int getBatchSize() {
return batchInsert.size();
}
#Override
public void setValues(PreparedStatement ps, int i) throws SQLException {
ps.setString(1, null);
ps.setString(2, batchInsert.get(i).getUsername());
// etc.
});
}
Method to Batch-Insert into Table 2
private void nativeBatchInsertStudyParticipants(JdbcTemplate jdbcTemplate, final List<UsersT> batchInsertUsers) {
String sqlInsert_STUDY_PARTICIPANTS_T =
"INSERT INTO PUBLIC.STUDY_PARTICIPANTS_T (id, study_id, subject_id, user_id) "
"VALUES (nextval('study_participants_t_id_seq'), ?, ?, ?
")";
// Insert into STUDY_PARTICIPANTS_T using overridden JdbcTemplate's Native-SQL batchUpdate() on the string "sqlInsert_USERS_T"
jdbcTemplate.batchUpdate(sqlInsert_STUDY_PARTICIPANTS_T, new BatchPreparedStatementSetter() {
#Override
public int getBatchSize() {
return batchInsert.size();
}
#Override
public void setValues(PreparedStatement ps, int i) throws SQLException {
// PROBLEM: For Param #4, USER_ID, need to get the USERS_T.ID from Batch-Insert #1
}
});
}
When I come to the 2nd Batch-Insert, one of the columns is an FK back to USERS_T.ID which is called STUDY_PARTICIPANTS_T.USER_ID. Is it possible for me to obtain it by keeping the jdbcTemplate.batchUpdate() logic?
Here's the answer.
1) One solution, if you're using jdbcTemplate (Spring JDBC), is to reserve your own ID range in advance. Then provide the manually-calculated IDs for each row yourself. E.g.
#Transactional(readOnly = false, rollbackFor = Exception.class)
public void doMultiTableInsert(List<String> entries) throws Exception {
// 1. Obtain current Sequence values
Integer currTable1SeqVal = table1DAO.getCurrentTable1SeqVal();
Integer currTable2SeqVal = table2DAO.getCurrentTable2SeqVal();
// 2. Immediately update the Sequences to the calculated final value (this reserves the ID range immediately)
table1DAO.setTable1SeqVal(currTable1SeqVal + entries.size());
table2DAO.setTable2SeqVal(currTable2SeqVal + entries.size());
for(int i = 0; i < entries.size(); i++) {
// Prepare Domain object...
UsersT user = new User();
user.setID(currTable1SeqVal + 1 + i); // Set ID manually
user.setCreatedDate(new Date());
// etc.
StudyParticipantsT sp = new StudyParticipantsT();
sp.setID(currTable2SeqVal + 1 + i); // Set ID manually
// etc.
user.setStudyParticipant(sp);
// Add to Batch-Insert List
batchInsertUsers.add(user);
// If list size ready for Batch-Insert (in this ex. 1000), or if at the end of all subjectIds, perform Batch Insert (both tables) and clear list
if (batchInsertUsers.size() == 1000 || i == subjectIds.size() - 1) {
// Part 1: Insert batch into USERS_T
nativeBatchInsertUsers(jdbcTemplate, batchInsertUsers);
// Part 2: Insert batch into STUDY_PARTICIPANTS_T
nativeBatchInsertStudyParticipants(jdbcTemplate, batchInsertUsers);
// Reset list
batchInsertUsers.clear();
}
}
}
then your Batch-Insert submethods referenced above:
1)
private void nativeBatchInsertUsers(JdbcTemplate jdbcTemplate, final List<UsersT> batchInsertUsers) {
String sqlInsert = "INSERT INTO PUBLIC.USERS_T (id, password, ... )"; // etc.
jdbcTemplate.batchUpdate(sqlInsert, new BatchPreparedStatementSetter() {
#Override
public int getBatchSize() {
return batchInsertUsers.size();
}
#Override
public void setValues(PreparedStatement ps, int i) throws SQLException {
ps.setInt(1, batchInsertUsers.get(i).getId()); // ID (provided by ourselves)
ps.setDate(2, batchInsertUsers.get(i).getCreatedDate());
//etc.
}
});
}
2)
private void nativeBatchInsertStudyParticipants(JdbcTemplate jdbcTemplate, final List<UsersT> batchInsertUsers) {
String sqlInsert = "INSERT INTO PUBLIC.STUDY_PARTICIPANTS_T (id, ... )"; // etc.
jdbcTemplate.batchUpdate(sqlInsert, new BatchPreparedStatementSetter() {
#Override
public int getBatchSize() {
return batchInsertUsers.size();
}
#Override
public void setValues(PreparedStatement ps, int i) throws SQLException {
ps.setInt(1, batchInsertUsers.get(i).getStudyParticipants().getId()); // ID (provided by ourselves)
//etc.
}
});
}
There are ways to get/set Sequence values, e.g. in Postgres it's
SELECT last_value FROM users_t_id_seq; -- GET SEQ VAL
SELECT setval('users_t_id_seq', 621938); -- SET SEQ VAL
Note also that everything is under #Transactional. If there are any exceptions in the method all data gets rolled back (for all exceptions, rollbackFor = Exception.class). The only thing that doesn't get rolled back is the manual Sequence update. But that's OK, sequences can have gaps.
2) Another solution, if you're willing to drop down to the PreparedStatement level, is Statement.RETURN_GENERATED_KEYS:
PreparedStatement ps = con.prepareStatement(sql, Statement.RETURN_GENERATED_KEYS)
After you execute ps, the ResultSet will contain your IDs in the order they were created. You can iterate over the ResultSet and store the IDs in a separate list.
while (rs.next()) {
generatedIDs.add(rs.getInt(1));
}
Remember that in this case you're responsible for your own Transaction Management. You need to conn.setAutoCommit(false); to have the batches pile up without real persistence, and then conn.commit(); / conn.rollback();.
Well I have been updating a legacy code since last few days.
Explanation:
There is a table CUSTOMER with columns blah1, blah2, blah3, blah4, blah.....
As per our architecture I need to create an insert statement dynamically which can insert data into any table with any number of column.
Currently we have following code.
public void save(Table table, Connection conn) throws Exception {
PreparedStatement pstmt = null;
try {
List<Row> rows = table.getRows();
String sql = "";
if(!rows.isEmpty() && rows != null)
{
for(Row row: rows) //READ EACH ROW
{
String columnName = ""; String columnValue = "";
List<String> params = new ArrayList<String>();
List<Column> columns = row.getColumns();
if(!columns.isEmpty() && columns != null)
{
for(Column column: columns) //GET EACH COLUMN DATA
{
columnName += ", "+column.getName();
columnValue += ", ?";
String value = column.getValue();
params.add(value); //ADD VALUE TO PARAMS
}
//INSERT QUERY
sql = "INSERT INTO "+table.getTableName()+" ("+columnName+") VALUES ("+columnValue+")";
if(pstmt == null) pstmt = conn.prepareStatement(sql);
//POPULATE PREPARED STATEMENT
for (int i =0; i<params.size(); i++) {
pstmt.setString(i+1, (String)params.get(i));
}
pstmt.addBatch();
}
}
pstmt.executeBatch();//BATCH COMMIT
conn.commit();
}
} catch (Exception e) {
if (conn != null) {
conn.rollback();
}
throw e;
}
}
Now instead of using the typical pstmt.executeBatch(). I want to use spring batch update as follows:
public void save(Table table, Connection conn) throws Exception{
String sql = createSaveQuery(table);//CREATES the INSERT Query
getJdbcTemplate().batchUpdate(sql.toString(), new BatchPreparedStatementSetter() {
#Override
public void setValues(PreparedStatement ps, int j) throws SQLException {
//PROBLEM AREA: How to map this for each insert statement?
for(int i =0; i < params.size(); i++){
ps.setString(i+1, (String)params.get(i));
}
}
#Override
public int getBatchSize() {
return 0;
}
});
}
But I cannot figure out how to set the params for the each insert Query. As in the we can set the pstmt.setString(i, params.get(i)); for each row. How to achieve the same in 'new BatchPreparedStatementSetter()'.
any suggestions will be appreciated. If you need to further imporve the explanation. Please let me know.
I guess it's against Spring Batch nature to create dynamic SQL queries. That's because of PreparedStatement nature (read more here: What does it mean when I say Prepared statement is pre-compiled?). Long story short - PreparedStatements are compiled on DB side and they are really fast. If you modify SQL query, it can't be reused. It is hard to give advise without knowing of actual DB schema, but in general you should have one POJO for CUSTOMER table (I hope you don't have BLOBS) and you should read-write all fields. It will be faster than "optimized" dynamic query.
I have created a custom function to insert data in my MySQL database. The functions first creates a query based on the input given. The query wil look like INSERT INTO tableName (columnName1, ..., columnNamei) VALUES (?, ..., ?), ..., (?, ...,?). After that, the PreparedStatement needs to made, which contains the real values. These need to be added to a batch, because I want to add multiple rows at once (as showed here: Java: Insert multiple rows into MySQL with PreparedStatement). Here is the code:
insertData() Function
public static void insertData(String table, List<HashMap<String, Object>> list) throws SQLException {
//Create query: make sure all of the rows in the table get the same amount of values passed
//Prepare colnames string
String colNamesParsed = "";
int counter = 1;
//Iterate over only the first hashmap of the list (THATS WHY ALL THE ROWS NEED TO HAVE THE SAME AMOUNT OF VALUES PASSED)
for (String colName : list.get(0).keySet()) {
//Check if it is the last col name
if (counter != list.get(0).keySet().size()) {
colNamesParsed = colNamesParsed + colName+", ";
}
else {
colNamesParsed = colNamesParsed + colName;
}
counter++;
}
//Now create the place holder for the query variables
String queryVariablesPlaceholder = "";
int rowSize = 0;
for (HashMap<String, Object> row : list) {
//This part is to check if all row sizes are equal
if (rowSize == 0) {
rowSize = row.values().size();
}
else {
//Check if the rowsize is equal for all rows
if (row.values().size() != rowSize) {
System.out.println("The rows of the arrays are from a different size");
return;
}
}
String queryVariablesRow = "(?, ";
for (int j = 1; j < (row.values().size()-1); j++) {
queryVariablesRow = queryVariablesRow+"?, ";
}
queryVariablesRow = queryVariablesRow+"?)";
//Make sure the query does not start with a comma
if (queryVariablesPlaceholder.equals("")) {
queryVariablesPlaceholder = queryVariablesRow;
}
else {
queryVariablesPlaceholder = queryVariablesPlaceholder+", "+queryVariablesRow;
}
}
//The MySQL query needs to be built now
String query = "INSERT INTO "+table+" ("+colNamesParsed+") VALUES "+queryVariablesPlaceholder+";";
System.out.println(query);
//Init prepared statement
PreparedStatement statement = con.prepareStatement(query);
for (HashMap<String, Object> map : list) {
int varCounter = 1;
//Iterate over all values that need to be inserted
for (Object object : map.values()) {
if (object instanceof Integer) {
statement.setInt(varCounter, Integer.parseInt(object.toString()));
}
else if (object instanceof String) {
statement.setString(varCounter, object.toString());
}
else if (object instanceof Timestamp) {
statement.setTimestamp(varCounter, parseStringToTimestamp(object.toString()));
}
else if (object instanceof Double) {
statement.setDouble(varCounter, Double.parseDouble(object.toString()));
}
System.out.println(varCounter);
varCounter++;
}
//Add row to the batch
try {
statement.addBatch();
}
catch (SQLException e) {
e.printStackTrace();
}
}
//Execute the query, which is in fact the batch
statement.executeBatch();
}
When I want to insert some data in the database, I execute the following code:
Functional part
List<HashMap<String, Object>> list = new ArrayList<>();
for (Object object : listOfObjects) {
HashMap<String, Object> map = new HashMap<>();
map.put("columnName1", object.getSomeValue());
/....../
map.put("columnName2", object.getSomeOtherValue());
list.add(map);
}
Functions.insertData("tableName", list);
Creating the dynamic query seems to work perfectly. However, I can't get the statement.addBatch() to work. It keeps giving me the following error:
java.sql.SQLException: No value specified for parameter 9
I don't get it, because I only have 8 parameters to pass in every unit of the batch. My target table has 9 columns, so I tried to add a value for that column, but then it says: No value specified for parameter 10, so it seems like it isn't closing the 'batch unit' or something.
What am I missing here?
Any help is greatly appreciated!
This
INSERT INTO tableName (columnName1, ..., columnNamei) VALUES (?, ..., ?), ..., (?, ...,?)
is not standard SQL syntax.
If you use this JDBC will a parameter for each "?" in your query.
Use:
INSERT INTO tableName (columnName1, ..., columnNamei) VALUES (?, ..., ?)
and add every statement to a batch.
The db tables that I am using are changing very often meaning new column can be add which will reflect my sql’s.
The solution that I was thinking is first “read” the meta data into some map and use it in order to retrieve the values something like this.
Read meta data:
public class Dynamic {
static public final Map<Integer, String> metadata = initMetaData();
HashMap<String, String> data = new HashMap<String, String>();
private static Map<Integer, String> initMetaData() {
Map<Integer, String> tmpMap = new HashMap<Integer, String>();
try {
Connection connection = DBConnection.getConnection();
try {
Statement stmt = connection.createStatement();
ResultSet result = stmt.executeQuery("SELECT * FROM TMP WHERE ROWNUM = 1");
for (int i = 1; i <= result.getMetaData().getColumnCount(); i++) {
tmpMap.put(new Integer(i), result.getMetaData().getColumnName(i));
}
} finally {
connection.close();
}
} catch (SQLException ex) {
…..
}
return Collections.unmodifiableMap(tmpMap);
}
public static String getColumnName(Integer index) {
return metadata.get(index);
}
And when running the sql:
public static void test()
try {
Connection connection = DBConnection.getConnection()
try {
Statement stmt = connection.createStatement();
ResultSet result = stmt.executeQuery("SELECT * FROM TMP where idx = 'R5'");
while (result.next()) {
Dynamic d = new Dynamic()
for (int i = 1; i <= Dynamic.metadata.size(); i++) {
d.setData(Dynamic.getColumnName(i),result.getString(Dynamic.getColumnName(i)));
}
}
In this approach I have two problems(that I notice):
1) I need to execute two loops
2) I don’t know which get function to use via resultset since the type can also change.
How can I overcome those problems ?
I would also appreciate to get some other suggestion maybe there is a simple why
Thanks
1) What is your alternative to the inner loop? How would you get the field values? Do you think there is a lot of overhead in looping over small number of integers?
2) You can get extra information about field data type from the same metadata where you get the field name and map the method names accordingly.
3) You should really create a map for multiple tables - table/fieldSeq/fieldType/methodName and maybe few extra details - you don't have to get them all dynamically all the time.