I'm running into an issue where I'm trying to insert BATCH of roughly 10,000 records into an Oracle 11g table of 13 columns. Actually this is blocked if I try more than 1000 records. Also, fine grained tests for a BATCH insert of 500 records is too slow as well.
The BATCH is generating huge trace files on the database sever due to ORA-4031 errors. The SQL statement running in each session is over 80,000 lines long, taking as input over 42,000 bind variables.
Driver: oracle.jdbc.driver.OracleDriver
I'm trying to satisfy the requirement: If any data for an insert is invalid, I got to rollback.
My question is:
Is there a way to tell MyBatis to divide the incoming bulky batch into small batches say 10 records each, insert them but rollback all the batches if one of the batch fails and commit only when all the batches are sucessfully inserted?
Or is there an alternative approach to this?
Below is the xml mapping for the insert:
<insert id="insertToTable" parameterType="java.util.List">
INSERT ALL
<foreach item="line" collection="list" >
<foreach item="lineItem"
collection="line.entrySet()" open="" close="" separator="">
into
TABLE_TEST_T
(col1, col2, col3, col4, col5,
col6, col7, col8, col9, col10,
col11, col12, col13)
values(
<!-- #{lineItem.item1, jdbcType=DATE}, -->
#{lineItem.item1, jdbcType=VARCHAR},
#{lineItem.item2, jdbcType=VARCHAR},
#{lineItem.item3, jdbcType=VARCHAR},
#{lineItem.item4, jdbcType=NUMERIC},
#{lineItem.item5, jdbcType=NUMERIC},
#{lineItem.item6, jdbcType=NUMERIC},
#{lineItem.item7, jdbcType=NUMERIC},
#{lineItem.item8, jdbcType=NUMERIC},
#{lineItem.item9, jdbcType=NUMERIC},
#{lineItem.item10, jdbcType=NUMERIC},
#{lineItem.item11, jdbcType=NUMERIC},
#{lineItem.item12, jdbcType=NUMERIC},
#{lineItem.item13, jdbcType=NUMERIC}
)
</foreach>
</foreach>
SELECT * FROM dual
</insert>
Appreciate your feedback.
Yes, you can start a transaction and roll it back if error happens or commit it if there are no errors.
There are many ways to manage transactions in Java applications
Since you are using myBatis, you can look into SqlMapTransactionManager.
I used a camel EIP Splitter in a route and tied the query in a transaction. More details can be found at
http://camel.465427.n5.nabble.com/Camel-Mybatis-2-13-1-BATCH-of-10-000-records-tt5756211.html
Related
I'm trying to make an insert all method inside my mapper.
The problem is with the the selectKey inside the foreach (it seems I cannot use it).
If I call, from the outside, a nextVal method it returns always the same number.
<select id="nextValKey" resultType="java.lang.Long">
SELECT MY_SEQUENCE.nextVal from dual
</select>
<insert id="insertAll" parameterType="list">
INSERT ALL
<foreach collection="items" item="item" index="i">
<![CDATA[
into MY_TABLE (ID)
values (
#{item.id, jdbcType=DECIMAL}
]]>
</foreach>
SELECT * FROM dual
</insert>
If I understand correctly you generate ids for items via call to nextValKey.
The problem is that mybatis used cached value if you invoke the same select statement for the second time in the same session.
If you have query that returns different values each time you can instruct mybatis to clear the cache after statement execution (by default this is off for select and is on for insert, update and delete):
<select id="nextValKey" resultType="java.lang.Long" flushCache="true">
SELECT MY_SEQUENCE.nextVal from dual
</select>
I don't know Java nor MyBatis.
However, why would you want to use INSERT ALL in this case? It is usually used when you want to insert rows into different tables, using the same INSERT statement.
In your case, though - as far as I understand it - all you do is (pseudocode)
insert into my_table (id) a_sequence_of_numbers
If that's so, and as that "sequence_of_numbers" gets its value from my_sequence, then just do it as
insert into my_table (id)
select my_sequence.nextval
from dual
connect by level <= 10; -- it would insert 10 numbers
[EDIT: how to insert bunch of values]
You'd do it as simple as that:
SQL> create table my_table (id number);
Table created.
SQL> set timing on
SQL>
SQL> insert into my_table
2 select level from dual
3 connect by level <= 10000;
10000 rows created.
Elapsed: 00:00:00.02
SQL>
Or, if you insist on a sequence you created,
SQL> insert into my_table
2 select seqa.nextval from dual
3 connect by level <= 10000;
10000 rows created.
Elapsed: 00:00:00.08
SQL>
So I am trying to do a batch insert using MyBatis below is my xml mapper
<insert id="insertMessages" parameterType="java.util.List">
INSERT INTO Messages(user, id, month,
key, value, name,
message, dateCreated, createdBy)
VALUES
<foreach item="item" collection="list" open="(" separator="),(" close=")">
#{item.user},
#{item.id},
#{item.month},
#{item.key},
#{item.value},
#{item.name},
#{item.message},
GETDATE(),
#{item.createdBy}
</foreach>
</insert>
right now it only works when the batch size is 200 or less. Anything more than 200, gets the error :
-The incoming tabular data stream(TDS) remote procedure call (RPC) protocol stream is incorrect.Too many parameters provided in the RPC request.The maximum is 2100
which is caused by SQL server not being able to take more than 2100 paramerters and myBatis builds query with "?" and then passes parameters to the query. In my case it is 8 parameters so if batch is 300 then 2400 parameters will be passed to DB. So I can keep adjusting the batch insert size for inserts but there's probably a better way. What can I do to not hit this parameters limitation on SQL server side? without changing SQL server. Thanks!
I am trying to insert 1000 records into a table(Oracle DB) using mybatis as a batch opertaion(using ExecutorType.Batch in sqlsessionTemplate & <foreach> tag in mapper xml).
While executing the mapper function to insert,if there is any error inbetween, it rollbacks completely, skipping the remaining batch insertion process.
Our requirement is to log the error for records whichever failing,and continue with the insertion of remaining records.
Is there any possibility/option available in mybatis-batch insert to achieve this.
Sample query used in mapper XML:
<insert id="addSampleBatch" parameterType='java.util.Map'>
INSERT ALL
<foreach collection="sampleList" item="vehicle">
INTO
vehicle
(id,name)
VALUES
(#{vehicle.id},
#{vehicle.name} )
</foreach>
SELECT * FROM dual
</insert>
myBatis version:3.2.8
mybatis-spring jar version:1.2.2
<insert id="insertIntoScheduleReportUserLink" parameterType="com.mypackage.model.ScheduleReport">
<selectKey keyProperty="id" resultType="java.lang.Integer" order="BEFORE">
select NEXTVAL('schedule_report_user_link_sequence')
</selectKey>
INSERT INTO schedule_report_user_link(
id, schedule_report_detail_id, to_user_id)
<foreach collection="selectedUsers" item="user" separator=",">
VALUES (#{id}, #{scheduleReportDetail.id}, #{user.id})
</foreach>;
</insert>
Here I am using for each loop to multiple insert. I need to know if selectKey generate new id for each insert?
Is there any better approach?
The loop is running only of the insert section not on the key generation part.
So its seems the key will be generated once.
Don't depend on hypothesis run for a little data and see it for your self.
INSERT INTO [UPLOAD_FILE_RECORD_FIELDS_DATA]([RECORD_ID], [FIELD_ORDER], [FIELD_VALUE], [ERROR_CODE])
select ?,?,?,?
union all
select ?,?,?,?
union all
select ?,?,?,?
I have to insert multiple records into one table.So i am using query as below.and setting parameter values. But i am getting error code 77.What is cause?
No of records to be inserted are approx 70000.So i am inserting 100 records in one query and then using addBatch() on preparedstatment 700 times i execute whole batch .
Actually it was not error code 77.it was no of updates per statement.SO Everything is working fine.