String[] insertCompIdarray = insertCompId.split(",");
String[] ttidarray = ttId.split(",");
List<TestInsert> tilist = new ArrayList<TestInsert>();
try{
for (int i = 0; i < ttidarray.length; i++) {
for (int j = 0; j < insertCompIdarray.length; j++) {
TestInsert ti =new TestInsert();
ti.setTicketTypeId(ttidarray[i]);
ti.setTicketCodeType(ticketCode);
ti.setComponentId(insertCompIdarray[j]);
tilist.add(ti);
}
}
final int batchSize = 65500;
sqlMapClient.startTransaction();
for (int i = 0 ; i < tilist.size() ; i ++) {
if (i % batchSize == 0) {
sqlMapClient.startBatch();
}
sqlMapClient.insert("EFT_TICKET_TYPE_TICKET_CODE.testInsertTicketCodeRelation", tilist.get(i));
if ((i+1) % batchSize ==0) {
sqlMapClient.executeBatch();
}
}
System.out.println("Running test");
sqlMapClient.executeBatch();
sqlMapClient.commitTransaction();
}
catch(SQLException e){
throw e;
}
finally{
sqlMapClient.endTransaction();
}
<typeAlias alias = "inserttest" type = "com.hkt.engineering.oneoss.ticketing.domain.TestInsert"/>
<insert id="testInsertTicketCodeRelation" parameterClass="inserttest" >
insert into TICKET_TYPE_TICKET_CODE
(
TICKET_TYPE_TICKET_CODE_ID,
TICKET_TYPE_ID,
TICKET_CODE_TYPE,
COMPONENT_ID
)
values
(
TICKET_TYPE_AND_CODE_ID_SEQ.nextVal,
#ticketTypeId#,
#ticketCodeType#,
#componentId#
)
</insert>
Oracle,批量插入,当batchSize设为100000(10W),插入的数据会出现丢失,1018160条数据,结果插入的只有30多万条,batchSize设为65000以下,不会出现数据丢失1018160条数据完全插入
请问这个数据丢失的原因是为什么呢?
还有,在完全插入数据的时间要大概147s,现在想缩短这个时间,请问有什么办法可以减少这个插入时间吗?