douyueqing1530 2013-09-27 20:38
浏览 37
已采纳

Mysql插入大数据

for now im trying to optimize some codes..

What is a better way to insert a large data into table?

Consider this code is running.

$arrayOfdata = execute query (SELECT c.id FROM table1 ) 

getting all the data from table1 storing it to array and inserting it on the table.

 private function insertSomeData($arrayOfdata, $other_id){
      foreach($arrayOfdata as $data){
                INSERT INTO table (
                    other_id,
                    other2_id,
                    is_transfered
                ) VALUES ('.$other_id.', '.$data['id'].', 0)'
            }
    }

i know if it have 500k of data in table1 this code is very slow. so i tried something like this.. i put all in one sql query

INSERT INTO 
table (
    other_id,
    other2_id,
    is_transfered
) 
SELECT 
    "other_id", c.id, 0 
FROM table1 

I read that to much large of data to insert cause the mysql to slow down or timeout. i tried this code on 500k of data in my local computer and it runs well..

is there any way that it will cause a problem if large data will be insert? other ways for faster insert that would not cause the server to use to0 much resources?

  • 写回答

7条回答 默认 最新

  • douketangyouzh5219 2013-09-27 21:59
    关注

    Assuming that you are using InnoDB engine (which is default in most recent MySQL versions), you should simply use transactions: wrap your insert loop into BEGIN; ... COMMIT; block.

    By default, every statement is run as transaction, and server must make sure that data makes it safely to disk before continuing to next statement. If you start transaction, then do many inserts, and then commit transaction, only then server must flush all the data onto the disk. On modern hardware, this could amount only to few disk operations instead of 500k of them.

    Another consideration is to use prepared statements. Server has to parse every SQL statement before executing it. This parsing does not come for free, and it is not unusual that parsing time could be more expensive than actual query execution time. Usually, this parsing is done every single time, and for your case it is done 500k times. If you use prepared statements, parsing/preparation is done only once, and cost to execute statement is only disk write (which is enhanced further if you are within active transaction, because server can batch that write by delaying it until transaction commits).

    Total improvement from using these methods could be dramatic - I have seen situations when using transactions reduced total run time from 30 minutes to 20 seconds.

    This answer should give some idea how to use transactions in PHP with MySQL.

    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论
查看更多回答(6条)
编辑
预览

报告相同问题?

悬赏问题

  • ¥15 PADS Logic 原理图
  • ¥15 PADS Logic 图标
  • ¥15 电脑和power bi环境都是英文如何将日期层次结构转换成英文
  • ¥20 气象站点数据求取中~
  • ¥15 如何获取APP内弹出的网址链接
  • ¥15 wifi 图标不见了 不知道怎么办 上不了网 变成小地球了
手机看
程序员都在用的中文IT技术交流社区

程序员都在用的中文IT技术交流社区

专业的中文 IT 技术社区,与千万技术人共成长

专业的中文 IT 技术社区,与千万技术人共成长

关注【CSDN】视频号,行业资讯、技术分享精彩不断,直播好礼送不停!

关注【CSDN】视频号,行业资讯、技术分享精彩不断,直播好礼送不停!

客服 返回
顶部