douyueqing1530
2013-09-28 04:38
浏览 29
已采纳

Mysql插入大数据

for now im trying to optimize some codes..

What is a better way to insert a large data into table?

Consider this code is running.

$arrayOfdata = execute query (SELECT c.id FROM table1 ) 

getting all the data from table1 storing it to array and inserting it on the table.

 private function insertSomeData($arrayOfdata, $other_id){
      foreach($arrayOfdata as $data){
                INSERT INTO table (
                    other_id,
                    other2_id,
                    is_transfered
                ) VALUES ('.$other_id.', '.$data['id'].', 0)'
            }
    }

i know if it have 500k of data in table1 this code is very slow. so i tried something like this.. i put all in one sql query

INSERT INTO 
table (
    other_id,
    other2_id,
    is_transfered
) 
SELECT 
    "other_id", c.id, 0 
FROM table1 

I read that to much large of data to insert cause the mysql to slow down or timeout. i tried this code on 500k of data in my local computer and it runs well..

is there any way that it will cause a problem if large data will be insert? other ways for faster insert that would not cause the server to use to0 much resources?

图片转代码服务由CSDN问答提供 功能建议

现在我试图优化一些代码..

什么是 将大数据插入表格的更好方法?

请考虑此代码正在运行。

  $ arrayOfdata =执行查询(SELECT c.id FROM table1)
   
 
 

从table1获取所有数据 将它存储到数组并将其插入表中。

 私有函数insertSomeData($ arrayOfdata,$ other_id){
 foreach($ arrayOfdata as $ data){
 INSERT  INTO表(
 other_id,
 other2_id,
 is_transfered 
)VALUES('。$ other_id。','。$ data ['id']。',0)'
} 
} 
 <  / code>  
 
 

我知道如果table1中有500k的数据,这段代码非常慢。 所以我尝试了这样的东西..我把所有的一个sql查询

  INSERT INTO 
table(
 other_id,
 other2_id,
 is_transfered 
)
SELECT  
“other_id”,c.id,0 
FROM table1 
   
 
 

我读到要插入的大量数据导致mysql减速或超时。 i在我的本地计算机上尝试使用500k数据的代码并运行良好..

如果插入大数据会导致问题吗? 更快插入的其他方法不会导致服务器使用太多资源?

  • 写回答
  • 好问题 提建议
  • 关注问题
  • 收藏
  • 邀请回答

7条回答 默认 最新

  • douketangyouzh5219 2013-09-28 05:59
    已采纳

    Assuming that you are using InnoDB engine (which is default in most recent MySQL versions), you should simply use transactions: wrap your insert loop into BEGIN; ... COMMIT; block.

    By default, every statement is run as transaction, and server must make sure that data makes it safely to disk before continuing to next statement. If you start transaction, then do many inserts, and then commit transaction, only then server must flush all the data onto the disk. On modern hardware, this could amount only to few disk operations instead of 500k of them.

    Another consideration is to use prepared statements. Server has to parse every SQL statement before executing it. This parsing does not come for free, and it is not unusual that parsing time could be more expensive than actual query execution time. Usually, this parsing is done every single time, and for your case it is done 500k times. If you use prepared statements, parsing/preparation is done only once, and cost to execute statement is only disk write (which is enhanced further if you are within active transaction, because server can batch that write by delaying it until transaction commits).

    Total improvement from using these methods could be dramatic - I have seen situations when using transactions reduced total run time from 30 minutes to 20 seconds.

    This answer should give some idea how to use transactions in PHP with MySQL.

    评论
    解决 无用
    打赏 举报
  • 查看更多回答(6条)

相关推荐 更多相似问题