douyueqing1530
2013-09-28 04:38
浏览 37
已采纳

Mysql插入大数据

for now im trying to optimize some codes..

What is a better way to insert a large data into table?

Consider this code is running.

$arrayOfdata = execute query (SELECT c.id FROM table1 ) 

getting all the data from table1 storing it to array and inserting it on the table.

 private function insertSomeData($arrayOfdata, $other_id){
      foreach($arrayOfdata as $data){
                INSERT INTO table (
                    other_id,
                    other2_id,
                    is_transfered
                ) VALUES ('.$other_id.', '.$data['id'].', 0)'
            }
    }

i know if it have 500k of data in table1 this code is very slow. so i tried something like this.. i put all in one sql query

INSERT INTO 
table (
    other_id,
    other2_id,
    is_transfered
) 
SELECT 
    "other_id", c.id, 0 
FROM table1 

I read that to much large of data to insert cause the mysql to slow down or timeout. i tried this code on 500k of data in my local computer and it runs well..

is there any way that it will cause a problem if large data will be insert? other ways for faster insert that would not cause the server to use to0 much resources?

图片转代码服务由CSDN问答提供 功能建议

现在我试图优化一些代码..

什么是 将大数据插入表格的更好方法?

请考虑此代码正在运行。

  $ arrayOfdata =执行查询(SELECT c.id FROM table1)
   
 
 

从table1获取所有数据 将它存储到数组并将其插入表中。

 私有函数insertSomeData($ arrayOfdata,$ other_id){
 foreach($ arrayOfdata as $ data){
 INSERT  INTO表(
 other_id,
 other2_id,
 is_transfered 
)VALUES('。$ other_id。','。$ data ['id']。',0)'
} 
} 
 <  / code>  
 
 

我知道如果table1中有500k的数据,这段代码非常慢。 所以我尝试了这样的东西..我把所有的一个sql查询

  INSERT INTO 
table(
 other_id,
 other2_id,
 is_transfered 
)
SELECT  
“other_id”,c.id,0 
FROM table1 
   
 
 

我读到要插入的大量数据导致mysql减速或超时。 i在我的本地计算机上尝试使用500k数据的代码并运行良好..

如果插入大数据会导致问题吗? 更快插入的其他方法不会导致服务器使用太多资源?

  • 写回答
  • 好问题 提建议
  • 追加酬金
  • 关注问题
  • 邀请回答

7条回答 默认 最新

相关推荐 更多相似问题