dpkajqd31574096 2017-11-06 15:46
浏览 91

PHP - 处理大型数据集时超过允许的内存

I have a list data with 999,000 records.

I have a select query and a while loop to get the data, I use array_push to add the retrieved value in loop into one array.

And then I want it so every loop processes 1000 values in this array.

My problem is when use array_push with big data I get the error:

Fatal Error: Allowed Memory Size of 134217728 Bytes

How can I optimize my code to resolve my problem?

My code is below:

 $sql = "select customer_id";
 $sql .= " from";
 $sql .= "  t_customer t1";
 $sql .= "  inner join t_mail_address t2 using(mid, customer_id)";
 $result = $conn->query($sql);
 $customerArray = array();
 while ($row = $result ->fetch(PDO::FETCH_ASSOC)) {
    array_push($customerArray , $row);
 }
 // Execute every 1000 record 
 foreach(array_chunk($customerArray , 1000) as $execCustomerArray ) { 
   // My code to execute for every records.
   // ....
 }
  • 写回答

2条回答 默认 最新

  • doujing2497 2017-11-06 15:54
    关注

    I'm unsure if it would fix anything, but one thing I will say is, your use of pushing all records into an array is silly.

    You're using fetch to fetch them one by one, then adding them all to an array, why on earth aren't you just using PDOStatement::fetchAll() ?

    Example:

     $sql = "select customer_id";
     $sql .= " from";
     $sql .= "  t_customer t1";
     $sql .= "  inner join t_mail_address t2 using(mid, customer_id)";
     $result = $conn->query($sql);
     $customerArray = $result->fetchAll(PDO::FETCH_ASSOC);
     // Execute every 1000 record 
     foreach(array_chunk($customerArray , 1000) as $execCustomerArray ) { 
       // My code to execute for every records.
       // ....
     }
    

    This may not fix your memory issue, because we can't see what the heavy lifting is for every customer record, but I will say that while loop you had was silly but most likely not the cause of your memory issue

    Depending on if this is a script, or a web page thing, you could also have an incremental loop sort of thing, and use the MySQL LIMIT function to implement basic paging for your data, thus preventing it from coming into memory all at once,

    评论

报告相同问题?

悬赏问题

  • ¥15 delta降尺度计算的一些细节,有偿
  • ¥15 Arduino红外遥控代码有问题
  • ¥15 数值计算离散正交多项式
  • ¥30 数值计算均差系数编程
  • ¥15 redis-full-check比较 两个集群的数据出错
  • ¥15 Matlab编程问题
  • ¥15 训练的多模态特征融合模型准确度很低怎么办
  • ¥15 kylin启动报错log4j类冲突
  • ¥15 超声波模块测距控制点灯,灯的闪烁很不稳定,经过调试发现测的距离偏大
  • ¥15 import arcpy出现importing _arcgisscripting 找不到相关程序