douruoshen1449 2012-08-07 11:43
浏览 261
已采纳

PHP脚本运行缓慢,但服务器CPU空闲

I have a PHP script to import various data from text files.

The import is very complex and my test file has 32.000 entrys. These entrys have to be parsed and inserted into a mysql database.

If i will run my script it needs 30 minutes to finish... And at this time my server cpu is 99% idle.

Is there a chance to optimize php and mysql that they are using more power from the machine?

code:

if ($handle = @fopen($filename, "r")) {

    $canceled = false;
    $item = null;
    $result = null;

    while (!feof($handle)) {
        $buffer = fgets($handle, 4096);
        $line = $buffer;

        if (substr($line, 0, 2) == '00') {

            continue;
        }
        else if (substr($line, 0, 2) == '99') {
            continue;
        }
        else if (strlen($line) < 75) {
            continue;
        }

        Reread:
        if ($canceled) {
            break;
        }

        if ($item == null) {
            $item = new DA11_Item();
            $item->boq_id = $boq_id;
        }
        $result = $this->add_line_into_item($item, $line);

        if ($result == self::RESULT_CLOSED) {
            $this->add_item($item);
            $item = null;
        }
        else if ($result == self::RESULT_REREAD) {
            $this->add_item($item);
            $item = null;
            goto Reread;
        }
        else if ($result == self::RESULT_IGNORD) {
            if (count($item->details()) > 0) {
                    $this->add_item($item);
            }
            $item = null;
        }
    }

    if ($item !== NULL) {
        $this->add_item($item);
    }

    fclose($handle);
}

add_item will perform a $item->save() and saves it to the database.

thx and kind regards, viperneo

  • 写回答

3条回答 默认 最新

  • dongsanhu4784 2012-08-07 11:48
    关注

    One problem you have is that every single insert is a separate request to your db-server including it's response. With 32.000 records you maybe get an idea, that this is a quite huge overhead. Use bulk inserts for (lets say) 1000 records at once

    INSERT INTO foo (col1, col2) VALUES
      (1,'2'),
      (3,'4')
      -- 997 additional rows
      (1999, '2000');
    

    Additional Transactions may help

    Update, because you mentioned active-record: I recommend to avoid any additional abstraction layer for such mass import tasks.

    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论
查看更多回答(2条)

报告相同问题?

悬赏问题

  • ¥15 装 pytorch 的时候出了好多问题,遇到这种情况怎么处理?
  • ¥20 IOS游览器某宝手机网页版自动立即购买JavaScript脚本
  • ¥15 手机接入宽带网线,如何释放宽带全部速度
  • ¥30 关于#r语言#的问题:如何对R语言中mfgarch包中构建的garch-midas模型进行样本内长期波动率预测和样本外长期波动率预测
  • ¥15 ETLCloud 处理json多层级问题
  • ¥15 matlab中使用gurobi时报错
  • ¥15 这个主板怎么能扩出一两个sata口
  • ¥15 不是,这到底错哪儿了😭
  • ¥15 2020长安杯与连接网探
  • ¥15 关于#matlab#的问题:在模糊控制器中选出线路信息,在simulink中根据线路信息生成速度时间目标曲线(初速度为20m/s,15秒后减为0的速度时间图像)我想问线路信息是什么