douhuang3740 2010-09-01 11:02
浏览 64

XML xpath搜索和数组循环与PHP,内存问题

I'm dealing with large XML files (several megabytes) for which I have to make various kind of checks. However I have problem with memory and time usage which grows very quickly. I've tested it like this:

$xml = new SimpleXMLElement($string);
$sum_of_elements = (double)0.0;

foreach ( $xml->xpath('//Amt') as $amt ) {
  $sum_of_elements += (double)$amt;
}

With microtime() and memory_get_usage() -functions I get the following results by running this code:

  • 5Mb file (7480 Amt-elements):
    • execution time 0,69s
    • Memory usage grows from 10.25Mb to 29.75Mb

That's still quite ok. But then with a bit bigger file memory and time usage grow very much:

  • 6Mb file (8976 Amt-elements):
    • execution time 8,53s
    • Memory usage grows from 10.25Mb to 99.25Mb

The problem seems to be in looping the result set. I've also tried for-loop instead of foreach but with no difference. Without looping the memory usage does not grow so much.

Any idea where the problem could be?

  • 写回答

1条回答

  • douzhuolong9886 2010-09-01 11:11
    关注

    SimpleXML is tree-based and will load the entire document into memory. Using unset to mark no longer needed resources for PHP's GC for cleanup during a loop might yield less memory usage. If that doesnt solve the issue, consider using XMLReader for a pull-based approach. Though you won't be able to use XPath, memory consumption should be significantly lower.

    评论

报告相同问题?

悬赏问题

  • ¥15 为什么使用javacv转封装rtsp为rtmp时出现如下问题:[h264 @ 000000004faf7500]no frame?
  • ¥15 乘性高斯噪声在深度学习网络中的应用
  • ¥15 运筹学排序问题中的在线排序
  • ¥15 关于docker部署flink集成hadoop的yarn,请教个问题 flink启动yarn-session.sh连不上hadoop,这个整了好几天一直不行,求帮忙看一下怎么解决
  • ¥30 求一段fortran代码用IVF编译运行的结果
  • ¥15 深度学习根据CNN网络模型,搭建BP模型并训练MNIST数据集
  • ¥15 C++ 头文件/宏冲突问题解决
  • ¥15 用comsol模拟大气湍流通过底部加热(温度不同)的腔体
  • ¥50 安卓adb backup备份子用户应用数据失败
  • ¥20 有人能用聚类分析帮我分析一下文本内容嘛