dongsuyou6938 2018-05-29 09:45
浏览 50

SimpleXml在写入文件时使用大量RAM

I'm running into a problem with my Laravel Application. I have a bunch of data in my Postgres DB that I want to write to an XML file but the command keeps getting killed because it's out of memory and can't allocate any more.

I'm using Laravel 5.6 with PostGreSQL 10 and SimpleXml.

Below is a piece of the code working on building the XML Document.

 $startTime = time();

    $feedXml = new SimpleXMLElementExtended('<?xml version="1.0" encoding="UTF-8"?><products></products>',
        LIBXML_NOERROR | LIBXML_NOWARNING);

$storeChannelProducts =
        StoreChannelProduct::where('store_channel_id', $storeChannelId)
                           ->whereNull('channel_data_exclude_reason')
                           ->get();

    ini_set('memory_limit', '512M');

    /*
     * Increase memory and execution time
     */
    ini_set('memory_limit', (384 + min_max((count($storeChannelProducts)) * 2.5, 0, 1728)).'M');
    set_time_limit(min_max((count($storeChannelProducts)) * 0.5, 30, 3600));

    $storeChannelProducts->keyBy('store_channel_product_id');
    $i = 1;
    foreach ($storeChannelProducts as $storeChannelObject) {

        \Log::debug('Feed building after making record ' . $i);
        \Log::debug('Memory usage: '. memory_get_usage());
        \Log::debug('Time since start = '.(time() - $startTime));

        $i++;

        if (gc_enabled()) {
            gc_disable();
        }

        $storeChannelProducts->forget($storeChannelObject->store_channel_product_id);
        unset($storeChannelObject, $feedProduct, $product);
    }

    \Log::debug('End of feed building');

In the foreach loop I get the data from my models and use the addChild and addAttribute methods to add the data to my XML file. The collection I loop over consist of about 4600 products.

When I ran the command the terminal gave "Killed!" as output and when I ran it again with the memory usage on a different tab I noticed that after about 1 minute the RAM being used was 1000MB, after about 3 minutes the RAM stabilized around 2370MB and from 5,5 minutes the RAM started increasing again until it crashed a little over the 8 minute mark when it was using 5000MB of RAM for this one command.

I already tried a few things like unsetting all variables that are being reset on the next iteration of the foreach loop and using gc_disable since I read somewhere that the Garbage Collector could use a lot of RAM to check all XML nodes to see which of them were still active.

I hope someone will be able to give me any insight or advice on how to fix this or decrease the RAM that is being used.

  • 写回答

0条回答 默认 最新

    报告相同问题?

    悬赏问题

    • ¥15 用土力学知识进行土坡稳定性分析与挡土墙设计
    • ¥15 帮我写一个c++工程
    • ¥30 Eclipse官网打不开,官网首页进不去,显示无法访问此页面,求解决方法
    • ¥15 关于smbclient 库的使用
    • ¥15 微信小程序协议怎么写
    • ¥15 c语言怎么用printf(“\b \b”)与getch()实现黑框里写入与删除?
    • ¥20 怎么用dlib库的算法识别小麦病虫害
    • ¥15 华为ensp模拟器中S5700交换机在配置过程中老是反复重启
    • ¥15 uniapp uview http 如何实现统一的请求异常信息提示?
    • ¥15 有了解d3和topogram.js库的吗?有偿请教