I'm running into a problem with my Laravel Application. I have a bunch of data in my Postgres DB that I want to write to an XML file but the command keeps getting killed because it's out of memory and can't allocate any more.
I'm using Laravel 5.6 with PostGreSQL 10 and SimpleXml.
Below is a piece of the code working on building the XML Document.
$startTime = time();
$feedXml = new SimpleXMLElementExtended('<?xml version="1.0" encoding="UTF-8"?><products></products>',
LIBXML_NOERROR | LIBXML_NOWARNING);
$storeChannelProducts =
StoreChannelProduct::where('store_channel_id', $storeChannelId)
->whereNull('channel_data_exclude_reason')
->get();
ini_set('memory_limit', '512M');
/*
* Increase memory and execution time
*/
ini_set('memory_limit', (384 + min_max((count($storeChannelProducts)) * 2.5, 0, 1728)).'M');
set_time_limit(min_max((count($storeChannelProducts)) * 0.5, 30, 3600));
$storeChannelProducts->keyBy('store_channel_product_id');
$i = 1;
foreach ($storeChannelProducts as $storeChannelObject) {
\Log::debug('Feed building after making record ' . $i);
\Log::debug('Memory usage: '. memory_get_usage());
\Log::debug('Time since start = '.(time() - $startTime));
$i++;
if (gc_enabled()) {
gc_disable();
}
$storeChannelProducts->forget($storeChannelObject->store_channel_product_id);
unset($storeChannelObject, $feedProduct, $product);
}
\Log::debug('End of feed building');
In the foreach loop I get the data from my models and use the addChild and addAttribute methods to add the data to my XML file. The collection I loop over consist of about 4600 products.
When I ran the command the terminal gave "Killed!" as output and when I ran it again with the memory usage on a different tab I noticed that after about 1 minute the RAM being used was 1000MB, after about 3 minutes the RAM stabilized around 2370MB and from 5,5 minutes the RAM started increasing again until it crashed a little over the 8 minute mark when it was using 5000MB of RAM for this one command.
I already tried a few things like unsetting all variables that are being reset on the next iteration of the foreach loop and using gc_disable since I read somewhere that the Garbage Collector could use a lot of RAM to check all XML nodes to see which of them were still active.
I hope someone will be able to give me any insight or advice on how to fix this or decrease the RAM that is being used.