I have a recursive function to iterate over 11M database records, 1000 at a time. As it approached 9M it stopped. My assumption of a memory problem was confirmed when I displayed get_memory_usage()
after every 1000 records.
The function works something like this:
<?
get_data_block();
function get_data_block($id=0);
{
//open a csv file for writing
$packages_sorted_csv=fopen("./csv/packages_sorted.csv", "a");
//get 1000 records and process them
//$unsorted = array of 1000 records from database
foreach($unsorted as $row);
{
$ct++;
$id++;
//$packages_sorted = array of processed data
//write output
fputcsv($packages_sorted_csv, $packages_sorted);
}
fclose($packages_sorted_csv);
if($ct==1000)
{
unset($unsorted);
echo 'Mem usage: '.memory_get_usage();
get_data_block($id); //call the function again
}else{
//finished
}
}
?>
Does anyone have a tip about how to release all resources with recursive functions? ...or is there a way to call the same function again so it's not called by itself?
Notes:
- I have to chunk the data in blocks to free up the busy mysql server.
- I have tried unsetting every defined variable that's not in the global scope.
- The only thing I can't seem to unset is the fopen resource.
- The memory size grows by about 400k each iteration.