I have several scripts running that downloads the daily xml and looks for every .xml in it and downloads them to a different folder so
1234.xml
/
daily.index.xml - - 4567.xml
\
6789.xml
Now I wish to do the same with the files.index.xml file, But everytime I try to open the index file the server stops with:
PHP Fatal error: Allowed memory size of 1073741824 bytes exhausted (tried to allocate 1073217536 bytes)
Is there a way to open up and dissect files.index.xml without my server to crash constantly?
Update: I believe the server hangs somewhere while running the script, as some XML files are beeing stored in the directory
Script:
// URL for index file
$url = "http://data.icecat.biz/export/level4/EN/files.index.xml";
// Custom header (username/pass is a paid account, so I can't share the credentials)
$context = stream_context_create(array (
'http' => array (
'header' => 'Authorization: Basic ' . base64_encode("username:pass")
)
));
// Get XML File
$indexfile = file_get_contents($url, false, $context);
// Save XML
$file = '../myhomeservices/fullindex/files_index.xml';
unlink($file);
$dailyfile = fopen("../myhomeservices/fullindex/files_index.xml", "w") or die("Unable to open file!");
chmod($dailyfile, 0777);
// Write the contents back to the file
$dailyxmlfile = fwrite($dailyfile, $indexfile);
if($dailyxmlfile){
} else {
echo 'Error!';
}
fclose($myfile);enter code here
Apache logs that 'file_get_contents($url, false, $context);' is leading to max out the memory.
Currently I'm trying to upload the files.index.xml (1,41gb file) in hope that I can process it this way.