This question already has an answer here:
- Processing large JSON files in PHP 6 answers
I have a JS object of very large size that I can get through the curl, which I need to break apart for further writing to MYSQL. But when I try to walk through it with the forech cycle, I encounter a problem of not enough memory_size. Are there any options to handle the array with less load on the server without add memory size in php? Here is my code but it does not work
Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 20480 bytes)
<?php
#ini_set('memory_limit', '-1');
if( $curl = curl_init() ) {
curl_setopt($curl, CURLOPT_AUTOREFERER, TRUE);
curl_setopt($curl, CURLOPT_HEADER, 0);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($curl, CURLOPT_URL, $url);
curl_setopt($curl, CURLOPT_FOLLOWLOCATION, TRUE);
curl_setopt($curl,CURLOPT_URL,'http://img.combats.com/auction/dump.js');
curl_setopt($curl,CURLOPT_RETURNTRANSFER,true);
$out_shortinfo = curl_exec($curl);
$json = json_decode(str_replace('var auction_dump=', '', $out_shortinfo), true);
foreach ($json as $key=>$value) {
if($key > 0)
{
echo $value['name'].' ';
echo $value['txt'].' ';
echo $value['_auc']['id'].'<br>';
}
}
curl_close($curl);
}
?>
</div>