I am working on a project where I must pull data from an XML API using PHP cURL.
I am unable to retrieve very large data sets from the API, for example I have no issue pulling 500 user email addresses. But if I am pulling a list of 15,000 email addresses I only get about 4,000 back and the request takes about 3 minutes.
I have adjusted the timeout in php.ini which did not help.
My thought is to retrieve the data in batches and then append to the data pulled in. I am pretty new to PHP/back end scripting so forgive me if this seems like a simple question.
My research has shown array_chunk may be useful, but wouldn't that need the full amount of data to chunk anyway? Or could I pull the data in as a chunk, go back to where I left off in the data set and rinse/repeat?
Here is the code I have so far.
function load_users($username, $token, $path, $list_id)
{
$xml = '<xmlrequest>
<username>'.$username.'</username>
<usertoken>'.$token.'</usertoken>
<requesttype>subscribers</requesttype>
<requestmethod>GetSubscribers</requestmethod>
<details>
<searchinfo>
<List>
'.list_id.'
</List>
<Email></Email>
</searchinfo>
</details>
</xmlrequest>';
$ch = curl_init($path);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, $xml);
$result = @curl_exec($ch);
curl_close($ch);
$xml_doc = simplexml_load_string($result);
return $xml_doc;
}