I'm currently doing something like a search-bot, some kind of web crawler. I found out that curl has some kind of multi-url-get-content. Here is my code:
protected function multiRequest($data, $options = array()) {
// array of curl handles
$curly = array();
// data to be returned
$result = array();
// multi handle
$mh = curl_multi_init();
// loop through $data and create curl handles
// then add them to the multi-handle
foreach ($data as $id => $d) {
$curly[$id] = curl_init();
$url = (is_array($d) && !empty($d['url'])) ? $d['url'] : $d;
curl_setopt($curly[$id], CURLOPT_URL, $url);
curl_setopt($curly[$id], CURLOPT_HEADER, 0);
curl_setopt($curly[$id], CURLOPT_RETURNTRANSFER, 1);
curl_multi_add_handle($mh, $curly[$id]);
}
// execute the handles
$running = null;
do {
$mrc = curl_multi_exec($mh, $running);
} while ($running > 0);
//} while ($mrc == CURLM_CALL_MULTI_PERFORM);
// get content and remove handles
foreach ($curly as $id => $c) {
$result[$id] = curl_multi_getcontent($c);
curl_multi_remove_handle($mh, $c);
}
// all done
curl_multi_close($mh);
return $result;
}
It worked when passing a single array with one link. Then it returns an array containing onr big String (the html content). However, when calling it a second time with a significant bigger array (~30 links), it returns me an array with the same size full of empty strings, as if the server just didn't want to answer all these requests. Is there a problem with my code?
Thanks for your help
Erik Brendel