i have a database with reciprocal urls and check it daily for availability, so i can sort available links with unavailable.
i do it in such way:
<?php
foreach($urls as $url) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_USERAGENT, 'Mozilla/4.0 (compatible;)');
curl_setopt($ch, CURLOPT_URL,$url );
curl_setopt($ch, CURLOPT_HEADER, 1);
curl_setopt($ch, CURLOPT_TIMEOUT, 30);
curl_setopt($ch, CURLOPT_NOBODY, TRUE);
curl_setopt($ch, CURLOPT_FAILONERROR, TRUE);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT_MS, 60000);
curl_setopt($ch, CURLOPT_PROXY, '127.0.0.1:107');
$page = curl_exec($ch);
$httpcode = curl_getinfo($ch, CURLINFO_HTTP_CODE);
curl_close($ch);
if($httpcode>=100 && $httpcode<600) echo $httpcode; else echo '0';
}
?>
It output the header code. How can i use same function in more sleep way? because in this way to execute this script take about 15 minutes for 100 links. Thank you for any ideas and suggestions. I also tried to use multi_curl, but cant understand how to manage outputs.