I have list of urls in excel xheet. i am reading all urls from excel sheet and existing urls and non existing urls are storing in serapare text files. My problem is if url is not responding then loop is stopped. my aim is to ship that url and check next url.
require_once 'excel_reader2.php';
$data = new Spreadsheet_Excel_Reader("urls.xls");
$totalsheets=count($data->sheets);
for($i=0;$i<count($data->sheets);$i++) // Loop to get all sheets in a file.
{
if(count($data->sheets[$i][cells])>0) // checking sheet not empty
{
$totalrows=count($data->sheets[$i][cells]);
for($j=1;$j<=count($data->sheets[$i][cells]);$j++) // loop used to get each row of the sheet
{
$name=$data->sheets[$i][cells][$j][1];
$url=$data->sheets[$i][cells][$j][2];
$file_headers =get_headers($url);
if(strpos($file_headers[0],"200")==true || $file_headers[0]!= 'HTTP/1.1 404 Not Found')
{
$exists = 'yes';
$myfile = fopen("existurls.txt", "w") or die("Unable to open file!");
$text .= $j." ".$name." ".$url."
";
fwrite($myfile, $text);
fclose($myfile);
echo "Url exists";
}
else
{
$exists = 'no';
$myfile = fopen("nonexisturls.txt", "w") or die("Unable to open file!");
$text .= $j." ".$name." ".$url."
";
fwrite($myfile, $text);
fclose($myfile);
echo "Url Not exists";
}
}
}
}
The code is working fine. But if any url is not responding then loop is stopping and not going to next one. Please help me how to skip that not responding url and continue the loop.