I've got a cron job that scans URLs to see if they're online or not. My problem is it never completes the 844 websites. It will vary between 260 to a little over 300.
The cron job calls this PHP file every 30 minutes, but it never completes my full list. Is there anything I'm not doing correctly to prevent the loop from completing the task?
// php file
//first part gets the csv file and reads the 844 sites
if (($handle = fopen("/public/csvs/" . $csv, "r")) !== FALSE) {
while (($data = fgetcsv($handle, 0, ",")) !== FALSE) {
$num = count($data);
for ($c = 0; $c < $num; $c++) {
$site = $data[$c];
$curl = curl_init();
curl_setopt_array( $curl, array(
CURLOPT_HEADER => true,
CURLOPT_NOBODY => true,
CURLOPT_RETURNTRANSFER => true,
CURLOPT_URL => 'https://'.$site ) );
$headers = explode( "
", curl_exec( $curl ) );
curl_close( $curl );
// gets the status of the URL then
fills it in the database.
$statushttp = $headers[0];
$mysqltime = date("Y-m-d H:i:s", $phptime);
//$conn->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
//$sql = $conn->prepare($sql);
$sql = "INSERT INTO csv (http_status,status,url,csv,related)
VALUES ('$statushttp','$status','$site','$csv',1)";
// use exec() because no results are returned
$conn->exec($sql);
//echo $site ." ".$statushttp."<br>";
//echo $statusCode."<br>";
}
}
}