dongzhiyi2006 2014-01-31 17:57
浏览 31
已采纳

当连接超时使用curl或get_file_contents到API它会杀死我的脚本

When the connection timesout when using curl or get_file_contents because of a network error or remote server doesn't respond for some reason it kills my script.

I am making these remote calls in a loop and if one fails it kills my script.

What is the best way to handle if a specific post fails, that it goes on to the next in the loop instead of the script dying?

  • 写回答

2条回答 默认 最新

  • dongshen6060 2014-01-31 18:03
    关注

    First set a parameter for CURL for a timeout limit:

     curl_setopt($ch, CURLOPT_TIMEOUT, 1800);
    

    The result of your curl_exec() call will show you if the request was successful or not:

    for(/* anything */) {
        $ch = curl_init(); 
        //...
        $result = curl_exec($ch);
        if (!$result) {
            continue; // use this to jump to the next loop
        }
    
        // this code will not be executed if the request failed
    
    }
    
    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论
查看更多回答(1条)

报告相同问题?