dongzhiyi2006
2014-01-31 17:57 浏览 25
已采纳

当连接超时使用curl或get_file_contents到API它会杀死我的脚本

When the connection timesout when using curl or get_file_contents because of a network error or remote server doesn't respond for some reason it kills my script.

I am making these remote calls in a loop and if one fails it kills my script.

What is the best way to handle if a specific post fails, that it goes on to the next in the loop instead of the script dying?

  • 点赞
  • 写回答
  • 关注问题
  • 收藏
  • 复制链接分享
  • 邀请回答

2条回答 默认 最新

  • 已采纳
    dongshen6060 dongshen6060 2014-01-31 18:03

    First set a parameter for CURL for a timeout limit:

     curl_setopt($ch, CURLOPT_TIMEOUT, 1800);
    

    The result of your curl_exec() call will show you if the request was successful or not:

    for(/* anything */) {
        $ch = curl_init(); 
        //...
        $result = curl_exec($ch);
        if (!$result) {
            continue; // use this to jump to the next loop
        }
    
        // this code will not be executed if the request failed
    
    }
    
    点赞 评论 复制链接分享
  • douba8758 douba8758 2014-01-31 18:00

    Use a try and catch block:

    foreach (remote call....) {
        try {
           your current code
        } catch (Exception $e) {
           what to do if it fails
        }
    }
    

    If your whole script needs more time to run, use:

    set_time_limit ( 500 ) // or however many seconds you need
    
    点赞 评论 复制链接分享

相关推荐