dongzhiyi2006 2014-01-31 17:57
浏览 31
已采纳

当连接超时使用curl或get_file_contents到API它会杀死我的脚本

When the connection timesout when using curl or get_file_contents because of a network error or remote server doesn't respond for some reason it kills my script.

I am making these remote calls in a loop and if one fails it kills my script.

What is the best way to handle if a specific post fails, that it goes on to the next in the loop instead of the script dying?

  • 写回答

2条回答 默认 最新

  • dongshen6060 2014-01-31 18:03
    关注

    First set a parameter for CURL for a timeout limit:

     curl_setopt($ch, CURLOPT_TIMEOUT, 1800);
    

    The result of your curl_exec() call will show you if the request was successful or not:

    for(/* anything */) {
        $ch = curl_init(); 
        //...
        $result = curl_exec($ch);
        if (!$result) {
            continue; // use this to jump to the next loop
        }
    
        // this code will not be executed if the request failed
    
    }
    
    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论
查看更多回答(1条)

报告相同问题?

悬赏问题

  • ¥20 基于MSP430f5529的MPU6050驱动,求出欧拉角
  • ¥20 Java-Oj-桌布的计算
  • ¥15 powerbuilder中的datawindow数据整合到新的DataWindow
  • ¥20 有人知道这种图怎么画吗?
  • ¥15 pyqt6如何引用qrc文件加载里面的的资源
  • ¥15 安卓JNI项目使用lua上的问题
  • ¥20 RL+GNN解决人员排班问题时梯度消失
  • ¥60 要数控稳压电源测试数据
  • ¥15 能帮我写下这个编程吗
  • ¥15 ikuai客户端l2tp协议链接报终止15信号和无法将p.p.p6转换为我的l2tp线路