dst67283
dst67283
2012-03-09 05:42

php url数据获取在特定时间停止

已采纳

I am using php 5.2 and I am fetching data from url using file_get_contents function. This is loop for 5000 and I have divided into 500 slots and set a script like this. For 500 it is taking 3 hours to complete because for some url it is taking too much time and for some it is in 1 sec that is fine.

What I want if url is taking more than 30 sec then skip and go for next. I want to stop fetch after 30 sec.

    <?php
// Create the stream context
$context = stream_context_create(array(
    'http' => array(
        'timeout' => 1       // Timeout in seconds
    )
));

// Fetch the URL's contents
echo date("Y-m-d H:i:s")."
";
$contents = file_get_contents('http://example.com', 0, $context);
echo date("Y-m-d H:i:s")."
";
// Check for empties
if (!empty($contents))
{
    // Woohoo
//    echo $contents;
echo "file fetched";
}
else
{
echo $contents;
echo "more than 30 sec"; 
}
?>

I have already done that it is not working for me because file_get_contents function is not stoping it will continue , then only thing now I am getting no result after 30 sec but time it is taking sameas u can see in output. Output of php

2012-03-09 11:26:38 2012-03-09 11:26:40 more than 30 sec

  • 点赞
  • 写回答
  • 关注问题
  • 收藏
  • 复制链接分享
  • 邀请回答

2条回答

  • doutu1939 doutu1939 9年前

    You can set the HTTP timeout. (Not tested)

    <?php
    $ctx = stream_context_create(array(
        'http' => array(
            'timeout' => 30
         )
    ));
    file_get_contents("http://example.com/", 0, $ctx); 
    

    Source

    Edit: I don't know why it isn't working with this code by you. But if you don't manage it to bring it to work with this you may also want to give CURL a try. This could be eventually also faster for that (but I don't know if that is really faster...).
    If that would work for you, you could than use the curl_setopt function to set the timeout time with the CURLOPT_TIMEOUT flag.

    点赞 评论 复制链接分享
  • doutan1637 doutan1637 9年前

    There some info on the php manual about timeouts.

    http://php.net/manual/en/function.file-get-contents.php

    there is mention of the following as of php 5.2.1

    ini_set('default_socket_timeout', 120);   
    $a = file_get_contents("http://abcxyz.com");
    

    or adding a context which is more or less the same.

    // Create the stream context
    $context = stream_context_create(array(
        'http' => array(
            'timeout' => 3      // Timeout in seconds
        )
    ));
    // Fetch the URL's contents
    $contents = file_get_contents('http://abcxyz.com', 0, $context);`
    

    A third option is using PHP's fsockopen which has an explicit timeout option
    http://www.php.net/manual/en/function.fsockopen.php

    $timeout = 2; // seconds
    $fp = fsockopen($url, 80, $errNo, $errString, $timeout); 
    /* stops connecting after 2 seconds, 
      stores the error Number in $errNo, 
      the error String in $errStr */
    

    To save writing a lot of code, you could use it as a quick check if host is up.

    ie:

    if (pingLink($domain,$timeout)) {
      file_get_contents()
    } 
    
    function pingLink($domain,$timeout=30){ 
        $status    = 0; //default site is down
        $file      = fsockopen($domain,"r");
        if ($file) {
           $status = 1;  // Site is up
           fclose($file);
        }
        return $status;
    }
    
    点赞 评论 复制链接分享

相关推荐