2014-09-28 19:48
浏览 72


I'm trying to fetch 10 webpages simultaneously.

I'm using curl_multi.

However i end up with a lot of 503 (too many requests) error on most of the fetched webpages. How can i fix this?

Here's the php script that i ran. http://pastebin.com/HhrffciC

You can run it on any php enabled server.

Here is what the output on my machine looked like. http://i.imgur.com/dhnXJwz.jpg

图片转代码服务由CSDN问答提供 功能建议



但是,在大多数提取的网页上,我最终会遇到大量503(请求太多)错误。 我该如何解决这个问题?

这是我运行的php脚本。 http://pastebin.com/HhrffciC

您可以在 任何启用php的服务器。

这是我机器上的输出结果。 http://i.imgur.com/dhnXJwz.jpg

  • 点赞
  • 写回答
  • 关注问题
  • 收藏
  • 邀请回答

1条回答 默认 最新

  • dongyou6847 2014-09-28 20:49

    There is a library called ParallelCurl that can allow you to control how many simultaneous requests are sent out. The script below sets the maximum to 5 and simply sends a series of GET requests to the URLs in your code. If this displays 503 errors for you (it doesn't for me) you can lower $max_requests to your needs.

    require __DIR__ . '/parallelcurl.php';
    function on_request_done($content, $url, $ch, $search) {
        echo $content;
    $data = array(
    $max_requests = 5;
    $parallel_curl = new ParallelCurl($max_requests);
    foreach ($data as $url) {
        $parallel_curl->startRequest($url, 'on_request_done');

    The GitHub README explains how to use the library further.

    点赞 打赏 评论

相关推荐 更多相似问题