doupaxia2478
doupaxia2478
采纳率100%
2012-04-27 20:12 阅读 28
已采纳

PHP file_get_contents()超时?

I am in the early stages of building a PHP application, part of which involves using file_get_contents() to fetch large files from a remote server and transfer them to a user. Lets say, for example, the targeted file that is being fetched is 200 mB.

  • Will this process time out if downloading to the server takes too long?
  • If so, is there a way to extend this timeout?
  • Can this file that is being downloaded also be transferred to a user simultaneously, or does the file have to be saved on the server then manually fetched by the user once download has completed?

I am just trying to make sure that I know my options or limitations are before I do much too more.

Thank you for your time.

  • 点赞
  • 写回答
  • 关注问题
  • 收藏
  • 复制链接分享

2条回答 默认 最新

  • 已采纳
    duan19740319 duan19740319 2012-04-27 20:15

    Yes, you can use set_time_limit(0) and the max_execution_time directive to cancel the time limit imposed by PHP.

    You can open a stream of the file, and transfer it to the user seamlessly.
    Read about fopen()

    点赞 评论 复制链接分享
  • dongzhuang5741 dongzhuang5741 2012-04-27 20:15

    If not a timeout you may well run into memory issues depending on how your PHP is configured. You can adjust a lot of these settings manually through code without much difficulty.

    http://php.net/manual/en/function.ini-set.php

    ini_set('memory_limit', '256M');
    
    点赞 评论 复制链接分享

相关推荐