doupaxia2478 2012-04-27 20:12 采纳率: 100%
浏览 38
已采纳

PHP file_get_contents()超时?

I am in the early stages of building a PHP application, part of which involves using file_get_contents() to fetch large files from a remote server and transfer them to a user. Lets say, for example, the targeted file that is being fetched is 200 mB.

  • Will this process time out if downloading to the server takes too long?
  • If so, is there a way to extend this timeout?
  • Can this file that is being downloaded also be transferred to a user simultaneously, or does the file have to be saved on the server then manually fetched by the user once download has completed?

I am just trying to make sure that I know my options or limitations are before I do much too more.

Thank you for your time.

  • 写回答

2条回答 默认 最新

  • duan19740319 2012-04-27 20:15
    关注

    Yes, you can use set_time_limit(0) and the max_execution_time directive to cancel the time limit imposed by PHP.

    You can open a stream of the file, and transfer it to the user seamlessly.
    Read about fopen()

    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论
查看更多回答(1条)

报告相同问题?

悬赏问题

  • ¥50 永磁型步进电机PID算法
  • ¥15 sqlite 附加(attach database)加密数据库时,返回26是什么原因呢?
  • ¥88 找成都本地经验丰富懂小程序开发的技术大咖
  • ¥15 如何处理复杂数据表格的除法运算
  • ¥15 如何用stc8h1k08的片子做485数据透传的功能?(关键词-串口)
  • ¥15 有兄弟姐妹会用word插图功能制作类似citespace的图片吗?
  • ¥200 uniapp长期运行卡死问题解决
  • ¥15 latex怎么处理论文引理引用参考文献
  • ¥15 请教:如何用postman调用本地虚拟机区块链接上的合约?
  • ¥15 为什么使用javacv转封装rtsp为rtmp时出现如下问题:[h264 @ 000000004faf7500]no frame?