I am in the early stages of building a PHP application, part of which involves using
file_get_contents() to fetch large files from a remote server and transfer them to a user. Lets say, for example, the targeted file that is being fetched is 200 mB.
- Will this process time out if downloading to the server takes too long?
- If so, is there a way to extend this timeout?
- Can this file that is being downloaded also be transferred to a user simultaneously, or does the file have to be saved on the server then manually fetched by the user once download has completed?
I am just trying to make sure that I know my options or limitations are before I do much too more.
Thank you for your time.
2条回答 默认 最新
- 已采纳 duan19740319 2012-04-27 20:15点赞 评论 复制链接分享
- dongzhuang5741 2012-04-27 20:15
If not a timeout you may well run into memory issues depending on how your PHP is configured. You can adjust a lot of these settings manually through code without much difficulty.
ini_set('memory_limit', '256M');点赞 评论 复制链接分享