I am in the early stages of building a PHP application, part of which involves using file_get_contents()
to fetch large files from a remote server and transfer them to a user. Lets say, for example, the targeted file that is being fetched is 200 mB.
- Will this process time out if downloading to the server takes too long?
- If so, is there a way to extend this timeout?
- Can this file that is being downloaded also be transferred to a user simultaneously, or does the file have to be saved on the server then manually fetched by the user once download has completed?
I am just trying to make sure that I know my options or limitations are before I do much too more.
Thank you for your time.