I have to implement a simple file download client in PHP capable of downloading large files as well as resuming them.
Is there a way i can download large files (>700 MB) in PHP and still have my PHP memory limit to 128M ? I'm guessing this has to do with writing to a file pointer. Any clue on which file handling functions to use ? there are so many. I am guessing fopen, flock, (fwrite,fgets,fread), fclose. Or should i use cURL ?
How do i resume downloads which are broken ? Script execution timeout, user stopping script, remote server timeout etc. ?