dongwo2772
2010-07-18 04:38 阅读 92
已采纳

PHP超时和FTP功能

In implementing the backup script I described in this serverfault question, I ran into some timeout issues that have prompted optimizations to the code (namely, backing up one file per execution of the script and doing everything I can to minimize the number of file-hashes I am calculating over the very large data files).

So far, that seems to have solved the timeout issue, but given the size of the files, there is certainly room for the transfer to take longer than the standard 30s allotted before a script times out. If that happens, I assume the file will simply be cut off as partially transferred. Is there any way to protect against this?

Note that I am operating on a shared-hosting environment, so editing the php.ini file is not an option.

  • 点赞
  • 写回答
  • 关注问题
  • 收藏
  • 复制链接分享

3条回答 默认 最新

  • 已采纳
    duanpu1111 duanpu1111 2010-07-18 05:27

    According to set_time_limit(), this should never be an issue because time spent executing activities outside the script are not included when calculating execution time of the script for timeout issues.

    点赞 评论 复制链接分享
  • dongpu1331 dongpu1331 2010-07-18 05:01

    If it's enabled, you can call set_time_limit(). Alternatively, if you run php from the command line (via cron or similar), max execution time does not apply.

    点赞 评论 复制链接分享
  • douxiaochun4964 douxiaochun4964 2010-07-18 05:01

    Can you try running the ftp job via the shell? Might work on a shared host...

    shell_exec('nohup ftp my-ftp-command 2> /dev/null');
    
    点赞 评论 复制链接分享

相关推荐