douyi02577 2012-12-05 16:16
浏览 154
已采纳

PHP下载大文件返回0 Kb文件

Hello fellow stakoverflowers,

I'm having a problem with PHP/Apache. I have an application that allows the admin to upload 100Mb files. The uploading works well but I'm having problems with the downloading.

It works perfectly with smaller files (tested with a 50Mb file) but for some reason I can't get the 100 Mb files.

Here's my php code

$extension = 'zip'; //for testing

switch ($extension) {
  case "dwg": $contentType="image/vnd.dwg"; break;
  case "dxf": $contentType="image/vnd.dxf"; break;
  case "pdf": $contentType="application/pdf"; break;
  case "zip": $contentType="application/zip"; break; 
  case "png": $contentType="image/png"; break; 
  case "jpeg": $contentType="image/jpeg"; break; 
  case "jpg": $contentType="image/jpg"; break; 
  case "gif": $contentType="image/gif"; break; 
  default: 
    $contentType = '';
}

@header("Content-type: " . $contentType);
@header("Content-Disposition: attachment; filename=$filename");
@header("Cache-Control: no-cache, must-revalidate");
@header("Expires: Sat, 26 Jul 1997 05:00:00 GMT"); // passed date

echo file_get_contents($url);

I've also tried other solutions I found on SO.

header('Content-Description: File Transfer');
header('Content-Transfer-Encoding: binary');
header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: public");
header("Content-type: " . $contentType);
header("Content-Disposition: attachment; filename=\"".$filename."\"");
header("Content-Length: ".filesize($url));

echo self::url_get_contents(URL_PUBLIC . $url);

...

private function url_get_contents ($url) {
    if (!function_exists('curl_init')){ 
        die('CURL is not installed!');
    }
    $ch = curl_init();
    curl_setopt($ch, CURLOPT_URL, $url);
    curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
        $output = curl_exec($ch);
    curl_close($ch);
        echo $output;
    return $output;
}

Or

$file_contents = fopen($url, "r");
echo $file_contents;
fclose($file_contents);

All solutions return the same thing. A file size of 0 Kb. Like I said, smaller sized files works.

Also, when I test locally, the 100 Mb size files download correctly so my guess is that the problem comes from the server. I've changed the php.ini as follows

register_globals = Off
magic_quotes_gpc = Off
post_max_size = 128M
memory_limit = 256M
upload_max_filesize = 128M
max_execution_time = 120
expose_php = off
session.save_path = /tmp
mysqli.default_socket = /tmp/mysql5.sock
mysql.default_socket = /tmp/mysql5.sock

It's probably a memory limit problem but not sure.

  • 写回答

1条回答 默认 最新

  • doukanzhuo4297 2012-12-05 16:24
    关注

    I had a similar problem before. My problem turned out to be the hosting provider. Once a script takes too long to execute, the hosting provider typically just shuts it off. I've had the same issues when trying to upload large sql scripts through phpMyAdmin.

    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论

报告相同问题?

悬赏问题

  • ¥50 安卓adb backup备份子用户应用数据失败
  • ¥20 有人能用聚类分析帮我分析一下文本内容嘛
  • ¥15 请问Lammps做复合材料拉伸模拟,应力应变曲线问题
  • ¥30 python代码,帮调试
  • ¥15 #MATLAB仿真#车辆换道路径规划
  • ¥15 java 操作 elasticsearch 8.1 实现 索引的重建
  • ¥15 数据可视化Python
  • ¥15 要给毕业设计添加扫码登录的功能!!有偿
  • ¥15 kafka 分区副本增加会导致消息丢失或者不可用吗?
  • ¥15 微信公众号自制会员卡没有收款渠道啊