douyi02577 2012-12-05 16:16
浏览 154
已采纳

PHP下载大文件返回0 Kb文件

Hello fellow stakoverflowers,

I'm having a problem with PHP/Apache. I have an application that allows the admin to upload 100Mb files. The uploading works well but I'm having problems with the downloading.

It works perfectly with smaller files (tested with a 50Mb file) but for some reason I can't get the 100 Mb files.

Here's my php code

$extension = 'zip'; //for testing

switch ($extension) {
  case "dwg": $contentType="image/vnd.dwg"; break;
  case "dxf": $contentType="image/vnd.dxf"; break;
  case "pdf": $contentType="application/pdf"; break;
  case "zip": $contentType="application/zip"; break; 
  case "png": $contentType="image/png"; break; 
  case "jpeg": $contentType="image/jpeg"; break; 
  case "jpg": $contentType="image/jpg"; break; 
  case "gif": $contentType="image/gif"; break; 
  default: 
    $contentType = '';
}

@header("Content-type: " . $contentType);
@header("Content-Disposition: attachment; filename=$filename");
@header("Cache-Control: no-cache, must-revalidate");
@header("Expires: Sat, 26 Jul 1997 05:00:00 GMT"); // passed date

echo file_get_contents($url);

I've also tried other solutions I found on SO.

header('Content-Description: File Transfer');
header('Content-Transfer-Encoding: binary');
header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: public");
header("Content-type: " . $contentType);
header("Content-Disposition: attachment; filename=\"".$filename."\"");
header("Content-Length: ".filesize($url));

echo self::url_get_contents(URL_PUBLIC . $url);

...

private function url_get_contents ($url) {
    if (!function_exists('curl_init')){ 
        die('CURL is not installed!');
    }
    $ch = curl_init();
    curl_setopt($ch, CURLOPT_URL, $url);
    curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
        $output = curl_exec($ch);
    curl_close($ch);
        echo $output;
    return $output;
}

Or

$file_contents = fopen($url, "r");
echo $file_contents;
fclose($file_contents);

All solutions return the same thing. A file size of 0 Kb. Like I said, smaller sized files works.

Also, when I test locally, the 100 Mb size files download correctly so my guess is that the problem comes from the server. I've changed the php.ini as follows

register_globals = Off
magic_quotes_gpc = Off
post_max_size = 128M
memory_limit = 256M
upload_max_filesize = 128M
max_execution_time = 120
expose_php = off
session.save_path = /tmp
mysqli.default_socket = /tmp/mysql5.sock
mysql.default_socket = /tmp/mysql5.sock

It's probably a memory limit problem but not sure.

  • 写回答

1条回答 默认 最新

  • doukanzhuo4297 2012-12-05 16:24
    关注

    I had a similar problem before. My problem turned out to be the hosting provider. Once a script takes too long to execute, the hosting provider typically just shuts it off. I've had the same issues when trying to upload large sql scripts through phpMyAdmin.

    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论

报告相同问题?

悬赏问题

  • ¥15 metadata提取的PDF元数据,如何转换为一个Excel
  • ¥15 关于arduino编程toCharArray()函数的使用
  • ¥100 vc++混合CEF采用CLR方式编译报错
  • ¥15 coze 的插件输入飞书多维表格 app_token 后一直显示错误,如何解决?
  • ¥15 vite+vue3+plyr播放本地public文件夹下视频无法加载
  • ¥15 c#逐行读取txt文本,但是每一行里面数据之间空格数量不同
  • ¥50 如何openEuler 22.03上安装配置drbd
  • ¥20 ING91680C BLE5.3 芯片怎么实现串口收发数据
  • ¥15 无线连接树莓派,无法执行update,如何解决?(相关搜索:软件下载)
  • ¥15 Windows11, backspace, enter, space键失灵