dtvfxzq3802 2009-08-04 19:53
浏览 148
已采纳

使用PHP中的gzip解压缩大文件

I'm using a simple unzip function (as seen below) for my files so I don't have to unzip files manually before they are processed further.

function uncompress($srcName, $dstName) {
    $string = implode("", gzfile($srcName));
    $fp = fopen($dstName, "w");
    fwrite($fp, $string, strlen($string));
    fclose($fp);
} 

The problem is that if the gzip file is large (e.g. 50mb) the unzipping takes a large amount of ram to process.

The question: can I parse a gzipped file in chunks and still get the correct result? Or is there a better other way to handle the issue of extracting large gzip files (even if it takes a few seconds more)?

  • 写回答

4条回答 默认 最新

  • doutaoer3148 2009-08-04 20:16
    关注

    gzfile() is a convenience method that calls gzopen, gzread, and gzclose.

    So, yes, you can manually do the gzopen and gzread the file in chunks.

    This will uncompress the file in 4kB chunks:

    function uncompress($srcName, $dstName) {
        $sfp = gzopen($srcName, "rb");
        $fp = fopen($dstName, "w");
    
        while (!gzeof($sfp)) {
            $string = gzread($sfp, 4096);
            fwrite($fp, $string, strlen($string));
        }
        gzclose($sfp);
        fclose($fp);
    }
    
    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论
查看更多回答(3条)

报告相同问题?

悬赏问题

  • ¥15 安装svn网络有问题怎么办
  • ¥15 Python爬取指定微博话题下的内容,保存为txt
  • ¥15 vue2登录调用后端接口如何实现
  • ¥65 永磁型步进电机PID算法
  • ¥15 sqlite 附加(attach database)加密数据库时,返回26是什么原因呢?
  • ¥88 找成都本地经验丰富懂小程序开发的技术大咖
  • ¥15 如何处理复杂数据表格的除法运算
  • ¥15 如何用stc8h1k08的片子做485数据透传的功能?(关键词-串口)
  • ¥15 有兄弟姐妹会用word插图功能制作类似citespace的图片吗?
  • ¥15 latex怎么处理论文引理引用参考文献