2016-11-21 07:18
浏览 570

如何从AWS S3将文件流式传输到Zip中

I'm using the PHP Flysystem package to stream content from my AWS S3 bucket. In particular, I'm using $filesystem->readStream.

My Question

When I stream a file, it ends up in myzip.zip and the size is correct, but when unzip it, it become myzip.zip.cpgz. Here is my prototype:

header('Pragma: no-cache');
header('Content-Description: File Download');
header('Content-disposition: attachment; filename="myZip.zip"');
header('Content-Type: application/octet-stream');
header('Content-Transfer-Encoding: binary');
$s3 = Storage::disk('s3'); // Laravel Syntax
echo $s3->readStream('directory/file.jpg');

What am I doing wrong?

Side Question

When I stream a file like this, does it:

  1. get fully downloaded into my server's RAM, then get transferred to the client, or
  2. does it get saved - in chunks - in the buffer, and then get transferred to the client?

Basically, is my server being burdened if I have have dozens of GB's of data being streamed?

  • 写回答
  • 好问题 提建议
  • 关注问题
  • 收藏
  • 邀请回答

1条回答 默认 最新

  • dpw5865 2016-11-21 07:38

    You are currently dumping the raw contents of the directory/file.jpg as the zip (which a jpg is not a zip) . You need to create a zip file with those contents.

    Instead of

    echo $s3->readStream('directory/file.jpg');

    Try the following in its place using the Zip extension:

    // use a temporary file to store the Zip file
    $zipFile = tmpfile();
    $zipPath = stream_get_meta_data($zipFile)['uri'];
    $jpgFile = tmpfile();
    $jpgPath = stream_get_meta_data($jpgFile)['uri'];
    // Download the file to disk
    stream_copy_to_stream($s3->readStream('directory/file.jpg'), $jpgFile);
    // Create the zip file with the file and its contents
    $zip = new ZipArchive();
    $zip->addFile($jpgPath, 'file.jpg');
    // export the contents of the zip

    Using tmpfile and stream_copy_to_stream, it will download it in chunks to a temporary file on disk and not into RAM

    解决 无用
    打赏 举报

相关推荐 更多相似问题