dongsui4658 2010-03-29 13:21
浏览 21

是否可以在不将数据保存在RAM中的情况下将文件流入和流出服务器?

PHP question (new to PHP after 10 years of Perl, arrggghhh!).

I have a 100mb file that I want to send to another server.

I have managed to read the file in and "post" it without curl (cannot use curl for this app). Everything is working fine on smaller files.

However, with the larger files, PHP complains about not being able to allocate memory.

Is there a way to open a file, line by line, and send it as a post ALSO line by line?

This way nothing is held in ram thus preventing my errors and getting around strict limitations.

Chris

Here's my current code that errors with large files:

<?php
ini_set('display_errors',1);
error_reporting(E_ALL|E_STRICT);

$resp = do_post_request("/local/file.txt","http://www.mysite.com/receivedata.php");
exit;

function do_post_request($file,$url){
   $fileHandle = fopen($file, "rb");
   $fileContents = stream_get_contents($fileHandle);
   fclose($fileHandle);

   $params = array(
      'http' => array
      (
          'method' => 'POST',
          'header'=>"Content-Type: multipart/form-data
",
          'content' => $fileContents
      )
   );

   $ctx = stream_context_create($params);
   $fp = fopen($url, 'rb', false, $ctx);

   $response = stream_get_contents($fp);
   return $response;
}
?>
  • 写回答

1条回答 默认 最新

  • dtg25862 2010-03-29 13:29
    关注

    You can use fopen and fgets (or fread alternatively) to read the file sequentially.

    However if your only purpose is to flush the file to standard output, you can simply use readfile('filename') and it'll do exactly what you want.

    评论

报告相同问题?