How to output a massive file with PHP without running out of memory

I have the code below to output a big file, but it's falling over because PHP's memory use seems to grow and grow as the file is read:

  <?php
  // various header() calls etc.
  $stream = fopen($tarfile,'r');
  ob_end_flush();
  while (!feof($stream)) {
    $buf = fread($stream, 4096);
    print $buf;
    flush();
    unset($buf);
    $aa_usage = memory_get_usage(TRUE); // ← this keeps going up!
  }
  fclose($stream);

I had thought that by the combination of flush and unset the additional memory use would be limited to the 4k buffer, but I'm clearly wrong.


If all you need is to output the content of a file then the right tool to do it is the PHP function readfile() . Replace all the code you posted with:

readfile($tarfile);

As the documentation says:

Note:

readfile() will not present any memory issues, even when sending large files, on its own. If you encounter an out of memory error ensure that output buffering is off with ob_get_level() .


你可以尝试只加载你需要的数据,如果你加载更多的数据,使用函数:fseek()

链接地址: http://www.djcxy.com/p/90402.html

上一篇: 如何将数据从Javascript传递到WKWebView中的Swift?

下一篇: 如何使用PHP输出大量文件而不会耗尽内存