dongxing4643 2013-08-28 20:47
浏览 24

PHP服务器文件意外下载截止

I have a web interface that I built into the admin section of a WordPress site. It scrapes a few tables in my database and just displays a big list of data row by row. There are about 30,000 rows of this data, displayed with a basic echo in a for loop. Displaying all 30,000 rows on a page works fine.

Additionally, I include an option to download a CSV file of the complete rows of data. I use fopen and then fputcsv to build the CSV file for download from the result of the data query. This feature used to work, but now that the dataset is at 30,000, the CSV will no longer generate correctly. What happens is the first 200~1000 rows will be written to the CSV file leaving out the majority of the data. I estimate that the CSV that is not properly generated in my case would be about 10 Megs. Then the file will download the first 200~1000 rows as though everything was working correctly.

Here is the code:

// This gets a huge list of data from a SP I built. This data is well formed
$data = $this->run_stats_stored_procedure($job_to_report);

// This is where the data is converted into a csv file. This part is broken
// the file may already exist at that location burn it down if it does
if(file_exists(ABSPATH . "some/path/to/my/file/csv_export.csv")) {
    unlink(ABSPATH . "some/path/to/my/file/csv_export.csv");                 
}
$csv_file_handler = fopen(ABSPATH . "some/path/to/my/file/candidate_export.csv", 'w');

if(!empty($csv_file_handler)) {

    $title_array = array(
        "ID",
        "other_feild"
    );

    fputcsv($csv_file_handler, $title_array, ",");

    if(!empty($data)) {

        foreach($data as $data_piece) {
            $array_as_csv_line = array();   

            foreach($data_piece as $object_property) {                  
                $array_as_csv_line[] = (string)$object_property;
            }

            fputcsv($csv_file_handler, $array_as_csv_line, ",");    
            unset($array_as_csv_line);
        }
    } else {
        fputcsv($csv_file_handler, array("empty"), ",");
    }           
    // pros clean everything up when they are done
    fclose($csv_file_handler);
}

I'm not sure what I need to change to get the entire CSV file to download. I believe this could be a configuration issue, but I'm not should. I am led to believe this because this function used to work with even 20,000 csv rows, it is now at 30,000 and breaking. Please let me know if additional info would help. Has anyone bumped into issues with huge CSV files before? Thank you to anyone who can help.

  • 写回答

2条回答 默认 最新

  • doumei7420 2013-08-29 02:18
    关注

    Is the "download" taking more than say a minute, two minutes, or three minutes? If so, the webserver could be closing the connection. For example, if you're using the Apache FCGI module, it has this directive:

    FcgidBusyTimeout
    

    which defaults to 300 seconds.

    This is the maximum time limit for request handling. If a FastCGI request does not complete within FcgidBusyTimeout seconds, it will be subject to termination.

    Hope this helps you solve your problem.

    评论

报告相同问题?

悬赏问题

  • ¥15 Vue3 大型图片数据拖动排序
  • ¥15 划分vlan后不通了
  • ¥15 GDI处理通道视频时总是带有白色锯齿
  • ¥20 用雷电模拟器安装百达屋apk一直闪退
  • ¥15 算能科技20240506咨询(拒绝大模型回答)
  • ¥15 自适应 AR 模型 参数估计Matlab程序
  • ¥100 角动量包络面如何用MATLAB绘制
  • ¥15 merge函数占用内存过大
  • ¥15 使用EMD去噪处理RML2016数据集时候的原理
  • ¥15 神经网络预测均方误差很小 但是图像上看着差别太大