I have a DB of sensor data that is being collected every second. The client would like to be able to download 12hour chunks in CSV format - This is all done.
The output is sadly not straight data and needs to be processed before the CSV can be created (parts are stored as JSON in the DB) - so I cant just dump the table.
So, to reduce load, I figured that the first time the file is downloaded, I would cache it to disk, then any more requests just download that file.
If I dont try to write it (using file_put_contents, FILE_APPEND), and just echo every line it is fine, but writing it, even if I give the script 512M it runs out of memory.
so this works
while($stmt->fetch()){
//processing code
$content = //CSV formatting
echo $content;
}
This does not
while($stmt->fetch()){
//processing code
$content = //CSV formatting
file_put_contents($pathToFile, $content, FILE_APPEND);
}
It seems like even thought I am calling file_put_contents at every line, it is storing it all to memory.
Any suggestions?