Yii大型SQL查询占用大量内存

I am using Yii 1.1.14 with php 5.3 on centos 6 and I am using CDbCommand to fetch data from a very large table, the result set is ~90,000 records over 10 columns I am exporting it to a csv file and the file size is about 15MB, the script always crashed without any error messages and only after some research I figured out that I need to raise the memory_limit in php.ini in order to be able to execute the script successfully. The only problem is that for a successful execution I had to raise the memory limit to 512MB(!) which is a lot! and if 10 users will be executing the same script my server will not respond very well...

I was wondering if anyone might know of a way to reduce memory consumption on sql queries with Yii? I know I can split the query to multiple queries using limits and offsets, but it just doesn't seem logical that a 15MB query will consume 512MB.

Here is the code:

set_time_limit(0);
$connection = new CDbConnection($dsn,$username,$password);
$command = $connection->createCommand('SELECT * FROM TEST_DATA');
$result = $command->queryAll(); //this is where the script crashes
print_r($result);

Any ideas would be greatly appreciated!

Thanks,

duanraotun1674
duanraotun1674 嗨,谢谢你的回复!@mikemiller,应该没有时间问题,因为脚本很快失败~5秒,我已经将时间限制设置为0..
接近 6 年之前 回复
dongli564510
dongli564510 因此,您认为如果将某个内容写入文件时需要15MB,那么在将其转换为保存信息的语言的数据结构时,它必须分配相同数量的内存?但是要回答你已经(正确)猜到的问题-使用LIMIT和OFFSET。不要一次加载所有内容。计算您拥有的行数并在移动到下一个5k行之前对5k行执行操作-直到您完成所有90k。
接近 6 年之前 回复
ds355020
ds355020 你确定问题不在数据库吗?如果它是一个非常大的查询,你可能会将PHP计时,而不是耗尽其内存。我假设你在使用MySQL。也许尝试一些索引来加速它,否则你会遇到限制和偏移或迁移到NoSQL解决方案
接近 6 年之前 回复

1个回答

Instead of using readAll that will returns all the rows in a single array (the real memory problem is here), you should simply use a foreach loop (take a look at CDbDataReader), e.g. :

$command = $connection->createCommand('SELECT * FROM TEST_DATA');
$rows = $command->query();
foreach ($rows as $row)
{

}

EDIT : Using LIMIT

$count = Yii::app()->db->createCommand('SELECT COUNT(*) FROM TEST_DATA')->queryScalar();
$maxRows = 1000:
$maxPages = ceil($count / $maxRows);

for ($i=0;$i<$maxPages;$i++)
{
    $offset = $i * $maxRows;
    $rows = $connection->createCommand("SELECT * FROM TEST_DATA LIMIT $offset,$maxRows")->query();
    foreach ($rows as $row)
    {
      // Here your code
    }
}
dounao5856
dounao5856 答案已更新。
接近 6 年之前 回复
duandange7480
duandange7480 你应该使用LIMIT
接近 6 年之前 回复
douweibiao8471
douweibiao8471 嗨,这个方法非常好,使用你的例子我能够创建一个大约110,000条记录的16MB文件,所以这是一个大约20%的改进! 但是脚本仍然在$ rows = $ command-> query(); 当试图获得更多记录......任何想法?
接近 6 年之前 回复
Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
立即提问
相关内容推荐