2013-03-12 20:15 阅读 71


I am wondering if there is a way to do what I am doing more efficiently. Right now, I have a class that retrives statuses from the database. It's pretty simple and shouldn't really effect performance all that much.

public function get ($var1, $var2, $var3)
    $feed = array(); //Initialize an empty array
    //Query the database
    $Statement = $this->Database->prepare("SELECT id, name, excerpt, post, timestamp, tags, title FROM posts WHERE col1 = ? AND col2 = ? AND col3 = ? ORDER BY id DESC LIMIT 15");
    $Statement->execute(array($var1, $var2, $var3));

    while($row = $Statement->fetch(PDO::FETCH_ASSOC))
        $posts[] = array( "id" => $row["id"], /*etc...*/ ); 

    return $posts;
} //end get

And then my page set up something like this which I know is not efficient at all:

<?php for ($count = 1; $count <= $total; $count++): //Display the calendar      
    echo $count;            
    $feed = $Feed->get($count, $total, $var3);

    foreach ($feed as $post): 
        echo $post["id"];           
endfor; ?>

I hope that makes sense. There's a lot more html thrown in there and everything. Right now there are only 18 rows in my database, and it takes 10 seconds to load the page. Which is really bad. I have to set it up this way because of the design of the site. So the foreach loop has to be within the for loop because the whole thing is set up as a calendar.

My question is whether it would be more efficient to select all of the rows, save them outside of the for loop and then work with that array, or whether it's better to run each query inside the foreach loop the way I'm doing it now (i've read a lot, and know that most people say this is a huge no no). And what kind of issues would I run into if I used the former option and there were say a million rows in the database.

I hope that makes sense. I'll update the question if it doesn't. Right now though about 30 queries are being made to only access 1 or 2 rows. But the only other option I could come up with is selecting all of the rows in the table, and then working with that array, but if there are pretend 1 million rows in the db, I feel like that would affect performance a lot more.

Am I right, and what are some solutions? Thanks

  • 点赞
  • 写回答
  • 关注问题
  • 收藏
  • 复制链接分享

1条回答 默认 最新

  • 已采纳
    duanping2695 duanping2695 2013-03-12 21:36

    I just want to point out that I did resolve the issue. If anyone is wondering why the foreach loop was querying so sow it was because I accidentally deleted a line where I connected to the Facebook api within the foreach loop every time to gather the poster's information. So if anyone ever stumbles upon this question, just to be sure I want to clarify that making many facebook->api calls is a bad thing. save the info in your database and query that instead.

    点赞 评论 复制链接分享