donglinyi4313 2012-06-20 15:23
浏览 83
已采纳

分页时显然更好:将结果存储在$ _SESSION中或单独查询每个页面

Imagine I've got a database with lots of data, from which users can search.
The result of a typical search is generally around 20-100 rows, which are then paginated (20 rows per page).

I've thought of two approaches to handle the navigation for these pages and would like to know if there are any pros and/or cons to these and if there are any better alternatives.

  1. Query once, store results in $_SESSION variable and filter rows according to current page. The reason I came up with this was to make the data retrieval once, without having to connect to the database for every page the user navigates. I don't know if it's better or worse than the other alternative I've come up with.

    session_start();
    
    $search = rawurldecode($_GET['search']);   //search word
    $interval = rawurldecode($_GET['interval']); //rows per page
    $page = rawurldecode($_GET['page']);    //page
    
    $min_row = $interval * ($page-1)+1;
    $max_row = $interval * $page;
    
    //query if (no results stored or first page) && the current search is not the previous search                               
    if((empty($_SESSION['SEARCH_RESULTS']) || $page == 1) && $_SESSION['SEARCH_RESULTS']['TERM'] != $search){
        $_SESSION['SEARCH_RESULTS'] = array();
        $_SESSION['SEARCH_RESULTS']['TERM'] = $search;
    
        $query = "exec usp_Search '$search'";
    
        $dbh = new DBH;
        $dbh->Connect()->Query($query);
    
        while($row = $dbh->Fetch_Array()){  
            $_SESSION['SEARCH_RESULTS']['ROWS'][] = $row;                           
        }
    }
    
    for($j = 0; $j < count($_SESSION['SEARCH_RESULTS']['ROWS']); $j++){
        $row = $_SESSION['SEARCH_RESULTS']['ROWS'][$j];
    
        //ignore all other rows not on the page
        if($j < ($min_row-1) || $j > $max_row) continue; 
    
        //print stuff
    }
    
  2. Query page by page. The query and the pagination is pretty straightforward.

    //Query
    $search = rawurldecode($_GET['search']);
    $interval = rawurldecode($_GET['interval']);
    $page = rawurldecode($_GET['page']);
    
    $min_row = $interval * ($page-1)+1;
    $max_row = $interval * $page;
    
    $query = "exec usp_Search '$search', $min_row, $max_row";
    
    $dbh = new DBH;
    $dbh->Connect()->Query($query);
    
    while($row = $dbh->Fetch_Array()){ 
        //print stuff                       
    }
    

SQL procedures from the alternatives

  1. Is just a procedure with a SELECT query

    SELECT 
        COL1,
        COL2,
        COL...
    FROM TABLE1
    WHERE (
        COL1 LIKE '%'+@search+'%' OR 
        COL2 LIKE '%'+@search+'%' OR 
        COL... LIKE '%'+@search+'%'
    )
    
  2. Is a procedure that creates a temp table and then selects rows from variables start to end.

    SELECT 
        COL1,
        COL2,
        COL...,
        ROW_NUMBER() OVER (ORDER BY COL1) AS [ROW_NUMBER]
    INTO #result
    FROM TABLE1
    WHERE (
        COL1 LIKE '%'+@search+'%' OR 
        COL2 LIKE '%'+@search+'%' OR 
        COL... LIKE '%'+@search+'%'
    )   
    
    SELECT 
        COL1,
        COL2,
        COL...
    FROM #result
    WHERE ROW_NUMBER BETWEEN @row_start AND @row_end
    
  • 写回答

2条回答 默认 最新

  • douju3911 2012-06-20 15:46
    关注

    You really can't store all of the results in the _SESSION for at least a couple reasons:

    • Users may make multiple searches simultaneously
    • Search results may change between a user's page loads.

    The second point depends on how frequently you update your DB, but is something to consider. The first is major, but you may also be able to get around it if you store the session in a clever way (but you don't want _SESSION getting too large either). This is irrespective of performance.

    Another consideration about getting all results at once and storing into _SESSION is that the majority of your users may only make one search request per visit. I know you would like to think they will always look at all 100 results, but if a large chunk of those results are not even being used, you're wasting quite a lot just to save a query or two. It's up to you to figure out how your users navigate.


    After reading that this is only going to be used by 20-30 people and only 70 rows a day, I'm satisfied to say you're wasting time trying to improve performance at this point. Go for the code that's easier to update later in case of major changes.

    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论
查看更多回答(1条)

报告相同问题?

悬赏问题

  • ¥15 C#算法问题, 不知道怎么处理这个数据的转换
  • ¥15 YoloV5 第三方库的版本对照问题
  • ¥15 请完成下列相关问题!
  • ¥15 drone 推送镜像时候 purge: true 推送完毕后没有删除对应的镜像,手动拷贝到服务器执行结果正确在样才能让指令自动执行成功删除对应镜像,如何解决?
  • ¥15 求daily translation(DT)偏差订正方法的代码
  • ¥15 js调用html页面需要隐藏某个按钮
  • ¥15 ads仿真结果在圆图上是怎么读数的
  • ¥20 Cotex M3的调试和程序执行方式是什么样的?
  • ¥20 java项目连接sqlserver时报ssl相关错误
  • ¥15 一道python难题3