dongxing4805 2015-05-12 22:09
浏览 41
已采纳

使用jQuery从执行外部连接的PHP脚本更新页面信息

I have a PHP script that performs a connection to my other server using file_get_contents, and then retrieves and displays the data.

    //authorize connection to the ext. server
    $xml_data=file_get_contents("http://server.com/connectioncounts"); 
    $doc = new DOMDocument(); 
    $doc->loadXML($xml_data);

    //variables to check for name / connection count
    $wmsast = $doc->getElementsByTagName('Name'); 
    $wmsasct = $wmsast->length; 

    //start the loop that fetches and displays each name
    for ($sidx = 0; $sidx < $wmsasct; $sidx++)  { 
    $strname = $wmsast->item($sidx)->getElementsByTagName("WhoIs")->item(0)->nodeValue; 
    $strctot = $wmsast->item($sidx)->getElementsByTagName("Sessions")->item(0)->nodeValue;  

    /**************************************
    Display only one instance of their name.
    strpos will check to see if the string contains a _ character
    **************************************/
    if (strpos($strname, '_') !== FALSE){
        //null. ignoring any duplicates
    }
    else {
        //Leftovers. This section contains the names that are only the BASE (no _jibberish, etc)
        echo $sidx . " <b>Name: </b>" . $strname . " Sessions: " . $strctot . "<br />";

    }//end display base check

}//end name loop

From the client side, I'm calling on this script using jQuery load () and to execute using mousemove().

$(document).mousemove(function(event){
    $('.xmlData').load('./connectioncounts.php').fadeIn(1000);
});

And I've also experimented with set interval which works just as well:

var auto_refresh = setInterval(
    function ()
    {
        $('.xmlData').load('./connectioncounts.php').fadeIn("slow");
    }, 1000); //refresh, 1000 milli = 1 second

It all works and the contents appear in "real time", but I can already notice an effect on performance and it's just me using it.

I'm trying to come up with a better solution but falling short. The problem with what I have now is that each client would be forcing the script to initiate a new connection to the other server, so I need a solution that will consistently keep the information updated without involving the clients making a new connection directly.

One idea I had was to use a cron job that executes the script, and modify the PHP to log the contents. Then I could simply get the contents of that cache from the client side. This would mean that there is only one connection being made instead of forcing a new connection every time a client wants the data.

The only problem is that the cron would have to be run frequently, like every few seconds. I've read about people running cron this much before, but every instance I've come across isn't making an external connection each time as well.

Is there any option for me other than cron to achieve this or in your experience is that good enough?

  • 写回答

2条回答 默认 最新

  • doudi5892 2015-05-12 22:18
    关注

    How about this: When the first client reads your data, you retrieve them from the remote server and cache them together with a timestamp.

    When the next clients read the same data, you check how old the contents of the cache is and only if it's older than 2 seconds (or whatever) you access the remote server again.

    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论
查看更多回答(1条)

报告相同问题?

悬赏问题

  • ¥15 delta降尺度计算的一些细节,有偿
  • ¥15 Arduino红外遥控代码有问题
  • ¥15 数值计算离散正交多项式
  • ¥30 数值计算均差系数编程
  • ¥15 redis-full-check比较 两个集群的数据出错
  • ¥15 Matlab编程问题
  • ¥15 训练的多模态特征融合模型准确度很低怎么办
  • ¥15 kylin启动报错log4j类冲突
  • ¥15 超声波模块测距控制点灯,灯的闪烁很不稳定,经过调试发现测的距离偏大
  • ¥15 import arcpy出现importing _arcgisscripting 找不到相关程序