dongxing4805 2015-05-12 22:09
浏览 41
已采纳

使用jQuery从执行外部连接的PHP脚本更新页面信息

I have a PHP script that performs a connection to my other server using file_get_contents, and then retrieves and displays the data.

    //authorize connection to the ext. server
    $xml_data=file_get_contents("http://server.com/connectioncounts"); 
    $doc = new DOMDocument(); 
    $doc->loadXML($xml_data);

    //variables to check for name / connection count
    $wmsast = $doc->getElementsByTagName('Name'); 
    $wmsasct = $wmsast->length; 

    //start the loop that fetches and displays each name
    for ($sidx = 0; $sidx < $wmsasct; $sidx++)  { 
    $strname = $wmsast->item($sidx)->getElementsByTagName("WhoIs")->item(0)->nodeValue; 
    $strctot = $wmsast->item($sidx)->getElementsByTagName("Sessions")->item(0)->nodeValue;  

    /**************************************
    Display only one instance of their name.
    strpos will check to see if the string contains a _ character
    **************************************/
    if (strpos($strname, '_') !== FALSE){
        //null. ignoring any duplicates
    }
    else {
        //Leftovers. This section contains the names that are only the BASE (no _jibberish, etc)
        echo $sidx . " <b>Name: </b>" . $strname . " Sessions: " . $strctot . "<br />";

    }//end display base check

}//end name loop

From the client side, I'm calling on this script using jQuery load () and to execute using mousemove().

$(document).mousemove(function(event){
    $('.xmlData').load('./connectioncounts.php').fadeIn(1000);
});

And I've also experimented with set interval which works just as well:

var auto_refresh = setInterval(
    function ()
    {
        $('.xmlData').load('./connectioncounts.php').fadeIn("slow");
    }, 1000); //refresh, 1000 milli = 1 second

It all works and the contents appear in "real time", but I can already notice an effect on performance and it's just me using it.

I'm trying to come up with a better solution but falling short. The problem with what I have now is that each client would be forcing the script to initiate a new connection to the other server, so I need a solution that will consistently keep the information updated without involving the clients making a new connection directly.

One idea I had was to use a cron job that executes the script, and modify the PHP to log the contents. Then I could simply get the contents of that cache from the client side. This would mean that there is only one connection being made instead of forcing a new connection every time a client wants the data.

The only problem is that the cron would have to be run frequently, like every few seconds. I've read about people running cron this much before, but every instance I've come across isn't making an external connection each time as well.

Is there any option for me other than cron to achieve this or in your experience is that good enough?

  • 写回答

2条回答 默认 最新

  • doudi5892 2015-05-12 22:18
    关注

    How about this: When the first client reads your data, you retrieve them from the remote server and cache them together with a timestamp.

    When the next clients read the same data, you check how old the contents of the cache is and only if it's older than 2 seconds (or whatever) you access the remote server again.

    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论
查看更多回答(1条)

报告相同问题?

悬赏问题

  • ¥15 wpf界面一直接收PLC给过来的信号,导致UI界面操作起来会卡顿
  • ¥15 init i2c:2 freq:100000[MAIXPY]: find ov2640[MAIXPY]: find ov sensor是main文件哪里有问题吗
  • ¥15 运动想象脑电信号数据集.vhdr
  • ¥15 三因素重复测量数据R语句编写,不存在交互作用
  • ¥15 微信会员卡等级和折扣规则
  • ¥15 微信公众平台自制会员卡可以通过收款码收款码收款进行自动积分吗
  • ¥15 随身WiFi网络灯亮但是没有网络,如何解决?
  • ¥15 gdf格式的脑电数据如何处理matlab
  • ¥20 重新写的代码替换了之后运行hbuliderx就这样了
  • ¥100 监控抖音用户作品更新可以微信公众号提醒