PHP中的异步函数调用

I am working on an a PHP web application and i need to perform some network operations in the request like fetching someone from remote server based on user's request.

Is it possible to simulate asynchronous behavior in PHP given that i have to pass some data to a function and also need output from it.

My code is like:

<?php

     $data1 = processGETandPOST();
     $data2 = processGETandPOST();
     $data3 = processGETandPOST();

     $response1 = makeNetworkCall($data1);
     $response2 = makeNetworkCall($data2);
     $response3 = makeNetworkCall($data3);

     processNetworkResponse($response1);
     processNetworkResponse($response2);
     processNetworkResponse($response3);

     /*HTML and OTHER UI STUFF HERE*/

     exit;
?>

Each network operation takes around 5 seconds to complete adding a total of 15 seconds to the response time of my application given i make 3 requests.

The makeNetworkCall() function just do a HTTP POST request.

The remote server is an 3rd party API so i don't have any control over there.

PS: Please do not answer giving suggestions about AJAX or Other things. I am currently looking if i can do this through PHP may be with an C++ extension or something like that.

doupu0619
doupu0619 您可以使用PHP的stream_select函数来运行非阻塞代码。React使用它来创建一个类似于node.js的事件驱动循环。
大约 5 年之前 回复
dongpeng8994
dongpeng8994 可能重复的PHP线程异步调用php函数
6 年多之前 回复
drake900918
drake900918 我相信答案就在这里:stackoverflow.com/questions/13846192/...快速注意:使用线程
7 年多之前 回复
doujiang1993
doujiang1993 尝试使用CURL来触发请求并从Web获取一些数据......
接近 8 年之前 回复

6个回答



现在,最好使用队列比线程(对于那些不使用Laravel的人来说,还有很多其他的实现喜欢这样)。</ p>

基本思想是,您的原始PHP脚本将任务或作业放入队列中。 然后你有队列作业工作者在其他地方运行,从队列中取出工作并开始独立于原始PHP处理它们。</ p>

优点是:</ p>


  1. 可扩展性 - 您只需添加工作节点即可满足需求。 通过这种方式,任务可以并行运行。</ li>
  2. 可靠性 - 现代队列管理器(如RabbitMQ,ZeroMQ,Redis等)非常可靠。 </ li>
    </ ol>
    </ div>

展开原文

原文

Nowadays, it's better to use queues than threads (for those who don't use Laravel there are tons of other implementations out there like this).

The basic idea is, your original PHP script puts tasks or jobs into a queue. Then you have queue job workers running elsewhere, taking jobs out of the queue and starts processing them independently of the original PHP.

The advantages are:

  1. Scalability - you can just add worker nodes to keep up with demand. In this way, tasks are run in parallel.
  2. Reliability - modern queue managers such as RabbitMQ, ZeroMQ, Redis, etc, are made to be extremely reliable.



cURL将是您唯一真正的选择(或者使用非阻塞套接字和一些自定义逻辑)。</ p>

此链接应发送 你在正确的方向。 PHP中没有异步处理,但如果您尝试同时发出多个Web请求,cURL multi将为您处理这些问题。</ p>
</ div>

展开原文

原文

cURL is going to be your only real choice here (either that, or using non-blocking sockets and some custom logic).

This link should send you in the right direction. There is no asynchronous processing in PHP, but if you're trying to make multiple simultaneous web requests, cURL multi will take care of that for you.



我认为如果HTML和其他UI内容需要返回的数据,那么就不会有异步方法。 </ p>

我认为在PHP中执行此操作的唯一方法是在数据库中记录请求并每分钟进行一次cron检查,或者使用Gearman队列处理,或者执行exec( )命令行进程</ p>

与此同时,你的php页面必须生成一些html或js,每隔几秒就会重新加载一次以检查进度,不理想。</ p> \ n

为了回避这个问题,你期待多少个不同的请求? 你能每小时左右自动下载它们并保存到数据库吗?</ p>
</ div>

展开原文

原文

I think if the HTML and other UI stuff needs the data returned then there is not going to be a way to async it.

I believe the only way to do this in PHP would be to log a request in a database and have a cron check every minute, or use something like Gearman queue processing, or maybe exec() a command line process

In the meantime you php page would have to generate some html or js that makes it reload every few seconds to check on progress, not ideal.

To sidestep the issue, how many different requests are you expecting? Could you download them all automatically every hour or so and save to a database?



我没有直接答案,但你可能想看看这些事情:</ p>

展开原文

原文

I dont have a direct answer, but you might want to look into these things:



还有http v2是curl的包装器。 可以通过pecl安装。 </ p>

http://devel-m6w6.rhcloud.com / mdref / http / </ p>
</ div>

展开原文

原文

There is also http v2 which is a wrapper for curl. Can be installed via pecl.

http://devel-m6w6.rhcloud.com/mdref/http/

I think some code about the cURL solution is needed here, so I will share mine (it was written mixing several sources as the PHP Manual and comments).

It does some parallel HTTP requests (domains in $aURLs) and print the responses once each one is completed (and stored them in $done for other possible uses).

The code is longer than needed because the realtime print part and the excess of comments, but feel free to edit the answer to improve it:

<?php
/* Strategies to avoid output buffering, ignore the block if you don't want to print the responses before every cURL is completed */
ini_set('output_buffering', 'off'); // Turn off output buffering
ini_set('zlib.output_compression', false); // Turn off PHP output compression       
//Flush (send) the output buffer and turn off output buffering
ob_end_flush(); while (@ob_end_flush());        
apache_setenv('no-gzip', true); //prevent apache from buffering it for deflate/gzip
ini_set('zlib.output_compression', false);
header("Content-type: text/plain"); //Remove to use HTML
ini_set('implicit_flush', true); // Implicitly flush the buffer(s)
ob_implicit_flush(true);
header('Cache-Control: no-cache'); // recommended to prevent caching of event data.
$string=''; for($i=0;$i<1000;++$i){$string.=' ';} output($string); //Safari and Internet Explorer have an internal 1K buffer.
//Here starts the program output

function output($string){
    ob_start();
    echo $string;
    if(ob_get_level()>0) ob_flush();
    ob_end_clean();  // clears buffer and closes buffering
    flush();
}

function multiprint($aCurlHandles,$print=true){
    global $done;
    // iterate through the handles and get your content
    foreach($aCurlHandles as $url=>$ch){
        if(!isset($done[$url])){ //only check for unready responses
            $html = curl_multi_getcontent($ch); //get the content           
            if($html){
                $done[$url]=$html;
                if($print) output("$html".PHP_EOL);
            }           
        }
    }
};

function full_curl_multi_exec($mh, &$still_running) {
    do {
      $rv = curl_multi_exec($mh, $still_running); //execute the handles 
    } while ($rv == CURLM_CALL_MULTI_PERFORM); //CURLM_CALL_MULTI_PERFORM means you should call curl_multi_exec() again because there is still data available for processing
    return $rv;
} 

set_time_limit(60); //Max execution time 1 minute

$aURLs = array("http://domain/script1.php","http://domain/script2.php");  // array of URLs

$done=array();  //Responses of each URL

    //Initialization
    $aCurlHandles = array(); // create an array for the individual curl handles
    $mh = curl_multi_init(); // init the curl Multi and returns a new cURL multi handle
    foreach ($aURLs as $id=>$url) { //add the handles for each url        
        $ch = curl_init(); // init curl, and then setup your options
        curl_setopt($ch, CURLOPT_URL, $url);
        curl_setopt($ch, CURLOPT_RETURNTRANSFER,1); // returns the result - very important
        curl_setopt($ch, CURLOPT_HEADER, 0); // no headers in the output
        $aCurlHandles[$url] = $ch;
        curl_multi_add_handle($mh,$ch);
    }

    //Process
    $active = null; //the number of individual handles it is currently working with
    $mrc=full_curl_multi_exec($mh, $active); 
    //As long as there are active connections and everything looks OK…
    while($active && $mrc == CURLM_OK) { //CURLM_OK means is that there is more data available, but it hasn't arrived yet.  
        // Wait for activity on any curl-connection and if the network socket has some data…
        if($descriptions=curl_multi_select($mh,1) != -1) {//If waiting for activity on any curl_multi connection has no failures (1 second timeout)     
            usleep(500); //Adjust this wait to your needs               
            //Process the data for as long as the system tells us to keep getting it
            $mrc=full_curl_multi_exec($mh, $active);        
            //output("Still active processes: $active".PHP_EOL);        
            //Printing each response once it is ready
            multiprint($aCurlHandles);  
        }
    }

    //Printing all the responses at the end
    //multiprint($aCurlHandles,false);      

    //Finalize
    foreach ($aCurlHandles as $url=>$ch) {
        curl_multi_remove_handle($mh, $ch); // remove the handle (assuming  you are done with it);
    }
    curl_multi_close($mh); // close the curl multi handler
?>
Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
立即提问
相关内容推荐