If you put your exec()
call in a separate script, you can call that external script multiple times in parallel using curl_multi_exec()
. That way, you'd make all the calls in separate requests, so they could execute simultaneously. Poll &$still_running
to see when all requests have finished, after which you can collect the results from each.
Update: Here's a blog post detailing exactly what I'm describing.
Example
Based on the blog post linked above, I put together the following example.
Script being run in parallel:
// waitAndDate.php
<?php
sleep((int)$_GET['time']);
printf('%d secs; %s', $_GET['time'], shell_exec('date'));
Script making calls in parallel:
// multiExec.php
<?php
$start = microtime(true);
$mh = curl_multi_init();
$handles = array();
// create several requests
for ($i = 0; $i < 5; $i++) {
$ch = curl_init();
$rand = rand(5,25); // just making up data to pass to script
curl_setopt($ch, CURLOPT_URL, "http://domain/waitAndDate.php?time=$rand");
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_TIMEOUT, 30);
curl_multi_add_handle($mh, $ch);
$handles[] = $ch;
}
// execute requests and poll periodically until all have completed
$isRunning = null;
do {
curl_multi_exec($mh, $isRunning);
usleep(250000);
} while ($isRunning > 0);
// fetch output of each request
$outputs = array();
for ($i = 0; $i < count($handles); $i++) {
$outputs[$i] = trim(curl_multi_getcontent($handles[$i]));
curl_multi_remove_handle($mh, $handles[$i]);
}
curl_multi_close($mh);
print_r($outputs);
printf("Elapsed time: %.2f seconds
", microtime(true) - $start);
Here is some output I received when running it a few times:
Array
(
[0] => 8 secs; Mon Apr 2 19:01:33 UTC 2012
[1] => 8 secs; Mon Apr 2 19:01:33 UTC 2012
[2] => 18 secs; Mon Apr 2 19:01:43 UTC 2012
[3] => 11 secs; Mon Apr 2 19:01:36 UTC 2012
[4] => 8 secs; Mon Apr 2 19:01:33 UTC 2012
)
Elapsed time: 18.36 seconds
Array
(
[0] => 22 secs; Mon Apr 2 19:02:33 UTC 2012
[1] => 9 secs; Mon Apr 2 19:02:20 UTC 2012
[2] => 8 secs; Mon Apr 2 19:02:19 UTC 2012
[3] => 11 secs; Mon Apr 2 19:02:22 UTC 2012
[4] => 7 secs; Mon Apr 2 19:02:18 UTC 2012
)
Elapsed time: 22.37 seconds
Array
(
[0] => 5 secs; Mon Apr 2 19:02:40 UTC 2012
[1] => 18 secs; Mon Apr 2 19:02:53 UTC 2012
[2] => 7 secs; Mon Apr 2 19:02:42 UTC 2012
[3] => 9 secs; Mon Apr 2 19:02:44 UTC 2012
[4] => 9 secs; Mon Apr 2 19:02:44 UTC 2012
)
Elapsed time: 18.35 seconds
Hope that helps!
One side note: make sure your web server can process this many parallel requests. If it serves them sequentially or can only serve very few simultaneously, this approach gains you little or nothing. :-)