I'm trying to test my PHP script that creates 48 cURL multi handles, each holding about 1500 requests, all executed in parallel. I'm trying to make sure that all requests are actually sent, and to do so, I'm logging the requests on my server. Here is my dump.php, which I make requests to, with my script, and it will log the request to a file:
<?php
ob_start();
echo $_SERVER["REQUEST_URI"];
$dump = ob_get_clean();
$log = "dump.txt";
$fh = fopen($log, 'a') or die("can't open file");
fwrite($fh, $dump . "
");
fclose($fh);
echo "Dump Completed";
?>
However, there is a slight problem of my quad core PC being able to fire off these requests at rate of about 1500 every 10 seconds, and most likely overloading my server as it tries to open the file at the same time. I've put a delay of 0.1 seconds between the sending of each request, but my server, I think still cannot open the file, dump the text, and close it before the next request arrives. As a result, my PHP file says that about 72k of requests were sent, but my server says that only about 14k were received. This could be due to my PHP program overloading my PC, or the port system, and some requests were never successfully sent, or my server isn't fast enough to handle 150 requests a second.
Any possible solutions to attempt to securely know that all my requests were sent? When my client uses the script, it is imperative that at least 99.5% of the requests are actually sent, and his server is powerful enough to handle that with no problem. I can't test it on his servers for security reasons, yet mine isn't powerful enough for me to simulate the working conditions.