I'm attempting to kick off an "async" php job of a long-running script, via a curl request that times out almost immediately. I'm aware better solutions on different stacks exist, but this seems to be a common suggestion when vanilla php is all that's available.
$c = curl_init();
curl_setopt($c, CURLOPT_URL, $url);
curl_setopt($c, CURLOPT_FOLLOWLOCATION, true); // Follow the redirects (needed for mod_rewrite)
curl_setopt($c, CURLOPT_HEADER, false); // Don't retrieve headers
curl_setopt($c, CURLOPT_NOBODY, true); // Don't retrieve the body
curl_setopt($c, CURLOPT_RETURNTRANSFER, true); // Return from curl_exec rather than echoing
// Timeout super fast once connected, so it goes into async.
curl_setopt( $c, CURLOPT_TIMEOUT_MS, 1 );
return curl_exec( $c );
In the target $url
I've defined a script that is supposed to ignore a user abort, then do it's own thing.
ob_start();
ignore_user_abort(true);
set_time_limit(0);
echo "OK";
ob_flush();
flush();
// wait 5s to help debug
sleep(5);
// processing stuff
so with CURLOPT_TIMEOUT_MS
set to 1ms
, the script never runs. With it set to 100ms
, it still never runs. With it set to 500ms
, it runs with the expected behavior (the parent script gives up and returns after ~500ms, the expected logic of my script executes after 5s
.
Am I doing something wrong, or is the simple answer that even ignore_user_abort()
at the top of the script needs a certain amount of time to execute?