We have a php (Version 5.3.10) cli application doing some heavy work on a ubuntu 12.04 64 bit machine. This script can run for a long time depending on the dataset it receives. These datasets are zip files with a lot of XML, image and MS doc files.
Earlier this script used few system commands (shell, perl, java) to complete its task. We did not have problems then. Recently, we upgraded these scripts to use RabbitMQ for multiple concurrent invocations, moved from cron based working to supervisord for automatic recovery and monitoring, and also used php's core libraries and functions as much as possible to avoid shell invocations.
Now, after deploying to production, we found that the script fatally crashed on a line, where ZipArchive was used to create an archive. To be specific, only on its methods "open" and "addFile". We tested this many a times with the problematic dataset and found that this is where the real problem is.
The error thrown was "Fatal Error: Maximum execution time of 300 seconds exceeded". We know about php's limit on exection time and we double checked php.ini and all those settings under "/etc/php5/conf.d" folder, and everywhere we had "max_execution_time" set to 0. We also checked that the script's sapi mode was "cli" using "php_sapi_name()". ini_get("max_execution_time") also returns 0.
Even when the script is managed by supervisord, the above mode and execution limit are the same. We could not find out from where this "max_execution_time" limit of 300 seconds is being triggered.
One more thing, the script actually ran for more than 600 seconds when it crashed with this message. We also feel that, its only when ZipArchive took more than 300 seconds by itself, that this happens. But we are not sure. Also the partial zip archive it creates when this happens is between 280 MB and 290 MB. So we downloaded php source from its repository and did a quick grep to see if ZipArchive's code base had any such limits. We found none.
We are now trying to replace ZipArchive php code with shell command as a work around. We are yet to test it. I will post our findings here soon.
Had any of you faced such issues before? Is this something related to ZipArchive? Is it recommended to use ZipArchive for creating huge archives? The partial zip file it created before being crashed was between 280 MB and 290 MB.