I'm creating some flat files from my MySQL database in a php job. Each file is 225kb. I create 40 files.
Basically, I'm running a PHP script which calls a query, the first time it has LIMIT 0, 1800. It then loops and runs 40 times and the last query uses LIMIT 72000,1800.
In the loop I sleep for 0.7 seconds. The whole process takes 45 seconds.
Heres some debug info I produced.
Query took 0.03 second for ../sitemap-0.xml done!
Query took 0.06 second for ../sitemap-1800.xml done!
..snip ..
Query took 0.9 second for ../sitemap-70200.xml done!
Query took 0.9 second for ../sitemap-72000.xml done!
Took: 44.7057
You'll notice the queries take longer and longer as the job runs, this must be the amount of data that the query needs to look at to determine the position of the LIMIT. As a result the CPU usage increases with each query.
The maximum cpu times limit on the server is 60 seconds.
My prime consideration is to keep cpu time and database query time as low as possible. Having high values is unacceptable.
Running the query over and over seems a bit waste of resources.
Is there a better way to do this ?