I've inherited a project that is using one file DBOperations for all interactions with out database. It's a huge mess of static function calls for read/writes. It's about 24,000 lines and maybe 1000 functions.
This file is then loaded using require_once in most pages. Just by including the file it adds about 300ms of load time. I need to sort this out and I'm looking for advice on how to approach it.
We don't have the resources to re-write everything so I want to re-organise everything.
I currently plan on splitting my big DBOperations file out into many smaller ones. Each database table will have it's own DBOps file e.g. my 'users' table will have a corresponding UsersDBOps.php file that will extend a class that will have some generic CRUD functions. I'll move all functions for the users table in DBOperations into UserDBOps. To prevent breaking any code I'll replace the code in DBOperations with a call to the function in UserDBOps. Finally I'll introduce autoloading.
So in summary my current plan is:
- Split DBOperations into DBOps files for each database table.
- Move all functions for a table to it's DBOps file. Update DBOperations functions to point to the new DBOps functions (prevent breaking my existing code)
- Slowly refactor all calls to DBOperations to the new DBOps files. Eventually I'll be able to get rid of DBOperations altogether.
- Introduce autoloading.
Would you make any changes to my plan?