I have recently come across a situation where querying from two tables that has about 16k records each.
Executing raw SQL results in the response time to a few seconds and if I opt to stick to the ORM queries, the same results in a nginx request timeout since the query takes more than 1 min to process.
Has anyone faced a similar scenario , I am aware that heavy queries can be made using a Doctrine.
The implementation looks like this. A more detailed explanation on the architecture that is in use: https://stackoverflow.com/a/19771835/540771
Laravel App Default Connection connects to DB 1 (has connection details to DB 2). According to the correct connection connects to appropriate DB.
DB 1:
connections table ( id, db_name, db_username, db_password, hostname, port)
DB 2:
customer ( customer_id, name )
customer_details ( customer_id, email, etc ... )
The query executes on the DB 2 and looks like:
SELECT *
FROM customers as c LEFT JOIN customer_details as cd
ON c.customer_id=cd.customer_id
ORDER BY c.created_at DESC
The row count on the table customers and customer details is 16k each. The relationship is 1 .. * , but the business rules limit it in a way so that there is always 1 customer record maps into 1 customer_details record.
The result that is fetched into a Datatable. Pagination is to be added to limit the number of records returned.