I'm quite inexperienced in this area so it may seem like a dumb question, but please bear with me.
Got a mysql
DB and a query, via PHP PDO
that look like this:
SELECT p.*, group_concat( distinct c.color) as color, b.breeder_puppy_thumb , b.breeder_name
FROM puppy_list p
JOIN puppies_join_colors pjc ON p.ID = pjc.puppy_ID
JOIN puppy_colors c ON c.ID = pjc.color_ID
JOIN breeder_list b ON b.ID = p.puppy_breeder_id
WHERE p.puppy_breed = :breedName
GROUP BY p.ID
ORDER BY p.price DESC
The ORDER BY
clause can vary (can be ordered by a different field).
I send this query via AJAX
to a PHP
file that sends it to the DB server - so round trip, 4 hops.
In my website, there is pagination and it's done on the fly, all data is in memory in the form a big JSON object - and the processing is done on this JSON object, in the front end framework (angularJS
based - pagination too), without querying the DB again.
My first question is: given a query like this one, with the rows returned being in the thousands (tens of thousands) what's a safe limit to put in place on a single query? How many rows should i return so that i won't overkill the client?
My second question is: should i move part of the processing away from angular to the back-end framework or go with it like this?
If i didn't formulate the question well enough, please let me know and i'll try and address that.