Assume, that I have a big (MySQL-)table (>10k rows) with id -> string. I can put them all in an array and cache this array. But the question ist: How to cache it efficiently?
a) Cache it as one big item. So I will execute
Quite short and easy. But for every entry I need, I have to fetch the whole thing. Absolutely inefficient.
b) Cache every entry itself:
foreach( $array as $id => $str ) $redis->set( "array:$id", $str );
Using this way, I will have >10k entries in Redis. That doesn't feel good. If I have 10 of these tables, i will have 100k entries....
So what's your proposal? How to cache a big array?