I have an application which routes requests depending on predefined URLs
in the following manners:
1st step: Cache lookup
All the URLS
are aggregated into a single array
(which is serialized and stored in cache) which structure is as follows:
- Each
array key
represents aURL
- The
URL
information (i.e. what to do with theURL
) is defined as the correspondingarray value
.
In practice, this gives me (FYI PHP 5.4
array syntax):
<?php
// structure of the cached URLS array
$cached_urls = [
'/pageA' => [
'controller' => 'ProductController',
'content_id' => 1234
],
'/serviceA' => [
'controller' => 'ServiceController',
'content_id' => 45678
]
];
// working with the array (retrieve $cached_urls from cache, then...)
if (!isset($cached_urls[$request['url']])) {
// 404
} else {
$url = $cached_urls[$request['url']];
// further actions based on $url
}
2nd step: DB lookup & cache rebuilding
if the $cached_urls
array could not be retrieved from cache I do 3 things:
- retrieve the
URL
information from aurl
table (where eachrow
represents1 URL
and theURL
itself is the field used to filter the query) and other related tables. - Process the request the same way I would to it in the cache scenario once I have
$url
- Rebuild the cache so that next time a request comes in, I don't have to do a DB lookup,
In terms of execution speed, here is what we have (from faster to slower):
- Cache Lookup ( ~10ms)
- DB lookup ( ~100ms)
- Rebuilding the cache ( ~2000ms)
When the cache is available, page are served very fast, however every time the cache gets rebuilt (which is about every minute), pages take a couple of seconds to be served, which is a problem. Therefore I was wondering:
What are the design patterns available in PHP to perform asynchronous processing (which in my case would be used to rebuild the cache) while avoiding the same tasks to be executed several times concurrently (as I only need the cache to be rebuilt once at a time until it is rebuilt, not for every request that hits my application in the meantime)?