I have coded some library that persists some data (indexed by processID)
between multiple requests in same PHP-FPM process. Currently I have max_requests
set to 10000
. So, the data is shared between these 10000 requests, until the process dies.
When this data is unavailable (initially), one request that is being handled currently creates this data and subsequent requests use it.
Problem
I see that this data for the same process is created multiple times for same process. So, are multiple requests handled by same process concurrently? Or are the requests handled like a queue?