2012-02-17 07:34


  • apache
  • json
  • php
  • caching

I'm trying to cache JSON content generated by a php script from database. However the dataset is very stable, there are very few changes or additions. Meaning data could go unchanged for weeks. The issue is that it contains a LOB column and that just takes a noticeable time to load, longer compared to supplying the json from a text file meaning git is the actual database call that makes it slow.

I'm displaying the data in a table with pagination(datatables jquery plugin) and for each page change the data is fetched from database again, also when going back to the previous page.

I've tried following:

"beforeSend": function (request)
    request.setRequestHeader("cache-control", "max-age=86400");

Does not work.

I tried mod_expires:

ExpiresActive On
ExpiresDefault "access plus 4 hours"
ExpiresByType application/javascript "access plus 1 day"
ExpiresByType application/json "access plus 1 day"

Does not work.

Therefore I assume all these settings are for real files on the file system only and not for dynamic generated stuff?

I would prefer a configurable approach meaning using Apache/PHP since I will not have full control over the server.


Note that the JSON contains multiple records so a key/value store would be kind of difficult to avhieve. The key would have to contain a lot of stuff: Query/filter expression and the requested page for paging.


Development and prod. are Windows... so memcached is not really an option...


I've tried kristovaher solution but does not work. The cache headers are not in the response all the time and after some plaing around I believe I determined the issue: I'm required do use NTLM authentication and when doing 2 request shortly after each other it works fine, however if you wait a bit, it seems the user is re-authenticated and then the cache control header is "lost".

  • 点赞
  • 回答
  • 收藏
  • 复制链接分享