2012-08-23 06:42



I'm monitoring a website (using curl) to see if it's up and to know the response time for a simple GET request on the homepage. I'm using a homemade script to do this, and I don't want to use Nagios or any other existing tool for that.

My problem is that I have no idea how to store the results for a long time period (say, months). I don't need the response time for every single request since epoch of course, but i'd like to get the status (up|down) and an average response time for large periods. For example :

  1. [ last month] -> daily response time (avg)
  2. [ last 3 months ] -> weekly response time (avg)
  3. [ last 12 months ] -> monthly response time (avg)

And for the whole duration, I want to keep the status. For that I will just save the last date when a status change happened I guess.

So the question is : what is the optimal way to store that kind of information with this frequency/retention ? I'm coding in PHP with a Mysql DB behind, but maybe there's something better than a DB for that ?

Any language/tech is accepted as long as it's free and running on linux/unix.

Thank you

  • 点赞
  • 写回答
  • 关注问题
  • 收藏
  • 复制链接分享
  • 邀请回答


  • doudengshen5591 doudengshen5591 9年前

    That sounds like a typical round-robin database. RRDtool allows you to manage such a database. I don't see a reason for re-inventing the wheel.

    点赞 评论 复制链接分享
  • dongzouqie4220 dongzouqie4220 9年前

    Save every request to DB table and to calcualte daily response time (avg), weekly response time (avg) and monthly response time (avg) write simple PHP functions.

    If you gona store only response times and request dates your table wouldn't grow fast and with correct indexes will work fast.

    点赞 评论 复制链接分享