I'm monitoring a website (using curl) to see if it's up and to know the response time for a simple GET request on the homepage. I'm using a homemade script to do this, and I don't want to use Nagios or any other existing tool for that.
My problem is that I have no idea how to store the results for a long time period (say, months). I don't need the response time for every single request since epoch of course, but i'd like to get the status (up|down) and an average response time for large periods. For example :
- [ last month] -> daily response time (avg)
- [ last 3 months ] -> weekly response time (avg)
- [ last 12 months ] -> monthly response time (avg)
And for the whole duration, I want to keep the status. For that I will just save the last date when a status change happened I guess.
So the question is : what is the optimal way to store that kind of information with this frequency/retention ? I'm coding in PHP with a Mysql DB behind, but maybe there's something better than a DB for that ?
Any language/tech is accepted as long as it's free and running on linux/unix.
Thank you