here is a free created example for my question:
Every 15 seconds I run a php script which checks the status of +5 servers. It gets the ping
, status
, version
and motd
. The data should be stored for two weeks, after this time it can be removed.
At the moment I would store it like this:
Table 1:
server_ID | name | ip | last_update
Table 2:
ID | server_ID | status | ping | version | motd | timestamp
I think with this way, nothing is doubled and I can easyly rename a server or change its ip.
But how should I make the drop of old rows? Is it too much if I do this every time the script runs (every 15 seconds) ? 14*24*60*4 = 80640 rows per server for the two weeks
Or how should I store the data, that I´m able to have the latest data and also the data to display a 2 weeks line chart?
Maybe a third table, which has hourly data.
Edit:
Thank you for your answers. What about this:
At the 15 seconds update I put all the Data in Table 1, but I update it, so it is only one row per server.
Then I use this trigger every hour and copy the current data into a new table and delete everything which is older than 2 weeks from it ?
At the moment I have set the server_ID in Table 1 and the ID in Table 2 as primary key, is there a good reason to set the timestamp as primary key instead ?