I'm monitoring a website (using curl) to see if it's up and to know the response time for a simple GET request on the homepage. I'm using a homemade script to do this, and I don't want to use Nagios or any other existing tool for that.
My problem is that I have no idea how to store the results for a long time period (say, months). I don't need the response time for every single request since epoch of course, but i'd like to get the status (up|down) and an average response time for large periods. For example :
And for the whole duration, I want to keep the status. For that I will just save the last date when a status change happened I guess.
So the question is : what is the optimal way to store that kind of information with this frequency/retention ? I'm coding in PHP with a Mysql DB behind, but maybe there's something better than a DB for that ?
Any language/tech is accepted as long as it's free and running on linux/unix.
Thank you
That sounds like a typical round-robin database. RRDtool allows you to manage such a database. I don't see a reason for re-inventing the wheel.
Save every request to DB table and to calcualte daily response time (avg)
, weekly response time (avg)
and monthly response time (avg)
write simple PHP functions.
If you gona store only response times and request dates your table wouldn't grow fast and with correct indexes will work fast.