Monitoring other people's sites (Was: Website for returns "HTTP/1.1 500 Internal Server Error")

Jeroen Massar jeroen at
Tue Mar 20 09:54:13 CDT 2012

On 2012-03-20 15:40 , Vinny_Abello at wrote:
> FYI - it's also the main IPv4 site, not just IPv6... although I'm
> unsure if it's the same issue.
> I was monitoring availability as a point of reference for my network
> and started receiving 500 errors recently as well that tripped up the
> monitoring system, even though the page comes up in any browser I
> try.
> GET / HTTP/1.1 User-Agent: Mozilla/4.0 (compatible; MSIE 4.01;
> Windows NT)

For everybody who is "monitoring" other people's websites, please please
please, monitor something static like /robots.txt as that can be
statically served and is kinda appropriate as it is intended for robots.
Oh and of course do set the User-Agent to something logical and to be
super nice include a contact address so that people who do check their
logs once in a while for fishy things they at least know what is
happening there and that it is not a process run afoul or something.

Of course, asking before doing tends to be a good idea too.

The IPv6 Internet already consists way too much out of monitoring by
pulling pages and doing pings...

Fortunately that should heavily change in a few months.


 (who noticed a certain s....h company performing latency checks against
one of his sites, which was no problem, but the fact that they where
causing almost more hits/traffic/load than normal clients was a bit on
the much side, them pulling robots.txt solved their problem to be able
to check if their IPv6 worked fine and the load issue on the server side
was gone too as nginx happily serves little robots.txt's at great speed
from cache ;)

 And for the few folks putting nagios's on other people's sites, they
obviously do not understand that even if the alarm goes off that
something is broken that they cannot fix it anyway, thus why bother...

More information about the NANOG mailing list