SORBS on autopilot?

Michelle Sullivan matthew at
Fri Jan 15 17:01:43 UTC 2010

paul wrote:
> Michelle,
> Thanks for your email.  Please specifically look at ticket 260695.  I 
> created the ticket on January 5th at about 1:30EST.  Immediately I got 
> my response from the robot.

See my other message in addition.
> I replied a few minutes later with:
>> TTL is right.  PTR is right.

That is my view, however most (if not all) of the tickets were for the 
/22 not the /32 which is why it was rejected.

> From your email, it is my understanding this should have went to a 
> human. I have no idea why my IP address wasn't accepted in the first 
> place.  And I have no idea why I didn't get a human response.

So go back to the robot response and tell me where it says it'll be sent 
to a human...please...?

> A couple suggestions:
> -program the robot to give the exact reason why it is denying: TTL 
> wrong or PTR indicates dynamic or whatever

It does give several responses, however the more exact the response the 
more issues we have had with people not understanding the reply.  Our 
response has been to format the message with a link to the FAQ where 
there is a lot more of a detailed response.  I will review the response 
with your suggestions and see if we can change it to some sort of 

> -kind of leaping to conclusions here, but possibly the robot is 
> caching DNS?  Which means even if what was broken had been fixed, the 
> robot wouldn't see it?

The robot caches results for 48 hours to prevent people launching DoS 
attacks on our systems as well as yours.  The results are easily checked 
here:<first octet>/<second octet>/<network>.txt


In this case you can easily see why the robot was unable to process the 
request...  PTR's were requested from the nominated authoritive servers, 
only to receive a "NODATA" response (commonly seen if TCP responses are 
required or CNAMEs are returned without the PTR.)

There is an issue with the robot and some correctly assigned classless 
delegations due to the way we process the data, there are various 
catches to correct this and re-process the network with a more reliable 
(but considerably more resource hungry) method.  Unfortunately it's not 
fool proof though, which is why we tell people to respond to the robot 
response to get a human to review it.  If anyone out there is 
knowledgable in Perl, C and DNS and wants to take a shot at fixing that 
issue I'd love to have the help.


More information about the NANOG mailing list