Bell Labs or Microsoft security?

Jack Bates jbates at brightok.net
Thu Jan 30 15:29:28 UTC 2003


From: "Simon Waters"

>
> 40 years of experience says it is unreasonable to expect the programmer to
get it right 100% of the time.
>
> A modern server or Desktop OS is measured in hundreds of millions of lines
of code, what is an acceptable error rate per line of code?
>
Perhaps I'm missing it, but is it unreasonable to have a tool that does
buffer checks? Obviously, the issue comes a lot of times when code is past
to outside sources that the compiler may not know about the handling of
code. However, it isn't hard to flag such calls as possible sources of
overflow if unknown, or continue the checks based on the debugger known that
particular routine and what it's data manipulation consists of. The idea of
hand checking code for mistakes is unreasonable. As you say, mistakes will
happen. Yet buffer overflows are mathematically seeable. If all api's are
handed out with proper specs of data handling, then verifying the integrity
of pointer usage could be guaranteed.

Am I missing something?

-Jack




More information about the NANOG mailing list