Digex transparent proxying

Chris A. Icide chris at priori.net
Tue Jun 30 23:10:52 UTC 1998


[NANOG WARNING: This document contains neither directions on contacting the
operational support group for any entity, nor the configuration code for
any network hardware]

Against my better judgement, I've decided to add my tiny bit of content to
this thread.  If there is anyone to thank for getting me into it, Mr.
Porter has done so with his excellent review of the thread.  Without it, I
might not have seen the humor in it.

However, there is something that I just can't understand.  It's amazing to
me that with the number of examples in history, people are able to make
grand statements like many have in this thread.  Please allow me to point
out a few historical statements that I feel are very similar to some of the
grand statements made in this thread:

1. The world is flat and rests on the back of a giant turtle.
2. Man will never be able to fly faster than the speed of sound.
3. The Dow Jones Industrial Average will never break 4000

Grandiose statements can be generally risky, and almost always the more
grand the statement, the more likely it will be wrong to some degree.  So
if you think some technology is just a short term hack, and is doomed to be
forgotten, sure, tell us, and then tell us why you believe it.  Just the
grand statement alone really is worthless.

On to the topic:  As far as I can tell there are three major subjects to
this thread.  The ethical nature of using a transparent cache.  The
technical and operational problems and solutions associated with the
implementation of a transparent cache.  And finally, the customer support
and management process involved with the implementation of the transparent
cache.

I'll address these in reverse order.

Customer Support and process management seems to be the Achilles' heel of
the Internet industry.  This industry has seemed to build up a terrible
reputation for poor customer support.  Coming from the Nuclear industry, I
am truly jaded when it comes to process management, but the Internet
industry seems to have taken a track as far as possible on the other side.
It amazes me the lack of process management in many (not all) of the
organizations involved in this industry.  Procedures seem to be a dirty
word, and tend to result in the exodus of technical talent when imposed
upon an organization.  In many cases, you can find people in the industry
jokingly referring to their facilities and really neat toys.  

Compared to the multiple interactions in the Internet, a nuclear power
plant is simple.  And yet, there are very few people if any who could
singly determine what one particular action would have upon the operation
of the plant as a whole.  Obviously, the consequences of something bad
happening is far worse, but in many cases, bad means the plant is shut down
for some time period, and the company loses cash at an astonishing rate.
However, the use of the Internet as a tool for many companies revenue
stream continues to grow, and the resulting problems generated by poor
process management is beginning to have much wider financial effect.

It's apparent that we on the operational side are truly more at fault than
anyone else.  It's our duty to provide this service to our customers as
best we can.  If there is any area in which the customer could claim
negligence or seek damages, this is one place that they may find a good
chance.  I'm not pointing fingers here, I've been in organizations that
were party to this myself.  I'm merely making a general observation.  Many
organizations have made good headway here.

Events such as the infamous 7007 event, the widespread connectivity
problems concurrent with filter implementations, down to local effects of
such as those resulting in this thread will continue to occur as long as we
continue to take a lax view as to the results of our actions.

There apparently have been several organizations developed recently that
claim to be attempting to address this problem.  However, I have not seen
an industry-wide affect by the operations of these organizations. 

As far as the technical and operational problems associated with the
implementation of current transparent caching technology, it appears to me
that the people on this list who have the most background in cache
technology seem to have come to an agreement that technological answers to
the problems experienced in this event exist.

I can't, nor won't feign any significant technical knowledge in the caching
field.  I have a general knowledge level at best.  

>From a general level, this industries technological growth rate is
phenomenal and the resulting technological implementation problems can be
significant.  However, there are organizations that are going to continue
to develop and implement these new technologies.  The successful ones will
be financially rewarded in our capitalistic society, and the unsuccessful
ones will fare less well.  Sooner or later those implementing the new
technology will have a significant advantage over those who don't.  To
avoid the challenge of new technology can be extremely damaging to the
viability of a company in the Internet industry.  

Yes, I admit that the possibility that any current implementation of a new
technology such as caching may actually end up being a temporary fix.
However, I would be amazed to find that the effort put into development and
operational testing of such a technology did not result in some advancement
in that field.  From caching, we may find interesting advancements in
distributed technology for example.

Finally, the ethical question comes around.  Is it, or is it not ethical
for someone to use a transparent cache?  IMHO, the arguments on this
portion of the thread have been nothing but sensationalist.  It reeks of
paranoia.  Perhaps the X-Files can (or have they already) done a show on
this.  THEY are stealing your packets without your knowledge! THEY are
monitoring your every transmission and receipt!  THEY know your innermost
secrets!  Bah!

Does transparent (or any caching technology, for that matter) give someone
additional tools to pry into the data flow?  Yes.  Does the technology
specifically do this and provide this information?  No.  It's a tool that
could be mismanaged and used for something other than its original purpose,
and perhaps even intrusive actions.  However, it requires someone who wants
to do such a thing, and the tools exist already to do similar things if
someone so chooses.  Attempting to limit the growth of a technology because
it could result in the technology being used wrongly is pure folly.  

In the case of transparent caching, it's my belief that as long as an end
user receives content as provided by the content provider in the format
specified by the content provider (yes, this includes dynamic information),
both the content provider and the end user could care less what physical,
electromagnetic, or optical transformations take place from end to end.  If
the end to end transit is truly transparent, then the goal is accomplished.

However, the expectation that this goal could be accomplished while content
providers, transit providers, and client/server software providers work
independently of each other, is ludicrous.  These entities must work
together to achieve this.  The pointing of fingers and blaming of the other
parties does nothing more than delay the maturation process of this
industry.  This is a complex and extremely interactive system, and must be
dealt with as a whole for an optimal solution to arise.

Perhaps it's the place here for a Grand Statement concerning the likelihood
of this occurring.  :)



-Against my better judgement,
Chris






More information about the NANOG mailing list