Reducing Usenet Bandwidth

Stephen Stuart stuart at tech.org
Sat Feb 9 06:09:12 UTC 2002



> Like many Internet settlement schemes, this seems to not make much sense. If 
> a person reads USENET for many years enjoying all of its wisdom, why should 
> he get a free ride? And why should the people who supply that wisdom have to 
> pay to do so? A USENET transaction is presumed to benefit both parties, or 
> else they wouldn't have configured their computers to make that transaction.

Well, the idea wasn't exactly fully formed, and you've taken it in a
direction that doesn't match what I was thinking. I am definitely
*not* thinking at the granularity of "users." I've heard of users, and
their hunger for pornography, MP3s, and pirated copies of Word, but
this isn't about them. It's about sites that want to offer USENET to
these "users," and the ever-increasing cost to play in the global
USENET pool.

The topic being discussed is to try to reduce USENET bandwidth. One
way to do that is to pass pointers around instead of complete
articles. If the USENET distribution system passed pointers to
articles around instead of the actual articles themselves, sites could
then "self-tune" their spools to the content that their readers (the
"users") found interesting (fetch articles from a source that offered
to actually spool them), either by pre-fetching or fetching on-demand,
but still have access to the "total accumulated wisdom" of USENET -
and maybe it wouldn't need to be reposted every week, because sites
could also offer value to their "users" on the publishing side by
offering to publish their content longer. 

It would be helpful if that last decision - how much to publish -
didn't have global impact the way it does now. When someone injects a
copy of Word into the USENET distribution system now, everyone's disk
and bandwidth cost is incurred immediately. If a pointer was flooded
instead, the upfront cost is less, and article transfer cost is
(arguably) able to more closely match a site's level of demand, rather
than other sites' willingness to supply.

This would have the effect of letting sites with different levels of
willingness to expend resources still play in the global USENET game
with, in theory, greater access to information. It would, again in
theory, allow sites that don't necessarily have the benefit of a lot
of resources to spool articles to leverage access to sites that do
(thus the comment about the cost being incurred by the publisher, or,
more appropriately, those willing to publish). The primary benefit, I
think, is that sites that publish poorly - allow a lot of trash to be
posted - could do so without poisoning the village green for others. A
downstream spool might be able to implement the policy option of
choosing not to pre-fetch on a site-by-site basis, rather than having
to tune their spool on a group-by-group basis, and the information is
all still there. The incentive to publish quality information is that
downstream sites are more willing to pre-fetch from you, lowering your
bandwidth costs.

There are, of course, a thousand devils in the details, like how to
chase down an article when you didn't necessarily have it and you
didn't necessarily know who might still be offering a copy. Some of
those problems in that vein that appeared insurmountable ten years ago
might have solutions in current "peer-to-peer" networking technologies
(thus the off-hand comment about Napster).

Users, in theory, would not really see anything different than they
see today. Underneath the covers, though, (a) some articles might take
longer to fetch than others, and (b) there'd be less trash distributed
globally. I don't envision reducing the hunger of "users" for
pornography, MP3s, or pirated copies of Word. Maybe we don't need to
incur so much cost transmitting them to and storing them in thousands
of sites around the net each week, though.

Stephen



More information about the NANOG mailing list