Reducing Usenet Bandwidth

Iljitsch van Beijnum iljitsch at muada.com
Sun Feb 17 12:44:53 UTC 2002


On Fri, 8 Feb 2002, Stephen Stuart wrote:

> The topic being discussed is to try to reduce USENET bandwidth. One
> way to do that is to pass pointers around instead of complete
> articles. If the USENET distribution system passed pointers to
> articles around instead of the actual articles themselves, sites could
> then "self-tune" their spools to the content that their readers (the
> "users") found interesting (fetch articles from a source that offered
> to actually spool them), either by pre-fetching or fetching on-demand,
> but still have access to the "total accumulated wisdom" of USENET -
> and maybe it wouldn't need to be reposted every week, because sites
> could also offer value to their "users" on the publishing side by
> offering to publish their content longer.

I'm a bit behind on reading the NANOG list, so excuse the late reply.

If we can really build such a beast, this would be extremely cool. The
method of choice for publishing free information on the Net is WWW these
days. But it doesn't work very well, since there is no direct relationship
between an URL and the published text or file. So people will use a "far
away" URL because they don't know the same file can be found much closer,
and URLs tend to break after a while.

I've thought about this for quite a while, and even written down a good
deal of them. If you're interested:

http://www.muada.com/projects/usenet.txt

Iljitsch van Beijnum




More information about the NANOG mailing list