Smurf Prevention

Richard Thomas buglord at ex-pressnet.com
Mon Jul 13 11:41:53 UTC 1998


-----Original Message-----
From: Dalvenjah FoxFire <dalvenjah at dal.net>
To: Richard Thomas <buglord at ex-pressnet.com>
Cc: nanog at merit.edu <nanog at merit.edu>
Date: Sunday, July 12, 1998 6:24 PM
Subject: Re: Smurf Prevention


>On Mon, Jul 13, 1998 at 04:48:41AM -0400, Richard Thomas put this into my
mailbox:
>
>> Perhaps we might have some success preventing smurfs from the most common
>> sources, hacked machines on university dorm networks, by getting the
>> university backbones to filter spoofs. Things like SUnet, FUnet,
NYSERnet,
>> etc, account for a large portion of universities used to smurf from, and
it
>> might be easier then trying to get each school to filter individually. I
>> found the following two addresses for nysernet and funet but was unable
to
>> read or translate the Swedish on www.sunet.se.
>
>That's one solution. What might be a better solution would be if the Big
Few
>networks (MCI, Sprint, UUnet, etc.) were to take the list of smurf
amplifiers
>from something like the SAR, *verify* that they're still smurf amplifiers,
>and then refuse to route traffic from those networks.
>
>Not only would it cut the smurfs down cold, but it would also get the folks
>responsible for those networks to fix things.
>
>Then again, if the big-bandwidth folks cared about such things, perhaps
they
>would have done so already.

Unfortunately the big guys have no incentive to deal with it on a
large-scale basis. The amount of traffic involved is insignificant to them,
they're making money off the bandwidth used, and of course the "filtering
smurfs takes too much cpu time on our routers" answer (they don't seem to
realize that as soon as the kids loose the instant gratification of seeing a
ping timeout they will get bored and stop). You know several global
broadcast scans have been conducted, I've submitted them all to the Smurf
Archive Registry, and we now seem stuck at around 14,000. That leads one to
believe maybe this is all there ARE (of the .0 and .255 variety anyhow).
10,000 of those are < 10 dupes, and only 500 or so are > 30 dupes. Also
remember that SAR doesn't scan to remove dead ones by itself (something I am
currently working to create in my bcast databases), so a good many of those
are probably fixed by now.

HOW HARD CAN IT BE to take care of 500 broadcasts? Very hard, since the only
bcasts still left are those with broken contact information and upstreams
who haven't been informed or who don't give a damn. Maybe if we all picked
10 of the worst offenders every day, picked up the phone, and started
informing people who have missed the boat...





More information about the NANOG mailing list