Can P2P applications learn to play fair on networks?

James Hess mysidia at gmail.com
Mon Oct 22 03:35:17 UTC 2007


Possible scenario...

Subscriber bandwidth caps are in theory too high, if the ISP can't support it --
but if the ISP were to lower them, the competition's service would look better,
advertising the larger supposed data rate -- plus the cap reduction would hurt
polite users.

In the absence of the P2P applications, the limits were fine, so hurting the P2P
application may be a preferable solution to the ISP charging everyone more
to support the excessive bandwidth usage of the 2-3% of subscribers who use
P2P applications, or dropping that 6m bandwidth cap to a 256 kilobit cap just to
be able to guarantee everyone can use it all at the same time.


Many ISP customers might thank them for blocking P2P, if it keeps
their subscription
costs low ---  in the absence of sufficient customer demand for P2P,
it will be throttled,
or filtered;  if they're paying for a 1.5m  connection (not a 6m) and
it costs half the price
of a normal 1.5m connection, but blocks P2P,  many customers might
like to make that
tradeoff.

> That's the ONLY thing they have to give us. Forget looking at L4 or alike,
> that will be encrypted as soon as ISPs start to discriminate on it. Users
> have enough computing power available to encrypt everything.

I'm afraid the response could then be for providers that limit P2P to
begin treating everything
encrypted as suspicious.


The source and destination address are enough to do a lot in theory....

If the first packet exchanged between two hosts was sourced from a
subscriber, then ISP
monitoring mechanism can record a session... "Session started from
inside to outside";
just like a stateful firewall.

The ratio of bytes a customer sends to an address versus number of
bytes they receive
from that address can be used: anything above 1.0 is an upload,
anything below 1.0 is
a download;  high ratio = reduced bandwidth cap.

Very poor treatment could be given to sessions started from outside to inside.

An address that only one or two subscribers exchange traffic with is
probably a P2P app.
An address that many subscribers try to exchange traffic with is
probably an E-commerce site.

Thus the whitelists could be built through automated means, just by
counting the number of
distinct inside sources per outside destination.

( if  1000 different customer source addresses send encrypted port 443
to one host, then
that host could be automatically listed as "probably not a P2P host") --

a second possibility is the ISP could examine SSL certificate of
remote destination --
f a site has gone through the trouble of having a high-grade X509
certificate signed by
a for-fee  official CA, then it's probably not a P2P peer.

If a user tries to connect to a site that has no certificate signed by
a recognized CA,
then it's probably either a possible phisher a P2P peer ---
these could in theory be blocked as a "stop phishing"  measure.

"Security" measure


> > Only if P2P shared network resources with other applications well does
> > increasing network resources make more sense.
> If your network cannot handle the traffic, don't offer the services.

Exactly what they would seem to be doing.   By blocking P2P uploads or
throttling
them, they are choosing to not offer full P2P access.

Some ISPs may block P2P and be very quiet about it, and it's unfortunate, as
customers would want to know about extra restrictions on the use of
their X-megs
connection.


Generally warnings that  excessive-bandwidth applications may be limited will be
mentioned in ISPs'  existing Acceptable Usage policies, they're
probably just not outright
saying "we block Xyz".


P2P applications seem to be a valuable tool; however, it would be an
ISP's available
choice to refuse to offer it -- or require P2P users to pay extra, in
proportion to
the additional usage of their networks that are required to function
with the service.
When P2P usage is a burden on their network.

Their network, their rules.


The bigger issue I would say, is that  in many areas, provider
monopolies exist on
affordable residential access services.

So if  "Provider A" happens to be the cable company in a local area,
that owns all that
infrastructure, and the rights to hang cable -- there's no opportunity
for a "Provider B"
to satisfy the demand, if they can't get a wire between them and their
would-be customer....


No competition and no cost-effective alternative access path
 gives "Provider A" too much free reign.

Free reign in terms of limiting consumer choice and forcing customers to accept
substandard or partial services, when customers are tricked by shiny
advertising
into thinking they are buying high-grade fully featured services.


-- 
-J



More information about the NANOG mailing list