nanog mailing list archives

Re: Can P2P applications learn to play fair on networks?


From: Florian Weimer <fw () deneb enyo de>
Date: Sun, 21 Oct 2007 17:17:31 +0200


* Sean Donelan:

If its not the content, why are network engineers at many university
networks, enterprise networks, public networks concerned about the
impact particular P2P protocols have on network operations?  If it was
just a single network, maybe they are evil.  But when many different
networks all start responding, then maybe something else is the
problem.

Uhm, what about civil liability?  It's not necessarily a technical issue
that motivates them, I think.

The traditional assumption is that all end hosts and applications
cooperate and fairly share network resources.  NNTP is usually
considered a very well-behaved network protocol.  Big bandwidth, but
sharing network resources.  HTTP is a little less behaved, but still
roughly seems to share network resources equally with other users. P2P
applications seem to be extremely disruptive to other users of shared
networks, and causes problems for other "polite" network applications.

So is Sun RPC.  I don't think the original implementation performs
exponential back-off.

If there is a technical reason, it's mostly that the network as deployed
is not sufficient to meet user demands.  Instead of providing more
resources, lack of funds may force some operators to discriminate
against certain traffic classes.  In such a scenario, it doesn't even
matter much that the targeted traffic class transports content of
questionable legaility.  It's more important that the measures applied
to it have actual impact (Amdahl's law dictates that you target popular
traffic), and that you can get away with it (this is where the legality
comes into play).


Current thread: