Interesting People mailing list archives

A flaw in the Internet architecture?


From: David Farber <dave () farber net>
Date: Fri, 11 Jul 2008 05:10:35 -0700


________________________________________
From: Tony Lauck [tlauck () madriver com]
Sent: Thursday, July 10, 2008 9:16 PM
To: David Farber
Cc: Richard Bennett
Subject: Re: [IP] A flaw in the Internet architecture?

I'm not sure what particular aspect of the Jacobson algorithm Richard
Bennett considers a flaw. Normally TCP flow rates decrease with
increasing round trip time -- this is a natural stabilizing function of
any window based end to end flow control mechanism.  By locating its
servers closer to its end users a CDN operator reduces round trip time
which benefits its users, but it also reduces transmission costs which
indirectly benefits network operators and other network users. I fail to
see any exploitation here. In any event, research dating at least as far
back as the early 1980's shows that fair allocation of network resources
can not be achieved at the ends of a network. Network operators must be
responsible for allocating their own resources equitably among their
customers; practical mechanisms exist to achieve this, some of which I
have described in previous posts to this list.

In my opinion, a common carrier should not be allowed into any other
related business, period.  (This works both ways:  Google should not be
allowed to become a carrier.)  Limited space monopolies may be
unavoidable in certain corners of an otherwise free market, but
companies should not be allowed to use integration or contracts to
leverage a monopoly. I believe the railroads made this clear during the
19th century. In my opinion these problems are structural and can not be
solved by ever more complex regulations. What is needed are simple
structural laws that make it impossible for unavoidable monopolies to
spread outside of their niche. (Maybe nothing in our advanced
civilization can be simple any longer. If so, then we are doomed, and
justly so.)

Tony Lauck
www.aglauck.com


Richard Bennett wrote:
The flaw that I see Google and the other CDNs exploiting is the inverse
correlation of TCP flow rates to round-trip-time in conditions of
congestion; caused by the Jacobson Algorithm; the multi-connection flaw
exploited by BitTorrent is a separate issue. In order to achieve the
Google flow rate, a competing search engine or video streamer would need
to build or hire infrastructure comparable to the Google network, and
this is something that ISPs could certainly provide on an economical
basis if they're permitted to do so by law. The NN regulations can be
read as barring ISPs from doing this, and that would be sad and
anti-competitive.

And FYI, Tony, I don't work for Comcast, but I'm certainly not opposed
to taking money from them, Google, or anybody else as long as I have
control of my message. I see the ISPs using my arguments all the time,
and it would only be respectable for them to sent me a check now and
then. Ironically, I do have a small financial relationship with Google
in as much as I have an Adwords account.

RB



-------------------------------------------
Archives: https://www.listbox.com/member/archive/247/=now
RSS Feed: https://www.listbox.com/member/archive/rss/247/
Powered by Listbox: http://www.listbox.com


Current thread: